diff --git a/.dotnet.azure/CHANGELOG.md b/.dotnet.azure/CHANGELOG.md new file mode 100644 index 000000000..972422632 --- /dev/null +++ b/.dotnet.azure/CHANGELOG.md @@ -0,0 +1,495 @@ +# Release History + +## 2.0.0-beta.3 (2024-08-23) + +This change updates the library for compatibility with the latest `2.0.0-beta.9` of the `OpenAI` package and the `2024-07-01-preview` Azure OpenAI service API version label, as published on 8/5. + +### Features Added + +- The library now directly supports alternative authentication audiences, including Azure Government. This can be specified by providing an appropriate `AzureOpenAIAudience` value to the `AzureOpenAIClientOptions.Audience` property when creating a client. See the client configuration section of the README for more details. + +Additional new features from the `OpenAI` package can be found in [the OpenAI changelog](https://github.com/openai/openai-dotnet/blob/main/CHANGELOG.md). + +**Please note**: Structured Outputs support is not yet available with the `2024-07-01-preview` service API version. This means that attempting to use the feature with this library version will fail with an unrecognized property for either `response_format` or `strict` in request payloads; all existing functionality is unaffected. Azure OpenAI support for Structured Outputs is coming soon. + +### Breaking Changes + +No Azure-specific breaking changes are present in this update. + +The update from `OpenAI` `2.0.0-beta.7` to `2.0.0-beta.9` does bring a number of breaking changes, however, as described in [the OpenAI changelog](https://github.com/openai/openai-dotnet/blob/main/CHANGELOG.md): + +- Removed client constructors that do not explicitly take an API key parameter or an endpoint via an `OpenAIClientOptions` parameter, making it clearer how to appropriately instantiate a client. ([13a9c68](https://github.com/openai/openai-dotnet/commit/13a9c68647c8d54475f1529a63b13ad711bd4ba6)) +- Removed the endpoint parameter from all client constructors, making it clearer that an alternative endpoint must be specified via the `OpenAIClientOptions` parameter. ([13a9c68](https://github.com/openai/openai-dotnet/commit/13a9c68647c8d54475f1529a63b13ad711bd4ba6)) +- Removed `OpenAIClient`'s `Endpoint` `protected` property. ([13a9c68](https://github.com/openai/openai-dotnet/commit/13a9c68647c8d54475f1529a63b13ad711bd4ba6)) +- Made `OpenAIClient`'s constructor that takes a `ClientPipeline` parameter `protected internal` instead of just `protected`. ([13a9c68](https://github.com/openai/openai-dotnet/commit/13a9c68647c8d54475f1529a63b13ad711bd4ba6)) +- Renamed the `User` property in applicable Options classes to `EndUserId`, making its purpose clearer. ([13a9c68](https://github.com/openai/openai-dotnet/commit/13a9c68647c8d54475f1529a63b13ad711bd4ba6)) +- Changed name of return types from methods returning streaming collections from `ResultCollection` to `CollectionResult`. ([7bdecfd](https://github.com/openai/openai-dotnet/commit/7bdecfd8d294be933c7779c7e5b6435ba8a8eab0)) +- Changed return types from methods returning paginated collections from `PageableCollection` to `PageCollection`. ([7bdecfd](https://github.com/openai/openai-dotnet/commit/7bdecfd8d294be933c7779c7e5b6435ba8a8eab0)) +- Users must now call `GetAllValues` on the collection of pages to enumerate collection items directly. Corresponding protocol methods return `IEnumerable` where each collection item represents a single service response holding a page of values. ([7bdecfd](https://github.com/openai/openai-dotnet/commit/7bdecfd8d294be933c7779c7e5b6435ba8a8eab0)) +- Updated `VectorStoreFileCounts` and `VectorStoreFileAssociationError` types from `readonly struct` to `class`. ([58f93c8](https://github.com/openai/openai-dotnet/commit/58f93c8d5ea080adfee8b37ae3cc034ebb06c79f)) + +### Bugs Fixed + +- Removed an inappropriate null check in `FileClient.GetFiles()` (azure-sdk-for-net 44912) +- Addressed issues with automatic retry behavior, including for HTTP 429 rate limit errors: + - Authorization headers are now appropriately reapplied to retried requests + - Automatic retry behavior will now honor header-based intervals from `Retry-After` and related response headers +- The client will now originate an `x-ms-client-request-id` header to match prior library behavior and facilitate troubleshooting + +Additional, non-Azure-specific bug fixes can be found in [the OpenAI changelog](https://github.com/openai/openai-dotnet/blob/main/CHANGELOG.md). + +## 2.0.0-beta.2 (2024-06-14) + +### Features Added + +- Per changes to the [OpenAI .NET client library](https://github.com/openai/openai-dotnet), most convenience methods now provide the direct ability to provide optional `CancellationTokens`, removing the need to use protocol methods + +### Breaking Changes + +- In support of `CancellationToken`s in methods, an overriden method signature for streaming chat completions was changed and a new minimum version dependency of 2.0.0-beta.5 is established for the OpenAI dependency. These styles of breaks will be extraordinarily rare. + +### Bugs Fixed + +- See breaking changes: when streaming chat completions, an error of "Unrecognized request argument supplied: stream_options" is introduced when using Azure.AI.OpenAI 2.0.0-beta.1 with OpenAI 2.0.0-beta.5+. This is fixed with the new version. + +## 2.0.0-beta.1 (2024-06-07) + +**Please note**: This update brings a *major* set of changes to the Azure.AI.OpenAI library. + +With the release of the official [OpenAI .NET client library](https://github.com/openai/openai-dotnet), the `Azure.AI.OpenAI` library has migrated to become a companion to OpenAI's package that offers Azure client configuration and strongly-typed extension support for Azure-specific request and response models. + +**We'd love your feedback:** our goal is to move the new `OpenAI` .NET library and its refreshed `Azure.AI.OpenAI` companion into a General Availability status as quickly as we can; we've heard loud and clear that the perpetual preview/prerelease status is an adoption blocker. To reach that goal, your feedback -- either on the issues here, in `azure-sdk-for-net`, or the issues on the new `openai-dotnet` OpenAI repository -- will be invaluable. + +### Features Added + +**OpenAI parity**: built on the OpenAI .NET library, full parity support is available for the breadth of common features, including: + +- Assistants V2 with streaming +- Audio transcription/translation and text-to-speech generation +- (Coming soon) Batch +- Chat completion +- Embeddings +- Files +- Fine-tuning +- Image generation with dall-e-3 +- Vector stores + +**Azure OpenAI**: updated to the latest `2024-05-01-preview` service API, new features include: + +- Assistants v2 with streaming +- Improved configuration for On Your Data +- Expanded Responsible AI content filter annotations + +### Breaking Changes + +Given the nature of this update, breaking changes are extensive. Please see the README and the [OpenAI library README](https://github.com/openai/openai-dotnet/blob/master/README.md) for usage details. OpenAI's library carries forward many of the same design concepts as the Azure.AI.OpenAI library used as a standalone library, but considerable improvements have been made to the surface that will require significant code adjustments. + +## 1.0.0-beta.17 (2024-05-03) + +### Features Added + +- Image input support for `gpt-4-turbo` chat completions now works with image data in addition to internet URLs. + Images may be now be used as `gpt-4-turbo` message content items via one of three constructors: + - `ChatMessageImageContent(Uri)` -- the existing constructor, used for URL-based image references + - `ChatMessageImageContent(Stream,string)` -- (new) used with a stream and known MIME type (like `image/png`) + - `ChatMessageImageContent(BinaryData,string)` -- (new) used with a BinaryData instance and known MIME type + Please see the [readme example](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/README.md#chat-with-images-using-gpt-4-turbo) for more details. + +### Breaking Changes + +- Public visibility of the `ChatMessageImageUrl` type is removed to promote more flexible use of data sources in + `ChatMessageImageContent`. Code that previously created a `ChatMessageImageUrl` using a `Uri` should simply provide + the `Uri` to the `ChatMessageImageContent` constructor directly. + +## 1.0.0-beta.16 (2024-04-11) + +### Features Added + +**Audio** + +- `GetAudioTranscription()` now supports word-level timestamp granularities via `AudioTranscriptionOptions`: + - The `Verbose` option for `ResponseFormat` must be used for any timing information to be populated + - `TimestampGranularityFlags` accepts a combination of the `.Word` and `.Segment` granularity values in + `AudioTimestampGranularity`, joined when needed via the single-pipe `|` operator + - For example, `TimestampGranularityFlags = AudioTimestampGranularity.Word | AudioTimestampGranularity.Segment` + will request that both word-level and segment-level timestamps are provided on the transcription result + - If not otherwise specified, `Verbose` format will default to using segment-level timestamp information + - Corresponding word-level information is found on the `.Words` collection of `AudioTranscription`, peer to the + existing `.Segments` collection + - Note that word-level timing information incurs a small amount of additional processingly latency; segment-level + timestamps do not encounter this behavior +- `GenerateSpeechFromText()` can now use `Wav` and `Pcm` values from `SpeechGenerationResponseFormat`, these new + options providing alternative uncompressed formats to `Flac` + +**Chat** + +- `ChatCompletions` and `StreamingChatCompletionsUpdate` now include the reported `Model` value from the response +- Log probability information is now included in `StreamingChatCompletionsUpdate` when `logprobs` are requested on + `GetChatCompletionsStreaming()` +- [AOAI] Custom Blocklist information in content filter results is now represented in a more structured + `ContentFilterDetailedResults` type +- [AOAI] A new `IndirectAttack` content filter entry is now present on content filter results for prompts + +### Breaking Changes + +- [AOAI] `AzureChatExtensionMessageContext`'s `RequestContentFilterResults` now uses the new + `ContentFilterDetailedResults` type, changed from the previous `IReadOnlyList`. The + previous list is now present on `CustomBlockLists.Details`, supplemented with a new `CustomBlockLists.Filtered` + property. + +### Bugs Fixed + +- [AOAI] An issue that sometimes caused `StreamingChatCompletionUpdates` from Azure OpenAI to inappropriately exclude + top-level information like `Id` and `CreatedAt` has been addressed + +## 1.0.0-beta.15 (2024-03-20) + +This release targets the latest `2024-03-01-preview` service API label and brings support for the `Dimensions` property when using new embedding models. + +### Features Added + +- `EmbeddingsOptions` now includes the `Dimensions` property, new to Azure OpenAI's `2024-03-01-preview` service API. + +### Bugs Fixed + +- Several issues with the `ImageGenerations` response object being treated as writeable are fixed: + - `ImageGenerations` no longer has an erroneous public constructor + - `ImageGenerations.Created` no longer has a public setter + - `ImageGenerations.Data` is now an `IReadOnlyList` instead of an `IList` + - A corresponding replacement factory method for mocks is added to `AzureOpenAIModelFactory` + +## 1.0.0-beta.14 (2024-03-04) + +### Features Added + +- Text-to-speech using OpenAI TTS models is now supported. See [OpenAI's API reference](https://platform.openai.com/docs/api-reference/audio/createSpeech) or the [Azure OpenAI quickstart](https://learn.microsoft.com/azure/ai-services/openai/text-to-speech-quickstart) for detailed overview and background information. + - The new method `GenerateSpeechFromText` exposes this capability on `OpenAIClient`. + - Text-to-speech converts text into lifelike spoken audio in a chosen voice, together with other optional configurations. + - This method works for both Azure OpenAI and non-Azure `api.openai.com` client configurations + +### Breaking Changes + +"On Your Data" changes: + +- Introduced a new type `AzureChatExtensionDataSourceResponseCitation` for a more structured representation of citation data. +- Correspondingly, updated `AzureChatExtensionsMessageContext`: + - Replaced `Messages` with `Citations` of type `AzureChatExtensionDataSourceResponseCitation`. + - Added `Intent` as a string type. +- Renamed "AzureCognitiveSearch" to "AzureSearch": + - `AzureCognitiveSearchChatExtensionConfiguration` is now `AzureSearchChatExtensionConfiguration`. + - `AzureCognitiveSearchIndexFieldMappingOptions` is now `AzureSearchIndexFieldMappingOptions`. +- Check the project README for updated code snippets. + +### Other Changes + +- New properties in `ChatCompletionsOptions`: + - `EnableLogProbabilities`: Allows retrieval of log probabilities (REST: `logprobs`) + - `LogProbabilitiesPerToken`: The number of most likely tokens to return per token (REST: `top_logprobs`) +- Introduced a new property in `CompletionsOptions`: + - `Suffix`: Defines the suffix that follows the completion of inserted text (REST: `suffix`) +- Image generation response now includes content filtering details (specific to Azure OpenAI endpoint): + - `ImageGenerationData.ContentFilterResults`: Information about the content filtering results. (REST: `content_filter_results`) + - `ImageGenerationData.PromptFilterResults`: Information about the content filtering category (REST: `prompt_filter_results`) + +## 1.0.0-beta.13 (2024-02-01) + +### Breaking Changes + +- Removed the setter of the `Functions` property of the `ChatCompletionsOptions` class as per the guidelines for collection properties. + +### Bugs Fixed + +- Addressed an issue with the public constructor for `ChatCompletionsFunctionToolCall` that failed to set the tool call type in the corresponding request. + +## 1.0.0-beta.12 (2023-12-15) + +Like beta.11, beta.12 is another release that brings further refinements and fixes. It remains based on the `2023-12-01-preview` service API version for Azure OpenAI and does not add any new service capabilities. + +### Features Added + +**Updates for using streaming tool calls:** + +- A new .NET-specific `StreamingToolCallUpdate` type has been added to better represent streaming tool call updates + when using chat tools. + - This new type includes an explicit `ToolCallIndex` property, reflecting `index` in the REST schema, to allow + resilient deserialization of parallel function tool calling. +- A convenience constructor has been added for `ChatRequestAssistantMessage` that can automatically populate from a prior + `ChatResponseMessage` when using non-streaming chat completions. +- A public constructor has been added for `ChatCompletionsFunctionToolCall` to allow more intuitive reconstruction of + `ChatCompletionsToolCall` instances for use in `ChatRequestAssistantMessage` instances made from streaming responses. + +**Other additions:** + +- To facilitate reuse of user message contents, `ChatRequestUserMessage` now provides a public `Content` property (`string`) as well as a public `MultimodalContentItems` property (`IList` type is introduced that implicitly exposes an `IAsyncEnumerable` derived from + the underlying response. +- `OpenAI.GetCompletionsStreaming()` now returns a `StreamingResponse` that may be directly + enumerated over. `StreamingCompletions`, `StreamingChoice`, and the corresponding methods are removed. +- Because Chat Completions use a distinct structure for their streaming response messages, a new + `StreamingChatCompletionsUpdate` type is introduced that encapsulates this update data. +- Correspondingly, `OpenAI.GetChatCompletionsStreaming()` now returns a + `StreamingResponse` that may be enumerated over directly. + `StreamingChatCompletions`, `StreamingChatChoice`, and related methods are removed. +- For more information, please see + [the related pull request description](https://github.com/Azure/azure-sdk-for-net/pull/39347) as well as the + updated snippets in the project README. + +#### `deploymentOrModelName` moved to `*Options.DeploymentName` + +`deploymentOrModelName` and related method parameters on `OpenAIClient` have been moved to `DeploymentName` +properties in the corresponding method options. This is intended to promote consistency across scenario, +language, and Azure/non-Azure OpenAI use. + +As an example, the following: + +```csharp +ChatCompletionsOptions chatCompletionsOptions = new() +{ + Messages = { new(ChatRole.User, "Hello, assistant!") }, +}; +Response response = client.GetChatCompletions("gpt-4", chatCompletionsOptions); +``` + +...is now re-written as: + +```csharp +ChatCompletionsOptions chatCompletionsOptions = new() +{ + DeploymentName = "gpt-4", + Messages = { new(ChatRole.User, "Hello, assistant!") }, +}; +Response response = client.GetChatCompletions(chatCompletionsOptions); +``` + +#### Consistency in complex method options type constructors + +With the migration of `DeploymentName` into method complex options types, these options types have now been snapped to +follow a common pattern: each complex options type will feature a default constructor that allows `init`-style setting +of properties as well as a single additional constructor that accepts *all* required parameters for the corresponding +method. Existing constructors that no longer meet that "all" requirement, including those impacted by the addition of +`DeploymentName`, have been removed. The "convenience" constructors that represented required parameter data +differently -- for example, `EmbeddingsOptions(string)`, have also been removed in favor of the consistent "set of +directly provide" choice. + +More exhaustively, *removed* are: + +- `AudioTranscriptionOptions(BinaryData)` +- `AudioTranslationOptions(BinaryData)` +- `ChatCompletionsOptions(IEnumerable)` +- `CompletionsOptions(IEnumerable)` +- `EmbeddingsOptions(string)` +- `EmbeddingsOptions(IEnumerable)` + +And *added* as replacements are: + +- `AudioTranscriptionOptions(string, BinaryData)` +- `AudioTranslationOptions(string, BinaryData)` +- `ChatCompletionsOptions(string, IEnumerable)` +- `CompletionsOptions(string, IEnumerable)` +- `EmbeddingsOptions(string, IEnumerable)` + +#### Embeddings now represented as `ReadOnlyMemory` + +Changed the representation of embeddings (specifically, the type of the `Embedding` property of the `EmbeddingItem` class) +from `IReadOnlyList` to `ReadOnlyMemory` as part of a broader effort to establish consistency across the +.NET ecosystem. + +#### `SearchKey` and `EmbeddingKey` properties replaced by `SetSearchKey` and `SetEmbeddingKey` methods + +Replaced the `SearchKey` and `EmbeddingKey` properties of the `AzureCognitiveSearchChatExtensionConfiguration` class with +new `SetSearchKey` and `SetEmbeddingKey` methods respectively. These methods simplify the configuration of the Azure Cognitive +Search chat extension by receiving a plain string instead of an `AzureKeyCredential`, promote more sensible key and secret +management, and align with the Azure SDK guidelines. + +## 1.0.0-beta.8 (2023-09-21) + +### Features Added + +- Audio Transcription and Audio Translation using OpenAI Whisper models is now supported. See [OpenAI's API + reference](https://platform.openai.com/docs/api-reference/audio) or the [Azure OpenAI + quickstart](https://learn.microsoft.com/azure/ai-services/openai/whisper-quickstart) for detailed overview and + background information. + - The new methods `GetAudioTranscription` and `GetAudioTranscription` expose these capabilities on `OpenAIClient` + - Transcription produces text in the primary, supported, spoken input language of the audio data provided, together + with any optional associated metadata + - Translation produces text, translated to English and reflective of the audio data provided, together with any + optional associated metadata + - These methods work for both Azure OpenAI and non-Azure `api.openai.com` client configurations + +### Breaking Changes + +- The underlying representation of `PromptFilterResults` (for `Completions` and `ChatCompletions`) has had its response + body key changed from `prompt_annotations` to `prompt_filter_results` +- **Prior versions of the `Azure.AI.OpenAI` library may no longer populate `PromptFilterResults` as expected** and it's + highly recommended to upgrade to this version if the use of Azure OpenAI content moderation annotations for input data + is desired +- If a library version upgrade is not immediately possible, it's advised to use `Response.GetRawResponse()` and manually + extract the `prompt_filter_results` object from the top level of the `Completions` or `ChatCompletions` response `Content` + payload + +### Bugs Fixed + +- Support for the described breaking change for `PromptFilterResults` was added and this library version will now again + deserialize `PromptFilterResults` appropriately +- `PromptFilterResults` and `ContentFilterResults` are now exposed on the result classes for streaming Completions and + Chat Completions. `Streaming(Chat)Completions.PromptFilterResults` will report an index-sorted list of all prompt + annotations received so far while `Streaming(Chat)Choice.ContentFilterResults` will reflect the latest-received + content annotations that were populated and received while streaming + +## 1.0.0-beta.7 (2023-08-25) + +### Features Added + +- The Azure OpenAI "using your own data" feature is now supported. See [the Azure OpenAI using your own data quickstart](https://learn.microsoft.com/azure/ai-services/openai/use-your-data-quickstart) for conceptual background and detailed setup instructions. + - Azure OpenAI chat extensions are configured via a new `AzureChatExtensionsOptions` property on `ChatCompletionsOptions`. When an `AzureChatExtensionsOptions` is provided, configured requests will only work with clients configured to use the Azure OpenAI service, as the capabilities are unique to that service target. + - `AzureChatExtensionsOptions` then has `AzureChatExtensionConfiguration` instances added to its `Extensions` property, with these instances representing the supplementary information needed for Azure OpenAI to use desired data sources to supplement chat completions behavior. + - `ChatChoice` instances on a `ChatCompletions` response value that used chat extensions will then also have their `Message` property supplemented by an `AzureChatExtensionMessageContext` instance. This context contains a collection of supplementary `Messages` that describe the behavior of extensions that were used and supplementary response data, such as citations, provided along with the response. + - See the README sample snippet for a simplified example of request/response use with "using your own data" + +## 1.0.0-beta.6 (2023-07-19) + +### Features Added + +- DALL-E image generation is now supported. See [the Azure OpenAI quickstart](https://learn.microsoft.com/azure/cognitive-services/openai/dall-e-quickstart) for conceptual background and detailed setup instructions. + - `OpenAIClient` gains a new `GetImageGenerations` method that accepts an `ImageGenerationOptions` and produces an `ImageGenerations` via its response. This response object encapsulates the temporary storage location of generated images for future retrieval. + - In contrast to other capabilities, DALL-E image generation does not require explicit creation or specification of a deployment or model. Its surface as such does not include this concept. +- Functions for chat completions are now supported: see [OpenAI's blog post on the topic](https://openai.com/blog/function-calling-and-other-api-updates) for much more detail. + - A list of `FunctionDefinition` objects may be populated on `ChatCompletionsOptions` via its `Functions` property. These definitions include a name and description together with a serialized JSON Schema representation of its parameters; these parameters can be generated easily via `BinaryData.FromObjectAsJson` with dynamic objects -- see the README for example usage. + - **NOTE**: Chat Functions requires a minimum of the `-0613` model versions for `gpt-4` and `gpt-3.5-turbo`/`gpt-35-turbo`. Please ensure you're using these later model versions, as Functions are not supported with older model revisions. For Azure OpenAI, you can update a deployment's model version or create a new model deployment with an updated version via the Azure AI Studio interface, also accessible through Azure Portal. +- (Azure OpenAI specific) Completions and Chat Completions responses now include embedded content filter annotations for prompts and responses +- A new `Azure.AI.OpenAI.AzureOpenAIModelFactory` is now present for mocking. + +### Breaking Changes + +- `ChatMessage`'s one-parameter constructor has been replaced with a no-parameter constructor. Please replace any hybrid construction with one of these two options that either completely rely on property setting or completely rely on constructor parameters. + +## 1.0.0-beta.5 (2023-03-22) + +This is a significant release that brings GPT-4 model support (chat) and the ability to use non-Azure OpenAI (not just Azure OpenAI resources) to the .NET library. It also makes a number of clarifying adjustments to request properties for completions. + +### Features Added +- GPT-4 models are now supported via new `GetChatCompletions` and `GetChatCompletionsStreaming` methods on `OpenAIClient`. These use the `/chat/completions` REST endpoint and represent the [OpenAI Chat messages format](https://platform.openai.com/docs/guides/chat). + - The `gpt-3.5-model` can also be used with Chat completions; prior models like text-davinci-003 cannot be used with Chat completions and should still use the `GetCompletions` methods. +- Support for using OpenAI's endpoint via valid API keys obtained from https://platform.openai.com has been added. `OpenAIClient` has new constructors that accept an OpenAI API key instead of an Azure endpoint URI and credential; once configured, Completions, Chat Completions, and Embeddings can be used with identical calling patterns. + +### Breaking Changes + +A number of Completions request properties have been renamed and further documented for clarity. +- `CompletionsOptions` (REST request payload): + - `CacheLevel` and `CompletionConfig` are removed. + - `LogitBias` (REST: `logit_bias`), previously a `` Dictionary, is now an `` Dictionary named `TokenSelectionBiases`. + - `LogProbability` (REST: `logprobs`) is renamed to `LogProbabilityCount`. + - `Model` is removed (in favor of the method-level parameter for deployment or model name) + - `Prompt` is renamed to `Prompts` + - `SnippetCount` (REST: `n`) is renamed to `ChoicesPerPrompt`. + - `Stop` is renamed to `StopSequences`. +- Method and property documentation are broadly updated, with renames from REST schema (like `n` becoming `ChoicesPerPrompt`) specifically noted in ``. + +## 1.0.0-beta.4 (2023-02-23) + +### Bugs fixed +- Addressed issues that sometimes caused `beta.3`'s new `GetStreamingCompletions` method to execute indefinitely + +## 1.0.0-beta.3 (2023-02-17) + +### Features Added +- Support for streaming Completions responses, a capability that parallels setting `stream=true` in the REST API, is now available. A new `GetStreamingCompletions` method on `OpenAIClient` provides a response value `StreamingCompletions` type. This, in turn, exposes a collection of `StreamingChoice` objects as an `IAsyncEnumerable` that will update as a streamed response progresses. `StreamingChoice` further exposes an `IAsyncEnumerable` of streaming text elements via a `GetTextStreaming` method. Used together, this facilitates providing faster, live-updating responses for Completions via the convenient `await foreach` pattern. +- ASP.NET integration via `Microsoft.Extensions.Azure`'s `IAzureClientBuilder` interfaces is available. `OpenAIClient` is now a supported client type for these extension methods. + +### Breaking Changes +- `CompletionsLogProbability.TokenLogProbability`, available on `Choice` elements of a `Completions` response value's `.Choices` collection when a non-zero `LogProbability` value is provided via `CompletionsOptions`, is now an `IReadOnlyList` vs. its previous type of `IReadOnlyList`. This nullability addition accomodates circumstances where some tokens produce expected null values in log probability arrays. + +### Bugs Fixed +- Setting `CompletionsOptions.Echo` to true while also setting a non-zero `CompletionsOptions.LogProbability` no longer results in a deserialization error during response processing. + +## 1.0.0-beta.2 (2023-02-08) +### Bugs Fixed +- Adjusted bad name `finishReason` to `finish_reason` in deserializer class + +## 1.0.0-beta.1 (2023-02-06) + +### Features Added + +- This is the initial preview release for Azure OpenAI inference capabilities, including completions and embeddings. diff --git a/.dotnet.azure/Directory.Build.props b/.dotnet.azure/Directory.Build.props new file mode 100644 index 000000000..0d1b0a607 --- /dev/null +++ b/.dotnet.azure/Directory.Build.props @@ -0,0 +1,43 @@ + + + $(MSBuildThisFileDirectory) + $(MSBuildThisFileDirectory)eng + $(RepoRoot)src + $(RepoRoot)src/SDKs + true + true + true + + + + + Debug + AnyCPU + $(Platform) + + + + + $(RepoRoot)artifacts\ + $(ArtifactsDir)obj\ + $(ArtifactsDir)bin\ + $(ArtifactsDir)packages\$(Configuration)\ + + $(MSBuildProjectName) + + $([System.IO.Path]::GetFullPath('$(ArtifactsBinDir)$(OutDirName)\')) + $(BaseOutputPath)$(Configuration)\ + $(BaseOutputPath)$(PlatformName)\$(Configuration)\ + + $([System.IO.Path]::GetFullPath('$(ArtifactsObjDir)$(OutDirName)\')) + $(BaseIntermediateOutputPath)$(Configuration)\ + $(BaseIntermediateOutputPath)$(PlatformName)\$(Configuration)\ + + $(ArtifactsPackagesDir)/$(MSBuildProjectName) + + + import-required-properties + + + + diff --git a/.dotnet.azure/Directory.Build.targets b/.dotnet.azure/Directory.Build.targets new file mode 100644 index 000000000..3d4330c2e --- /dev/null +++ b/.dotnet.azure/Directory.Build.targets @@ -0,0 +1,10 @@ + + + + <_Parameter1>SourcePath + <_Parameter2>$(MSBuildProjectDirectory) + + + + + diff --git a/.dotnet.azure/README.md b/.dotnet.azure/README.md new file mode 100644 index 000000000..3a201aa89 --- /dev/null +++ b/.dotnet.azure/README.md @@ -0,0 +1,536 @@ +# Azure OpenAI client library for .NET + +The Azure OpenAI client library for .NET is a companion to the official [OpenAI client library for .NET](https://github.com/openai/openai-dotnet). The Azure OpenAI library configures a client for use with Azure OpenAI and provides additional strongly typed extension support for request and response models specific to Azure OpenAI scenarios. + +Azure OpenAI is a managed service that allows developers to deploy, tune, and generate content from OpenAI models on Azure resources. + + [Source code](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/src) | [Package (NuGet)](https://www.nuget.org/packages/Azure.AI.OpenAI) | [API reference documentation](https://learn.microsoft.com/azure/cognitive-services/openai/reference) | [Product documentation](https://learn.microsoft.com/azure/cognitive-services/openai/) | [Samples](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/tests/Samples) + +## Getting started + +### Prerequisites + +To use an Azure OpenAI resource, you must have: + +1. An [Azure subscription](https://azure.microsoft.com/free/dotnet/) +1. [Azure OpenAI access](https://learn.microsoft.com/azure/cognitive-services/openai/overview#how-do-i-get-access-to-azure-openai) + +These prerequisites allow you to create an Azure OpenAI resource and get both a connection URL and API keys. For more information, see [Quickstart: Get started generating text using Azure OpenAI Service](https://learn.microsoft.com/azure/cognitive-services/openai/quickstart). + +### Install the package + +Install the client library for .NET with [NuGet](https://www.nuget.org/): + +```dotnetcli +dotnet add package Azure.AI.OpenAI --prerelease +``` + +The `Azure.AI.OpenAI` package builds on the [official OpenAI package](https://www.nuget.org/packages/OpenAI), which is included as a dependency. + +### Authenticate the client + +To interact with Azure OpenAI or OpenAI, create an instance of [AzureOpenAIClient][azure_openai_client_class] with one of the following approaches: + +- [Create client with a Microsoft Entra credential](#create-client-with-a-microsoft-entra-credential) **(Recommended)** +- [Create client with an API key](#create-client-with-an-api-key) + +#### Create client with a Microsoft Entra credential + +A secure, keyless authentication approach is to use Microsoft Entra ID (formerly Azure Active Directory) via the [Azure Identity library][azure_identity]. To use the library: + +1. Install the [Azure.Identity package](https://www.nuget.org/packages/Azure.Identity): + + ```dotnetcli + dotnet add package Azure.Identity + ``` + +1. Use the desired credential type from the library. For example, [DefaultAzureCredential][azure_identity_dac]: + +```C# Snippet:ConfigureClient:WithEntra +AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); +ChatClient chatClient = azureClient.GetChatClient("my-gpt-4o-mini-deployment"); +``` + +##### Configure client for Azure sovereign cloud** + +If your Microsoft Entra credentials are issued by an entity other than Azure Public Cloud, you can set the `Audience` property on `OpenAIClientOptions` to modify the token authorization scope used for requests. + +For example, the following will configure the client to authenticate tokens via Azure Government Cloud, using `https://cognitiveservices.azure.us/.default` as the authorization scope: + +```C# Snippet:ConfigureClient:GovernmentAudience +AzureOpenAIClientOptions options = new() +{ + Audience = AzureOpenAIAudience.AzureGovernment, +}; +AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); +ChatClient chatClient = azureClient.GetChatClient("my-gpt-4o-mini-deployment"); +``` + +For a custom or non-enumerated value, the authorization scope can be provided directly as the value for `Audience`: + +```C# Snippet:ConfigureClient:CustomAudience +AzureOpenAIClientOptions optionsWithCustomAudience = new() +{ + Audience = "https://cognitiveservices.azure.com/.default", +}; +``` + +#### Create client with an API key + +While not as secure as Microsoft Entra-based authentication, it's possible to authenticate using a client subscription key: + +```C# Snippet:ConfigureClient:WithAOAITopLevelClient +string keyFromEnvironment = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY"); + +AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new AzureKeyCredential(keyFromEnvironment)); +ChatClient chatClient = azureClient.GetChatClient("my-gpt-35-turbo-deployment"); +``` + +## Key concepts + +### Assistants + +See [OpenAI's Assistants API overview](https://platform.openai.com/docs/assistants/overview). + +### Audio transcription/translation and text-to-speech generation + +See [OpenAI Capabilities: Speech to text](https://platform.openai.com/docs/guides/speech-to-text/speech-to-text). + +### Batch + +See [OpenAI's Batch API guide](https://platform.openai.com/docs/guides/batch). + +### Chat completion + +Chat models take a list of messages as input and return a model-generated message as output. Although the chat format is +designed to make multi-turn conversations easy, it's also useful for single-turn tasks without any conversation. + +See [OpenAI Capabilities: Chat completion](https://platform.openai.com/docs/guides/text-generation/chat-completions-api). + +### Image generation + +See [OpenAI Capabilities: Image generation](https://platform.openai.com/docs/guides/images/introduction). + +### Files + +See [OpenAI's Files API reference](https://platform.openai.com/docs/api-reference/files). + +### Text embeddings + +See [OpenAI Capabilities: Embeddings](https://platform.openai.com/docs/guides/embeddings/embeddings). + +### Thread safety + +We guarantee that all client instance methods are thread-safe and independent of each other ([guideline](https://azure.github.io/azure-sdk/dotnet_introduction.html#dotnet-service-methods-thread-safety)). This ensures that the recommendation of reusing client instances is always safe, even across threads. + +### Additional concepts + + +[Client options](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/core/Azure.Core/README.md#configuring-service-clients-using-clientoptions) | +[Accessing the response](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/core/Azure.Core/README.md#accessing-http-response-details-using-responset) | +[Long-running operations](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/core/Azure.Core/README.md#consuming-long-running-operations-using-operationt) | +[Handling failures](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/core/Azure.Core/README.md#reporting-errors-requestfailedexception) | +[Diagnostics](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/core/Azure.Core/samples/Diagnostics.md) | +[Mocking](https://learn.microsoft.com/dotnet/azure/sdk/unit-testing-mocking) | +[Client lifetime](https://devblogs.microsoft.com/azure-sdk/lifetime-management-and-thread-safety-guarantees-of-azure-sdk-net-clients/) + + +## Examples + +You can familiarize yourself with different APIs using [Samples from OpenAI's .NET library](https://github.com/openai/openai-dotnet/tree/main/examples) or [Azure.AI.OpenAI-specific samples](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/openai/Azure.AI.OpenAI/tests/Samples). Most OpenAI capabilities are available on both Azure OpenAI and OpenAI using the same scenario clients and methods, so not all scenarios are redundantly covered here. + +### Get a chat completion + +```C# Snippet:SimpleChatResponse +AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); +ChatClient chatClient = azureClient.GetChatClient("my-gpt-35-turbo-deployment"); + +ChatCompletion completion = chatClient.CompleteChat( + [ + // System messages represent instructions or other guidance about how the assistant should behave + new SystemChatMessage("You are a helpful assistant that talks like a pirate."), + // User messages represent user input, whether historical or the most recen tinput + new UserChatMessage("Hi, can you help me?"), + // Assistant messages in a request represent conversation history for responses + new AssistantChatMessage("Arrr! Of course, me hearty! What can I do for ye?"), + new UserChatMessage("What's the best way to train a parrot?"), + ]); + +Console.WriteLine($"{completion.Role}: {completion.Content[0].Text}"); +``` + +### Stream chat messages + +Streaming chat completions use the `CompleteChatStreaming` and `CompleteChatStreamingAsync` method, which return a `ResultCollection` or `AsyncCollectionResult` instead of a `ClientResult`. These result collections can be iterated over using `foreach` or `await foreach`, with each update arriving as new data is available from the streamed response. + +```C# Snippet:StreamChatMessages +AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); +ChatClient chatClient = azureClient.GetChatClient("my-gpt-35-turbo-deployment"); + +CollectionResult completionUpdates = chatClient.CompleteChatStreaming( + [ + new SystemChatMessage("You are a helpful assistant that talks like a pirate."), + new UserChatMessage("Hi, can you help me?"), + new AssistantChatMessage("Arrr! Of course, me hearty! What can I do for ye?"), + new UserChatMessage("What's the best way to train a parrot?"), + ]); + +foreach (StreamingChatCompletionUpdate completionUpdate in completionUpdates) +{ + foreach (ChatMessageContentPart contentPart in completionUpdate.ContentUpdate) + { + Console.Write(contentPart.Text); + } +} +``` + +### Use chat tools + +**Tools** extend chat completions by allowing an assistant to invoke defined functions and other capabilities in the +process of fulfilling a chat completions request. To use chat tools, start by defining a function tool. Here, we root the tools in local methods for clarity and convenience: + +```C# Snippet:ChatTools:DefineTool +static string GetCurrentLocation() +{ + // Call the location API here. + return "San Francisco"; +} + +static string GetCurrentWeather(string location, string unit = "celsius") +{ + // Call the weather API here. + return $"31 {unit}"; +} + +ChatTool getCurrentLocationTool = ChatTool.CreateFunctionTool( + functionName: nameof(GetCurrentLocation), + functionDescription: "Get the user's current location" +); + +ChatTool getCurrentWeatherTool = ChatTool.CreateFunctionTool( + functionName: nameof(GetCurrentWeather), + functionDescription: "Get the current weather in a given location", + functionParameters: BinaryData.FromString(""" + { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. Boston, MA" + }, + "unit": { + "type": "string", + "enum": [ "celsius", "fahrenheit" ], + "description": "The temperature unit to use. Infer this from the specified location." + } + }, + "required": [ "location" ] + } + """) +); +``` + +With the tool defined, include that new definition in the options for a chat completions request: + +```C# Snippet:ChatTools:RequestWithFunctions +ChatCompletionOptions options = new() +{ + Tools = { getCurrentLocationTool, getCurrentWeatherTool }, +}; + +List conversationMessages = + [ + new UserChatMessage("What's the weather like in Boston?"), + ]; +ChatCompletion completion = chatClient.CompleteChat(conversationMessages); +``` + +When the assistant decides that one or more tools should be used, the response message includes one or more "tool +calls" that must all be resolved via "tool messages" on the subsequent request. This resolution of tool calls into +new request messages can be thought of as a sort of "callback" for chat completions. + +To provide tool call resolutions to the assistant to allow the request to continue, provide all prior historical +context -- including the original system and user messages, the response from the assistant that included the tool +calls, and the tool messages that resolved each of those tools -- when making a subsequent request. + +```C# Snippet:ChatTools:HandleToolCalls +// Purely for convenience and clarity, this standalone local method handles tool call responses. +string GetToolCallContent(ChatToolCall toolCall) +{ + if (toolCall.FunctionName == getCurrentWeatherTool.FunctionName) + { + // Validate arguments before using them; it's not always guaranteed to be valid JSON! + try + { + using JsonDocument argumentsDocument = JsonDocument.Parse(toolCall.FunctionArguments); + if (!argumentsDocument.RootElement.TryGetProperty("location", out JsonElement locationElement)) + { + // Handle missing required "location" argument + } + else + { + string location = locationElement.GetString(); + if (argumentsDocument.RootElement.TryGetProperty("unit", out JsonElement unitElement)) + { + return GetCurrentWeather(location, unitElement.GetString()); + } + else + { + return GetCurrentWeather(location); + } + } + } + catch (JsonException) + { + // Handle the JsonException (bad arguments) here + } + } + // Handle unexpected tool calls + throw new NotImplementedException(); +} + +if (completion.FinishReason == ChatFinishReason.ToolCalls) +{ + // Add a new assistant message to the conversation history that includes the tool calls + conversationMessages.Add(new AssistantChatMessage(completion)); + + foreach (ChatToolCall toolCall in completion.ToolCalls) + { + conversationMessages.Add(new ToolChatMessage(toolCall.Id, GetToolCallContent(toolCall))); + } + + // Now make a new request with all the messages thus far, including the original +} +``` + +When using tool calls with streaming responses, accumulate tool call details much like you'd accumulate the other +portions of streamed choices, in this case using the accumulated `StreamingToolCallUpdate` data to instantiate new +tool call messages for assistant message history. Note that the model will ignore `ChoiceCount` when providing tools +and that all streamed responses should map to a single, common choice index in the range of `[0..(ChoiceCount - 1)]`. + +```C# Snippet:ChatTools:StreamingChatTools +Dictionary toolCallIdsByIndex = []; +Dictionary functionNamesByIndex = []; +Dictionary functionArgumentBuildersByIndex = []; +StringBuilder contentBuilder = new(); + +foreach (StreamingChatCompletionUpdate streamingChatUpdate + in chatClient.CompleteChatStreaming(conversationMessages, options)) +{ + foreach (ChatMessageContentPart contentPart in streamingChatUpdate.ContentUpdate) + { + contentBuilder.Append(contentPart.Text); + } + foreach (StreamingChatToolCallUpdate toolCallUpdate in streamingChatUpdate.ToolCallUpdates) + { + if (!string.IsNullOrEmpty(toolCallUpdate.Id)) + { + toolCallIdsByIndex[toolCallUpdate.Index] = toolCallUpdate.Id; + } + if (!string.IsNullOrEmpty(toolCallUpdate.FunctionName)) + { + functionNamesByIndex[toolCallUpdate.Index] = toolCallUpdate.FunctionName; + } + if (!string.IsNullOrEmpty(toolCallUpdate.FunctionArgumentsUpdate)) + { + StringBuilder argumentsBuilder + = functionArgumentBuildersByIndex.TryGetValue(toolCallUpdate.Index, out StringBuilder existingBuilder) + ? existingBuilder + : new(); + argumentsBuilder.Append(toolCallUpdate.FunctionArgumentsUpdate); + functionArgumentBuildersByIndex[toolCallUpdate.Index] = argumentsBuilder; + } + } +} + +List toolCalls = []; +foreach (KeyValuePair indexToIdPair in toolCallIdsByIndex) +{ + toolCalls.Add(ChatToolCall.CreateFunctionToolCall( + indexToIdPair.Value, + functionNamesByIndex[indexToIdPair.Key], + functionArgumentBuildersByIndex[indexToIdPair.Key].ToString())); +} + +conversationMessages.Add(new AssistantChatMessage(toolCalls, contentBuilder.ToString())); + +// Placeholder: each tool call must be resolved, like in the non-streaming case +string GetToolCallOutput(ChatToolCall toolCall) => null; + +foreach (ChatToolCall toolCall in toolCalls) +{ + conversationMessages.Add(new ToolChatMessage(toolCall.Id, GetToolCallOutput(toolCall))); +} + +// Repeat with the history and all tool call resolution messages added +``` + +### Use your own data with Azure OpenAI + +The use your own data feature is unique to Azure OpenAI and won't work with a client configured to use the non-Azure service. +See [the Azure OpenAI using your own data quickstart](https://learn.microsoft.com/azure/ai-services/openai/use-your-data-quickstart) for conceptual background and detailed setup instructions. + +**NOTE:** The concurrent use of [Chat Functions](#use-chat-functions) and Azure Chat Extensions on a single request isn't yet supported. Supplying both will result in the Chat Functions information being ignored and the operation behaving as if only the Azure Chat Extensions were provided. To address this limitation, consider separating the evaluation of Chat Functions and Azure Chat Extensions across multiple requests in your solution design. + +```C# Snippet:ChatUsingYourOwnData +// Extension methods to use data sources with options are subject to SDK surface changes. Suppress the +// warning to acknowledge and this and use the subject-to-change AddDataSource method. +#pragma warning disable AOAI001 + +ChatCompletionOptions options = new(); +options.AddDataSource(new AzureSearchChatDataSource() +{ + Endpoint = new Uri("https://your-search-resource.search.windows.net"), + IndexName = "contoso-products-index", + Authentication = DataSourceAuthentication.FromApiKey( + Environment.GetEnvironmentVariable("OYD_SEARCH_KEY")), +}); + +ChatCompletion completion = chatClient.CompleteChat( + [ + new UserChatMessage("What are the best-selling Contoso products this month?"), + ], + options); + +AzureChatMessageContext onYourDataContext = completion.GetAzureMessageContext(); + +if (onYourDataContext?.Intent is not null) +{ + Console.WriteLine($"Intent: {onYourDataContext.Intent}"); +} +foreach (AzureChatCitation citation in onYourDataContext?.Citations ?? []) +{ + Console.WriteLine($"Citation: {citation.Content}"); +} +``` + +### Use Assistants and stream a run + +[Assistants](https://platform.openai.com/docs/assistants/overview) provide a stateful, service-persisted conversational +model that can be enriched with a larger array of tools than Chat Completions. + +Creating an `AssistantClient` is similar to other scenario clients. An important difference is that Assistants features +are marked as `[Experimental]` to reflect the API's beta status, and thus you'll need to suppress the corresponding +warning to instantiate a client. This can be done in the `.csproj` file via the `` element or, as below, in +the code itself with a `#pragma` directive. + +```C# Snippet:Assistants:CreateClient +AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); + +// The Assistants feature area is in beta, with API specifics subject to change. +// Suppress the [Experimental] warning via .csproj or, as here, in the code to acknowledge. +#pragma warning disable OPENAI001 +AssistantClient assistantClient = azureClient.GetAssistantClient(); +``` + +With a client, you can then create Assistants, Threads, and new Messages on a thread in preparation to start a run. As is the case for other shared API surfaces, you should use an Azure OpenAI model deployment name wherever a model name is requested. + +```C# Snippet:Assistants:PrepareToRun +Assistant assistant = await assistantClient.CreateAssistantAsync( + model: "my-gpt-4o-deployment", + new AssistantCreationOptions() + { + Name = "My Friendly Test Assistant", + Instructions = "You politely help with math questions. Use the code interpreter tool when asked to " + + "visualize numbers.", + Tools = { ToolDefinition.CreateCodeInterpreter() }, + }); +ThreadInitializationMessage initialMessage = new( + MessageRole.User, + [ + "Hi, Assistant! Draw a graph for a line with a slope of 4 and y-intercept of 9." + ]); +AssistantThread thread = await assistantClient.CreateThreadAsync(new ThreadCreationOptions() +{ + InitialMessages = { initialMessage }, +}); +``` + +You can then start a run and stream updates as they arrive using the `Streaming` method variants, handling the updates +you're interested in using the enumerated kind of event it is and/or one of the several derived types for the streaming +update class, as shown here for content: + +```C# Snippet:Assistants:StreamRun +RunCreationOptions runOptions = new() +{ + AdditionalInstructions = "When possible, talk like a pirate." +}; +await foreach (StreamingUpdate streamingUpdate + in assistantClient.CreateRunStreamingAsync(thread, assistant, runOptions)) +{ + if (streamingUpdate.UpdateKind == StreamingUpdateReason.RunCreated) + { + Console.WriteLine($"--- Run started! ---"); + } + else if (streamingUpdate is MessageContentUpdate contentUpdate) + { + Console.Write(contentUpdate.Text); + if (contentUpdate.ImageFileId is not null) + { + Console.WriteLine($"[Image content file ID: {contentUpdate.ImageFileId}"); + } + } +} +``` + +Remember that things like Assistants, Threads, and Vector Stores are persistent resources. You can save their IDs to +reuse them later or, as demonstrated below, delete them when no longer desired. + +```C# Snippet:Assistants:Cleanup +// Optionally, delete persistent resources that are no longer needed. +_ = await assistantClient.DeleteAssistantAsync(assistant); +_ = await assistantClient.DeleteThreadAsync(thread); +``` + +## Next steps + +## Troubleshooting + +When you interact with Azure OpenAI using the .NET SDK, errors returned by the service correspond to the same HTTP status codes returned for [REST API][openai_rest] requests. + +For example, if you try to create a client using an endpoint that doesn't match your Azure OpenAI Resource endpoint, a `404` error is returned, indicating `Resource Not Found`. + +## Contributing + +See the [OpenAI CONTRIBUTING.md][openai_contrib] for details on building, testing, and contributing to this library. + +This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit [cla.microsoft.com][cla]. + +When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. + +This project has adopted the [Microsoft Open Source Code of Conduct][code_of_conduct]. For more information, see the [Code of Conduct FAQ][code_of_conduct_faq] or contact [opencode@microsoft.com][email_opencode] with any additional questions or comments. + + +[azure_identity]: https://learn.microsoft.com/dotnet/api/overview/azure/identity-readme?view=azure-dotnet +[azure_identity_dac]: https://learn.microsoft.com/dotnet/api/azure.identity.defaultazurecredential?view=azure-dotnet +[msdocs_openai_chat_quickstart]: https://learn.microsoft.com/azure/ai-services/openai/chatgpt-quickstart?pivots=programming-language-csharp +[msdocs_openai_dalle_quickstart]: https://learn.microsoft.com/azure/ai-services/openai/dall-e-quickstart?pivots=programming-language-csharp +[msdocs_openai_whisper_quickstart]: https://learn.microsoft.com/azure/ai-services/openai/whisper-quickstart +[msdocs_openai_tts_quickstart]: https://learn.microsoft.com/azure/ai-services/openai/text-to-speech-quickstart +[msdocs_openai_completion]: https://learn.microsoft.com/azure/cognitive-services/openai/how-to/completions +[msdocs_openai_embedding]: https://learn.microsoft.com/azure/cognitive-services/openai/concepts/understand-embeddings +[style-guide-msft]: https://docs.microsoft.com/style-guide/capitalization +[style-guide-cloud]: https://aka.ms/azsdk/cloud-style-guide +[azure_openai_client_class]: https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIClient.cs +[openai_rest]: https://learn.microsoft.com/azure/cognitive-services/openai/reference +[azure_openai_completions_docs]: https://learn.microsoft.com/azure/cognitive-services/openai/how-to/completions +[azure_openai_embeddgings_docs]: https://learn.microsoft.com/azure/cognitive-services/openai/concepts/understand-embeddings +[openai_contrib]: https://github.com/Azure/azure-sdk-for-net/blob/main/CONTRIBUTING.md +[cla]: https://cla.microsoft.com +[code_of_conduct]: https://opensource.microsoft.com/codeofconduct/ +[code_of_conduct_faq]: https://opensource.microsoft.com/codeofconduct/faq/ +[email_opencode]: mailto:opencode@microsoft.com + +![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-net/sdk/openai/Azure.AI.OpenAI/README.png) \ No newline at end of file diff --git a/.dotnet.azure/assets.json b/.dotnet.azure/assets.json new file mode 100644 index 000000000..39d86235b --- /dev/null +++ b/.dotnet.azure/assets.json @@ -0,0 +1,6 @@ +{ + "AssetsRepo": "Azure/azure-sdk-assets", + "AssetsRepoPrefixPath": "net", + "TagPrefix": "net/openai/Azure.AI.OpenAI", + "Tag": "net/openai/Azure.AI.OpenAI_23ae923738" +} diff --git a/.dotnet.azure/eng/CodeAnalysis.ruleset b/.dotnet.azure/eng/CodeAnalysis.ruleset new file mode 100644 index 000000000..d6ade187a --- /dev/null +++ b/.dotnet.azure/eng/CodeAnalysis.ruleset @@ -0,0 +1,404 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/.dotnet.azure/eng/Directory.Build.Common.props b/.dotnet.azure/eng/Directory.Build.Common.props new file mode 100644 index 000000000..59a538ae8 --- /dev/null +++ b/.dotnet.azure/eng/Directory.Build.Common.props @@ -0,0 +1,56 @@ + + + + true + true + true + 11.0 + true + + $(NoWarn); + NU5105; + + CA1812; + CA1716; + CA1308; + CA1819; + CA1710; + CA1028; + CA1032; + CA1063; + CA1066; + CA1815; + CA2007; + CA2231; + CA2225; + CA1714; + CA1062; + CA1031; + CA2000; + CA2012; + + MSB3245; + AZPROVISION001; + + true + + + + + + netstandard2.0 + $(WarningsNotAsErrors);NU1901;NU1902;NU1903;NU1904 + $(RepoEngPath)\CodeAnalysis.ruleset + + + + + + net8.0;net6.0 + $(RequiredTargetFrameworks);net462 + + + + + + diff --git a/.dotnet.azure/eng/Directory.Build.Common.targets b/.dotnet.azure/eng/Directory.Build.Common.targets new file mode 100644 index 000000000..ec2340250 --- /dev/null +++ b/.dotnet.azure/eng/Directory.Build.Common.targets @@ -0,0 +1,3 @@ + + + diff --git a/.dotnet.azure/eng/Packages.Data.props b/.dotnet.azure/eng/Packages.Data.props new file mode 100644 index 000000000..f868bc705 --- /dev/null +++ b/.dotnet.azure/eng/Packages.Data.props @@ -0,0 +1,395 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + 1.0.0-dev.20240806.1 + + + diff --git a/.dotnet.azure/nuget.config b/.dotnet.azure/nuget.config new file mode 100644 index 000000000..1f889a235 --- /dev/null +++ b/.dotnet.azure/nuget.config @@ -0,0 +1,9 @@ + + + + + + + + + diff --git a/.dotnet.azure/sdk/openai/.gitignore b/.dotnet.azure/sdk/openai/.gitignore new file mode 100644 index 000000000..5e5364f16 --- /dev/null +++ b/.dotnet.azure/sdk/openai/.gitignore @@ -0,0 +1 @@ +#Azure.AI.OpenAI/Directory.Build.props \ No newline at end of file diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI.sln b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI.sln new file mode 100644 index 000000000..20dcf6c40 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI.sln @@ -0,0 +1,34 @@ + +Microsoft Visual Studio Solution File, Format Version 12.00 +# Visual Studio Version 17 +VisualStudioVersion = 17.10.35004.147 +MinimumVisualStudioVersion = 10.0.40219.1 +Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Azure.AI.OpenAI", "Azure.AI.OpenAI\src\Azure.AI.OpenAI.csproj", "{A80B9566-84A5-4AE4-AA0A-72B18646F1EC}" +EndProject +Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "OpenAI", "..\..\..\.dotnet\src\OpenAI.csproj", "{8BEE571B-DB25-4BE5-B9EB-2CA81D12EBC6}" +EndProject +Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Azure.AI.OpenAI.Tests", "Azure.AI.OpenAI\tests\Azure.AI.OpenAI.Tests.csproj", "{23DAB09E-3986-4248-AC80-2273C20FCD90}" +EndProject +Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "OpenAI.TestFramework", "tools\TestFramework\src\OpenAI.TestFramework.csproj", "{D1E3E196-BAA8-47C2-905A-B1C20733AEA8}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Unsigned|Any CPU = Unsigned|Any CPU + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {A80B9566-84A5-4AE4-AA0A-72B18646F1EC}.Unsigned|Any CPU.ActiveCfg = Debug|Any CPU + {A80B9566-84A5-4AE4-AA0A-72B18646F1EC}.Unsigned|Any CPU.Build.0 = Debug|Any CPU + {8BEE571B-DB25-4BE5-B9EB-2CA81D12EBC6}.Unsigned|Any CPU.ActiveCfg = Unsigned|Any CPU + {8BEE571B-DB25-4BE5-B9EB-2CA81D12EBC6}.Unsigned|Any CPU.Build.0 = Unsigned|Any CPU + {23DAB09E-3986-4248-AC80-2273C20FCD90}.Unsigned|Any CPU.ActiveCfg = Debug|Any CPU + {23DAB09E-3986-4248-AC80-2273C20FCD90}.Unsigned|Any CPU.Build.0 = Debug|Any CPU + {D1E3E196-BAA8-47C2-905A-B1C20733AEA8}.Unsigned|Any CPU.ActiveCfg = Debug|Any CPU + {D1E3E196-BAA8-47C2-905A-B1C20733AEA8}.Unsigned|Any CPU.Build.0 = Debug|Any CPU + EndGlobalSection + GlobalSection(SolutionProperties) = preSolution + HideSolutionNode = FALSE + EndGlobalSection + GlobalSection(ExtensibilityGlobals) = postSolution + SolutionGuid = {A68497E4-547C-42B4-8EE4-6776A8238EE4} + EndGlobalSection +EndGlobal diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/Directory.Build.props b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/Directory.Build.props new file mode 100644 index 000000000..924ecfa8f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/Directory.Build.props @@ -0,0 +1,20 @@ + + + + + + $(RepoRoot)/../.dotnet/src/OpenAI.csproj + 1.1.0-beta.5 + + + + + + + + + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/assets.json b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/assets.json new file mode 100644 index 000000000..78d850850 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/assets.json @@ -0,0 +1,6 @@ +{ + "AssetsRepo": "Azure/azure-sdk-assets", + "AssetsRepoPrefixPath": "net", + "TagPrefix": "dotnet.azure/openai/Azure.AI.OpenAI", + "Tag": "dotnet.azure/openai/Azure.AI.OpenAI_9a2f5cd1c9" +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Azure.AI.OpenAI.csproj b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Azure.AI.OpenAI.csproj new file mode 100644 index 000000000..6bff57bff --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Azure.AI.OpenAI.csproj @@ -0,0 +1,64 @@ + + + + + Azure OpenAI's official extension package for using OpenAI's .NET library with the Azure OpenAI Service. + + Azure.AI.OpenAI Client Library + 2.0.0-beta.4 + Microsoft Azure OpenAI + true + $(RequiredTargetFrameworks) + true + $(NoWarn);CS1591;AZC0012;AZC0102;CS8002;CS0436;AZC0112 + enable + preview + disable + + + + + 0024000004800000940000000602000000240000525341310004000001000100d15ddcb29688295338af4b7686603fe614abd555e09efba8fb88ee09e1f7b1ccaeed2e8f823fa9eef3fdd60217fc012ea67d2479751a0b8c087a4185541b851bd8b16f8d91b840e51b1cb0ba6fe647997e57429265e85ef62d565db50a69ae1647d54d7bd855e4db3d8a91510e5bcbd0edfbbecaa20a7bd9ae74593daa7b11b4 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/AzureAssistantClient.Protocol.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/AzureAssistantClient.Protocol.cs new file mode 100644 index 000000000..c1c987098 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/AzureAssistantClient.Protocol.cs @@ -0,0 +1,570 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Diagnostics.CodeAnalysis; + +namespace Azure.AI.OpenAI.Assistants; + +[Experimental("OPENAI001")] +internal partial class AzureAssistantClient : AssistantClient +{ + public override async Task CreateAssistantAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateAssistantRequest(content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult CreateAssistant(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateAssistantRequest(content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override IAsyncEnumerable GetAssistantsAsync(int? limit, string order, string after, string before, RequestOptions options) + { + AzureAssistantsPageEnumerator enumerator = new(Pipeline, _endpoint, limit, order, after, before, _apiVersion, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + public override IEnumerable GetAssistants(int? limit, string order, string after, string before, RequestOptions options) + { + AzureAssistantsPageEnumerator enumerator = new(Pipeline, _endpoint, limit, order, after, before, _apiVersion, options); + return PageCollectionHelpers.Create(enumerator); + } + + public override async Task GetAssistantAsync(string assistantId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + using PipelineMessage message = CreateGetAssistantRequest(assistantId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult GetAssistant(string assistantId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + using PipelineMessage message = CreateGetAssistantRequest(assistantId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task ModifyAssistantAsync(string assistantId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyAssistantRequest(assistantId, content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult ModifyAssistant(string assistantId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyAssistantRequest(assistantId, content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task DeleteAssistantAsync(string assistantId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + using PipelineMessage message = CreateDeleteAssistantRequest(assistantId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult DeleteAssistant(string assistantId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + using PipelineMessage message = CreateDeleteAssistantRequest(assistantId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override async Task CreateMessageAsync(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateCreateMessageRequest(threadId, content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult CreateMessage(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateCreateMessageRequest(threadId, content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override IAsyncEnumerable GetMessagesAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + AzureMessagesPageEnumerator enumerator = new(Pipeline, _endpoint, threadId, limit, order, after, before, _apiVersion, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + public override IEnumerable GetMessages(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + AzureMessagesPageEnumerator enumerator = new(Pipeline, _endpoint, threadId, limit, order, after, before, _apiVersion, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + public override async Task GetMessageAsync(string threadId, string messageId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetMessageRequest(threadId, messageId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult GetMessage(string threadId, string messageId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetMessageRequest(threadId, messageId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override async Task ModifyMessageAsync(string threadId, string messageId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + using PipelineMessage message = CreateModifyMessageRequest(threadId, messageId, content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult ModifyMessage(string threadId, string messageId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + using PipelineMessage message = CreateModifyMessageRequest(threadId, messageId, content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override async Task DeleteMessageAsync(string threadId, string messageId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + using PipelineMessage message = CreateDeleteMessageRequest(threadId, messageId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult DeleteMessage(string threadId, string messageId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + using PipelineMessage message = CreateDeleteMessageRequest(threadId, messageId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override async Task CreateThreadAndRunAsync(BinaryContent content, RequestOptions options = null) + { + PipelineMessage message = null; + try + { + message = CreateCreateThreadAndRunRequest(content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + /// + public override ClientResult CreateThreadAndRun(BinaryContent content, RequestOptions options = null) + { + PipelineMessage message = null; + try + { + message = CreateCreateThreadAndRunRequest(content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + /// + public override async Task CreateRunAsync(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + PipelineMessage message = null; + try + { + message = CreateCreateRunRequest(threadId, content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + /// + public override ClientResult CreateRun(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + PipelineMessage message = null; + try + { + message = CreateCreateRunRequest(threadId, content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + public override IAsyncEnumerable GetRunsAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + AzureRunsPageEnumerator enumerator = new(Pipeline, _endpoint, threadId, limit, order, after, before, _apiVersion, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + public override IEnumerable GetRuns(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + AzureRunsPageEnumerator enumerator = new(Pipeline, _endpoint, threadId, limit, order, after, before, _apiVersion, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + public override async Task GetRunAsync(string threadId, string runId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateGetRunRequest(threadId, runId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult GetRun(string threadId, string runId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateGetRunRequest(threadId, runId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override async Task ModifyRunAsync(string threadId, string runId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateModifyRunRequest(threadId, runId, content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult ModifyRun(string threadId, string runId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateModifyRunRequest(threadId, runId, content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override async Task CancelRunAsync(string threadId, string runId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateCancelRunRequest(threadId, runId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult CancelRun(string threadId, string runId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateCancelRunRequest(threadId, runId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override async Task SubmitToolOutputsToRunAsync(string threadId, string runId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + PipelineMessage message = null; + try + { + message = CreateSubmitToolOutputsToRunRequest(threadId, runId, content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + /// + public override ClientResult SubmitToolOutputsToRun(string threadId, string runId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + PipelineMessage message = null; + try + { + message = CreateSubmitToolOutputsToRunRequest(threadId, runId, content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + public override IAsyncEnumerable GetRunStepsAsync(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + AzureRunStepsPageEnumerator enumerator = new(Pipeline, _endpoint, threadId, runId, limit, order, after, before, _apiVersion, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + public override IEnumerable GetRunSteps(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + AzureRunStepsPageEnumerator enumerator = new(Pipeline, _endpoint, threadId, runId, limit, order, after, before, _apiVersion, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + public override async Task GetRunStepAsync(string threadId, string runId, string stepId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + Argument.AssertNotNullOrEmpty(stepId, nameof(stepId)); + + using PipelineMessage message = CreateGetRunStepRequest(threadId, runId, stepId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult GetRunStep(string threadId, string runId, string stepId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + Argument.AssertNotNullOrEmpty(stepId, nameof(stepId)); + + using PipelineMessage message = CreateGetRunStepRequest(threadId, runId, stepId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override async Task CreateThreadAsync(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCreateThreadRequest(content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult CreateThread(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCreateThreadRequest(content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override async Task GetThreadAsync(string threadId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetThreadRequest(threadId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult GetThread(string threadId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetThreadRequest(threadId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override async Task ModifyThreadAsync(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateModifyThreadRequest(threadId, content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult ModifyThread(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateModifyThreadRequest(threadId, content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + /// + public override async Task DeleteThreadAsync(string threadId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateDeleteThreadRequest(threadId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + public override ClientResult DeleteThread(string threadId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateDeleteThreadRequest(threadId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + private new PipelineMessage CreateCreateAssistantRequest(BinaryContent content, RequestOptions options = null) + => NewJsonPostBuilder(content, options).WithPath("assistants").Build(); + + private new PipelineMessage CreateGetAssistantRequest(string assistantId, RequestOptions options) + => NewJsonGetBuilder(options).WithPath("assistants", assistantId).Build(); + + private new PipelineMessage CreateModifyAssistantRequest(string assistantId, BinaryContent content, RequestOptions options) + => NewJsonPostBuilder(content, options).WithPath("assistants", assistantId).Build(); + + private new PipelineMessage CreateDeleteAssistantRequest(string assistantId, RequestOptions options) + => NewJsonDeleteBuilder(options).WithPath("assistants", assistantId).Build(); + + private PipelineMessage CreateCreateThreadRequest(BinaryContent content, RequestOptions options) + => NewJsonPostBuilder(content, options).WithPath("threads").Build(); + + private PipelineMessage CreateGetThreadsRequest(int? limit, string order, string after, string before, RequestOptions options) + => NewGetListBuilder(limit, order, after, before, options).WithPath("threads").Build(); + + private PipelineMessage CreateGetThreadRequest(string threadId, RequestOptions options) + => NewJsonGetBuilder(options).WithPath("threads", threadId).Build(); + + private PipelineMessage CreateModifyThreadRequest(string threadId, BinaryContent content, RequestOptions options) + => NewJsonPostBuilder(content, options).WithPath("threads", threadId).Build(); + + private PipelineMessage CreateDeleteThreadRequest(string threadId, RequestOptions options) + => NewJsonDeleteBuilder(options).WithPath("threads", threadId).Build(); + + private PipelineMessage CreateCreateMessageRequest(string threadId, BinaryContent content, RequestOptions options) + => NewJsonPostBuilder(content, options).WithPath("threads", threadId, "messages").Build(); + + private PipelineMessage CreateGetMessageRequest(string threadId, string messageId, RequestOptions options) + => NewJsonGetBuilder(options).WithPath("threads", threadId, "messages", messageId).Build(); + + private PipelineMessage CreateModifyMessageRequest(string threadId, string messageId, BinaryContent content, RequestOptions options) + => NewJsonPostBuilder(content, options).WithPath("threads", threadId, "messages", messageId).Build(); + + private PipelineMessage CreateDeleteMessageRequest(string threadId, string messageId, RequestOptions options) + => NewJsonDeleteBuilder(options).WithPath("threads", threadId, "messages", messageId).Build(); + + private PipelineMessage CreateCreateThreadAndRunRequest(BinaryContent content, RequestOptions options) + => NewJsonPostBuilder(content, options).WithPath("threads", "runs").Build(); + + private PipelineMessage CreateCreateRunRequest(string threadId, BinaryContent content, RequestOptions options) + => NewJsonPostBuilder(content, options).WithPath("threads", threadId, "runs").Build(); + + private PipelineMessage CreateGetRunRequest(string threadId, string runId, RequestOptions options) + => NewJsonGetBuilder(options).WithPath("threads", threadId, "runs", runId).Build(); + + private PipelineMessage CreateModifyRunRequest(string threadId, string runId, BinaryContent content, RequestOptions options) + => NewJsonPostBuilder(content, options).WithPath("threads", threadId, "runs", runId).Build(); + + private PipelineMessage CreateCancelRunRequest(string threadId, string runId, RequestOptions options) + => NewBuilder(options).WithMethod("POST").WithPath("threads", threadId, "runs", runId, "cancel").WithAccept("application/json").Build(); + + private PipelineMessage CreateSubmitToolOutputsToRunRequest(string threadId, string runId, BinaryContent content, RequestOptions options) + => NewJsonPostBuilder(content, options).WithPath("threads", threadId, "runs", runId, "submit_tool_outputs").Build(); + + private PipelineMessage CreateGetRunStepRequest(string threadId, string runId, string stepId, RequestOptions options) + => NewJsonGetBuilder(options).WithPath("threads", threadId, "runs", runId, "steps", stepId).Build(); + + private AzureOpenAIPipelineMessageBuilder NewBuilder(RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithAssistantsHeader() + .WithOptions(options); + + private AzureOpenAIPipelineMessageBuilder NewJsonPostBuilder(BinaryContent content, RequestOptions options) + => NewBuilder(options) + .WithMethod("POST") + .WithContent(content, "application/json") + .WithAccept("application/json"); + + private AzureOpenAIPipelineMessageBuilder NewJsonGetBuilder(RequestOptions options) + => NewBuilder(options) + .WithMethod("GET") + .WithAccept("application/json"); + + private AzureOpenAIPipelineMessageBuilder NewJsonDeleteBuilder(RequestOptions options) + => NewBuilder(options) + .WithMethod("DELETE") + .WithAccept("application/json"); + + private AzureOpenAIPipelineMessageBuilder NewGetListBuilder(int? limit, string order, string after, string before, RequestOptions options) + => NewJsonGetBuilder(options) + .WithCommonListParameters(limit, order, after, before); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/AzureAssistantClient.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/AzureAssistantClient.cs new file mode 100644 index 000000000..c8b48f925 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/AzureAssistantClient.cs @@ -0,0 +1,32 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.Assistants; + +/// +/// The scenario client used for assistant operations with the Azure OpenAI service. +/// +/// +/// To retrieve an instance of this type, use the matching method on . +/// +internal partial class AzureAssistantClient : AssistantClient +{ + private readonly Uri _endpoint; + private readonly string _apiVersion; + + internal AzureAssistantClient(ClientPipeline pipeline, Uri endpoint, AzureOpenAIClientOptions options) + : base(pipeline, new OpenAIClientOptions() { Endpoint = endpoint }) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNull(endpoint, nameof(endpoint)); + options ??= new(); + + _endpoint = endpoint; + _apiVersion = options.Version; + } + + protected AzureAssistantClient() + { } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureAssistantsPageEnumerator.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureAssistantsPageEnumerator.cs new file mode 100644 index 000000000..f351279a7 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureAssistantsPageEnumerator.cs @@ -0,0 +1,49 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +#nullable enable + +namespace Azure.AI.OpenAI.Assistants; + +internal partial class AzureAssistantsPageEnumerator : AssistantsPageEnumerator +{ + private readonly Uri _endpoint; + private readonly string _apiVersion; + + public AzureAssistantsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + int? limit, string order, string after, string before, + string apiVersion, + RequestOptions options) + : base(pipeline, endpoint, limit, order, after, before, options) + { + _endpoint = endpoint; + _apiVersion = apiVersion; + } + + internal override async Task GetAssistantsAsync(int? limit, string order, string after, string before, RequestOptions options) + { + using PipelineMessage message = CreateGetAssistantsRequest(limit, order, after, before, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal override ClientResult GetAssistants(int? limit, string order, string after, string before, RequestOptions options) + { + using PipelineMessage message = CreateGetAssistantsRequest(limit, order, after, before, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + private PipelineMessage CreateGetAssistantsRequest(int? limit, string order, string after, string before, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithAssistantsHeader() + .WithOptions(options) + .WithMethod("GET") + .WithAccept("application/json") + .WithCommonListParameters(limit, order, after, before) + .WithPath("assistants") + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureMessagesPageEnumerator.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureMessagesPageEnumerator.cs new file mode 100644 index 000000000..79ad28d13 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureMessagesPageEnumerator.cs @@ -0,0 +1,52 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.Assistants; + +internal partial class AzureMessagesPageEnumerator : MessagesPageEnumerator +{ + private readonly Uri _endpoint; + private readonly string _apiVersion; + + public AzureMessagesPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string threadId, + int? limit, string order, string after, string before, + string apiVersion, + RequestOptions options) + : base(pipeline, endpoint, threadId, limit, order, after, before, options) + { + _endpoint = endpoint; + _apiVersion = apiVersion; + } + + internal override async Task GetMessagesAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetMessagesRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal override ClientResult GetMessages(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetMessagesRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + private PipelineMessage CreateGetMessagesRequest(string threadId, int? limit, string order, string after, string before, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithAssistantsHeader() + .WithOptions(options) + .WithMethod("GET") + .WithAccept("application/json") + .WithCommonListParameters(limit, order, after, before) + .WithPath("threads", threadId, "messages") + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureRunStepsPageEnumerator.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureRunStepsPageEnumerator.cs new file mode 100644 index 000000000..e7f6ae902 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureRunStepsPageEnumerator.cs @@ -0,0 +1,54 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.Assistants; + +internal partial class AzureRunStepsPageEnumerator : RunStepsPageEnumerator +{ + private readonly Uri _endpoint; + private readonly string _apiVersion; + + public AzureRunStepsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string threadId, string runId, + int? limit, string order, string after, string before, + string apiVersion, + RequestOptions options) + : base(pipeline, endpoint, threadId, runId, limit, order, after, before, options) + { + _endpoint = endpoint; + _apiVersion = apiVersion; + } + + internal override async Task GetRunStepsAsync(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateGetRunStepsRequest(threadId, runId, limit, order, after, before, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal override ClientResult GetRunSteps(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateGetRunStepsRequest(threadId, runId, limit, order, after, before, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + private PipelineMessage CreateGetRunStepsRequest(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithAssistantsHeader() + .WithOptions(options) + .WithMethod("GET") + .WithAccept("application/json") + .WithCommonListParameters(limit, order, after, before) + .WithPath("threads", threadId, "runs", runId, "steps") + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureRunsPageEnumerator.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureRunsPageEnumerator.cs new file mode 100644 index 000000000..65afbca20 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Assistants/Internal/Pagination/AzureRunsPageEnumerator.cs @@ -0,0 +1,51 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.Assistants; + +internal partial class AzureRunsPageEnumerator : RunsPageEnumerator +{ + private readonly Uri _endpoint; + private readonly string _apiVersion; + + public AzureRunsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string threadId, int? limit, string order, string after, string before, + string apiVersion, + RequestOptions options) + : base(pipeline, endpoint, threadId, limit, order, after, before, options) + { + _endpoint = endpoint; + _apiVersion = apiVersion; + } + + internal override async Task GetRunsAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetRunsRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal override ClientResult GetRuns(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetRunsRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + private PipelineMessage CreateGetRunsRequest(string threadId, int? limit, string order, string after, string before, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithAssistantsHeader() + .WithOptions(options) + .WithMethod("GET") + .WithAccept("application/json") + .WithCommonListParameters(limit, order, after, before) + .WithPath("threads", threadId, "runs") + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Audio/AzureAudioClient.Protocol.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Audio/AzureAudioClient.Protocol.cs new file mode 100644 index 000000000..874b4f3be --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Audio/AzureAudioClient.Protocol.cs @@ -0,0 +1,84 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; +using OpenAI.Audio; + +namespace Azure.AI.OpenAI.Audio; + +internal partial class AzureAudioClient : AudioClient +{ + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult TranscribeAudio(BinaryContent content, string contentType, RequestOptions options = null) + { + using PipelineMessage message = CreateTranscribeAudioRequestMessage(content, contentType, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task TranscribeAudioAsync(BinaryContent content, string contentType, RequestOptions options = null) + { + using PipelineMessage message = CreateTranscribeAudioRequestMessage(content, contentType, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(response); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult TranslateAudio(BinaryContent content, string contentType, RequestOptions options = null) + { + using PipelineMessage message = CreateTranslateAudioRequestMessage(content, contentType, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task TranslateAudioAsync(BinaryContent content, string contentType, RequestOptions options = null) + { + using PipelineMessage message = CreateTranslateAudioRequestMessage(content, contentType, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(response); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult GenerateSpeech(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateGenerateSpeechFromTextRequestMessage(content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task GenerateSpeechAsync(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateGenerateSpeechFromTextRequestMessage(content, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(response); + } + + private PipelineMessage CreateTranscribeAudioRequestMessage(BinaryContent content, string contentType, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithMethod("POST") + .WithPath("audio", "transcriptions") + .WithContent(content, contentType) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private PipelineMessage CreateTranslateAudioRequestMessage(BinaryContent content, string contentType, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithMethod("POST") + .WithPath("audio", "translations") + .WithContent(content, contentType) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private PipelineMessage CreateGenerateSpeechFromTextRequestMessage(BinaryContent content, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithMethod("POST") + .WithPath("audio", "speech") + .WithContent(content, "application/json") + .WithAccept("application/octet-stream") + .WithOptions(options) + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Audio/AzureAudioClient.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Audio/AzureAudioClient.cs new file mode 100644 index 000000000..f15480951 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Audio/AzureAudioClient.cs @@ -0,0 +1,36 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using OpenAI.Audio; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.Audio; + +/// +/// The scenario client used for audio operations with the Azure OpenAI service. +/// +/// +/// To retrieve an instance of this type, use the matching method on . +/// +internal partial class AzureAudioClient : AudioClient +{ + private readonly string _deploymentName; + private readonly Uri _endpoint; + private readonly string _apiVersion; + + internal AzureAudioClient(ClientPipeline pipeline, string deploymentName, Uri endpoint, AzureOpenAIClientOptions options) + : base(pipeline, model: deploymentName, new OpenAIClientOptions() { Endpoint = endpoint }) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNullOrEmpty(deploymentName, nameof(deploymentName)); + Argument.AssertNotNull(endpoint, nameof(endpoint)); + options ??= new(); + + _deploymentName = deploymentName; + _endpoint = endpoint; + _apiVersion = options.Version; + } + + protected AzureAudioClient() + { } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIAudience.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIAudience.cs new file mode 100644 index 000000000..719fcdd8b --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIAudience.cs @@ -0,0 +1,69 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ComponentModel; + +namespace Azure.AI.OpenAI; + +/// +/// Represents cloud authentication audiences available for Azure OpenAI. +/// These audiences correspond to authorization token authentication scopes. +/// +public readonly partial struct AzureOpenAIAudience : IEquatable +{ + private readonly string _value; + + /// + /// Initializes a new instance of the object. + /// + /// + /// Please consider using one of the known, valid values like or . + /// + /// + /// The Microsoft Entra audience to use when forming authorization scopes. + /// For Azure OpenAI, this value corresponds to a URL that identifies the Azure cloud where the resource is located. + /// For more information: . + /// + /// is null. + public AzureOpenAIAudience(string value) + { + Argument.AssertNotNullOrEmpty(value, nameof(value)); + _value = value; + } + + private const string AzurePublicCloudValue = "https://cognitiveservices.azure.com/.default"; + private const string AzureGovernmentValue = "https://cognitiveservices.azure.us/.default"; + + /// + /// The authorization audience used to connect to the public Azure cloud. Default if not otherwise specified. + /// + public static AzureOpenAIAudience AzurePublicCloud { get; } = new AzureOpenAIAudience(AzurePublicCloudValue); + + /// + /// The authorization audience used to authenticate with the Azure Government cloud. + /// + /// + /// For more information, please refer to + /// . + /// + public static AzureOpenAIAudience AzureGovernment { get; } = new AzureOpenAIAudience(AzureGovernmentValue); + + /// Determines if two values are the same. + public static bool operator ==(AzureOpenAIAudience left, AzureOpenAIAudience right) => left.Equals(right); + /// Determines if two values are not the same. + public static bool operator !=(AzureOpenAIAudience left, AzureOpenAIAudience right) => !left.Equals(right); + /// Converts a string to a . + public static implicit operator AzureOpenAIAudience(string value) => new AzureOpenAIAudience(value); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is AzureOpenAIAudience other && Equals(other); + /// + public bool Equals(AzureOpenAIAudience other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + /// + public override string ToString() => _value; +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIClient.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIClient.cs new file mode 100644 index 000000000..9569b4ef6 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIClient.cs @@ -0,0 +1,329 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +global using OpenAI; +global using OpenAI.Assistants; +global using OpenAI.Audio; +global using OpenAI.Batch; +global using OpenAI.Chat; +global using OpenAI.Embeddings; +global using OpenAI.Files; +global using OpenAI.FineTuning; +global using OpenAI.Images; +global using OpenAI.Models; +global using OpenAI.Moderations; +global using OpenAI.VectorStores; + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; +using System.Diagnostics.CodeAnalysis; +using Azure.AI.OpenAI.Assistants; +using Azure.AI.OpenAI.Audio; +using Azure.AI.OpenAI.Batch; +using Azure.AI.OpenAI.Chat; +using Azure.AI.OpenAI.Embeddings; +using Azure.AI.OpenAI.Files; +using Azure.AI.OpenAI.FineTuning; +using Azure.AI.OpenAI.Images; +using Azure.AI.OpenAI.VectorStores; +using Azure.Core; + +#pragma warning disable AZC0007 + +namespace Azure.AI.OpenAI; + +/// +/// The top-level client for the Azure OpenAI service. +/// +/// +/// For scenario-specific operations, create a corresponding client using the matching method on this type, e.g. . +/// +public partial class AzureOpenAIClient : OpenAIClient +{ + private readonly Uri _endpoint; + private readonly AzureOpenAIClientOptions _options; + + /// + /// Creates a new instance of that will connect to a specified Azure OpenAI + /// service resource endpoint using an API key. + /// + /// + /// + /// For token-based authentication, including the use of managed identity, please use the alternate constructor: + /// + /// + /// + /// The Azure OpenAI resource endpoint to use. This should not include model deployment or operation information. For example: https://my-resource.openai.azure.com. + /// The API key to authenticate with the service. + public AzureOpenAIClient(Uri endpoint, ApiKeyCredential credential) : this(endpoint, credential, new AzureOpenAIClientOptions()) + { + } + + /// + /// Creates a new instance of that will connect to a specified Azure OpenAI + /// service resource endpoint using an API key. + /// + /// + /// + /// For token-based authentication, including the use of managed identity, please use the alternate constructor: + /// + /// + /// + /// The Azure OpenAI resource endpoint to use. This should not include model deployment or operation information. For example: https://my-resource.openai.azure.com. + /// The API key to authenticate with the service. + public AzureOpenAIClient(Uri endpoint, AzureKeyCredential credential) : this(endpoint, credential, new AzureOpenAIClientOptions()) + { + } + + /// + /// Creates a new instance of that will connect to an Azure OpenAI service resource + /// using token authentication, including for tokens issued via managed identity. + /// + /// + /// For API-key-based authentication, please use the alternate constructor: + /// + /// + /// The Azure OpenAI resource endpoint to use. This should not include model deployment or operation information. For example: https://my-resource.openai.azure.com. + /// The token credential to authenticate with the service. + public AzureOpenAIClient(Uri endpoint, TokenCredential credential) : this(endpoint, credential, new AzureOpenAIClientOptions()) + { + } + + /// + /// Creates a new instance of that will connect to a specified Azure OpenAI + /// service resource endpoint using an API key. + /// + /// + /// + /// For token-based authentication, including the use of managed identity, please use the alternate constructor: + /// + /// + /// + /// The Azure OpenAI resource endpoint to use. This should not include model deployment or operation information. For example: https://my-resource.openai.azure.com. + /// The API key to authenticate with the service. + /// The options to configure the client. + public AzureOpenAIClient(Uri endpoint, ApiKeyCredential credential, AzureOpenAIClientOptions options) + : this(CreatePipeline(credential, options), endpoint, options) + { + } + + /// + /// Creates a new instance of that will connect to a specified Azure OpenAI + /// service resource endpoint using an API key. + /// + /// + /// + /// For token-based authentication, including the use of managed identity, please use the alternate constructor: + /// + /// + /// + /// The Azure OpenAI resource endpoint to use. This should not include model deployment or operation information. For example: https://my-resource.openai.azure.com. + /// The API key to authenticate with the service. + /// The options to configure the client. + public AzureOpenAIClient(Uri endpoint, AzureKeyCredential credential, AzureOpenAIClientOptions options) + : this(CreatePipeline(credential?.Key, options), endpoint, options) + { + } + + /// + /// Creates a new instance of that will connect to an Azure OpenAI service resource + /// using token authentication, including for tokens issued via managed identity. + /// + /// + /// For API-key-based authentication, please use the alternate constructor: + /// + /// + /// + /// + /// The Azure OpenAI resource endpoint to use. This should not include model deployment or operation information. + /// + /// + /// Example: https://my-resource.openai.azure.com + /// + /// + /// The API key to use when authenticating with the provided endpoint. + /// The scenario-independent options to use. + public AzureOpenAIClient(Uri endpoint, TokenCredential credential, AzureOpenAIClientOptions options = null) + : this(CreatePipeline(credential, options), endpoint, options) + { } + + /// + /// Creates a new instance of . + /// + /// The client pipeline to use. + /// The endpoint to use. + /// The additional client options to use. + protected AzureOpenAIClient(ClientPipeline pipeline, Uri endpoint, AzureOpenAIClientOptions options) + : base(pipeline, new OpenAIClientOptions() { Endpoint = endpoint }) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNull(endpoint, nameof(endpoint)); + options ??= new(); + + _endpoint = endpoint; + _options = options; + } + + /// + /// Creates a new instance of for mocking. + /// + protected AzureOpenAIClient() + { } + + /// + /// Gets a new instance configured for assistant operation use with the Azure OpenAI service. + /// + /// A new instance. + [Experimental("OPENAI001")] + public override AssistantClient GetAssistantClient() + => new AzureAssistantClient(Pipeline, _endpoint, _options); + + /// + /// Gets a new instance configured for audio operation use with the Azure OpenAI service. + /// + /// The model deployment name to use for the new client's audio operations. + /// A new instance. + public override AudioClient GetAudioClient(string deploymentName) + => new AzureAudioClient(Pipeline, deploymentName, _endpoint, _options); + + /// + /// Gets a new instance configured for batch operation use with the Azure OpenAI service. + /// + /// The model deployment name to use for the new client's audio operations. + /// A new instance. + public BatchClient GetBatchClient(string deploymentName) + => new AzureBatchClient(Pipeline, deploymentName, _endpoint, _options); + + /// + /// This method is unsupported for Azure OpenAI. Please use the alternate + /// method that accepts a model deployment name, instead. + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override BatchClient GetBatchClient() => GetBatchClient(deploymentName: null); + + /// + /// Gets a new instance configured for chat completion operation use with the Azure OpenAI service. + /// + /// The model deployment name to use for the new client's chat completion operations. + /// A new instance. + public override ChatClient GetChatClient(string deploymentName) + => new AzureChatClient(Pipeline, deploymentName, _endpoint, _options); + + /// + /// Gets a new instance configured for embedding operation use with the Azure OpenAI service. + /// + /// The model deployment name to use for the new client's embedding operations. + /// A new instance. + public override EmbeddingClient GetEmbeddingClient(string deploymentName) + => new AzureEmbeddingClient(Pipeline, deploymentName, _endpoint, _options); + + /// + /// Gets a new instance configured for file operation use with the Azure OpenAI service. + /// + /// A new instance. + public override FileClient GetFileClient() + => new AzureFileClient(Pipeline, _endpoint, _options); + + /// + /// Gets a new instance configured for fine-tuning operation use with the Azure OpenAI service. + /// + /// A new instance. + public override FineTuningClient GetFineTuningClient() + => new AzureFineTuningClient(Pipeline, _endpoint, _options); + + /// + /// Gets a new instance configured for image operation use with the Azure OpenAI service. + /// + /// The model deployment name to use for the new client's image operations. + /// A new instance. + public override ImageClient GetImageClient(string deploymentName) + => new AzureImageClient(Pipeline, deploymentName, _endpoint, _options); + + /// + /// Model management operations are not supported with Azure OpenAI. + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override ModelClient GetModelClient() + => throw new NotSupportedException($"Azure OpenAI does not support the OpenAI model management API. Please " + + "use the Azure AI Services Account Management API to interact with Azure OpenAI model deployments."); + + /// + /// Moderation operations are not supported with Azure OpenAI. + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override ModerationClient GetModerationClient(string _) + => throw new NotSupportedException($"Azure OpenAI does not support the OpenAI moderations API. Please refer " + + "to the documentation on Microsoft's Responsible AI embedded content filters to learn more about Azure " + + "OpenAI's content filter policies and content filter annotations."); + + /// + /// Gets a new instance configured for vector store operation use with the + /// Azure OpenAI service. + /// + /// A new instance. + [Experimental("OPENAI001")] + public override VectorStoreClient GetVectorStoreClient() + => new AzureVectorStoreClient(Pipeline, _endpoint, _options); + + private static ClientPipeline CreatePipeline(PipelinePolicy authenticationPolicy, AzureOpenAIClientOptions options) + => ClientPipeline.Create( + options ?? new(), + perCallPolicies: + [ + CreateAddUserAgentHeaderPolicy(options), + CreateAddClientRequestIdHeaderPolicy(), + ], + perTryPolicies: + [ + authenticationPolicy, + ], + beforeTransportPolicies: []); + + internal static ClientPipeline CreatePipeline(ApiKeyCredential credential, AzureOpenAIClientOptions options = null) + { + Argument.AssertNotNull(credential, nameof(credential)); + return CreatePipeline(ApiKeyAuthenticationPolicy.CreateHeaderApiKeyPolicy(credential, "api-key"), options); + } + + internal static ClientPipeline CreatePipeline(TokenCredential credential, AzureOpenAIClientOptions options = null) + { + Argument.AssertNotNull(credential, nameof(credential)); + string authorizationScope = options?.Audience?.ToString() + ?? AzureOpenAIAudience.AzurePublicCloud.ToString(); + return CreatePipeline(new AzureTokenAuthenticationPolicy(credential, [authorizationScope]), options); + } + + private static PipelinePolicy CreateAddUserAgentHeaderPolicy(AzureOpenAIClientOptions options = null) + { + Core.TelemetryDetails telemetryDetails = new(typeof(AzureOpenAIClient).Assembly, options?.ApplicationId); + return new GenericActionPipelinePolicy( + requestAction: request => + { + if (request?.Headers?.TryGetValue(s_userAgentHeaderKey, out string _) == false) + { + request.Headers.Set(s_userAgentHeaderKey, telemetryDetails.ToString()); + } + }); + } + + private static PipelinePolicy CreateAddClientRequestIdHeaderPolicy() + { + return new GenericActionPipelinePolicy(request => + { + if (request?.Headers is not null) + { + string requestId = request.Headers.TryGetValue(s_clientRequestIdHeaderKey, out string existingHeader) == true + ? existingHeader + : Guid.NewGuid().ToString().ToLowerInvariant(); + request.Headers.Set(s_clientRequestIdHeaderKey, requestId); + } + }); + } + + private static readonly string s_userAgentHeaderKey = "User-Agent"; + private static readonly string s_clientRequestIdHeaderKey = "x-ms-client-request-id"; + private static PipelineMessageClassifier s_pipelineMessageClassifier; + internal static PipelineMessageClassifier PipelineMessageClassifier + => s_pipelineMessageClassifier ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200, 201 }); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIClientOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIClientOptions.cs new file mode 100644 index 000000000..1b2989bfb --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIClientOptions.cs @@ -0,0 +1,103 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI; + +/// +/// Defines the scenario-independent, client-level options for the Azure-specific OpenAI client. +/// +public partial class AzureOpenAIClientOptions : ClientPipelineOptions +{ + internal string Version { get; } + + /// + /// The authorization audience to use when authenticating with Azure authentication tokens + /// + /// + /// By default, the public Azure cloud will be used to authenticate tokens. Modify this value to authenticate tokens + /// with other clouds like Azure Government. + /// + public AzureOpenAIAudience? Audience + { + get => _authorizationAudience; + set + { + AssertNotFrozen(); + _authorizationAudience = value; + } + } + private AzureOpenAIAudience? _authorizationAudience; + + /// + public string ApplicationId + { + get => _applicationId; + set + { + AssertNotFrozen(); + _applicationId = value; + } + } + private string _applicationId; + + /// + /// Initializes a new instance of + /// + /// The service API version to use with the client. + /// The provided service API version is not supported. + public AzureOpenAIClientOptions(ServiceVersion version = LatestVersion) + : base() + { + Version = version switch + { + ServiceVersion.V2024_04_01_Preview => "2024-04-01-preview", + ServiceVersion.V2024_05_01_Preview => "2024-05-01-preview", + ServiceVersion.V2024_06_01 => "2024-06-01", + ServiceVersion.V2024_07_01_Preview => "2024-07-01-preview", + _ => throw new NotSupportedException() + }; + RetryPolicy = new RetryWithDelaysPolicy(); + } + + /// The version of the service to use. + public enum ServiceVersion + { + /// Service version "2024-04-01-preview". + V2024_04_01_Preview = 7, + V2024_05_01_Preview = 8, + V2024_06_01 = 9, + V2024_07_01_Preview = 10, + } + + internal class RetryWithDelaysPolicy : ClientRetryPolicy + { + protected override TimeSpan GetNextDelay(PipelineMessage message, int tryCount) + { + TimeSpan? TryGetTimeSpanFromHeader(string headerName, int millisecondsPerValue = 1, bool allowDateTimeOffset = false) + { + if (double.TryParse( + message?.Response?.Headers?.TryGetValue(headerName, out string textValue) == true ? textValue : null, + out double doubleValue) == true) + { + return TimeSpan.FromMilliseconds(millisecondsPerValue * doubleValue); + } + else if (allowDateTimeOffset && DateTimeOffset.TryParse(headerName, out DateTimeOffset delayUntil)) + { + return delayUntil - DateTimeOffset.Now; + } + return null; + } + + TimeSpan? delayFromHeader = + TryGetTimeSpanFromHeader("retry-after-ms") + ?? TryGetTimeSpanFromHeader("x-ms-retry-after-ms") + ?? TryGetTimeSpanFromHeader("Retry-After", millisecondsPerValue: 1000, allowDateTimeOffset: true); + + return delayFromHeader ?? base.GetNextDelay(message, tryCount); + } + } + + private const ServiceVersion LatestVersion = ServiceVersion.V2024_07_01_Preview; +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureTokenAuthenticationPolicy.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureTokenAuthenticationPolicy.cs new file mode 100644 index 000000000..6fb81cb1a --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureTokenAuthenticationPolicy.cs @@ -0,0 +1,80 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using Azure.Core; +using System.ClientModel.Primitives; +using System.Net; + +namespace Azure.AI.OpenAI; + +internal partial class AzureTokenAuthenticationPolicy : PipelinePolicy +{ + private readonly TokenCredential _credential; + private readonly string[] _scopes; + private readonly TimeSpan _refreshOffset; + private AccessToken? _currentToken; + + public AzureTokenAuthenticationPolicy(TokenCredential credential, IEnumerable scopes, TimeSpan? refreshOffset = null) + { + Argument.AssertNotNull(credential, nameof(credential)); + Argument.AssertNotNull(scopes, nameof(scopes)); + + _credential = credential; + _scopes = scopes.ToArray(); + _refreshOffset = refreshOffset ?? s_defaultRefreshOffset; + } + + public override void Process(PipelineMessage message, IReadOnlyList pipeline, int currentIndex) + { + if (message?.Request is not null) + { + if (!IsTokenFresh()) + { + TokenRequestContext tokenRequestContext = CreateRequestContext(message.Request); + _currentToken = _credential.GetToken(tokenRequestContext, cancellationToken: default); + } + message?.Request?.Headers?.Add("Authorization", $"Bearer {_currentToken.Value.Token}"); + } + ProcessNext(message, pipeline, currentIndex); + if (message?.Response?.Status == (int)HttpStatusCode.Unauthorized) + { + _currentToken = null; + } + } + + public override async ValueTask ProcessAsync(PipelineMessage message, IReadOnlyList pipeline, int currentIndex) + { + if (message?.Request is not null) + { + if (!IsTokenFresh()) + { + TokenRequestContext tokenRequestContext = CreateRequestContext(message.Request); + _currentToken + = await _credential.GetTokenAsync(tokenRequestContext, cancellationToken: default).ConfigureAwait(false); + } + message?.Request?.Headers?.Add("Authorization", $"Bearer {_currentToken.Value.Token}"); + } + await ProcessNextAsync(message, pipeline, currentIndex).ConfigureAwait(false); + if (message?.Response?.Status == (int)HttpStatusCode.Unauthorized) + { + _currentToken = null; + } + } + + private bool IsTokenFresh() + { + if (!_currentToken.HasValue) return false; + DateTimeOffset refreshAt = _currentToken.Value.RefreshOn ?? (_currentToken.Value.ExpiresOn - _refreshOffset); + return DateTimeOffset.UtcNow < refreshAt; + } + + private TokenRequestContext CreateRequestContext(PipelineRequest request) + { + string clientRequestId = request.Headers.TryGetValue("x-ms-client-request-id", out string messageClientId) == true + ? messageClientId + : null; + return new TokenRequestContext(_scopes, clientRequestId); + } + + private static readonly TimeSpan s_defaultRefreshOffset = TimeSpan.FromMinutes(5); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Batch/AzureBatchClient.Protocol.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Batch/AzureBatchClient.Protocol.cs new file mode 100644 index 000000000..00cec454c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Batch/AzureBatchClient.Protocol.cs @@ -0,0 +1,105 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.Batch; + +internal partial class AzureBatchClient : BatchClient +{ + public override async Task CreateBatchAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateBatchRequest(content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult CreateBatch(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateBatchRequest(content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override IAsyncEnumerable GetBatchesAsync(string after, int? limit, RequestOptions options) + { + BatchesPageEnumerator enumerator = new(Pipeline, _endpoint, after, limit, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + public override IEnumerable GetBatches(string after, int? limit, RequestOptions options) + { + BatchesPageEnumerator enumerator = new(Pipeline, _endpoint, after, limit, options); + return PageCollectionHelpers.Create(enumerator); + } + + public override async Task GetBatchAsync(string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateRetrieveBatchRequest(batchId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult GetBatch(string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateRetrieveBatchRequest(batchId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task CancelBatchAsync(string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateCancelBatchRequest(batchId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult CancelBatch(string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateCancelBatchRequest(batchId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + private new PipelineMessage CreateCreateBatchRequest(BinaryContent content, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithMethod("POST") + .WithPath("batches") + .WithContent(content, "application/json") + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateGetBatchesRequest(string after, int? limit, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithMethod("GET") + .WithPath("batches") + .WithOptionalQueryParameter("after", after) + .WithOptionalQueryParameter("limit", limit) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateRetrieveBatchRequest(string batchId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithMethod("GET") + .WithPath("batches", batchId) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateCancelBatchRequest(string batchId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithMethod("POST") + .WithPath("batches", batchId, "cancel") + .WithAccept("application/json") + .WithOptions(options) + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Batch/AzureBatchClient.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Batch/AzureBatchClient.cs new file mode 100644 index 000000000..e04d9daf9 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Batch/AzureBatchClient.cs @@ -0,0 +1,36 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; +using OpenAI.Batch; + +namespace Azure.AI.OpenAI.Batch; + +/// +/// The scenario client used for Files operations with the Azure OpenAI service. +/// +/// +/// To retrieve an instance of this type, use the matching method on . +/// +internal partial class AzureBatchClient : BatchClient +{ + private readonly Uri _endpoint; + private readonly string _deploymentName; + private readonly string _apiVersion; + + internal AzureBatchClient(ClientPipeline pipeline, string deploymentName, Uri endpoint, AzureOpenAIClientOptions options) + : base(pipeline, new OpenAIClientOptions() { Endpoint = endpoint }) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNullOrEmpty(deploymentName, nameof(deploymentName)); + Argument.AssertNotNull(endpoint, nameof(endpoint)); + options ??= new(); + + _deploymentName = deploymentName; + _endpoint = endpoint; + _apiVersion = options.Version; + } + + protected AzureBatchClient() + { } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatClient.Protocol.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatClient.Protocol.cs new file mode 100644 index 000000000..1420399fc --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatClient.Protocol.cs @@ -0,0 +1,41 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using OpenAI.Chat; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; + +namespace Azure.AI.OpenAI.Chat; + +internal partial class AzureChatClient : ChatClient +{ + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult CompleteChat(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCompleteChatRequestMessage(content, options); + PipelineResponse response = Pipeline.ProcessMessage(message, options); + return ClientResult.FromResponse(message.BufferResponse ? response : message.ExtractResponse()); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task CompleteChatAsync(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCompleteChatRequestMessage(content, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(message.BufferResponse ? response : message.ExtractResponse()); + } + + private PipelineMessage CreateCompleteChatRequestMessage( + BinaryContent content, + RequestOptions options = null, + bool? bufferResponse = true) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithPath("chat", "completions") + .WithMethod("POST") + .WithContent(content, "application/json") + .WithAccept("application/json") + .WithResponseContentBuffering(bufferResponse) + .WithOptions(options) + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatClient.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatClient.cs new file mode 100644 index 000000000..86ca7d229 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatClient.cs @@ -0,0 +1,55 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using OpenAI.Chat; +using System.ClientModel; +using System.ClientModel.Primitives; + +#pragma warning disable AZC0112 + +namespace Azure.AI.OpenAI.Chat; + +/// +/// The scenario client used for chat completion operations with the Azure OpenAI service. +/// +/// +/// To retrieve an instance of this type, use the matching method on . +/// +internal partial class AzureChatClient : ChatClient +{ + private readonly string _deploymentName; + private readonly Uri _endpoint; + private readonly string _apiVersion; + + internal AzureChatClient(ClientPipeline pipeline, string deploymentName, Uri endpoint, AzureOpenAIClientOptions options) + : base(pipeline, model: deploymentName, new OpenAIClientOptions() { Endpoint = endpoint }) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNullOrEmpty(deploymentName, nameof(deploymentName)); + Argument.AssertNotNull(endpoint, nameof(endpoint)); + options ??= new(); + + _deploymentName = deploymentName; + _endpoint = endpoint; + _apiVersion = options.Version; + } + + protected AzureChatClient() + { } + + /// + public override AsyncCollectionResult CompleteChatStreamingAsync(IEnumerable messages, ChatCompletionOptions options = null, CancellationToken cancellationToken = default) + { + options ??= new(); + options.StreamOptions = null; + return base.CompleteChatStreamingAsync(messages, options, cancellationToken); + } + + /// + public override CollectionResult CompleteChatStreaming(IEnumerable messages, ChatCompletionOptions options = null, CancellationToken cancellationToken = default) + { + options ??= new(); + options.StreamOptions = null; + return base.CompleteChatStreaming(messages, options, cancellationToken); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatCompletion.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatCompletion.cs new file mode 100644 index 000000000..299f2dea3 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatCompletion.cs @@ -0,0 +1,38 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Diagnostics.CodeAnalysis; +using Azure.AI.OpenAI.Chat; +using Azure.AI.OpenAI.Internal; +using OpenAI.Chat; + +#pragma warning disable AZC0112 + +namespace Azure.AI.OpenAI; + +public static partial class AzureChatCompletionExtensions +{ + [Experimental("AOAI001")] + public static ContentFilterResultForPrompt GetContentFilterResultForPrompt(this ChatCompletion chatCompletion) + { + return AdditionalPropertyHelpers.GetAdditionalListProperty( + chatCompletion.SerializedAdditionalRawData, + "prompt_filter_results")?[0]; + } + + [Experimental("AOAI001")] + public static ContentFilterResultForResponse GetContentFilterResultForResponse(this ChatCompletion chatCompletion) + { + return AdditionalPropertyHelpers.GetAdditionalProperty( + chatCompletion.Choices?[0]?.SerializedAdditionalRawData, + "content_filter_results"); + } + + [Experimental("AOAI001")] + public static AzureChatMessageContext GetAzureMessageContext(this ChatCompletion chatCompletion) + { + return AdditionalPropertyHelpers.GetAdditionalProperty( + chatCompletion.Choices?[0]?.Message?.SerializedAdditionalRawData, + "context"); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatCompletionOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatCompletionOptions.cs new file mode 100644 index 000000000..54ab5606e --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureChatCompletionOptions.cs @@ -0,0 +1,36 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Diagnostics.CodeAnalysis; +using Azure.AI.OpenAI.Chat; +using Azure.AI.OpenAI.Internal; + +namespace Azure.AI.OpenAI; + +public static partial class AzureChatCompletionOptionsExtensions +{ + [Experimental("AOAI001")] + public static void AddDataSource(this ChatCompletionOptions options, AzureChatDataSource dataSource) + { + options.SerializedAdditionalRawData ??= new Dictionary(); + + IList existingSources + = AdditionalPropertyHelpers.GetAdditionalListProperty( + options.SerializedAdditionalRawData, + "data_sources") + ?? new ChangeTrackingList(); + existingSources.Add(dataSource); + AdditionalPropertyHelpers.SetAdditionalProperty( + options.SerializedAdditionalRawData, + "data_sources", + existingSources); + } + + [Experimental("AOAI001")] + public static IReadOnlyList GetDataSources(this ChatCompletionOptions options) + { + return AdditionalPropertyHelpers.GetAdditionalListProperty( + options.SerializedAdditionalRawData, + "data_sources") as IReadOnlyList; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureStreamingChatCompletionUpdate.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureStreamingChatCompletionUpdate.cs new file mode 100644 index 000000000..9de53f530 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/AzureStreamingChatCompletionUpdate.cs @@ -0,0 +1,42 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using Azure.AI.OpenAI.Chat; +using Azure.AI.OpenAI.Internal; +using OpenAI.Chat; +using System.Diagnostics.CodeAnalysis; + +#pragma warning disable AZC0112 + +namespace Azure.AI.OpenAI; + +public static partial class AzureStreamingChatCompletionUpdateExtensions +{ + [Experimental("AOAI001")] + public static AzureChatMessageContext GetAzureMessageContext(this StreamingChatCompletionUpdate chatUpdate) + { + if (chatUpdate.Choices?.Count > 0) + { + return AdditionalPropertyHelpers.GetAdditionalProperty( + chatUpdate.Choices[0].Delta?.SerializedAdditionalRawData, + "context"); + } + return null; + } + + [Experimental("AOAI001")] + public static ContentFilterResultForPrompt GetContentFilterResultForPrompt(this StreamingChatCompletionUpdate chatUpdate) + { + return AdditionalPropertyHelpers.GetAdditionalListProperty( + chatUpdate.SerializedAdditionalRawData, + "prompt_filter_results")?[0]; + } + + [Experimental("AOAI001")] + public static ContentFilterResultForResponse GetContentFilterResultForResponse(this StreamingChatCompletionUpdate chatUpdate) + { + return AdditionalPropertyHelpers.GetAdditionalProperty( + chatUpdate?.Choices?.ElementAtOrDefault(0)?.SerializedAdditionalRawData, + "content_filter_results"); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/GeneratorStubs.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/GeneratorStubs.cs new file mode 100644 index 000000000..368320c83 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/GeneratorStubs.cs @@ -0,0 +1,11 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureChatDataSource")] public abstract partial class AzureChatDataSource { } +[CodeGenModel("AzureChatMessageContextCitation")] public partial class AzureChatCitation { } +[CodeGenModel("AzureChatMessageContextAllRetrievedDocuments")] public partial class AzureChatRetrievedDocument { } +[CodeGenModel("AzureChatMessageContextAllRetrievedDocumentsFilterReason")] public readonly partial struct AzureChatRetrievedDocumentFilterReason { } +[CodeGenModel("AzureChatMessageContext")] public partial class AzureChatMessageContext { } +[CodeGenModel("AzureSearchChatDataSourceParametersQueryType")] public readonly partial struct DataSourceQueryType { } diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/GeneratorStubs.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..4a2bb2bcd --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/GeneratorStubs.cs @@ -0,0 +1,18 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureChatDataSourceAccessTokenAuthenticationOptions")] internal partial class InternalAzureChatDataSourceAccessTokenAuthenticationOptions { } +[CodeGenModel("AzureChatDataSourceApiKeyAuthenticationOptions")] internal partial class InternalAzureChatDataSourceApiKeyAuthenticationOptions { } +[CodeGenModel("AzureChatDataSourceConnectionStringAuthenticationOptions")] internal partial class InternalAzureChatDataSourceConnectionStringAuthenticationOptions { } +[CodeGenModel("AzureChatDataSourceDeploymentNameVectorizationSource")] internal partial class InternalAzureChatDataSourceDeploymentNameVectorizationSource { } +[CodeGenModel("AzureChatDataSourceEncodedApiKeyAuthenticationOptions")] internal partial class InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions { } +[CodeGenModel("AzureChatDataSourceKeyAndKeyIdAuthenticationOptions")] internal partial class InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions { } +[CodeGenModel("AzureChatDataSourceModelIdVectorizationSource")] internal partial class InternalAzureChatDataSourceModelIdVectorizationSource { } +[CodeGenModel("AzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions")] internal partial class InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions { } +[CodeGenModel("AzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions")] internal partial class InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions { } +[CodeGenModel("AzureSearchChatDataSourceParametersIncludeContext")] internal readonly partial struct InternalAzureSearchChatDataSourceParametersIncludeContext { } +[CodeGenModel("UnknownAzureChatDataSource")] internal partial class InternalUnknownAzureChatDataSource { } +[CodeGenModel("UnknownAzureChatDataSourceAuthenticationOptions")] internal partial class InternalUnknownAzureChatDataSourceAuthenticationOptions { } +[CodeGenModel("UnknownAzureChatDataSourceVectorizationSource")] internal partial class InternalUnknownAzureChatDataSourceVectorizationSource { } diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureChatDataSourceEndpointVectorizationSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureChatDataSourceEndpointVectorizationSource.cs new file mode 100644 index 000000000..b818eab4e --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureChatDataSourceEndpointVectorizationSource.cs @@ -0,0 +1,10 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureChatDataSourceEndpointVectorizationSource")] +internal partial class InternalAzureChatDataSourceEndpointVectorizationSource +{ + internal DataSourceAuthentication Authentication { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureCosmosDBChatDataSourceParameters.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureCosmosDBChatDataSourceParameters.cs new file mode 100644 index 000000000..a3a7a1bc3 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureCosmosDBChatDataSourceParameters.cs @@ -0,0 +1,63 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureCosmosDBChatDataSourceParameters")] +internal partial class InternalAzureCosmosDBChatDataSourceParameters +{ + [CodeGenMember("IncludeContexts")] + private IList _internalIncludeContexts = new ChangeTrackingList(); + private DataSourceOutputContextFlags? _outputContextFlags; + + /// + public DataSourceOutputContextFlags? OutputContextFlags + { + get => DataSourceOutputContextFlagsExtensions.FromStringList(_internalIncludeContexts); + internal set + { + _outputContextFlags = value; + _internalIncludeContexts = _outputContextFlags?.ToStringList(); + } + } + + /// + /// The authentication options to use with the Azure CosmosDB data source. + /// + /// + /// Azure CosmosDB data sources support any of the following options: + /// + /// + /// + /// + [CodeGenMember("Authentication")] + public DataSourceAuthentication Authentication { get; set; } + + /// Gets the index field mappings. + /// + /// Supported field mappings for Elasticsearch data sources include: + /// + /// -- Required + /// -- Required + /// + /// + /// + /// + /// + /// + [CodeGenMember("FieldsMapping")] + public DataSourceFieldMappings FieldMappings { get; set; } + + /// + /// The vectorization dependency used for embeddings. + /// + /// + /// Supported vectorization dependencies for Azure CosmosDB data sources include: + /// + /// + /// + /// + /// + [CodeGenMember("EmbeddingDependency")] + public DataSourceVectorizer VectorizationSource { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureMachineLearningIndexDataSourceParameters.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureMachineLearningIndexDataSourceParameters.cs new file mode 100644 index 000000000..07930fab6 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureMachineLearningIndexDataSourceParameters.cs @@ -0,0 +1,37 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureMachineLearningIndexChatDataSourceParameters")] +internal partial class InternalAzureMachineLearningIndexChatDataSourceParameters +{ + [CodeGenMember("IncludeContexts")] + private IList _internalIncludeContexts = new ChangeTrackingList(); + private DataSourceOutputContextFlags? _outputContextFlags; + + /// + public DataSourceOutputContextFlags? OutputContextFlags + { + get => DataSourceOutputContextFlagsExtensions.FromStringList(_internalIncludeContexts); + internal set + { + _outputContextFlags = value; + _internalIncludeContexts = _outputContextFlags?.ToStringList(); + } + } + + /// + /// The authentication options to use with the Azure Machine Learning Index data source. + /// + /// + /// Azure Machine Learning Index data sources support any of the following options: + /// + /// + /// + /// + /// + /// + [CodeGenMember("Authentication")] + public DataSourceAuthentication Authentication { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureSearchChatDataSourceParameters.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureSearchChatDataSourceParameters.cs new file mode 100644 index 000000000..a8cc7c74f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalAzureSearchChatDataSourceParameters.cs @@ -0,0 +1,71 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureSearchChatDataSourceParameters")] +internal partial class InternalAzureSearchChatDataSourceParameters +{ + [CodeGenMember("IncludeContexts")] + private IList _internalIncludeContexts = new ChangeTrackingList(); + private DataSourceOutputContextFlags? _outputContextFlags; + + /// + public DataSourceOutputContextFlags? OutputContextFlags + { + get => DataSourceOutputContextFlagsExtensions.FromStringList(_internalIncludeContexts); + internal set + { + _outputContextFlags = value; + _internalIncludeContexts = _outputContextFlags?.ToStringList(); + } + } + + /// + /// The authentication options to use with the Azure Search data source. + /// + /// + /// Azure Search data sources support any of the following options: + /// + /// + /// + /// + /// + /// + /// + [CodeGenMember("Authentication")] + public DataSourceAuthentication Authentication { get; set; } + + /// Gets the index field mappings. + /// + /// Supported field mappings for Azure Search data sources include: + /// + /// + /// + /// + /// + /// + /// + /// + /// + /// + [CodeGenMember("FieldsMapping")] + public DataSourceFieldMappings FieldMappings { get; set; } + + /// The query type for the Azure Search resource to use. + [CodeGenMember("QueryType")] + public DataSourceQueryType? QueryType { get; set; } + + /// + /// The vectorization dependency used for embeddings. + /// + /// + /// Supported vectorization dependencies for Azure Search data sources include: + /// + /// + /// + /// + /// + [CodeGenMember("EmbeddingDependency")] + public DataSourceVectorizer VectorizationSource { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalElasticsearchChatDataSourceParameters.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalElasticsearchChatDataSourceParameters.cs new file mode 100644 index 000000000..568067e4c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalElasticsearchChatDataSourceParameters.cs @@ -0,0 +1,73 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("ElasticsearchChatDataSourceParameters")] +internal partial class InternalElasticsearchChatDataSourceParameters +{ + [CodeGenMember("IncludeContexts")] + private IList _internalIncludeContexts = new ChangeTrackingList(); + private DataSourceOutputContextFlags? _outputContextFlags; + + /// + public DataSourceOutputContextFlags? OutputContextFlags + { + get => DataSourceOutputContextFlagsExtensions.FromStringList(_internalIncludeContexts); + internal set + { + _outputContextFlags = value; + _internalIncludeContexts = _outputContextFlags?.ToStringList(); + } + } + + /// + /// The authentication options to use with the Elasticsearch data source. + /// + /// + /// Elasticsearch data sources support any of the following options: + /// + /// + /// + /// + /// + [CodeGenMember("Authentication")] + public DataSourceAuthentication Authentication { get; set; } + + /// Gets the index field mappings. + /// + /// Supported field mappings for Elasticsearch data sources include: + /// + /// + /// + /// + /// + /// + /// + /// + /// + [CodeGenMember("FieldsMapping")] + public DataSourceFieldMappings FieldMappings { get; set; } + + /// + /// Gets the query type. + /// + /// + /// Elasticsearch supports and . + /// + [CodeGenMember("QueryType")] + public DataSourceQueryType? QueryType { get; set; } + /// + /// The vectorization dependency used for embeddings. + /// + /// + /// Supported vectorization dependencies for Elasticsearch data sources include: + /// + /// + /// + /// + /// + /// + [CodeGenMember("EmbeddingDependency")] + public DataSourceVectorizer VectorizationSource { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalPineconeChatDataSourceParameters.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalPineconeChatDataSourceParameters.cs new file mode 100644 index 000000000..3b9288b15 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/Internal/InternalPineconeChatDataSourceParameters.cs @@ -0,0 +1,61 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("PineconeChatDataSourceParameters")] +internal partial class InternalPineconeChatDataSourceParameters +{ + [CodeGenMember("IncludeContexts")] + private IList _internalIncludeContexts = new ChangeTrackingList(); + private DataSourceOutputContextFlags? _outputContextFlags; + + /// + public DataSourceOutputContextFlags? OutputContextFlags + { + get => DataSourceOutputContextFlagsExtensions.FromStringList(_internalIncludeContexts); + internal set + { + _outputContextFlags = value; + _internalIncludeContexts = _outputContextFlags?.ToStringList(); + } + } + + /// + /// The authentication options to use with the Pinecone data source. + /// + /// + /// Pinecone data sources support any of the following options: + /// + /// + /// + /// + [CodeGenMember("Authentication")] + public DataSourceAuthentication Authentication { get; set; } + + /// The index field mappings. This is required for Pinecone data sources. + /// + /// Supported field mappings for Pinecone data sources include: + /// + /// -- Required + /// + /// + /// + /// + /// + /// + [CodeGenMember("FieldsMapping")] + public DataSourceFieldMappings FieldMappings { get; set; } + + /// + /// The vectorization dependency used for embeddings. + /// + /// + /// Supported vectorization dependencies for Pinecone data sources include: + /// + /// + /// + /// + [CodeGenMember("EmbeddingDependency")] + public DataSourceVectorizer VectorizationSource { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/AzureCosmosDBChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/AzureCosmosDBChatDataSource.cs new file mode 100644 index 000000000..9f3933cde --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/AzureCosmosDBChatDataSource.cs @@ -0,0 +1,133 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Diagnostics.CodeAnalysis; + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureCosmosDBChatDataSource")] +public partial class AzureCosmosDBChatDataSource : AzureChatDataSource +{ + [CodeGenMember("Parameters")] + internal InternalAzureCosmosDBChatDataSourceParameters InternalParameters { get; } + + /// + required public string ContainerName + { + get => InternalParameters.ContainerName; + init => InternalParameters.ContainerName = value; + } + + /// + required public string DatabaseName + { + get => InternalParameters.DatabaseName; + init => InternalParameters.DatabaseName = value; + } + + /// + required public string IndexName + { + get => InternalParameters.IndexName; + init => InternalParameters.IndexName = value; + } + + /// + required public DataSourceAuthentication Authentication + { + get => InternalParameters.Authentication; + init => InternalParameters.Authentication = value; + } + + /// + required public DataSourceVectorizer VectorizationSource + { + get => InternalParameters.VectorizationSource; + init => InternalParameters.VectorizationSource = value; + } + + /// + required public DataSourceFieldMappings FieldMappings + { + get => InternalParameters.FieldMappings; + init => InternalParameters.FieldMappings = value; + } + + /// + public int? TopNDocuments + { + get => InternalParameters.TopNDocuments; + init => InternalParameters.TopNDocuments = value; + } + + /// + public bool? InScope + { + get => InternalParameters.InScope; + init => InternalParameters.InScope = value; + } + + /// + public int? Strictness + { + get => InternalParameters.Strictness; + init => InternalParameters.Strictness = value; + } + + /// + public string RoleInformation + { + get => InternalParameters.RoleInformation; + init => InternalParameters.RoleInformation = value; + } + + /// + public int? MaxSearchQueries + { + get => InternalParameters.MaxSearchQueries; + init => InternalParameters.MaxSearchQueries = value; + } + + /// + public bool? AllowPartialResult + { + get => InternalParameters.AllowPartialResult; + init => InternalParameters.AllowPartialResult = value; + } + + /// + public DataSourceOutputContextFlags? OutputContextFlags + { + get => InternalParameters.OutputContextFlags; + init => InternalParameters.OutputContextFlags = value; + } + + /// + /// Initializes a new instance of . + /// + public AzureCosmosDBChatDataSource() : base(type: "azure_cosmos_db", serializedAdditionalRawData: null) + { + InternalParameters = new(); + } + + // CUSTOM: Made internal. + /// Initializes a new instance of . + /// The parameter information to control the use of the Azure CosmosDB data source. + /// is null. + internal AzureCosmosDBChatDataSource(InternalAzureCosmosDBChatDataSourceParameters internalParameters) : this() + { + Argument.AssertNotNull(internalParameters, nameof(internalParameters)); + InternalParameters = internalParameters; + } + + /// Initializes a new instance of . + /// + /// Keeps track of any properties unknown to the library. + /// The parameter information to control the use of the Azure Search data source. + [SetsRequiredMembers] + internal AzureCosmosDBChatDataSource(string type, IDictionary serializedAdditionalRawData, InternalAzureCosmosDBChatDataSourceParameters internalParameters) + : base(type, serializedAdditionalRawData) + { + InternalParameters = internalParameters; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/AzureMachineLearningIndexChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/AzureMachineLearningIndexChatDataSource.cs new file mode 100644 index 000000000..cd0550e0f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/AzureMachineLearningIndexChatDataSource.cs @@ -0,0 +1,126 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Diagnostics.CodeAnalysis; + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureMachineLearningIndexChatDataSource")] +public partial class AzureMachineLearningIndexChatDataSource : AzureChatDataSource +{ + [CodeGenMember("Parameters")] + internal InternalAzureMachineLearningIndexChatDataSourceParameters InternalParameters { get; } + + /// + required public string IndexName + { + get => InternalParameters.Name; + init => InternalParameters.Name = value; + } + + /// + required public string ProjectResourceId + { + get => InternalParameters.ProjectResourceId; + init => InternalParameters.ProjectResourceId = value; + } + + /// + required public DataSourceAuthentication Authentication + { + get => InternalParameters.Authentication; + init => InternalParameters.Authentication = value; + } + + /// + required public string Version + { + get => InternalParameters.Version; + init => InternalParameters.Version = value; + } + + /// + public int? TopNDocuments + { + get => InternalParameters.TopNDocuments; + init => InternalParameters.TopNDocuments = value; + } + + /// + public bool? InScope + { + get => InternalParameters.InScope; + init => InternalParameters.InScope = value; + } + + /// + public int? Strictness + { + get => InternalParameters.Strictness; + init => InternalParameters.Strictness = value; + } + + /// + public string RoleInformation + { + get => InternalParameters.RoleInformation; + init => InternalParameters.RoleInformation = value; + } + + /// + public int? MaxSearchQueries + { + get => InternalParameters.MaxSearchQueries; + init => InternalParameters.MaxSearchQueries = value; + } + + /// + public bool? AllowPartialResult + { + get => InternalParameters.AllowPartialResult; + init => InternalParameters.AllowPartialResult = value; + } + + /// + public DataSourceOutputContextFlags? OutputContextFlags + { + get => InternalParameters.OutputContextFlags; + init => InternalParameters.OutputContextFlags = value; + } + + /// + public string Filter + { + get => InternalParameters.Filter; + init => InternalParameters.Filter = value; + } + + /// + /// Creates a new instance of . + /// + public AzureMachineLearningIndexChatDataSource() : base(type: "azure_ml_index", serializedAdditionalRawData: null) + { + InternalParameters = new(); + } + + // CUSTOM: Made internal. + /// Initializes a new instance of . + /// The parameter information to control the use of the Azure Machine Learning Index data source. + /// is null. + internal AzureMachineLearningIndexChatDataSource(InternalAzureMachineLearningIndexChatDataSourceParameters internalParameters) + { + Argument.AssertNotNull(internalParameters, nameof(internalParameters)); + InternalParameters = internalParameters; + } + + /// Initializes a new instance of . + /// + /// Keeps track of any properties unknown to the library. + /// The parameter information to control the use of the Azure Search data source. + [SetsRequiredMembers] + internal AzureMachineLearningIndexChatDataSource(string type, IDictionary serializedAdditionalRawData, InternalAzureMachineLearningIndexChatDataSourceParameters internalParameters) + : base(type, serializedAdditionalRawData) + { + InternalParameters = internalParameters; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/AzureSearchChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/AzureSearchChatDataSource.cs new file mode 100644 index 000000000..799cc13dc --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/AzureSearchChatDataSource.cs @@ -0,0 +1,147 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Diagnostics.CodeAnalysis; + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureSearchChatDataSource")] +public partial class AzureSearchChatDataSource : AzureChatDataSource +{ + [CodeGenMember("Parameters")] + internal InternalAzureSearchChatDataSourceParameters InternalParameters { get; } + + /// + required public Uri Endpoint + { + get => InternalParameters.Endpoint; + init => InternalParameters.Endpoint = value; + } + + /// + required public string IndexName + { + get => InternalParameters.IndexName; + init => InternalParameters.IndexName = value; + } + + /// + required public DataSourceAuthentication Authentication + { + get => InternalParameters.Authentication; + init => InternalParameters.Authentication = value; + } + + /// + public int? TopNDocuments + { + get => InternalParameters.TopNDocuments; + init => InternalParameters.TopNDocuments = value; + } + + /// + public bool? InScope + { + get => InternalParameters.InScope; + init => InternalParameters.InScope = value; + } + + /// + public int? Strictness + { + get => InternalParameters.Strictness; + init => InternalParameters.Strictness = value; + } + + /// + public string RoleInformation + { + get => InternalParameters.RoleInformation; + init => InternalParameters.RoleInformation = value; + } + + /// + public int? MaxSearchQueries + { + get => InternalParameters.MaxSearchQueries; + init => InternalParameters.MaxSearchQueries = value; + } + + /// + public bool? AllowPartialResult + { + get => InternalParameters.AllowPartialResult; + init => InternalParameters.AllowPartialResult = value; + } + + /// + public DataSourceOutputContextFlags? OutputContextFlags + { + get => InternalParameters.OutputContextFlags; + init => InternalParameters.OutputContextFlags = value; + } + + /// + public DataSourceFieldMappings FieldMappings + { + get => InternalParameters.FieldMappings; + init => InternalParameters.FieldMappings = value; + } + + /// + public DataSourceQueryType? QueryType + { + get => InternalParameters.QueryType; + init => InternalParameters.QueryType = value; + } + + /// + public string SemanticConfiguration + { + get => InternalParameters.SemanticConfiguration; + init => InternalParameters.SemanticConfiguration = value; + } + + /// + public string Filter + { + get => InternalParameters.Filter; + init => InternalParameters.Filter = value; + } + + /// + public DataSourceVectorizer VectorizationSource + { + get => InternalParameters.VectorizationSource; + init => InternalParameters.VectorizationSource = value; + } + + /// + /// Creates a new instance of . + /// + public AzureSearchChatDataSource() : base(type: "azure_search", serializedAdditionalRawData: null) + { + InternalParameters = new(); + } + + // CUSTOM: Made internal. + /// Initializes a new instance of . + /// The parameter information to control the use of the Azure Search data source. + /// is null. + internal AzureSearchChatDataSource(InternalAzureSearchChatDataSourceParameters internalParameters) : this() + { + Argument.AssertNotNull(internalParameters, nameof(internalParameters)); + InternalParameters = internalParameters; + } + + /// Initializes a new instance of . + /// + /// Keeps track of any properties unknown to the library. + /// The parameter information to control the use of the Azure Search data source. + [SetsRequiredMembers] + internal AzureSearchChatDataSource(string type, IDictionary serializedAdditionalRawData, InternalAzureSearchChatDataSourceParameters internalParameters) + : base(type, serializedAdditionalRawData) + { + InternalParameters = internalParameters; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceAuthentication.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceAuthentication.cs new file mode 100644 index 000000000..070b79547 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceAuthentication.cs @@ -0,0 +1,23 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureChatDataSourceAuthenticationOptions")] +public partial class DataSourceAuthentication +{ + public static DataSourceAuthentication FromApiKey(string apiKey) + => new InternalAzureChatDataSourceApiKeyAuthenticationOptions(apiKey); + public static DataSourceAuthentication FromAccessToken(string accessToken) + => new InternalAzureChatDataSourceAccessTokenAuthenticationOptions(accessToken); + public static DataSourceAuthentication FromConnectionString(string connectionString) + => new InternalAzureChatDataSourceConnectionStringAuthenticationOptions(connectionString); + public static DataSourceAuthentication FromKeyAndKeyId(string key, string keyId) + => new InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions(key, keyId); + public static DataSourceAuthentication FromEncodedApiKey(string encodedApiKey) + => new InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions(encodedApiKey); + public static DataSourceAuthentication FromSystemManagedIdentity() + => new InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions(); + public static DataSourceAuthentication FromUserManagedIdentity(string identityResourceId) + => new InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions(identityResourceId); +} \ No newline at end of file diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceFieldMappings.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceFieldMappings.cs new file mode 100644 index 000000000..999ade6bc --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceFieldMappings.cs @@ -0,0 +1,99 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureSearchChatDataSourceParametersFieldsMapping")] +public partial class DataSourceFieldMappings +{ + /// + /// The name of the index field to use as a title. + /// + [CodeGenMember("TitleField")] + public string TitleFieldName { get; init;} + + /// + /// The name of the index field to use as a URL. + /// + /// + /// + /// This field is applicable to data source types including: , + /// , , and + /// . + /// + /// + /// It is not applicable to types including: . + /// + /// + [CodeGenMember("UrlField")] + public string UrlFieldName { get; init;} + + /// The name of the index field to use as a filepath. + /// + /// + /// This field is applicable to data source types including: , + /// , , and + /// . + /// + /// + /// It is not applicable to types including: . + /// + /// + [CodeGenMember("FilepathField")] + public string FilepathFieldName { get; init; } + + /// The names of index fields that should be treated as content. + /// + /// + /// This field is applicable to data source types including: , + /// , , and + /// . + /// + /// + /// It is not applicable to types including: . + /// + /// + [CodeGenMember("ContentFields")] + public IList ContentFieldNames { get; } = new ChangeTrackingList(); + + /// The separator pattern that content fields should use. + /// + /// + /// This field is applicable to data source types including: , + /// , , and + /// . + /// + /// + /// It is not applicable to types including: . + /// + /// + [CodeGenMember("ContentFieldsSeparator")] + public string ContentFieldSeparator { get; init;} + + /// The names of fields that represent vector data. + /// + /// + /// This field is applicable to data source types including: , + /// , and . + /// + /// + /// It is not applicable to types including: and + /// . + /// + /// + [CodeGenMember("VectorFields")] + public IList VectorFieldNames { get; } = new ChangeTrackingList(); + + /// The names of fields that represent image vector data. + /// + /// This configuration is only applicable to . + /// + [CodeGenMember("ImageVectorFields")] + public IList ImageVectorFieldNames { get; } = new ChangeTrackingList(); + + /// + /// Initializes a new instance of . + /// + public DataSourceFieldMappings() + {} +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceOutputContextFlags.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceOutputContextFlags.Serialization.cs new file mode 100644 index 000000000..b5a0ff12d --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceOutputContextFlags.Serialization.cs @@ -0,0 +1,50 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +internal static partial class DataSourceOutputContextFlagsExtensions +{ + public static IList ToStringList(this DataSourceOutputContextFlags flags) + { + List contexts = []; + if (flags.HasFlag(DataSourceOutputContextFlags.Intent)) + { + contexts.Add("intent"); + } + if (flags.HasFlag(DataSourceOutputContextFlags.Citations)) + { + contexts.Add("citations"); + } + if (flags.HasFlag(DataSourceOutputContextFlags.AllRetrievedDocuments)) + { + contexts.Add("all_retrieved_documents"); + } + return contexts; + } + + public static DataSourceOutputContextFlags? FromStringList(IList strings) + { + if (strings is null) + { + return null; + } + DataSourceOutputContextFlags flags = 0; + foreach (string s in strings) + { + if (s == "citations") + { + flags |= DataSourceOutputContextFlags.Citations; + } + else if (s == "intent") + { + flags |= DataSourceOutputContextFlags.Intent; + } + else if (s == "all_retrieved_documents") + { + flags |= DataSourceOutputContextFlags.AllRetrievedDocuments; + } + } + return flags; + } +} \ No newline at end of file diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceOutputContextFlags.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceOutputContextFlags.cs new file mode 100644 index 000000000..1123f96f7 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceOutputContextFlags.cs @@ -0,0 +1,24 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +/// +/// The include_context flags to request for an On Your Data retrieval result, which control what information +/// will be available on instances in the response. +/// +/// +/// By default, intent and citations will be requested. +/// +/// This value is provided as a bitmask flag. For example, to request intent and all_retrieved_documents +/// contexts, use the bitwise OR operator by assigning +/// | . +/// +/// +[Flags] +public enum DataSourceOutputContextFlags : int +{ + Intent = 1 << 0, + Citations = 1 << 1, + AllRetrievedDocuments = 1 << 2, +} \ No newline at end of file diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceVectorizer.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceVectorizer.cs new file mode 100644 index 000000000..51e4847b3 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/DataSourceVectorizer.cs @@ -0,0 +1,26 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("AzureChatDataSourceVectorizationSource")] +public abstract partial class DataSourceVectorizer +{ + /// + /// Creates a new data source embedding dependency reference from an authenticated endpoint. + /// + /// + /// Vectorization endpoint authentication only supports api-key- and access-token-based authentication, as + /// created via and + /// , respectively. + /// + /// The endpoint to use for vectorization. + /// The authentication mechanism to use with the endpoint. + /// + public static DataSourceVectorizer FromEndpoint(Uri endpoint, DataSourceAuthentication authentication) + => new InternalAzureChatDataSourceEndpointVectorizationSource(endpoint, authentication); + public static DataSourceVectorizer FromDeploymentName(string deploymentName) + => new InternalAzureChatDataSourceDeploymentNameVectorizationSource(deploymentName); + public static DataSourceVectorizer FromModelId(string modelId) + => new InternalAzureChatDataSourceModelIdVectorizationSource(modelId); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/ElasticsearchChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/ElasticsearchChatDataSource.cs new file mode 100644 index 000000000..8ded23802 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/ElasticsearchChatDataSource.cs @@ -0,0 +1,131 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Diagnostics.CodeAnalysis; + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("ElasticsearchChatDataSource")] +public partial class ElasticsearchChatDataSource : AzureChatDataSource +{ + [CodeGenMember("Parameters")] + internal InternalElasticsearchChatDataSourceParameters InternalParameters { get; } + + /// + required public Uri Endpoint + { + get => InternalParameters.Endpoint; + init => InternalParameters.Endpoint = value; + } + + /// + required public string IndexName + { + get => InternalParameters.IndexName; + init => InternalParameters.IndexName = value; + } + + /// + required public DataSourceAuthentication Authentication + { + get => InternalParameters.Authentication; + init => InternalParameters.Authentication = value; + } + + /// + public int? TopNDocuments + { + get => InternalParameters.TopNDocuments; + init => InternalParameters.TopNDocuments = value; + } + + /// + public bool? InScope + { + get => InternalParameters.InScope; + init => InternalParameters.InScope = value; + } + + /// + public int? Strictness + { + get => InternalParameters.Strictness; + init => InternalParameters.Strictness = value; + } + + /// + public string RoleInformation + { + get => InternalParameters.RoleInformation; + init => InternalParameters.RoleInformation = value; + } + + /// + public int? MaxSearchQueries + { + get => InternalParameters.MaxSearchQueries; + init => InternalParameters.MaxSearchQueries = value; + } + + /// + public bool? AllowPartialResult + { + get => InternalParameters.AllowPartialResult; + init => InternalParameters.AllowPartialResult = value; + } + + /// + public DataSourceOutputContextFlags? OutputContextFlags + { + get => InternalParameters.OutputContextFlags; + init => InternalParameters.OutputContextFlags = value; + } + + /// + public DataSourceFieldMappings FieldMappings + { + get => InternalParameters.FieldMappings; + init => InternalParameters.FieldMappings = value; + } + + /// + public DataSourceQueryType? QueryType + { + get => InternalParameters.QueryType; + init => InternalParameters.QueryType = value; + } + + /// + public DataSourceVectorizer VectorizationSource + { + get => InternalParameters.VectorizationSource; + init => InternalParameters.VectorizationSource = value; + } + + public ElasticsearchChatDataSource() : base(type: "elasticsearch", serializedAdditionalRawData: null) + { + InternalParameters = new(); + } + + // CUSTOM: Made internal. + /// Initializes a new instance of . + /// The parameter information to control the use of the Elasticsearch data source. + /// is null. + internal ElasticsearchChatDataSource(InternalElasticsearchChatDataSourceParameters internalParameters) + : this() + { + Argument.AssertNotNull(internalParameters, nameof(internalParameters)); + InternalParameters = internalParameters; + } + + /// Initializes a new instance of . + /// + /// Keeps track of any properties unknown to the library. + /// The parameter information to control the use of the Azure Search data source. + [SetsRequiredMembers] + internal ElasticsearchChatDataSource(string type, IDictionary serializedAdditionalRawData, InternalElasticsearchChatDataSourceParameters internalParameters) + : base(type, serializedAdditionalRawData) + { + InternalParameters = internalParameters; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/PineconeChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/PineconeChatDataSource.cs new file mode 100644 index 000000000..fd7106040 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Chat/OnYourData/PineconeChatDataSource.cs @@ -0,0 +1,123 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Diagnostics.CodeAnalysis; + +namespace Azure.AI.OpenAI.Chat; + +[CodeGenModel("PineconeChatDataSource")] +public partial class PineconeChatDataSource : AzureChatDataSource +{ + [CodeGenMember("Parameters")] + internal InternalPineconeChatDataSourceParameters InternalParameters { get; } + + /// + required public string Environment + { + get => InternalParameters.Environment; + init => InternalParameters.Environment = value; + } + + /// + required public string IndexName + { + get => InternalParameters.IndexName; + init => InternalParameters.IndexName = value; + } + + /// + required public DataSourceAuthentication Authentication + { + get => InternalParameters.Authentication; + init => InternalParameters.Authentication = value; + } + + /// + required public DataSourceVectorizer VectorizationSource + { + get => InternalParameters.VectorizationSource; + init => InternalParameters.VectorizationSource = value; + } + + /// + required public DataSourceFieldMappings FieldMappings + { + get => InternalParameters.FieldMappings; + init => InternalParameters.FieldMappings = value; + } + + /// + public int? TopNDocuments + { + get => InternalParameters.TopNDocuments; + init => InternalParameters.TopNDocuments = value; + } + + /// + public bool? InScope + { + get => InternalParameters.InScope; + init => InternalParameters.InScope = value; + } + + /// + public int? Strictness + { + get => InternalParameters.Strictness; + init => InternalParameters.Strictness = value; + } + + /// + public string RoleInformation + { + get => InternalParameters.RoleInformation; + init => InternalParameters.RoleInformation = value; + } + + /// + public int? MaxSearchQueries + { + get => InternalParameters.MaxSearchQueries; + init => InternalParameters.MaxSearchQueries = value; + } + + /// + public bool? AllowPartialResult + { + get => InternalParameters.AllowPartialResult; + init => InternalParameters.AllowPartialResult = value; + } + + /// + public DataSourceOutputContextFlags? OutputContextFlags + { + get => InternalParameters.OutputContextFlags; + init => InternalParameters.OutputContextFlags = value; + } + + public PineconeChatDataSource() : base(type: "pinecone", serializedAdditionalRawData: null) + { + InternalParameters = new(); + } + + // CUSTOM: Made internal. + /// Initializes a new instance of . + /// The parameter information to control the use of the Pinecone data source. + /// is null. + internal PineconeChatDataSource(InternalPineconeChatDataSourceParameters internalParameters) : this() + { + Argument.AssertNotNull(internalParameters, nameof(internalParameters)); + InternalParameters = internalParameters; + } + + /// Initializes a new instance of . + /// + /// Keeps track of any properties unknown to the library. + /// The parameter information to control the use of the Azure Search data source. + [SetsRequiredMembers] + internal PineconeChatDataSource(string type, IDictionary serializedAdditionalRawData, InternalPineconeChatDataSourceParameters internalParameters) + : base(type, serializedAdditionalRawData) + { + InternalParameters = internalParameters; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/AdditionalPropertyHelpers.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/AdditionalPropertyHelpers.cs new file mode 100644 index 000000000..bc0c0a10b --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/AdditionalPropertyHelpers.cs @@ -0,0 +1,48 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Internal; + +internal static class AdditionalPropertyHelpers +{ + internal static T GetAdditionalProperty(IDictionary additionalProperties, string key) + where T : class, IJsonModel + { + if (additionalProperties?.TryGetValue(key, out BinaryData binaryProperty) != true) + { + return null; + } + return (T)ModelReaderWriter.Read(binaryProperty, typeof(T)); + } + + internal static IList GetAdditionalListProperty(IDictionary additionalProperties, string key) + where T : class, IJsonModel + { + if (additionalProperties?.TryGetValue(key, out BinaryData binaryProperty) != true) + { + return null; + } + List items = []; + using JsonDocument document = JsonDocument.Parse(binaryProperty); + foreach (JsonElement element in document.RootElement.EnumerateArray()) + { + items.Add((T)ModelReaderWriter.Read(BinaryData.FromObjectAsJson(element), typeof(T))); + } + return items; + } + + internal static void SetAdditionalProperty(IDictionary additionalProperties, string key, T value) + { + using MemoryStream stream = new(); + using (Utf8JsonWriter writer = new(stream)) + { + writer.WriteObjectValue(value); + } + stream.Position = 0; + BinaryData binaryValue = BinaryData.FromStream(stream); + additionalProperties[key] = binaryValue; + } +} \ No newline at end of file diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterBlocklistResult.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterBlocklistResult.cs new file mode 100644 index 000000000..12a3b13e2 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterBlocklistResult.cs @@ -0,0 +1,30 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable disable + +namespace Azure.AI.OpenAI; + +[CodeGenModel("AzureContentFilterBlocklistResult")] +public partial class ContentFilterBlocklistResult +{ + public IReadOnlyDictionary BlocklistFilterStatuses + { + get + { + if (_filteredByBlocklistId is null) + { + _filteredByBlocklistId = []; + foreach (InternalAzureContentFilterBlocklistResultDetail internalDetail in InternalDetails ?? []) + { + _filteredByBlocklistId[internalDetail.Id] = internalDetail.Filtered; + } + } + return _filteredByBlocklistId; + } + } + private Dictionary _filteredByBlocklistId; + + [CodeGenMember("Details")] + private IReadOnlyList InternalDetails { get; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterResultForPrompt.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterResultForPrompt.Serialization.cs new file mode 100644 index 000000000..2116f1867 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterResultForPrompt.Serialization.cs @@ -0,0 +1,53 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace Azure.AI.OpenAI; + +public partial class ContentFilterResultForPrompt +{ + internal static ContentFilterResultForPrompt DeserializeContentFilterResultForPrompt(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int? promptIndex = default; + InternalAzureContentFilterResultForPromptContentFilterResults contentFilterResults = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("prompt_index"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + promptIndex = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("content_filter_results"u8) + // CUSTOMIZATION: some models, such as gpt-4o, observationally use a different, singular label + || property.NameEquals("content_filter_result"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + contentFilterResults = InternalAzureContentFilterResultForPromptContentFilterResults.DeserializeInternalAzureContentFilterResultForPromptContentFilterResults(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ContentFilterResultForPrompt(promptIndex, contentFilterResults, serializedAdditionalRawData); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterResultForPrompt.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterResultForPrompt.cs new file mode 100644 index 000000000..15bc9b158 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterResultForPrompt.cs @@ -0,0 +1,33 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI; + +[CodeGenModel("AzureContentFilterResultForPrompt")] +public partial class ContentFilterResultForPrompt +{ + internal int? PromptIndex { get; } + /// Gets the content filter results. + [CodeGenMember("ContentFilterResults")] + internal InternalAzureContentFilterResultForPromptContentFilterResults InternalResults { get; } + + /// + internal InternalAzureContentFilterResultForPromptContentFilterResultsError Error { get; set; } + + /// + public ContentFilterSeverityResult Sexual => InternalResults?.Sexual; + /// + public ContentFilterSeverityResult Violence => InternalResults?.Violence; + /// + public ContentFilterSeverityResult Hate => InternalResults?.Hate; + /// + public ContentFilterSeverityResult SelfHarm => InternalResults?.SelfHarm; + /// + public ContentFilterDetectionResult Profanity => InternalResults?.Profanity; + /// + public ContentFilterBlocklistResult CustomBlocklists => InternalResults?.CustomBlocklists; + /// + public ContentFilterDetectionResult Jailbreak => InternalResults?.Jailbreak; + /// + public ContentFilterDetectionResult IndirectAttack => InternalResults?.IndirectAttack; +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterResultForResponse.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterResultForResponse.cs new file mode 100644 index 000000000..4b21e611e --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/ContentFilterResultForResponse.cs @@ -0,0 +1,10 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI; + +[CodeGenModel("AzureContentFilterResultForChoice")] +public partial class ContentFilterResultForResponse +{ + internal InternalAzureContentFilterResultForPromptContentFilterResultsError Error { get; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/GeneratorStubs.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/GeneratorStubs.cs new file mode 100644 index 000000000..6c52e6539 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Common/GeneratorStubs.cs @@ -0,0 +1,21 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI; +[CodeGenModel("AzureContentFilterBlocklistIdResult")] internal partial class InternalAzureContentFilterBlocklistIdResult { } +[CodeGenModel("AzureContentFilterBlocklistResultDetail")] internal partial class InternalAzureContentFilterBlocklistResultDetail { } +[CodeGenModel("AzureContentFilterResultForChoiceProtectedMaterialCode")] public partial class ContentFilterProtectedMaterialResult { } +[CodeGenModel("AzureContentFilterResultForChoiceProtectedMaterialCodeCitation")] public partial class ContentFilterProtectedMaterialCitedResult { } +[CodeGenModel("AzureContentFilterResultForPromptContentFilterResults")] internal partial class InternalAzureContentFilterResultForPromptContentFilterResults { } +[CodeGenModel("AzureContentFilterResultForPromptContentFilterResultsError")] internal partial class InternalAzureContentFilterResultForPromptContentFilterResultsError { } +[CodeGenModel("AzureContentFilterSeverityResultSeverity")] public readonly partial struct ContentFilterSeverity { } +[CodeGenModel("AzureContentFilterSeverityResult")] public partial class ContentFilterSeverityResult +{ + [CodeGenMember("Severity")] + public ContentFilterSeverity Severity { get; } +} +[CodeGenModel("AzureContentFilterDetectionResult")] public partial class ContentFilterDetectionResult { } +[CodeGenModel("AzureOpenAIChatErrorInnerError")] internal partial class InternalAzureOpenAIChatErrorInnerError { } +[CodeGenModel("AzureOpenAIChatErrorInnerErrorCode")] internal readonly partial struct InternalAzureOpenAIChatErrorInnerErrorCode { } +[CodeGenModel("AzureOpenAIDalleErrorInnerError")] internal partial class InternalAzureOpenAIDalleErrorInnerError { } +[CodeGenModel("AzureOpenAIDalleErrorInnerErrorCode")] internal readonly partial struct InternalAzureOpenAIDalleErrorInnerErrorCode { } diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Embeddings/AzureEmbeddingClient.Protocol.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Embeddings/AzureEmbeddingClient.Protocol.cs new file mode 100644 index 000000000..afe5ac46e --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Embeddings/AzureEmbeddingClient.Protocol.cs @@ -0,0 +1,36 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; +using OpenAI.Embeddings; + +namespace Azure.AI.OpenAI.Embeddings; + +internal partial class AzureEmbeddingClient : EmbeddingClient +{ + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult GenerateEmbeddings(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateEmbeddingPipelineMessage(content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task GenerateEmbeddingsAsync(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateEmbeddingPipelineMessage(content, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(response); + } + + private PipelineMessage CreateEmbeddingPipelineMessage(BinaryContent content, RequestOptions options = null) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithMethod("POST") + .WithPath("embeddings") + .WithContent(content, "application/json") + .WithAccept("application/json") + .WithOptions(options) + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Embeddings/AzureEmbeddingClient.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Embeddings/AzureEmbeddingClient.cs new file mode 100644 index 000000000..7bbdf2477 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Embeddings/AzureEmbeddingClient.cs @@ -0,0 +1,36 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using OpenAI.Embeddings; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.Embeddings; + +/// +/// The scenario client used for embedding operations with the Azure OpenAI service. +/// +/// +/// To retrieve an instance of this type, use the matching method on . +/// +internal partial class AzureEmbeddingClient : EmbeddingClient +{ + private readonly string _deploymentName; + private readonly Uri _endpoint; + private readonly string _apiVersion; + + internal AzureEmbeddingClient(ClientPipeline pipeline, string deploymentName, Uri endpoint, AzureOpenAIClientOptions options) + : base(pipeline, model: deploymentName, new OpenAIClientOptions() { Endpoint = endpoint }) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNullOrEmpty(deploymentName, nameof(deploymentName)); + Argument.AssertNotNull(endpoint, nameof(endpoint)); + options ??= new(); + + _deploymentName = deploymentName; + _endpoint = endpoint; + _apiVersion = options.Version; + } + + protected AzureEmbeddingClient() + { } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Files/AzureFileClient.Protocol.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Files/AzureFileClient.Protocol.cs new file mode 100644 index 000000000..07b8d3ccf --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Files/AzureFileClient.Protocol.cs @@ -0,0 +1,141 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; + +namespace Azure.AI.OpenAI.Files; + +internal partial class AzureFileClient : FileClient +{ + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult DeleteFile(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDeleteRequestMessage(fileId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task DeleteFileAsync(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDeleteRequestMessage(fileId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult DownloadFile(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDownloadContentRequestMessage(fileId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task DownloadFileAsync(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDownloadContentRequestMessage(fileId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult GetFile(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateGetFileRequestMessage(fileId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task GetFileAsync(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateGetFileRequestMessage(fileId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult GetFiles(string purpose, RequestOptions options) + { + using PipelineMessage message = CreateGetFilesRequestMessage(purpose, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task GetFilesAsync(string purpose, RequestOptions options) + { + using PipelineMessage message = CreateGetFilesRequestMessage(purpose, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult UploadFile(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateUploadRequestMessage(content, contentType, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task UploadFileAsync(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateUploadRequestMessage(content, contentType, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + private PipelineMessage CreateDeleteRequestMessage(string fileId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("DELETE") + .WithPath("files", fileId) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private PipelineMessage CreateDownloadContentRequestMessage(string fileId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("GET") + .WithPath("files", fileId, "content") + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private PipelineMessage CreateGetFileRequestMessage(string fileId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("GET") + .WithPath("files", fileId) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private PipelineMessage CreateGetFilesRequestMessage(string purpose, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("GET") + .WithPath("files") + .WithAccept("application/json") + .WithOptionalQueryParameter("purpose", purpose) + .WithOptions(options) + .Build(); + + private PipelineMessage CreateUploadRequestMessage(BinaryContent content, string contentType, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("POST") + .WithPath("files") + .WithContent(content, contentType) + .WithAccept("application/json") + .WithOptions(options) + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Files/AzureFileClient.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Files/AzureFileClient.cs new file mode 100644 index 000000000..60eb3403f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Files/AzureFileClient.cs @@ -0,0 +1,78 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using OpenAI.Files; +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.Files; + +/// +/// The scenario client used for Files operations with the Azure OpenAI service. +/// +/// +/// To retrieve an instance of this type, use the matching method on . +/// +internal partial class AzureFileClient : FileClient +{ + private readonly Uri _endpoint; + private readonly string _apiVersion; + + internal AzureFileClient(ClientPipeline pipeline, Uri endpoint, AzureOpenAIClientOptions options) + : base(pipeline, new OpenAIClientOptions() { Endpoint = endpoint }) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNull(endpoint, nameof(endpoint)); + options ??= new(); + + _endpoint = endpoint; + _apiVersion = options.Version; + } + + protected AzureFileClient() + { } + + /// + public override ClientResult UploadFile(Stream file, string filename, FileUploadPurpose purpose, CancellationToken cancellationToken = default) + { + if (purpose != FileUploadPurpose.FineTune) + { + return base.UploadFile(file, filename, purpose, cancellationToken); + } + + // need to set the content type for fine tuning file uploads in Azure OpenAI + Argument.AssertNotNull(file, "file"); + Argument.AssertNotNullOrEmpty(filename, "filename"); + + using MultipartFormDataBinaryContent content = CreateMultiPartContentWithMimeType(file, filename, purpose); + ClientResult clientResult = UploadFile(content, content.ContentType, new() { CancellationToken = cancellationToken }); + return ClientResult.FromValue(OpenAIFileInfo.FromResponse(clientResult.GetRawResponse()), clientResult.GetRawResponse()); + } + + /// + public override async Task> UploadFileAsync(Stream file, string filename, FileUploadPurpose purpose, CancellationToken cancellationToken = default) + { + if (purpose != FileUploadPurpose.FineTune) + { + return await base.UploadFileAsync(file, filename, purpose, cancellationToken) + .ConfigureAwait(false); + } + + // need to set the content type for fine tuning file uploads in Azure OpenAI + Argument.AssertNotNull(file, "file"); + Argument.AssertNotNullOrEmpty(filename, "filename"); + + using MultipartFormDataBinaryContent content = CreateMultiPartContentWithMimeType(file, filename, purpose); + ClientResult result = await UploadFileAsync(content, content.ContentType, new() { CancellationToken = cancellationToken }) + .ConfigureAwait(continueOnCapturedContext: false); + return ClientResult.FromValue(OpenAIFileInfo.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + private MultipartFormDataBinaryContent CreateMultiPartContentWithMimeType(Stream file, string filename, FileUploadPurpose purpose) + { + MultipartFormDataBinaryContent multipartFormDataBinaryContent = new MultipartFormDataBinaryContent(); + multipartFormDataBinaryContent.Add(file, "file", filename, "text/plain"); + multipartFormDataBinaryContent.Add(purpose.ToString(), "purpose"); + return multipartFormDataBinaryContent; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/AzureFineTuningClient.Extensions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/AzureFineTuningClient.Extensions.cs new file mode 100644 index 000000000..822b2c2d1 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/AzureFineTuningClient.Extensions.cs @@ -0,0 +1,61 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Diagnostics.CodeAnalysis; +using OpenAI.FineTuning; + +namespace Azure.AI.OpenAI.FineTuning; + +/// +/// Extension methods for Azure fine tuning clients. +/// +internal static class AzureFineTuningClientExtensions +{ + /// + /// Deletes an Azure OpenAI fine tuning job. + /// + /// The Azure OpenAI fine tuning client. + /// The identifier for the fine tuning job to delete. + /// (Optional) The request options to use. + /// The request result. + /// The Azure OpenAI service will always return a success (HTTP 204) regardless of whether or not + /// the job you are trying to delete exists. + [Experimental("AOAI001")] + public static ClientResult DeleteJob(this FineTuningClient client, string jobId, RequestOptions? options = null) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + return Cast(client).DeleteJob(jobId, options); + } + + /// + /// Deletes an Azure OpenAI fine tuning job. + /// + /// The Azure OpenAI fine tuning client. + /// The identifier for the fine tuning job to delete. + /// (Optional) The request options to use. + /// The request result. + /// The Azure OpenAI service will always return a success (HTTP 204) regardless of whether or not + /// the job you are trying to delete exists. + [Experimental("AOAI001")] + public static Task DeleteJobAsync(this FineTuningClient client, string jobId, RequestOptions? options = null) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + return Cast(client).DeleteJobAsync(jobId, options); + } + + private static AzureFineTuningClient Cast(FineTuningClient? client) + { + Argument.AssertNotNull(client, nameof(client)); + var azureClient = client as AzureFineTuningClient; + if (azureClient == null) + { + throw new InvalidOperationException("Only supported on Azure OpenAI fine tuning clients"); + } + + return azureClient; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/AzureFineTuningClient.Protocol.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/AzureFineTuningClient.Protocol.cs new file mode 100644 index 000000000..479f12cde --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/AzureFineTuningClient.Protocol.cs @@ -0,0 +1,137 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Diagnostics.CodeAnalysis; + +namespace Azure.AI.OpenAI.FineTuning; + +internal partial class AzureFineTuningClient : FineTuningClient +{ + private readonly PipelineMessageClassifier DeleteJobClassifier = PipelineMessageClassifier.Create(stackalloc ushort[] { 204 }); + + public override ClientResult CreateJob(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCreateJobRequestMessage(content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task CreateJobAsync(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCreateJobRequestMessage(content, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(response); + } + + public override ClientResult GetJob(string fineTuningJobId, RequestOptions options) + { + using PipelineMessage message = CreateGetJobRequestMessage(fineTuningJobId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task GetJobAsync(string fineTuningJobId, RequestOptions options) + { + using PipelineMessage message = CreateGetJobRequestMessage(fineTuningJobId, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(response); + } + + public override IEnumerable GetJobs(string after, int? limit, RequestOptions options) + { + AzureFineTuningJobsPageEnumerator enumerator = new(Pipeline, _endpoint, after, limit, _apiVersion, options); + return PageCollectionHelpers.Create(enumerator); + } + + public override IAsyncEnumerable GetJobsAsync(string after, int? limit, RequestOptions options) + { + AzureFineTuningJobsPageEnumerator enumerator = new(Pipeline, _endpoint, after, limit, _apiVersion, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + public override IEnumerable GetJobEvents(string fineTuningJobId, string after, int? limit, RequestOptions options) + { + AzureFineTuningJobEventsPageEnumerator enumerator = new(Pipeline, _endpoint, fineTuningJobId, after, limit, _apiVersion, options); + return PageCollectionHelpers.Create(enumerator); + } + + public override IAsyncEnumerable GetJobEventsAsync(string fineTuningJobId, string after, int? limit, RequestOptions options) + { + AzureFineTuningJobEventsPageEnumerator enumerator = new(Pipeline, _endpoint, fineTuningJobId, after, limit, _apiVersion, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + public override IEnumerable GetJobCheckpoints(string fineTuningJobId, string after, int? limit, RequestOptions options) + { + AzureFineTuningJobCheckpointsPageEnumerator enumerator = new(Pipeline, _endpoint, fineTuningJobId, after, limit, _apiVersion, options); + return PageCollectionHelpers.Create(enumerator); + } + + public override IAsyncEnumerable GetJobCheckpointsAsync(string fineTuningJobId, string after, int? limit, RequestOptions options) + { + AzureFineTuningJobCheckpointsPageEnumerator enumerator = new(Pipeline, _endpoint, fineTuningJobId, after, limit, _apiVersion, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + public override ClientResult CancelJob(string fineTuningJobId, RequestOptions options) + { + using PipelineMessage message = CreateCancelJobRequestMessage(fineTuningJobId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task CancelJobAsync(string fineTuningJobId, RequestOptions options) + { + using PipelineMessage message = CreateCancelJobRequestMessage(fineTuningJobId, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(response); + } + + [Experimental("AOAI001")] + public virtual ClientResult DeleteJob(string jobId, RequestOptions options = null) + { + using PipelineMessage message = CreateDeleteJobRequestMessage(jobId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [Experimental("AOAI001")] + public virtual async Task DeleteJobAsync(string jobId, RequestOptions options = null) + { + using PipelineMessage message = CreateDeleteJobRequestMessage(jobId, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(response); + } + + private PipelineMessage CreateCreateJobRequestMessage(BinaryContent content, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("POST") + .WithPath("fine_tuning", "jobs") + .WithContent(content, "application/json") + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private PipelineMessage CreateGetJobRequestMessage(string jobId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("GET") + .WithPath("fine_tuning", "jobs", jobId) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private PipelineMessage CreateCancelJobRequestMessage(string jobId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("POST") + .WithPath("fine_tuning", "jobs", jobId, "cancel") + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private PipelineMessage CreateDeleteJobRequestMessage(string jobId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("DELETE") + .WithPath("fine_tuning", "jobs", jobId) + .WithAccept("application/json") + .WithClassifier(DeleteJobClassifier) + .WithOptions(options) + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/AzureFineTuningClient.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/AzureFineTuningClient.cs new file mode 100644 index 000000000..5c69e1bdd --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/AzureFineTuningClient.cs @@ -0,0 +1,33 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using OpenAI.FineTuning; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.FineTuning; + +/// +/// The scenario client used for fine-tuning operations with the Azure OpenAI service. +/// +/// +/// To retrieve an instance of this type, use the matching method on . +/// +internal partial class AzureFineTuningClient : FineTuningClient +{ + private readonly Uri _endpoint; + private readonly string _apiVersion; + + internal AzureFineTuningClient(ClientPipeline pipeline, Uri endpoint, AzureOpenAIClientOptions options) + : base(pipeline, new OpenAIClientOptions() { Endpoint = endpoint }) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNull(endpoint, nameof(endpoint)); + options ??= new(); + + _endpoint = endpoint; + _apiVersion = options.Version; + } + + protected AzureFineTuningClient() + { } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/Pagination/AzureFineTuningJobCheckpointsPageEnumerator.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/Pagination/AzureFineTuningJobCheckpointsPageEnumerator.cs new file mode 100644 index 000000000..509b5c2ce --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/Pagination/AzureFineTuningJobCheckpointsPageEnumerator.cs @@ -0,0 +1,34 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.FineTuning; + +internal class AzureFineTuningJobCheckpointsPageEnumerator : FineTuningJobCheckpointsPageEnumerator +{ + private readonly string _apiVersion; + + public AzureFineTuningJobCheckpointsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string jobId, string after, int? limit, + string apiVersion, + RequestOptions options) + : base(pipeline, endpoint, jobId, after, limit, options) + { + _apiVersion = apiVersion; + } + + internal override PipelineMessage CreateGetFineTuningJobCheckpointsRequest(string fineTuningJobId, string after, int? limit, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(_pipeline, _endpoint, _apiVersion) + .WithMethod("GET") + .WithPath("fine_tuning", "jobs", fineTuningJobId, "checkpoints") + .WithOptionalQueryParameter("after", after) + .WithOptionalQueryParameter("limit", limit) + .WithAccept("application/json") + .WithOptions(options) + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/Pagination/AzureFineTuningJobEventsPageEnumerator.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/Pagination/AzureFineTuningJobEventsPageEnumerator.cs new file mode 100644 index 000000000..469bc8f8d --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/Pagination/AzureFineTuningJobEventsPageEnumerator.cs @@ -0,0 +1,34 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.FineTuning; + +internal class AzureFineTuningJobEventsPageEnumerator : FineTuningJobEventsPageEnumerator +{ + private readonly string _apiVersion; + + public AzureFineTuningJobEventsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string jobId, string after, int? limit, + string apiVersion, + RequestOptions options) + : base(pipeline, endpoint, jobId, after, limit, options) + { + _apiVersion = apiVersion; + } + + internal override PipelineMessage CreateGetFineTuningEventsRequest(string fineTuningJobId, string after, int? limit, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(_pipeline, _endpoint, _apiVersion) + .WithMethod("GET") + .WithPath("fine_tuning", "jobs", fineTuningJobId, "events") + .WithOptionalQueryParameter("after", after) + .WithOptionalQueryParameter("limit", limit) + .WithAccept("application/json") + .WithOptions(options) + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/Pagination/AzureFineTuningJobsPageEnumerator.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/Pagination/AzureFineTuningJobsPageEnumerator.cs new file mode 100644 index 000000000..be89ef203 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/FineTuning/Pagination/AzureFineTuningJobsPageEnumerator.cs @@ -0,0 +1,35 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.FineTuning; + +internal class AzureFineTuningJobsPageEnumerator : FineTuningJobsPageEnumerator +{ + private readonly string _apiVersion; + + public AzureFineTuningJobsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string? after, + int? limit, + string apiVersion, + RequestOptions options) + : base(pipeline, endpoint, after, limit, options) + { + _apiVersion = apiVersion; + } + + internal override PipelineMessage CreateGetFineTuningJobsRequest(string? after, int? limit, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(_pipeline, _endpoint, _apiVersion) + .WithMethod("GET") + .WithPath("fine_tuning", "jobs") + .WithOptionalQueryParameter("after", after) + .WithOptionalQueryParameter("limit", limit) + .WithAccept("application/json") + .WithOptions(options) + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/AzureGeneratedImage.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/AzureGeneratedImage.cs new file mode 100644 index 000000000..d0adc6fa0 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/AzureGeneratedImage.cs @@ -0,0 +1,27 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using Azure.AI.OpenAI.Internal; +using OpenAI.Images; +using System.Diagnostics.CodeAnalysis; + +namespace Azure.AI.OpenAI; + +public static class AzureGeneratedImageExtensions +{ + [Experimental("AOAI001")] + public static ImageContentFilterResultForPrompt GetContentFilterResultForPrompt(this GeneratedImage image) + { + return AdditionalPropertyHelpers.GetAdditionalProperty( + image.SerializedAdditionalRawData, + "prompt_filter_results"); + } + + [Experimental("AOAI001")] + public static ImageContentFilterResultForResponse GetContentFilterResultForResponse(this GeneratedImage image) + { + return AdditionalPropertyHelpers.GetAdditionalProperty( + image.SerializedAdditionalRawData, + "content_filter_results"); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/AzureImageClient.Protocol.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/AzureImageClient.Protocol.cs new file mode 100644 index 000000000..81fc8a739 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/AzureImageClient.Protocol.cs @@ -0,0 +1,84 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; +using OpenAI.Images; + +namespace Azure.AI.OpenAI.Images; + +internal partial class AzureImageClient : ImageClient +{ + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult GenerateImages(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateGenerateImagesRequestMessage(content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task GenerateImagesAsync(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateGenerateImagesRequestMessage(content, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(response); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult GenerateImageEdits(BinaryContent content, string contentType, RequestOptions options = null) + { + using PipelineMessage message = CreateGenerateImageEditsRequestMessage(content, contentType, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task GenerateImageEditsAsync(BinaryContent content, string contentType, RequestOptions options = null) + { + using PipelineMessage message = CreateGenerateImageEditsRequestMessage(content, contentType, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(response); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override ClientResult GenerateImageVariations(BinaryContent content, string contentType, RequestOptions options = null) + { + using PipelineMessage message = CreateGenerateImageVariationsRequestMessage(content, contentType, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override async Task GenerateImageVariationsAsync(BinaryContent content, string contentType, RequestOptions options = null) + { + using PipelineMessage message = CreateGenerateImageVariationsRequestMessage(content, contentType, options); + PipelineResponse response = await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + return ClientResult.FromResponse(response); + } + + private PipelineMessage CreateGenerateImagesRequestMessage(BinaryContent content, RequestOptions options = null) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithMethod("POST") + .WithPath("images", "generations") + .WithContent(content, "application/json") + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private PipelineMessage CreateGenerateImageEditsRequestMessage(BinaryContent content, string contentType, RequestOptions options = null) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithMethod("POST") + .WithPath("images", "edits") + .WithContent(content, contentType) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private PipelineMessage CreateGenerateImageVariationsRequestMessage(BinaryContent content, string contentType, RequestOptions options = null) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion, _deploymentName) + .WithMethod("POST") + .WithPath("images", "variations") + .WithContent(content, contentType) + .WithAccept("application/json") + .WithOptions(options) + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/AzureImageClient.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/AzureImageClient.cs new file mode 100644 index 000000000..ebeda9dae --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/AzureImageClient.cs @@ -0,0 +1,36 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using OpenAI.Images; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.Images; + +/// +/// The scenario client used for image operations with the Azure OpenAI service. +/// +/// +/// To retrieve an instance of this type, use the matching method on . +/// +internal partial class AzureImageClient : ImageClient +{ + private readonly string _deploymentName; + private readonly Uri _endpoint; + private readonly string _apiVersion; + + internal AzureImageClient(ClientPipeline pipeline, string deploymentName, Uri endpoint, AzureOpenAIClientOptions options) + : base(pipeline, model: deploymentName, new OpenAIClientOptions() { Endpoint = endpoint }) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNullOrEmpty(deploymentName, nameof(deploymentName)); + Argument.AssertNotNull(endpoint, nameof(endpoint)); + options ??= new(); + + _deploymentName = deploymentName; + _endpoint = endpoint; + _apiVersion = options.Version; + } + + protected AzureImageClient() + {} +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/GeneratorStubs.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/GeneratorStubs.cs new file mode 100644 index 000000000..181c1652f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Images/GeneratorStubs.cs @@ -0,0 +1,7 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI; + +[CodeGenModel("AzureContentFilterImagePromptResults")] public partial class ImageContentFilterResultForPrompt { } +[CodeGenModel("AzureContentFilterImageResponseResults")] public partial class ImageContentFilterResultForResponse { } diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/AzureOpenAIChatError.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/AzureOpenAIChatError.cs new file mode 100644 index 000000000..4b73bcea1 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/AzureOpenAIChatError.cs @@ -0,0 +1,52 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; +using System.Text; +using System.Text.Json; + +namespace Azure.AI.OpenAI; + +[CodeGenModel("AzureOpenAIChatError")] +internal partial class AzureOpenAIChatError +{ + internal static AzureOpenAIChatError TryCreateFromResponse(PipelineResponse response) + { + try + { + using JsonDocument errorDocument = JsonDocument.Parse(response.Content); + AzureOpenAIChatErrorResponse errorResponse + = AzureOpenAIChatErrorResponse.DeserializeAzureOpenAIChatErrorResponse(errorDocument.RootElement); + return errorResponse.Error; + } + catch (InvalidOperationException) + { + return null; + } + catch (JsonException) + { + return null; + } + } + + public string ToExceptionMessage(int httpStatus) + { + StringBuilder messageBuilder = new(); + messageBuilder.Append($"HTTP {httpStatus}"); + messageBuilder.Append(!string.IsNullOrEmpty(Type) || !string.IsNullOrEmpty(Code) ? " (" : string.Empty); + messageBuilder.Append(Type); + messageBuilder.Append(!string.IsNullOrEmpty(Type) ? ": " : string.Empty); + messageBuilder.Append(Code); + messageBuilder.Append(!string.IsNullOrEmpty(Type) || !string.IsNullOrEmpty(Code) ? ")" : string.Empty); + messageBuilder.AppendLine(); + + if (!string.IsNullOrEmpty(Param)) + { + messageBuilder.AppendLine($"Parameter: {Param}"); + } + + messageBuilder.AppendLine(); + messageBuilder.Append(Message); + return messageBuilder.ToString(); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/AzureOpenAIDalleError.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/AzureOpenAIDalleError.cs new file mode 100644 index 000000000..e7cab23b0 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/AzureOpenAIDalleError.cs @@ -0,0 +1,52 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; +using System.Text; +using System.Text.Json; + +namespace Azure.AI.OpenAI; + +[CodeGenModel("AzureOpenAIDalleError")] +internal partial class AzureOpenAIDalleError +{ + internal static AzureOpenAIDalleError TryCreateFromResponse(PipelineResponse response) + { + try + { + using JsonDocument errorDocument = JsonDocument.Parse(response.Content); + AzureOpenAIDalleErrorResponse errorResponse + = AzureOpenAIDalleErrorResponse.DeserializeAzureOpenAIDalleErrorResponse(errorDocument.RootElement); + return errorResponse.Error; + } + catch (InvalidOperationException) + { + return null; + } + catch (JsonException) + { + return null; + } + } + + public string ToExceptionMessage(int httpStatus) + { + StringBuilder messageBuilder = new(); + messageBuilder.Append($"HTTP {httpStatus}"); + messageBuilder.Append(!string.IsNullOrEmpty(Type) || !string.IsNullOrEmpty(Code) ? " (" : string.Empty); + messageBuilder.Append(Type); + messageBuilder.Append(!string.IsNullOrEmpty(Type) ? ": " : string.Empty); + messageBuilder.Append(Code); + messageBuilder.Append(!string.IsNullOrEmpty(Type) || !string.IsNullOrEmpty(Code) ? ")" : string.Empty); + messageBuilder.AppendLine(); + + if (!string.IsNullOrEmpty(Param)) + { + messageBuilder.AppendLine($"Parameter: {Param}"); + } + + messageBuilder.AppendLine(); + messageBuilder.Append(Message); + return messageBuilder.ToString(); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/ClientPipelineExtensions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/ClientPipelineExtensions.cs new file mode 100644 index 000000000..247188a14 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/ClientPipelineExtensions.cs @@ -0,0 +1,66 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI; + +internal static partial class ClientPipelineExtensions +{ + public static async ValueTask ProcessMessageAsync( + this ClientPipeline pipeline, + PipelineMessage message, + RequestOptions options) + { + await pipeline.SendAsync(message).ConfigureAwait(false); + + if (message.Response.IsError && (options?.ErrorOptions & ClientErrorBehaviors.NoThrow) != ClientErrorBehaviors.NoThrow) + { + throw await TryBufferResponseAndCreateErrorAsync(message).ConfigureAwait(false) switch + { + string errorMessage when !string.IsNullOrEmpty(errorMessage) + => new ClientResultException(errorMessage, message.Response), + _ => new ClientResultException(message.Response), + }; + } + + return message.Response; + } + + public static PipelineResponse ProcessMessage( + this ClientPipeline pipeline, + PipelineMessage message, + RequestOptions options) + { + pipeline.Send(message); + + if (message.Response.IsError && (options?.ErrorOptions & ClientErrorBehaviors.NoThrow) != ClientErrorBehaviors.NoThrow) + { + throw TryBufferResponseAndCreateError(message) switch + { + string errorMessage when !string.IsNullOrEmpty(errorMessage) + => new ClientResultException(errorMessage, message.Response), + _ => new ClientResultException(message.Response), + }; + } + + return message.Response; + } + + private static string TryBufferResponseAndCreateError(PipelineMessage message) + { + message.Response.BufferContent(); + return TryCreateErrorMessageFromResponse(message.Response); + } + + private static async Task TryBufferResponseAndCreateErrorAsync(PipelineMessage message) + { + await message.Response.BufferContentAsync().ConfigureAwait(false); + return TryCreateErrorMessageFromResponse(message.Response); + } + + private static string TryCreateErrorMessageFromResponse(PipelineResponse response) + => AzureOpenAIChatError.TryCreateFromResponse(response)?.ToExceptionMessage(response.Status) + ?? AzureOpenAIDalleError.TryCreateFromResponse(response)?.ToExceptionMessage(response.Status); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/ClientUriBuilder.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/ClientUriBuilder.cs new file mode 100644 index 000000000..6d81fd5ee --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/ClientUriBuilder.cs @@ -0,0 +1,26 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI; + +internal partial class ClientUriBuilder +{ + public Uri ToUri() + { + if (_pathBuilder is not null) + { + UriBuilder.Path = _pathBuilder.ToString(); + } + + // CUSTOMIZATION: UriBuilder.Query.set always prepends its own '?' pre-net5.0, which results in a double '??'. + // To mitigate pending generated ClientUriBuilder updates, ensure we only get one '?' by letting + // the builder add it. + // see: https://github.com/Azure/autorest.csharp/issues/4815 + if (_queryBuilder is not null) + { + UriBuilder.Query = _queryBuilder.ToString().TrimStart('?'); + } + + return UriBuilder.Uri; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/GeneratorStubs.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..ba8c821f4 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/Internal/GeneratorStubs.cs @@ -0,0 +1,8 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI; + +[CodeGenModel("AzureOpenAIChatErrorResponse")] internal partial class AzureOpenAIChatErrorResponse { } + +[CodeGenModel("AzureOpenAIDalleErrorResponse")] internal partial class AzureOpenAIDalleErrorResponse { } diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/AzureVectorStoreClient.Protocol.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/AzureVectorStoreClient.Protocol.cs new file mode 100644 index 000000000..63040be8b --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/AzureVectorStoreClient.Protocol.cs @@ -0,0 +1,310 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.VectorStores; + +internal partial class AzureVectorStoreClient : VectorStoreClient +{ + public override IAsyncEnumerable GetVectorStoresAsync(int? limit, string order, string after, string before, RequestOptions options) + { + AzureVectorStoresPageEnumerator enumerator = new(Pipeline, _endpoint, limit, order, after, before, _apiVersion, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + public override IEnumerable GetVectorStores(int? limit, string order, string after, string before, RequestOptions options) + { + AzureVectorStoresPageEnumerator enumerator = new(Pipeline, _endpoint, limit, order, after, before, _apiVersion,options); + return PageCollectionHelpers.Create(enumerator); + } + + public override async Task CreateVectorStoreAsync(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCreateVectorStoreRequest(content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult CreateVectorStore(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCreateVectorStoreRequest(content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task GetVectorStoreAsync(string vectorStoreId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateGetVectorStoreRequest(vectorStoreId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult GetVectorStore(string vectorStoreId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateGetVectorStoreRequest(vectorStoreId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task ModifyVectorStoreAsync(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyVectorStoreRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult ModifyVectorStore(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyVectorStoreRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task DeleteVectorStoreAsync(string vectorStoreId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateDeleteVectorStoreRequest(vectorStoreId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult DeleteVectorStore(string vectorStoreId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateDeleteVectorStoreRequest(vectorStoreId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override IAsyncEnumerable GetFileAssociationsAsync(string vectorStoreId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + AzureVectorStoreFilesPageEnumerator enumerator = new(Pipeline, _endpoint, vectorStoreId, limit, order, after, before, filter, _apiVersion, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + public override IEnumerable GetFileAssociations(string vectorStoreId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + AzureVectorStoreFilesPageEnumerator enumerator = new(Pipeline, _endpoint, vectorStoreId, limit, order, after, before, filter, _apiVersion, options); + return PageCollectionHelpers.Create(enumerator); + } + + public override async Task AddFileToVectorStoreAsync(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateVectorStoreFileRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult AddFileToVectorStore(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateVectorStoreFileRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task GetFileAssociationAsync(string vectorStoreId, string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateGetVectorStoreFileRequest(vectorStoreId, fileId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult GetFileAssociation(string vectorStoreId, string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateGetVectorStoreFileRequest(vectorStoreId, fileId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task RemoveFileFromStoreAsync(string vectorStoreId, string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDeleteVectorStoreFileRequest(vectorStoreId, fileId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult RemoveFileFromStore(string vectorStoreId, string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDeleteVectorStoreFileRequest(vectorStoreId, fileId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task CreateBatchFileJobAsync(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateVectorStoreFileBatchRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult CreateBatchFileJob(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateVectorStoreFileBatchRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task GetBatchFileJobAsync(string vectorStoreId, string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateGetVectorStoreFileBatchRequest(vectorStoreId, batchId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult GetBatchFileJob(string vectorStoreId, string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateGetVectorStoreFileBatchRequest(vectorStoreId, batchId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override async Task CancelBatchFileJobAsync(string vectorStoreId, string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateCancelVectorStoreFileBatchRequest(vectorStoreId, batchId, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public override ClientResult CancelBatchFileJob(string vectorStoreId, string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateCancelVectorStoreFileBatchRequest(vectorStoreId, batchId, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + public override IAsyncEnumerable GetFileAssociationsAsync(string vectorStoreId, string batchId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + AzureVectorStoreFileBatchesPageEnumerator enumerator = new(Pipeline, _endpoint, vectorStoreId, batchId, limit, order, after, before, filter, _apiVersion, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + public override IEnumerable GetFileAssociations(string vectorStoreId, string batchId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + AzureVectorStoreFileBatchesPageEnumerator enumerator = new(Pipeline, _endpoint, vectorStoreId, batchId, limit, order, after, before, filter, _apiVersion, options); + return PageCollectionHelpers.Create(enumerator); + } + + private new PipelineMessage CreateCreateVectorStoreRequest(BinaryContent content, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("POST") + .WithPath("vector_stores") + .WithContent(content, "application/json") + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateGetVectorStoreRequest(string vectorStoreId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("GET") + .WithPath("vector_stores", vectorStoreId) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateModifyVectorStoreRequest(string vectorStoreId, BinaryContent content, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("POST") + .WithPath("vector_stores", vectorStoreId) + .WithContent(content, "application/json") + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateDeleteVectorStoreRequest(string vectorStoreId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("DELETE") + .WithPath("vector_stores", vectorStoreId) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateCreateVectorStoreFileRequest(string vectorStoreId, BinaryContent content, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("POST") + .WithPath("vector_stores", vectorStoreId, "files") + .WithContent(content, "application/json") + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateGetVectorStoreFileRequest(string vectorStoreId, string fileId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("GET") + .WithPath("vector_stores", vectorStoreId, "files", fileId) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateDeleteVectorStoreFileRequest(string vectorStoreId, string fileId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("DELETE") + .WithPath("vector_stores", vectorStoreId, "files", fileId) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateCreateVectorStoreFileBatchRequest(string vectorStoreId, BinaryContent content, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("POST") + .WithPath("vector_stores", vectorStoreId, "file_batches") + .WithContent(content, "application/json") + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateGetVectorStoreFileBatchRequest(string vectorStoreId, string batchId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("GET") + .WithPath("vector_stores", vectorStoreId, "file_batches", batchId) + .WithAccept("application/json") + .WithOptions(options) + .Build(); + + private new PipelineMessage CreateCancelVectorStoreFileBatchRequest(string vectorStoreId, string batchId, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithMethod("POST") + .WithPath("vector_stores", vectorStoreId, "file_batches", batchId, "cancel") + .WithAccept("application/json") + .WithOptions(options) + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/AzureVectorStoreClient.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/AzureVectorStoreClient.cs new file mode 100644 index 000000000..52b07ee81 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/AzureVectorStoreClient.cs @@ -0,0 +1,35 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using OpenAI.VectorStores; +using System.ClientModel.Primitives; +using System.Diagnostics.CodeAnalysis; + +namespace Azure.AI.OpenAI.VectorStores; + +/// +/// The scenario client used for vector store operations with the Azure OpenAI service. +/// +/// +/// To retrieve an instance of this type, use the matching method on . +/// +[Experimental("OPENAI001")] +internal partial class AzureVectorStoreClient : VectorStoreClient +{ + private readonly Uri _endpoint; + private readonly string _apiVersion; + + internal AzureVectorStoreClient(ClientPipeline pipeline, Uri endpoint, AzureOpenAIClientOptions options) + : base(pipeline, new OpenAIClientOptions() { Endpoint = endpoint }) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNull(endpoint, nameof(endpoint)); + options ??= new(); + + _endpoint = endpoint; + _apiVersion = options.Version; + } + + protected AzureVectorStoreClient() + { } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/Internal/Pagination/AzureVectorStoreFileBatchesPageEnumerator.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/Internal/Pagination/AzureVectorStoreFileBatchesPageEnumerator.cs new file mode 100644 index 000000000..21ee0a3fb --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/Internal/Pagination/AzureVectorStoreFileBatchesPageEnumerator.cs @@ -0,0 +1,54 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.VectorStores; + +internal partial class AzureVectorStoreFileBatchesPageEnumerator : VectorStoreFileBatchesPageEnumerator +{ + private readonly Uri _endpoint; + private readonly string _apiVersion; + + public AzureVectorStoreFileBatchesPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string vectorStoreId, string batchId, int? limit, string order, string after, string before, string filter, + string apiVersion, + RequestOptions options) + : base(pipeline, endpoint, vectorStoreId, batchId, limit, order, after, before, filter, options) + { + _endpoint = endpoint; + _apiVersion = apiVersion; + } + + internal override async Task GetFileAssociationsAsync(string vectorStoreId, string batchId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateGetFilesInVectorStoreBatchesRequest(vectorStoreId, batchId, limit, order, after, before, filter, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal override ClientResult GetFileAssociations(string vectorStoreId, string batchId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateGetFilesInVectorStoreBatchesRequest(vectorStoreId, batchId, limit, order, after, before, filter, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + private new PipelineMessage CreateGetFilesInVectorStoreBatchesRequest(string vectorStoreId, string batchId, int? limit, string order, string after, string before, string filter, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithAssistantsHeader() + .WithOptions(options) + .WithMethod("GET") + .WithAccept("application/json") + .WithCommonListParameters(limit, order, after, before) + .WithOptionalQueryParameter("filter", filter) + .WithPath("vector_stores", vectorStoreId, "file_batches", batchId, "files") + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/Internal/Pagination/AzureVectorStoreFilesPageEnumerator.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/Internal/Pagination/AzureVectorStoreFilesPageEnumerator.cs new file mode 100644 index 000000000..4152a4869 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/Internal/Pagination/AzureVectorStoreFilesPageEnumerator.cs @@ -0,0 +1,53 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.VectorStores; + +internal partial class AzureVectorStoreFilesPageEnumerator : VectorStoreFilesPageEnumerator +{ + private readonly Uri _endpoint; + private readonly string _apiVersion; + + public AzureVectorStoreFilesPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string vectorStoreId, + int? limit, string order, string after, string before, string filter, + string apiVersion, + RequestOptions options) + : base(pipeline, endpoint, vectorStoreId, limit, order, after, before, filter, options) + { + _endpoint = endpoint; + _apiVersion = apiVersion; + } + + internal override async Task GetFileAssociationsAsync(string vectorStoreId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateGetVectorStoreFilesRequest(vectorStoreId, limit, order, after, before, filter, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal override ClientResult GetFileAssociations(string vectorStoreId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateGetVectorStoreFilesRequest(vectorStoreId, limit, order, after, before, filter, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + private new PipelineMessage CreateGetVectorStoreFilesRequest(string vectorStoreId, int? limit, string order, string after, string before, string filter, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithAssistantsHeader() + .WithOptions(options) + .WithMethod("GET") + .WithAccept("application/json") + .WithCommonListParameters(limit, order, after, before) + .WithOptionalQueryParameter("filter", filter) + .WithPath("vector_stores", vectorStoreId, "files") + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/Internal/Pagination/AzureVectorStoresPageEnumerator.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/Internal/Pagination/AzureVectorStoresPageEnumerator.cs new file mode 100644 index 000000000..d8a90039f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Custom/VectorStores/Internal/Pagination/AzureVectorStoresPageEnumerator.cs @@ -0,0 +1,47 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI.VectorStores; + +internal partial class AzureVectorStoresPageEnumerator : VectorStoresPageEnumerator +{ + private readonly Uri _endpoint; + private readonly string _apiVersion; + + public AzureVectorStoresPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + int? limit, string order, string after, string before, + string apiVersion, + RequestOptions options) + : base(pipeline, endpoint, limit, order, after, before, options) + { + _endpoint = endpoint; + _apiVersion = apiVersion; + } + + internal override async Task GetVectorStoresAsync(int? limit, string order, string after, string before, RequestOptions options) + { + using PipelineMessage message = CreateGetVectorStoresRequest(limit, order, after, before, options); + return ClientResult.FromResponse(await Pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal override ClientResult GetVectorStores(int? limit, string order, string after, string before, RequestOptions options) + { + using PipelineMessage message = CreateGetVectorStoresRequest(limit, order, after, before, options); + return ClientResult.FromResponse(Pipeline.ProcessMessage(message, options)); + } + + private new PipelineMessage CreateGetVectorStoresRequest(int? limit, string order, string after, string before, RequestOptions options) + => new AzureOpenAIPipelineMessageBuilder(Pipeline, _endpoint, _apiVersion) + .WithAssistantsHeader() + .WithOptions(options) + .WithMethod("GET") + .WithAccept("application/json") + .WithCommonListParameters(limit, order, after, before) + .WithPath("vector_stores") + .Build(); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatCitation.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatCitation.Serialization.cs new file mode 100644 index 000000000..7f9738eea --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatCitation.Serialization.cs @@ -0,0 +1,186 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + public partial class AzureChatCitation : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureChatCitation)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("content") != true) + { + writer.WritePropertyName("content"u8); + writer.WriteStringValue(Content); + } + if (SerializedAdditionalRawData?.ContainsKey("title") != true && Optional.IsDefined(Title)) + { + writer.WritePropertyName("title"u8); + writer.WriteStringValue(Title); + } + if (SerializedAdditionalRawData?.ContainsKey("url") != true && Optional.IsDefined(Url)) + { + writer.WritePropertyName("url"u8); + writer.WriteStringValue(Url); + } + if (SerializedAdditionalRawData?.ContainsKey("filepath") != true && Optional.IsDefined(Filepath)) + { + writer.WritePropertyName("filepath"u8); + writer.WriteStringValue(Filepath); + } + if (SerializedAdditionalRawData?.ContainsKey("chunk_id") != true && Optional.IsDefined(ChunkId)) + { + writer.WritePropertyName("chunk_id"u8); + writer.WriteStringValue(ChunkId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureChatCitation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureChatCitation)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureChatCitation(document.RootElement, options); + } + + internal static AzureChatCitation DeserializeAzureChatCitation(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string content = default; + string title = default; + string url = default; + string filepath = default; + string chunkId = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("content"u8)) + { + content = property.Value.GetString(); + continue; + } + if (property.NameEquals("title"u8)) + { + title = property.Value.GetString(); + continue; + } + if (property.NameEquals("url"u8)) + { + url = property.Value.GetString(); + continue; + } + if (property.NameEquals("filepath"u8)) + { + filepath = property.Value.GetString(); + continue; + } + if (property.NameEquals("chunk_id"u8)) + { + chunkId = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AzureChatCitation( + content, + title, + url, + filepath, + chunkId, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureChatCitation)} does not support writing '{options.Format}' format."); + } + } + + AzureChatCitation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureChatCitation(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureChatCitation)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static AzureChatCitation FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAzureChatCitation(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatCitation.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatCitation.cs new file mode 100644 index 000000000..114bc7d3b --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatCitation.cs @@ -0,0 +1,87 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureChatMessageContextCitation. + public partial class AzureChatCitation + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// The content of the citation. + /// is null. + internal AzureChatCitation(string content) + { + Argument.AssertNotNull(content, nameof(content)); + + Content = content; + } + + /// Initializes a new instance of . + /// The content of the citation. + /// The title for the citation. + /// The URL of the citation. + /// The file path for the citation. + /// The chunk ID for the citation. + /// Keeps track of any properties unknown to the library. + internal AzureChatCitation(string content, string title, string url, string filepath, string chunkId, IDictionary serializedAdditionalRawData) + { + Content = content; + Title = title; + Url = url; + Filepath = filepath; + ChunkId = chunkId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal AzureChatCitation() + { + } + + /// The content of the citation. + public string Content { get; } + /// The title for the citation. + public string Title { get; } + /// The URL of the citation. + public string Url { get; } + /// The file path for the citation. + public string Filepath { get; } + /// The chunk ID for the citation. + public string ChunkId { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatDataSource.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatDataSource.Serialization.cs new file mode 100644 index 000000000..76263b02b --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatDataSource.Serialization.cs @@ -0,0 +1,130 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + [PersistableModelProxy(typeof(InternalUnknownAzureChatDataSource))] + public partial class AzureChatDataSource : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureChatDataSource)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureChatDataSource IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureChatDataSource)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureChatDataSource(document.RootElement, options); + } + + internal static AzureChatDataSource DeserializeAzureChatDataSource(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "azure_cosmos_db": return AzureCosmosDBChatDataSource.DeserializeAzureCosmosDBChatDataSource(element, options); + case "azure_ml_index": return AzureMachineLearningIndexChatDataSource.DeserializeAzureMachineLearningIndexChatDataSource(element, options); + case "azure_search": return AzureSearchChatDataSource.DeserializeAzureSearchChatDataSource(element, options); + case "elasticsearch": return ElasticsearchChatDataSource.DeserializeElasticsearchChatDataSource(element, options); + case "pinecone": return PineconeChatDataSource.DeserializePineconeChatDataSource(element, options); + } + } + return InternalUnknownAzureChatDataSource.DeserializeInternalUnknownAzureChatDataSource(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureChatDataSource)} does not support writing '{options.Format}' format."); + } + } + + AzureChatDataSource IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureChatDataSource(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureChatDataSource)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static AzureChatDataSource FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAzureChatDataSource(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatDataSource.cs new file mode 100644 index 000000000..01acaf5cf --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatDataSource.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// + /// A representation of configuration data for a single Azure OpenAI chat data source. + /// This will be used by a chat completions request that should use Azure OpenAI chat extensions to augment the + /// response behavior. + /// The use of this configuration is compatible only with Azure OpenAI. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes. + /// The available derived classes include , , , and . + /// + public abstract partial class AzureChatDataSource + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + protected AzureChatDataSource() + { + } + + /// Initializes a new instance of . + /// The differentiating type identifier for the data source. + /// Keeps track of any properties unknown to the library. + internal AzureChatDataSource(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// The differentiating type identifier for the data source. + internal string Type { get; set; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatMessageContext.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatMessageContext.Serialization.cs new file mode 100644 index 000000000..40f6cd625 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatMessageContext.Serialization.cs @@ -0,0 +1,176 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + public partial class AzureChatMessageContext : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureChatMessageContext)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("intent") != true && Optional.IsDefined(Intent)) + { + writer.WritePropertyName("intent"u8); + writer.WriteStringValue(Intent); + } + if (SerializedAdditionalRawData?.ContainsKey("citations") != true && Optional.IsCollectionDefined(Citations)) + { + writer.WritePropertyName("citations"u8); + writer.WriteStartArray(); + foreach (var item in Citations) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("all_retrieved_documents") != true && Optional.IsDefined(AllRetrievedDocuments)) + { + writer.WritePropertyName("all_retrieved_documents"u8); + writer.WriteObjectValue(AllRetrievedDocuments, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureChatMessageContext IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureChatMessageContext)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureChatMessageContext(document.RootElement, options); + } + + internal static AzureChatMessageContext DeserializeAzureChatMessageContext(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string intent = default; + IReadOnlyList citations = default; + AzureChatRetrievedDocument allRetrievedDocuments = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("intent"u8)) + { + intent = property.Value.GetString(); + continue; + } + if (property.NameEquals("citations"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(AzureChatCitation.DeserializeAzureChatCitation(item, options)); + } + citations = array; + continue; + } + if (property.NameEquals("all_retrieved_documents"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + allRetrievedDocuments = AzureChatRetrievedDocument.DeserializeAzureChatRetrievedDocument(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AzureChatMessageContext(intent, citations ?? new ChangeTrackingList(), allRetrievedDocuments, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureChatMessageContext)} does not support writing '{options.Format}' format."); + } + } + + AzureChatMessageContext IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureChatMessageContext(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureChatMessageContext)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static AzureChatMessageContext FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAzureChatMessageContext(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatMessageContext.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatMessageContext.cs new file mode 100644 index 000000000..dd2367368 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatMessageContext.cs @@ -0,0 +1,73 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// + /// An additional property, added to chat completion response messages, produced by the Azure OpenAI service when using + /// extension behavior. This includes intent and citation information from the On Your Data feature. + /// + public partial class AzureChatMessageContext + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + internal AzureChatMessageContext() + { + Citations = new ChangeTrackingList(); + } + + /// Initializes a new instance of . + /// The detected intent from the chat history, which is used to carry conversation context between interactions. + /// The citations produced by the data retrieval. + /// Summary information about documents retrieved by the data retrieval operation. + /// Keeps track of any properties unknown to the library. + internal AzureChatMessageContext(string intent, IReadOnlyList citations, AzureChatRetrievedDocument allRetrievedDocuments, IDictionary serializedAdditionalRawData) + { + Intent = intent; + Citations = citations; + AllRetrievedDocuments = allRetrievedDocuments; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// The detected intent from the chat history, which is used to carry conversation context between interactions. + public string Intent { get; } + /// The citations produced by the data retrieval. + public IReadOnlyList Citations { get; } + /// Summary information about documents retrieved by the data retrieval operation. + public AzureChatRetrievedDocument AllRetrievedDocuments { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatRetrievedDocument.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatRetrievedDocument.Serialization.cs new file mode 100644 index 000000000..963e4902b --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatRetrievedDocument.Serialization.cs @@ -0,0 +1,268 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + public partial class AzureChatRetrievedDocument : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureChatRetrievedDocument)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("content") != true) + { + writer.WritePropertyName("content"u8); + writer.WriteStringValue(Content); + } + if (SerializedAdditionalRawData?.ContainsKey("title") != true && Optional.IsDefined(Title)) + { + writer.WritePropertyName("title"u8); + writer.WriteStringValue(Title); + } + if (SerializedAdditionalRawData?.ContainsKey("url") != true && Optional.IsDefined(Url)) + { + writer.WritePropertyName("url"u8); + writer.WriteStringValue(Url); + } + if (SerializedAdditionalRawData?.ContainsKey("filepath") != true && Optional.IsDefined(Filepath)) + { + writer.WritePropertyName("filepath"u8); + writer.WriteStringValue(Filepath); + } + if (SerializedAdditionalRawData?.ContainsKey("chunk_id") != true && Optional.IsDefined(ChunkId)) + { + writer.WritePropertyName("chunk_id"u8); + writer.WriteStringValue(ChunkId); + } + if (SerializedAdditionalRawData?.ContainsKey("search_queries") != true) + { + writer.WritePropertyName("search_queries"u8); + writer.WriteStartArray(); + foreach (var item in SearchQueries) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("data_source_index") != true) + { + writer.WritePropertyName("data_source_index"u8); + writer.WriteNumberValue(DataSourceIndex); + } + if (SerializedAdditionalRawData?.ContainsKey("original_search_score") != true && Optional.IsDefined(OriginalSearchScore)) + { + writer.WritePropertyName("original_search_score"u8); + writer.WriteNumberValue(OriginalSearchScore.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("rerank_score") != true && Optional.IsDefined(RerankScore)) + { + writer.WritePropertyName("rerank_score"u8); + writer.WriteNumberValue(RerankScore.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("filter_reason") != true && Optional.IsDefined(FilterReason)) + { + writer.WritePropertyName("filter_reason"u8); + writer.WriteStringValue(FilterReason.Value.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureChatRetrievedDocument IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureChatRetrievedDocument)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureChatRetrievedDocument(document.RootElement, options); + } + + internal static AzureChatRetrievedDocument DeserializeAzureChatRetrievedDocument(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string content = default; + string title = default; + string url = default; + string filepath = default; + string chunkId = default; + IReadOnlyList searchQueries = default; + int dataSourceIndex = default; + double? originalSearchScore = default; + double? rerankScore = default; + AzureChatRetrievedDocumentFilterReason? filterReason = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("content"u8)) + { + content = property.Value.GetString(); + continue; + } + if (property.NameEquals("title"u8)) + { + title = property.Value.GetString(); + continue; + } + if (property.NameEquals("url"u8)) + { + url = property.Value.GetString(); + continue; + } + if (property.NameEquals("filepath"u8)) + { + filepath = property.Value.GetString(); + continue; + } + if (property.NameEquals("chunk_id"u8)) + { + chunkId = property.Value.GetString(); + continue; + } + if (property.NameEquals("search_queries"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + searchQueries = array; + continue; + } + if (property.NameEquals("data_source_index"u8)) + { + dataSourceIndex = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("original_search_score"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + originalSearchScore = property.Value.GetDouble(); + continue; + } + if (property.NameEquals("rerank_score"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + rerankScore = property.Value.GetDouble(); + continue; + } + if (property.NameEquals("filter_reason"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + filterReason = new AzureChatRetrievedDocumentFilterReason(property.Value.GetString()); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AzureChatRetrievedDocument( + content, + title, + url, + filepath, + chunkId, + searchQueries, + dataSourceIndex, + originalSearchScore, + rerankScore, + filterReason, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureChatRetrievedDocument)} does not support writing '{options.Format}' format."); + } + } + + AzureChatRetrievedDocument IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureChatRetrievedDocument(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureChatRetrievedDocument)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static AzureChatRetrievedDocument FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAzureChatRetrievedDocument(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatRetrievedDocument.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatRetrievedDocument.cs new file mode 100644 index 000000000..c0e4b0f94 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatRetrievedDocument.cs @@ -0,0 +1,113 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureChatMessageContextAllRetrievedDocuments. + public partial class AzureChatRetrievedDocument + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// The content of the citation. + /// The search queries executed to retrieve documents. + /// The index of the data source used for retrieval. + /// or is null. + internal AzureChatRetrievedDocument(string content, IEnumerable searchQueries, int dataSourceIndex) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNull(searchQueries, nameof(searchQueries)); + + Content = content; + SearchQueries = searchQueries.ToList(); + DataSourceIndex = dataSourceIndex; + } + + /// Initializes a new instance of . + /// The content of the citation. + /// The title for the citation. + /// The URL of the citation. + /// The file path for the citation. + /// The chunk ID for the citation. + /// The search queries executed to retrieve documents. + /// The index of the data source used for retrieval. + /// The original search score for the retrieval. + /// The rerank score for the retrieval. + /// If applicable, an indication of why the document was filtered. + /// Keeps track of any properties unknown to the library. + internal AzureChatRetrievedDocument(string content, string title, string url, string filepath, string chunkId, IReadOnlyList searchQueries, int dataSourceIndex, double? originalSearchScore, double? rerankScore, AzureChatRetrievedDocumentFilterReason? filterReason, IDictionary serializedAdditionalRawData) + { + Content = content; + Title = title; + Url = url; + Filepath = filepath; + ChunkId = chunkId; + SearchQueries = searchQueries; + DataSourceIndex = dataSourceIndex; + OriginalSearchScore = originalSearchScore; + RerankScore = rerankScore; + FilterReason = filterReason; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal AzureChatRetrievedDocument() + { + } + + /// The content of the citation. + public string Content { get; } + /// The title for the citation. + public string Title { get; } + /// The URL of the citation. + public string Url { get; } + /// The file path for the citation. + public string Filepath { get; } + /// The chunk ID for the citation. + public string ChunkId { get; } + /// The search queries executed to retrieve documents. + public IReadOnlyList SearchQueries { get; } + /// The index of the data source used for retrieval. + public int DataSourceIndex { get; } + /// The original search score for the retrieval. + public double? OriginalSearchScore { get; } + /// The rerank score for the retrieval. + public double? RerankScore { get; } + /// If applicable, an indication of why the document was filtered. + public AzureChatRetrievedDocumentFilterReason? FilterReason { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatRetrievedDocumentFilterReason.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatRetrievedDocumentFilterReason.cs new file mode 100644 index 000000000..36be67132 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureChatRetrievedDocumentFilterReason.cs @@ -0,0 +1,48 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureChatMessageContextAllRetrievedDocumentsFilterReason. + public readonly partial struct AzureChatRetrievedDocumentFilterReason : IEquatable + { + private readonly string _value; + + /// Initializes a new instance of . + /// is null. + public AzureChatRetrievedDocumentFilterReason(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ScoreValue = "score"; + private const string RerankValue = "rerank"; + + /// score. + public static AzureChatRetrievedDocumentFilterReason Score { get; } = new AzureChatRetrievedDocumentFilterReason(ScoreValue); + /// rerank. + public static AzureChatRetrievedDocumentFilterReason Rerank { get; } = new AzureChatRetrievedDocumentFilterReason(RerankValue); + /// Determines if two values are the same. + public static bool operator ==(AzureChatRetrievedDocumentFilterReason left, AzureChatRetrievedDocumentFilterReason right) => left.Equals(right); + /// Determines if two values are not the same. + public static bool operator !=(AzureChatRetrievedDocumentFilterReason left, AzureChatRetrievedDocumentFilterReason right) => !left.Equals(right); + /// Converts a string to a . + public static implicit operator AzureChatRetrievedDocumentFilterReason(string value) => new AzureChatRetrievedDocumentFilterReason(value); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is AzureChatRetrievedDocumentFilterReason other && Equals(other); + /// + public bool Equals(AzureChatRetrievedDocumentFilterReason other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + /// + public override string ToString() => _value; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureCosmosDBChatDataSource.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureCosmosDBChatDataSource.Serialization.cs new file mode 100644 index 000000000..9b490f30b --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureCosmosDBChatDataSource.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + public partial class AzureCosmosDBChatDataSource : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureCosmosDBChatDataSource)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("parameters") != true) + { + writer.WritePropertyName("parameters"u8); + writer.WriteObjectValue(InternalParameters, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureCosmosDBChatDataSource IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureCosmosDBChatDataSource)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureCosmosDBChatDataSource(document.RootElement, options); + } + + internal static AzureCosmosDBChatDataSource DeserializeAzureCosmosDBChatDataSource(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalAzureCosmosDBChatDataSourceParameters parameters = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("parameters"u8)) + { + parameters = InternalAzureCosmosDBChatDataSourceParameters.DeserializeInternalAzureCosmosDBChatDataSourceParameters(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AzureCosmosDBChatDataSource(type, serializedAdditionalRawData, parameters); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureCosmosDBChatDataSource)} does not support writing '{options.Format}' format."); + } + } + + AzureCosmosDBChatDataSource IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureCosmosDBChatDataSource(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureCosmosDBChatDataSource)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new AzureCosmosDBChatDataSource FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAzureCosmosDBChatDataSource(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureCosmosDBChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureCosmosDBChatDataSource.cs new file mode 100644 index 000000000..e33af59b8 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureCosmosDBChatDataSource.cs @@ -0,0 +1,14 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// Represents a data source configuration that will use an Azure CosmosDB resource. + public partial class AzureCosmosDBChatDataSource : AzureChatDataSource + { + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureMachineLearningIndexChatDataSource.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureMachineLearningIndexChatDataSource.Serialization.cs new file mode 100644 index 000000000..2384629e5 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureMachineLearningIndexChatDataSource.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + public partial class AzureMachineLearningIndexChatDataSource : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureMachineLearningIndexChatDataSource)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("parameters") != true) + { + writer.WritePropertyName("parameters"u8); + writer.WriteObjectValue(InternalParameters, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureMachineLearningIndexChatDataSource IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureMachineLearningIndexChatDataSource)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureMachineLearningIndexChatDataSource(document.RootElement, options); + } + + internal static AzureMachineLearningIndexChatDataSource DeserializeAzureMachineLearningIndexChatDataSource(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalAzureMachineLearningIndexChatDataSourceParameters parameters = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("parameters"u8)) + { + parameters = InternalAzureMachineLearningIndexChatDataSourceParameters.DeserializeInternalAzureMachineLearningIndexChatDataSourceParameters(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AzureMachineLearningIndexChatDataSource(type, serializedAdditionalRawData, parameters); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureMachineLearningIndexChatDataSource)} does not support writing '{options.Format}' format."); + } + } + + AzureMachineLearningIndexChatDataSource IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureMachineLearningIndexChatDataSource(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureMachineLearningIndexChatDataSource)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new AzureMachineLearningIndexChatDataSource FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAzureMachineLearningIndexChatDataSource(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureMachineLearningIndexChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureMachineLearningIndexChatDataSource.cs new file mode 100644 index 000000000..7abef53a2 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureMachineLearningIndexChatDataSource.cs @@ -0,0 +1,14 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// Represents a data source configuration that will use an Azure Machine Learning vector index. + public partial class AzureMachineLearningIndexChatDataSource : AzureChatDataSource + { + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatError.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatError.Serialization.cs new file mode 100644 index 000000000..d4bda9baa --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatError.Serialization.cs @@ -0,0 +1,190 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal partial class AzureOpenAIChatError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureOpenAIChatError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true && Optional.IsDefined(Code)) + { + writer.WritePropertyName("code"u8); + writer.WriteStringValue(Code); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true && Optional.IsDefined(Message)) + { + writer.WritePropertyName("message"u8); + writer.WriteStringValue(Message); + } + if (SerializedAdditionalRawData?.ContainsKey("param") != true && Optional.IsDefined(Param)) + { + writer.WritePropertyName("param"u8); + writer.WriteStringValue(Param); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true && Optional.IsDefined(Type)) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData?.ContainsKey("inner_error") != true && Optional.IsDefined(InnerError)) + { + writer.WritePropertyName("inner_error"u8); + writer.WriteObjectValue(InnerError, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureOpenAIChatError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureOpenAIChatError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureOpenAIChatError(document.RootElement, options); + } + + internal static AzureOpenAIChatError DeserializeAzureOpenAIChatError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string code = default; + string message = default; + string param = default; + string type = default; + InternalAzureOpenAIChatErrorInnerError innerError = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + code = property.Value.GetString(); + continue; + } + if (property.NameEquals("message"u8)) + { + message = property.Value.GetString(); + continue; + } + if (property.NameEquals("param"u8)) + { + param = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (property.NameEquals("inner_error"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + innerError = InternalAzureOpenAIChatErrorInnerError.DeserializeInternalAzureOpenAIChatErrorInnerError(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AzureOpenAIChatError( + code, + message, + param, + type, + innerError, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureOpenAIChatError)} does not support writing '{options.Format}' format."); + } + } + + AzureOpenAIChatError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureOpenAIChatError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureOpenAIChatError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static AzureOpenAIChatError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAzureOpenAIChatError(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatError.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatError.cs new file mode 100644 index 000000000..b379c88ac --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatError.cs @@ -0,0 +1,77 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// The structured representation of an error from an Azure OpenAI chat completion request. + internal partial class AzureOpenAIChatError + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + internal AzureOpenAIChatError() + { + } + + /// Initializes a new instance of . + /// The distinct, machine-generated identifier for the error. + /// A human-readable message associated with the error. + /// If applicable, the request input parameter associated with the error. + /// If applicable, the input line number associated with the error. + /// If applicable, an upstream error that originated this error. + /// Keeps track of any properties unknown to the library. + internal AzureOpenAIChatError(string code, string message, string param, string type, InternalAzureOpenAIChatErrorInnerError innerError, IDictionary serializedAdditionalRawData) + { + Code = code; + Message = message; + Param = param; + Type = type; + InnerError = innerError; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// The distinct, machine-generated identifier for the error. + public string Code { get; } + /// A human-readable message associated with the error. + public string Message { get; } + /// If applicable, the request input parameter associated with the error. + public string Param { get; } + /// If applicable, the input line number associated with the error. + public string Type { get; } + /// If applicable, an upstream error that originated this error. + public InternalAzureOpenAIChatErrorInnerError InnerError { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatErrorResponse.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatErrorResponse.Serialization.cs new file mode 100644 index 000000000..a85c386fd --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatErrorResponse.Serialization.cs @@ -0,0 +1,140 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal partial class AzureOpenAIChatErrorResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureOpenAIChatErrorResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("error") != true && Optional.IsDefined(Error)) + { + writer.WritePropertyName("error"u8); + writer.WriteObjectValue(Error, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureOpenAIChatErrorResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureOpenAIChatErrorResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureOpenAIChatErrorResponse(document.RootElement, options); + } + + internal static AzureOpenAIChatErrorResponse DeserializeAzureOpenAIChatErrorResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + AzureOpenAIChatError error = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("error"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + error = AzureOpenAIChatError.DeserializeAzureOpenAIChatError(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AzureOpenAIChatErrorResponse(error, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureOpenAIChatErrorResponse)} does not support writing '{options.Format}' format."); + } + } + + AzureOpenAIChatErrorResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureOpenAIChatErrorResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureOpenAIChatErrorResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static AzureOpenAIChatErrorResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAzureOpenAIChatErrorResponse(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatErrorResponse.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatErrorResponse.cs new file mode 100644 index 000000000..767fce2b0 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIChatErrorResponse.cs @@ -0,0 +1,61 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// A structured representation of an error an Azure OpenAI request. + internal partial class AzureOpenAIChatErrorResponse + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + internal AzureOpenAIChatErrorResponse() + { + } + + /// Initializes a new instance of . + /// + /// Keeps track of any properties unknown to the library. + internal AzureOpenAIChatErrorResponse(AzureOpenAIChatError error, IDictionary serializedAdditionalRawData) + { + Error = error; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Gets the error. + public AzureOpenAIChatError Error { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleError.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleError.Serialization.cs new file mode 100644 index 000000000..2498a9dac --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleError.Serialization.cs @@ -0,0 +1,190 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal partial class AzureOpenAIDalleError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureOpenAIDalleError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true && Optional.IsDefined(Code)) + { + writer.WritePropertyName("code"u8); + writer.WriteStringValue(Code); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true && Optional.IsDefined(Message)) + { + writer.WritePropertyName("message"u8); + writer.WriteStringValue(Message); + } + if (SerializedAdditionalRawData?.ContainsKey("param") != true && Optional.IsDefined(Param)) + { + writer.WritePropertyName("param"u8); + writer.WriteStringValue(Param); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true && Optional.IsDefined(Type)) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData?.ContainsKey("inner_error") != true && Optional.IsDefined(InnerError)) + { + writer.WritePropertyName("inner_error"u8); + writer.WriteObjectValue(InnerError, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureOpenAIDalleError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureOpenAIDalleError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureOpenAIDalleError(document.RootElement, options); + } + + internal static AzureOpenAIDalleError DeserializeAzureOpenAIDalleError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string code = default; + string message = default; + string param = default; + string type = default; + InternalAzureOpenAIDalleErrorInnerError innerError = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + code = property.Value.GetString(); + continue; + } + if (property.NameEquals("message"u8)) + { + message = property.Value.GetString(); + continue; + } + if (property.NameEquals("param"u8)) + { + param = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (property.NameEquals("inner_error"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + innerError = InternalAzureOpenAIDalleErrorInnerError.DeserializeInternalAzureOpenAIDalleErrorInnerError(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AzureOpenAIDalleError( + code, + message, + param, + type, + innerError, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureOpenAIDalleError)} does not support writing '{options.Format}' format."); + } + } + + AzureOpenAIDalleError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureOpenAIDalleError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureOpenAIDalleError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static AzureOpenAIDalleError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAzureOpenAIDalleError(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleError.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleError.cs new file mode 100644 index 000000000..7dfb2de16 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleError.cs @@ -0,0 +1,77 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// The structured representation of an error from an Azure OpenAI image generation request. + internal partial class AzureOpenAIDalleError + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + internal AzureOpenAIDalleError() + { + } + + /// Initializes a new instance of . + /// The distinct, machine-generated identifier for the error. + /// A human-readable message associated with the error. + /// If applicable, the request input parameter associated with the error. + /// If applicable, the input line number associated with the error. + /// If applicable, an upstream error that originated this error. + /// Keeps track of any properties unknown to the library. + internal AzureOpenAIDalleError(string code, string message, string param, string type, InternalAzureOpenAIDalleErrorInnerError innerError, IDictionary serializedAdditionalRawData) + { + Code = code; + Message = message; + Param = param; + Type = type; + InnerError = innerError; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// The distinct, machine-generated identifier for the error. + public string Code { get; } + /// A human-readable message associated with the error. + public string Message { get; } + /// If applicable, the request input parameter associated with the error. + public string Param { get; } + /// If applicable, the input line number associated with the error. + public string Type { get; } + /// If applicable, an upstream error that originated this error. + public InternalAzureOpenAIDalleErrorInnerError InnerError { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleErrorResponse.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleErrorResponse.Serialization.cs new file mode 100644 index 000000000..cdcc8bc20 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleErrorResponse.Serialization.cs @@ -0,0 +1,140 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal partial class AzureOpenAIDalleErrorResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureOpenAIDalleErrorResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("error") != true && Optional.IsDefined(Error)) + { + writer.WritePropertyName("error"u8); + writer.WriteObjectValue(Error, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureOpenAIDalleErrorResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureOpenAIDalleErrorResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureOpenAIDalleErrorResponse(document.RootElement, options); + } + + internal static AzureOpenAIDalleErrorResponse DeserializeAzureOpenAIDalleErrorResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + AzureOpenAIDalleError error = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("error"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + error = AzureOpenAIDalleError.DeserializeAzureOpenAIDalleError(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AzureOpenAIDalleErrorResponse(error, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureOpenAIDalleErrorResponse)} does not support writing '{options.Format}' format."); + } + } + + AzureOpenAIDalleErrorResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureOpenAIDalleErrorResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureOpenAIDalleErrorResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static AzureOpenAIDalleErrorResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAzureOpenAIDalleErrorResponse(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleErrorResponse.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleErrorResponse.cs new file mode 100644 index 000000000..6aafb0c3d --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureOpenAIDalleErrorResponse.cs @@ -0,0 +1,61 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// A structured representation of an error an Azure OpenAI request. + internal partial class AzureOpenAIDalleErrorResponse + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + internal AzureOpenAIDalleErrorResponse() + { + } + + /// Initializes a new instance of . + /// + /// Keeps track of any properties unknown to the library. + internal AzureOpenAIDalleErrorResponse(AzureOpenAIDalleError error, IDictionary serializedAdditionalRawData) + { + Error = error; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Gets the error. + public AzureOpenAIDalleError Error { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureSearchChatDataSource.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureSearchChatDataSource.Serialization.cs new file mode 100644 index 000000000..a9deddec3 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureSearchChatDataSource.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + public partial class AzureSearchChatDataSource : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureSearchChatDataSource)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("parameters") != true) + { + writer.WritePropertyName("parameters"u8); + writer.WriteObjectValue(InternalParameters, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureSearchChatDataSource IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureSearchChatDataSource)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureSearchChatDataSource(document.RootElement, options); + } + + internal static AzureSearchChatDataSource DeserializeAzureSearchChatDataSource(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalAzureSearchChatDataSourceParameters parameters = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("parameters"u8)) + { + parameters = InternalAzureSearchChatDataSourceParameters.DeserializeInternalAzureSearchChatDataSourceParameters(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AzureSearchChatDataSource(type, serializedAdditionalRawData, parameters); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureSearchChatDataSource)} does not support writing '{options.Format}' format."); + } + } + + AzureSearchChatDataSource IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureSearchChatDataSource(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureSearchChatDataSource)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new AzureSearchChatDataSource FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAzureSearchChatDataSource(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureSearchChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureSearchChatDataSource.cs new file mode 100644 index 000000000..9c4af92db --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/AzureSearchChatDataSource.cs @@ -0,0 +1,14 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// Represents a data source configuration that will use an Azure Search resource. + public partial class AzureSearchChatDataSource : AzureChatDataSource + { + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterBlocklistResult.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterBlocklistResult.Serialization.cs new file mode 100644 index 000000000..a2f4dd0e6 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterBlocklistResult.Serialization.cs @@ -0,0 +1,161 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + public partial class ContentFilterBlocklistResult : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterBlocklistResult)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("filtered") != true) + { + writer.WritePropertyName("filtered"u8); + writer.WriteBooleanValue(Filtered); + } + if (SerializedAdditionalRawData?.ContainsKey("details") != true && Optional.IsCollectionDefined(InternalDetails)) + { + writer.WritePropertyName("details"u8); + writer.WriteStartArray(); + foreach (var item in InternalDetails) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ContentFilterBlocklistResult IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterBlocklistResult)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeContentFilterBlocklistResult(document.RootElement, options); + } + + internal static ContentFilterBlocklistResult DeserializeContentFilterBlocklistResult(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + bool filtered = default; + IReadOnlyList details = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("filtered"u8)) + { + filtered = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("details"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalAzureContentFilterBlocklistResultDetail.DeserializeInternalAzureContentFilterBlocklistResultDetail(item, options)); + } + details = array; + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ContentFilterBlocklistResult(filtered, details ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ContentFilterBlocklistResult)} does not support writing '{options.Format}' format."); + } + } + + ContentFilterBlocklistResult IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeContentFilterBlocklistResult(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ContentFilterBlocklistResult)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static ContentFilterBlocklistResult FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeContentFilterBlocklistResult(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterBlocklistResult.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterBlocklistResult.cs new file mode 100644 index 000000000..d033e0e34 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterBlocklistResult.cs @@ -0,0 +1,71 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// A collection of true/false filtering results for configured custom blocklists. + public partial class ContentFilterBlocklistResult + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// A value indicating whether any of the detailed blocklists resulted in a filtering action. + internal ContentFilterBlocklistResult(bool filtered) + { + Filtered = filtered; + InternalDetails = new ChangeTrackingList(); + } + + /// Initializes a new instance of . + /// A value indicating whether any of the detailed blocklists resulted in a filtering action. + /// The pairs of individual blocklist IDs and whether they resulted in a filtering action. + /// Keeps track of any properties unknown to the library. + internal ContentFilterBlocklistResult(bool filtered, IReadOnlyList internalDetails, IDictionary serializedAdditionalRawData) + { + Filtered = filtered; + InternalDetails = internalDetails; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal ContentFilterBlocklistResult() + { + } + + /// A value indicating whether any of the detailed blocklists resulted in a filtering action. + public bool Filtered { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterDetectionResult.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterDetectionResult.Serialization.cs new file mode 100644 index 000000000..9c520fede --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterDetectionResult.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + public partial class ContentFilterDetectionResult : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterDetectionResult)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("filtered") != true) + { + writer.WritePropertyName("filtered"u8); + writer.WriteBooleanValue(Filtered); + } + if (SerializedAdditionalRawData?.ContainsKey("detected") != true) + { + writer.WritePropertyName("detected"u8); + writer.WriteBooleanValue(Detected); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ContentFilterDetectionResult IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterDetectionResult)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeContentFilterDetectionResult(document.RootElement, options); + } + + internal static ContentFilterDetectionResult DeserializeContentFilterDetectionResult(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + bool filtered = default; + bool detected = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("filtered"u8)) + { + filtered = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("detected"u8)) + { + detected = property.Value.GetBoolean(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ContentFilterDetectionResult(filtered, detected, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ContentFilterDetectionResult)} does not support writing '{options.Format}' format."); + } + } + + ContentFilterDetectionResult IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeContentFilterDetectionResult(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ContentFilterDetectionResult)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static ContentFilterDetectionResult FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeContentFilterDetectionResult(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterDetectionResult.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterDetectionResult.cs new file mode 100644 index 000000000..34aab2400 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterDetectionResult.cs @@ -0,0 +1,77 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// + /// A labeled content filter result item that indicates whether the content was detected and whether the content was + /// filtered. + /// + public partial class ContentFilterDetectionResult + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// Whether the content detection resulted in a content filtering action. + /// Whether the labeled content category was detected in the content. + internal ContentFilterDetectionResult(bool filtered, bool detected) + { + Filtered = filtered; + Detected = detected; + } + + /// Initializes a new instance of . + /// Whether the content detection resulted in a content filtering action. + /// Whether the labeled content category was detected in the content. + /// Keeps track of any properties unknown to the library. + internal ContentFilterDetectionResult(bool filtered, bool detected, IDictionary serializedAdditionalRawData) + { + Filtered = filtered; + Detected = detected; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal ContentFilterDetectionResult() + { + } + + /// Whether the content detection resulted in a content filtering action. + public bool Filtered { get; } + /// Whether the labeled content category was detected in the content. + public bool Detected { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialCitedResult.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialCitedResult.Serialization.cs new file mode 100644 index 000000000..2e46d1992 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialCitedResult.Serialization.cs @@ -0,0 +1,151 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + public partial class ContentFilterProtectedMaterialCitedResult : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterProtectedMaterialCitedResult)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("license") != true && Optional.IsDefined(License)) + { + writer.WritePropertyName("license"u8); + writer.WriteStringValue(License); + } + if (SerializedAdditionalRawData?.ContainsKey("URL") != true && Optional.IsDefined(URL)) + { + writer.WritePropertyName("URL"u8); + writer.WriteStringValue(URL.AbsoluteUri); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ContentFilterProtectedMaterialCitedResult IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterProtectedMaterialCitedResult)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeContentFilterProtectedMaterialCitedResult(document.RootElement, options); + } + + internal static ContentFilterProtectedMaterialCitedResult DeserializeContentFilterProtectedMaterialCitedResult(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string license = default; + Uri url = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("license"u8)) + { + license = property.Value.GetString(); + continue; + } + if (property.NameEquals("URL"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + url = new Uri(property.Value.GetString()); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ContentFilterProtectedMaterialCitedResult(license, url, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ContentFilterProtectedMaterialCitedResult)} does not support writing '{options.Format}' format."); + } + } + + ContentFilterProtectedMaterialCitedResult IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeContentFilterProtectedMaterialCitedResult(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ContentFilterProtectedMaterialCitedResult)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static ContentFilterProtectedMaterialCitedResult FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeContentFilterProtectedMaterialCitedResult(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialCitedResult.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialCitedResult.cs new file mode 100644 index 000000000..0d8391506 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialCitedResult.cs @@ -0,0 +1,65 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// The AzureContentFilterResultForChoiceProtectedMaterialCodeCitation. + public partial class ContentFilterProtectedMaterialCitedResult + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + internal ContentFilterProtectedMaterialCitedResult() + { + } + + /// Initializes a new instance of . + /// The name or identifier of the license associated with the detection. + /// The URL associated with the license. + /// Keeps track of any properties unknown to the library. + internal ContentFilterProtectedMaterialCitedResult(string license, Uri url, IDictionary serializedAdditionalRawData) + { + License = license; + URL = url; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// The name or identifier of the license associated with the detection. + public string License { get; } + /// The URL associated with the license. + public Uri URL { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialResult.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialResult.Serialization.cs new file mode 100644 index 000000000..1cf232dd8 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialResult.Serialization.cs @@ -0,0 +1,162 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + public partial class ContentFilterProtectedMaterialResult : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterProtectedMaterialResult)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("filtered") != true) + { + writer.WritePropertyName("filtered"u8); + writer.WriteBooleanValue(Filtered); + } + if (SerializedAdditionalRawData?.ContainsKey("detected") != true) + { + writer.WritePropertyName("detected"u8); + writer.WriteBooleanValue(Detected); + } + if (SerializedAdditionalRawData?.ContainsKey("citation") != true && Optional.IsDefined(Citation)) + { + writer.WritePropertyName("citation"u8); + writer.WriteObjectValue(Citation, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ContentFilterProtectedMaterialResult IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterProtectedMaterialResult)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeContentFilterProtectedMaterialResult(document.RootElement, options); + } + + internal static ContentFilterProtectedMaterialResult DeserializeContentFilterProtectedMaterialResult(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + bool filtered = default; + bool detected = default; + ContentFilterProtectedMaterialCitedResult citation = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("filtered"u8)) + { + filtered = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("detected"u8)) + { + detected = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("citation"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + citation = ContentFilterProtectedMaterialCitedResult.DeserializeContentFilterProtectedMaterialCitedResult(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ContentFilterProtectedMaterialResult(filtered, detected, citation, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ContentFilterProtectedMaterialResult)} does not support writing '{options.Format}' format."); + } + } + + ContentFilterProtectedMaterialResult IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeContentFilterProtectedMaterialResult(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ContentFilterProtectedMaterialResult)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static ContentFilterProtectedMaterialResult FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeContentFilterProtectedMaterialResult(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialResult.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialResult.cs new file mode 100644 index 000000000..83551b0f3 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterProtectedMaterialResult.cs @@ -0,0 +1,78 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// The AzureContentFilterResultForChoiceProtectedMaterialCode. + public partial class ContentFilterProtectedMaterialResult + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// Whether the content detection resulted in a content filtering action. + /// Whether the labeled content category was detected in the content. + internal ContentFilterProtectedMaterialResult(bool filtered, bool detected) + { + Filtered = filtered; + Detected = detected; + } + + /// Initializes a new instance of . + /// Whether the content detection resulted in a content filtering action. + /// Whether the labeled content category was detected in the content. + /// If available, the citation details describing the associated license and its location. + /// Keeps track of any properties unknown to the library. + internal ContentFilterProtectedMaterialResult(bool filtered, bool detected, ContentFilterProtectedMaterialCitedResult citation, IDictionary serializedAdditionalRawData) + { + Filtered = filtered; + Detected = detected; + Citation = citation; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal ContentFilterProtectedMaterialResult() + { + } + + /// Whether the content detection resulted in a content filtering action. + public bool Filtered { get; } + /// Whether the labeled content category was detected in the content. + public bool Detected { get; } + /// If available, the citation details describing the associated license and its location. + public ContentFilterProtectedMaterialCitedResult Citation { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForPrompt.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForPrompt.Serialization.cs new file mode 100644 index 000000000..552f85dfa --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForPrompt.Serialization.cs @@ -0,0 +1,112 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + public partial class ContentFilterResultForPrompt : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterResultForPrompt)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("prompt_index") != true && Optional.IsDefined(PromptIndex)) + { + writer.WritePropertyName("prompt_index"u8); + writer.WriteNumberValue(PromptIndex.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("content_filter_results") != true && Optional.IsDefined(InternalResults)) + { + writer.WritePropertyName("content_filter_results"u8); + writer.WriteObjectValue(InternalResults, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ContentFilterResultForPrompt IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterResultForPrompt)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeContentFilterResultForPrompt(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ContentFilterResultForPrompt)} does not support writing '{options.Format}' format."); + } + } + + ContentFilterResultForPrompt IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeContentFilterResultForPrompt(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ContentFilterResultForPrompt)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static ContentFilterResultForPrompt FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeContentFilterResultForPrompt(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForPrompt.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForPrompt.cs new file mode 100644 index 000000000..46aa27b09 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForPrompt.cs @@ -0,0 +1,60 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// A content filter result associated with a single input prompt item into a generative AI system. + public partial class ContentFilterResultForPrompt + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + internal ContentFilterResultForPrompt() + { + } + + /// Initializes a new instance of . + /// The index of the input prompt associated with the accompanying content filter result categories. + /// The content filter category details for the result. + /// Keeps track of any properties unknown to the library. + internal ContentFilterResultForPrompt(int? promptIndex, InternalAzureContentFilterResultForPromptContentFilterResults internalResults, IDictionary serializedAdditionalRawData) + { + PromptIndex = promptIndex; + InternalResults = internalResults; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForResponse.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForResponse.Serialization.cs new file mode 100644 index 000000000..5b2644c9c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForResponse.Serialization.cs @@ -0,0 +1,270 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + public partial class ContentFilterResultForResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterResultForResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("sexual") != true && Optional.IsDefined(Sexual)) + { + writer.WritePropertyName("sexual"u8); + writer.WriteObjectValue(Sexual, options); + } + if (SerializedAdditionalRawData?.ContainsKey("hate") != true && Optional.IsDefined(Hate)) + { + writer.WritePropertyName("hate"u8); + writer.WriteObjectValue(Hate, options); + } + if (SerializedAdditionalRawData?.ContainsKey("violence") != true && Optional.IsDefined(Violence)) + { + writer.WritePropertyName("violence"u8); + writer.WriteObjectValue(Violence, options); + } + if (SerializedAdditionalRawData?.ContainsKey("self_harm") != true && Optional.IsDefined(SelfHarm)) + { + writer.WritePropertyName("self_harm"u8); + writer.WriteObjectValue(SelfHarm, options); + } + if (SerializedAdditionalRawData?.ContainsKey("profanity") != true && Optional.IsDefined(Profanity)) + { + writer.WritePropertyName("profanity"u8); + writer.WriteObjectValue(Profanity, options); + } + if (SerializedAdditionalRawData?.ContainsKey("custom_blocklists") != true && Optional.IsDefined(CustomBlocklists)) + { + writer.WritePropertyName("custom_blocklists"u8); + writer.WriteObjectValue(CustomBlocklists, options); + } + if (SerializedAdditionalRawData?.ContainsKey("error") != true && Optional.IsDefined(Error)) + { + writer.WritePropertyName("error"u8); + writer.WriteObjectValue(Error, options); + } + if (SerializedAdditionalRawData?.ContainsKey("protected_material_text") != true && Optional.IsDefined(ProtectedMaterialText)) + { + writer.WritePropertyName("protected_material_text"u8); + writer.WriteObjectValue(ProtectedMaterialText, options); + } + if (SerializedAdditionalRawData?.ContainsKey("protected_material_code") != true && Optional.IsDefined(ProtectedMaterialCode)) + { + writer.WritePropertyName("protected_material_code"u8); + writer.WriteObjectValue(ProtectedMaterialCode, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ContentFilterResultForResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterResultForResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeContentFilterResultForResponse(document.RootElement, options); + } + + internal static ContentFilterResultForResponse DeserializeContentFilterResultForResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + ContentFilterSeverityResult sexual = default; + ContentFilterSeverityResult hate = default; + ContentFilterSeverityResult violence = default; + ContentFilterSeverityResult selfHarm = default; + ContentFilterDetectionResult profanity = default; + ContentFilterBlocklistResult customBlocklists = default; + InternalAzureContentFilterResultForPromptContentFilterResultsError error = default; + ContentFilterDetectionResult protectedMaterialText = default; + ContentFilterProtectedMaterialResult protectedMaterialCode = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("sexual"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + sexual = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("hate"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + hate = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("violence"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + violence = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("self_harm"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + selfHarm = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("profanity"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + profanity = ContentFilterDetectionResult.DeserializeContentFilterDetectionResult(property.Value, options); + continue; + } + if (property.NameEquals("custom_blocklists"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + customBlocklists = ContentFilterBlocklistResult.DeserializeContentFilterBlocklistResult(property.Value, options); + continue; + } + if (property.NameEquals("error"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + error = InternalAzureContentFilterResultForPromptContentFilterResultsError.DeserializeInternalAzureContentFilterResultForPromptContentFilterResultsError(property.Value, options); + continue; + } + if (property.NameEquals("protected_material_text"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + protectedMaterialText = ContentFilterDetectionResult.DeserializeContentFilterDetectionResult(property.Value, options); + continue; + } + if (property.NameEquals("protected_material_code"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + protectedMaterialCode = ContentFilterProtectedMaterialResult.DeserializeContentFilterProtectedMaterialResult(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ContentFilterResultForResponse( + sexual, + hate, + violence, + selfHarm, + profanity, + customBlocklists, + error, + protectedMaterialText, + protectedMaterialCode, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ContentFilterResultForResponse)} does not support writing '{options.Format}' format."); + } + } + + ContentFilterResultForResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeContentFilterResultForResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ContentFilterResultForResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static ContentFilterResultForResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeContentFilterResultForResponse(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForResponse.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForResponse.cs new file mode 100644 index 000000000..5228ea56a --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterResultForResponse.cs @@ -0,0 +1,129 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// A content filter result for a single response item produced by a generative AI system. + public partial class ContentFilterResultForResponse + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + internal ContentFilterResultForResponse() + { + } + + /// Initializes a new instance of . + /// + /// A content filter category for language related to anatomical organs and genitals, romantic relationships, acts + /// portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an + /// assault or a forced sexual violent act against one's will, prostitution, pornography, and abuse. + /// + /// + /// A content filter category that can refer to any content that attacks or uses pejorative or discriminatory + /// language with reference to a person or identity group based on certain differentiating attributes of these groups + /// including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, + /// religion, immigration status, ability status, personal appearance, and body size. + /// + /// + /// A content filter category for language related to physical actions intended to hurt, injure, damage, or kill + /// someone or something; describes weapons, guns and related entities, such as manufactures, associations, + /// legislation, and so on. + /// + /// + /// A content filter category that describes language related to physical actions intended to purposely hurt, injure, + /// damage one's body or kill oneself. + /// + /// + /// A detection result that identifies whether crude, vulgar, or otherwise objection language is present in the + /// content. + /// + /// A collection of binary filtering outcomes for configured custom blocklists. + /// If present, details about an error that prevented content filtering from completing its evaluation. + /// A detection result that describes a match against text protected under copyright or other status. + /// A detection result that describes a match against licensed code or other protected source material. + /// Keeps track of any properties unknown to the library. + internal ContentFilterResultForResponse(ContentFilterSeverityResult sexual, ContentFilterSeverityResult hate, ContentFilterSeverityResult violence, ContentFilterSeverityResult selfHarm, ContentFilterDetectionResult profanity, ContentFilterBlocklistResult customBlocklists, InternalAzureContentFilterResultForPromptContentFilterResultsError error, ContentFilterDetectionResult protectedMaterialText, ContentFilterProtectedMaterialResult protectedMaterialCode, IDictionary serializedAdditionalRawData) + { + Sexual = sexual; + Hate = hate; + Violence = violence; + SelfHarm = selfHarm; + Profanity = profanity; + CustomBlocklists = customBlocklists; + Error = error; + ProtectedMaterialText = protectedMaterialText; + ProtectedMaterialCode = protectedMaterialCode; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// + /// A content filter category for language related to anatomical organs and genitals, romantic relationships, acts + /// portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an + /// assault or a forced sexual violent act against one's will, prostitution, pornography, and abuse. + /// + public ContentFilterSeverityResult Sexual { get; } + /// + /// A content filter category that can refer to any content that attacks or uses pejorative or discriminatory + /// language with reference to a person or identity group based on certain differentiating attributes of these groups + /// including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, + /// religion, immigration status, ability status, personal appearance, and body size. + /// + public ContentFilterSeverityResult Hate { get; } + /// + /// A content filter category for language related to physical actions intended to hurt, injure, damage, or kill + /// someone or something; describes weapons, guns and related entities, such as manufactures, associations, + /// legislation, and so on. + /// + public ContentFilterSeverityResult Violence { get; } + /// + /// A content filter category that describes language related to physical actions intended to purposely hurt, injure, + /// damage one's body or kill oneself. + /// + public ContentFilterSeverityResult SelfHarm { get; } + /// + /// A detection result that identifies whether crude, vulgar, or otherwise objection language is present in the + /// content. + /// + public ContentFilterDetectionResult Profanity { get; } + /// A collection of binary filtering outcomes for configured custom blocklists. + public ContentFilterBlocklistResult CustomBlocklists { get; } + /// A detection result that describes a match against text protected under copyright or other status. + public ContentFilterDetectionResult ProtectedMaterialText { get; } + /// A detection result that describes a match against licensed code or other protected source material. + public ContentFilterProtectedMaterialResult ProtectedMaterialCode { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterSeverity.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterSeverity.cs new file mode 100644 index 000000000..91a9f01a8 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterSeverity.cs @@ -0,0 +1,54 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace Azure.AI.OpenAI +{ + /// The AzureContentFilterSeverityResultSeverity. + public readonly partial struct ContentFilterSeverity : IEquatable + { + private readonly string _value; + + /// Initializes a new instance of . + /// is null. + public ContentFilterSeverity(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string SafeValue = "safe"; + private const string LowValue = "low"; + private const string MediumValue = "medium"; + private const string HighValue = "high"; + + /// safe. + public static ContentFilterSeverity Safe { get; } = new ContentFilterSeverity(SafeValue); + /// low. + public static ContentFilterSeverity Low { get; } = new ContentFilterSeverity(LowValue); + /// medium. + public static ContentFilterSeverity Medium { get; } = new ContentFilterSeverity(MediumValue); + /// high. + public static ContentFilterSeverity High { get; } = new ContentFilterSeverity(HighValue); + /// Determines if two values are the same. + public static bool operator ==(ContentFilterSeverity left, ContentFilterSeverity right) => left.Equals(right); + /// Determines if two values are not the same. + public static bool operator !=(ContentFilterSeverity left, ContentFilterSeverity right) => !left.Equals(right); + /// Converts a string to a . + public static implicit operator ContentFilterSeverity(string value) => new ContentFilterSeverity(value); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is ContentFilterSeverity other && Equals(other); + /// + public bool Equals(ContentFilterSeverity other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + /// + public override string ToString() => _value; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterSeverityResult.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterSeverityResult.Serialization.cs new file mode 100644 index 000000000..6f7da3ac8 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterSeverityResult.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + public partial class ContentFilterSeverityResult : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterSeverityResult)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("filtered") != true) + { + writer.WritePropertyName("filtered"u8); + writer.WriteBooleanValue(Filtered); + } + if (SerializedAdditionalRawData?.ContainsKey("severity") != true) + { + writer.WritePropertyName("severity"u8); + writer.WriteStringValue(Severity.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ContentFilterSeverityResult IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ContentFilterSeverityResult)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeContentFilterSeverityResult(document.RootElement, options); + } + + internal static ContentFilterSeverityResult DeserializeContentFilterSeverityResult(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + bool filtered = default; + ContentFilterSeverity severity = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("filtered"u8)) + { + filtered = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("severity"u8)) + { + severity = new ContentFilterSeverity(property.Value.GetString()); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ContentFilterSeverityResult(filtered, severity, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ContentFilterSeverityResult)} does not support writing '{options.Format}' format."); + } + } + + ContentFilterSeverityResult IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeContentFilterSeverityResult(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ContentFilterSeverityResult)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static ContentFilterSeverityResult FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeContentFilterSeverityResult(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterSeverityResult.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterSeverityResult.cs new file mode 100644 index 000000000..95f0fa1bc --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ContentFilterSeverityResult.cs @@ -0,0 +1,75 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// + /// A labeled content filter result item that indicates whether the content was filtered and what the qualitative + /// severity level of the content was, as evaluated against content filter configuration for the category. + /// + public partial class ContentFilterSeverityResult + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// Whether the content severity resulted in a content filtering action. + /// The labeled severity of the content. + internal ContentFilterSeverityResult(bool filtered, ContentFilterSeverity severity) + { + Filtered = filtered; + Severity = severity; + } + + /// Initializes a new instance of . + /// Whether the content severity resulted in a content filtering action. + /// The labeled severity of the content. + /// Keeps track of any properties unknown to the library. + internal ContentFilterSeverityResult(bool filtered, ContentFilterSeverity severity, IDictionary serializedAdditionalRawData) + { + Filtered = filtered; + Severity = severity; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal ContentFilterSeverityResult() + { + } + + /// Whether the content severity resulted in a content filtering action. + public bool Filtered { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceAuthentication.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceAuthentication.Serialization.cs new file mode 100644 index 000000000..88ed804d9 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceAuthentication.Serialization.cs @@ -0,0 +1,132 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + [PersistableModelProxy(typeof(InternalUnknownAzureChatDataSourceAuthenticationOptions))] + public partial class DataSourceAuthentication : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(DataSourceAuthentication)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + DataSourceAuthentication IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(DataSourceAuthentication)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeDataSourceAuthentication(document.RootElement, options); + } + + internal static DataSourceAuthentication DeserializeDataSourceAuthentication(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "access_token": return InternalAzureChatDataSourceAccessTokenAuthenticationOptions.DeserializeInternalAzureChatDataSourceAccessTokenAuthenticationOptions(element, options); + case "api_key": return InternalAzureChatDataSourceApiKeyAuthenticationOptions.DeserializeInternalAzureChatDataSourceApiKeyAuthenticationOptions(element, options); + case "connection_string": return InternalAzureChatDataSourceConnectionStringAuthenticationOptions.DeserializeInternalAzureChatDataSourceConnectionStringAuthenticationOptions(element, options); + case "encoded_api_key": return InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions.DeserializeInternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions(element, options); + case "key_and_key_id": return InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions.DeserializeInternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions(element, options); + case "system_assigned_managed_identity": return InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions.DeserializeInternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions(element, options); + case "user_assigned_managed_identity": return InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions.DeserializeInternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions(element, options); + } + } + return InternalUnknownAzureChatDataSourceAuthenticationOptions.DeserializeInternalUnknownAzureChatDataSourceAuthenticationOptions(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(DataSourceAuthentication)} does not support writing '{options.Format}' format."); + } + } + + DataSourceAuthentication IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeDataSourceAuthentication(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(DataSourceAuthentication)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static DataSourceAuthentication FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeDataSourceAuthentication(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceAuthentication.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceAuthentication.cs new file mode 100644 index 000000000..60914d91a --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceAuthentication.cs @@ -0,0 +1,64 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// + /// The AzureChatDataSourceAuthenticationOptions. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + public abstract partial class DataSourceAuthentication + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + protected DataSourceAuthentication() + { + } + + /// Initializes a new instance of . + /// Discriminator. + /// Keeps track of any properties unknown to the library. + internal DataSourceAuthentication(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Discriminator. + internal string Type { get; set; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceFieldMappings.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceFieldMappings.Serialization.cs new file mode 100644 index 000000000..7afbad202 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceFieldMappings.Serialization.cs @@ -0,0 +1,252 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + public partial class DataSourceFieldMappings : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(DataSourceFieldMappings)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("title_field") != true && Optional.IsDefined(TitleFieldName)) + { + writer.WritePropertyName("title_field"u8); + writer.WriteStringValue(TitleFieldName); + } + if (SerializedAdditionalRawData?.ContainsKey("url_field") != true && Optional.IsDefined(UrlFieldName)) + { + writer.WritePropertyName("url_field"u8); + writer.WriteStringValue(UrlFieldName); + } + if (SerializedAdditionalRawData?.ContainsKey("filepath_field") != true && Optional.IsDefined(FilepathFieldName)) + { + writer.WritePropertyName("filepath_field"u8); + writer.WriteStringValue(FilepathFieldName); + } + if (SerializedAdditionalRawData?.ContainsKey("content_fields") != true && Optional.IsCollectionDefined(ContentFieldNames)) + { + writer.WritePropertyName("content_fields"u8); + writer.WriteStartArray(); + foreach (var item in ContentFieldNames) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("content_fields_separator") != true && Optional.IsDefined(ContentFieldSeparator)) + { + writer.WritePropertyName("content_fields_separator"u8); + writer.WriteStringValue(ContentFieldSeparator); + } + if (SerializedAdditionalRawData?.ContainsKey("vector_fields") != true && Optional.IsCollectionDefined(VectorFieldNames)) + { + writer.WritePropertyName("vector_fields"u8); + writer.WriteStartArray(); + foreach (var item in VectorFieldNames) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("image_vector_fields") != true && Optional.IsCollectionDefined(ImageVectorFieldNames)) + { + writer.WritePropertyName("image_vector_fields"u8); + writer.WriteStartArray(); + foreach (var item in ImageVectorFieldNames) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + DataSourceFieldMappings IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(DataSourceFieldMappings)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeDataSourceFieldMappings(document.RootElement, options); + } + + internal static DataSourceFieldMappings DeserializeDataSourceFieldMappings(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string titleField = default; + string urlField = default; + string filepathField = default; + IList contentFields = default; + string contentFieldsSeparator = default; + IList vectorFields = default; + IList imageVectorFields = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("title_field"u8)) + { + titleField = property.Value.GetString(); + continue; + } + if (property.NameEquals("url_field"u8)) + { + urlField = property.Value.GetString(); + continue; + } + if (property.NameEquals("filepath_field"u8)) + { + filepathField = property.Value.GetString(); + continue; + } + if (property.NameEquals("content_fields"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + contentFields = array; + continue; + } + if (property.NameEquals("content_fields_separator"u8)) + { + contentFieldsSeparator = property.Value.GetString(); + continue; + } + if (property.NameEquals("vector_fields"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + vectorFields = array; + continue; + } + if (property.NameEquals("image_vector_fields"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + imageVectorFields = array; + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new DataSourceFieldMappings( + titleField, + urlField, + filepathField, + contentFields ?? new ChangeTrackingList(), + contentFieldsSeparator, + vectorFields ?? new ChangeTrackingList(), + imageVectorFields ?? new ChangeTrackingList(), + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(DataSourceFieldMappings)} does not support writing '{options.Format}' format."); + } + } + + DataSourceFieldMappings IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeDataSourceFieldMappings(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(DataSourceFieldMappings)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static DataSourceFieldMappings FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeDataSourceFieldMappings(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceFieldMappings.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceFieldMappings.cs new file mode 100644 index 000000000..9933608ce --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceFieldMappings.cs @@ -0,0 +1,66 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureSearchChatDataSourceParametersFieldsMapping. + public partial class DataSourceFieldMappings + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + + /// Initializes a new instance of . + /// The name of the index field to use as a title. + /// The name of the index field to use as a URL. + /// The name of the index field to use as a filepath. + /// The names of index fields that should be treated as content. + /// The separator pattern that content fields should use. + /// The names of fields that represent vector data. + /// The names of fields that represent image vector data. + /// Keeps track of any properties unknown to the library. + internal DataSourceFieldMappings(string titleFieldName, string urlFieldName, string filepathFieldName, IList contentFieldNames, string contentFieldSeparator, IList vectorFieldNames, IList imageVectorFieldNames, IDictionary serializedAdditionalRawData) + { + TitleFieldName = titleFieldName; + UrlFieldName = urlFieldName; + FilepathFieldName = filepathFieldName; + ContentFieldNames = contentFieldNames; + ContentFieldSeparator = contentFieldSeparator; + VectorFieldNames = vectorFieldNames; + ImageVectorFieldNames = imageVectorFieldNames; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceQueryType.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceQueryType.cs new file mode 100644 index 000000000..1f05c54d6 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceQueryType.cs @@ -0,0 +1,57 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureSearchChatDataSourceParametersQueryType. + public readonly partial struct DataSourceQueryType : IEquatable + { + private readonly string _value; + + /// Initializes a new instance of . + /// is null. + public DataSourceQueryType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string SimpleValue = "simple"; + private const string SemanticValue = "semantic"; + private const string VectorValue = "vector"; + private const string VectorSimpleHybridValue = "vector_simple_hybrid"; + private const string VectorSemanticHybridValue = "vector_semantic_hybrid"; + + /// simple. + public static DataSourceQueryType Simple { get; } = new DataSourceQueryType(SimpleValue); + /// semantic. + public static DataSourceQueryType Semantic { get; } = new DataSourceQueryType(SemanticValue); + /// vector. + public static DataSourceQueryType Vector { get; } = new DataSourceQueryType(VectorValue); + /// vector_simple_hybrid. + public static DataSourceQueryType VectorSimpleHybrid { get; } = new DataSourceQueryType(VectorSimpleHybridValue); + /// vector_semantic_hybrid. + public static DataSourceQueryType VectorSemanticHybrid { get; } = new DataSourceQueryType(VectorSemanticHybridValue); + /// Determines if two values are the same. + public static bool operator ==(DataSourceQueryType left, DataSourceQueryType right) => left.Equals(right); + /// Determines if two values are not the same. + public static bool operator !=(DataSourceQueryType left, DataSourceQueryType right) => !left.Equals(right); + /// Converts a string to a . + public static implicit operator DataSourceQueryType(string value) => new DataSourceQueryType(value); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is DataSourceQueryType other && Equals(other); + /// + public bool Equals(DataSourceQueryType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + /// + public override string ToString() => _value; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceVectorizer.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceVectorizer.Serialization.cs new file mode 100644 index 000000000..15db67fa0 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceVectorizer.Serialization.cs @@ -0,0 +1,128 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + [PersistableModelProxy(typeof(InternalUnknownAzureChatDataSourceVectorizationSource))] + public partial class DataSourceVectorizer : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(DataSourceVectorizer)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + DataSourceVectorizer IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(DataSourceVectorizer)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeDataSourceVectorizer(document.RootElement, options); + } + + internal static DataSourceVectorizer DeserializeDataSourceVectorizer(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "deployment_name": return InternalAzureChatDataSourceDeploymentNameVectorizationSource.DeserializeInternalAzureChatDataSourceDeploymentNameVectorizationSource(element, options); + case "endpoint": return InternalAzureChatDataSourceEndpointVectorizationSource.DeserializeInternalAzureChatDataSourceEndpointVectorizationSource(element, options); + case "model_id": return InternalAzureChatDataSourceModelIdVectorizationSource.DeserializeInternalAzureChatDataSourceModelIdVectorizationSource(element, options); + } + } + return InternalUnknownAzureChatDataSourceVectorizationSource.DeserializeInternalUnknownAzureChatDataSourceVectorizationSource(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(DataSourceVectorizer)} does not support writing '{options.Format}' format."); + } + } + + DataSourceVectorizer IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeDataSourceVectorizer(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(DataSourceVectorizer)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static DataSourceVectorizer FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeDataSourceVectorizer(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceVectorizer.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceVectorizer.cs new file mode 100644 index 000000000..474c24590 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/DataSourceVectorizer.cs @@ -0,0 +1,64 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// + /// A representation of a data vectorization source usable as an embedding resource with a data source. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + public abstract partial class DataSourceVectorizer + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + protected DataSourceVectorizer() + { + } + + /// Initializes a new instance of . + /// The differentiating identifier for the concrete vectorization source. + /// Keeps track of any properties unknown to the library. + internal DataSourceVectorizer(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// The differentiating identifier for the concrete vectorization source. + internal string Type { get; set; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ElasticsearchChatDataSource.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ElasticsearchChatDataSource.Serialization.cs new file mode 100644 index 000000000..5265a9315 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ElasticsearchChatDataSource.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + public partial class ElasticsearchChatDataSource : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ElasticsearchChatDataSource)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("parameters") != true) + { + writer.WritePropertyName("parameters"u8); + writer.WriteObjectValue(InternalParameters, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ElasticsearchChatDataSource IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ElasticsearchChatDataSource)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeElasticsearchChatDataSource(document.RootElement, options); + } + + internal static ElasticsearchChatDataSource DeserializeElasticsearchChatDataSource(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalElasticsearchChatDataSourceParameters parameters = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("parameters"u8)) + { + parameters = InternalElasticsearchChatDataSourceParameters.DeserializeInternalElasticsearchChatDataSourceParameters(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ElasticsearchChatDataSource(type, serializedAdditionalRawData, parameters); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ElasticsearchChatDataSource)} does not support writing '{options.Format}' format."); + } + } + + ElasticsearchChatDataSource IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeElasticsearchChatDataSource(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ElasticsearchChatDataSource)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new ElasticsearchChatDataSource FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeElasticsearchChatDataSource(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ElasticsearchChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ElasticsearchChatDataSource.cs new file mode 100644 index 000000000..db281d1d6 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ElasticsearchChatDataSource.cs @@ -0,0 +1,14 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The ElasticsearchChatDataSource. + public partial class ElasticsearchChatDataSource : AzureChatDataSource + { + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForPrompt.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForPrompt.Serialization.cs new file mode 100644 index 000000000..1ecef8716 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForPrompt.Serialization.cs @@ -0,0 +1,234 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + public partial class ImageContentFilterResultForPrompt : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ImageContentFilterResultForPrompt)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("profanity") != true && Optional.IsDefined(Profanity)) + { + writer.WritePropertyName("profanity"u8); + writer.WriteObjectValue(Profanity, options); + } + if (SerializedAdditionalRawData?.ContainsKey("custom_blocklists") != true && Optional.IsDefined(CustomBlocklists)) + { + writer.WritePropertyName("custom_blocklists"u8); + writer.WriteObjectValue(CustomBlocklists, options); + } + if (SerializedAdditionalRawData?.ContainsKey("jailbreak") != true) + { + writer.WritePropertyName("jailbreak"u8); + writer.WriteObjectValue(Jailbreak, options); + } + if (SerializedAdditionalRawData?.ContainsKey("sexual") != true && Optional.IsDefined(Sexual)) + { + writer.WritePropertyName("sexual"u8); + writer.WriteObjectValue(Sexual, options); + } + if (SerializedAdditionalRawData?.ContainsKey("violence") != true && Optional.IsDefined(Violence)) + { + writer.WritePropertyName("violence"u8); + writer.WriteObjectValue(Violence, options); + } + if (SerializedAdditionalRawData?.ContainsKey("hate") != true && Optional.IsDefined(Hate)) + { + writer.WritePropertyName("hate"u8); + writer.WriteObjectValue(Hate, options); + } + if (SerializedAdditionalRawData?.ContainsKey("self_harm") != true && Optional.IsDefined(SelfHarm)) + { + writer.WritePropertyName("self_harm"u8); + writer.WriteObjectValue(SelfHarm, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ImageContentFilterResultForPrompt IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ImageContentFilterResultForPrompt)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeImageContentFilterResultForPrompt(document.RootElement, options); + } + + internal static ImageContentFilterResultForPrompt DeserializeImageContentFilterResultForPrompt(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + ContentFilterDetectionResult profanity = default; + ContentFilterBlocklistResult customBlocklists = default; + ContentFilterDetectionResult jailbreak = default; + ContentFilterSeverityResult sexual = default; + ContentFilterSeverityResult violence = default; + ContentFilterSeverityResult hate = default; + ContentFilterSeverityResult selfHarm = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("profanity"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + profanity = ContentFilterDetectionResult.DeserializeContentFilterDetectionResult(property.Value, options); + continue; + } + if (property.NameEquals("custom_blocklists"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + customBlocklists = ContentFilterBlocklistResult.DeserializeContentFilterBlocklistResult(property.Value, options); + continue; + } + if (property.NameEquals("jailbreak"u8)) + { + jailbreak = ContentFilterDetectionResult.DeserializeContentFilterDetectionResult(property.Value, options); + continue; + } + if (property.NameEquals("sexual"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + sexual = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("violence"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + violence = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("hate"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + hate = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("self_harm"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + selfHarm = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ImageContentFilterResultForPrompt( + sexual, + violence, + hate, + selfHarm, + serializedAdditionalRawData, + profanity, + customBlocklists, + jailbreak); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ImageContentFilterResultForPrompt)} does not support writing '{options.Format}' format."); + } + } + + ImageContentFilterResultForPrompt IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeImageContentFilterResultForPrompt(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ImageContentFilterResultForPrompt)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new ImageContentFilterResultForPrompt FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeImageContentFilterResultForPrompt(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForPrompt.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForPrompt.cs new file mode 100644 index 000000000..86b7f00cc --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForPrompt.cs @@ -0,0 +1,85 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// A content filter result for an image generation operation's input request content. + public partial class ImageContentFilterResultForPrompt : ImageContentFilterResultForResponse + { + /// Initializes a new instance of . + /// + /// A detection result that describes user prompt injection attacks, where malicious users deliberately exploit + /// system vulnerabilities to elicit unauthorized behavior from the LLM. This could lead to inappropriate content + /// generation or violations of system-imposed restrictions. + /// + /// is null. + internal ImageContentFilterResultForPrompt(ContentFilterDetectionResult jailbreak) + { + Argument.AssertNotNull(jailbreak, nameof(jailbreak)); + + Jailbreak = jailbreak; + } + + /// Initializes a new instance of . + /// + /// A content filter category for language related to anatomical organs and genitals, romantic relationships, acts + /// portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an + /// assault or a forced sexual violent act against one's will, prostitution, pornography, and abuse. + /// + /// + /// A content filter category for language related to physical actions intended to hurt, injure, damage, or kill + /// someone or something; describes weapons, guns and related entities, such as manufactures, associations, + /// legislation, and so on. + /// + /// + /// A content filter category that can refer to any content that attacks or uses pejorative or discriminatory + /// language with reference to a person or identity group based on certain differentiating attributes of these groups + /// including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, + /// religion, immigration status, ability status, personal appearance, and body size. + /// + /// + /// A content filter category that describes language related to physical actions intended to purposely hurt, injure, + /// damage one's body or kill oneself. + /// + /// Keeps track of any properties unknown to the library. + /// + /// A detection result that identifies whether crude, vulgar, or otherwise objection language is present in the + /// content. + /// + /// A collection of binary filtering outcomes for configured custom blocklists. + /// + /// A detection result that describes user prompt injection attacks, where malicious users deliberately exploit + /// system vulnerabilities to elicit unauthorized behavior from the LLM. This could lead to inappropriate content + /// generation or violations of system-imposed restrictions. + /// + internal ImageContentFilterResultForPrompt(ContentFilterSeverityResult sexual, ContentFilterSeverityResult violence, ContentFilterSeverityResult hate, ContentFilterSeverityResult selfHarm, IDictionary serializedAdditionalRawData, ContentFilterDetectionResult profanity, ContentFilterBlocklistResult customBlocklists, ContentFilterDetectionResult jailbreak) : base(sexual, violence, hate, selfHarm, serializedAdditionalRawData) + { + Profanity = profanity; + CustomBlocklists = customBlocklists; + Jailbreak = jailbreak; + } + + /// Initializes a new instance of for deserialization. + internal ImageContentFilterResultForPrompt() + { + } + + /// + /// A detection result that identifies whether crude, vulgar, or otherwise objection language is present in the + /// content. + /// + public ContentFilterDetectionResult Profanity { get; } + /// A collection of binary filtering outcomes for configured custom blocklists. + public ContentFilterBlocklistResult CustomBlocklists { get; } + /// + /// A detection result that describes user prompt injection attacks, where malicious users deliberately exploit + /// system vulnerabilities to elicit unauthorized behavior from the LLM. This could lead to inappropriate content + /// generation or violations of system-imposed restrictions. + /// + public ContentFilterDetectionResult Jailbreak { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForResponse.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForResponse.Serialization.cs new file mode 100644 index 000000000..d666cf634 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForResponse.Serialization.cs @@ -0,0 +1,185 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + public partial class ImageContentFilterResultForResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ImageContentFilterResultForResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("sexual") != true && Optional.IsDefined(Sexual)) + { + writer.WritePropertyName("sexual"u8); + writer.WriteObjectValue(Sexual, options); + } + if (SerializedAdditionalRawData?.ContainsKey("violence") != true && Optional.IsDefined(Violence)) + { + writer.WritePropertyName("violence"u8); + writer.WriteObjectValue(Violence, options); + } + if (SerializedAdditionalRawData?.ContainsKey("hate") != true && Optional.IsDefined(Hate)) + { + writer.WritePropertyName("hate"u8); + writer.WriteObjectValue(Hate, options); + } + if (SerializedAdditionalRawData?.ContainsKey("self_harm") != true && Optional.IsDefined(SelfHarm)) + { + writer.WritePropertyName("self_harm"u8); + writer.WriteObjectValue(SelfHarm, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ImageContentFilterResultForResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ImageContentFilterResultForResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeImageContentFilterResultForResponse(document.RootElement, options); + } + + internal static ImageContentFilterResultForResponse DeserializeImageContentFilterResultForResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + ContentFilterSeverityResult sexual = default; + ContentFilterSeverityResult violence = default; + ContentFilterSeverityResult hate = default; + ContentFilterSeverityResult selfHarm = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("sexual"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + sexual = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("violence"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + violence = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("hate"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + hate = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("self_harm"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + selfHarm = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ImageContentFilterResultForResponse(sexual, violence, hate, selfHarm, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ImageContentFilterResultForResponse)} does not support writing '{options.Format}' format."); + } + } + + ImageContentFilterResultForResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeImageContentFilterResultForResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ImageContentFilterResultForResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static ImageContentFilterResultForResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeImageContentFilterResultForResponse(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForResponse.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForResponse.cs new file mode 100644 index 000000000..ed4b17e85 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/ImageContentFilterResultForResponse.cs @@ -0,0 +1,105 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// A content filter result for an image generation operation's output response content. + public partial class ImageContentFilterResultForResponse + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + internal ImageContentFilterResultForResponse() + { + } + + /// Initializes a new instance of . + /// + /// A content filter category for language related to anatomical organs and genitals, romantic relationships, acts + /// portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an + /// assault or a forced sexual violent act against one's will, prostitution, pornography, and abuse. + /// + /// + /// A content filter category for language related to physical actions intended to hurt, injure, damage, or kill + /// someone or something; describes weapons, guns and related entities, such as manufactures, associations, + /// legislation, and so on. + /// + /// + /// A content filter category that can refer to any content that attacks or uses pejorative or discriminatory + /// language with reference to a person or identity group based on certain differentiating attributes of these groups + /// including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, + /// religion, immigration status, ability status, personal appearance, and body size. + /// + /// + /// A content filter category that describes language related to physical actions intended to purposely hurt, injure, + /// damage one's body or kill oneself. + /// + /// Keeps track of any properties unknown to the library. + internal ImageContentFilterResultForResponse(ContentFilterSeverityResult sexual, ContentFilterSeverityResult violence, ContentFilterSeverityResult hate, ContentFilterSeverityResult selfHarm, IDictionary serializedAdditionalRawData) + { + Sexual = sexual; + Violence = violence; + Hate = hate; + SelfHarm = selfHarm; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// + /// A content filter category for language related to anatomical organs and genitals, romantic relationships, acts + /// portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an + /// assault or a forced sexual violent act against one's will, prostitution, pornography, and abuse. + /// + public ContentFilterSeverityResult Sexual { get; } + /// + /// A content filter category for language related to physical actions intended to hurt, injure, damage, or kill + /// someone or something; describes weapons, guns and related entities, such as manufactures, associations, + /// legislation, and so on. + /// + public ContentFilterSeverityResult Violence { get; } + /// + /// A content filter category that can refer to any content that attacks or uses pejorative or discriminatory + /// language with reference to a person or identity group based on certain differentiating attributes of these groups + /// including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, + /// religion, immigration status, ability status, personal appearance, and body size. + /// + public ContentFilterSeverityResult Hate { get; } + /// + /// A content filter category that describes language related to physical actions intended to purposely hurt, injure, + /// damage one's body or kill oneself. + /// + public ContentFilterSeverityResult SelfHarm { get; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/Argument.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/Argument.cs new file mode 100644 index 000000000..5a8ab138b --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/Argument.cs @@ -0,0 +1,126 @@ +// + +#nullable disable + +using System; +using System.Collections; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + internal static class Argument + { + public static void AssertNotNull(T value, string name) + { + if (value is null) + { + throw new ArgumentNullException(name); + } + } + + public static void AssertNotNull(T? value, string name) + where T : struct + { + if (!value.HasValue) + { + throw new ArgumentNullException(name); + } + } + + public static void AssertNotNullOrEmpty(IEnumerable value, string name) + { + if (value is null) + { + throw new ArgumentNullException(name); + } + if (value is ICollection collectionOfT && collectionOfT.Count == 0) + { + throw new ArgumentException("Value cannot be an empty collection.", name); + } + if (value is ICollection collection && collection.Count == 0) + { + throw new ArgumentException("Value cannot be an empty collection.", name); + } + using IEnumerator e = value.GetEnumerator(); + if (!e.MoveNext()) + { + throw new ArgumentException("Value cannot be an empty collection.", name); + } + } + + public static void AssertNotNullOrEmpty(string value, string name) + { + if (value is null) + { + throw new ArgumentNullException(name); + } + if (value.Length == 0) + { + throw new ArgumentException("Value cannot be an empty string.", name); + } + } + + public static void AssertNotNullOrWhiteSpace(string value, string name) + { + if (value is null) + { + throw new ArgumentNullException(name); + } + if (string.IsNullOrWhiteSpace(value)) + { + throw new ArgumentException("Value cannot be empty or contain only white-space characters.", name); + } + } + + public static void AssertNotDefault(ref T value, string name) + where T : struct, IEquatable + { + if (value.Equals(default)) + { + throw new ArgumentException("Value cannot be empty.", name); + } + } + + public static void AssertInRange(T value, T minimum, T maximum, string name) + where T : notnull, IComparable + { + if (minimum.CompareTo(value) > 0) + { + throw new ArgumentOutOfRangeException(name, "Value is less than the minimum allowed."); + } + if (maximum.CompareTo(value) < 0) + { + throw new ArgumentOutOfRangeException(name, "Value is greater than the maximum allowed."); + } + } + + public static void AssertEnumDefined(Type enumType, object value, string name) + { + if (!Enum.IsDefined(enumType, value)) + { + throw new ArgumentException($"Value not defined for {enumType.FullName}.", name); + } + } + + public static T CheckNotNull(T value, string name) + where T : class + { + AssertNotNull(value, name); + return value; + } + + public static string CheckNotNullOrEmpty(string value, string name) + { + AssertNotNullOrEmpty(value, name); + return value; + } + + public static void AssertNull(T value, string name, string message = null) + { + if (value != null) + { + throw new ArgumentException(message ?? "Value must be null.", name); + } + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/BinaryContentHelper.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/BinaryContentHelper.cs new file mode 100644 index 000000000..e6f35c517 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/BinaryContentHelper.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal static class BinaryContentHelper + { + public static BinaryContent FromEnumerable(IEnumerable enumerable) + where T : notnull + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteStartArray(); + foreach (var item in enumerable) + { + content.JsonWriter.WriteObjectValue(item, ModelSerializationExtensions.WireOptions); + } + content.JsonWriter.WriteEndArray(); + + return content; + } + + public static BinaryContent FromEnumerable(IEnumerable enumerable) + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteStartArray(); + foreach (var item in enumerable) + { + if (item == null) + { + content.JsonWriter.WriteNullValue(); + } + else + { +#if NET6_0_OR_GREATER + content.JsonWriter.WriteRawValue(item); +#else + using (JsonDocument document = JsonDocument.Parse(item)) + { + JsonSerializer.Serialize(content.JsonWriter, document.RootElement); + } +#endif + } + } + content.JsonWriter.WriteEndArray(); + + return content; + } + + public static BinaryContent FromEnumerable(ReadOnlySpan span) + where T : notnull + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteStartArray(); + for (int i = 0; i < span.Length; i++) + { + content.JsonWriter.WriteObjectValue(span[i], ModelSerializationExtensions.WireOptions); + } + content.JsonWriter.WriteEndArray(); + + return content; + } + + public static BinaryContent FromDictionary(IDictionary dictionary) + where TValue : notnull + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteStartObject(); + foreach (var item in dictionary) + { + content.JsonWriter.WritePropertyName(item.Key); + content.JsonWriter.WriteObjectValue(item.Value, ModelSerializationExtensions.WireOptions); + } + content.JsonWriter.WriteEndObject(); + + return content; + } + + public static BinaryContent FromDictionary(IDictionary dictionary) + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteStartObject(); + foreach (var item in dictionary) + { + content.JsonWriter.WritePropertyName(item.Key); + if (item.Value == null) + { + content.JsonWriter.WriteNullValue(); + } + else + { +#if NET6_0_OR_GREATER + content.JsonWriter.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(content.JsonWriter, document.RootElement); + } +#endif + } + } + content.JsonWriter.WriteEndObject(); + + return content; + } + + public static BinaryContent FromObject(object value) + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteObjectValue(value, ModelSerializationExtensions.WireOptions); + return content; + } + + public static BinaryContent FromObject(BinaryData value) + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); +#if NET6_0_OR_GREATER + content.JsonWriter.WriteRawValue(value); +#else + using (JsonDocument document = JsonDocument.Parse(value)) + { + JsonSerializer.Serialize(content.JsonWriter, document.RootElement); + } +#endif + return content; + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ChangeTrackingDictionary.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ChangeTrackingDictionary.cs new file mode 100644 index 000000000..058e71abe --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ChangeTrackingDictionary.cs @@ -0,0 +1,164 @@ +// + +#nullable disable + +using System; +using System.Collections; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + internal class ChangeTrackingDictionary : IDictionary, IReadOnlyDictionary where TKey : notnull + { + private IDictionary _innerDictionary; + + public ChangeTrackingDictionary() + { + } + + public ChangeTrackingDictionary(IDictionary dictionary) + { + if (dictionary == null) + { + return; + } + _innerDictionary = new Dictionary(dictionary); + } + + public ChangeTrackingDictionary(IReadOnlyDictionary dictionary) + { + if (dictionary == null) + { + return; + } + _innerDictionary = new Dictionary(); + foreach (var pair in dictionary) + { + _innerDictionary.Add(pair); + } + } + + public bool IsUndefined => _innerDictionary == null; + + public int Count => IsUndefined ? 0 : EnsureDictionary().Count; + + public bool IsReadOnly => IsUndefined ? false : EnsureDictionary().IsReadOnly; + + public ICollection Keys => IsUndefined ? Array.Empty() : EnsureDictionary().Keys; + + public ICollection Values => IsUndefined ? Array.Empty() : EnsureDictionary().Values; + + public TValue this[TKey key] + { + get + { + if (IsUndefined) + { + throw new KeyNotFoundException(nameof(key)); + } + return EnsureDictionary()[key]; + } + set + { + EnsureDictionary()[key] = value; + } + } + + IEnumerable IReadOnlyDictionary.Keys => Keys; + + IEnumerable IReadOnlyDictionary.Values => Values; + + public IEnumerator> GetEnumerator() + { + if (IsUndefined) + { + IEnumerator> enumerateEmpty() + { + yield break; + } + return enumerateEmpty(); + } + return EnsureDictionary().GetEnumerator(); + } + + IEnumerator IEnumerable.GetEnumerator() + { + return GetEnumerator(); + } + + public void Add(KeyValuePair item) + { + EnsureDictionary().Add(item); + } + + public void Clear() + { + EnsureDictionary().Clear(); + } + + public bool Contains(KeyValuePair item) + { + if (IsUndefined) + { + return false; + } + return EnsureDictionary().Contains(item); + } + + public void CopyTo(KeyValuePair[] array, int index) + { + if (IsUndefined) + { + return; + } + EnsureDictionary().CopyTo(array, index); + } + + public bool Remove(KeyValuePair item) + { + if (IsUndefined) + { + return false; + } + return EnsureDictionary().Remove(item); + } + + public void Add(TKey key, TValue value) + { + EnsureDictionary().Add(key, value); + } + + public bool ContainsKey(TKey key) + { + if (IsUndefined) + { + return false; + } + return EnsureDictionary().ContainsKey(key); + } + + public bool Remove(TKey key) + { + if (IsUndefined) + { + return false; + } + return EnsureDictionary().Remove(key); + } + + public bool TryGetValue(TKey key, out TValue value) + { + if (IsUndefined) + { + value = default; + return false; + } + return EnsureDictionary().TryGetValue(key, out value); + } + + public IDictionary EnsureDictionary() + { + return _innerDictionary ??= new Dictionary(); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ChangeTrackingList.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ChangeTrackingList.cs new file mode 100644 index 000000000..9c6986a3a --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ChangeTrackingList.cs @@ -0,0 +1,150 @@ +// + +#nullable disable + +using System; +using System.Collections; +using System.Collections.Generic; +using System.Linq; + +namespace Azure.AI.OpenAI +{ + internal class ChangeTrackingList : IList, IReadOnlyList + { + private IList _innerList; + + public ChangeTrackingList() + { + } + + public ChangeTrackingList(IList innerList) + { + if (innerList != null) + { + _innerList = innerList; + } + } + + public ChangeTrackingList(IReadOnlyList innerList) + { + if (innerList != null) + { + _innerList = innerList.ToList(); + } + } + + public bool IsUndefined => _innerList == null; + + public int Count => IsUndefined ? 0 : EnsureList().Count; + + public bool IsReadOnly => IsUndefined ? false : EnsureList().IsReadOnly; + + public T this[int index] + { + get + { + if (IsUndefined) + { + throw new ArgumentOutOfRangeException(nameof(index)); + } + return EnsureList()[index]; + } + set + { + if (IsUndefined) + { + throw new ArgumentOutOfRangeException(nameof(index)); + } + EnsureList()[index] = value; + } + } + + public void Reset() + { + _innerList = null; + } + + public IEnumerator GetEnumerator() + { + if (IsUndefined) + { + IEnumerator enumerateEmpty() + { + yield break; + } + return enumerateEmpty(); + } + return EnsureList().GetEnumerator(); + } + + IEnumerator IEnumerable.GetEnumerator() + { + return GetEnumerator(); + } + + public void Add(T item) + { + EnsureList().Add(item); + } + + public void Clear() + { + EnsureList().Clear(); + } + + public bool Contains(T item) + { + if (IsUndefined) + { + return false; + } + return EnsureList().Contains(item); + } + + public void CopyTo(T[] array, int arrayIndex) + { + if (IsUndefined) + { + return; + } + EnsureList().CopyTo(array, arrayIndex); + } + + public bool Remove(T item) + { + if (IsUndefined) + { + return false; + } + return EnsureList().Remove(item); + } + + public int IndexOf(T item) + { + if (IsUndefined) + { + return -1; + } + return EnsureList().IndexOf(item); + } + + public void Insert(int index, T item) + { + EnsureList().Insert(index, item); + } + + public void RemoveAt(int index) + { + if (IsUndefined) + { + throw new ArgumentOutOfRangeException(nameof(index)); + } + EnsureList().RemoveAt(index); + } + + public IList EnsureList() + { + return _innerList ??= new List(); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ClientPipelineExtensions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ClientPipelineExtensions.cs new file mode 100644 index 000000000..e1d028882 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ClientPipelineExtensions.cs @@ -0,0 +1,41 @@ +// + +#nullable disable + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace Azure.AI.OpenAI +{ + internal static partial class ClientPipelineExtensions + { + public static async ValueTask> ProcessHeadAsBoolMessageAsync(this ClientPipeline pipeline, PipelineMessage message, RequestOptions options) + { + PipelineResponse response = await pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + switch (response.Status) + { + case >= 200 and < 300: + return ClientResult.FromValue(true, response); + case >= 400 and < 500: + return ClientResult.FromValue(false, response); + default: + return new ErrorResult(response, new ClientResultException(response)); + } + } + + public static ClientResult ProcessHeadAsBoolMessage(this ClientPipeline pipeline, PipelineMessage message, RequestOptions options) + { + PipelineResponse response = pipeline.ProcessMessage(message, options); + switch (response.Status) + { + case >= 200 and < 300: + return ClientResult.FromValue(true, response); + case >= 400 and < 500: + return ClientResult.FromValue(false, response); + default: + return new ErrorResult(response, new ClientResultException(response)); + } + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ClientUriBuilder.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ClientUriBuilder.cs new file mode 100644 index 000000000..aa2ae4da3 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ClientUriBuilder.cs @@ -0,0 +1,190 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; +using System.Text; + +namespace Azure.AI.OpenAI +{ + internal partial class ClientUriBuilder + { + private UriBuilder _uriBuilder; + private StringBuilder _pathBuilder; + private StringBuilder _queryBuilder; + + public ClientUriBuilder() + { + } + + private UriBuilder UriBuilder => _uriBuilder ??= new UriBuilder(); + + private StringBuilder PathBuilder => _pathBuilder ??= new StringBuilder(UriBuilder.Path); + + private StringBuilder QueryBuilder => _queryBuilder ??= new StringBuilder(UriBuilder.Query); + + public void Reset(Uri uri) + { + _uriBuilder = new UriBuilder(uri); + _pathBuilder = new StringBuilder(UriBuilder.Path); + _queryBuilder = new StringBuilder(UriBuilder.Query); + } + + public void AppendPath(string value, bool escape) + { + Argument.AssertNotNullOrWhiteSpace(value, nameof(value)); + + if (escape) + { + value = Uri.EscapeDataString(value); + } + + if (PathBuilder.Length > 0 && PathBuilder[PathBuilder.Length - 1] == '/' && value[0] == '/') + { + PathBuilder.Remove(PathBuilder.Length - 1, 1); + } + + PathBuilder.Append(value); + UriBuilder.Path = PathBuilder.ToString(); + } + + public void AppendPath(bool value, bool escape = false) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(float value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(double value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(int value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(byte[] value, string format, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendPath(IEnumerable value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(DateTimeOffset value, string format, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendPath(TimeSpan value, string format, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendPath(Guid value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(long value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, string value, bool escape) + { + Argument.AssertNotNullOrWhiteSpace(name, nameof(name)); + Argument.AssertNotNullOrWhiteSpace(value, nameof(value)); + + if (QueryBuilder.Length > 0) + { + QueryBuilder.Append('&'); + } + + if (escape) + { + value = Uri.EscapeDataString(value); + } + + QueryBuilder.Append(name); + QueryBuilder.Append('='); + QueryBuilder.Append(value); + } + + public void AppendQuery(string name, bool value, bool escape = false) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, float value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, DateTimeOffset value, string format, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendQuery(string name, TimeSpan value, string format, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendQuery(string name, double value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, decimal value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, int value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, long value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, TimeSpan value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, byte[] value, string format, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendQuery(string name, Guid value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQueryDelimited(string name, IEnumerable value, string delimiter, bool escape = true) + { + var stringValues = value.Select(v => ModelSerializationExtensions.TypeFormatters.ConvertToString(v)); + AppendQuery(name, string.Join(delimiter, stringValues), escape); + } + + public void AppendQueryDelimited(string name, IEnumerable value, string delimiter, string format, bool escape = true) + { + var stringValues = value.Select(v => ModelSerializationExtensions.TypeFormatters.ConvertToString(v, format)); + AppendQuery(name, string.Join(delimiter, stringValues), escape); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ErrorResult.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ErrorResult.cs new file mode 100644 index 000000000..f9ea3276e --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ErrorResult.cs @@ -0,0 +1,23 @@ +// + +#nullable disable + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI +{ + internal class ErrorResult : ClientResult + { + private readonly PipelineResponse _response; + private readonly ClientResultException _exception; + + public ErrorResult(PipelineResponse response, ClientResultException exception) : base(default, response) + { + _response = response; + _exception = exception; + } + + public override T Value => throw _exception; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ModelSerializationExtensions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ModelSerializationExtensions.cs new file mode 100644 index 000000000..f4286bb0e --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/ModelSerializationExtensions.cs @@ -0,0 +1,399 @@ +// + +#nullable disable + +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Diagnostics; +using System.Globalization; +using System.Text.Json; +using System.Xml; + +namespace Azure.AI.OpenAI +{ + internal static class ModelSerializationExtensions + { + internal static readonly ModelReaderWriterOptions WireOptions = new ModelReaderWriterOptions("W"); + internal static readonly BinaryData SentinelValue = BinaryData.FromObjectAsJson("__EMPTY__"); + + public static object GetObject(this JsonElement element) + { + switch (element.ValueKind) + { + case JsonValueKind.String: + return element.GetString(); + case JsonValueKind.Number: + if (element.TryGetInt32(out int intValue)) + { + return intValue; + } + if (element.TryGetInt64(out long longValue)) + { + return longValue; + } + return element.GetDouble(); + case JsonValueKind.True: + return true; + case JsonValueKind.False: + return false; + case JsonValueKind.Undefined: + case JsonValueKind.Null: + return null; + case JsonValueKind.Object: + var dictionary = new Dictionary(); + foreach (var jsonProperty in element.EnumerateObject()) + { + dictionary.Add(jsonProperty.Name, jsonProperty.Value.GetObject()); + } + return dictionary; + case JsonValueKind.Array: + var list = new List(); + foreach (var item in element.EnumerateArray()) + { + list.Add(item.GetObject()); + } + return list.ToArray(); + default: + throw new NotSupportedException($"Not supported value kind {element.ValueKind}"); + } + } + + public static byte[] GetBytesFromBase64(this JsonElement element, string format) + { + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + + return format switch + { + "U" => TypeFormatters.FromBase64UrlString(element.GetRequiredString()), + "D" => element.GetBytesFromBase64(), + _ => throw new ArgumentException($"Format is not supported: '{format}'", nameof(format)) + }; + } + + public static DateTimeOffset GetDateTimeOffset(this JsonElement element, string format) => format switch + { + "U" when element.ValueKind == JsonValueKind.Number => DateTimeOffset.FromUnixTimeSeconds(element.GetInt64()), + _ => TypeFormatters.ParseDateTimeOffset(element.GetString(), format) + }; + + public static TimeSpan GetTimeSpan(this JsonElement element, string format) => TypeFormatters.ParseTimeSpan(element.GetString(), format); + + public static char GetChar(this JsonElement element) + { + if (element.ValueKind == JsonValueKind.String) + { + var text = element.GetString(); + if (text == null || text.Length != 1) + { + throw new NotSupportedException($"Cannot convert \"{text}\" to a char"); + } + return text[0]; + } + else + { + throw new NotSupportedException($"Cannot convert {element.ValueKind} to a char"); + } + } + + [Conditional("DEBUG")] + public static void ThrowNonNullablePropertyIsNull(this JsonProperty property) + { + throw new JsonException($"A property '{property.Name}' defined as non-nullable but received as null from the service. This exception only happens in DEBUG builds of the library and would be ignored in the release build"); + } + + public static string GetRequiredString(this JsonElement element) + { + var value = element.GetString(); + if (value == null) + { + throw new InvalidOperationException($"The requested operation requires an element of type 'String', but the target element has type '{element.ValueKind}'."); + } + return value; + } + + public static void WriteStringValue(this Utf8JsonWriter writer, DateTimeOffset value, string format) + { + writer.WriteStringValue(TypeFormatters.ToString(value, format)); + } + + public static void WriteStringValue(this Utf8JsonWriter writer, DateTime value, string format) + { + writer.WriteStringValue(TypeFormatters.ToString(value, format)); + } + + public static void WriteStringValue(this Utf8JsonWriter writer, TimeSpan value, string format) + { + writer.WriteStringValue(TypeFormatters.ToString(value, format)); + } + + public static void WriteStringValue(this Utf8JsonWriter writer, char value) + { + writer.WriteStringValue(value.ToString(CultureInfo.InvariantCulture)); + } + + public static void WriteBase64StringValue(this Utf8JsonWriter writer, byte[] value, string format) + { + if (value == null) + { + writer.WriteNullValue(); + return; + } + switch (format) + { + case "U": + writer.WriteStringValue(TypeFormatters.ToBase64UrlString(value)); + break; + case "D": + writer.WriteBase64StringValue(value); + break; + default: + throw new ArgumentException($"Format is not supported: '{format}'", nameof(format)); + } + } + + public static void WriteNumberValue(this Utf8JsonWriter writer, DateTimeOffset value, string format) + { + if (format != "U") + { + throw new ArgumentOutOfRangeException(nameof(format), "Only 'U' format is supported when writing a DateTimeOffset as a Number."); + } + writer.WriteNumberValue(value.ToUnixTimeSeconds()); + } + + public static void WriteObjectValue(this Utf8JsonWriter writer, T value, ModelReaderWriterOptions options = null) + { + switch (value) + { + case null: + writer.WriteNullValue(); + break; + case IJsonModel jsonModel: + jsonModel.Write(writer, options ?? WireOptions); + break; + case byte[] bytes: + writer.WriteBase64StringValue(bytes); + break; + case BinaryData bytes0: + writer.WriteBase64StringValue(bytes0); + break; + case JsonElement json: + json.WriteTo(writer); + break; + case int i: + writer.WriteNumberValue(i); + break; + case decimal d: + writer.WriteNumberValue(d); + break; + case double d0: + if (double.IsNaN(d0)) + { + writer.WriteStringValue("NaN"); + } + else + { + writer.WriteNumberValue(d0); + } + break; + case float f: + writer.WriteNumberValue(f); + break; + case long l: + writer.WriteNumberValue(l); + break; + case string s: + writer.WriteStringValue(s); + break; + case bool b: + writer.WriteBooleanValue(b); + break; + case Guid g: + writer.WriteStringValue(g); + break; + case DateTimeOffset dateTimeOffset: + writer.WriteStringValue(dateTimeOffset, "O"); + break; + case DateTime dateTime: + writer.WriteStringValue(dateTime, "O"); + break; + case IEnumerable> enumerable: + writer.WriteStartObject(); + foreach (var pair in enumerable) + { + writer.WritePropertyName(pair.Key); + writer.WriteObjectValue(pair.Value, options); + } + writer.WriteEndObject(); + break; + case IEnumerable objectEnumerable: + writer.WriteStartArray(); + foreach (var item in objectEnumerable) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + break; + case TimeSpan timeSpan: + writer.WriteStringValue(timeSpan, "P"); + break; + default: + throw new NotSupportedException($"Not supported type {value.GetType()}"); + } + } + + public static void WriteObjectValue(this Utf8JsonWriter writer, object value, ModelReaderWriterOptions options = null) + { + writer.WriteObjectValue(value, options); + } + + internal static bool IsSentinelValue(BinaryData value) + { + ReadOnlySpan sentinelSpan = SentinelValue.ToMemory().Span; + ReadOnlySpan valueSpan = value.ToMemory().Span; + return sentinelSpan.SequenceEqual(valueSpan); + } + + internal static class TypeFormatters + { + private const string RoundtripZFormat = "yyyy-MM-ddTHH:mm:ss.fffffffZ"; + public const string DefaultNumberFormat = "G"; + + public static string ToString(bool value) => value ? "true" : "false"; + + public static string ToString(DateTime value, string format) => value.Kind switch + { + DateTimeKind.Utc => ToString((DateTimeOffset)value, format), + _ => throw new NotSupportedException($"DateTime {value} has a Kind of {value.Kind}. Generated clients require it to be UTC. You can call DateTime.SpecifyKind to change Kind property value to DateTimeKind.Utc.") + }; + + public static string ToString(DateTimeOffset value, string format) => format switch + { + "D" => value.ToString("yyyy-MM-dd", CultureInfo.InvariantCulture), + "U" => value.ToUnixTimeSeconds().ToString(CultureInfo.InvariantCulture), + "O" => value.ToUniversalTime().ToString(RoundtripZFormat, CultureInfo.InvariantCulture), + "o" => value.ToUniversalTime().ToString(RoundtripZFormat, CultureInfo.InvariantCulture), + "R" => value.ToString("r", CultureInfo.InvariantCulture), + _ => value.ToString(format, CultureInfo.InvariantCulture) + }; + + public static string ToString(TimeSpan value, string format) => format switch + { + "P" => XmlConvert.ToString(value), + _ => value.ToString(format, CultureInfo.InvariantCulture) + }; + + public static string ToString(byte[] value, string format) => format switch + { + "U" => ToBase64UrlString(value), + "D" => Convert.ToBase64String(value), + _ => throw new ArgumentException($"Format is not supported: '{format}'", nameof(format)) + }; + + public static string ToBase64UrlString(byte[] value) + { + int numWholeOrPartialInputBlocks = checked(value.Length + 2) / 3; + int size = checked(numWholeOrPartialInputBlocks * 4); + char[] output = new char[size]; + + int numBase64Chars = Convert.ToBase64CharArray(value, 0, value.Length, output, 0); + + int i = 0; + for (; i < numBase64Chars; i++) + { + char ch = output[i]; + if (ch == '+') + { + output[i] = '-'; + } + else + { + if (ch == '/') + { + output[i] = '_'; + } + else + { + if (ch == '=') + { + break; + } + } + } + } + + return new string(output, 0, i); + } + + public static byte[] FromBase64UrlString(string value) + { + int paddingCharsToAdd = (value.Length % 4) switch + { + 0 => 0, + 2 => 2, + 3 => 1, + _ => throw new InvalidOperationException("Malformed input") + }; + char[] output = new char[(value.Length + paddingCharsToAdd)]; + int i = 0; + for (; i < value.Length; i++) + { + char ch = value[i]; + if (ch == '-') + { + output[i] = '+'; + } + else + { + if (ch == '_') + { + output[i] = '/'; + } + else + { + output[i] = ch; + } + } + } + + for (; i < output.Length; i++) + { + output[i] = '='; + } + + return Convert.FromBase64CharArray(output, 0, output.Length); + } + + public static DateTimeOffset ParseDateTimeOffset(string value, string format) => format switch + { + "U" => DateTimeOffset.FromUnixTimeSeconds(long.Parse(value, CultureInfo.InvariantCulture)), + _ => DateTimeOffset.Parse(value, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal) + }; + + public static TimeSpan ParseTimeSpan(string value, string format) => format switch + { + "P" => XmlConvert.ToTimeSpan(value), + _ => TimeSpan.ParseExact(value, format, CultureInfo.InvariantCulture) + }; + + public static string ConvertToString(object value, string format = null) => value switch + { + null => "null", + string s => s, + bool b => ToString(b), + int or float or double or long or decimal => ((IFormattable)value).ToString(DefaultNumberFormat, CultureInfo.InvariantCulture), + byte[] b0 when format != null => ToString(b0, format), + IEnumerable s0 => string.Join(",", s0), + DateTimeOffset dateTime when format != null => ToString(dateTime, format), + TimeSpan timeSpan when format != null => ToString(timeSpan, format), + TimeSpan timeSpan0 => XmlConvert.ToString(timeSpan0), + Guid guid => guid.ToString(), + BinaryData binaryData => ConvertToString(binaryData.ToArray(), format), + _ => value.ToString() + }; + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/MultipartFormDataBinaryContent.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/MultipartFormDataBinaryContent.cs new file mode 100644 index 000000000..43825f6b4 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/MultipartFormDataBinaryContent.cs @@ -0,0 +1,197 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.Globalization; +using System.IO; +using System.Net.Http; +using System.Net.Http.Headers; +using System.Threading; +using System.Threading.Tasks; + +namespace Azure.AI.OpenAI +{ + internal class MultipartFormDataBinaryContent : BinaryContent + { + private readonly MultipartFormDataContent _multipartContent; + private static readonly Random _random = new Random(); + private static readonly char[] _boundaryValues = "0123456789=ABCDEFGHIJKLMNOPQRSTUVWXYZ_abcdefghijklmnopqrstuvwxyz".ToCharArray(); + + public MultipartFormDataBinaryContent() + { + _multipartContent = new MultipartFormDataContent(CreateBoundary()); + } + + public string ContentType + { + get + { + return _multipartContent.Headers.ContentType.ToString(); + } + } + + internal HttpContent HttpContent => _multipartContent; + + private static string CreateBoundary() + { + Span chars = new char[70]; + byte[] random = new byte[70]; + _random.NextBytes(random); + int mask = 255 >> 2; + for (int i = 0; i < 70; i++) + { + chars[i] = _boundaryValues[random[i] & mask]; + } + return chars.ToString(); + } + + public void Add(string content, string name, string filename = null, string contentType = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(name, nameof(name)); + + Add(new StringContent(content), name, filename, contentType); + } + + public void Add(int content, string name, string filename = null, string contentType = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(name, nameof(name)); + + string value = content.ToString("G", CultureInfo.InvariantCulture); + Add(new StringContent(value), name, filename, contentType); + } + + public void Add(long content, string name, string filename = null, string contentType = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(name, nameof(name)); + + string value = content.ToString("G", CultureInfo.InvariantCulture); + Add(new StringContent(value), name, filename, contentType); + } + + public void Add(float content, string name, string filename = null, string contentType = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(name, nameof(name)); + + string value = content.ToString("G", CultureInfo.InvariantCulture); + Add(new StringContent(value), name, filename, contentType); + } + + public void Add(double content, string name, string filename = null, string contentType = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(name, nameof(name)); + + string value = content.ToString("G", CultureInfo.InvariantCulture); + Add(new StringContent(value), name, filename, contentType); + } + + public void Add(decimal content, string name, string filename = null, string contentType = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(name, nameof(name)); + + string value = content.ToString("G", CultureInfo.InvariantCulture); + Add(new StringContent(value), name, filename, contentType); + } + + public void Add(bool content, string name, string filename = null, string contentType = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(name, nameof(name)); + + string value = content ? "true" : "false"; + Add(new StringContent(value), name, filename, contentType); + } + + public void Add(Stream content, string name, string filename = null, string contentType = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(name, nameof(name)); + + Add(new StreamContent(content), name, filename, contentType); + } + + public void Add(byte[] content, string name, string filename = null, string contentType = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(name, nameof(name)); + + Add(new ByteArrayContent(content), name, filename, contentType); + } + + public void Add(BinaryData content, string name, string filename = null, string contentType = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(name, nameof(name)); + + Add(new ByteArrayContent(content.ToArray()), name, filename, contentType); + } + + private void Add(HttpContent content, string name, string filename, string contentType) + { + if (filename != null) + { + Argument.AssertNotNullOrEmpty(filename, nameof(filename)); + AddFilenameHeader(content, name, filename); + } + if (contentType != null) + { + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + AddContentTypeHeader(content, contentType); + } + _multipartContent.Add(content, name); + } + + public static void AddFilenameHeader(HttpContent content, string name, string filename) + { + ContentDispositionHeaderValue header = new ContentDispositionHeaderValue("form-data") { Name = name, FileName = filename }; + content.Headers.ContentDisposition = header; + } + + public static void AddContentTypeHeader(HttpContent content, string contentType) + { + MediaTypeHeaderValue header = new MediaTypeHeaderValue(contentType); + content.Headers.ContentType = header; + } + + public override bool TryComputeLength(out long length) + { + if (_multipartContent.Headers.ContentLength is long contentLength) + { + length = contentLength; + return true; + } + length = 0; + return false; + } + + public override void WriteTo(Stream stream, CancellationToken cancellationToken = default) + { +#if NET6_0_OR_GREATER + _multipartContent.CopyTo(stream, default, cancellationToken); +#else + _multipartContent.CopyToAsync(stream).GetAwaiter().GetResult(); +#endif + } + + public override async Task WriteToAsync(Stream stream, CancellationToken cancellationToken = default) + { +#if NET6_0_OR_GREATER + await _multipartContent.CopyToAsync(stream, cancellationToken).ConfigureAwait(false); +#else + await _multipartContent.CopyToAsync(stream).ConfigureAwait(false); +#endif + } + + public override void Dispose() + { + _multipartContent.Dispose(); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/Optional.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/Optional.cs new file mode 100644 index 000000000..56bba5f65 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/Optional.cs @@ -0,0 +1,48 @@ +// + +#nullable disable + +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal static class Optional + { + public static bool IsCollectionDefined(IEnumerable collection) + { + return !(collection is ChangeTrackingList changeTrackingList && changeTrackingList.IsUndefined); + } + + public static bool IsCollectionDefined(IDictionary collection) + { + return !(collection is ChangeTrackingDictionary changeTrackingDictionary && changeTrackingDictionary.IsUndefined); + } + + public static bool IsCollectionDefined(IReadOnlyDictionary collection) + { + return !(collection is ChangeTrackingDictionary changeTrackingDictionary && changeTrackingDictionary.IsUndefined); + } + + public static bool IsDefined(T? value) + where T : struct + { + return value.HasValue; + } + + public static bool IsDefined(object value) + { + return value != null; + } + + public static bool IsDefined(JsonElement value) + { + return value.ValueKind != JsonValueKind.Undefined; + } + + public static bool IsDefined(string value) + { + return value != null; + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/Utf8JsonBinaryContent.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/Utf8JsonBinaryContent.cs new file mode 100644 index 000000000..7f70307bf --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/Internal/Utf8JsonBinaryContent.cs @@ -0,0 +1,52 @@ +// + +#nullable disable + +using System.ClientModel; +using System.IO; +using System.Text.Json; +using System.Threading; +using System.Threading.Tasks; + +namespace Azure.AI.OpenAI +{ + internal class Utf8JsonBinaryContent : BinaryContent + { + private readonly MemoryStream _stream; + private readonly BinaryContent _content; + + public Utf8JsonBinaryContent() + { + _stream = new MemoryStream(); + _content = Create(_stream); + JsonWriter = new Utf8JsonWriter(_stream); + } + + public Utf8JsonWriter JsonWriter { get; } + + public override async Task WriteToAsync(Stream stream, CancellationToken cancellationToken = default) + { + await JsonWriter.FlushAsync().ConfigureAwait(false); + await _content.WriteToAsync(stream, cancellationToken).ConfigureAwait(false); + } + + public override void WriteTo(Stream stream, CancellationToken cancellationToken = default) + { + JsonWriter.Flush(); + _content.WriteTo(stream, cancellationToken); + } + + public override bool TryComputeLength(out long length) + { + length = JsonWriter.BytesCommitted + JsonWriter.BytesPending; + return true; + } + + public override void Dispose() + { + JsonWriter.Dispose(); + _content.Dispose(); + _stream.Dispose(); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceAccessTokenAuthenticationOptions.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceAccessTokenAuthenticationOptions.Serialization.cs new file mode 100644 index 000000000..344aed027 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceAccessTokenAuthenticationOptions.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureChatDataSourceAccessTokenAuthenticationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceAccessTokenAuthenticationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("access_token") != true) + { + writer.WritePropertyName("access_token"u8); + writer.WriteStringValue(AccessToken); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureChatDataSourceAccessTokenAuthenticationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceAccessTokenAuthenticationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureChatDataSourceAccessTokenAuthenticationOptions(document.RootElement, options); + } + + internal static InternalAzureChatDataSourceAccessTokenAuthenticationOptions DeserializeInternalAzureChatDataSourceAccessTokenAuthenticationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string accessToken = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("access_token"u8)) + { + accessToken = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureChatDataSourceAccessTokenAuthenticationOptions(type, serializedAdditionalRawData, accessToken); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceAccessTokenAuthenticationOptions)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureChatDataSourceAccessTokenAuthenticationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureChatDataSourceAccessTokenAuthenticationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceAccessTokenAuthenticationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalAzureChatDataSourceAccessTokenAuthenticationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureChatDataSourceAccessTokenAuthenticationOptions(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceAccessTokenAuthenticationOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceAccessTokenAuthenticationOptions.cs new file mode 100644 index 000000000..2fbc3dc9f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceAccessTokenAuthenticationOptions.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureChatDataSourceAccessTokenAuthenticationOptions. + internal partial class InternalAzureChatDataSourceAccessTokenAuthenticationOptions : DataSourceAuthentication + { + /// Initializes a new instance of . + /// + /// is null. + internal InternalAzureChatDataSourceAccessTokenAuthenticationOptions(string accessToken) + { + Argument.AssertNotNull(accessToken, nameof(accessToken)); + + Type = "access_token"; + AccessToken = accessToken; + } + + /// Initializes a new instance of . + /// Discriminator. + /// Keeps track of any properties unknown to the library. + /// + internal InternalAzureChatDataSourceAccessTokenAuthenticationOptions(string type, IDictionary serializedAdditionalRawData, string accessToken) : base(type, serializedAdditionalRawData) + { + AccessToken = accessToken; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureChatDataSourceAccessTokenAuthenticationOptions() + { + } + + /// Gets the access token. + internal string AccessToken { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceApiKeyAuthenticationOptions.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceApiKeyAuthenticationOptions.Serialization.cs new file mode 100644 index 000000000..4d448e27c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceApiKeyAuthenticationOptions.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureChatDataSourceApiKeyAuthenticationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceApiKeyAuthenticationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("key") != true) + { + writer.WritePropertyName("key"u8); + writer.WriteStringValue(Key); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureChatDataSourceApiKeyAuthenticationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceApiKeyAuthenticationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureChatDataSourceApiKeyAuthenticationOptions(document.RootElement, options); + } + + internal static InternalAzureChatDataSourceApiKeyAuthenticationOptions DeserializeInternalAzureChatDataSourceApiKeyAuthenticationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string key = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("key"u8)) + { + key = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureChatDataSourceApiKeyAuthenticationOptions(type, serializedAdditionalRawData, key); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceApiKeyAuthenticationOptions)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureChatDataSourceApiKeyAuthenticationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureChatDataSourceApiKeyAuthenticationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceApiKeyAuthenticationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalAzureChatDataSourceApiKeyAuthenticationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureChatDataSourceApiKeyAuthenticationOptions(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceApiKeyAuthenticationOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceApiKeyAuthenticationOptions.cs new file mode 100644 index 000000000..91325ee4c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceApiKeyAuthenticationOptions.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureChatDataSourceApiKeyAuthenticationOptions. + internal partial class InternalAzureChatDataSourceApiKeyAuthenticationOptions : DataSourceAuthentication + { + /// Initializes a new instance of . + /// + /// is null. + internal InternalAzureChatDataSourceApiKeyAuthenticationOptions(string key) + { + Argument.AssertNotNull(key, nameof(key)); + + Type = "api_key"; + Key = key; + } + + /// Initializes a new instance of . + /// Discriminator. + /// Keeps track of any properties unknown to the library. + /// + internal InternalAzureChatDataSourceApiKeyAuthenticationOptions(string type, IDictionary serializedAdditionalRawData, string key) : base(type, serializedAdditionalRawData) + { + Key = key; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureChatDataSourceApiKeyAuthenticationOptions() + { + } + + /// Gets the key. + internal string Key { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceConnectionStringAuthenticationOptions.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceConnectionStringAuthenticationOptions.Serialization.cs new file mode 100644 index 000000000..87563d7c1 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceConnectionStringAuthenticationOptions.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureChatDataSourceConnectionStringAuthenticationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceConnectionStringAuthenticationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("connection_string") != true) + { + writer.WritePropertyName("connection_string"u8); + writer.WriteStringValue(ConnectionString); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureChatDataSourceConnectionStringAuthenticationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceConnectionStringAuthenticationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureChatDataSourceConnectionStringAuthenticationOptions(document.RootElement, options); + } + + internal static InternalAzureChatDataSourceConnectionStringAuthenticationOptions DeserializeInternalAzureChatDataSourceConnectionStringAuthenticationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string connectionString = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("connection_string"u8)) + { + connectionString = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureChatDataSourceConnectionStringAuthenticationOptions(type, serializedAdditionalRawData, connectionString); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceConnectionStringAuthenticationOptions)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureChatDataSourceConnectionStringAuthenticationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureChatDataSourceConnectionStringAuthenticationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceConnectionStringAuthenticationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalAzureChatDataSourceConnectionStringAuthenticationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureChatDataSourceConnectionStringAuthenticationOptions(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceConnectionStringAuthenticationOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceConnectionStringAuthenticationOptions.cs new file mode 100644 index 000000000..aec715710 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceConnectionStringAuthenticationOptions.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureChatDataSourceConnectionStringAuthenticationOptions. + internal partial class InternalAzureChatDataSourceConnectionStringAuthenticationOptions : DataSourceAuthentication + { + /// Initializes a new instance of . + /// + /// is null. + internal InternalAzureChatDataSourceConnectionStringAuthenticationOptions(string connectionString) + { + Argument.AssertNotNull(connectionString, nameof(connectionString)); + + Type = "connection_string"; + ConnectionString = connectionString; + } + + /// Initializes a new instance of . + /// Discriminator. + /// Keeps track of any properties unknown to the library. + /// + internal InternalAzureChatDataSourceConnectionStringAuthenticationOptions(string type, IDictionary serializedAdditionalRawData, string connectionString) : base(type, serializedAdditionalRawData) + { + ConnectionString = connectionString; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureChatDataSourceConnectionStringAuthenticationOptions() + { + } + + /// Gets the connection string. + internal string ConnectionString { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceDeploymentNameVectorizationSource.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceDeploymentNameVectorizationSource.Serialization.cs new file mode 100644 index 000000000..a322da00a --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceDeploymentNameVectorizationSource.Serialization.cs @@ -0,0 +1,163 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureChatDataSourceDeploymentNameVectorizationSource : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceDeploymentNameVectorizationSource)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("deployment_name") != true) + { + writer.WritePropertyName("deployment_name"u8); + writer.WriteStringValue(DeploymentName); + } + if (SerializedAdditionalRawData?.ContainsKey("dimensions") != true && Optional.IsDefined(Dimensions)) + { + writer.WritePropertyName("dimensions"u8); + writer.WriteNumberValue(Dimensions.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureChatDataSourceDeploymentNameVectorizationSource IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceDeploymentNameVectorizationSource)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureChatDataSourceDeploymentNameVectorizationSource(document.RootElement, options); + } + + internal static InternalAzureChatDataSourceDeploymentNameVectorizationSource DeserializeInternalAzureChatDataSourceDeploymentNameVectorizationSource(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string deploymentName = default; + int? dimensions = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("deployment_name"u8)) + { + deploymentName = property.Value.GetString(); + continue; + } + if (property.NameEquals("dimensions"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + dimensions = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureChatDataSourceDeploymentNameVectorizationSource(type, serializedAdditionalRawData, deploymentName, dimensions); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceDeploymentNameVectorizationSource)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureChatDataSourceDeploymentNameVectorizationSource IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureChatDataSourceDeploymentNameVectorizationSource(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceDeploymentNameVectorizationSource)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalAzureChatDataSourceDeploymentNameVectorizationSource FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureChatDataSourceDeploymentNameVectorizationSource(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceDeploymentNameVectorizationSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceDeploymentNameVectorizationSource.cs new file mode 100644 index 000000000..35bec8189 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceDeploymentNameVectorizationSource.cs @@ -0,0 +1,65 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// + /// Represents a vectorization source that makes internal service calls against an Azure OpenAI embedding model + /// deployment. In contrast with the endpoint-based vectorization source, a deployment-name-based vectorization source + /// must be part of the same Azure OpenAI resource but can be used even in private networks. + /// + internal partial class InternalAzureChatDataSourceDeploymentNameVectorizationSource : DataSourceVectorizer + { + /// Initializes a new instance of . + /// + /// The embedding model deployment to use for vectorization. This deployment must exist within the same Azure OpenAI + /// resource as the model deployment being used for chat completions. + /// + /// is null. + internal InternalAzureChatDataSourceDeploymentNameVectorizationSource(string deploymentName) + { + Argument.AssertNotNull(deploymentName, nameof(deploymentName)); + + Type = "deployment_name"; + DeploymentName = deploymentName; + } + + /// Initializes a new instance of . + /// The differentiating identifier for the concrete vectorization source. + /// Keeps track of any properties unknown to the library. + /// + /// The embedding model deployment to use for vectorization. This deployment must exist within the same Azure OpenAI + /// resource as the model deployment being used for chat completions. + /// + /// + /// The number of dimensions to request on embeddings. + /// Only supported in 'text-embedding-3' and later models. + /// + internal InternalAzureChatDataSourceDeploymentNameVectorizationSource(string type, IDictionary serializedAdditionalRawData, string deploymentName, int? dimensions) : base(type, serializedAdditionalRawData) + { + DeploymentName = deploymentName; + Dimensions = dimensions; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureChatDataSourceDeploymentNameVectorizationSource() + { + } + + /// + /// The embedding model deployment to use for vectorization. This deployment must exist within the same Azure OpenAI + /// resource as the model deployment being used for chat completions. + /// + internal string DeploymentName { get; set; } + /// + /// The number of dimensions to request on embeddings. + /// Only supported in 'text-embedding-3' and later models. + /// + internal int? Dimensions { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions.Serialization.cs new file mode 100644 index 000000000..b24dfb4f4 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("encoded_api_key") != true) + { + writer.WritePropertyName("encoded_api_key"u8); + writer.WriteStringValue(EncodedApiKey); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions(document.RootElement, options); + } + + internal static InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions DeserializeInternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string encodedApiKey = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("encoded_api_key"u8)) + { + encodedApiKey = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions(type, serializedAdditionalRawData, encodedApiKey); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions.cs new file mode 100644 index 000000000..bb4089d7c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureChatDataSourceEncodedApiKeyAuthenticationOptions. + internal partial class InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions : DataSourceAuthentication + { + /// Initializes a new instance of . + /// + /// is null. + internal InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions(string encodedApiKey) + { + Argument.AssertNotNull(encodedApiKey, nameof(encodedApiKey)); + + Type = "encoded_api_key"; + EncodedApiKey = encodedApiKey; + } + + /// Initializes a new instance of . + /// Discriminator. + /// Keeps track of any properties unknown to the library. + /// + internal InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions(string type, IDictionary serializedAdditionalRawData, string encodedApiKey) : base(type, serializedAdditionalRawData) + { + EncodedApiKey = encodedApiKey; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureChatDataSourceEncodedApiKeyAuthenticationOptions() + { + } + + /// Gets the encoded api key. + internal string EncodedApiKey { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEndpointVectorizationSource.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEndpointVectorizationSource.Serialization.cs new file mode 100644 index 000000000..e984e66bc --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEndpointVectorizationSource.Serialization.cs @@ -0,0 +1,174 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureChatDataSourceEndpointVectorizationSource : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceEndpointVectorizationSource)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("endpoint") != true) + { + writer.WritePropertyName("endpoint"u8); + writer.WriteStringValue(Endpoint.AbsoluteUri); + } + if (SerializedAdditionalRawData?.ContainsKey("authentication") != true) + { + writer.WritePropertyName("authentication"u8); + writer.WriteObjectValue(Authentication, options); + } + if (SerializedAdditionalRawData?.ContainsKey("dimensions") != true && Optional.IsDefined(Dimensions)) + { + writer.WritePropertyName("dimensions"u8); + writer.WriteNumberValue(Dimensions.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureChatDataSourceEndpointVectorizationSource IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceEndpointVectorizationSource)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureChatDataSourceEndpointVectorizationSource(document.RootElement, options); + } + + internal static InternalAzureChatDataSourceEndpointVectorizationSource DeserializeInternalAzureChatDataSourceEndpointVectorizationSource(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + Uri endpoint = default; + DataSourceAuthentication authentication = default; + int? dimensions = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("endpoint"u8)) + { + endpoint = new Uri(property.Value.GetString()); + continue; + } + if (property.NameEquals("authentication"u8)) + { + authentication = DataSourceAuthentication.DeserializeDataSourceAuthentication(property.Value, options); + continue; + } + if (property.NameEquals("dimensions"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + dimensions = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureChatDataSourceEndpointVectorizationSource(type, serializedAdditionalRawData, endpoint, authentication, dimensions); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceEndpointVectorizationSource)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureChatDataSourceEndpointVectorizationSource IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureChatDataSourceEndpointVectorizationSource(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceEndpointVectorizationSource)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalAzureChatDataSourceEndpointVectorizationSource FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureChatDataSourceEndpointVectorizationSource(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEndpointVectorizationSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEndpointVectorizationSource.cs new file mode 100644 index 000000000..374ef45d2 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceEndpointVectorizationSource.cs @@ -0,0 +1,80 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// Represents a vectorization source that makes public service calls against an Azure OpenAI embedding model deployment. + internal partial class InternalAzureChatDataSourceEndpointVectorizationSource : DataSourceVectorizer + { + /// Initializes a new instance of . + /// + /// Specifies the resource endpoint URL from which embeddings should be retrieved. + /// It should be in the format of: + /// https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/embeddings. + /// The api-version query parameter is not allowed. + /// + /// + /// The authentication mechanism to use with the endpoint-based vectorization source. + /// Endpoint authentication supports API key and access token mechanisms. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// or is null. + internal InternalAzureChatDataSourceEndpointVectorizationSource(Uri endpoint, DataSourceAuthentication authentication) + { + Argument.AssertNotNull(endpoint, nameof(endpoint)); + Argument.AssertNotNull(authentication, nameof(authentication)); + + Type = "endpoint"; + Endpoint = endpoint; + Authentication = authentication; + } + + /// Initializes a new instance of . + /// The differentiating identifier for the concrete vectorization source. + /// Keeps track of any properties unknown to the library. + /// + /// Specifies the resource endpoint URL from which embeddings should be retrieved. + /// It should be in the format of: + /// https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/embeddings. + /// The api-version query parameter is not allowed. + /// + /// + /// The authentication mechanism to use with the endpoint-based vectorization source. + /// Endpoint authentication supports API key and access token mechanisms. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// + /// The number of dimensions to request on embeddings. + /// Only supported in 'text-embedding-3' and later models. + /// + internal InternalAzureChatDataSourceEndpointVectorizationSource(string type, IDictionary serializedAdditionalRawData, Uri endpoint, DataSourceAuthentication authentication, int? dimensions) : base(type, serializedAdditionalRawData) + { + Endpoint = endpoint; + Authentication = authentication; + Dimensions = dimensions; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureChatDataSourceEndpointVectorizationSource() + { + } + + /// + /// Specifies the resource endpoint URL from which embeddings should be retrieved. + /// It should be in the format of: + /// https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/embeddings. + /// The api-version query parameter is not allowed. + /// + internal Uri Endpoint { get; set; } + /// + /// The number of dimensions to request on embeddings. + /// Only supported in 'text-embedding-3' and later models. + /// + internal int? Dimensions { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions.Serialization.cs new file mode 100644 index 000000000..3d1b11f30 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions.Serialization.cs @@ -0,0 +1,159 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("key") != true) + { + writer.WritePropertyName("key"u8); + writer.WriteStringValue(Key); + } + if (SerializedAdditionalRawData?.ContainsKey("key_id") != true) + { + writer.WritePropertyName("key_id"u8); + writer.WriteStringValue(KeyId); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions(document.RootElement, options); + } + + internal static InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions DeserializeInternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string key = default; + string keyId = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("key"u8)) + { + key = property.Value.GetString(); + continue; + } + if (property.NameEquals("key_id"u8)) + { + keyId = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions(type, serializedAdditionalRawData, key, keyId); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions.cs new file mode 100644 index 000000000..9f6c3b347 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions.cs @@ -0,0 +1,49 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureChatDataSourceKeyAndKeyIdAuthenticationOptions. + internal partial class InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions : DataSourceAuthentication + { + /// Initializes a new instance of . + /// + /// + /// or is null. + internal InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions(string key, string keyId) + { + Argument.AssertNotNull(key, nameof(key)); + Argument.AssertNotNull(keyId, nameof(keyId)); + + Type = "key_and_key_id"; + Key = key; + KeyId = keyId; + } + + /// Initializes a new instance of . + /// Discriminator. + /// Keeps track of any properties unknown to the library. + /// + /// + internal InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions(string type, IDictionary serializedAdditionalRawData, string key, string keyId) : base(type, serializedAdditionalRawData) + { + Key = key; + KeyId = keyId; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureChatDataSourceKeyAndKeyIdAuthenticationOptions() + { + } + + /// Gets the key. + internal string Key { get; set; } + /// Gets the key id. + internal string KeyId { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceModelIdVectorizationSource.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceModelIdVectorizationSource.Serialization.cs new file mode 100644 index 000000000..cbbae8276 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceModelIdVectorizationSource.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureChatDataSourceModelIdVectorizationSource : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceModelIdVectorizationSource)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("model_id") != true) + { + writer.WritePropertyName("model_id"u8); + writer.WriteStringValue(ModelId); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureChatDataSourceModelIdVectorizationSource IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceModelIdVectorizationSource)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureChatDataSourceModelIdVectorizationSource(document.RootElement, options); + } + + internal static InternalAzureChatDataSourceModelIdVectorizationSource DeserializeInternalAzureChatDataSourceModelIdVectorizationSource(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string modelId = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("model_id"u8)) + { + modelId = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureChatDataSourceModelIdVectorizationSource(type, serializedAdditionalRawData, modelId); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceModelIdVectorizationSource)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureChatDataSourceModelIdVectorizationSource IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureChatDataSourceModelIdVectorizationSource(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceModelIdVectorizationSource)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalAzureChatDataSourceModelIdVectorizationSource FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureChatDataSourceModelIdVectorizationSource(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceModelIdVectorizationSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceModelIdVectorizationSource.cs new file mode 100644 index 000000000..b4bf391b5 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceModelIdVectorizationSource.cs @@ -0,0 +1,45 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// + /// Represents a vectorization source that makes service calls based on a search service model ID. + /// This source type is currently only supported by Elasticsearch. + /// + internal partial class InternalAzureChatDataSourceModelIdVectorizationSource : DataSourceVectorizer + { + /// Initializes a new instance of . + /// The embedding model build ID to use for vectorization. + /// is null. + internal InternalAzureChatDataSourceModelIdVectorizationSource(string modelId) + { + Argument.AssertNotNull(modelId, nameof(modelId)); + + Type = "model_id"; + ModelId = modelId; + } + + /// Initializes a new instance of . + /// The differentiating identifier for the concrete vectorization source. + /// Keeps track of any properties unknown to the library. + /// The embedding model build ID to use for vectorization. + internal InternalAzureChatDataSourceModelIdVectorizationSource(string type, IDictionary serializedAdditionalRawData, string modelId) : base(type, serializedAdditionalRawData) + { + ModelId = modelId; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureChatDataSourceModelIdVectorizationSource() + { + } + + /// The embedding model build ID to use for vectorization. + internal string ModelId { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions.Serialization.cs new file mode 100644 index 000000000..3d97453fc --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions.Serialization.cs @@ -0,0 +1,137 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions(document.RootElement, options); + } + + internal static InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions DeserializeInternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions.cs new file mode 100644 index 000000000..c08da9869 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions. + internal partial class InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions : DataSourceAuthentication + { + /// Initializes a new instance of . + internal InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions() + { + Type = "system_assigned_managed_identity"; + } + + /// Initializes a new instance of . + /// Discriminator. + /// Keeps track of any properties unknown to the library. + internal InternalAzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions.Serialization.cs new file mode 100644 index 000000000..92a62ed8c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("managed_identity_resource_id") != true) + { + writer.WritePropertyName("managed_identity_resource_id"u8); + writer.WriteStringValue(ManagedIdentityResourceId); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions(document.RootElement, options); + } + + internal static InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions DeserializeInternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string managedIdentityResourceId = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("managed_identity_resource_id"u8)) + { + managedIdentityResourceId = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions(type, serializedAdditionalRawData, managedIdentityResourceId); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions.cs new file mode 100644 index 000000000..20e6e6b2f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions. + internal partial class InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions : DataSourceAuthentication + { + /// Initializes a new instance of . + /// + /// is null. + internal InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions(string managedIdentityResourceId) + { + Argument.AssertNotNull(managedIdentityResourceId, nameof(managedIdentityResourceId)); + + Type = "user_assigned_managed_identity"; + ManagedIdentityResourceId = managedIdentityResourceId; + } + + /// Initializes a new instance of . + /// Discriminator. + /// Keeps track of any properties unknown to the library. + /// + internal InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions(string type, IDictionary serializedAdditionalRawData, string managedIdentityResourceId) : base(type, serializedAdditionalRawData) + { + ManagedIdentityResourceId = managedIdentityResourceId; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions() + { + } + + /// Gets the managed identity resource id. + internal string ManagedIdentityResourceId { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistIdResult.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistIdResult.Serialization.cs new file mode 100644 index 000000000..e00fc19cc --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistIdResult.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal partial class InternalAzureContentFilterBlocklistIdResult : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureContentFilterBlocklistIdResult)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("filtered") != true) + { + writer.WritePropertyName("filtered"u8); + writer.WriteBooleanValue(Filtered); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureContentFilterBlocklistIdResult IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureContentFilterBlocklistIdResult)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureContentFilterBlocklistIdResult(document.RootElement, options); + } + + internal static InternalAzureContentFilterBlocklistIdResult DeserializeInternalAzureContentFilterBlocklistIdResult(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + bool filtered = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("filtered"u8)) + { + filtered = property.Value.GetBoolean(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureContentFilterBlocklistIdResult(id, filtered, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureContentFilterBlocklistIdResult)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureContentFilterBlocklistIdResult IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureContentFilterBlocklistIdResult(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureContentFilterBlocklistIdResult)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static InternalAzureContentFilterBlocklistIdResult FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureContentFilterBlocklistIdResult(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistIdResult.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistIdResult.cs new file mode 100644 index 000000000..195018268 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistIdResult.cs @@ -0,0 +1,81 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// + /// A content filter result item that associates an existing custom blocklist ID with a value indicating whether or not + /// the corresponding blocklist resulted in content being filtered. + /// + internal partial class InternalAzureContentFilterBlocklistIdResult + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// The ID of the custom blocklist associated with the filtered status. + /// Whether the associated blocklist resulted in the content being filtered. + /// is null. + internal InternalAzureContentFilterBlocklistIdResult(string id, bool filtered) + { + Argument.AssertNotNull(id, nameof(id)); + + Id = id; + Filtered = filtered; + } + + /// Initializes a new instance of . + /// The ID of the custom blocklist associated with the filtered status. + /// Whether the associated blocklist resulted in the content being filtered. + /// Keeps track of any properties unknown to the library. + internal InternalAzureContentFilterBlocklistIdResult(string id, bool filtered, IDictionary serializedAdditionalRawData) + { + Id = id; + Filtered = filtered; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureContentFilterBlocklistIdResult() + { + } + + /// The ID of the custom blocklist associated with the filtered status. + internal string Id { get; set; } + /// Whether the associated blocklist resulted in the content being filtered. + internal bool Filtered { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistResultDetail.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistResultDetail.Serialization.cs new file mode 100644 index 000000000..02ac4d99c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistResultDetail.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal partial class InternalAzureContentFilterBlocklistResultDetail : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureContentFilterBlocklistResultDetail)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("filtered") != true) + { + writer.WritePropertyName("filtered"u8); + writer.WriteBooleanValue(Filtered); + } + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureContentFilterBlocklistResultDetail IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureContentFilterBlocklistResultDetail)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureContentFilterBlocklistResultDetail(document.RootElement, options); + } + + internal static InternalAzureContentFilterBlocklistResultDetail DeserializeInternalAzureContentFilterBlocklistResultDetail(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + bool filtered = default; + string id = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("filtered"u8)) + { + filtered = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureContentFilterBlocklistResultDetail(filtered, id, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureContentFilterBlocklistResultDetail)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureContentFilterBlocklistResultDetail IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureContentFilterBlocklistResultDetail(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureContentFilterBlocklistResultDetail)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static InternalAzureContentFilterBlocklistResultDetail FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureContentFilterBlocklistResultDetail(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistResultDetail.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistResultDetail.cs new file mode 100644 index 000000000..d0726035c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterBlocklistResultDetail.cs @@ -0,0 +1,78 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// The AzureContentFilterBlocklistResultDetail. + internal partial class InternalAzureContentFilterBlocklistResultDetail + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// A value indicating whether the blocklist produced a filtering action. + /// The ID of the custom blocklist evaluated. + /// is null. + internal InternalAzureContentFilterBlocklistResultDetail(bool filtered, string id) + { + Argument.AssertNotNull(id, nameof(id)); + + Filtered = filtered; + Id = id; + } + + /// Initializes a new instance of . + /// A value indicating whether the blocklist produced a filtering action. + /// The ID of the custom blocklist evaluated. + /// Keeps track of any properties unknown to the library. + internal InternalAzureContentFilterBlocklistResultDetail(bool filtered, string id, IDictionary serializedAdditionalRawData) + { + Filtered = filtered; + Id = id; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureContentFilterBlocklistResultDetail() + { + } + + /// A value indicating whether the blocklist produced a filtering action. + internal bool Filtered { get; set; } + /// The ID of the custom blocklist evaluated. + internal string Id { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResults.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResults.Serialization.cs new file mode 100644 index 000000000..c2378dd52 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResults.Serialization.cs @@ -0,0 +1,263 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal partial class InternalAzureContentFilterResultForPromptContentFilterResults : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureContentFilterResultForPromptContentFilterResults)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("sexual") != true && Optional.IsDefined(Sexual)) + { + writer.WritePropertyName("sexual"u8); + writer.WriteObjectValue(Sexual, options); + } + if (SerializedAdditionalRawData?.ContainsKey("hate") != true && Optional.IsDefined(Hate)) + { + writer.WritePropertyName("hate"u8); + writer.WriteObjectValue(Hate, options); + } + if (SerializedAdditionalRawData?.ContainsKey("violence") != true && Optional.IsDefined(Violence)) + { + writer.WritePropertyName("violence"u8); + writer.WriteObjectValue(Violence, options); + } + if (SerializedAdditionalRawData?.ContainsKey("self_harm") != true && Optional.IsDefined(SelfHarm)) + { + writer.WritePropertyName("self_harm"u8); + writer.WriteObjectValue(SelfHarm, options); + } + if (SerializedAdditionalRawData?.ContainsKey("profanity") != true && Optional.IsDefined(Profanity)) + { + writer.WritePropertyName("profanity"u8); + writer.WriteObjectValue(Profanity, options); + } + if (SerializedAdditionalRawData?.ContainsKey("custom_blocklists") != true && Optional.IsDefined(CustomBlocklists)) + { + writer.WritePropertyName("custom_blocklists"u8); + writer.WriteObjectValue(CustomBlocklists, options); + } + if (SerializedAdditionalRawData?.ContainsKey("error") != true && Optional.IsDefined(Error)) + { + writer.WritePropertyName("error"u8); + writer.WriteObjectValue(Error, options); + } + if (SerializedAdditionalRawData?.ContainsKey("jailbreak") != true) + { + writer.WritePropertyName("jailbreak"u8); + writer.WriteObjectValue(Jailbreak, options); + } + if (SerializedAdditionalRawData?.ContainsKey("indirect_attack") != true) + { + writer.WritePropertyName("indirect_attack"u8); + writer.WriteObjectValue(IndirectAttack, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureContentFilterResultForPromptContentFilterResults IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureContentFilterResultForPromptContentFilterResults)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureContentFilterResultForPromptContentFilterResults(document.RootElement, options); + } + + internal static InternalAzureContentFilterResultForPromptContentFilterResults DeserializeInternalAzureContentFilterResultForPromptContentFilterResults(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + ContentFilterSeverityResult sexual = default; + ContentFilterSeverityResult hate = default; + ContentFilterSeverityResult violence = default; + ContentFilterSeverityResult selfHarm = default; + ContentFilterDetectionResult profanity = default; + ContentFilterBlocklistResult customBlocklists = default; + InternalAzureContentFilterResultForPromptContentFilterResultsError error = default; + ContentFilterDetectionResult jailbreak = default; + ContentFilterDetectionResult indirectAttack = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("sexual"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + sexual = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("hate"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + hate = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("violence"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + violence = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("self_harm"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + selfHarm = ContentFilterSeverityResult.DeserializeContentFilterSeverityResult(property.Value, options); + continue; + } + if (property.NameEquals("profanity"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + profanity = ContentFilterDetectionResult.DeserializeContentFilterDetectionResult(property.Value, options); + continue; + } + if (property.NameEquals("custom_blocklists"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + customBlocklists = ContentFilterBlocklistResult.DeserializeContentFilterBlocklistResult(property.Value, options); + continue; + } + if (property.NameEquals("error"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + error = InternalAzureContentFilterResultForPromptContentFilterResultsError.DeserializeInternalAzureContentFilterResultForPromptContentFilterResultsError(property.Value, options); + continue; + } + if (property.NameEquals("jailbreak"u8)) + { + jailbreak = ContentFilterDetectionResult.DeserializeContentFilterDetectionResult(property.Value, options); + continue; + } + if (property.NameEquals("indirect_attack"u8)) + { + indirectAttack = ContentFilterDetectionResult.DeserializeContentFilterDetectionResult(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureContentFilterResultForPromptContentFilterResults( + sexual, + hate, + violence, + selfHarm, + profanity, + customBlocklists, + error, + jailbreak, + indirectAttack, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureContentFilterResultForPromptContentFilterResults)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureContentFilterResultForPromptContentFilterResults IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureContentFilterResultForPromptContentFilterResults(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureContentFilterResultForPromptContentFilterResults)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static InternalAzureContentFilterResultForPromptContentFilterResults FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureContentFilterResultForPromptContentFilterResults(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResults.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResults.cs new file mode 100644 index 000000000..8c654f487 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResults.cs @@ -0,0 +1,169 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// The AzureContentFilterResultForPromptContentFilterResults. + internal partial class InternalAzureContentFilterResultForPromptContentFilterResults + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// + /// A detection result that describes user prompt injection attacks, where malicious users deliberately exploit + /// system vulnerabilities to elicit unauthorized behavior from the LLM. This could lead to inappropriate content + /// generation or violations of system-imposed restrictions. + /// + /// + /// A detection result that describes attacks on systems powered by Generative AI models that can happen every time + /// an application processes information that wasn’t directly authored by either the developer of the application or + /// the user. + /// + /// or is null. + internal InternalAzureContentFilterResultForPromptContentFilterResults(ContentFilterDetectionResult jailbreak, ContentFilterDetectionResult indirectAttack) + { + Argument.AssertNotNull(jailbreak, nameof(jailbreak)); + Argument.AssertNotNull(indirectAttack, nameof(indirectAttack)); + + Jailbreak = jailbreak; + IndirectAttack = indirectAttack; + } + + /// Initializes a new instance of . + /// + /// A content filter category for language related to anatomical organs and genitals, romantic relationships, acts + /// portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an + /// assault or a forced sexual violent act against one's will, prostitution, pornography, and abuse. + /// + /// + /// A content filter category that can refer to any content that attacks or uses pejorative or discriminatory + /// language with reference to a person or identity group based on certain differentiating attributes of these groups + /// including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, + /// religion, immigration status, ability status, personal appearance, and body size. + /// + /// + /// A content filter category for language related to physical actions intended to hurt, injure, damage, or kill + /// someone or something; describes weapons, guns and related entities, such as manufactures, associations, + /// legislation, and so on. + /// + /// + /// A content filter category that describes language related to physical actions intended to purposely hurt, injure, + /// damage one's body or kill oneself. + /// + /// + /// A detection result that identifies whether crude, vulgar, or otherwise objection language is present in the + /// content. + /// + /// A collection of binary filtering outcomes for configured custom blocklists. + /// If present, details about an error that prevented content filtering from completing its evaluation. + /// + /// A detection result that describes user prompt injection attacks, where malicious users deliberately exploit + /// system vulnerabilities to elicit unauthorized behavior from the LLM. This could lead to inappropriate content + /// generation or violations of system-imposed restrictions. + /// + /// + /// A detection result that describes attacks on systems powered by Generative AI models that can happen every time + /// an application processes information that wasn’t directly authored by either the developer of the application or + /// the user. + /// + /// Keeps track of any properties unknown to the library. + internal InternalAzureContentFilterResultForPromptContentFilterResults(ContentFilterSeverityResult sexual, ContentFilterSeverityResult hate, ContentFilterSeverityResult violence, ContentFilterSeverityResult selfHarm, ContentFilterDetectionResult profanity, ContentFilterBlocklistResult customBlocklists, InternalAzureContentFilterResultForPromptContentFilterResultsError error, ContentFilterDetectionResult jailbreak, ContentFilterDetectionResult indirectAttack, IDictionary serializedAdditionalRawData) + { + Sexual = sexual; + Hate = hate; + Violence = violence; + SelfHarm = selfHarm; + Profanity = profanity; + CustomBlocklists = customBlocklists; + Error = error; + Jailbreak = jailbreak; + IndirectAttack = indirectAttack; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureContentFilterResultForPromptContentFilterResults() + { + } + + /// + /// A content filter category for language related to anatomical organs and genitals, romantic relationships, acts + /// portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an + /// assault or a forced sexual violent act against one's will, prostitution, pornography, and abuse. + /// + internal ContentFilterSeverityResult Sexual { get; set; } + /// + /// A content filter category that can refer to any content that attacks or uses pejorative or discriminatory + /// language with reference to a person or identity group based on certain differentiating attributes of these groups + /// including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, + /// religion, immigration status, ability status, personal appearance, and body size. + /// + internal ContentFilterSeverityResult Hate { get; set; } + /// + /// A content filter category for language related to physical actions intended to hurt, injure, damage, or kill + /// someone or something; describes weapons, guns and related entities, such as manufactures, associations, + /// legislation, and so on. + /// + internal ContentFilterSeverityResult Violence { get; set; } + /// + /// A content filter category that describes language related to physical actions intended to purposely hurt, injure, + /// damage one's body or kill oneself. + /// + internal ContentFilterSeverityResult SelfHarm { get; set; } + /// + /// A detection result that identifies whether crude, vulgar, or otherwise objection language is present in the + /// content. + /// + internal ContentFilterDetectionResult Profanity { get; set; } + /// A collection of binary filtering outcomes for configured custom blocklists. + internal ContentFilterBlocklistResult CustomBlocklists { get; set; } + /// If present, details about an error that prevented content filtering from completing its evaluation. + internal InternalAzureContentFilterResultForPromptContentFilterResultsError Error { get; set; } + /// + /// A detection result that describes user prompt injection attacks, where malicious users deliberately exploit + /// system vulnerabilities to elicit unauthorized behavior from the LLM. This could lead to inappropriate content + /// generation or violations of system-imposed restrictions. + /// + internal ContentFilterDetectionResult Jailbreak { get; set; } + /// + /// A detection result that describes attacks on systems powered by Generative AI models that can happen every time + /// an application processes information that wasn’t directly authored by either the developer of the application or + /// the user. + /// + internal ContentFilterDetectionResult IndirectAttack { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResultsError.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResultsError.Serialization.cs new file mode 100644 index 000000000..24d22365e --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResultsError.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal partial class InternalAzureContentFilterResultForPromptContentFilterResultsError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureContentFilterResultForPromptContentFilterResultsError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true) + { + writer.WritePropertyName("code"u8); + writer.WriteNumberValue(Code); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true) + { + writer.WritePropertyName("message"u8); + writer.WriteStringValue(Message); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureContentFilterResultForPromptContentFilterResultsError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureContentFilterResultForPromptContentFilterResultsError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureContentFilterResultForPromptContentFilterResultsError(document.RootElement, options); + } + + internal static InternalAzureContentFilterResultForPromptContentFilterResultsError DeserializeInternalAzureContentFilterResultForPromptContentFilterResultsError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int code = default; + string message = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + code = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("message"u8)) + { + message = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureContentFilterResultForPromptContentFilterResultsError(code, message, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureContentFilterResultForPromptContentFilterResultsError)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureContentFilterResultForPromptContentFilterResultsError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureContentFilterResultForPromptContentFilterResultsError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureContentFilterResultForPromptContentFilterResultsError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static InternalAzureContentFilterResultForPromptContentFilterResultsError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureContentFilterResultForPromptContentFilterResultsError(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResultsError.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResultsError.cs new file mode 100644 index 000000000..af8b8a13f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureContentFilterResultForPromptContentFilterResultsError.cs @@ -0,0 +1,78 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// The AzureContentFilterResultForPromptContentFilterResultsError. + internal partial class InternalAzureContentFilterResultForPromptContentFilterResultsError + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// A distinct, machine-readable code associated with the error. + /// A human-readable message associated with the error. + /// is null. + internal InternalAzureContentFilterResultForPromptContentFilterResultsError(int code, string message) + { + Argument.AssertNotNull(message, nameof(message)); + + Code = code; + Message = message; + } + + /// Initializes a new instance of . + /// A distinct, machine-readable code associated with the error. + /// A human-readable message associated with the error. + /// Keeps track of any properties unknown to the library. + internal InternalAzureContentFilterResultForPromptContentFilterResultsError(int code, string message, IDictionary serializedAdditionalRawData) + { + Code = code; + Message = message; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureContentFilterResultForPromptContentFilterResultsError() + { + } + + /// A distinct, machine-readable code associated with the error. + internal int Code { get; set; } + /// A human-readable message associated with the error. + internal string Message { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureCosmosDBChatDataSourceParameters.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureCosmosDBChatDataSourceParameters.Serialization.cs new file mode 100644 index 000000000..105a22779 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureCosmosDBChatDataSourceParameters.Serialization.cs @@ -0,0 +1,317 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureCosmosDBChatDataSourceParameters : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureCosmosDBChatDataSourceParameters)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("top_n_documents") != true && Optional.IsDefined(TopNDocuments)) + { + writer.WritePropertyName("top_n_documents"u8); + writer.WriteNumberValue(TopNDocuments.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("in_scope") != true && Optional.IsDefined(InScope)) + { + writer.WritePropertyName("in_scope"u8); + writer.WriteBooleanValue(InScope.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("strictness") != true && Optional.IsDefined(Strictness)) + { + writer.WritePropertyName("strictness"u8); + writer.WriteNumberValue(Strictness.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("role_information") != true && Optional.IsDefined(RoleInformation)) + { + writer.WritePropertyName("role_information"u8); + writer.WriteStringValue(RoleInformation); + } + if (SerializedAdditionalRawData?.ContainsKey("max_search_queries") != true && Optional.IsDefined(MaxSearchQueries)) + { + writer.WritePropertyName("max_search_queries"u8); + writer.WriteNumberValue(MaxSearchQueries.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("allow_partial_result") != true && Optional.IsDefined(AllowPartialResult)) + { + writer.WritePropertyName("allow_partial_result"u8); + writer.WriteBooleanValue(AllowPartialResult.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("include_contexts") != true && Optional.IsCollectionDefined(_internalIncludeContexts)) + { + writer.WritePropertyName("include_contexts"u8); + writer.WriteStartArray(); + foreach (var item in _internalIncludeContexts) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("container_name") != true) + { + writer.WritePropertyName("container_name"u8); + writer.WriteStringValue(ContainerName); + } + if (SerializedAdditionalRawData?.ContainsKey("database_name") != true) + { + writer.WritePropertyName("database_name"u8); + writer.WriteStringValue(DatabaseName); + } + if (SerializedAdditionalRawData?.ContainsKey("embedding_dependency") != true) + { + writer.WritePropertyName("embedding_dependency"u8); + writer.WriteObjectValue(VectorizationSource, options); + } + if (SerializedAdditionalRawData?.ContainsKey("index_name") != true) + { + writer.WritePropertyName("index_name"u8); + writer.WriteStringValue(IndexName); + } + if (SerializedAdditionalRawData?.ContainsKey("authentication") != true) + { + writer.WritePropertyName("authentication"u8); + writer.WriteObjectValue(Authentication, options); + } + if (SerializedAdditionalRawData?.ContainsKey("fields_mapping") != true) + { + writer.WritePropertyName("fields_mapping"u8); + writer.WriteObjectValue(FieldMappings, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureCosmosDBChatDataSourceParameters IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureCosmosDBChatDataSourceParameters)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureCosmosDBChatDataSourceParameters(document.RootElement, options); + } + + internal static InternalAzureCosmosDBChatDataSourceParameters DeserializeInternalAzureCosmosDBChatDataSourceParameters(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int? topNDocuments = default; + bool? inScope = default; + int? strictness = default; + string roleInformation = default; + int? maxSearchQueries = default; + bool? allowPartialResult = default; + IList includeContexts = default; + string containerName = default; + string databaseName = default; + DataSourceVectorizer embeddingDependency = default; + string indexName = default; + DataSourceAuthentication authentication = default; + DataSourceFieldMappings fieldsMapping = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("top_n_documents"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + topNDocuments = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("in_scope"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + inScope = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("strictness"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + strictness = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("role_information"u8)) + { + roleInformation = property.Value.GetString(); + continue; + } + if (property.NameEquals("max_search_queries"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + maxSearchQueries = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("allow_partial_result"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + allowPartialResult = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("include_contexts"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + includeContexts = array; + continue; + } + if (property.NameEquals("container_name"u8)) + { + containerName = property.Value.GetString(); + continue; + } + if (property.NameEquals("database_name"u8)) + { + databaseName = property.Value.GetString(); + continue; + } + if (property.NameEquals("embedding_dependency"u8)) + { + embeddingDependency = DataSourceVectorizer.DeserializeDataSourceVectorizer(property.Value, options); + continue; + } + if (property.NameEquals("index_name"u8)) + { + indexName = property.Value.GetString(); + continue; + } + if (property.NameEquals("authentication"u8)) + { + authentication = DataSourceAuthentication.DeserializeDataSourceAuthentication(property.Value, options); + continue; + } + if (property.NameEquals("fields_mapping"u8)) + { + fieldsMapping = DataSourceFieldMappings.DeserializeDataSourceFieldMappings(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureCosmosDBChatDataSourceParameters( + topNDocuments, + inScope, + strictness, + roleInformation, + maxSearchQueries, + allowPartialResult, + includeContexts ?? new ChangeTrackingList(), + containerName, + databaseName, + embeddingDependency, + indexName, + authentication, + fieldsMapping, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureCosmosDBChatDataSourceParameters)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureCosmosDBChatDataSourceParameters IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureCosmosDBChatDataSourceParameters(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureCosmosDBChatDataSourceParameters)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static InternalAzureCosmosDBChatDataSourceParameters FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureCosmosDBChatDataSourceParameters(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureCosmosDBChatDataSourceParameters.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureCosmosDBChatDataSourceParameters.cs new file mode 100644 index 000000000..69fcfdfdf --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureCosmosDBChatDataSourceParameters.cs @@ -0,0 +1,165 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureCosmosDBChatDataSourceParameters. + internal partial class InternalAzureCosmosDBChatDataSourceParameters + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// + /// + /// + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// + /// + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// + /// , , , , or is null. + internal InternalAzureCosmosDBChatDataSourceParameters(string containerName, string databaseName, DataSourceVectorizer vectorizationSource, string indexName, DataSourceAuthentication authentication, DataSourceFieldMappings fieldMappings) + { + Argument.AssertNotNull(containerName, nameof(containerName)); + Argument.AssertNotNull(databaseName, nameof(databaseName)); + Argument.AssertNotNull(vectorizationSource, nameof(vectorizationSource)); + Argument.AssertNotNull(indexName, nameof(indexName)); + Argument.AssertNotNull(authentication, nameof(authentication)); + Argument.AssertNotNull(fieldMappings, nameof(fieldMappings)); + + _internalIncludeContexts = new ChangeTrackingList(); + ContainerName = containerName; + DatabaseName = databaseName; + VectorizationSource = vectorizationSource; + IndexName = indexName; + Authentication = authentication; + FieldMappings = fieldMappings; + } + + /// Initializes a new instance of . + /// The configured number of documents to feature in the query. + /// Whether queries should be restricted to use of the indexed data. + /// + /// The configured strictness of the search relevance filtering. + /// Higher strictness will increase precision but lower recall of the answer. + /// + /// + /// Additional instructions for the model to inform how it should behave and any context it should reference when + /// generating a response. You can describe the assistant's personality and tell it how to format responses. + /// This is limited to 100 tokens and counts against the overall token limit. + /// + /// + /// The maximum number of rewritten queries that should be sent to the search provider for a single user message. + /// By default, the system will make an automatic determination. + /// + /// + /// If set to true, the system will allow partial search results to be used and the request will fail if all + /// partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + /// + /// + /// The output context properties to include on the response. + /// By default, citations and intent will be requested. + /// + /// + /// + /// + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// + /// + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// + /// Keeps track of any properties unknown to the library. + internal InternalAzureCosmosDBChatDataSourceParameters(int? topNDocuments, bool? inScope, int? strictness, string roleInformation, int? maxSearchQueries, bool? allowPartialResult, IList internalIncludeContexts, string containerName, string databaseName, DataSourceVectorizer vectorizationSource, string indexName, DataSourceAuthentication authentication, DataSourceFieldMappings fieldMappings, IDictionary serializedAdditionalRawData) + { + TopNDocuments = topNDocuments; + InScope = inScope; + Strictness = strictness; + RoleInformation = roleInformation; + MaxSearchQueries = maxSearchQueries; + AllowPartialResult = allowPartialResult; + _internalIncludeContexts = internalIncludeContexts; + ContainerName = containerName; + DatabaseName = databaseName; + VectorizationSource = vectorizationSource; + IndexName = indexName; + Authentication = authentication; + FieldMappings = fieldMappings; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureCosmosDBChatDataSourceParameters() + { + } + + /// The configured number of documents to feature in the query. + internal int? TopNDocuments { get; set; } + /// Whether queries should be restricted to use of the indexed data. + internal bool? InScope { get; set; } + /// + /// The configured strictness of the search relevance filtering. + /// Higher strictness will increase precision but lower recall of the answer. + /// + internal int? Strictness { get; set; } + /// + /// Additional instructions for the model to inform how it should behave and any context it should reference when + /// generating a response. You can describe the assistant's personality and tell it how to format responses. + /// This is limited to 100 tokens and counts against the overall token limit. + /// + internal string RoleInformation { get; set; } + /// + /// The maximum number of rewritten queries that should be sent to the search provider for a single user message. + /// By default, the system will make an automatic determination. + /// + internal int? MaxSearchQueries { get; set; } + /// + /// If set to true, the system will allow partial search results to be used and the request will fail if all + /// partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + /// + internal bool? AllowPartialResult { get; set; } + /// Gets the container name. + internal string ContainerName { get; set; } + /// Gets the database name. + internal string DatabaseName { get; set; } + /// Gets the index name. + internal string IndexName { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureMachineLearningIndexChatDataSourceParameters.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureMachineLearningIndexChatDataSourceParameters.Serialization.cs new file mode 100644 index 000000000..d5071162c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureMachineLearningIndexChatDataSourceParameters.Serialization.cs @@ -0,0 +1,305 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureMachineLearningIndexChatDataSourceParameters : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureMachineLearningIndexChatDataSourceParameters)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("top_n_documents") != true && Optional.IsDefined(TopNDocuments)) + { + writer.WritePropertyName("top_n_documents"u8); + writer.WriteNumberValue(TopNDocuments.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("in_scope") != true && Optional.IsDefined(InScope)) + { + writer.WritePropertyName("in_scope"u8); + writer.WriteBooleanValue(InScope.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("strictness") != true && Optional.IsDefined(Strictness)) + { + writer.WritePropertyName("strictness"u8); + writer.WriteNumberValue(Strictness.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("role_information") != true && Optional.IsDefined(RoleInformation)) + { + writer.WritePropertyName("role_information"u8); + writer.WriteStringValue(RoleInformation); + } + if (SerializedAdditionalRawData?.ContainsKey("max_search_queries") != true && Optional.IsDefined(MaxSearchQueries)) + { + writer.WritePropertyName("max_search_queries"u8); + writer.WriteNumberValue(MaxSearchQueries.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("allow_partial_result") != true && Optional.IsDefined(AllowPartialResult)) + { + writer.WritePropertyName("allow_partial_result"u8); + writer.WriteBooleanValue(AllowPartialResult.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("include_contexts") != true && Optional.IsCollectionDefined(_internalIncludeContexts)) + { + writer.WritePropertyName("include_contexts"u8); + writer.WriteStartArray(); + foreach (var item in _internalIncludeContexts) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("authentication") != true) + { + writer.WritePropertyName("authentication"u8); + writer.WriteObjectValue(Authentication, options); + } + if (SerializedAdditionalRawData?.ContainsKey("project_resource_id") != true) + { + writer.WritePropertyName("project_resource_id"u8); + writer.WriteStringValue(ProjectResourceId); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData?.ContainsKey("version") != true) + { + writer.WritePropertyName("version"u8); + writer.WriteStringValue(Version); + } + if (SerializedAdditionalRawData?.ContainsKey("filter") != true && Optional.IsDefined(Filter)) + { + writer.WritePropertyName("filter"u8); + writer.WriteStringValue(Filter); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureMachineLearningIndexChatDataSourceParameters IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureMachineLearningIndexChatDataSourceParameters)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureMachineLearningIndexChatDataSourceParameters(document.RootElement, options); + } + + internal static InternalAzureMachineLearningIndexChatDataSourceParameters DeserializeInternalAzureMachineLearningIndexChatDataSourceParameters(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int? topNDocuments = default; + bool? inScope = default; + int? strictness = default; + string roleInformation = default; + int? maxSearchQueries = default; + bool? allowPartialResult = default; + IList includeContexts = default; + DataSourceAuthentication authentication = default; + string projectResourceId = default; + string name = default; + string version = default; + string filter = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("top_n_documents"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + topNDocuments = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("in_scope"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + inScope = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("strictness"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + strictness = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("role_information"u8)) + { + roleInformation = property.Value.GetString(); + continue; + } + if (property.NameEquals("max_search_queries"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + maxSearchQueries = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("allow_partial_result"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + allowPartialResult = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("include_contexts"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + includeContexts = array; + continue; + } + if (property.NameEquals("authentication"u8)) + { + authentication = DataSourceAuthentication.DeserializeDataSourceAuthentication(property.Value, options); + continue; + } + if (property.NameEquals("project_resource_id"u8)) + { + projectResourceId = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("version"u8)) + { + version = property.Value.GetString(); + continue; + } + if (property.NameEquals("filter"u8)) + { + filter = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureMachineLearningIndexChatDataSourceParameters( + topNDocuments, + inScope, + strictness, + roleInformation, + maxSearchQueries, + allowPartialResult, + includeContexts ?? new ChangeTrackingList(), + authentication, + projectResourceId, + name, + version, + filter, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureMachineLearningIndexChatDataSourceParameters)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureMachineLearningIndexChatDataSourceParameters IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureMachineLearningIndexChatDataSourceParameters(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureMachineLearningIndexChatDataSourceParameters)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static InternalAzureMachineLearningIndexChatDataSourceParameters FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureMachineLearningIndexChatDataSourceParameters(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureMachineLearningIndexChatDataSourceParameters.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureMachineLearningIndexChatDataSourceParameters.cs new file mode 100644 index 000000000..571ad60a2 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureMachineLearningIndexChatDataSourceParameters.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureMachineLearningIndexChatDataSourceParameters. + internal partial class InternalAzureMachineLearningIndexChatDataSourceParameters + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// The ID of the Azure Machine Learning index project to use. + /// The name of the Azure Machine Learning index to use. + /// The version of the vector index to use. + /// , , or is null. + internal InternalAzureMachineLearningIndexChatDataSourceParameters(DataSourceAuthentication authentication, string projectResourceId, string name, string version) + { + Argument.AssertNotNull(authentication, nameof(authentication)); + Argument.AssertNotNull(projectResourceId, nameof(projectResourceId)); + Argument.AssertNotNull(name, nameof(name)); + Argument.AssertNotNull(version, nameof(version)); + + _internalIncludeContexts = new ChangeTrackingList(); + Authentication = authentication; + ProjectResourceId = projectResourceId; + Name = name; + Version = version; + } + + /// Initializes a new instance of . + /// The configured number of documents to feature in the query. + /// Whether queries should be restricted to use of the indexed data. + /// + /// The configured strictness of the search relevance filtering. + /// Higher strictness will increase precision but lower recall of the answer. + /// + /// + /// Additional instructions for the model to inform how it should behave and any context it should reference when + /// generating a response. You can describe the assistant's personality and tell it how to format responses. + /// This is limited to 100 tokens and counts against the overall token limit. + /// + /// + /// The maximum number of rewritten queries that should be sent to the search provider for a single user message. + /// By default, the system will make an automatic determination. + /// + /// + /// If set to true, the system will allow partial search results to be used and the request will fail if all + /// partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + /// + /// + /// The output context properties to include on the response. + /// By default, citations and intent will be requested. + /// + /// + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// The ID of the Azure Machine Learning index project to use. + /// The name of the Azure Machine Learning index to use. + /// The version of the vector index to use. + /// A search filter, which is only applicable if the vector index is of the 'AzureSearch' type. + /// Keeps track of any properties unknown to the library. + internal InternalAzureMachineLearningIndexChatDataSourceParameters(int? topNDocuments, bool? inScope, int? strictness, string roleInformation, int? maxSearchQueries, bool? allowPartialResult, IList internalIncludeContexts, DataSourceAuthentication authentication, string projectResourceId, string name, string version, string filter, IDictionary serializedAdditionalRawData) + { + TopNDocuments = topNDocuments; + InScope = inScope; + Strictness = strictness; + RoleInformation = roleInformation; + MaxSearchQueries = maxSearchQueries; + AllowPartialResult = allowPartialResult; + _internalIncludeContexts = internalIncludeContexts; + Authentication = authentication; + ProjectResourceId = projectResourceId; + Name = name; + Version = version; + Filter = filter; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureMachineLearningIndexChatDataSourceParameters() + { + } + + /// The configured number of documents to feature in the query. + internal int? TopNDocuments { get; set; } + /// Whether queries should be restricted to use of the indexed data. + internal bool? InScope { get; set; } + /// + /// The configured strictness of the search relevance filtering. + /// Higher strictness will increase precision but lower recall of the answer. + /// + internal int? Strictness { get; set; } + /// + /// Additional instructions for the model to inform how it should behave and any context it should reference when + /// generating a response. You can describe the assistant's personality and tell it how to format responses. + /// This is limited to 100 tokens and counts against the overall token limit. + /// + internal string RoleInformation { get; set; } + /// + /// The maximum number of rewritten queries that should be sent to the search provider for a single user message. + /// By default, the system will make an automatic determination. + /// + internal int? MaxSearchQueries { get; set; } + /// + /// If set to true, the system will allow partial search results to be used and the request will fail if all + /// partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + /// + internal bool? AllowPartialResult { get; set; } + /// The ID of the Azure Machine Learning index project to use. + internal string ProjectResourceId { get; set; } + /// The name of the Azure Machine Learning index to use. + internal string Name { get; set; } + /// The version of the vector index to use. + internal string Version { get; set; } + /// A search filter, which is only applicable if the vector index is of the 'AzureSearch' type. + internal string Filter { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIChatErrorInnerError.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIChatErrorInnerError.Serialization.cs new file mode 100644 index 000000000..694933771 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIChatErrorInnerError.Serialization.cs @@ -0,0 +1,167 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal partial class InternalAzureOpenAIChatErrorInnerError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureOpenAIChatErrorInnerError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true && Optional.IsDefined(Code)) + { + writer.WritePropertyName("code"u8); + writer.WriteStringValue(Code.Value.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("revised_prompt") != true && Optional.IsDefined(RevisedPrompt)) + { + writer.WritePropertyName("revised_prompt"u8); + writer.WriteStringValue(RevisedPrompt); + } + if (SerializedAdditionalRawData?.ContainsKey("content_filter_results") != true && Optional.IsDefined(ContentFilterResults)) + { + writer.WritePropertyName("content_filter_results"u8); + writer.WriteObjectValue(ContentFilterResults, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureOpenAIChatErrorInnerError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureOpenAIChatErrorInnerError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureOpenAIChatErrorInnerError(document.RootElement, options); + } + + internal static InternalAzureOpenAIChatErrorInnerError DeserializeInternalAzureOpenAIChatErrorInnerError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalAzureOpenAIChatErrorInnerErrorCode? code = default; + string revisedPrompt = default; + ContentFilterResultForPrompt contentFilterResults = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + code = new InternalAzureOpenAIChatErrorInnerErrorCode(property.Value.GetString()); + continue; + } + if (property.NameEquals("revised_prompt"u8)) + { + revisedPrompt = property.Value.GetString(); + continue; + } + if (property.NameEquals("content_filter_results"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + contentFilterResults = ContentFilterResultForPrompt.DeserializeContentFilterResultForPrompt(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureOpenAIChatErrorInnerError(code, revisedPrompt, contentFilterResults, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureOpenAIChatErrorInnerError)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureOpenAIChatErrorInnerError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureOpenAIChatErrorInnerError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureOpenAIChatErrorInnerError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static InternalAzureOpenAIChatErrorInnerError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureOpenAIChatErrorInnerError(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIChatErrorInnerError.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIChatErrorInnerError.cs new file mode 100644 index 000000000..0912cd016 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIChatErrorInnerError.cs @@ -0,0 +1,70 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// The AzureOpenAIChatErrorInnerError. + internal partial class InternalAzureOpenAIChatErrorInnerError + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + internal InternalAzureOpenAIChatErrorInnerError() + { + } + + /// Initializes a new instance of . + /// The code associated with the inner error. + /// If applicable, the modified prompt used for generation. + /// The content filter result details associated with the inner error. + /// Keeps track of any properties unknown to the library. + internal InternalAzureOpenAIChatErrorInnerError(InternalAzureOpenAIChatErrorInnerErrorCode? code, string revisedPrompt, ContentFilterResultForPrompt contentFilterResults, IDictionary serializedAdditionalRawData) + { + Code = code; + RevisedPrompt = revisedPrompt; + ContentFilterResults = contentFilterResults; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// The code associated with the inner error. + internal InternalAzureOpenAIChatErrorInnerErrorCode? Code { get; set; } + /// If applicable, the modified prompt used for generation. + internal string RevisedPrompt { get; set; } + /// The content filter result details associated with the inner error. + internal ContentFilterResultForPrompt ContentFilterResults { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIChatErrorInnerErrorCode.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIChatErrorInnerErrorCode.cs new file mode 100644 index 000000000..1e1ab2f0c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIChatErrorInnerErrorCode.cs @@ -0,0 +1,46 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace Azure.AI.OpenAI +{ + /// The AzureOpenAIChatErrorInnerError_code. + internal readonly partial struct InternalAzureOpenAIChatErrorInnerErrorCode : IEquatable + { + private readonly string _value; + + /// Initializes a new instance of . + /// is null. + public InternalAzureOpenAIChatErrorInnerErrorCode(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ResponsibleAIPolicyViolationValue = "ResponsibleAIPolicyViolation"; + + /// ResponsibleAIPolicyViolation. + internal static InternalAzureOpenAIChatErrorInnerErrorCode ResponsibleAIPolicyViolation { get; set; } = new InternalAzureOpenAIChatErrorInnerErrorCode(ResponsibleAIPolicyViolationValue); + /// Determines if two values are the same. + public static bool operator ==(InternalAzureOpenAIChatErrorInnerErrorCode left, InternalAzureOpenAIChatErrorInnerErrorCode right) => left.Equals(right); + /// Determines if two values are not the same. + public static bool operator !=(InternalAzureOpenAIChatErrorInnerErrorCode left, InternalAzureOpenAIChatErrorInnerErrorCode right) => !left.Equals(right); + /// Converts a string to a . + public static implicit operator InternalAzureOpenAIChatErrorInnerErrorCode(string value) => new InternalAzureOpenAIChatErrorInnerErrorCode(value); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalAzureOpenAIChatErrorInnerErrorCode other && Equals(other); + /// + public bool Equals(InternalAzureOpenAIChatErrorInnerErrorCode other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + /// + public override string ToString() => _value; + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIDalleErrorInnerError.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIDalleErrorInnerError.Serialization.cs new file mode 100644 index 000000000..328bdf127 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIDalleErrorInnerError.Serialization.cs @@ -0,0 +1,167 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI +{ + internal partial class InternalAzureOpenAIDalleErrorInnerError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureOpenAIDalleErrorInnerError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true && Optional.IsDefined(Code)) + { + writer.WritePropertyName("code"u8); + writer.WriteStringValue(Code.Value.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("revised_prompt") != true && Optional.IsDefined(RevisedPrompt)) + { + writer.WritePropertyName("revised_prompt"u8); + writer.WriteStringValue(RevisedPrompt); + } + if (SerializedAdditionalRawData?.ContainsKey("content_filter_results") != true && Optional.IsDefined(ContentFilterResults)) + { + writer.WritePropertyName("content_filter_results"u8); + writer.WriteObjectValue(ContentFilterResults, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureOpenAIDalleErrorInnerError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureOpenAIDalleErrorInnerError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureOpenAIDalleErrorInnerError(document.RootElement, options); + } + + internal static InternalAzureOpenAIDalleErrorInnerError DeserializeInternalAzureOpenAIDalleErrorInnerError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalAzureOpenAIDalleErrorInnerErrorCode? code = default; + string revisedPrompt = default; + ImageContentFilterResultForPrompt contentFilterResults = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + code = new InternalAzureOpenAIDalleErrorInnerErrorCode(property.Value.GetString()); + continue; + } + if (property.NameEquals("revised_prompt"u8)) + { + revisedPrompt = property.Value.GetString(); + continue; + } + if (property.NameEquals("content_filter_results"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + contentFilterResults = ImageContentFilterResultForPrompt.DeserializeImageContentFilterResultForPrompt(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureOpenAIDalleErrorInnerError(code, revisedPrompt, contentFilterResults, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureOpenAIDalleErrorInnerError)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureOpenAIDalleErrorInnerError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureOpenAIDalleErrorInnerError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureOpenAIDalleErrorInnerError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static InternalAzureOpenAIDalleErrorInnerError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureOpenAIDalleErrorInnerError(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIDalleErrorInnerError.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIDalleErrorInnerError.cs new file mode 100644 index 000000000..d4b590f53 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIDalleErrorInnerError.cs @@ -0,0 +1,70 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI +{ + /// The AzureOpenAIDalleErrorInnerError. + internal partial class InternalAzureOpenAIDalleErrorInnerError + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + internal InternalAzureOpenAIDalleErrorInnerError() + { + } + + /// Initializes a new instance of . + /// The code associated with the inner error. + /// If applicable, the modified prompt used for generation. + /// The content filter result details associated with the inner error. + /// Keeps track of any properties unknown to the library. + internal InternalAzureOpenAIDalleErrorInnerError(InternalAzureOpenAIDalleErrorInnerErrorCode? code, string revisedPrompt, ImageContentFilterResultForPrompt contentFilterResults, IDictionary serializedAdditionalRawData) + { + Code = code; + RevisedPrompt = revisedPrompt; + ContentFilterResults = contentFilterResults; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// The code associated with the inner error. + internal InternalAzureOpenAIDalleErrorInnerErrorCode? Code { get; set; } + /// If applicable, the modified prompt used for generation. + internal string RevisedPrompt { get; set; } + /// The content filter result details associated with the inner error. + internal ImageContentFilterResultForPrompt ContentFilterResults { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIDalleErrorInnerErrorCode.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIDalleErrorInnerErrorCode.cs new file mode 100644 index 000000000..35cf2d978 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureOpenAIDalleErrorInnerErrorCode.cs @@ -0,0 +1,46 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace Azure.AI.OpenAI +{ + /// The AzureOpenAIDalleErrorInnerError_code. + internal readonly partial struct InternalAzureOpenAIDalleErrorInnerErrorCode : IEquatable + { + private readonly string _value; + + /// Initializes a new instance of . + /// is null. + public InternalAzureOpenAIDalleErrorInnerErrorCode(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ResponsibleAIPolicyViolationValue = "ResponsibleAIPolicyViolation"; + + /// ResponsibleAIPolicyViolation. + internal static InternalAzureOpenAIDalleErrorInnerErrorCode ResponsibleAIPolicyViolation { get; set; } = new InternalAzureOpenAIDalleErrorInnerErrorCode(ResponsibleAIPolicyViolationValue); + /// Determines if two values are the same. + public static bool operator ==(InternalAzureOpenAIDalleErrorInnerErrorCode left, InternalAzureOpenAIDalleErrorInnerErrorCode right) => left.Equals(right); + /// Determines if two values are not the same. + public static bool operator !=(InternalAzureOpenAIDalleErrorInnerErrorCode left, InternalAzureOpenAIDalleErrorInnerErrorCode right) => !left.Equals(right); + /// Converts a string to a . + public static implicit operator InternalAzureOpenAIDalleErrorInnerErrorCode(string value) => new InternalAzureOpenAIDalleErrorInnerErrorCode(value); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalAzureOpenAIDalleErrorInnerErrorCode other && Equals(other); + /// + public bool Equals(InternalAzureOpenAIDalleErrorInnerErrorCode other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + /// + public override string ToString() => _value; + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureSearchChatDataSourceParameters.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureSearchChatDataSourceParameters.Serialization.cs new file mode 100644 index 000000000..d855a3a0c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureSearchChatDataSourceParameters.Serialization.cs @@ -0,0 +1,353 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalAzureSearchChatDataSourceParameters : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureSearchChatDataSourceParameters)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("top_n_documents") != true && Optional.IsDefined(TopNDocuments)) + { + writer.WritePropertyName("top_n_documents"u8); + writer.WriteNumberValue(TopNDocuments.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("in_scope") != true && Optional.IsDefined(InScope)) + { + writer.WritePropertyName("in_scope"u8); + writer.WriteBooleanValue(InScope.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("strictness") != true && Optional.IsDefined(Strictness)) + { + writer.WritePropertyName("strictness"u8); + writer.WriteNumberValue(Strictness.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("role_information") != true && Optional.IsDefined(RoleInformation)) + { + writer.WritePropertyName("role_information"u8); + writer.WriteStringValue(RoleInformation); + } + if (SerializedAdditionalRawData?.ContainsKey("max_search_queries") != true && Optional.IsDefined(MaxSearchQueries)) + { + writer.WritePropertyName("max_search_queries"u8); + writer.WriteNumberValue(MaxSearchQueries.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("allow_partial_result") != true && Optional.IsDefined(AllowPartialResult)) + { + writer.WritePropertyName("allow_partial_result"u8); + writer.WriteBooleanValue(AllowPartialResult.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("include_contexts") != true && Optional.IsCollectionDefined(_internalIncludeContexts)) + { + writer.WritePropertyName("include_contexts"u8); + writer.WriteStartArray(); + foreach (var item in _internalIncludeContexts) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("endpoint") != true) + { + writer.WritePropertyName("endpoint"u8); + writer.WriteStringValue(Endpoint.AbsoluteUri); + } + if (SerializedAdditionalRawData?.ContainsKey("index_name") != true) + { + writer.WritePropertyName("index_name"u8); + writer.WriteStringValue(IndexName); + } + if (SerializedAdditionalRawData?.ContainsKey("authentication") != true) + { + writer.WritePropertyName("authentication"u8); + writer.WriteObjectValue(Authentication, options); + } + if (SerializedAdditionalRawData?.ContainsKey("fields_mapping") != true && Optional.IsDefined(FieldMappings)) + { + writer.WritePropertyName("fields_mapping"u8); + writer.WriteObjectValue(FieldMappings, options); + } + if (SerializedAdditionalRawData?.ContainsKey("query_type") != true && Optional.IsDefined(QueryType)) + { + writer.WritePropertyName("query_type"u8); + writer.WriteStringValue(QueryType.Value.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("semantic_configuration") != true && Optional.IsDefined(SemanticConfiguration)) + { + writer.WritePropertyName("semantic_configuration"u8); + writer.WriteStringValue(SemanticConfiguration); + } + if (SerializedAdditionalRawData?.ContainsKey("filter") != true && Optional.IsDefined(Filter)) + { + writer.WritePropertyName("filter"u8); + writer.WriteStringValue(Filter); + } + if (SerializedAdditionalRawData?.ContainsKey("embedding_dependency") != true && Optional.IsDefined(VectorizationSource)) + { + writer.WritePropertyName("embedding_dependency"u8); + writer.WriteObjectValue(VectorizationSource, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAzureSearchChatDataSourceParameters IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAzureSearchChatDataSourceParameters)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAzureSearchChatDataSourceParameters(document.RootElement, options); + } + + internal static InternalAzureSearchChatDataSourceParameters DeserializeInternalAzureSearchChatDataSourceParameters(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int? topNDocuments = default; + bool? inScope = default; + int? strictness = default; + string roleInformation = default; + int? maxSearchQueries = default; + bool? allowPartialResult = default; + IList includeContexts = default; + Uri endpoint = default; + string indexName = default; + DataSourceAuthentication authentication = default; + DataSourceFieldMappings fieldsMapping = default; + DataSourceQueryType? queryType = default; + string semanticConfiguration = default; + string filter = default; + DataSourceVectorizer embeddingDependency = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("top_n_documents"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + topNDocuments = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("in_scope"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + inScope = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("strictness"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + strictness = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("role_information"u8)) + { + roleInformation = property.Value.GetString(); + continue; + } + if (property.NameEquals("max_search_queries"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + maxSearchQueries = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("allow_partial_result"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + allowPartialResult = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("include_contexts"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + includeContexts = array; + continue; + } + if (property.NameEquals("endpoint"u8)) + { + endpoint = new Uri(property.Value.GetString()); + continue; + } + if (property.NameEquals("index_name"u8)) + { + indexName = property.Value.GetString(); + continue; + } + if (property.NameEquals("authentication"u8)) + { + authentication = DataSourceAuthentication.DeserializeDataSourceAuthentication(property.Value, options); + continue; + } + if (property.NameEquals("fields_mapping"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fieldsMapping = DataSourceFieldMappings.DeserializeDataSourceFieldMappings(property.Value, options); + continue; + } + if (property.NameEquals("query_type"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + queryType = new DataSourceQueryType(property.Value.GetString()); + continue; + } + if (property.NameEquals("semantic_configuration"u8)) + { + semanticConfiguration = property.Value.GetString(); + continue; + } + if (property.NameEquals("filter"u8)) + { + filter = property.Value.GetString(); + continue; + } + if (property.NameEquals("embedding_dependency"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + embeddingDependency = DataSourceVectorizer.DeserializeDataSourceVectorizer(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAzureSearchChatDataSourceParameters( + topNDocuments, + inScope, + strictness, + roleInformation, + maxSearchQueries, + allowPartialResult, + includeContexts ?? new ChangeTrackingList(), + endpoint, + indexName, + authentication, + fieldsMapping, + queryType, + semanticConfiguration, + filter, + embeddingDependency, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAzureSearchChatDataSourceParameters)} does not support writing '{options.Format}' format."); + } + } + + InternalAzureSearchChatDataSourceParameters IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAzureSearchChatDataSourceParameters(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAzureSearchChatDataSourceParameters)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static InternalAzureSearchChatDataSourceParameters FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAzureSearchChatDataSourceParameters(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureSearchChatDataSourceParameters.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureSearchChatDataSourceParameters.cs new file mode 100644 index 000000000..470fb9c45 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureSearchChatDataSourceParameters.cs @@ -0,0 +1,164 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureSearchChatDataSourceParameters. + internal partial class InternalAzureSearchChatDataSourceParameters + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// The absolute endpoint path for the Azure Search resource to use. + /// The name of the index to use, as specified in the Azure Search resource. + /// + /// The authentication mechanism to use with Azure Search. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// , or is null. + internal InternalAzureSearchChatDataSourceParameters(Uri endpoint, string indexName, DataSourceAuthentication authentication) + { + Argument.AssertNotNull(endpoint, nameof(endpoint)); + Argument.AssertNotNull(indexName, nameof(indexName)); + Argument.AssertNotNull(authentication, nameof(authentication)); + + _internalIncludeContexts = new ChangeTrackingList(); + Endpoint = endpoint; + IndexName = indexName; + Authentication = authentication; + } + + /// Initializes a new instance of . + /// The configured number of documents to feature in the query. + /// Whether queries should be restricted to use of the indexed data. + /// + /// The configured strictness of the search relevance filtering. + /// Higher strictness will increase precision but lower recall of the answer. + /// + /// + /// Additional instructions for the model to inform how it should behave and any context it should reference when + /// generating a response. You can describe the assistant's personality and tell it how to format responses. + /// This is limited to 100 tokens and counts against the overall token limit. + /// + /// + /// The maximum number of rewritten queries that should be sent to the search provider for a single user message. + /// By default, the system will make an automatic determination. + /// + /// + /// If set to true, the system will allow partial search results to be used and the request will fail if all + /// partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + /// + /// + /// The output context properties to include on the response. + /// By default, citations and intent will be requested. + /// + /// The absolute endpoint path for the Azure Search resource to use. + /// The name of the index to use, as specified in the Azure Search resource. + /// + /// The authentication mechanism to use with Azure Search. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// The field mappings to use with the Azure Search resource. + /// The query type for the Azure Search resource to use. + /// Additional semantic configuration for the query. + /// A filter to apply to the search. + /// + /// The vectorization source to use with Azure Search. + /// Supported sources for Azure Search include endpoint and deployment name. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// Keeps track of any properties unknown to the library. + internal InternalAzureSearchChatDataSourceParameters(int? topNDocuments, bool? inScope, int? strictness, string roleInformation, int? maxSearchQueries, bool? allowPartialResult, IList internalIncludeContexts, Uri endpoint, string indexName, DataSourceAuthentication authentication, DataSourceFieldMappings fieldMappings, DataSourceQueryType? queryType, string semanticConfiguration, string filter, DataSourceVectorizer vectorizationSource, IDictionary serializedAdditionalRawData) + { + TopNDocuments = topNDocuments; + InScope = inScope; + Strictness = strictness; + RoleInformation = roleInformation; + MaxSearchQueries = maxSearchQueries; + AllowPartialResult = allowPartialResult; + _internalIncludeContexts = internalIncludeContexts; + Endpoint = endpoint; + IndexName = indexName; + Authentication = authentication; + FieldMappings = fieldMappings; + QueryType = queryType; + SemanticConfiguration = semanticConfiguration; + Filter = filter; + VectorizationSource = vectorizationSource; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal InternalAzureSearchChatDataSourceParameters() + { + } + + /// The configured number of documents to feature in the query. + internal int? TopNDocuments { get; set; } + /// Whether queries should be restricted to use of the indexed data. + internal bool? InScope { get; set; } + /// + /// The configured strictness of the search relevance filtering. + /// Higher strictness will increase precision but lower recall of the answer. + /// + internal int? Strictness { get; set; } + /// + /// Additional instructions for the model to inform how it should behave and any context it should reference when + /// generating a response. You can describe the assistant's personality and tell it how to format responses. + /// This is limited to 100 tokens and counts against the overall token limit. + /// + internal string RoleInformation { get; set; } + /// + /// The maximum number of rewritten queries that should be sent to the search provider for a single user message. + /// By default, the system will make an automatic determination. + /// + internal int? MaxSearchQueries { get; set; } + /// + /// If set to true, the system will allow partial search results to be used and the request will fail if all + /// partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + /// + internal bool? AllowPartialResult { get; set; } + /// The absolute endpoint path for the Azure Search resource to use. + internal Uri Endpoint { get; set; } + /// The name of the index to use, as specified in the Azure Search resource. + internal string IndexName { get; set; } + /// Additional semantic configuration for the query. + internal string SemanticConfiguration { get; set; } + /// A filter to apply to the search. + internal string Filter { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureSearchChatDataSourceParametersIncludeContext.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureSearchChatDataSourceParametersIncludeContext.cs new file mode 100644 index 000000000..90e6d8676 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalAzureSearchChatDataSourceParametersIncludeContext.cs @@ -0,0 +1,52 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace Azure.AI.OpenAI.Chat +{ + /// The AzureSearchChatDataSourceParametersIncludeContext. + internal readonly partial struct InternalAzureSearchChatDataSourceParametersIncludeContext : IEquatable + { + private readonly string _value; + + /// Initializes a new instance of . + /// is null. + public InternalAzureSearchChatDataSourceParametersIncludeContext(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string CitationsValue = "citations"; + private const string IntentValue = "intent"; + private const string AllRetrievedDocumentsValue = "all_retrieved_documents"; + + /// citations. + internal static InternalAzureSearchChatDataSourceParametersIncludeContext Citations { get; set; } = new InternalAzureSearchChatDataSourceParametersIncludeContext(CitationsValue); + /// intent. + internal static InternalAzureSearchChatDataSourceParametersIncludeContext Intent { get; set; } = new InternalAzureSearchChatDataSourceParametersIncludeContext(IntentValue); + /// all_retrieved_documents. + internal static InternalAzureSearchChatDataSourceParametersIncludeContext AllRetrievedDocuments { get; set; } = new InternalAzureSearchChatDataSourceParametersIncludeContext(AllRetrievedDocumentsValue); + /// Determines if two values are the same. + public static bool operator ==(InternalAzureSearchChatDataSourceParametersIncludeContext left, InternalAzureSearchChatDataSourceParametersIncludeContext right) => left.Equals(right); + /// Determines if two values are not the same. + public static bool operator !=(InternalAzureSearchChatDataSourceParametersIncludeContext left, InternalAzureSearchChatDataSourceParametersIncludeContext right) => !left.Equals(right); + /// Converts a string to a . + public static implicit operator InternalAzureSearchChatDataSourceParametersIncludeContext(string value) => new InternalAzureSearchChatDataSourceParametersIncludeContext(value); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalAzureSearchChatDataSourceParametersIncludeContext other && Equals(other); + /// + public bool Equals(InternalAzureSearchChatDataSourceParametersIncludeContext other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + /// + public override string ToString() => _value; + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalElasticsearchChatDataSourceParameters.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalElasticsearchChatDataSourceParameters.Serialization.cs new file mode 100644 index 000000000..488442d4b --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalElasticsearchChatDataSourceParameters.Serialization.cs @@ -0,0 +1,329 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalElasticsearchChatDataSourceParameters : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalElasticsearchChatDataSourceParameters)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("top_n_documents") != true && Optional.IsDefined(TopNDocuments)) + { + writer.WritePropertyName("top_n_documents"u8); + writer.WriteNumberValue(TopNDocuments.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("in_scope") != true && Optional.IsDefined(InScope)) + { + writer.WritePropertyName("in_scope"u8); + writer.WriteBooleanValue(InScope.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("strictness") != true && Optional.IsDefined(Strictness)) + { + writer.WritePropertyName("strictness"u8); + writer.WriteNumberValue(Strictness.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("role_information") != true && Optional.IsDefined(RoleInformation)) + { + writer.WritePropertyName("role_information"u8); + writer.WriteStringValue(RoleInformation); + } + if (SerializedAdditionalRawData?.ContainsKey("max_search_queries") != true && Optional.IsDefined(MaxSearchQueries)) + { + writer.WritePropertyName("max_search_queries"u8); + writer.WriteNumberValue(MaxSearchQueries.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("allow_partial_result") != true && Optional.IsDefined(AllowPartialResult)) + { + writer.WritePropertyName("allow_partial_result"u8); + writer.WriteBooleanValue(AllowPartialResult.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("include_contexts") != true && Optional.IsCollectionDefined(_internalIncludeContexts)) + { + writer.WritePropertyName("include_contexts"u8); + writer.WriteStartArray(); + foreach (var item in _internalIncludeContexts) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("endpoint") != true) + { + writer.WritePropertyName("endpoint"u8); + writer.WriteStringValue(Endpoint.AbsoluteUri); + } + if (SerializedAdditionalRawData?.ContainsKey("index_name") != true) + { + writer.WritePropertyName("index_name"u8); + writer.WriteStringValue(IndexName); + } + if (SerializedAdditionalRawData?.ContainsKey("authentication") != true) + { + writer.WritePropertyName("authentication"u8); + writer.WriteObjectValue(Authentication, options); + } + if (SerializedAdditionalRawData?.ContainsKey("fields_mapping") != true && Optional.IsDefined(FieldMappings)) + { + writer.WritePropertyName("fields_mapping"u8); + writer.WriteObjectValue(FieldMappings, options); + } + if (SerializedAdditionalRawData?.ContainsKey("query_type") != true && Optional.IsDefined(QueryType)) + { + writer.WritePropertyName("query_type"u8); + writer.WriteStringValue(QueryType.Value.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("embedding_dependency") != true && Optional.IsDefined(VectorizationSource)) + { + writer.WritePropertyName("embedding_dependency"u8); + writer.WriteObjectValue(VectorizationSource, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalElasticsearchChatDataSourceParameters IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalElasticsearchChatDataSourceParameters)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalElasticsearchChatDataSourceParameters(document.RootElement, options); + } + + internal static InternalElasticsearchChatDataSourceParameters DeserializeInternalElasticsearchChatDataSourceParameters(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int? topNDocuments = default; + bool? inScope = default; + int? strictness = default; + string roleInformation = default; + int? maxSearchQueries = default; + bool? allowPartialResult = default; + IList includeContexts = default; + Uri endpoint = default; + string indexName = default; + DataSourceAuthentication authentication = default; + DataSourceFieldMappings fieldsMapping = default; + DataSourceQueryType? queryType = default; + DataSourceVectorizer embeddingDependency = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("top_n_documents"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + topNDocuments = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("in_scope"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + inScope = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("strictness"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + strictness = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("role_information"u8)) + { + roleInformation = property.Value.GetString(); + continue; + } + if (property.NameEquals("max_search_queries"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + maxSearchQueries = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("allow_partial_result"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + allowPartialResult = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("include_contexts"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + includeContexts = array; + continue; + } + if (property.NameEquals("endpoint"u8)) + { + endpoint = new Uri(property.Value.GetString()); + continue; + } + if (property.NameEquals("index_name"u8)) + { + indexName = property.Value.GetString(); + continue; + } + if (property.NameEquals("authentication"u8)) + { + authentication = DataSourceAuthentication.DeserializeDataSourceAuthentication(property.Value, options); + continue; + } + if (property.NameEquals("fields_mapping"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fieldsMapping = DataSourceFieldMappings.DeserializeDataSourceFieldMappings(property.Value, options); + continue; + } + if (property.NameEquals("query_type"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + queryType = new DataSourceQueryType(property.Value.GetString()); + continue; + } + if (property.NameEquals("embedding_dependency"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + embeddingDependency = DataSourceVectorizer.DeserializeDataSourceVectorizer(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalElasticsearchChatDataSourceParameters( + topNDocuments, + inScope, + strictness, + roleInformation, + maxSearchQueries, + allowPartialResult, + includeContexts ?? new ChangeTrackingList(), + endpoint, + indexName, + authentication, + fieldsMapping, + queryType, + embeddingDependency, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalElasticsearchChatDataSourceParameters)} does not support writing '{options.Format}' format."); + } + } + + InternalElasticsearchChatDataSourceParameters IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalElasticsearchChatDataSourceParameters(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalElasticsearchChatDataSourceParameters)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static InternalElasticsearchChatDataSourceParameters FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalElasticsearchChatDataSourceParameters(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalElasticsearchChatDataSourceParameters.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalElasticsearchChatDataSourceParameters.cs new file mode 100644 index 000000000..fdfc831ab --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalElasticsearchChatDataSourceParameters.cs @@ -0,0 +1,152 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The ElasticsearchChatDataSourceParameters. + internal partial class InternalElasticsearchChatDataSourceParameters + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// + /// + /// + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// , or is null. + internal InternalElasticsearchChatDataSourceParameters(Uri endpoint, string indexName, DataSourceAuthentication authentication) + { + Argument.AssertNotNull(endpoint, nameof(endpoint)); + Argument.AssertNotNull(indexName, nameof(indexName)); + Argument.AssertNotNull(authentication, nameof(authentication)); + + _internalIncludeContexts = new ChangeTrackingList(); + Endpoint = endpoint; + IndexName = indexName; + Authentication = authentication; + } + + /// Initializes a new instance of . + /// The configured number of documents to feature in the query. + /// Whether queries should be restricted to use of the indexed data. + /// + /// The configured strictness of the search relevance filtering. + /// Higher strictness will increase precision but lower recall of the answer. + /// + /// + /// Additional instructions for the model to inform how it should behave and any context it should reference when + /// generating a response. You can describe the assistant's personality and tell it how to format responses. + /// This is limited to 100 tokens and counts against the overall token limit. + /// + /// + /// The maximum number of rewritten queries that should be sent to the search provider for a single user message. + /// By default, the system will make an automatic determination. + /// + /// + /// If set to true, the system will allow partial search results to be used and the request will fail if all + /// partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + /// + /// + /// The output context properties to include on the response. + /// By default, citations and intent will be requested. + /// + /// + /// + /// + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// + /// + /// + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// Keeps track of any properties unknown to the library. + internal InternalElasticsearchChatDataSourceParameters(int? topNDocuments, bool? inScope, int? strictness, string roleInformation, int? maxSearchQueries, bool? allowPartialResult, IList internalIncludeContexts, Uri endpoint, string indexName, DataSourceAuthentication authentication, DataSourceFieldMappings fieldMappings, DataSourceQueryType? queryType, DataSourceVectorizer vectorizationSource, IDictionary serializedAdditionalRawData) + { + TopNDocuments = topNDocuments; + InScope = inScope; + Strictness = strictness; + RoleInformation = roleInformation; + MaxSearchQueries = maxSearchQueries; + AllowPartialResult = allowPartialResult; + _internalIncludeContexts = internalIncludeContexts; + Endpoint = endpoint; + IndexName = indexName; + Authentication = authentication; + FieldMappings = fieldMappings; + QueryType = queryType; + VectorizationSource = vectorizationSource; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal InternalElasticsearchChatDataSourceParameters() + { + } + + /// The configured number of documents to feature in the query. + internal int? TopNDocuments { get; set; } + /// Whether queries should be restricted to use of the indexed data. + internal bool? InScope { get; set; } + /// + /// The configured strictness of the search relevance filtering. + /// Higher strictness will increase precision but lower recall of the answer. + /// + internal int? Strictness { get; set; } + /// + /// Additional instructions for the model to inform how it should behave and any context it should reference when + /// generating a response. You can describe the assistant's personality and tell it how to format responses. + /// This is limited to 100 tokens and counts against the overall token limit. + /// + internal string RoleInformation { get; set; } + /// + /// The maximum number of rewritten queries that should be sent to the search provider for a single user message. + /// By default, the system will make an automatic determination. + /// + internal int? MaxSearchQueries { get; set; } + /// + /// If set to true, the system will allow partial search results to be used and the request will fail if all + /// partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + /// + internal bool? AllowPartialResult { get; set; } + /// Gets the endpoint. + internal Uri Endpoint { get; set; } + /// Gets the index name. + internal string IndexName { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalPineconeChatDataSourceParameters.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalPineconeChatDataSourceParameters.Serialization.cs new file mode 100644 index 000000000..e7acec9ad --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalPineconeChatDataSourceParameters.Serialization.cs @@ -0,0 +1,305 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalPineconeChatDataSourceParameters : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalPineconeChatDataSourceParameters)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("top_n_documents") != true && Optional.IsDefined(TopNDocuments)) + { + writer.WritePropertyName("top_n_documents"u8); + writer.WriteNumberValue(TopNDocuments.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("in_scope") != true && Optional.IsDefined(InScope)) + { + writer.WritePropertyName("in_scope"u8); + writer.WriteBooleanValue(InScope.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("strictness") != true && Optional.IsDefined(Strictness)) + { + writer.WritePropertyName("strictness"u8); + writer.WriteNumberValue(Strictness.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("role_information") != true && Optional.IsDefined(RoleInformation)) + { + writer.WritePropertyName("role_information"u8); + writer.WriteStringValue(RoleInformation); + } + if (SerializedAdditionalRawData?.ContainsKey("max_search_queries") != true && Optional.IsDefined(MaxSearchQueries)) + { + writer.WritePropertyName("max_search_queries"u8); + writer.WriteNumberValue(MaxSearchQueries.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("allow_partial_result") != true && Optional.IsDefined(AllowPartialResult)) + { + writer.WritePropertyName("allow_partial_result"u8); + writer.WriteBooleanValue(AllowPartialResult.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("include_contexts") != true && Optional.IsCollectionDefined(_internalIncludeContexts)) + { + writer.WritePropertyName("include_contexts"u8); + writer.WriteStartArray(); + foreach (var item in _internalIncludeContexts) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("environment") != true) + { + writer.WritePropertyName("environment"u8); + writer.WriteStringValue(Environment); + } + if (SerializedAdditionalRawData?.ContainsKey("index_name") != true) + { + writer.WritePropertyName("index_name"u8); + writer.WriteStringValue(IndexName); + } + if (SerializedAdditionalRawData?.ContainsKey("authentication") != true) + { + writer.WritePropertyName("authentication"u8); + writer.WriteObjectValue(Authentication, options); + } + if (SerializedAdditionalRawData?.ContainsKey("embedding_dependency") != true) + { + writer.WritePropertyName("embedding_dependency"u8); + writer.WriteObjectValue(VectorizationSource, options); + } + if (SerializedAdditionalRawData?.ContainsKey("fields_mapping") != true) + { + writer.WritePropertyName("fields_mapping"u8); + writer.WriteObjectValue(FieldMappings, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalPineconeChatDataSourceParameters IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalPineconeChatDataSourceParameters)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalPineconeChatDataSourceParameters(document.RootElement, options); + } + + internal static InternalPineconeChatDataSourceParameters DeserializeInternalPineconeChatDataSourceParameters(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int? topNDocuments = default; + bool? inScope = default; + int? strictness = default; + string roleInformation = default; + int? maxSearchQueries = default; + bool? allowPartialResult = default; + IList includeContexts = default; + string environment = default; + string indexName = default; + DataSourceAuthentication authentication = default; + DataSourceVectorizer embeddingDependency = default; + DataSourceFieldMappings fieldsMapping = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("top_n_documents"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + topNDocuments = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("in_scope"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + inScope = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("strictness"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + strictness = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("role_information"u8)) + { + roleInformation = property.Value.GetString(); + continue; + } + if (property.NameEquals("max_search_queries"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + maxSearchQueries = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("allow_partial_result"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + allowPartialResult = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("include_contexts"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + includeContexts = array; + continue; + } + if (property.NameEquals("environment"u8)) + { + environment = property.Value.GetString(); + continue; + } + if (property.NameEquals("index_name"u8)) + { + indexName = property.Value.GetString(); + continue; + } + if (property.NameEquals("authentication"u8)) + { + authentication = DataSourceAuthentication.DeserializeDataSourceAuthentication(property.Value, options); + continue; + } + if (property.NameEquals("embedding_dependency"u8)) + { + embeddingDependency = DataSourceVectorizer.DeserializeDataSourceVectorizer(property.Value, options); + continue; + } + if (property.NameEquals("fields_mapping"u8)) + { + fieldsMapping = DataSourceFieldMappings.DeserializeDataSourceFieldMappings(property.Value, options); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalPineconeChatDataSourceParameters( + topNDocuments, + inScope, + strictness, + roleInformation, + maxSearchQueries, + allowPartialResult, + includeContexts ?? new ChangeTrackingList(), + environment, + indexName, + authentication, + embeddingDependency, + fieldsMapping, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalPineconeChatDataSourceParameters)} does not support writing '{options.Format}' format."); + } + } + + InternalPineconeChatDataSourceParameters IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalPineconeChatDataSourceParameters(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalPineconeChatDataSourceParameters)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static InternalPineconeChatDataSourceParameters FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalPineconeChatDataSourceParameters(document.RootElement); + } + + /// Convert into a . + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalPineconeChatDataSourceParameters.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalPineconeChatDataSourceParameters.cs new file mode 100644 index 000000000..2e19d59b5 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalPineconeChatDataSourceParameters.cs @@ -0,0 +1,172 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The PineconeChatDataSourceParameters. + internal partial class InternalPineconeChatDataSourceParameters + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IDictionary SerializedAdditionalRawData { get; set; } + /// Initializes a new instance of . + /// The environment name to use with Pinecone. + /// The name of the Pinecone database index to use. + /// + /// The authentication mechanism to use with Pinecone. + /// Supported authentication mechanisms for Pinecone include: API key. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// + /// The vectorization source to use as an embedding dependency for the Pinecone data source. + /// Supported vectorization sources for Pinecone include: deployment name. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// + /// Field mappings to apply to data used by the Pinecone data source. + /// Note that content field mappings are required for Pinecone. + /// + /// , , , or is null. + internal InternalPineconeChatDataSourceParameters(string environment, string indexName, DataSourceAuthentication authentication, DataSourceVectorizer vectorizationSource, DataSourceFieldMappings fieldMappings) + { + Argument.AssertNotNull(environment, nameof(environment)); + Argument.AssertNotNull(indexName, nameof(indexName)); + Argument.AssertNotNull(authentication, nameof(authentication)); + Argument.AssertNotNull(vectorizationSource, nameof(vectorizationSource)); + Argument.AssertNotNull(fieldMappings, nameof(fieldMappings)); + + _internalIncludeContexts = new ChangeTrackingList(); + Environment = environment; + IndexName = indexName; + Authentication = authentication; + VectorizationSource = vectorizationSource; + FieldMappings = fieldMappings; + } + + /// Initializes a new instance of . + /// The configured number of documents to feature in the query. + /// Whether queries should be restricted to use of the indexed data. + /// + /// The configured strictness of the search relevance filtering. + /// Higher strictness will increase precision but lower recall of the answer. + /// + /// + /// Additional instructions for the model to inform how it should behave and any context it should reference when + /// generating a response. You can describe the assistant's personality and tell it how to format responses. + /// This is limited to 100 tokens and counts against the overall token limit. + /// + /// + /// The maximum number of rewritten queries that should be sent to the search provider for a single user message. + /// By default, the system will make an automatic determination. + /// + /// + /// If set to true, the system will allow partial search results to be used and the request will fail if all + /// partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + /// + /// + /// The output context properties to include on the response. + /// By default, citations and intent will be requested. + /// + /// The environment name to use with Pinecone. + /// The name of the Pinecone database index to use. + /// + /// The authentication mechanism to use with Pinecone. + /// Supported authentication mechanisms for Pinecone include: API key. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// + /// The vectorization source to use as an embedding dependency for the Pinecone data source. + /// Supported vectorization sources for Pinecone include: deployment name. + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.. + /// + /// + /// Field mappings to apply to data used by the Pinecone data source. + /// Note that content field mappings are required for Pinecone. + /// + /// Keeps track of any properties unknown to the library. + internal InternalPineconeChatDataSourceParameters(int? topNDocuments, bool? inScope, int? strictness, string roleInformation, int? maxSearchQueries, bool? allowPartialResult, IList internalIncludeContexts, string environment, string indexName, DataSourceAuthentication authentication, DataSourceVectorizer vectorizationSource, DataSourceFieldMappings fieldMappings, IDictionary serializedAdditionalRawData) + { + TopNDocuments = topNDocuments; + InScope = inScope; + Strictness = strictness; + RoleInformation = roleInformation; + MaxSearchQueries = maxSearchQueries; + AllowPartialResult = allowPartialResult; + _internalIncludeContexts = internalIncludeContexts; + Environment = environment; + IndexName = indexName; + Authentication = authentication; + VectorizationSource = vectorizationSource; + FieldMappings = fieldMappings; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal InternalPineconeChatDataSourceParameters() + { + } + + /// The configured number of documents to feature in the query. + internal int? TopNDocuments { get; set; } + /// Whether queries should be restricted to use of the indexed data. + internal bool? InScope { get; set; } + /// + /// The configured strictness of the search relevance filtering. + /// Higher strictness will increase precision but lower recall of the answer. + /// + internal int? Strictness { get; set; } + /// + /// Additional instructions for the model to inform how it should behave and any context it should reference when + /// generating a response. You can describe the assistant's personality and tell it how to format responses. + /// This is limited to 100 tokens and counts against the overall token limit. + /// + internal string RoleInformation { get; set; } + /// + /// The maximum number of rewritten queries that should be sent to the search provider for a single user message. + /// By default, the system will make an automatic determination. + /// + internal int? MaxSearchQueries { get; set; } + /// + /// If set to true, the system will allow partial search results to be used and the request will fail if all + /// partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + /// + internal bool? AllowPartialResult { get; set; } + /// The environment name to use with Pinecone. + internal string Environment { get; set; } + /// The name of the Pinecone database index to use. + internal string IndexName { get; set; } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSource.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSource.Serialization.cs new file mode 100644 index 000000000..140d7777d --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSource.Serialization.cs @@ -0,0 +1,137 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalUnknownAzureChatDataSource : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureChatDataSource)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AzureChatDataSource IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AzureChatDataSource)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAzureChatDataSource(document.RootElement, options); + } + + internal static InternalUnknownAzureChatDataSource DeserializeInternalUnknownAzureChatDataSource(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalUnknownAzureChatDataSource(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AzureChatDataSource)} does not support writing '{options.Format}' format."); + } + } + + AzureChatDataSource IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAzureChatDataSource(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AzureChatDataSource)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalUnknownAzureChatDataSource FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalUnknownAzureChatDataSource(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSource.cs new file mode 100644 index 000000000..551c0d6dd --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSource.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// Unknown version of AzureChatDataSource. + internal partial class InternalUnknownAzureChatDataSource : AzureChatDataSource + { + /// Initializes a new instance of . + /// The differentiating type identifier for the data source. + /// Keeps track of any properties unknown to the library. + internal InternalUnknownAzureChatDataSource(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + /// Initializes a new instance of for deserialization. + internal InternalUnknownAzureChatDataSource() + { + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceAuthenticationOptions.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceAuthenticationOptions.Serialization.cs new file mode 100644 index 000000000..a852533c4 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceAuthenticationOptions.Serialization.cs @@ -0,0 +1,137 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalUnknownAzureChatDataSourceAuthenticationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(DataSourceAuthentication)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + DataSourceAuthentication IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(DataSourceAuthentication)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeDataSourceAuthentication(document.RootElement, options); + } + + internal static InternalUnknownAzureChatDataSourceAuthenticationOptions DeserializeInternalUnknownAzureChatDataSourceAuthenticationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalUnknownAzureChatDataSourceAuthenticationOptions(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(DataSourceAuthentication)} does not support writing '{options.Format}' format."); + } + } + + DataSourceAuthentication IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeDataSourceAuthentication(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(DataSourceAuthentication)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalUnknownAzureChatDataSourceAuthenticationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalUnknownAzureChatDataSourceAuthenticationOptions(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceAuthenticationOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceAuthenticationOptions.cs new file mode 100644 index 000000000..183368888 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceAuthenticationOptions.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// Unknown version of AzureChatDataSourceAuthenticationOptions. + internal partial class InternalUnknownAzureChatDataSourceAuthenticationOptions : DataSourceAuthentication + { + /// Initializes a new instance of . + /// Discriminator. + /// Keeps track of any properties unknown to the library. + internal InternalUnknownAzureChatDataSourceAuthenticationOptions(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + /// Initializes a new instance of for deserialization. + internal InternalUnknownAzureChatDataSourceAuthenticationOptions() + { + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceVectorizationSource.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceVectorizationSource.Serialization.cs new file mode 100644 index 000000000..ed6bccf0c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceVectorizationSource.Serialization.cs @@ -0,0 +1,137 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + internal partial class InternalUnknownAzureChatDataSourceVectorizationSource : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(DataSourceVectorizer)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + DataSourceVectorizer IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(DataSourceVectorizer)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeDataSourceVectorizer(document.RootElement, options); + } + + internal static InternalUnknownAzureChatDataSourceVectorizationSource DeserializeInternalUnknownAzureChatDataSourceVectorizationSource(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalUnknownAzureChatDataSourceVectorizationSource(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(DataSourceVectorizer)} does not support writing '{options.Format}' format."); + } + } + + DataSourceVectorizer IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeDataSourceVectorizer(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(DataSourceVectorizer)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new InternalUnknownAzureChatDataSourceVectorizationSource FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalUnknownAzureChatDataSourceVectorizationSource(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceVectorizationSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceVectorizationSource.cs new file mode 100644 index 000000000..56560cf74 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/InternalUnknownAzureChatDataSourceVectorizationSource.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// Unknown version of AzureChatDataSourceVectorizationSource. + internal partial class InternalUnknownAzureChatDataSourceVectorizationSource : DataSourceVectorizer + { + /// Initializes a new instance of . + /// The differentiating identifier for the concrete vectorization source. + /// Keeps track of any properties unknown to the library. + internal InternalUnknownAzureChatDataSourceVectorizationSource(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + /// Initializes a new instance of for deserialization. + internal InternalUnknownAzureChatDataSourceVectorizationSource() + { + } + } +} + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/PineconeChatDataSource.Serialization.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/PineconeChatDataSource.Serialization.cs new file mode 100644 index 000000000..ca7bb2911 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/PineconeChatDataSource.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Chat +{ + public partial class PineconeChatDataSource : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(PineconeChatDataSource)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("parameters") != true) + { + writer.WritePropertyName("parameters"u8); + writer.WriteObjectValue(InternalParameters, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + PineconeChatDataSource IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(PineconeChatDataSource)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializePineconeChatDataSource(document.RootElement, options); + } + + internal static PineconeChatDataSource DeserializePineconeChatDataSource(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalPineconeChatDataSourceParameters parameters = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("parameters"u8)) + { + parameters = InternalPineconeChatDataSourceParameters.DeserializeInternalPineconeChatDataSourceParameters(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (options.Format != "W") + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new PineconeChatDataSource(type, serializedAdditionalRawData, parameters); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(PineconeChatDataSource)} does not support writing '{options.Format}' format."); + } + } + + PineconeChatDataSource IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializePineconeChatDataSource(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(PineconeChatDataSource)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + /// Deserializes the model from a raw response. + /// The result to deserialize the model from. + internal static new PineconeChatDataSource FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializePineconeChatDataSource(document.RootElement); + } + + /// Convert into a . + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/PineconeChatDataSource.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/PineconeChatDataSource.cs new file mode 100644 index 000000000..273aaca38 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Generated/PineconeChatDataSource.cs @@ -0,0 +1,14 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Chat +{ + /// The PineconeChatDataSource. + public partial class PineconeChatDataSource : AzureChatDataSource + { + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/AzureOpenAIPipelineMessageBuilder.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/AzureOpenAIPipelineMessageBuilder.cs new file mode 100644 index 000000000..07d5d3711 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/AzureOpenAIPipelineMessageBuilder.cs @@ -0,0 +1,167 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI; + +/// +/// A helper class to standardize custom protocol message creation across various Azure OpenAI scenario clients. +/// +internal class AzureOpenAIPipelineMessageBuilder +{ + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + private readonly string _deploymentName; + private string[] _pathComponents; + private readonly List> _queryStringParameters = []; + private string _method; + private BinaryContent _content; + private readonly Dictionary _headers = []; + private PipelineMessageClassifier _classifier; + private RequestOptions _options; + private bool? _bufferResponse; + + /// + /// Creates a new instance of . + /// + /// + /// + /// + /// + public AzureOpenAIPipelineMessageBuilder(ClientPipeline pipeline, Uri endpoint, string apiVersion, string deploymentName = null) + { + _pipeline = pipeline; + _endpoint = endpoint; + _deploymentName = deploymentName; + _queryStringParameters.Add(new KeyValuePair("api-version", apiVersion)); + } + + public AzureOpenAIPipelineMessageBuilder WithPath(params string[] pathComponents) + { + _pathComponents = pathComponents; + return this; + } + + public AzureOpenAIPipelineMessageBuilder WithOptionalQueryParameter(string name, string value) + { + if (!string.IsNullOrEmpty(value)) + { + _queryStringParameters.Add(new(name, value)); + } + return this; + } + + public AzureOpenAIPipelineMessageBuilder WithOptionalQueryParameter(string name, T? value) + where T : struct, IConvertible + => WithOptionalQueryParameter(name, value.HasValue ? Convert.ChangeType(value.Value, typeof(string)).ToString() : null); + + public AzureOpenAIPipelineMessageBuilder WithCommonListParameters(int? limit, string order, string after, string before) + => WithOptionalQueryParameter("limit", limit) + .WithOptionalQueryParameter("order", order) + .WithOptionalQueryParameter("after", after) + .WithOptionalQueryParameter("before", before); + + public AzureOpenAIPipelineMessageBuilder WithMethod(string requestMethod) + { + _method = requestMethod; + return this; + } + + public AzureOpenAIPipelineMessageBuilder WithContent(BinaryContent content, string contentType) + { + _content = content; + _headers["Content-Type"] = contentType; + return this; + } + + public AzureOpenAIPipelineMessageBuilder WithHeader(string name, string value) + { + _headers[name] = value; + return this; + } + + public AzureOpenAIPipelineMessageBuilder WithAssistantsHeader() + { + _headers[s_OpenAIBetaFeatureHeader] = s_OpenAIBetaAssistantsV2HeaderValue; + return this; + } + + public AzureOpenAIPipelineMessageBuilder WithAccept(string acceptHeaderValue) + => WithHeader("Accept", acceptHeaderValue); + + public AzureOpenAIPipelineMessageBuilder WithOptions(RequestOptions requestOptions) + { + _options = requestOptions; + return this; + } + + public AzureOpenAIPipelineMessageBuilder WithResponseContentBuffering(bool? shouldBufferContent) + { + _bufferResponse = shouldBufferContent; + return this; + } + + public AzureOpenAIPipelineMessageBuilder WithClassifier(PipelineMessageClassifier classifier) + { + _classifier = classifier; + return this; + } + + public PipelineMessage Build() + { + Argument.AssertNotNullOrWhiteSpace(_method, nameof(_method)); + + PipelineMessage message = _pipeline.CreateMessage(); + message.ResponseClassifier = _classifier ?? AzureOpenAIClient.PipelineMessageClassifier; + if (_bufferResponse.HasValue) + { + message.BufferResponse = _bufferResponse.Value; + } + PipelineRequest request = message.Request; + request.Method = _method; + SetUri(request); + foreach (KeyValuePair pair in _headers) + { + request.Headers.Set(pair.Key, pair.Value); + } + request.Content = _content; + if (_options is not null) + { + message.Apply(_options); + } + return message; + } + + private void SetUri(PipelineRequest request) + { + ClientUriBuilder uriBuilder = new(); + uriBuilder.Reset(_endpoint); + + bool hasTrailingSlash = _endpoint.AbsoluteUri.EndsWith("/"); + uriBuilder.AppendPath($"{(hasTrailingSlash ? "" : "/")}openai", escape: false); + + if (!string.IsNullOrEmpty(_deploymentName)) + { + uriBuilder.AppendPath($"/deployments/", escape: false); + uriBuilder.AppendPath(_deploymentName, escape: true); + } + + foreach (string pathComponent in _pathComponents ?? []) + { + uriBuilder.AppendPath("/", escape: false); + uriBuilder.AppendPath(pathComponent, escape: true); + } + + foreach (KeyValuePair queryStringPair in _queryStringParameters) + { + uriBuilder.AppendQuery(queryStringPair.Key, queryStringPair.Value, escape: true); + } + + request.Uri = uriBuilder.ToUri(); + } + + private static readonly string s_OpenAIBetaFeatureHeader = "OpenAI-Beta"; + private static readonly string s_OpenAIBetaAssistantsV2HeaderValue = "assistants=v2"; +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/CustomSerializationHelpers.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/CustomSerializationHelpers.cs new file mode 100644 index 000000000..23fc567fa --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/CustomSerializationHelpers.cs @@ -0,0 +1,131 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace Azure.AI.OpenAI; + +internal static partial class CustomSerializationHelpers +{ + internal static TOutput DeserializeNewInstance( + UInstanceInput existingInstance, + Func deserializationFunc, + ref Utf8JsonReader reader, + ModelReaderWriterOptions options) + where UInstanceInput : IJsonModel + { + options ??= new("W"); + var format = options.Format == "W" ? ((IJsonModel)existingInstance).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(UInstanceInput)} does not support '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return deserializationFunc.Invoke(document.RootElement, options); + } + + internal static TOutput DeserializeNewInstance( + UInstanceInput existingInstance, + Func deserializationFunc, + BinaryData data, + ModelReaderWriterOptions options) + where UInstanceInput : IPersistableModel + { + options ??= new("W"); + var format = options.Format == "W" ? ((IPersistableModel)existingInstance).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return deserializationFunc.Invoke(document.RootElement, options)!; + } + default: + throw new FormatException($"The model {nameof(UInstanceInput)} does not support '{format}' format."); + } + } + + internal static void SerializeInstance( + UInstanceInput instance, + Action serializationFunc, + Utf8JsonWriter writer, + ModelReaderWriterOptions options) + where UInstanceInput : IJsonModel + { + options ??= new ModelReaderWriterOptions("W"); + AssertSupportedJsonWriteFormat(instance, options); + serializationFunc.Invoke(instance, writer, options); + } + + internal static void SerializeInstance( + T instance, + Action serializationFunc, + Utf8JsonWriter writer, + ModelReaderWriterOptions options) + where T : IJsonModel + => SerializeInstance(instance, serializationFunc, writer, options); + + internal static BinaryData SerializeInstance( + UInstanceInput instance, + ModelReaderWriterOptions options) + where UInstanceInput : IPersistableModel + { + options ??= new("W"); + AssertSupportedPersistableWriteFormat(instance, options); + return ModelReaderWriter.Write(instance, options); + } + + internal static BinaryData SerializeInstance(T instance, ModelReaderWriterOptions options) + where T : IPersistableModel + => SerializeInstance(instance, options); + + internal static void AssertSupportedJsonWriteFormat(T instance, ModelReaderWriterOptions options) + where T : IJsonModel + => AssertSupportedJsonWriteFormat(instance, options); + + internal static void AssertSupportedJsonWriteFormat(UInstanceInput instance, ModelReaderWriterOptions options) + where UInstanceInput : IJsonModel + { + var format = options.Format == "W" ? ((IJsonModel)instance).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(UInstanceInput)} does not support '{format}' format."); + } + } + + internal static void AssertSupportedPersistableWriteFormat(T instance, ModelReaderWriterOptions options) + where T : IPersistableModel + => AssertSupportedPersistableWriteFormat(instance, options); + + internal static void AssertSupportedPersistableWriteFormat(UInstanceInput instance, ModelReaderWriterOptions options) + where UInstanceInput : IPersistableModel + { + var format = options.Format == "W" ? ((IPersistableModel)instance).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(UInstanceInput)} does not support '{format}' format."); + } + } + + internal static void WriteSerializedAdditionalRawData(this Utf8JsonWriter writer, IDictionary dictionary, ModelReaderWriterOptions options) + { + if (true && dictionary != null) + { + foreach (var item in dictionary) + { + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using JsonDocument document = JsonDocument.Parse(item.Value); + JsonSerializer.Serialize(writer, document.RootElement); +#endif + } + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenClientAttribute.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenClientAttribute.cs new file mode 100644 index 000000000..8c309829a --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenClientAttribute.cs @@ -0,0 +1,16 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +namespace Azure.AI.OpenAI; + +[AttributeUsage(AttributeTargets.Class)] +internal class CodeGenClientAttribute : CodeGenTypeAttribute +{ + public Type? ParentClient { get; set; } + + public CodeGenClientAttribute(string originalName) : base(originalName) + { + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenMemberAttribute.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenMemberAttribute.cs new file mode 100644 index 000000000..595fce61c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenMemberAttribute.cs @@ -0,0 +1,18 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +namespace Azure.AI.OpenAI; + +[AttributeUsage(AttributeTargets.Property | AttributeTargets.Field)] +internal class CodeGenMemberAttribute : CodeGenTypeAttribute +{ + public CodeGenMemberAttribute() : base(null) + { + } + + public CodeGenMemberAttribute(string originalName) : base(originalName) + { + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenModelAttribute.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenModelAttribute.cs new file mode 100644 index 000000000..c9de6f820 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenModelAttribute.cs @@ -0,0 +1,28 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +namespace Azure.AI.OpenAI; + +[AttributeUsage(AttributeTargets.Class | AttributeTargets.Enum | AttributeTargets.Struct)] +internal class CodeGenModelAttribute : CodeGenTypeAttribute +{ + /// + /// Gets or sets a coma separated list of additional model usage modes. Allowed values: model, error, intput, output. + /// + public string[]? Usage { get; set; } + + /// + /// Gets or sets a coma separated list of additional model serialization formats. + /// + public string[]? Formats { get; set; } + + public CodeGenModelAttribute() : base(null) + { + } + + public CodeGenModelAttribute(string originalName) : base(originalName) + { + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenSerializationAttribute.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenSerializationAttribute.cs new file mode 100644 index 000000000..f893b8434 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenSerializationAttribute.cs @@ -0,0 +1,54 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +namespace Azure.AI.OpenAI; + +[AttributeUsage(AttributeTargets.Class | AttributeTargets.Struct, AllowMultiple = true, Inherited = true)] +internal class CodeGenSerializationAttribute : Attribute +{ + /// + /// Gets or sets the property name which these hooks should apply to + /// + public string? PropertyName { get; set; } + /// + /// Gets or sets the serialization path of the property in the JSON + /// + public string[]? SerializationPath { get; } + /// + /// Gets or sets the method name to use when serializing the property value (property name excluded) + /// The signature of the serialization hook method must be or compatible with when invoking: + /// private void SerializeHook(Utf8JsonWriter writer); + /// + public string? SerializationValueHook { get; set; } + /// + /// Gets or sets the method name to use when deserializing the property value from the JSON + /// private static void DeserializationHook(JsonProperty property, ref TypeOfTheProperty propertyValue); // if the property is required + /// private static void DeserializationHook(JsonProperty property, ref Optional<TypeOfTheProperty> propertyValue); // if the property is optional + /// + public string? DeserializationValueHook { get; set; } + /// + /// Gets or sets the method name to use when serializing the property value (property name excluded) + /// The signature of the serialization hook method must be or compatible with when invoking: + /// private void SerializeHook(StringBuilder builder); + /// + public string? BicepSerializationValueHook { get; set; } + + public CodeGenSerializationAttribute(string propertyName) + { + PropertyName = propertyName; + } + + public CodeGenSerializationAttribute(string propertyName, string serializationName) + { + PropertyName = propertyName; + SerializationPath = new[] { serializationName }; + } + + public CodeGenSerializationAttribute(string propertyName, string[] serializationPath) + { + PropertyName = propertyName; + SerializationPath = serializationPath; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenSuppressAttribute.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenSuppressAttribute.cs new file mode 100644 index 000000000..4b0c1e72c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenSuppressAttribute.cs @@ -0,0 +1,17 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace Azure.AI.OpenAI; + +[AttributeUsage(AttributeTargets.Class | AttributeTargets.Enum | AttributeTargets.Struct, AllowMultiple = true)] +internal class CodeGenSuppressAttribute : Attribute +{ + public string Member { get; } + public Type[] Parameters { get; } + + public CodeGenSuppressAttribute(string member, params Type[] parameters) + { + Member = member; + Parameters = parameters; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenTypeAttribute.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenTypeAttribute.cs new file mode 100644 index 000000000..2587e1f57 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Generator/CodeGenTypeAttribute.cs @@ -0,0 +1,19 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System; + +namespace Azure.AI.OpenAI; + +[AttributeUsage(AttributeTargets.Class)] +internal class CodeGenTypeAttribute : Attribute +{ + public string? OriginalName { get; } + + public CodeGenTypeAttribute(string? originalName) + { + OriginalName = originalName; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/GenericActionPipelinePolicy.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/GenericActionPipelinePolicy.cs new file mode 100644 index 000000000..79e5ccc30 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/GenericActionPipelinePolicy.cs @@ -0,0 +1,32 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; + +namespace Azure.AI.OpenAI; + +internal partial class GenericActionPipelinePolicy : PipelinePolicy +{ + private Action _requestAction; + private Action _responseAction; + + public GenericActionPipelinePolicy(Action requestAction = null, Action responseAction = null) + { + _requestAction = requestAction; + _responseAction = responseAction; + } + + public override void Process(PipelineMessage message, IReadOnlyList pipeline, int currentIndex) + { + _requestAction?.Invoke(message.Request); + ProcessNext(message, pipeline, currentIndex); + _responseAction?.Invoke(message.Response); + } + + public override async ValueTask ProcessAsync(PipelineMessage message, IReadOnlyList pipeline, int currentIndex) + { + _requestAction?.Invoke(message.Request); + await ProcessNextAsync(message, pipeline, currentIndex).ConfigureAwait(false); + _responseAction?.Invoke(message.Response); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.ExperimentalAttribute.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.ExperimentalAttribute.cs new file mode 100644 index 000000000..b702c283e --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.ExperimentalAttribute.cs @@ -0,0 +1,62 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#if !NET8_0_OR_GREATER + +#pragma warning disable SA1649 +#nullable enable + +namespace System.Diagnostics.CodeAnalysis +{ + /// + /// Indicates that an API is experimental and it may change in the future. + /// + /// + /// This attribute allows call sites to be flagged with a diagnostic that indicates that an experimental + /// feature is used. Authors can use this attribute to ship preview features in their assemblies. + /// + [AttributeUsage(AttributeTargets.Assembly | + AttributeTargets.Module | + AttributeTargets.Class | + AttributeTargets.Struct | + AttributeTargets.Enum | + AttributeTargets.Constructor | + AttributeTargets.Method | + AttributeTargets.Property | + AttributeTargets.Field | + AttributeTargets.Event | + AttributeTargets.Interface | + AttributeTargets.Delegate, Inherited = false)] + internal sealed class ExperimentalAttribute : Attribute + { + /// + /// Initializes a new instance of the class, specifying the ID that the compiler will use + /// when reporting a use of the API the attribute applies to. + /// + /// The ID that the compiler will use when reporting a use of the API the attribute applies to. + public ExperimentalAttribute(string diagnosticId) + { + DiagnosticId = diagnosticId; + } + + /// + /// Gets the ID that the compiler will use when reporting a use of the API the attribute applies to. + /// + /// The unique diagnostic ID. + /// + /// The diagnostic ID is shown in build output for warnings and errors. + /// This property represents the unique ID that can be used to suppress the warnings or errors, if needed. + /// + public string DiagnosticId { get; } + + /// + /// Gets or sets the URL for corresponding documentation. + /// The API accepts a format string instead of an actual URL, creating a generic URL that includes the diagnostic ID. + /// + /// The format string that represents a URL to corresponding documentation. + /// An example format string is https://contoso.com/obsoletion-warnings/{0}. + public string? UrlFormat { get; set; } + } +} + +#endif // !NET8_0_OR_LATER diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.SetsRequiredMembersAttribute.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.SetsRequiredMembersAttribute.cs new file mode 100644 index 000000000..9e5f8890d --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.SetsRequiredMembersAttribute.cs @@ -0,0 +1,11 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#if !NET7_0_OR_GREATER + +namespace System.Diagnostics.CodeAnalysis; + +[AttributeUsage(AttributeTargets.Constructor, AllowMultiple = false, Inherited = false)] +internal sealed class SetsRequiredMembersAttribute : Attribute { } + +#endif // !NET7_0_OR_GREATER diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Runtime.CompilerServices.CompilerFeatureRequiredAttribute.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Runtime.CompilerServices.CompilerFeatureRequiredAttribute.cs new file mode 100644 index 000000000..0ad171cbd --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Runtime.CompilerServices.CompilerFeatureRequiredAttribute.cs @@ -0,0 +1,18 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#if !NET7_0_OR_GREATER + +namespace System.Runtime.CompilerServices; + +[AttributeUsage(AttributeTargets.All, AllowMultiple = true, Inherited = false)] +internal sealed class CompilerFeatureRequiredAttribute(string featureName) : Attribute +{ + public string FeatureName { get; } = featureName; + public bool IsOptional { get; init; } + + public const string RefStructs = nameof(RefStructs); + public const string RequiredMembers = nameof(RequiredMembers); +} + +#endif // !NET7_0_OR_GREATER diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Runtime.CompilerServices.IsExternalInit.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Runtime.CompilerServices.IsExternalInit.cs new file mode 100644 index 000000000..b34c34cfa --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Runtime.CompilerServices.IsExternalInit.cs @@ -0,0 +1,12 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#if !NET5_0_OR_GREATER + +using System.ComponentModel; +namespace System.Runtime.CompilerServices; + +[EditorBrowsable(EditorBrowsableState.Never)] +internal static class IsExternalInit { } + +#endif // !NET5_0_OR_GREATER diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Runtime.CompilerServices.RequiredMemberAttribute.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Runtime.CompilerServices.RequiredMemberAttribute.cs new file mode 100644 index 000000000..b10b19591 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/src/Utility/Polyfill/System.Runtime.CompilerServices.RequiredMemberAttribute.cs @@ -0,0 +1,11 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#if !NET7_0_OR_GREATER + +namespace System.Runtime.CompilerServices; + +[AttributeUsage(AttributeTargets.Class | AttributeTargets.Struct | AttributeTargets.Field | AttributeTargets.Property, AllowMultiple = false, Inherited = false)] +internal sealed class RequiredMemberAttribute : Attribute { } + +#endif // !NET7_0_OR_GREATER diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/edit_sample_image.png b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/edit_sample_image.png new file mode 100644 index 000000000..869bb1e04 Binary files /dev/null and b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/edit_sample_image.png differ diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/edit_sample_mask.png b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/edit_sample_mask.png new file mode 100644 index 000000000..98b9c237c Binary files /dev/null and b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/edit_sample_mask.png differ diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/fine_tuning.jsonl b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/fine_tuning.jsonl new file mode 100644 index 000000000..2c345d259 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/fine_tuning.jsonl @@ -0,0 +1,11 @@ +{"messages":[{"role":"system","content":"Given a sports headline, provide the following fields in a JSON dictionary, where applicable: \"player\" (full name), \"team\", \"sport\", and \"gender\". In the case there is more than one team return an array of that dictionary. Do not include any markdown characters such as ```json and ```"},{"role":"user","content":"Sources: Colts grant RB Taylor OK to seek trade"},{"role":"assistant","content":"{\n \"player\": \"Jonathan Taylor\",\n \"team\": \"Indianapolis Colts\",\n \"sport\": \"football\",\n \"gender\": \"male\"\n}"}]} +{"messages":[{"role":"system","content":"Given a sports headline, provide the following fields in a JSON dictionary, where applicable: \"player\" (full name), \"team\", \"sport\", and \"gender\". In the case there is more than one team return an array of that dictionary. Do not include any markdown characters such as ```json and ```"},{"role":"user","content":"OSU 'split down middle' on starting QB battle"},{"role":"assistant","content":"{\n \"team\": \"Ohio State University\",\n \"sport\": \"football\"\n}"}]} +{"messages":[{"role":"system","content":"Given a sports headline, provide the following fields in a JSON dictionary, where applicable: \"player\" (full name), \"team\", \"sport\", and \"gender\". In the case there is more than one team return an array of that dictionary. Do not include any markdown characters such as ```json and ```"},{"role":"user","content":"Christian McCaffrey given 2 year extension by 49ers"},{"role":"assistant","content":"{\n \"player\": \"Christian McCaffrey\",\n \"team\": \"San Francisco 49ers\",\n \"sport\": \"football\",\n \"gender\": \"male\"\n}"}]} +{"messages":[{"role":"system","content":"Given a sports headline, provide the following fields in a JSON dictionary, where applicable: \"player\" (full name), \"team\", \"sport\", and \"gender\". In the case there is more than one team return an array of that dictionary. Do not include any markdown characters such as ```json and ```"},{"role":"user","content":"Tucupita Marcano banned for life by MLB for betting on baseball"},{"role":"assistant","content":"{\n \"player\": \"Tucupita Marcano\",\n \"sport\": \"baseball\",\n \"gender\": \"male\"\n}"}]} +{"messages":[{"role":"system","content":"Given a sports headline, provide the following fields in a JSON dictionary, where applicable: \"player\" (full name), \"team\", \"sport\", and \"gender\". In the case there is more than one team return an array of that dictionary. Do not include any markdown characters such as ```json and ```"},{"role":"user","content":"Who will win the 2024 NBA finals? Predictions about the Celtics vs. Mavericks matchup"},{"role":"assistant","content":"[\n {\n \"team\": \"Celtics\",\n \"sport\": \"basketball\"\n },\n {\n \"team\": \"Mavericks\",\n \"sport\": \"basketball\"\n }\n]"}]} +{"messages":[{"role":"system","content":"Given a sports headline, provide the following fields in a JSON dictionary, where applicable: \"player\" (full name), \"team\", \"sport\", and \"gender\". In the case there is more than one team return an array of that dictionary. Do not include any markdown characters such as ```json and ```"},{"role":"user","content":"Pavleski will not play in 2024-2025 season"},{"role":"assistant","content":"{\n \"player\": \"Pavelski\",\n \"sport\": \"hockey\",\n \"gender\": \"male\"\n}"}]} +{"messages":[{"role":"system","content":"Given a sports headline, provide the following fields in a JSON dictionary, where applicable: \"player\" (full name), \"team\", \"sport\", and \"gender\". In the case there is more than one team return an array of that dictionary. Do not include any markdown characters such as ```json and ```"},{"role":"user","content":"Charges against Scottie Scheffler have been dropped after arrest"},{"role":"assistant","content":"{\n \"player\": \"Scottie Scheffler\",\n \"sport\": \"golf\",\n \"gender\": \"male\"\n}"}]} +{"messages":[{"role":"system","content":"Given a sports headline, provide the following fields in a JSON dictionary, where applicable: \"player\" (full name), \"team\", \"sport\", and \"gender\". In the case there is more than one team return an array of that dictionary. Do not include any markdown characters such as ```json and ```"},{"role":"user","content":"Perez picked by Red Bull for F1 2025 deal"},{"role":"assistant","content":"{\n \"player\": \"Perez\",\n \"team\": \"Red Bull\",\n \"sport\": \"F1\",\n \"gender\": \"male\"\n}"}]} +{"messages":[{"role":"system","content":"Given a sports headline, provide the following fields in a JSON dictionary, where applicable: \"player\" (full name), \"team\", \"sport\", and \"gender\". In the case there is more than one team return an array of that dictionary. Do not include any markdown characters such as ```json and ```"},{"role":"user","content":"DL Johnson III waived by 49ers"},{"role":"assistant","content":"{\n \"player\": \"DL Johnson III\",\n \"team\": \"49ers\",\n \"sport\": \"football\",\n \"gender\": \"male\"\n}"}]} +{"messages":[{"role":"system","content":"Given a sports headline, provide the following fields in a JSON dictionary, where applicable: \"player\" (full name), \"team\", \"sport\", and \"gender\". In the case there is more than one team return an array of that dictionary. Do not include any markdown characters such as ```json and ```"},{"role":"user","content":"Trevor Williams placed on injured list"},{"role":"assistant","content":"{\n \"player\": \"Trevor Williams\",\n \"sport\": \"baseball\",\n \"gender\": \"male\"\n}"}]} +{"messages":[{"role":"system","content":"Given a sports headline, provide the following fields in a JSON dictionary, where applicable: \"player\" (full name), \"team\", \"sport\", and \"gender\". In the case there is more than one team return an array of that dictionary. Do not include any markdown characters such as ```json and ```"},{"role":"user","content":"Coco Gauff, and Iga Swiatek will meet in French Open semis"},{"role":"assistant","content":"[\n {\n \"player\": \"Coco Gauff\",\n \"sport\": \"tennis\",\n \"gender\": \"female\"\n },\n {\n \"player\": \"Iga Swiatek\",\n \"sport\": \"tennis\",\n \"gender\": \"female\"\n }\n]"}]} \ No newline at end of file diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/french.wav b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/french.wav new file mode 100644 index 000000000..847f3463a Binary files /dev/null and b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/french.wav differ diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/hello_world.m4a b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/hello_world.m4a new file mode 100644 index 000000000..ed8e09c8f Binary files /dev/null and b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/hello_world.m4a differ diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/playback_test_config.json b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/playback_test_config.json new file mode 100644 index 000000000..d55e5d96e --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/playback_test_config.json @@ -0,0 +1,38 @@ +{ + "default": { + "endpoint": "https://Sanitized.openai.azure.com/", + "key": "Sanitized", + "deployment": "gpt-4-turbo", + "resource_group": "Sanitized", + "subscription_id": "Sanitized" + }, + "audio": { + "deployment": "whisper" + }, + "embedding": { + "deployment": "text-embedding-ada-002" + }, + "fine_tuning": { + "deployment": "gpt-35-turbo-0613", + "fine_tuned_model": "gpt-35-turbo-0613.ft-53f9c10199f84dfea3ec772341862ff5-azure-ai-openai-integration-test" + }, + "image": { + "deployment": "dall-e-3" + }, + "rate_limited_chat": { + "endpoint": "https://Sanitized.openai.azure.com/", + "key": "Sanitized", + "deployment": "gpt-35-turbo-low-quota" + }, + "search": { + "endpoint": "https://Sanitized.search.windows.net/", + "key": "Sanitized", + "index": "openaiwikisearchindex" + }, + "tts": { + "deployment": "tts" + }, + "vision": { + "deployment": "gpt-4-vision-preview" + } +} \ No newline at end of file diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/speed-talking.wav b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/speed-talking.wav new file mode 100644 index 000000000..2a09e2737 Binary files /dev/null and b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/speed-talking.wav differ diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/stop_sign.png b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/stop_sign.png new file mode 100644 index 000000000..002b3ae1a Binary files /dev/null and b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/stop_sign.png differ diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/variation_sample_image.png b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/variation_sample_image.png new file mode 100644 index 000000000..119a13e8f Binary files /dev/null and b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Assets/variation_sample_image.png differ diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/AssistantTests.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/AssistantTests.cs new file mode 100644 index 000000000..67b4c354f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/AssistantTests.cs @@ -0,0 +1,653 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable disable + +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Diagnostics; +using System.Linq; +using System.Text; +using System.Threading.Tasks; +using Azure.AI.OpenAI.Tests.Utils.Config; +using OpenAI; +using OpenAI.Assistants; +using OpenAI.Files; +using OpenAI.TestFramework; +using OpenAI.TestFramework.Utils; +using OpenAI.VectorStores; + +namespace Azure.AI.OpenAI.Tests; + +public class AssistantTests(bool isAsync) : AoaiTestBase(isAsync) +{ + [Test] + [Category("Smoke")] + public void CanCreateClient() => Assert.That(GetTestClient(), Is.InstanceOf()); + + [Test] + [Category("Smoke")] + public void VerifyClientOptionMutability() + { + AzureOpenAIClientOptions options = null; + Assert.DoesNotThrow(() => + options = new AzureOpenAIClientOptions() + { + ApplicationId = "init does not throw", + }); + Assert.DoesNotThrow(() => + options.ApplicationId = "set before freeze OK"); + AzureOpenAIClient azureClient = new( + new Uri("https://www.microsoft.com/placeholder"), + new ApiKeyCredential("placeholder"), + options); + Assert.Throws(() => + options.ApplicationId = "set after freeze throws"); + } + + [RecordedTest] + public async Task BasicAssistantOperationsWork() + { + AssistantClient client = GetTestClient(); + string modelName = client.DeploymentOrThrow(); + Assistant assistant = await client.CreateAssistantAsync(modelName); + Validate(assistant); + Assert.That(assistant.Name, Is.Null.Or.Empty); + assistant = await client.ModifyAssistantAsync(assistant.Id, new AssistantModificationOptions() + { + Name = "test assistant name", + }); + Assert.That(assistant.Name, Is.EqualTo("test assistant name")); + bool deleted = await client.DeleteAssistantAsync(assistant.Id); + Assert.That(deleted, Is.True); + assistant = await client.CreateAssistantAsync(modelName, new AssistantCreationOptions() + { + Metadata = + { + ["testkey"] = "hello!" + }, + }); + Validate(assistant); + Assistant retrievedAssistant = await client.GetAssistantAsync(assistant.Id); + Assert.That(retrievedAssistant.Id, Is.EqualTo(assistant.Id)); + Assert.That(retrievedAssistant.Metadata.TryGetValue("testkey", out string metadataValue) && metadataValue == "hello!"); + Assistant modifiedAssistant = await client.ModifyAssistantAsync(assistant.Id, new AssistantModificationOptions() + { + Metadata = + { + ["testkey"] = "goodbye!", + }, + }); + Assert.That(modifiedAssistant.Id, Is.EqualTo(assistant.Id)); + AsyncPageCollection recentAssistants = client.GetAssistantsAsync(); + Assistant firstAssistant = await recentAssistants.GetAllValuesAsync().FirstOrDefaultAsync(); + Assert.That(firstAssistant, Is.Not.Null); + Assert.That(firstAssistant.Metadata.TryGetValue("testkey", out string newMetadataValue) && newMetadataValue == "goodbye!"); + } + + [RecordedTest] + public async Task BasicThreadOperationsWork() + { + AssistantClient client = GetTestClient(); + AssistantThread thread = await client.CreateThreadAsync(); + Validate(thread); + Assert.That(thread.CreatedAt, Is.GreaterThan(s_2024)); + bool deleted = await client.DeleteThreadAsync(thread.Id); + Assert.That(deleted, Is.True); + + ThreadCreationOptions options = new() + { + Metadata = + { + ["threadMetadata"] = "threadMetadataValue", + } + }; + thread = await client.CreateThreadAsync(options); + Validate(thread); + Assert.That(thread.Metadata.TryGetValue("threadMetadata", out string threadMetadataValue) && threadMetadataValue == "threadMetadataValue"); + AssistantThread retrievedThread = await client.GetThreadAsync(thread.Id); + Assert.That(retrievedThread.Id, Is.EqualTo(thread.Id)); + thread = await client.ModifyThreadAsync(thread, new ThreadModificationOptions() + { + Metadata = + { + ["threadMetadata"] = "newThreadMetadataValue", + }, + }); + Assert.That(thread.Metadata.TryGetValue("threadMetadata", out threadMetadataValue) && threadMetadataValue == "newThreadMetadataValue"); + } + + [RecordedTest] + public async Task SettingResponseFormatWorks() + { + AssistantClient client = GetTestClient(); + string modelName = client.DeploymentOrThrow(); + + Assistant assistant = await client.CreateAssistantAsync(modelName, new() + { + ResponseFormat = AssistantResponseFormat.JsonObject, + }); + Validate(assistant); + Assert.That(assistant.ResponseFormat, Is.EqualTo(AssistantResponseFormat.JsonObject)); + assistant = await client.ModifyAssistantAsync(assistant, new() + { + ResponseFormat = AssistantResponseFormat.Text, + }); + Assert.That(assistant.ResponseFormat, Is.EqualTo(AssistantResponseFormat.Text)); + AssistantThread thread = await client.CreateThreadAsync(); + Validate(thread); + ThreadMessage message = await client.CreateMessageAsync(thread.Id, MessageRole.User, ["Write some JSON for me!"]); + Validate(message); + ThreadRun run = await client.CreateRunAsync(thread, assistant, new() + { + ResponseFormat = AssistantResponseFormat.JsonObject, + }); + Validate(run); + Assert.That(run.ResponseFormat, Is.EqualTo(AssistantResponseFormat.JsonObject)); + } + + [RecordedTest] + public async Task StreamingToolCall() + { + AssistantClient client = GetTestClient(); + string modelName = client.DeploymentOrThrow(); + FunctionToolDefinition getWeatherTool = new("get_current_weather") { Description = "Gets the user's current weather" }; + Assistant assistant = await client.CreateAssistantAsync(modelName, new() + { + Tools = { getWeatherTool } + }); + Validate(assistant); + + Stopwatch stopwatch = Stopwatch.StartNew(); + void Print(string message) => Console.WriteLine($"[{stopwatch.ElapsedMilliseconds,6}] {message}"); + + Print(" >>> Beginning call ... "); + + ThreadCreationOptions thrdOpt = new() + { + InitialMessages = { new(MessageRole.User, ["What should I wear outside right now?"]), }, + }; + AsyncCollectionResult asyncResults = client.CreateThreadAndRunStreamingAsync(assistant, thrdOpt); + + Print(" >>> Starting enumeration ..."); + + ThreadRun run = null; + + do + { + run = null; + List toolOutputs = new(); + await foreach (StreamingUpdate update in asyncResults) + { + string message = update.UpdateKind.ToString(); + + if (update is RunUpdate runUpdate) + { + message += $" run_id:{runUpdate.Value.Id}"; + run = runUpdate.Value; + } + if (update is RequiredActionUpdate requiredActionUpdate) + { + Assert.That(requiredActionUpdate.FunctionName, Is.EqualTo(getWeatherTool.FunctionName)); + Assert.That(requiredActionUpdate.GetThreadRun().Status, Is.EqualTo(RunStatus.RequiresAction)); + message += $" {requiredActionUpdate.FunctionName}"; + toolOutputs.Add(new(requiredActionUpdate.ToolCallId, "warm and sunny")); + } + if (update is MessageContentUpdate contentUpdate) + { + message += $" {contentUpdate.Text}"; + } + Print(message); + } + if (toolOutputs.Count > 0) + { + asyncResults = client.SubmitToolOutputsToRunStreamingAsync(run, toolOutputs); + } + } while (run?.Status.IsTerminal == false); + } + + [RecordedTest] + public async Task BasicMessageOperationsWork() + { + // TODO FIXME Can't currently delete messages on AOAI + bool aoaiDeleteBugFixed = false; + + AssistantClient client = GetTestClient(); + AssistantThread thread = await client.CreateThreadAsync(); + Validate(thread); + ThreadMessage message = await client.CreateMessageAsync(thread.Id, MessageRole.User, ["Hello, world!"]); + Validate(message); + Assert.That(message.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(message.Content?.Count, Is.EqualTo(1)); + Assert.That(message.Content[0], Is.Not.Null); + Assert.That(message.Content[0].Text, Is.EqualTo("Hello, world!")); + + if (aoaiDeleteBugFixed) + { + bool deleted = await client.DeleteMessageAsync(message); + Assert.That(deleted, Is.True); + } + + message = await client.CreateMessageAsync(thread.Id, MessageRole.User, ["Goodbye, world!"], new MessageCreationOptions() + { + Metadata = + { + ["messageMetadata"] = "messageMetadataValue", + }, + }); + Validate(message); + Assert.That(message.Metadata.TryGetValue("messageMetadata", out string metadataValue) && metadataValue == "messageMetadataValue"); + + ThreadMessage retrievedMessage = await client.GetMessageAsync(thread.Id, message.Id); + Assert.That(retrievedMessage.Id, Is.EqualTo(message.Id)); + + message = await client.ModifyMessageAsync(message, new MessageModificationOptions() + { + Metadata = + { + ["messageMetadata"] = "newValue", + } + }); + Assert.That(message.Metadata.TryGetValue("messageMetadata", out metadataValue) && metadataValue == "newValue"); + + var messagePage = await client.GetMessagesAsync(thread).ToListAsync(); + if (aoaiDeleteBugFixed) + { + Assert.That(messagePage.Count, Is.EqualTo(1)); + } + else + { + Assert.That(messagePage.Count, Is.EqualTo(2)); + } + + Assert.That(messagePage.ElementAt(0).Id, Is.EqualTo(message.Id)); + Assert.That(messagePage.ElementAt(0).Metadata.TryGetValue("messageMetadata", out metadataValue) && metadataValue == "newValue"); + } + + [RecordedTest] + public async Task ThreadWithInitialMessagesWorks() + { + const string userGreeting = "Hello, world!"; + const string userQuestion = "Can you describe why stop signs are the shape and color that they are?"; + + AssistantClient client = GetTestClient(); + ThreadCreationOptions options = new() + { + InitialMessages = + { + new ThreadInitializationMessage(MessageRole.User, [userGreeting]), + new ThreadInitializationMessage(MessageRole.User, [ userQuestion ]) + { + Metadata = + { + ["messageMetadata"] = "messageMetadataValue", + }, + }, + }, + }; + AssistantThread thread = await client.CreateThreadAsync(options); + Validate(thread); + List messageList = await client.GetMessagesAsync(thread, new() { Order = ListOrder.OldestFirst }).ToListAsync(); + Assert.That(messageList.Count, Is.EqualTo(2)); + Assert.That(messageList[0].Role, Is.EqualTo(MessageRole.User)); + Assert.That(messageList[0].Content?.Count, Is.EqualTo(1)); + Assert.That(messageList[0].Content[0].Text, Is.EqualTo(userGreeting)); + Assert.That(messageList[1].Content[0], Is.Not.Null); + Assert.That(messageList[1].Content[0].Text, Is.EqualTo(userQuestion)); + } + + [RecordedTest] + public async Task BasicRunOperationsWork() + { + AssistantClient client = GetTestClient(); + string modelName = client.DeploymentOrThrow(); + Assistant assistant = await client.CreateAssistantAsync(modelName); + Validate(assistant); + AssistantThread thread = await client.CreateThreadAsync(); + Validate(thread); + List runPage = await client.GetRunsAsync(thread.Id).ToListAsync(); + Assert.That(runPage.Count, Is.EqualTo(0)); + ThreadMessage message = await client.CreateMessageAsync(thread.Id, MessageRole.User, ["Hello, assistant!"]); + Validate(message); + ThreadRun run = await client.CreateRunAsync(thread.Id, assistant.Id); + Validate(run); + Assert.That(run.Status, Is.EqualTo(RunStatus.Queued)); + Assert.That(run.CreatedAt, Is.GreaterThan(s_2024)); + ThreadRun retrievedRun = await client.GetRunAsync(thread.Id, run.Id); + Assert.That(retrievedRun.Id, Is.EqualTo(run.Id)); + runPage = await client.GetRunsAsync(thread.Id).ToListAsync(); + Assert.That(runPage.Count, Is.EqualTo(1)); + Assert.That(runPage.ElementAt(0).Id, Is.EqualTo(run.Id)); + + List messages = await client.GetMessagesAsync(thread).ToListAsync(); + Assert.That(messages.Count, Is.GreaterThanOrEqualTo(1)); + + run = await WaitUntilReturnLast( + run, + () => client.GetRunAsync(run), + r => r.Status.IsTerminal); + Assert.That(run.Status, Is.EqualTo(RunStatus.Completed)); + + Assert.Multiple(() => + { + Assert.That(run.Status, Is.EqualTo(RunStatus.Completed)); + Assert.That(run.CompletedAt, Is.GreaterThan(s_2024)); + Assert.That(run.RequiredActions, Is.Empty); + Assert.That(run.AssistantId, Is.EqualTo(assistant.Id)); + Assert.That(run.FailedAt, Is.Null); + Assert.That(run.IncompleteDetails, Is.Null); + }); + messages = await client.GetMessagesAsync(thread).ToListAsync(); + Assert.That(messages.Count, Is.EqualTo(2)); + + Assert.That(messages.ElementAt(0).Role, Is.EqualTo(MessageRole.Assistant)); + Assert.That(messages.ElementAt(1).Role, Is.EqualTo(MessageRole.User)); + Assert.That(messages.ElementAt(1).Id, Is.EqualTo(message.Id)); + } + + [RecordedTest] + public async Task BasicRunStepFunctionalityWorks() + { + AssistantClient client = GetTestClient(); + string modelName = client.DeploymentOrThrow(); + Assistant assistant = await client.CreateAssistantAsync(modelName, new AssistantCreationOptions() + { + Tools = { new CodeInterpreterToolDefinition() }, + Instructions = "Call the code interpreter tool when asked to visualize mathematical concepts.", + }); + Validate(assistant); + + AssistantThread thread = await client.CreateThreadAsync(new ThreadCreationOptions() + { + InitialMessages = { new(MessageRole.User, ["Please graph the equation y = 3x + 4"]), }, + }); + Validate(thread); + + ThreadRun run = await client.CreateRunAsync(thread, assistant); + Validate(run); + + run = await WaitUntilReturnLast( + run, + () => client.GetRunAsync(run), + r => r.Status.IsTerminal); + Assert.That(run.Status, Is.EqualTo(RunStatus.Completed)); + Assert.That(run.Usage?.TotalTokens, Is.GreaterThan(0)); + + List runSteps = await client.GetRunStepsAsync(run).ToListAsync(); + Assert.That(runSteps.Count(), Is.GreaterThan(1)); + Assert.Multiple(() => + { + Assert.That(runSteps.ElementAt(0).AssistantId, Is.EqualTo(assistant.Id)); + Assert.That(runSteps.ElementAt(0).ThreadId, Is.EqualTo(thread.Id)); + Assert.That(runSteps.ElementAt(0).RunId, Is.EqualTo(run.Id)); + Assert.That(runSteps.ElementAt(0).CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(runSteps.ElementAt(0).CompletedAt, Is.GreaterThan(s_2024)); + }); + RunStepDetails details = runSteps.ElementAt(0).Details; + Assert.That(details?.CreatedMessageId, Is.Not.Null.Or.Empty); + + details = runSteps.ElementAt(1).Details; + Assert.Multiple(() => + { + Assert.That(details?.ToolCalls.Count, Is.GreaterThan(0)); + Assert.That(details.ToolCalls[0].ToolKind, Is.EqualTo(RunStepToolCallKind.CodeInterpreter)); + Assert.That(details.ToolCalls[0].ToolCallId, Is.Not.Null.Or.Empty); + Assert.That(details.ToolCalls[0].CodeInterpreterInput, Is.Not.Null.Or.Empty); + Assert.That(details.ToolCalls[0].CodeInterpreterOutputs?.Count, Is.GreaterThan(0)); + Assert.That(details.ToolCalls[0].CodeInterpreterOutputs[0].ImageFileId, Is.Not.Null.Or.Empty); + }); + } + + [RecordedTest] + public async Task FunctionToolsWork() + { + AssistantClient client = GetTestClient(); + string modelName = client.DeploymentOrThrow(); + Assistant assistant = await client.CreateAssistantAsync(modelName, new AssistantCreationOptions() + { + Tools = + { + new FunctionToolDefinition() + { + FunctionName = "get_favorite_food_for_day_of_week", + Description = "gets the user's favorite food for a given day of the week, like Tuesday", + Parameters = BinaryData.FromObjectAsJson(new + { + type = "object", + properties = new + { + day_of_week = new + { + type = "string", + description = "a day of the week, like Tuesday or Saturday", + } + } + }), + }, + }, + }); + Validate(assistant); + Assert.That(assistant.Tools?.Count, Is.EqualTo(1)); + + FunctionToolDefinition responseToolDefinition = assistant.Tools[0] as FunctionToolDefinition; + Assert.That(responseToolDefinition?.FunctionName, Is.EqualTo("get_favorite_food_for_day_of_week")); + Assert.That(responseToolDefinition?.Parameters, Is.Not.Null); + + ThreadRun run = await client.CreateThreadAndRunAsync( + assistant, + new ThreadCreationOptions() + { + InitialMessages = { new(MessageRole.User, ["What should I eat on Thursday?"]) }, + }, + new RunCreationOptions() + { + AdditionalInstructions = "Call provided tools when appropriate.", + }); + Validate(run); + Console.WriteLine($" Run status right after creation: {run.Status}"); + + // TODO FIXME: The underlying OpenAI code doesn't consider the "requires_action" status to be terminal even though it is. + // Work around this here + run = await WaitUntilReturnLast( + run, + () => client.GetRunAsync(run), + r => r.Status.IsTerminal || r.Status.Equals(RunStatus.RequiresAction)); + + Assert.That(run.Status, Is.EqualTo(RunStatus.RequiresAction)); + Assert.That(run.RequiredActions?.Count, Is.EqualTo(1)); + Assert.That(run.RequiredActions[0].ToolCallId, Is.Not.Null.Or.Empty); + Assert.That(run.RequiredActions[0].FunctionName, Is.EqualTo("get_favorite_food_for_day_of_week")); + Assert.That(run.RequiredActions[0].FunctionArguments, Is.Not.Null.Or.Empty); + + run = await client.SubmitToolOutputsToRunAsync(run, [new(run.RequiredActions[0].ToolCallId, "tacos")]); + Assert.That(run.Status.IsTerminal, Is.False); + + run = await WaitUntilReturnLast( + run, + () => client.GetRunAsync(run), + r => r.Status.IsTerminal); + Assert.That(run.Status, Is.EqualTo(RunStatus.Completed)); + + List messages = await client.GetMessagesAsync(run.ThreadId, new() { Order = ListOrder.NewestFirst }) + .ToListAsync(); + Assert.That(messages.Count, Is.GreaterThan(1)); + Assert.That(messages.ElementAt(0).Role, Is.EqualTo(MessageRole.Assistant)); + Assert.That(messages.ElementAt(0).Content?[0], Is.Not.Null); + Assert.That(messages.ElementAt(0).Content?[0].Text, Does.Contain("tacos")); + } + + [RecordedTest] + public async Task BasicFileSearchWorks() + { + // First, we need to upload a simple test file. + AssistantClient client = GetTestClient(); + string modelName = client.DeploymentOrThrow(); + FileClient fileClient = GetTestClientFrom(client); + + OpenAIFileInfo testFile = await fileClient.UploadFileAsync( + BinaryData.FromString(""" + This file describes the favorite foods of several people. + + Summanus Ferdinand: tacos + Tekakwitha Effie: pizza + Filip Carola: cake + """), + "favorite_foods.txt", + FileUploadPurpose.Assistants); + Validate(testFile); + + // Create an assistant, using the creation helper to make a new vector store + Assistant assistant = await client.CreateAssistantAsync(modelName, new() + { + Tools = { new FileSearchToolDefinition() }, + ToolResources = new() + { + FileSearch = new() + { + NewVectorStores = + { + new VectorStoreCreationHelper([testFile]), + } + } + } + }); + Validate(assistant); + Assert.That(assistant.ToolResources?.FileSearch?.VectorStoreIds, Has.Count.EqualTo(1)); + string createdVectorStoreId = assistant.ToolResources.FileSearch.VectorStoreIds[0]; + ValidateById(createdVectorStoreId); + + // Modify an assistant to use the existing vector store + assistant = await client.ModifyAssistantAsync(assistant, new AssistantModificationOptions() + { + ToolResources = new() + { + FileSearch = new() + { + VectorStoreIds = { assistant.ToolResources.FileSearch.VectorStoreIds[0] }, + }, + }, + }); + Assert.That(assistant.ToolResources?.FileSearch?.VectorStoreIds, Has.Count.EqualTo(1)); + Assert.That(assistant.ToolResources.FileSearch.VectorStoreIds[0], Is.EqualTo(createdVectorStoreId)); + + // Create a thread with an override vector store + AssistantThread thread = await client.CreateThreadAsync(new ThreadCreationOptions() + { + InitialMessages = { new(MessageRole.User, ["Using the files you have available, what's Filip's favorite food?"]) }, + ToolResources = new() + { + FileSearch = new() + { + NewVectorStores = + { + new VectorStoreCreationHelper([testFile.Id]) + } + } + } + }); + Validate(thread); + Assert.That(thread.ToolResources?.FileSearch?.VectorStoreIds, Has.Count.EqualTo(1)); + createdVectorStoreId = thread.ToolResources.FileSearch.VectorStoreIds[0]; + ValidateById(createdVectorStoreId); + + // Ensure that modifying the thread with an existing vector store works + thread = await client.ModifyThreadAsync(thread, new ThreadModificationOptions() + { + ToolResources = new() + { + FileSearch = new() + { + VectorStoreIds = { createdVectorStoreId }, + } + } + }); + Assert.That(thread.ToolResources?.FileSearch?.VectorStoreIds, Has.Count.EqualTo(1)); + Assert.That(thread.ToolResources.FileSearch.VectorStoreIds[0], Is.EqualTo(createdVectorStoreId)); + + ThreadRun run = await client.CreateRunAsync(thread, assistant); + Validate(run); + run = await WaitUntilReturnLast( + run, + () => client.GetRunAsync(run), + r => r.Status.IsTerminal); + Assert.That(run.Status, Is.EqualTo(RunStatus.Completed)); + + AsyncPageCollection messages = client.GetMessagesAsync(thread, new() { Order = ListOrder.NewestFirst }); + int numPages = 0; + int numThreads = 0; + bool hasCake = false; + await foreach (PageResult page in messages) + { + numPages++; + foreach (ThreadMessage message in page.Values) + { + numThreads++; + foreach (MessageContent content in message.Content) + { + Console.WriteLine(content.Text); + hasCake |= content.Text?.ToLowerInvariant().Contains("cake") == true; + foreach (TextAnnotation annotation in content.TextAnnotations) + { + Console.WriteLine($" --> From file: {annotation.InputFileId}, replacement: {annotation.TextToReplace}"); + } + } + } + } + + Assert.That(numPages, Is.GreaterThan(0)); + Assert.That(numThreads, Is.GreaterThan(0)); + Assert.That(hasCake, Is.True); + } + + [RecordedTest] + public async Task StreamingRunWorks() + { + AssistantClient client = GetTestClient(); + string modelName = client.DeploymentOrThrow(); + Assistant assistant = await client.CreateAssistantAsync(modelName); + Validate(assistant); + + AssistantThread thread = await client.CreateThreadAsync(new ThreadCreationOptions() + { + InitialMessages = { new(MessageRole.User, ["Hello there, assistant! How are you today?"]), }, + }); + Validate(thread); + + AsyncCollectionResult streamingResult = client.CreateRunStreamingAsync(thread.Id, assistant.Id); + + StringBuilder content = new(); + DateTimeOffset? lastUpdate = null; + StreamingUpdateReason? lastUpdateReason = null; + + await foreach (StreamingUpdate update in streamingResult) + { + if (update is RunUpdate runUpdate) + { + lastUpdateReason = runUpdate.UpdateKind; + lastUpdate = update.UpdateKind switch + { + StreamingUpdateReason.RunCreated => runUpdate.Value.CreatedAt, + StreamingUpdateReason.RunQueued => runUpdate.Value.StartedAt, + StreamingUpdateReason.RunInProgress => runUpdate.Value.StartedAt, + StreamingUpdateReason.RunCompleted => runUpdate.Value.CompletedAt, + _ => null, + }; + } + if (update is MessageContentUpdate contentUpdate) + { + // TODO FIXME: The OpenAI library code is currently incorrectly returning a MessageRole.User value here. + // It should instead be null or at least Assistant + //Assert.That(contentUpdate.Role, Is.Null.Or.EqualTo(MessageRole.Assistant)); + Assert.That(contentUpdate.Text, Is.Not.Null); // can be empty string + content.Append(contentUpdate.Text); + } + } + + Assert.That(lastUpdateReason, Is.EqualTo(StreamingUpdateReason.RunCompleted)); + Assert.That(lastUpdate, Is.Not.Null.And.GreaterThan(s_2024)); + Assert.That(content, Has.Length.GreaterThan(0)); + } + + private static readonly DateTimeOffset s_2024 = new(2024, 1, 1, 0, 0, 0, TimeSpan.Zero); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/AudioTests.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/AudioTests.cs new file mode 100644 index 000000000..5312b8083 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/AudioTests.cs @@ -0,0 +1,167 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.IO; +using System.Threading.Tasks; +using OpenAI.Audio; +using OpenAI.TestFramework; + +namespace Azure.AI.OpenAI.Tests; + +public class AudioTests(bool isAsync) : AoaiTestBase(isAsync) +{ + [Test] + [Category("Smoke")] + public void CanCreateClient() => Assert.That(GetTestClient(), Is.InstanceOf()); + + [RecordedTest] + public async Task TranscriptionWorks() + { + AudioClient audioClient = GetTestClient(); + AudioTranscription transcription = await audioClient.TranscribeAudioAsync(Assets.HelloWorld.RelativePath); + Assert.That(transcription?.Text, Is.Not.Null.Or.Empty); + } + + [RecordedTest] + public async Task TranslationWorks() + { + AudioClient audioClient = GetTestClient(); + AudioTranslation translation = await audioClient.TranslateAudioAsync(Assets.WhisperFrenchDescription.RelativePath); + Assert.That(translation?.Text, Is.Not.Null.Or.Empty); + } + + [RecordedTest] + public async Task TextToSpeechWorks() + { + AudioClient audioClient = GetTestClient("tts"); + BinaryData ttsData = await audioClient.GenerateSpeechAsync( + "hello, world!", + GeneratedSpeechVoice.Alloy); + Assert.That(ttsData, Is.Not.Null); + } + + [RecordedTest] + [TestCase(AudioTranscriptionFormat.Simple)] + [TestCase(AudioTranscriptionFormat.Verbose)] + [TestCase(AudioTranscriptionFormat.Srt)] + [TestCase(AudioTranscriptionFormat.Vtt)] + [TestCase(null)] + public async Task TranscriptionWorksWithFormat(AudioTranscriptionFormat? format) + { + AudioClient client = GetTestClient(); + + var audioInfo = Assets.HelloWorld; + using Stream audioFileStream = File.OpenRead(audioInfo.RelativePath); + AudioTranscriptionOptions options = new() + { + Temperature = 0.25f, + ResponseFormat = format, + }; + + AudioTranscription transcription = await client.TranscribeAudioAsync( + audioFileStream, audioInfo.Name, options); + + Assert.That(transcription, Is.Not.Null); + Assert.That(transcription.Text, Is.Not.Null.Or.Empty); + + if (format == AudioTranscriptionFormat.Simple) + { + Assert.That(transcription.Duration, Is.Null); + Assert.That(transcription.Language, Is.Null); + Assert.That(transcription.Segments, Is.Null.Or.Empty); + } + else if (format == AudioTranscriptionFormat.Verbose) + { + Assert.That(transcription.Duration, Is.GreaterThan(TimeSpan.FromSeconds(0))); + Assert.That(transcription.Language, Is.Not.Null.Or.Empty); + Assert.That(transcription.Segments, Is.Not.Null.Or.Empty); + + TranscribedSegment firstSegment = transcription.Segments[0]; + Assert.That(firstSegment, Is.Not.Null); + Assert.That(firstSegment.Id, Is.EqualTo(0)); + Assert.That(firstSegment.Start, Is.GreaterThanOrEqualTo(TimeSpan.FromSeconds(0))); + Assert.That(firstSegment.End, Is.GreaterThan(firstSegment.Start)); + Assert.That(firstSegment.Text, Is.Not.Null.Or.Empty); + } + } + + [RecordedTest] + [TestCase(AudioTimestampGranularities.Default)] + [TestCase(AudioTimestampGranularities.Word)] + [TestCase(AudioTimestampGranularities.Segment)] + [TestCase(AudioTimestampGranularities.Word | AudioTimestampGranularities.Segment)] + public async Task TranscriptionTimestampGranularitiesWork(AudioTimestampGranularities granularityFlags) + { + AudioClient client = GetTestClient(); + var audioInfo = Assets.HelloWorld; + using Stream audioFileStream = File.OpenRead(audioInfo.RelativePath); + AudioTranscriptionOptions options = new() + { + Granularities = granularityFlags, + ResponseFormat = AudioTranscriptionFormat.Verbose, + }; + ClientResult transcriptionResult = await client.TranscribeAudioAsync( + audioFileStream, + audioInfo.Name, + options); + PipelineResponse response = transcriptionResult.GetRawResponse(); + Assert.That(response, Is.Not.Null); + AudioTranscription transcription = transcriptionResult.Value; + Assert.That(transcription.Text, Is.Not.Null.Or.Empty); + Assert.That( + transcription.Words?.Count > 0, + Is.EqualTo(granularityFlags.HasFlag(AudioTimestampGranularities.Word)), + "Word-level information should appear (and only appear) when requested"); + Assert.That( + transcription.Segments?.Count > 0, + Is.EqualTo(granularityFlags.HasFlag(AudioTimestampGranularities.Segment) || granularityFlags == AudioTimestampGranularities.Default), + "Segment-level information should appear (and only appear) when requested or when no flags were provided"); + } + + [RecordedTest] + [TestCase(AudioTranslationFormat.Simple)] + [TestCase(AudioTranslationFormat.Verbose)] + [TestCase(AudioTranslationFormat.Srt)] + [TestCase(AudioTranslationFormat.Vtt)] + [TestCase(null)] + public async Task TranslationWorksWithFormat(AudioTranslationFormat? format) + { + AudioClient client = GetTestClient(); + + var audioInfo = Assets.WhisperFrenchDescription; + using Stream audioFileStream = File.OpenRead(audioInfo.RelativePath); + AudioTranslationOptions options = new() + { + ResponseFormat = format, + }; + + AudioTranslation translation = await client.TranslateAudioAsync( + audioFileStream, audioInfo.Name, options); + + Assert.That(translation, Is.Not.Null); + Assert.That(translation.Text, Is.Not.Null.Or.Empty); + + if (format == AudioTranslationFormat.Simple) + { + Assert.That(translation.Duration, Is.Null); + Assert.That(translation.Language, Is.Null); + Assert.That(translation.Segments, Is.Null.Or.Empty); + } + else if (format == AudioTranslationFormat.Verbose) + { + Assert.That(translation.Duration, Is.GreaterThan(TimeSpan.FromSeconds(0))); + Assert.That(translation.Language, Is.Not.Null.Or.Empty); + Assert.That(translation.Segments, Is.Not.Null.Or.Empty); + + TranscribedSegment firstSegment = translation.Segments[0]; + Assert.That(firstSegment, Is.Not.Null); + Assert.That(firstSegment.Id, Is.EqualTo(0)); + Assert.That(firstSegment.Start, Is.GreaterThanOrEqualTo(TimeSpan.FromSeconds(0))); + Assert.That(firstSegment.End, Is.GreaterThan(firstSegment.Start)); + Assert.That(firstSegment.Text, Is.Not.Null.Or.Empty); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Azure.AI.OpenAI.Tests.csproj b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Azure.AI.OpenAI.Tests.csproj new file mode 100644 index 000000000..5cc3cb985 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Azure.AI.OpenAI.Tests.csproj @@ -0,0 +1,61 @@ + + + + $(RequiredTargetFrameworks) + + + $(NoWarn);CS1591;CS8002;SA1402;SA1507;SA1508;SA1633;SA1028;SA1505;OPENAI001;AOAI001 + preview + enable + + + + + + + + + + + + + + + + + + + + + + + + + + + PreserveNewest + + + Never + + + + + + Utils\Polyfill\%(RecursiveDir)\%(Filename).cs + + + + + + + <_Parameter1>TestProjectSourceBasePath + <_Parameter2>$(MSBuildThisFileDirectory) + + + + diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/BatchTests.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/BatchTests.cs new file mode 100644 index 000000000..357c022d1 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/BatchTests.cs @@ -0,0 +1,222 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.IO; +using System.Linq; +using System.Net.Http; +using System.Text.Json; +using System.Threading.Tasks; +using Azure.AI.OpenAI.Tests.Models; +using Azure.AI.OpenAI.Tests.Utils; +using Azure.AI.OpenAI.Tests.Utils.Config; +using OpenAI.Batch; +using OpenAI.Chat; +using OpenAI.Embeddings; +using OpenAI.Files; +using OpenAI.TestFramework; +using OpenAI.TestFramework.Mocks; +using OpenAI.TestFramework.Utils; + +namespace Azure.AI.OpenAI.Tests; + +public class BatchTests : AoaiTestBase +{ + public BatchTests(bool isAsync) : base(isAsync) + { } + + [Test] + [Category("Smoke")] + public void CanCreateClient() => Assert.That(GetTestClient(), Is.InstanceOf()); + + [RecordedTest] + [Ignore("Azure OpenAI does not yet support batch file uploads")] + public async Task SimpleBatchCompletionsTest() + { + BatchClient batchClient = GetTestClient(new TestClientOptions(AzureOpenAIClientOptions.ServiceVersion.V2024_06_01)); + await using BatchOperations ops = new(this, batchClient); + + // Create the batch operations to send and upload them + ops.ChatClient.CompleteChat([new SystemChatMessage("You are a saccharine AI"), new UserChatMessage("Tell me about yourself")]); + ops.ChatClient.CompleteChat([new UserChatMessage("Give me a large random number")]); + Assert.That(ops.Operations, Has.Count.EqualTo(2)); + string inputFileId = await ops.UploadBatchFileAsync(); + + // Create the batch operation + using var requestContent = new BatchOptions() + { + InputFileId = inputFileId, + Endpoint = ops.Operations.Select(o => o.Url).Distinct().First(), + Metadata = + { + [ "description" ] = "Azure OpenAI .Net SDK integration test framework " + nameof(SimpleBatchCompletionsTest), + } + }.ToBinaryContent(); + + ClientResult response = await batchClient.CreateBatchAsync(requestContent); + BatchObject batchObj = ExtractAndValidateBatchObj(response); + + // Poll until we've completed, failed, or were canceled + while ("completed" != batchObj.Status) + { + response = await batchClient.GetBatchAsync(batchObj.Id, new()); + batchObj = ExtractAndValidateBatchObj(response); + } + + Assert.That(batchObj.OutputFileID, Is.Not.Null.Or.Empty); + BinaryData outputData = await ops.DownloadAndValidateResultAsync(batchObj.OutputFileID!); + var parsedOutput = BatchResult.From(outputData); + Assert.That(parsedOutput, Is.Not.Null); + Assert.That(parsedOutput, Has.Count.EqualTo(ops.Operations.Count)); + for (int i = 0; i < parsedOutput.Count; i++) + { + Assert.That(parsedOutput[i].CustomId, Is.EqualTo(ops.Operations[i].CustomId), "Wrong custom ID at index {0}", i); + var completion = parsedOutput[i].Response!; + Assert.That(completion, Is.Not.Null); + Assert.That(completion.Role, Is.EqualTo(ChatMessageRole.Assistant)); + Assert.That(completion.Content, Has.Count.EqualTo(1)); + Assert.That(completion.Content[0].Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + Assert.That(completion.Content[0].Text, Is.Not.Null.Or.Empty); + } + + } + + #region helper methods + + private BinaryData ValidateHasRawJsonResponse(ClientResult result) + { + Assert.That(result, Is.Not.Null); + PipelineResponse response = result.GetRawResponse(); + Assert.That(response, Is.Not.Null); + Assert.That(response.Status, Is.GreaterThanOrEqualTo(200).And.LessThan(300)); + Assert.That(response.Headers.GetFirstOrDefault("Content-Type"), Does.StartWith("application/json")); + + return response.Content; + } + + private void ValidateBatchResult(BatchObject batchObj) + { + Assert.That(batchObj, Is.Not.Null); + Assert.That(batchObj.Id, Is.Not.Null.Or.Empty); + Assert.That(batchObj.Status, Is.Not.Null); + Assert.That(batchObj.Status, Is.AnyOf("validating", "in_progress", "finalizing", "completed")); + } + + private BatchObject ExtractAndValidateBatchObj(ClientResult result) + { + var binaryData = ValidateHasRawJsonResponse(result); + var batchObj = BatchObject.From(binaryData); + ValidateBatchResult(batchObj); + return batchObj; + } + + #endregion + + #region helper classes + + private class BatchOperations : IAsyncDisposable + { + private MockHttpMessageHandler _handler; + private List _operations; + private string? _uploadId; + private FileClient _fileClient; + + public BatchOperations(AoaiTestBase testBase, BatchClient batchClient) + { + _handler = new(MockHttpMessageHandler.ReturnEmptyJson); + _handler.OnRequest += HandleRequest; + _operations = new(); + + BatchFileName = "batch-" + Guid.NewGuid().ToString("D") + ".json"; + + _fileClient = testBase.GetTestClientFrom(batchClient); + + // Generate the fake pipeline to capture requests and save them to a file later + AzureOpenAIClient fakeTopLevel = new AzureOpenAIClient( + new Uri("https://not.a.real.endpoint.fake"), + new ApiKeyCredential("not.a.real.key"), + new() { Transport = _handler.Transport }); + + ChatClient = fakeTopLevel.GetChatClient(testBase.TestConfig.GetConfig().DeploymentOrThrow("chat client")); + EmbeddingClient = fakeTopLevel.GetEmbeddingClient(testBase.TestConfig.GetConfig().DeploymentOrThrow("embedding client")); + } + + public string BatchFileName { get; } + public IReadOnlyList Operations => _operations; + public ChatClient ChatClient { get; } + public EmbeddingClient EmbeddingClient { get; } + + public async Task UploadBatchFileAsync() + { + if (Operations.Count == 0) + { + throw new InvalidOperationException(); + } + + using MemoryStream stream = new MemoryStream(); + JsonHelpers.Serialize(stream, _operations, JsonOptions.OpenAIJsonOptions); + stream.Seek(0, SeekOrigin.Begin); + var data = BinaryData.FromStream(stream); + + using var content = BinaryContent.Create(data); + + OpenAIFileInfo file = await _fileClient.UploadFileAsync(data, BatchFileName, FileUploadPurpose.Batch); + _uploadId = file.Id; + Assert.That(_uploadId, Is.Not.Null.Or.Empty); + return _uploadId; + } + + public async Task DownloadAndValidateResultAsync(string outputId) + { + ClientResult response = await _fileClient.DownloadFileAsync(outputId); + Assert.That(response, Is.Not.Null); + Assert.That(response.Value, Is.Not.Null); + return response.Value; + } + + public async ValueTask DisposeAsync() + { + // clean up any files + if (_uploadId != null) + { + await _fileClient.DeleteFileAsync(_uploadId); + } + + _handler.OnRequest -= HandleRequest; + _handler.Dispose(); + _operations.Clear(); + } + + private void HandleRequest(object? sender, CapturedRequest request) + { + JsonElement? element = null; + if (request.Content != null) + { + using var json = JsonDocument.Parse(request.Content.ToMemory()); + element = json.RootElement.Clone(); + } + + BatchOperation operation = new() + { + Method = request.Method, + Url = request.Uri?.AbsolutePath ?? string.Empty, + Body = element + }; + + _operations.Add(operation); + } + + public class BatchOperation + { + public string CustomId { get; } = Guid.NewGuid().ToString(); + public HttpMethod Method { get; init; } = HttpMethod.Get; + public string Url { get; init; } = string.Empty; + public JsonElement? Body { get; init; } + } + } + + #endregion +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.Functions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.Functions.cs new file mode 100644 index 000000000..ebe881c6a --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.Functions.cs @@ -0,0 +1,280 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text; +using System.Text.Json; +using System.Threading.Tasks; +using OpenAI.Chat; +using OpenAI.TestFramework; + +namespace Azure.AI.OpenAI.Tests; + +public partial class ChatTests +{ + [Obsolete] + private static readonly ChatFunction FUNCTION_TEMPERATURE = new( + "get_future_temperature", + "requests the anticipated future temperature at a provided location to help inform advice about topics like choice of attire", + BinaryData.FromString( + """ + { + "type": "object", + "properties": { + "locationName": { + "type": "string", + "description": "the name or brief description of a location for weather information" + }, + "date": { + "type": "string", + "description": "the day, month, and year for which to retrieve weather information" + } + } + } + """)); + + public enum FunctionCallTestType + { + Auto, + None, + Function, + } + + [RecordedTest] + [TestCase(FunctionCallTestType.None)] + [TestCase(FunctionCallTestType.Auto)] + [TestCase(FunctionCallTestType.Function)] + [Obsolete] + public async Task SimpleFunctionCallWorks(FunctionCallTestType functionCallType) + { + ChatClient client = GetTestClient(); + + List messages = new() + { + new SystemChatMessage("You are a helpful assistant."), + new UserChatMessage("What should I wear in Honolulu next Thursday?") + }; + var requestOptions = new ChatCompletionOptions() + { + FunctionChoice = functionCallType switch + { + FunctionCallTestType.Auto => ChatFunctionChoice.Auto, + FunctionCallTestType.None => ChatFunctionChoice.None, + FunctionCallTestType.Function => new ChatFunctionChoice(FUNCTION_TEMPERATURE), + _ => throw new NotImplementedException(), + }, + Functions = { FUNCTION_TEMPERATURE }, + MaxTokens = 512, + }; + + ClientResult response = await client.CompleteChatAsync(messages, requestOptions); + Assert.That(response, Is.Not.Null); + + ChatCompletion completion = response.Value; + Assert.IsNotNull(completion); + Assert.That(completion.Id, Is.Not.Null.Or.Empty); + + ContentFilterResultForPrompt filter = completion.GetContentFilterResultForPrompt(); + Assert.IsNotNull(filter); + Assert.That(filter.SelfHarm, Is.Not.Null); + Assert.That(filter.SelfHarm.Filtered, Is.False); + Assert.That(filter.SelfHarm.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + + if (functionCallType == FunctionCallTestType.None) + { + Assert.That(completion.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + Assert.That(completion.FunctionCall, Is.Null); + + Assert.That(completion.Content, Has.Count.GreaterThan(0)); + Assert.That(completion.Content, Has.All.Not.Null); + + ChatMessageContentPart content = completion.Content[0]; + Assert.That(content.Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + Assert.That(content.Text, Is.Not.Null.Or.Empty); + + // test complete, as we were merely validating that we didn't get what we shouldn't + return; + } + + // TODO old tests look for stop reason of function_call for both auto and function, but the service currently returns "stop" + // for function + if (functionCallType == FunctionCallTestType.Auto) + { + Assert.That(completion.FinishReason, Is.EqualTo(ChatFinishReason.FunctionCall)); + } + else + { + Assert.That(completion.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + } + + Assert.That(completion.Content, Has.Count.EqualTo(0)); + + Assert.That(completion.FunctionCall, Is.Not.Null); + Assert.That(completion.FunctionCall.FunctionName, Is.EqualTo(FUNCTION_TEMPERATURE.FunctionName)); + Assert.That(completion.FunctionCall.FunctionArguments, Is.Not.Null); + var parsedArgs = JsonSerializer.Deserialize(completion.FunctionCall.FunctionArguments, SERIALIZER_OPTIONS)!; + Assert.That(parsedArgs, Is.Not.Null); + Assert.That(parsedArgs.LocationName, Is.Not.Null.Or.Empty); + Assert.That(parsedArgs.Date, Is.Not.Null.Or.Empty); + + // Complete the function call + messages.Add(new AssistantChatMessage(completion.FunctionCall)); + messages.Add(new FunctionChatMessage(FUNCTION_TEMPERATURE.FunctionName, JsonSerializer.Serialize(new + { + temperature = 31, + unit = "celsius" + }))); + + requestOptions = new() + { + Functions = { FUNCTION_TEMPERATURE }, + MaxTokens = requestOptions.MaxTokens, + }; + + completion = await client.CompleteChatAsync(messages, requestOptions); + Assert.That(completion, Is.Not.Null); + Assert.That(completion.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + + ContentFilterResultForResponse responseFilter = completion.GetContentFilterResultForResponse(); + Assert.That(responseFilter, Is.Not.Null); + Assert.That(responseFilter.Hate, Is.Not.Null); + Assert.That(responseFilter.Hate.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + Assert.That(responseFilter.Hate.Filtered, Is.False); + + Assert.That(completion.Content, Has.Count.GreaterThan(0)); + Assert.That(completion.Content[0], Is.Not.Null); + Assert.That(completion.Content[0].Text, Is.Not.Null.Or.Empty); + Assert.That(completion.Content[0].Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + } + + [RecordedTest] + [TestCase(FunctionCallTestType.None)] + [TestCase(FunctionCallTestType.Auto)] + [TestCase(FunctionCallTestType.Function)] + [Obsolete] + public async Task SimpleFunctionCallWorksStreaming(FunctionCallTestType functionCallType) + { + StringBuilder content = new(); + bool foundPromptFilter = false; + bool foundResponseFilter = false; + string? functionName = null; + StringBuilder functionArgs = new(); + + ChatClient client = GetTestClient(); + + List messages = new() + { + new SystemChatMessage("You are a helpful assistant."), + new UserChatMessage("What should I wear in Honolulu next Thursday?") + }; + var requestOptions = new ChatCompletionOptions() + { + FunctionChoice = functionCallType switch + { + FunctionCallTestType.Auto => ChatFunctionChoice.Auto, + FunctionCallTestType.None => ChatFunctionChoice.None, + FunctionCallTestType.Function => new ChatFunctionChoice(FUNCTION_TEMPERATURE), + _ => throw new NotImplementedException(), + }, + Functions = { FUNCTION_TEMPERATURE }, + MaxTokens = 512, + }; + + Action validateUpdate = (update) => + { + Assert.That(update.ContentUpdate, Is.Not.Null); + Assert.That(update.ContentUpdate, Has.All.Not.Null); + + if (update.FunctionCallUpdate != null) + { + Assert.That(update.FunctionCallUpdate.FunctionName, Is.Null.Or.EqualTo(FUNCTION_TEMPERATURE.FunctionName)); + functionName ??= update.FunctionCallUpdate.FunctionName; + + Assert.That(update.FunctionCallUpdate.FunctionArgumentsUpdate, Is.Not.Null); + functionArgs.Append(update.FunctionCallUpdate.FunctionArgumentsUpdate); + } + + foreach (var part in update.ContentUpdate) + { + Assert.That(part.Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + Assert.That(part.Text, Is.Not.Null); // Could be empty string + + content.Append(part.Text); + } + + var promptFilter = update.GetContentFilterResultForPrompt(); + if (!foundPromptFilter && promptFilter?.Hate != null) + { + Assert.That(promptFilter.Hate.Filtered, Is.False); + Assert.That(promptFilter.Hate.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + foundPromptFilter = true; + } + + var responseFilter = update.GetContentFilterResultForResponse(); + if (!foundResponseFilter && responseFilter?.Hate != null) + { + Assert.That(responseFilter.Hate.Filtered, Is.False); + Assert.That(responseFilter.Hate.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + foundResponseFilter = true; + } + }; + + AsyncCollectionResult response = client.CompleteChatStreamingAsync(messages, requestOptions); + Assert.That(response, Is.Not.Null); + + await foreach (StreamingChatCompletionUpdate update in response) + { + validateUpdate(update); + } + + Assert.That(foundPromptFilter, Is.True); + + if (functionCallType != FunctionCallTestType.None) + { + Assert.That(functionName, Is.Not.Null); + var parsedArgs = JsonSerializer.Deserialize(functionArgs.ToString(), SERIALIZER_OPTIONS)!; + Assert.That(parsedArgs, Is.Not.Null); + Assert.That(parsedArgs.LocationName, Is.Not.Null.Or.Empty); + Assert.That(parsedArgs.Date, Is.Not.Null.Or.Empty); + + // TODO FIXME: There isn't a clear or obvious way to pass the assitant function message back to the service, and the constructors that allow + // us manual control are internal. So let's use JSON. + var converted = ModelReaderWriter.Read(BinaryData.FromString(JsonSerializer.Serialize(new { name = functionName, arguments = functionArgs.ToString() }))); + messages.Add(new AssistantChatMessage(converted)); + messages.Add(new FunctionChatMessage(FUNCTION_TEMPERATURE.FunctionName, JsonSerializer.Serialize(new + { + temperature = 31, + unit = "celsius" + }))); + + requestOptions = new() + { + Functions = { FUNCTION_TEMPERATURE }, + MaxTokens = requestOptions.MaxTokens, + }; + + content.Clear(); + foundPromptFilter = false; + foundResponseFilter = false; + functionName = null; + functionArgs.Clear(); + + response = client.CompleteChatStreamingAsync(messages, requestOptions); + Assert.That(response, Is.Not.Null); + + await foreach (StreamingChatCompletionUpdate update in response) + { + validateUpdate(update); + } + } + + Assert.That(foundPromptFilter, Is.True); + Assert.That(foundResponseFilter, Is.True); + Assert.That(functionName, Is.Null); + Assert.That(functionArgs, Has.Length.EqualTo(0)); + Assert.That(content.ToString(), Is.Not.Null.Or.Empty); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.Tools.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.Tools.cs new file mode 100644 index 000000000..77160c89a --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.Tools.cs @@ -0,0 +1,326 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Text; +using System.Text.Json; +using System.Threading.Tasks; +using OpenAI.Chat; +using OpenAI.TestFramework; + +namespace Azure.AI.OpenAI.Tests +{ + public partial class ChatTests + { + private static readonly JsonSerializerOptions SERIALIZER_OPTIONS = new() + { + PropertyNameCaseInsensitive = true, + PropertyNamingPolicy = JsonNamingPolicy.CamelCase + }; + + private static readonly ChatTool TOOL_TEMPERATURE = ChatTool.CreateFunctionTool( + "get_future_temperature", + "requests the anticipated future temperature at a provided location to help inform advice about topics like choice of attire", + BinaryData.FromString( + """ + { + "type": "object", + "properties": { + "locationName": { + "type": "string", + "description": "the name or brief description of a location for weather information" + }, + "date": { + "type": "string", + "description": "the day, month, and year for which to retrieve weather information" + } + } + } + """)); + + private class TemperatureFunctionRequestArguments + { + public string? LocationName { get; set; } + public string? Date { get; set; } + } + + public enum ToolChoiceTestType + { + None, + Auto, + Tool, + Required + } + + [RecordedTest] + [TestCase(ToolChoiceTestType.None)] + [TestCase(ToolChoiceTestType.Auto)] + [TestCase(ToolChoiceTestType.Tool)] + [TestCase(ToolChoiceTestType.Required, Ignore = "This seems to be considered invalid")] + public async Task SimpleToolWorks(ToolChoiceTestType toolChoice) + { + ChatClient client = GetTestClient(); + + List messages = new() + { + new SystemChatMessage("You are a helpful assistant."), + new UserChatMessage("What should I wear in Honolulu next Thursday?") + }; + var requestOptions = new ChatCompletionOptions() + { + ToolChoice = toolChoice switch + { + ToolChoiceTestType.None => ChatToolChoice.None, + ToolChoiceTestType.Auto => ChatToolChoice.Auto, + ToolChoiceTestType.Tool => new ChatToolChoice(TOOL_TEMPERATURE), + ToolChoiceTestType.Required => ChatToolChoice.Required, + _ => throw new NotImplementedException(), + }, + Tools = { TOOL_TEMPERATURE }, + MaxTokens = 512, + }; + + ClientResult response = await client.CompleteChatAsync(messages, requestOptions); + Assert.That(response, Is.Not.Null); + + ChatCompletion completion = response.Value; + Assert.IsNotNull(completion); + Assert.That(completion.Id, Is.Not.Null.Or.Empty); + + ContentFilterResultForPrompt filter = completion.GetContentFilterResultForPrompt(); + Assert.IsNotNull(filter); + Assert.That(filter.SelfHarm, Is.Not.Null); + Assert.That(filter.SelfHarm.Filtered, Is.False); + Assert.That(filter.SelfHarm.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + + if (toolChoice == ToolChoiceTestType.None) + { + Assert.That(completion.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + Assert.That(completion.ToolCalls, Has.Count.EqualTo(0)); + + Assert.That(completion.Content, Has.Count.GreaterThan(0)); + Assert.That(completion.Content, Has.All.Not.Null); + + ChatMessageContentPart content = completion.Content[0]; + Assert.That(content.Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + Assert.That(content.Text, Is.Not.Null.Or.Empty); + + // test complete, as we were merely validating that we didn't get what we shouldn't + return; + } + + // TODO old tests look for stop reason of function_call for both auto and function, but the service currently returns "stop" + // for function + if (toolChoice == ToolChoiceTestType.Auto) + { + Assert.That(completion.FinishReason, Is.EqualTo(ChatFinishReason.ToolCalls)); + } + else + { + Assert.That(completion.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + } + + Assert.That(completion.Content, Has.Count.EqualTo(0)); + Assert.That(completion.ToolCalls, Has.Count.EqualTo(1)); + Assert.That(completion.ToolCalls, Has.All.Not.Null); + + ChatToolCall toolCall = completion.ToolCalls[0]; + Assert.That(toolCall.Id, Is.Not.Null.Or.Empty); + Assert.That(toolCall.Kind, Is.EqualTo(ChatToolCallKind.Function)); + Assert.That(toolCall.FunctionName, Is.EqualTo(TOOL_TEMPERATURE.FunctionName)); + Assert.That(toolCall.FunctionArguments, Is.Not.Null); + var parsedArgs = JsonSerializer.Deserialize(toolCall.FunctionArguments, SERIALIZER_OPTIONS)!; + Assert.That(parsedArgs, Is.Not.Null); + Assert.That(parsedArgs.LocationName, Is.Not.Null.Or.Empty); + Assert.That(parsedArgs.Date, Is.Not.Null.Or.Empty); + + // Complete the tool call + messages.Add(new AssistantChatMessage([toolCall])); + messages.Add(new ToolChatMessage(toolCall.Id, JsonSerializer.Serialize(new + { + temperature = 31, + unit = "celsius" + }))); + + requestOptions = new() + { + Tools = { TOOL_TEMPERATURE }, + MaxTokens = requestOptions.MaxTokens + }; + + completion = await client.CompleteChatAsync(messages, requestOptions); + Assert.That(completion, Is.Not.Null); + Assert.That(completion.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + + ContentFilterResultForPrompt promptFilter = completion.GetContentFilterResultForPrompt(); + Assert.That(promptFilter, Is.Not.Null); + Assert.That(promptFilter.Hate, Is.Not.Null); + Assert.That(promptFilter.Hate.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + Assert.That(promptFilter.Hate.Filtered, Is.False); + + ContentFilterResultForResponse responseFilter = completion.GetContentFilterResultForResponse(); + Assert.That(responseFilter, Is.Not.Null); + Assert.That(responseFilter.Hate, Is.Not.Null); + Assert.That(responseFilter.Hate.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + Assert.That(responseFilter.Hate.Filtered, Is.False); + + Assert.That(completion.Content, Has.Count.GreaterThan(0)); + Assert.That(completion.Content, Has.All.Not.Null); + Assert.That(completion.Content[0].Text, Is.Not.Null.Or.Empty); + Assert.That(completion.Content[0].Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + } + + [RecordedTest] + [TestCase(ToolChoiceTestType.None)] + [TestCase(ToolChoiceTestType.Auto)] + [TestCase(ToolChoiceTestType.Tool)] + [TestCase(ToolChoiceTestType.Required, Ignore = "This seems to be considered invalid")] + public async Task SimpleToolWorksStreaming(ToolChoiceTestType toolChoice) + { + StringBuilder content = new(); + bool foundPromptFilter = false; + bool foundResponseFilter = false; + string? toolId = null; + string? toolName = null; + StringBuilder toolArgs = new(); + + ChatClient client = GetTestClient(); + + List messages = new() + { + new SystemChatMessage("You are a helpful assistant."), + new UserChatMessage("What should I wear in Honolulu next Thursday?") + }; + var requestOptions = new ChatCompletionOptions() + { + ToolChoice = toolChoice switch + { + ToolChoiceTestType.None => ChatToolChoice.None, + ToolChoiceTestType.Auto => ChatToolChoice.Auto, + ToolChoiceTestType.Tool => new ChatToolChoice(TOOL_TEMPERATURE), + ToolChoiceTestType.Required => ChatToolChoice.Required, + _ => throw new NotImplementedException(), + }, + Tools = { TOOL_TEMPERATURE }, + MaxTokens = 512, + }; + + Action validateUpdate = (update) => + { + Assert.That(update.ContentUpdate, Is.Not.Null); + Assert.That(update.ContentUpdate, Has.All.Not.Null); + Assert.That(update.ToolCallUpdates, Is.Not.Null); + Assert.That(update.ToolCallUpdates, Has.All.Not.Null); + + if (update.ToolCallUpdates.Count > 0) + { + Assert.That(update.ToolCallUpdates, Has.Count.EqualTo(1)); + + StreamingChatToolCallUpdate toolUpdate = update.ToolCallUpdates[0]; + Assert.That(toolUpdate.Index, Is.EqualTo(0)); + Assert.That(toolUpdate.Id, Is.Null.Or.Not.Empty); + toolId ??= toolUpdate.Id; + Assert.That(toolUpdate.FunctionName, Is.Null.Or.EqualTo(TOOL_TEMPERATURE.FunctionName)); + toolName ??= toolUpdate.FunctionName; + + Assert.That(toolUpdate.FunctionArgumentsUpdate, Is.Not.Null); + toolArgs.Append(toolUpdate.FunctionArgumentsUpdate); + } + + foreach (var part in update.ContentUpdate) + { + Assert.That(part.Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + Assert.That(part.Text, Is.Not.Null); // Could be empty string + + content.Append(part.Text); + } + + var promptFilter = update.GetContentFilterResultForPrompt(); + if (!foundPromptFilter && promptFilter?.Hate != null) + { + Assert.That(promptFilter.Hate.Filtered, Is.False); + Assert.That(promptFilter.Hate.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + foundPromptFilter = true; + } + + var responseFilter = update.GetContentFilterResultForResponse(); + if (!foundResponseFilter && responseFilter?.Hate != null) + { + Assert.That(responseFilter.Hate.Filtered, Is.False); + Assert.That(responseFilter.Hate.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + foundResponseFilter = true; + } + }; + + AsyncCollectionResult response = client.CompleteChatStreamingAsync(messages, requestOptions); + Assert.That(response, Is.Not.Null); + + await foreach (StreamingChatCompletionUpdate update in response) + { + validateUpdate(update); + } + + Assert.That(foundPromptFilter, Is.True); + + if (toolChoice != ToolChoiceTestType.None) + { + Assert.That(content, Has.Length.EqualTo(0)); + Assert.That(toolId, Is.Not.Null); + Assert.That(toolName, Is.Not.Null); + Assert.That(toolArgs, Has.Length.GreaterThan(0)); + var parsedArgs = JsonSerializer.Deserialize(toolArgs.ToString(), SERIALIZER_OPTIONS)!; + Assert.That(parsedArgs, Is.Not.Null); + Assert.That(parsedArgs.LocationName, Is.Not.Null.Or.Empty); + Assert.That(parsedArgs.Date, Is.Not.Null.Or.Empty); + + // Complete the tool call + messages.Add( + new AssistantChatMessage( + [ + ChatToolCall.CreateFunctionToolCall( + toolId, + toolName, + toolArgs.ToString() + ) + ] + ) + ); + messages.Add(new ToolChatMessage(toolId, JsonSerializer.Serialize(new + { + temperature = 31, + unit = "celsius" + }))); + + requestOptions = new() + { + Tools = { TOOL_TEMPERATURE }, + MaxTokens = requestOptions.MaxTokens + }; + + content.Clear(); + foundPromptFilter = false; + foundResponseFilter = false; + toolId = null; + toolName = null; + toolArgs.Clear(); + + response = client.CompleteChatStreamingAsync(messages, requestOptions); + Assert.That(response, Is.Not.Null); + + await foreach (StreamingChatCompletionUpdate update in response) + { + validateUpdate(update); + } + } + + Assert.That(foundPromptFilter, Is.True); + Assert.That(foundResponseFilter, Is.True); + Assert.That(content.ToString(), Is.Not.Null.Or.Empty); + Assert.That(toolId, Is.Null); + Assert.That(toolName, Is.Null); + Assert.That(toolArgs, Has.Length.EqualTo(0)); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.Vision.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.Vision.cs new file mode 100644 index 000000000..85399aac4 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.Vision.cs @@ -0,0 +1,137 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.ClientModel; +using System.IO; +using System.Text; +using System.Threading.Tasks; +using OpenAI.Chat; +using OpenAI.TestFramework; + +namespace Azure.AI.OpenAI.Tests +{ + public partial class ChatTests + { + [RecordedTest] + [TestCase(true)] + [TestCase(false)] + public async Task ChatWithImages(bool useUri) + { + var imageAsset = Assets.DogAndCat; + ChatClient client = GetTestClient("vision"); + + ChatMessageContentPart imagePart; + if (useUri) + { + imagePart = ChatMessageContentPart.CreateImageMessageContentPart( + imageAsset.Url, ImageChatMessageContentPartDetail.Low); + } + else + { + using var stream = File.OpenRead(imageAsset.RelativePath); + var imageData = BinaryData.FromStream(stream); + + imagePart = ChatMessageContentPart.CreateImageMessageContentPart( + imageData, imageAsset.MimeType, ImageChatMessageContentPartDetail.Low); + } + + ChatMessage[] messages = + [ + new SystemChatMessage("You are a helpful assistant that helps describe images."), + new UserChatMessage(imagePart, ChatMessageContentPart.CreateTextMessageContentPart("describe this image")) + ]; + + ChatCompletionOptions options = new() + { + MaxTokens = 2048, + }; + + var response = await client.CompleteChatAsync(messages, options); + Assert.That(response, Is.Not.Null); + + Assert.That(response.Value.Id, Is.Not.Null.Or.Empty); + Assert.That(response.Value.CreatedAt, Is.GreaterThan(START_2024)); + Assert.That(response.Value.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + Assert.That(response.Value.Role, Is.EqualTo(ChatMessageRole.Assistant)); + Assert.That(response.Value.Usage, Is.Not.Null); + Assert.That(response.Value.Usage.InputTokens, Is.GreaterThan(10)); + Assert.That(response.Value.Usage.OutputTokens, Is.GreaterThan(10)); + Assert.That(response.Value.Usage.TotalTokens, Is.GreaterThan(20)); + + Assert.That(response.Value.Content, Has.Count.EqualTo(1)); + ChatMessageContentPart choice = response.Value.Content[0]; + Assert.That(choice.Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + Assert.That(choice.Text, Is.Not.Null.Or.Empty); + Assert.That(choice.Text.ToLowerInvariant(), Does.Contain("dog").Or.Contain("cat")); + + // TODO FIXME: Some models (e.g. gpt-4o either randomly return prompt filters with some missing entries) + var promptFilter = response.Value.GetContentFilterResultForPrompt(); + Assert.That(promptFilter, Is.Not.Null); + //Assert.That(promptFilter.Hate, Is.Not.Null); + //Assert.That(promptFilter.Hate.Filtered, Is.False); + //Assert.That(promptFilter.Hate.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + + var responseFilter = response.Value.GetContentFilterResultForResponse(); + Assert.That(responseFilter, Is.Not.Null); + Assert.That(responseFilter.Hate, Is.Not.Null); + Assert.That(responseFilter.Hate.Filtered, Is.False); + Assert.That(responseFilter.Hate.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + } + + [RecordedTest] + [TestCase(true)] + [TestCase(false)] + public async Task ChatWithImagesStreaming(bool useUri) + { + bool foundPromptFilter = false; + bool foundResponseFilter = false; + StringBuilder content = new(); + + ChatClient client = GetTestClient("vision"); + + ChatMessageContentPart imagePart; + var imageAsset = Assets.DogAndCat; + if (useUri) + { + imagePart = ChatMessageContentPart.CreateImageMessageContentPart( + imageAsset.Url, ImageChatMessageContentPartDetail.Low); + } + else + { + using var stream = File.OpenRead(imageAsset.RelativePath); + var imageData = BinaryData.FromStream(stream); + + imagePart = ChatMessageContentPart.CreateImageMessageContentPart( + imageData, imageAsset.MimeType, ImageChatMessageContentPartDetail.Low); + } + + ChatMessage[] messages = + [ + new SystemChatMessage("You are a helpful assistant that helps describe images."), + new UserChatMessage(imagePart, ChatMessageContentPart.CreateTextMessageContentPart("describe this image")) + ]; + + ChatCompletionOptions options = new() + { + MaxTokens = 2048, + }; + + AsyncCollectionResult response = client.CompleteChatStreamingAsync(messages, options); + Assert.That(response, Is.Not.Null); + + await foreach (StreamingChatCompletionUpdate update in response) + { + ValidateUpdate(update, content, ref foundPromptFilter, ref foundResponseFilter); + } + + // TOOD FIXME: gpt-4o models seem to return inconsistent prompt filters to skip this for now + //Assert.That(foundPromptFilter, Is.True); + Assert.That(foundResponseFilter, Is.True); + Assert.That(content, Has.Length.GreaterThan(0)); + + string c = content.ToString().ToLowerInvariant(); + Assert.That(c, Does.Contain("dog").Or.Contain("cat")); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.cs new file mode 100644 index 000000000..8e376ecb4 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ChatTests.cs @@ -0,0 +1,575 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Diagnostics; +using System.Linq; +using System.Net.Http; +using System.Reflection; +using System.Text; +using System.Threading.Tasks; +using Azure.AI.OpenAI.Chat; +using Azure.AI.OpenAI.Tests.Utils.Config; +using OpenAI.Chat; +using OpenAI.TestFramework; +using OpenAI.TestFramework.Mocks; +using OpenAI.TestFramework.Utils; + +namespace Azure.AI.OpenAI.Tests; + +public partial class ChatTests : AoaiTestBase +{ + public ChatTests(bool isAsync) : base(isAsync) + { } + + #region General tests + + [Test] + [Category("Smoke")] + public async Task DefaultUserAgentStringWorks() + { + using MockHttpMessageHandler pipeline = new(MockHttpMessageHandler.ReturnEmptyJson); + + Uri endpoint = new Uri("https://www.bing.com/"); + string apiKey = "not-a-real-one"; + string model = "ignore"; + + AzureOpenAIClient topLevel = new( + endpoint, + new ApiKeyCredential(apiKey), + new AzureOpenAIClientOptions() + { + Transport = pipeline.Transport + }); + + ChatClient client = WrapClient(topLevel.GetChatClient(model)); + + await client.CompleteChatAsync([new UserChatMessage("Hello")]); + + Assert.That(pipeline.Requests, Is.Not.Empty); + + var request = pipeline.Requests[0]; + Assert.That(request.Method, Is.EqualTo(HttpMethod.Post)); + Assert.That(request.Uri?.GetLeftPart(UriPartial.Authority), Is.EqualTo(endpoint.GetLeftPart(UriPartial.Authority))); + Assert.That(request.Headers.GetValueOrDefault("api-key")?.FirstOrDefault(), Is.EqualTo(apiKey)); + Assert.That(request.Headers.GetValueOrDefault("User-Agent")?.FirstOrDefault(), Does.Contain("azsdk-net-AI.OpenAI/")); + Assert.That(request.Content, Is.Not.Null); + var jsonString = request.Content.ToString(); + Assert.That(jsonString, Is.Not.Null.Or.Empty); + Assert.That(jsonString, Does.Contain("\"messages\"").And.Contain("\"model\"").And.Contain(model)); + } + + [Test] + [Category("Smoke")] + public void DataSourceSerializationWorks() + { + AzureSearchChatDataSource source = new() + { + Endpoint = new Uri("https://some-search-resource.azure.com"), + Authentication = DataSourceAuthentication.FromApiKey("test-api-key"), + IndexName = "index-name-here", + FieldMappings = new() + { + ContentFieldNames = { "hello" }, + TitleFieldName = "hi", + }, + AllowPartialResult = true, + QueryType = DataSourceQueryType.Simple, + OutputContextFlags = DataSourceOutputContextFlags.AllRetrievedDocuments | DataSourceOutputContextFlags.Citations, + VectorizationSource = DataSourceVectorizer.FromEndpoint( + new Uri("https://my-embedding.com"), + DataSourceAuthentication.FromApiKey("embedding-api-key")), + }; + dynamic serialized = ModelReaderWriter.Write(source).ToDynamicFromJson(); + Assert.That(serialized?.type?.ToString(), Is.EqualTo("azure_search")); + Assert.That(serialized?.parameters?.authentication?.type?.ToString(), Is.EqualTo("api_key")); + Assert.That(serialized?.parameters?.authentication?.key?.ToString(), Does.Contain("test")); + Assert.That(serialized?.parameters?.index_name?.ToString(), Is.EqualTo("index-name-here")); + Assert.That(serialized?.parameters?.fields_mapping?.content_fields?[0]?.ToString(), Is.EqualTo("hello")); + Assert.That(serialized?.parameters?.fields_mapping?.title_field?.ToString(), Is.EqualTo("hi")); + Assert.That(bool.TryParse(serialized?.parameters?.allow_partial_result?.ToString(), out bool parsed) && parsed == true); + Assert.That(serialized?.parameters?.query_type?.ToString(), Is.EqualTo("simple")); + Assert.That(serialized?.parameters?.include_contexts?[0]?.ToString(), Is.EqualTo("citations")); + Assert.That(serialized?.parameters?.include_contexts?[1]?.ToString(), Is.EqualTo("all_retrieved_documents")); + Assert.That(serialized?.parameters?.embedding_dependency?.type?.ToString(), Is.EqualTo("endpoint")); + + ChatCompletionOptions options = new(); + options.AddDataSource(new ElasticsearchChatDataSource() + { + Authentication = DataSourceAuthentication.FromAccessToken("foo-token"), + Endpoint = new Uri("https://my-elasticsearch.com"), + IndexName = "my-index-name", + InScope = true, + }); + + IReadOnlyList sourcesFromOptions = options.GetDataSources(); + Assert.That(sourcesFromOptions, Has.Count.EqualTo(1)); + Assert.That(sourcesFromOptions[0], Is.InstanceOf()); + Assert.That(((ElasticsearchChatDataSource)sourcesFromOptions[0]).IndexName, Is.EqualTo("my-index-name")); + + options.AddDataSource(new AzureCosmosDBChatDataSource() + { + Authentication = DataSourceAuthentication.FromApiKey("api-key"), + ContainerName = "my-container-name", + DatabaseName = "my_database_name", + FieldMappings = new() + { + ContentFieldNames = { "hello", "world" }, + }, + IndexName = "my-index-name", + VectorizationSource = DataSourceVectorizer.FromDeploymentName("my-deployment"), + }); + sourcesFromOptions = options.GetDataSources(); + Assert.That(sourcesFromOptions, Has.Count.EqualTo(2)); + Assert.That(sourcesFromOptions[1], Is.InstanceOf()); + } + + [RecordedTest] + public async Task ChatCompletionBadKeyGivesHelpfulError() + { + string mockKey = "not-a-valid-key-and-should-still-be-sanitized"; + + try + { + ChatClient chatClient = GetTestClient(keyCredential: new ApiKeyCredential(mockKey)); + _ = await chatClient.CompleteChatAsync([new UserChatMessage("oops, this won't work with that key!")]); + Assert.Fail("No exception was thrown"); + } + catch (Exception thrownException) + { + Assert.That(thrownException, Is.InstanceOf()); + Assert.That(thrownException.Message, Does.Contain("invalid subscription key")); + Assert.That(thrownException.Message, Does.Not.Contain(mockKey)); + } + } + + [RecordedTest] + [Category("Smoke")] + public async Task DefaultAzureCredentialWorks() + { + ChatClient chatClient = GetTestClient(tokenCredential: this.TestEnvironment.Credential); + ChatCompletion chatCompletion = await chatClient.CompleteChatAsync([ChatMessage.CreateUserMessage("Hello, world!")]); + Assert.That(chatCompletion, Is.Not.Null); + Assert.That(chatCompletion.Content, Is.Not.Null.Or.Empty); + Assert.That(chatCompletion.Content[0].Text, Is.Not.Null.Or.Empty); + } + + [RecordedTest] + [Ignore("Delay behavior not emulated by recordings, and needs to be run manually with some time in between iterations due to service throttling behaviour")] + [TestCase("x-ms-retry-after-ms", "1000", 1000)] + [TestCase("retry-after-ms", "1400", 1400)] + [TestCase("Retry-After", "1", 1000)] + [TestCase("Retry-After", "1.5", 1500)] + [TestCase("retry-after-ms", "200", 200)] + [TestCase("x-fake-test-retry-header", "1400", 800)] + public async Task RateLimitedRetryWorks(string headerName, string headerValue, double expectedDelayMilliseconds) + { + const string responseClass = "HttpClientTransportResponse"; + const string responseField = "_httpResponse"; + IConfiguration testConfig = TestConfig.GetConfig("rate_limited_chat")!; + Assert.That(testConfig, Is.Not.Null); + + int failureCount = 0; + string? clientRequestId = null; + + TestPipelinePolicy replaceHeadersPolicy = new( + requestAction: (request) => + { + clientRequestId ??= request.Headers.GetFirstOrDefault("x-ms-client-request-id"); + }, + responseAction: (response) => + { + if (response.Status != 200) + { + failureCount++; + + Type httpPipelineResponseType = typeof(HttpClientPipelineTransport).GetNestedType(responseClass, BindingFlags.NonPublic) + ?? throw new InvalidOperationException($"Could not the expected {responseClass} inner non public class"); + FieldInfo httpResponseField = httpPipelineResponseType.GetField(responseField, BindingFlags.Instance | BindingFlags.NonPublic) + ?? throw new InvalidOperationException($"Could not find the expected {responseClass}.{responseField} field)"); + HttpResponseMessage httpResponse = httpResponseField.GetValue(response) as HttpResponseMessage + ?? throw new InvalidOperationException($"Could note determine the HttpResponseMessage to modify"); + + httpResponse.Headers.Remove("x-ms-retry-after-ms"); + httpResponse.Headers.Remove("retry-after-ms"); + httpResponse.Headers.Remove("Retry-After"); + httpResponse.Headers.TryAddWithoutValidation(headerName, headerValue); + } + }); + + TestClientOptions options = new(); + options.AddPolicy(replaceHeadersPolicy, PipelinePosition.PerTry); + + ChatClient client = GetTestClient(testConfig, options); + + BinaryContent requestContent = BinaryContent.Create(BinaryData.FromString($$""" + { + "model": "{{testConfig.Deployment}}", + "messages": [ + { "role": "user", "content": "Write three haikus about tropical fruit." } + ] + } + """)); + RequestOptions noThrowOptions = new() { ErrorOptions = ClientErrorBehaviors.NoThrow }; + + TimeSpan? observed200Delay = null; + TimeSpan? observed429Delay = null; + + for (int i = 0; i < 4 && !observed429Delay.HasValue; i++) + { + Stopwatch requestWatch = Stopwatch.StartNew(); + ClientResult protocolResult = await client.CompleteChatAsync(requestContent, noThrowOptions); + PipelineResponse response = protocolResult.GetRawResponse(); + bool responseHasRequestId = response.Headers.TryGetValue("x-ms-client-request-id", out string? requestIdFromResponse); + Assert.That(responseHasRequestId, Is.True); + Assert.That(requestIdFromResponse, Is.EqualTo(clientRequestId)); + switch (response.Status) + { + case 200: + observed200Delay = requestWatch.Elapsed; + break; + case 429: + observed429Delay = requestWatch.Elapsed; + break; + default: + Assert.Fail(); + break; + } + clientRequestId = null; + } + + Assert.That(observed200Delay.HasValue, Is.True); + Assert.That(observed429Delay.HasValue, Is.True); + Assert.That(failureCount, Is.EqualTo(4)); + Assert.That(observed429Delay!.Value.TotalMilliseconds, Is.GreaterThan(expectedDelayMilliseconds)); + Assert.That(observed429Delay!.Value.TotalMilliseconds, Is.LessThan(3 * expectedDelayMilliseconds + 2 * observed200Delay!.Value.TotalMilliseconds)); + } + + #endregion + + #region Regular chat completions tests + + [RecordedTest] + public async Task ChatCompletion() + { + ChatClient chatClient = GetTestClient(); + ClientResult chatCompletion = await chatClient.CompleteChatAsync([new UserChatMessage("hello, world!")]); + Assert.That(chatCompletion, Is.Not.Null); + Assert.That(chatCompletion.Value, Is.Not.Null); + Assert.That(chatCompletion.Value, Is.InstanceOf()); + Assert.That(chatCompletion.Value.Content, Is.Not.Null.Or.Empty); + } + + [RecordedTest] + public async Task ChatCompletionWithHistoryAndLogProbabilities() + { + ChatClient client = GetTestClient(); + + ChatCompletion response = await client.CompleteChatAsync( + [ + new SystemChatMessage("You are a helpful assistant."), + new UserChatMessage("I am baking a pizza, can you help me?"), + new AssistantChatMessage("Of course, I'd be happy to help! What do you need assistance with? Do you need a recipe, cooking time and temperature suggestions, topping ideas, or something else?"), + new UserChatMessage("What temperature should I bake at?") + ], + new ChatCompletionOptions() + { + IncludeLogProbabilities = true, + TopLogProbabilityCount = 3 + }); + + Assert.That(response, Is.Not.Null); + Assert.That(response.Id, Is.Not.Null.Or.Empty); + Assert.That(response.CreatedAt, Is.GreaterThan(new DateTimeOffset(2024, 01, 01, 00, 00, 00, TimeSpan.Zero))); + Assert.That(response.FinishReason, Is.Not.Null.Or.Empty); + Assert.That(response.Content, Is.Not.Null.Or.Empty); + Assert.That(response.Content.Count, Is.EqualTo(1)); + Assert.That(response.Usage, Is.Not.Null); + Assert.That(response.Usage.InputTokens, Is.GreaterThan(10)); + Assert.That(response.Usage.OutputTokens, Is.GreaterThan(10)); + Assert.That(response.Usage.TotalTokens, Is.GreaterThan(20)); + Assert.That(response.ContentTokenLogProbabilities, Is.Not.Null.Or.Empty); + foreach (var logProb in response.ContentTokenLogProbabilities) + { + Assert.That(logProb, Is.Not.Null); + Assert.That(logProb.TopLogProbabilities, Is.Not.Null.Or.Empty); + Assert.That(logProb.TopLogProbabilities.Count, Is.EqualTo(3)); + } + + ChatMessageContentPart content = response.Content[0]; + Assert.That(content.Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + Assert.That(content.Text, Is.Not.Null.Or.Empty); + Assert.That(content.Text, Does + .Contain("Fahrenheit") + .Or.Contain("Celsius") + .Or.Contain("F") + .Or.Contain("C") + .Or.Contain("oven")); + } + + [RecordedTest] + public async Task ChatCompletionWithTextFormat() + { + ChatClient client = GetTestClient(); + ChatCompletionOptions options = new() + { + ResponseFormat = ChatResponseFormat.Text + }; + + ChatCompletion response = await client.CompleteChatAsync([new UserChatMessage("Give me a random number")], options); + Assert.That(response, Is.Not.Null); + Assert.That(response.Content, Is.Not.Null.Or.Empty); + Assert.That(response.Content[0].Text, Is.Not.Null.Or.Empty); + } + + [RecordedTest] + public async Task ChatCompletionContentFilter() + { + ChatClient client = GetTestClient(); + ClientResult chatCompletionResult = await client.CompleteChatAsync([ChatMessage.CreateUserMessage("Hello, world!")]); + Console.WriteLine($"--- RESPONSE ---"); + ChatCompletion chatCompletion = chatCompletionResult; + ContentFilterResultForPrompt promptFilterResult = chatCompletion.GetContentFilterResultForPrompt(); + Assert.That(promptFilterResult, Is.Not.Null); + Assert.That(promptFilterResult.Sexual?.Filtered, Is.False); + Assert.That(promptFilterResult.Sexual?.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + ContentFilterResultForResponse responseFilterResult = chatCompletion.GetContentFilterResultForResponse(); + Assert.That(responseFilterResult, Is.Not.Null); + Assert.That(responseFilterResult.Hate?.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + Assert.That(responseFilterResult.ProtectedMaterialCode, Is.Null); + } + + [RecordedTest] + public async Task SearchExtensionWorks() + { + var searchConfig = TestConfig.GetConfig("search")!; + Assert.That(searchConfig, Is.Not.Null); + string searchIndex = searchConfig.GetValueOrThrow("index"); + + AzureSearchChatDataSource source = new() + { + Endpoint = searchConfig.Endpoint, + Authentication = DataSourceAuthentication.FromApiKey(searchConfig.Key), + IndexName = searchIndex, + AllowPartialResult = true, + QueryType = DataSourceQueryType.Simple, + }; + ChatCompletionOptions options = new(); + options.AddDataSource(source); + + ChatClient client = GetTestClient(); + + ClientResult chatCompletionResult = await client.CompleteChatAsync( + [new UserChatMessage("What does the term 'PR complete' mean?")], + options); + Assert.That(chatCompletionResult, Is.Not.Null); + + ChatCompletion chatCompletion = chatCompletionResult.Value; + Assert.That(chatCompletion, Is.Not.Null); + Assert.That(chatCompletion.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + Assert.That(chatCompletion.Content, Is.Not.Null.Or.Empty); + + var content = chatCompletion.Content[0]; + Assert.That(content.Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + Assert.That(content.Text, Is.Not.Null.Or.Empty); + + AzureChatMessageContext context = chatCompletion.GetAzureMessageContext(); + Assert.IsNotNull(context); + Assert.That(context.Intent, Is.Not.Null.Or.Empty); + Assert.That(context.Citations, Has.Count.GreaterThan(0)); + Assert.That(context.Citations[0].Filepath, Is.Not.Null.Or.Empty); + Assert.That(context.Citations[0].Content, Is.Not.Null.Or.Empty); + Assert.That(context.Citations[0].ChunkId, Is.Not.Null.Or.Empty); + Assert.That(context.Citations[0].Title, Is.Not.Null.Or.Empty); + } + + #endregion + + #region Streaming chat completion tests + + [RecordedTest] + public async Task ChatCompletionBadKeyGivesHelpfulErrorStreaming() + { + string mockKey = "not-a-valid-key-and-should-still-be-sanitized"; + + try + { + ChatClient chatClient = GetTestClient(keyCredential: new ApiKeyCredential(mockKey)); + var messages = new[] { new UserChatMessage("oops, this won't work with that key!") }; + + AsyncCollectionResult result = chatClient.CompleteChatStreamingAsync(messages); + await foreach (StreamingChatCompletionUpdate update in result) + { + Assert.Fail("No exception was thrown"); + } + + Assert.Fail("No exception was thrown"); + } + catch (Exception thrownException) + { + Assert.That(thrownException, Is.InstanceOf()); + Assert.That(thrownException.Message, Does.Contain("invalid subscription key")); + Assert.That(thrownException.Message, Does.Not.Contain(mockKey)); + } + } + + [RecordedTest] + public async Task ChatCompletionStreaming() + { + StringBuilder builder = new(); + bool foundPromptFilter = false; + bool foundResponseFilter = false; + + ChatClient chatClient = GetTestClient(); + + ChatMessage[] messages = + [ + new SystemChatMessage("You are a curmudgeon"), + new UserChatMessage("Hello, assitant!") + ]; + ChatCompletionOptions options = new() + { + MaxTokens = 512, + IncludeLogProbabilities = true, + TopLogProbabilityCount = 1, + }; + + AsyncCollectionResult streamingResults = chatClient.CompleteChatStreamingAsync(messages, options); + Assert.That(streamingResults, Is.Not.Null); + + await foreach (StreamingChatCompletionUpdate update in streamingResults) + { + ValidateUpdate(update, builder, ref foundPromptFilter, ref foundResponseFilter); + } + + string allText = builder.ToString(); + Assert.That(allText, Is.Not.Null.Or.Empty); + + Assert.That(foundPromptFilter, Is.True); + Assert.That(foundResponseFilter, Is.True); + } + + [RecordedTest] + public async Task SearchExtensionWorksStreaming() + { + StringBuilder builder = new(); + bool foundPromptFilter = false; + bool foundResponseFilter = false; + List contexts = new(); + + var searchConfig = TestConfig.GetConfig("search")!; + Assert.That(searchConfig, Is.Not.Null); + string searchIndex = searchConfig.GetValueOrThrow("index"); + + AzureSearchChatDataSource source = new() + { + Endpoint = searchConfig.Endpoint, + Authentication = DataSourceAuthentication.FromApiKey(searchConfig.Key), + IndexName = searchIndex, + AllowPartialResult = true, + QueryType = DataSourceQueryType.Simple, + }; + + ChatCompletionOptions options = new(); + options.AddDataSource(source); + + ChatMessage[] messages = [new UserChatMessage("What does the term 'PR complete' mean?")]; + + ChatClient client = GetTestClient(); + + AsyncCollectionResult chatUpdates = client.CompleteChatStreamingAsync(messages, options); + Assert.IsNotNull(chatUpdates); + + await foreach (StreamingChatCompletionUpdate update in chatUpdates) + { + ValidateUpdate(update, builder, ref foundPromptFilter, ref foundResponseFilter); + + AzureChatMessageContext context = update.GetAzureMessageContext(); + if (context != null) + { + contexts.Add(context); + } + } + + string allText = builder.ToString(); + Assert.That(allText, Is.Not.Null.Or.Empty); + + // TODO FIXME: When using data sources, the service does not appear to return request nor response filtering information + //Assert.That(foundPromptFilter, Is.True); + //Assert.That(foundResponseFilter, Is.True); + + Assert.That(contexts, Has.Count.EqualTo(1)); + Assert.That(contexts[0].Intent, Is.Not.Null.Or.Empty); + Assert.That(contexts[0].Citations, Has.Count.GreaterThan(0)); + Assert.That(contexts[0].Citations[0].Content, Is.Not.Null.Or.Empty); + Assert.That(contexts[0].Citations[0].Filepath, Is.Not.Null.Or.Empty); + Assert.That(contexts[0].Citations[0].ChunkId, Is.Not.Null.Or.Empty); + Assert.That(contexts[0].Citations[0].Title, Is.Not.Null.Or.Empty); + } + + #endregion + + #region Helper methods + + private void ValidateUpdate(StreamingChatCompletionUpdate update, StringBuilder builder, ref bool foundPromptFilter, ref bool foundResponseFilter) + { + if (update.CreatedAt == UNIX_EPOCH) + { + // This is the first message that usually contains the service's request content filtering + ContentFilterResultForPrompt promptFilter = update.GetContentFilterResultForPrompt(); + if (promptFilter?.SelfHarm != null) + { + Assert.That(promptFilter.SelfHarm.Filtered, Is.False); + Assert.That(promptFilter.SelfHarm.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + foundPromptFilter = true; + } + } + else + { + Assert.That(update.Id, Is.Not.Null.Or.Empty); + Assert.That(update.CreatedAt, Is.GreaterThan(new DateTimeOffset(2024, 01, 01, 00, 00, 00, TimeSpan.Zero))); + Assert.That(update.FinishReason, Is.Null.Or.EqualTo(ChatFinishReason.Stop)); + if (update.Usage != null) + { + Assert.That(update.Usage.InputTokens, Is.GreaterThanOrEqualTo(0)); + Assert.That(update.Usage.OutputTokens, Is.GreaterThanOrEqualTo(0)); + Assert.That(update.Usage.TotalTokens, Is.GreaterThanOrEqualTo(0)); + } + + Assert.That(update.Model, Is.Not.Null); + Assert.That(update.Role, Is.Null.Or.EqualTo(ChatMessageRole.Assistant)); + Assert.That(update.ContentUpdate, Is.Not.Null); + + Assert.That(update.ContentTokenLogProbabilities, Is.Not.Null); + foreach (var logProb in update.ContentTokenLogProbabilities) + { + Assert.That(logProb.TopLogProbabilities, Is.Not.Null); + Assert.That(logProb.TopLogProbabilities.Count, Is.EqualTo(1)); + } + + foreach (ChatMessageContentPart part in update.ContentUpdate) + { + Assert.That(part.Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + Assert.That(part.Text, Is.Not.Null); + + builder.Append(part.Text); + } + + if (!foundResponseFilter) + { + ContentFilterResultForResponse responseFilter = update.GetContentFilterResultForResponse(); + if (responseFilter?.Violence != null) + { + Assert.That(responseFilter.Violence.Filtered, Is.False); + Assert.That(responseFilter.Violence.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + foundResponseFilter = true; + } + } + } + + #endregion + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/EmbeddingTests.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/EmbeddingTests.cs new file mode 100644 index 000000000..771d59fbb --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/EmbeddingTests.cs @@ -0,0 +1,27 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.Threading.Tasks; +using OpenAI.Embeddings; +using OpenAI.TestFramework; + +namespace Azure.AI.OpenAI.Tests; + +public class EmbeddingTests : AoaiTestBase +{ + public EmbeddingTests(bool isAsync) : base(isAsync) + { } + + [Test] + [Category("Smoke")] + public void CanCreateClient() => Assert.That(GetTestClient(), Is.InstanceOf()); + + [RecordedTest] + public async Task SimpleEmbeddingWithTopLevelClient() + { + EmbeddingClient embeddingClient = GetTestClient(); + ClientResult embeddingResult = await embeddingClient.GenerateEmbeddingAsync("sample text to embed"); + Assert.That(embeddingResult?.Value?.Vector.Length, Is.GreaterThan(0)); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/FileTests.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/FileTests.cs new file mode 100644 index 000000000..48b533798 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/FileTests.cs @@ -0,0 +1,40 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Threading.Tasks; +using OpenAI.Files; +using OpenAI.TestFramework; + +namespace Azure.AI.OpenAI.Tests; + +public class FileTests : AoaiTestBase +{ + public FileTests(bool isAsync) : base(isAsync) + { } + + [Test] + [Category("Smoke")] + public void CanCreateClient() => Assert.That(GetTestClient(), Is.InstanceOf()); + + [RecordedTest] + public async Task CanUploadAndDeleteFiles() + { + FileClient client = GetTestClient(); + OpenAIFileInfo file = await client.UploadFileAsync( + BinaryData.FromString("hello, world!"), + "test_file_delete_me.txt", + FileUploadPurpose.Assistants); + Validate(file); + bool deleted = await client.DeleteFileAsync(file.Id); + Assert.IsTrue(deleted); + } + + [RecordedTest] + public async Task CanListFiles() + { + FileClient client = GetTestClient(); + OpenAIFileInfoCollection files = await client.GetFilesAsync(); + Assert.That(files, Has.Count.GreaterThan(0)); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/FineTuningTests.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/FineTuningTests.cs new file mode 100644 index 000000000..b77689edf --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/FineTuningTests.cs @@ -0,0 +1,416 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Linq; +using System.Text.Json; +using System.Threading.Tasks; +using Azure.AI.OpenAI.FineTuning; +using Azure.AI.OpenAI.Tests.Models; +using Azure.AI.OpenAI.Tests.Utils; +using Azure.AI.OpenAI.Tests.Utils.Config; +using OpenAI.Chat; +using OpenAI.Files; +using OpenAI.FineTuning; +using OpenAI.TestFramework; +using OpenAI.TestFramework.Utils; + +namespace Azure.AI.OpenAI.Tests; + +public class FineTuningTests : AoaiTestBase +{ + public FineTuningTests(bool isAsync) : base(isAsync) + { } + + [Test] + [Category("Smoke")] + public void CanCreateClient() => Assert.That(GetTestClient(), Is.InstanceOf()); + + [RecordedTest] + public async Task JobsFineTuning() + { + FineTuningClient client = GetTestClient(); + + int count = 25; + + await foreach (FineTuningJob job in EnumerateJobsAsync(client)) + { + if (count-- <= 0) + { + break; + } + + Assert.That(job, Is.Not.Null); + Assert.That(job.ID, !(Is.Null.Or.Empty)); + Assert.That(job.FineTunedModel, Is.Null.Or.Not.Empty); // this either null or set to some non-empty value + Assert.That(job.Status, !(Is.Null.Or.Empty)); + Assert.That(job.Object, Is.EqualTo("fine_tuning.job")); + } + } + + [RecordedTest] + public async Task CheckpointsFineTuning() + { + string fineTunedModel = GetFineTunedModel(); + FineTuningClient client = GetTestClient(); + + // Check if the model exists by searching all jobs + FineTuningJob job = await EnumerateJobsAsync(client) + .FirstOrDefaultAsync(j => j.FineTunedModel == fineTunedModel)!; + Assert.That(job, Is.Not.Null); + Assert.That(job!.Status, Is.EqualTo("succeeded")); + + int count = 25; + await foreach (FineTuningCheckpoint checkpoint in EnumerateCheckpoints(client, job.ID)) + { + if (count-- <= 0) + { + break; + } + + Assert.That(checkpoint, Is.Not.Null); + Assert.That(checkpoint.ID, !(Is.Null.Or.Empty)); + Assert.That(checkpoint.CreatedAt, Is.GreaterThan(START_2024)); + Assert.That(checkpoint.FineTunedModelCheckpoint, !(Is.Null.Or.Empty)); + Assert.That(checkpoint.Metrics, Is.Not.Null); + Assert.That(checkpoint.Metrics.Step, Is.GreaterThan(0)); + Assert.That(checkpoint.Metrics.TrainLoss, Is.GreaterThan(0)); + Assert.That(checkpoint.Metrics.TrainMeanTokenAccuracy, Is.GreaterThan(0)); + //Assert.That(checkpoint.Metrics.ValidLoss, Is.GreaterThan(0)); + //Assert.That(checkpoint.Metrics.ValidMeanTokenAccuracy, Is.GreaterThan(0)); + //Assert.That(checkpoint.Metrics.FullValidLoss, Is.GreaterThan(0)); + //Assert.That(checkpoint.Metrics.FullValidMeanTokenAccuracy, Is.GreaterThan(0)); + } + } + + [RecordedTest] + public async Task EventsFineTuning() + { + string fineTunedModel = GetFineTunedModel(); + FineTuningClient client = GetTestClient(); + + // Check if the model exists by searching all jobs + FineTuningJob job = await EnumerateJobsAsync(client) + .FirstOrDefaultAsync(j => j.FineTunedModel == fineTunedModel)!; + Assert.That(job, Is.Not.Null); + Assert.That(job!.Status, Is.EqualTo("succeeded")); + + HashSet ids = new(); + + int count = 25; + var asyncEnum = EnumerateAsync((after, limit, opt) => client.GetJobEventsAsync(job.ID, after, limit, opt)); + await foreach (FineTuningJobEvent evt in asyncEnum) + { + if (count-- <= 0) + { + break; + } + + Assert.That(evt, Is.Not.Null); + Assert.That(evt.ID, !(Is.Null.Or.Empty)); + Assert.That(evt.Object, Is.EqualTo("fine_tuning.job.event")); + Assert.That(evt.CreatedAt, Is.GreaterThan(START_2024)); + Assert.That(evt.Level, !(Is.Null.Or.Empty)); + Assert.That(evt.Message, !(Is.Null.Or.Empty)); + + bool added = ids.Add(evt.ID); + Assert.That(added, Is.True, "Duplicate event ID detected {0}", evt.ID); + } + } + + [RecordedTest] + public async Task DeleteFineTuningModel() + { + FineTuningClient client = GetTestClient(); + Assert.That(client, Is.Not.Null); + Assert.That(client, Is.InstanceOf()); + + // The service always happily returns HTTP 204 regardless of whether or not the model exists + bool deleted = await DeleteJobAndVerifyAsync(client, "does-not-exist"); + Assert.That(deleted, Is.True); + } + + [RecordedTest] + public async Task CreateAndCancelFineTuning() + { + var fineTuningFile = Assets.FineTuning; + + FineTuningClient client = GetTestClient(); + FileClient fileClient = GetTestClientFrom(client); + + // upload training data + OpenAIFileInfo uploadedFile = await UploadAndWaitForCompleteOrFail(fileClient, fineTuningFile.RelativePath); + + // Create the fine tuning job + using var requestContent = new FineTuningOptions() + { + Model = client.DeploymentOrThrow(), + TrainingFile = uploadedFile.Id + }.ToBinaryContent(); + + ClientResult result = await client.CreateJobAsync(requestContent); + FineTuningJob job = ValidateAndParse(result); + Assert.That(job.ID, !(Is.Null.Or.Empty)); + + await using RunOnScopeExit _ = new(async () => + { + bool deleted = await DeleteJobAndVerifyAsync(client, job.ID); + Assert.True(deleted, "Failed to delete fine tuning job: {0}", job.ID); + }); + + // Wait for some events to become available + ListResponse events; + int maxLoops = 10; + do + { + result = await client.GetJobEventsAsync(job.ID, null, 10, new()).FirstOrDefaultAsync(); + events = ValidateAndParse>(result); + + if (events.Data?.Count > 0) + { + Assert.That(events.Data[0], Is.Not.Null); + Assert.That(events.Data[0].ID, !(Is.Null.Or.Empty)); + Assert.That(events.Data[0].Level, !(Is.Null.Or.Empty)); + Assert.That(events.Data[0].Message, !(Is.Null.Or.Empty)); + Assert.That(events.Data[0].CreatedAt, Is.GreaterThan(START_2024)); + + break; + } + + await Task.Delay(TimeSpan.FromSeconds(2)); + + } while (maxLoops-- > 0); + + // Cancel the fine tuning job + result = await client.CancelJobAsync(job.ID, new()); + job = ValidateAndParse(result); + + // Make sure the job status shows as cancelled + job = await WaitForJobToEnd(client, job); + Assert.That(job.Status, Is.EqualTo("cancelled")); + } + + [RecordedTest] + [Category("LongRunning")] // CAUTION: This test can take up 30 *minutes* to run in live mode + public async Task CreateAndDeleteFineTuning() + { + var fineTuningFile = Assets.FineTuning; + + FineTuningClient client = GetTestClient(); + FileClient fileClient = GetTestClientFrom(client); + + // upload training data + OpenAIFileInfo uploadedFile = await UploadAndWaitForCompleteOrFail(fileClient, fineTuningFile.RelativePath); + Assert.That(uploadedFile.Status, Is.EqualTo(OpenAIFileStatus.Processed)); + + // Create the fine tuning job + using var requestContent = new FineTuningOptions() + { + Model = client.DeploymentOrThrow(), + TrainingFile = uploadedFile.Id + }.ToBinaryContent(); + + ClientResult result = await client.CreateJobAsync(requestContent); + FineTuningJob job = ValidateAndParse(result); + Assert.That(job.ID, Is.Not.Null.Or.Empty); + Assert.That(job.Error, Is.Null); + Assert.That(job.Status, !(Is.Null.Or.EqualTo("failed").Or.EqualTo("cancelled"))); + + // Wait for the fine tuning to complete + job = await WaitForJobToEnd(client, job); + Assert.That(job.Status, Is.EqualTo("succeeded"), "Fine tuning did not succeed"); + Assert.That(job.FineTunedModel, Is.Not.Null.Or.Empty); + + // Delete the fine tuned model + bool deleted = await DeleteJobAndVerifyAsync(client, job.ID); + Assert.True(deleted, "Failed to delete fine tuning model: {0}", job.FineTunedModel); + } + + [RecordedTest] + [Category("LongRunning")] // CAUTION: This test can take around 10 to 15 *minutes* in live mode to run + public async Task DeployAndChatWithModel() + { + string fineTunedModel = GetFineTunedModel(); + FineTuningClient client = GetTestClient(); + + AzureDeploymentClient deploymentClient = GetTestClientFrom(client); + string? deploymentName = null; + await using RunOnScopeExit _ = new(async () => + { + if (deploymentName != null) + { + await deploymentClient.DeleteDeploymentAsync(deploymentName); + } + }); + + // Check if the model exists by searching all jobs + FineTuningJob? job = await EnumerateJobsAsync(client) + .FirstOrDefaultAsync(j => j.FineTunedModel == fineTunedModel); + Assert.That(job, Is.Not.Null); + Assert.That(job!.Status, Is.EqualTo("succeeded")); + + // Deploy the model and wait for the deployment to finish + deploymentName = "azure-ai-openai-test-" + Recording?.Random.NewGuid().ToString(); + AzureDeployedModel deployment = await deploymentClient.CreateDeploymentAsync(deploymentName, fineTunedModel); + Assert.That(deployment, Is.Not.Null); + Assert.That(deployment.ID, !(Is.Null.Or.Empty)); + Assert.That(deployment.Properties, Is.Not.Null); + + deployment = await WaitUntilReturnLast( + deployment, + () => deploymentClient.GetDeploymentAsync(deploymentName), + (d) => + { + Assert.That(deployment?.Properties?.ProvisioningState, !(Is.Null.Or.Empty)); + + return d.Properties.ProvisioningState == "Succeeded" + || d.Properties.ProvisioningState == "Failed" + || d.Properties.ProvisioningState == "Canceled"; + }, + TimeSpan.FromMinutes(1), + TimeSpan.FromMinutes(30)); + + Assert.That(deployment.Properties.ProvisioningState, Is.EqualTo("Succeeded")); + + // Run a chat completion test + ChatClient chatClient = GetTestClientFrom(client, deploymentName); + + ChatCompletion completion = await chatClient.CompleteChatAsync( + [ + new SystemChatMessage("Convert sports headline to JSON: \"player\" (full name), \"team\", \"sport\", and \"gender\". If more than one return an array. No markdown"), + new UserChatMessage("Pavleski will not play in 2024-2025 season") + ]); + Assert.That(completion, Is.Not.Null); + Assert.That(completion.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + Assert.That(completion.Content, Has.Count.GreaterThan(0)); + Assert.That(completion.Content[0].Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + Assert.That(completion.Content[0].Text, !Is.Null.Or.Empty); + + // we expect a JSON payload as the response so let's try to deserialize it + using var jsonDoc = JsonDocument.Parse(completion.Content[0].Text, new() + { + AllowTrailingCommas = true, + CommentHandling = JsonCommentHandling.Skip, + MaxDepth = 2 + }); + JsonElement json = jsonDoc.RootElement; + if (json.ValueKind == JsonValueKind.Array) + { + json = json.EnumerateArray().FirstOrDefault(); + } + + Assert.That(json.ValueKind, Is.EqualTo(JsonValueKind.Object)); + Assert.That(json.EnumerateObject().Select(p => p.Name), Has.Some.Match("(player)|(team)|(sport)|(gender)")); + } + + #region helper methods + + private string GetFineTunedModel() + { + string? model = TestConfig.GetConfig() + ?.GetValue("fine_tuned_model"); + Assert.That(model, !(Is.Null.Or.Empty), "Failed to find the already fine tuned model to use"); + return model!; + } + + private async Task UploadAndWaitForCompleteOrFail(FileClient fileClient, string path) + { + OpenAIFileInfo uploadedFile = await fileClient.UploadFileAsync(path, FileUploadPurpose.FineTune); + Validate(uploadedFile); + + uploadedFile = await WaitUntilReturnLast( + uploadedFile, + () => fileClient.GetFileAsync(uploadedFile.Id), + f => f.Status == OpenAIFileStatus.Processed || f.Status == OpenAIFileStatus.Error, + TimeSpan.FromSeconds(5), + TimeSpan.FromMinutes(5)) + .ConfigureAwait(false); + + return uploadedFile; + } + + private Task WaitForJobToEnd(FineTuningClient client, FineTuningJob job) + { + RequestOptions options = new(); + string jobId = job.ID; + + // NOTE: Fine tuning jobs can take up 30 minutes to complete so the timeouts here are longer to account for that + return WaitUntilReturnLast( + job, + async () => + { + ClientResult result = await client.GetJobAsync(jobId, options).ConfigureAwait(false); + return ValidateAndParse(result); + }, + j => j.Status == "cancelled" || j.Status == "failed" || j.Status == "succeeded", + TimeSpan.FromMinutes(1), + TimeSpan.FromMinutes(40)); + } + + private IAsyncEnumerable EnumerateJobsAsync(FineTuningClient client) + => EnumerateAsync((after, limit, opt) => client.GetJobsAsync(after, limit, opt)); + + private IAsyncEnumerable EnumerateCheckpoints(FineTuningClient client, string jobId) + => EnumerateAsync((after, limit, opt) => client.GetJobCheckpointsAsync(jobId, after, limit, opt)); + + private async IAsyncEnumerable EnumerateAsync(Func> getAsyncEnumerable) + where T : FineTuningModelBase + { + int numPerFetch = 10; + RequestOptions reqOptions = new(); + + await foreach (ClientResult pageResult in getAsyncEnumerable(null, numPerFetch, reqOptions)) + { + ListResponse items = ValidateAndParse>(pageResult); + if (items.Data?.Count > 0) + { + foreach (T item in items.Data) + { + yield return item; + } + } + } + } + + private async Task DeleteJobAndVerifyAsync(FineTuningClient client, string jobId, TimeSpan? timeBetween = null, TimeSpan? maxWaitTime = null) + { + var stopTime = DateTimeOffset.Now + (maxWaitTime ?? TimeSpan.FromMinutes(1)); + var sleepTime = timeBetween ?? TimeSpan.FromSeconds(2); + + RequestOptions noThrow = new() + { + ErrorOptions = ClientErrorBehaviors.NoThrow + }; + + // Since the DeleteJob and DeleteJobAsync are extensions methods, we need to call them on the unwrapped type, + // instead of the dynamically wrapped type. + var rawClient = UnWrap(client); + + bool success = false; + while (DateTimeOffset.Now < stopTime) + { + ClientResult result = IsAsync + ? await rawClient.DeleteJobAsync(jobId, noThrow).ConfigureAwait(false) + : rawClient.DeleteJob(jobId, noThrow); + Assert.That(result, Is.Not.Null); + + // verify the deletion actually succeeded + result = await client.GetJobAsync(jobId, noThrow).ConfigureAwait(false); + var rawResponse = result.GetRawResponse(); + success = rawResponse.Status == 404; + if (success) + { + break; + } + + await Task.Delay(sleepTime).ConfigureAwait(false); + } + + return success; + } + + #endregion +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ImageTests.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ImageTests.cs new file mode 100644 index 000000000..f82f5762f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/ImageTests.cs @@ -0,0 +1,76 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.ClientModel; +using System.Threading.Tasks; +using OpenAI.Images; +using OpenAI.TestFramework; + +namespace Azure.AI.OpenAI.Tests; + +public class ImageTests(bool isAsync) : AoaiTestBase(isAsync) +{ + [RecordedTest] + [Category("Smoke")] + public void CanCreateClient() + { + ImageClient client = GetTestClient(tokenCredential: TestEnvironment.Credential); + Assert.That(client, Is.InstanceOf()); + } + + [RecordedTest] + public async Task BadKeyGivesHelpfulError() + { + string mockKey = "not-a-valid-key-and-should-still-be-sanitized"; + + try + { + ImageClient client = GetTestClient(keyCredential: new ApiKeyCredential(mockKey)); + _ = await client.GenerateImageAsync("a delightful exception message, in contemporary watercolor"); + Assert.Fail("No exception was thrown"); + } + catch (Exception thrownException) + { + Assert.That(thrownException, Is.InstanceOf()); + Assert.That(thrownException.Message, Does.Contain("invalid subscription key")); + Assert.That(thrownException.Message, Does.Not.Contain(mockKey)); + } + } + + [RecordedTest] + public async Task CanCreateSimpleImage() + { + ImageClient client = GetTestClient(); + GeneratedImage image = await client.GenerateImageAsync("a tabby cat", new() + { + Quality = GeneratedImageQuality.Standard, + Size = GeneratedImageSize.W1024xH1024, + EndUserId = "test_user", + ResponseFormat = GeneratedImageFormat.Bytes, + }); + Assert.That(image, Is.Not.Null); + Assert.That(image.ImageBytes, Is.Not.Null); + } + + [RecordedTest] + public async Task CanGetContentFilterResults() + { + ImageClient client = GetTestClient(); + ClientResult imageResult = await client.GenerateImageAsync("a tabby cat", new() + { + Quality = GeneratedImageQuality.Standard, + Size = GeneratedImageSize.W1024xH1024, + EndUserId = "test_user", + ResponseFormat = GeneratedImageFormat.Uri, + }); + GeneratedImage image = imageResult.Value; + Assert.That(image, Is.Not.Null); + Assert.That(image.ImageUri, Is.Not.Null); + Console.WriteLine($"RESPONSE--\n{imageResult.GetRawResponse().Content}"); + ImageContentFilterResultForPrompt promptResults = image.GetContentFilterResultForPrompt(); + ImageContentFilterResultForResponse responseResults = image.GetContentFilterResultForResponse(); + Assert.That(promptResults?.Sexual?.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + Assert.That(responseResults?.Sexual?.Severity, Is.EqualTo(ContentFilterSeverity.Safe)); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/AutoOrLongValue.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/AutoOrLongValue.cs new file mode 100644 index 000000000..e794736a3 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/AutoOrLongValue.cs @@ -0,0 +1,95 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System; +using System.Globalization; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Tests.Models; + +public readonly struct AutoOrLongValue +{ + public const string NULL = "<>"; + public const string AUTO = "auto"; + + private readonly long? _longValue; + private readonly string _stringValue; + + public AutoOrLongValue() + { + _longValue = null; + _stringValue = NULL; + } + + public AutoOrLongValue(string value) + { + if (value == null) + { + throw new ArgumentNullException("value"); + } + else if (string.Equals(value, AUTO, StringComparison.OrdinalIgnoreCase)) + { + _longValue = null; + _stringValue = AUTO; + } + else if (string.Equals(value, NULL, StringComparison.OrdinalIgnoreCase)) + { + _longValue = null; + _stringValue = NULL; + } + else + { + throw new NotSupportedException(); + } + } + + public AutoOrLongValue(long value) + { + _longValue = value; + _stringValue = value.ToString(CultureInfo.InvariantCulture); + } + + public JsonElement? ToJsonElement() + { + if (_stringValue == NULL) + { + return null; + } + + using var json = JsonDocument.Parse( + _longValue?.ToString(CultureInfo.InvariantCulture) + ?? $"\"{_stringValue}\""); + + return json.RootElement.Clone(); + } + + public static AutoOrLongValue FromJsonElement(JsonElement element) + { + if (element.ValueKind == JsonValueKind.String) + { + return new(element.GetString() ?? NULL); + } + else if (element.ValueKind == JsonValueKind.Null) + { + return new(); + } + else if (element.ValueKind == JsonValueKind.Number) + { + return new(element.GetInt64()); + } + else + { + throw new JsonException("Unsupported element kind: " + element.ValueKind); + } + } + + public bool HasValue => _stringValue != NULL && HasLongValue; + public string StringValue => _stringValue; + public bool HasLongValue => _longValue.HasValue; + public long LongValue => _longValue ?? throw new InvalidOperationException("No corresponding long value"); + + public static implicit operator AutoOrLongValue(long val) => new AutoOrLongValue(val); + public static implicit operator AutoOrLongValue(string? val) => new AutoOrLongValue(val ?? NULL); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/AzureDeployedModel.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/AzureDeployedModel.cs new file mode 100644 index 000000000..bcb3cfaa8 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/AzureDeployedModel.cs @@ -0,0 +1,26 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +namespace Azure.AI.OpenAI.Tests.Models; + +public class AzureDeployedModel +{ + required public string ID { get; init; } + required public string Name { get; init; } + required public Props Properties { get; init; } + + public class Props + { + required public ModelInfo Model { get; init; } + required public string ProvisioningState { get; init; } + } + + public class ModelInfo + { + public string? Model { get; init; } + required public string Name { get; init; } + required public string Version { get; init; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/AzureDeploymentClient.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/AzureDeploymentClient.cs new file mode 100644 index 000000000..3b51c441d --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/AzureDeploymentClient.cs @@ -0,0 +1,252 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Globalization; +using System.IO; +using System.Linq; +using System.Net.Http; +using System.Text.Json; +using System.Threading; +using System.Threading.Tasks; +using Azure.AI.OpenAI.Tests.Utils; +using Azure.AI.OpenAI.Tests.Utils.Config; +using OpenAI.TestFramework.Utils; + +namespace Azure.AI.OpenAI.Tests.Models; + +internal class AzureDeploymentClient : IDisposable +{ + private const string DEFAULT_API_VERSION = "2023-10-01-preview"; + private const string DEFAULT_SKU_NAME = "standard"; + private const int DEFAULT_CAPACITY = 1; + + private CancellationTokenSource _cts; + private ClientPipeline _pipeline; + private Core.AccessToken? _cachedAuthToken; + private readonly Core.TokenCredential _credential; + private readonly string _subscriptionId; + private readonly string _resourceGroup; + private readonly string _resourceName; + private readonly string _endpointUrl; + private readonly string _apiVersion; + + internal AzureDeploymentClient() + { + // for mocking + _cts = new(); + _pipeline = ClientPipeline.Create(); + _subscriptionId = _resourceGroup = _resourceName = _endpointUrl = string.Empty; + _apiVersion = DEFAULT_API_VERSION; + _credential = null!; + } + + public AzureDeploymentClient(IConfiguration config, Core.TokenCredential credential, string? apiVersion = null, PipelineTransport? transport = null) + { + if (config == null) + { + throw new ArgumentNullException(nameof(config)); + } + + _cts = new(); + _pipeline = ClientPipeline.Create(new ClientPipelineOptions() + { + Transport = transport ?? new HttpClientPipelineTransport() + }); + _credential = credential ?? throw new ArgumentNullException(nameof(credential)); + + _subscriptionId = config.GetValueOrThrow("subscription_id"); + _resourceGroup = config.GetValueOrThrow("resource_group"); + _resourceName = config.Endpoint?.IdnHost.Split('.').FirstOrDefault() + ?? throw new KeyNotFoundException("Could extract the resource name from the endpoint URL in the config"); + + _endpointUrl = $"https://management.azure.com/subscriptions/{_subscriptionId}/resourceGroups/{_resourceGroup}/providers/Microsoft.CognitiveServices/accounts/{_resourceName}/deployments/"; + + _apiVersion = DEFAULT_API_VERSION; + if (!string.IsNullOrWhiteSpace(apiVersion)) + { + _apiVersion = Uri.EscapeDataString(apiVersion); + } + } + + public virtual AzureDeployedModel CreateDeployment(string deploymentName, string modelName, string? skuName = DEFAULT_SKU_NAME, int capacity = DEFAULT_CAPACITY, CancellationToken token = default) + => CreateDeploymentAsync(false, deploymentName, modelName, skuName, capacity, token).GetAwaiter().GetResult(); + + public virtual Task CreateDeploymentAsync(string deploymentName, string modelName, string? skuName = DEFAULT_SKU_NAME, int capacity = DEFAULT_CAPACITY, CancellationToken token = default) + => CreateDeploymentAsync(true, deploymentName, modelName, skuName, capacity, token).AsTask(); + + public virtual AzureDeployedModel GetDeployment(string deploymentName, CancellationToken token = default) + => GetDeploymentAsync(false, deploymentName, token).GetAwaiter().GetResult(); + + public virtual Task GetDeploymentAsync(string deploymentName, CancellationToken token = default) + => GetDeploymentAsync(true, deploymentName, token).AsTask(); + + public virtual bool DeleteDeployment(string deploymentName, CancellationToken token = default) + => DeleteDeploymentAsync(false, deploymentName, token).GetAwaiter().GetResult(); + + public virtual Task DeleteDeploymentAsync(string deploymentName, CancellationToken token = default) + => DeleteDeploymentAsync(true, deploymentName, token).AsTask(); + + public void Dispose() + { + _cts.Cancel(); + _cts.Dispose(); + } + + private async ValueTask CreateDeploymentAsync(bool isAsync, string deploymentName, string modelName, string? skuName, int capacity, CancellationToken token) + { + BinaryContent content = ToJsonContent(new + { + sku = new + { + name = skuName, + capacity = capacity.ToString(CultureInfo.InvariantCulture), + }, + properties = new + { + model = new + { + format = "OpenAI", + name = modelName, + version = "1" + } + } + }); + + PipelineResponse response = await SendRequestAsync(isAsync, HttpMethod.Put, deploymentName, content, token) + .ConfigureAwait(false); + return FromJsonContent(response, token); + } + + private async ValueTask GetDeploymentAsync(bool isAsync, string deploymentName, CancellationToken token) + { + PipelineResponse response = await SendRequestAsync(isAsync, HttpMethod.Get, deploymentName, null, token) + .ConfigureAwait(false); + return FromJsonContent(response, token); + } + + private async ValueTask DeleteDeploymentAsync(bool isAsync, string deploymentName, CancellationToken token) + { + PipelineResponse response = await SendRequestAsync(isAsync, HttpMethod.Delete, deploymentName, null, token) + .ConfigureAwait(false); + ThrowOnFailed(response); + return true; + } + + private static BinaryContent ToJsonContent(T value) + { + Utf8JsonBinaryContent content = new(); + JsonSerializer.Serialize(content.JsonWriter, value, typeof(T), JsonOptions.AzureJsonOptions); + return content; + } + + private class ErrorDetail + { + public string? Code { get; init; } + public string? Message { get; init; } + } + + private class ErrorInfo + { + public ErrorDetail? Error { get; init; } + } + + private static void ThrowOnFailed(PipelineResponse response) + { + if (response.IsError) + { + if (response.Content != null + && response.Headers.GetFirstOrDefault("Content-Type")?.StartsWith("application/json") == true) + { + using Stream errorStream = response.Content.ToStream(); + ErrorInfo? error = JsonHelpers.Deserialize(errorStream, JsonOptions.AzureJsonOptions); + if (error?.Error != null) + { + throw new ClientResultException($"[{response.Status} - {error.Error.Code}] {error.Error.Message}", response); + } + } + + throw new ClientResultException(response); + } + } + + private static T FromJsonContent(PipelineResponse response, CancellationToken token) + { + ThrowOnFailed(response); + + using Stream stream = response.Content.ToStream(); + return JsonHelpers.Deserialize(stream, JsonOptions.AzureJsonOptions) + ?? throw new InvalidDataException("Service returned a null JSON response body"); + } + + private async ValueTask SendRequestAsync(bool isAsync, HttpMethod method, string pathPart, BinaryContent? body, CancellationToken token) + { + var linked = CancellationTokenSource.CreateLinkedTokenSource(_cts.Token, token); + + PipelineMessage message = _pipeline.CreateMessage(); + message.Apply(new() + { + CancellationToken = linked.Token, + ErrorOptions = ClientErrorBehaviors.NoThrow + }); + + string requestId = Guid.NewGuid().ToString(); + string bearerToken = await GetOrRenewAuthTokenAsync(isAsync, requestId, token).ConfigureAwait(false); + + string fullEndpoint = _endpointUrl + pathPart + "?api-version=" + _apiVersion; + + PipelineRequest request = message.Request; + request.Method = method.Method; + request.Uri = new Uri(fullEndpoint); + request.Headers.Add("x-ms-client-request-id", requestId); + request.Headers.Add("Authorization", "Bearer " + bearerToken); + if (body != null) + { + request.Headers.Add("Content-Type", "application/json"); + request.Content = body; + } + + if (isAsync) + { + await _pipeline.SendAsync(message).ConfigureAwait(false); + } + else + { + _pipeline.Send(message); + } + + return message.Response ?? throw new InvalidOperationException("No response was set after sending"); + } + + private async ValueTask GetOrRenewAuthTokenAsync(bool isAsync, string requestId, CancellationToken token) + { + // TODO FIXME: Use more streamlined way to get bearer auth token + if (_cachedAuthToken?.ExpiresOn > DateTimeOffset.Now.AddSeconds(-5)) + { + return _cachedAuthToken.Value.Token; + } + + var context = new Core.TokenRequestContext( + [ + "https://management.azure.com/.default" + ], + requestId); + + Core.AccessToken authToken; + if (isAsync) + { + authToken = await _credential.GetTokenAsync(context, token).ConfigureAwait(false); + } + else + { + authToken = _credential.GetToken(context, token); + } + + string bearerToken = authToken.Token; + _cachedAuthToken = authToken; + return bearerToken; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/BatchObject.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/BatchObject.cs new file mode 100644 index 000000000..0c0e7d517 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/BatchObject.cs @@ -0,0 +1,22 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Text.Json; +using Azure.AI.OpenAI.Tests.Utils; + +namespace Azure.AI.OpenAI.Tests.Models; + +public class BatchObject +{ + public static BatchObject From(BinaryData data) + { + return JsonSerializer.Deserialize(data, JsonOptions.OpenAIJsonOptions) + ?? throw new InvalidOperationException("Response was null JSON"); + } + + public string? Status { get; set; } + public string? Id { get; set; } + public string? OutputFileID { get; set; } + public string? ErrorFileId { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/BatchOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/BatchOptions.cs new file mode 100644 index 000000000..40a580370 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/BatchOptions.cs @@ -0,0 +1,29 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.IO; +using Azure.AI.OpenAI.Tests.Utils; +using OpenAI.TestFramework.Utils; + +namespace Azure.AI.OpenAI.Tests.Models; + +public class BatchOptions +{ + public string? InputFileId { get; set; } + public string? Endpoint { get; set; } + public string CompletionWindow { get; set; } = "24h"; + public IDictionary Metadata { get; } = new Dictionary(); + + public BinaryContent ToBinaryContent() + { + using MemoryStream stream = new MemoryStream(); + JsonHelpers.Serialize(stream, this, JsonOptions.OpenAIJsonOptions); + + stream.Seek(0, SeekOrigin.Begin); + var data = BinaryData.FromStream(stream); + return BinaryContent.Create(data); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/BatchResult.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/BatchResult.cs new file mode 100644 index 000000000..9d36d8d18 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/BatchResult.cs @@ -0,0 +1,41 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Collections.Generic; +using System.IO; +using System.Text; +using System.Text.Json; +using Azure.AI.OpenAI.Tests.Utils; + +namespace Azure.AI.OpenAI.Tests.Models; + +public class BatchResult +{ + public string? ID { get; init; } + public string? CustomId { get; init; } + public T? Response { get; init; } + public JsonElement? Error { get; init; } + + public static IReadOnlyList> From(BinaryData data) + { + List> list = new(); + using var reader = new StreamReader(data.ToStream(), Encoding.UTF8, false); + string? line; + while ((line = reader.ReadLine()) != null) + { + if (string.IsNullOrWhiteSpace(line)) + { + break; + } + + var entry = JsonSerializer.Deserialize>(line, JsonOptions.OpenAIJsonOptions); + if (entry != null) + { + list.Add(entry); + } + } + + return list; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningCheckpoint.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningCheckpoint.cs new file mode 100644 index 000000000..8ab2bb46a --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningCheckpoint.cs @@ -0,0 +1,29 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System; + +namespace Azure.AI.OpenAI.Tests.Models +{ + public class FineTuningCheckpoint : FineTuningModelBase + { + public DateTimeOffset CreatedAt { get; init; } + public string? FineTunedModelCheckpoint { get; init; } + public string? FineTuningJobID { get; init; } + public int StepNumber { get; init; } + public MetricsInfo Metrics { get; init; } = new MetricsInfo(); + + public class MetricsInfo + { + public int Step { get; init; } + public float TrainLoss { get; init; } + public float TrainMeanTokenAccuracy { get; init; } + public float ValidLoss { get; init; } + public float ValidMeanTokenAccuracy { get; init; } + public float FullValidLoss { get; init; } + public float FullValidMeanTokenAccuracy { get; init; } + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningHyperparameters.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningHyperparameters.cs new file mode 100644 index 000000000..9057245c8 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningHyperparameters.cs @@ -0,0 +1,89 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Tests.Models; + +public class FineTuningHyperparameters : IJsonModel +{ + private Dictionary _values = new(); + + public AutoOrLongValue? BatchSize + { + get => Get("batch_size"); + set => Set("batch_size", value); + } + + public AutoOrLongValue? LearningRateMultiplier + { + get => Get("learning_rate_multiplier"); + set => Set("learning_rate_multiplier", value); + } + + public AutoOrLongValue? NumEpochs + { + get => Get("n_epochs"); + set => Set("n_epochs", value); + } + + private AutoOrLongValue? Get(string key) + { + if (_values.TryGetValue(key, out JsonElement element)) + { + return AutoOrLongValue.FromJsonElement(element); + } + + return null; + } + + private void Set(string key, AutoOrLongValue? value) + { + JsonElement? element = value?.ToJsonElement(); + if (element == null) + { + _values.Remove(key); + } + else + { + _values[key] = element.Value; + } + } + + FineTuningHyperparameters IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var dict = JsonSerializer.Deserialize>(ref reader); + FineTuningHyperparameters instance = new(); + instance._values = dict ?? new Dictionary(); + return instance; + } + + FineTuningHyperparameters IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + ReadOnlyMemory rawData = data.ToMemory(); + var reader = new Utf8JsonReader(rawData.Span); + return ((IJsonModel)this).Create(ref reader, options); + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) + => ModelReaderWriterOptions.Json.Format; + + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + foreach (var kvp in _values) + { + writer.WritePropertyName(kvp.Key); + kvp.Value.WriteTo(writer); + } + writer.WriteEndObject(); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + => ModelReaderWriter.Write(this, options); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningJob.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningJob.cs new file mode 100644 index 000000000..e2834f3fe --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningJob.cs @@ -0,0 +1,22 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Tests.Models; + +public class FineTuningJob : FineTuningModelBase +{ + public DateTimeOffset CreatedAt { get; init; } + public IReadOnlyDictionary? Error { get; set; } + public string? FineTunedModel { get; init; } + public string Model { get; init; } = string.Empty; + public string? OrganizationID { get; init; } + public string Status { get; set; } = string.Empty; + public IReadOnlyList? ResultFiles { get; init; } + public int? TrainedTokens { get; init; } + public DateTimeOffset EstimatedFinish { get; init; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningJobEvent.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningJobEvent.cs new file mode 100644 index 000000000..4e444b0b7 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningJobEvent.cs @@ -0,0 +1,15 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System; + +namespace Azure.AI.OpenAI.Tests.Models; + +public class FineTuningJobEvent : FineTuningModelBase +{ + public DateTimeOffset CreatedAt { get; init; } + public string? Level { get; init; } + public string? Message { get; init; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningModelBase.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningModelBase.cs new file mode 100644 index 000000000..3e75eee9e --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningModelBase.cs @@ -0,0 +1,12 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +namespace Azure.AI.OpenAI.Tests.Models; + +public abstract class FineTuningModelBase +{ + required public string ID { get; init; } + required public string Object { get; init; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningOptions.cs new file mode 100644 index 000000000..4c44995f8 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/FineTuningOptions.cs @@ -0,0 +1,26 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.IO; +using Azure.AI.OpenAI.Tests.Utils; +using OpenAI.TestFramework.Utils; + +namespace Azure.AI.OpenAI.Tests.Models; + +public class FineTuningOptions +{ + required public string TrainingFile { get; init; } + required public string Model { get; init; } + public int? Seed { get; set; } + public string? Suffix { get; set; } + public FineTuningHyperparameters? Hyperparameters { get; init; } + + public BinaryContent ToBinaryContent() + { + MemoryStream stream = new(); + JsonHelpers.Serialize(stream, this, JsonOptions.OpenAIJsonOptions); + stream.Seek(0, SeekOrigin.Begin); + return BinaryContent.Create(stream); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/ListResponse.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/ListResponse.cs new file mode 100644 index 000000000..c5207f796 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Models/ListResponse.cs @@ -0,0 +1,14 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Tests.Models; + +public class ListResponse +{ + public bool HasMore { get; init; } + public IReadOnlyList? Data { get; init; } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Properties/AssemblyInfo.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Properties/AssemblyInfo.cs new file mode 100644 index 000000000..278184836 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Properties/AssemblyInfo.cs @@ -0,0 +1,4 @@ +using System.Runtime.CompilerServices; +using Castle.Core.Internal; + +[assembly: InternalsVisibleTo(InternalsVisible.ToDynamicProxyGenAssembly2)] diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/00_ClientConfiguration.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/00_ClientConfiguration.cs new file mode 100644 index 000000000..a4ebb8fc1 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/00_ClientConfiguration.cs @@ -0,0 +1,64 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable disable + +using System; +using Azure.Identity; +using OpenAI.Chat; + +namespace Azure.AI.OpenAI.Samples; + +public partial class AzureOpenAISamples +{ + public void CreateAnAzureOpenAIClient() + { + #region Snippet:ConfigureClient:WithAOAITopLevelClient + string keyFromEnvironment = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY"); + + AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new AzureKeyCredential(keyFromEnvironment)); + ChatClient chatClient = azureClient.GetChatClient("my-gpt-35-turbo-deployment"); + #endregion + } + + public void CreateAnAzureOpenAIClientWithEntra() + { + #region Snippet:ConfigureClient:WithEntra + AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); + ChatClient chatClient = azureClient.GetChatClient("my-gpt-4o-mini-deployment"); + #endregion + } + + public void UseAzureGovernment() + { + #region Snippet:ConfigureClient:GovernmentAudience + AzureOpenAIClientOptions options = new() + { + Audience = AzureOpenAIAudience.AzureGovernment, + }; + AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); + ChatClient chatClient = azureClient.GetChatClient("my-gpt-4o-mini-deployment"); + #endregion + } + + public void UseCustomAuthorizationScope() + { + #region Snippet:ConfigureClient:CustomAudience + AzureOpenAIClientOptions optionsWithCustomAudience = new() + { + Audience = "https://cognitiveservices.azure.com/.default", + }; + #endregion + + AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); + ChatClient chatClient = azureClient.GetChatClient("my-gpt-4o-mini-deployment"); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/01_Chat.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/01_Chat.cs new file mode 100644 index 000000000..3ad7584e5 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/01_Chat.cs @@ -0,0 +1,292 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable disable + +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Text; +using System.Text.Json; +using Azure.Identity; +using OpenAI.Chat; + +namespace Azure.AI.OpenAI.Samples; + +public partial class AzureOpenAISamples +{ + public void BasicChat() + { + #region Snippet:SimpleChatResponse + AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); + ChatClient chatClient = azureClient.GetChatClient("my-gpt-35-turbo-deployment"); + + ChatCompletion completion = chatClient.CompleteChat( + [ + // System messages represent instructions or other guidance about how the assistant should behave + new SystemChatMessage("You are a helpful assistant that talks like a pirate."), + // User messages represent user input, whether historical or the most recen tinput + new UserChatMessage("Hi, can you help me?"), + // Assistant messages in a request represent conversation history for responses + new AssistantChatMessage("Arrr! Of course, me hearty! What can I do for ye?"), + new UserChatMessage("What's the best way to train a parrot?"), + ]); + + Console.WriteLine($"{completion.Role}: {completion.Content[0].Text}"); + #endregion + } + + public void StreamingChat() + { + #region Snippet:StreamChatMessages + AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); + ChatClient chatClient = azureClient.GetChatClient("my-gpt-35-turbo-deployment"); + + CollectionResult completionUpdates = chatClient.CompleteChatStreaming( + [ + new SystemChatMessage("You are a helpful assistant that talks like a pirate."), + new UserChatMessage("Hi, can you help me?"), + new AssistantChatMessage("Arrr! Of course, me hearty! What can I do for ye?"), + new UserChatMessage("What's the best way to train a parrot?"), + ]); + + foreach (StreamingChatCompletionUpdate completionUpdate in completionUpdates) + { + foreach (ChatMessageContentPart contentPart in completionUpdate.ContentUpdate) + { + Console.Write(contentPart.Text); + } + } + #endregion + } + + public void ChatWithTools() + { + #region Snippet:ChatTools:DefineTool + static string GetCurrentLocation() + { + // Call the location API here. + return "San Francisco"; + } + + static string GetCurrentWeather(string location, string unit = "celsius") + { + // Call the weather API here. + return $"31 {unit}"; + } + + ChatTool getCurrentLocationTool = ChatTool.CreateFunctionTool( + functionName: nameof(GetCurrentLocation), + functionDescription: "Get the user's current location" + ); + + ChatTool getCurrentWeatherTool = ChatTool.CreateFunctionTool( + functionName: nameof(GetCurrentWeather), + functionDescription: "Get the current weather in a given location", + functionParameters: BinaryData.FromString(""" + { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. Boston, MA" + }, + "unit": { + "type": "string", + "enum": [ "celsius", "fahrenheit" ], + "description": "The temperature unit to use. Infer this from the specified location." + } + }, + "required": [ "location" ] + } + """) + ); + #endregion + + AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); + ChatClient chatClient = azureClient.GetChatClient("my-gpt-35-turbo-deployment"); + + #region Snippet:ChatTools:RequestWithFunctions + ChatCompletionOptions options = new() + { + Tools = { getCurrentLocationTool, getCurrentWeatherTool }, + }; + + List conversationMessages = + [ + new UserChatMessage("What's the weather like in Boston?"), + ]; + ChatCompletion completion = chatClient.CompleteChat(conversationMessages); + #endregion + + #region Snippet:ChatTools:HandleToolCalls + // Purely for convenience and clarity, this standalone local method handles tool call responses. + string GetToolCallContent(ChatToolCall toolCall) + { + if (toolCall.FunctionName == getCurrentWeatherTool.FunctionName) + { + // Validate arguments before using them; it's not always guaranteed to be valid JSON! + try + { + using JsonDocument argumentsDocument = JsonDocument.Parse(toolCall.FunctionArguments); + if (!argumentsDocument.RootElement.TryGetProperty("location", out JsonElement locationElement)) + { + // Handle missing required "location" argument + } + else + { + string location = locationElement.GetString(); + if (argumentsDocument.RootElement.TryGetProperty("unit", out JsonElement unitElement)) + { + return GetCurrentWeather(location, unitElement.GetString()); + } + else + { + return GetCurrentWeather(location); + } + } + } + catch (JsonException) + { + // Handle the JsonException (bad arguments) here + } + } + // Handle unexpected tool calls + throw new NotImplementedException(); + } + + if (completion.FinishReason == ChatFinishReason.ToolCalls) + { + // Add a new assistant message to the conversation history that includes the tool calls + conversationMessages.Add(new AssistantChatMessage(completion)); + + foreach (ChatToolCall toolCall in completion.ToolCalls) + { + conversationMessages.Add(new ToolChatMessage(toolCall.Id, GetToolCallContent(toolCall))); + } + + // Now make a new request with all the messages thus far, including the original + } + #endregion + } + + public void StreamingChatToolCalls() + { + static string GetCurrentLocation() + { + // Call the location API here. + return "San Francisco"; + } + + static string GetCurrentWeather(string location, string unit = "celsius") + { + // Call the weather API here. + return $"31 {unit}"; + } + + ChatTool getCurrentLocationTool = ChatTool.CreateFunctionTool( + functionName: nameof(GetCurrentLocation), + functionDescription: "Get the user's current location" + ); + + ChatTool getCurrentWeatherTool = ChatTool.CreateFunctionTool( + functionName: nameof(GetCurrentWeather), + functionDescription: "Get the current weather in a given location", + functionParameters: BinaryData.FromString(""" + { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. Boston, MA" + }, + "unit": { + "type": "string", + "enum": [ "celsius", "fahrenheit" ], + "description": "The temperature unit to use. Infer this from the specified location." + } + }, + "required": [ "location" ] + } + """) + ); + + AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); + ChatClient chatClient = azureClient.GetChatClient("my-gpt-35-turbo-deployment"); + + ChatCompletionOptions options = new() + { + Tools = { getCurrentLocationTool, getCurrentWeatherTool }, + }; + + List conversationMessages = + [ + new UserChatMessage("What's the weather like in Boston?"), + ]; + + #region Snippet:ChatTools:StreamingChatTools + Dictionary toolCallIdsByIndex = []; + Dictionary functionNamesByIndex = []; + Dictionary functionArgumentBuildersByIndex = []; + StringBuilder contentBuilder = new(); + + foreach (StreamingChatCompletionUpdate streamingChatUpdate + in chatClient.CompleteChatStreaming(conversationMessages, options)) + { + foreach (ChatMessageContentPart contentPart in streamingChatUpdate.ContentUpdate) + { + contentBuilder.Append(contentPart.Text); + } + foreach (StreamingChatToolCallUpdate toolCallUpdate in streamingChatUpdate.ToolCallUpdates) + { + if (!string.IsNullOrEmpty(toolCallUpdate.Id)) + { + toolCallIdsByIndex[toolCallUpdate.Index] = toolCallUpdate.Id; + } + if (!string.IsNullOrEmpty(toolCallUpdate.FunctionName)) + { + functionNamesByIndex[toolCallUpdate.Index] = toolCallUpdate.FunctionName; + } + if (!string.IsNullOrEmpty(toolCallUpdate.FunctionArgumentsUpdate)) + { + StringBuilder argumentsBuilder + = functionArgumentBuildersByIndex.TryGetValue(toolCallUpdate.Index, out StringBuilder existingBuilder) + ? existingBuilder + : new(); + argumentsBuilder.Append(toolCallUpdate.FunctionArgumentsUpdate); + functionArgumentBuildersByIndex[toolCallUpdate.Index] = argumentsBuilder; + } + } + } + + List toolCalls = []; + foreach (KeyValuePair indexToIdPair in toolCallIdsByIndex) + { + toolCalls.Add(ChatToolCall.CreateFunctionToolCall( + indexToIdPair.Value, + functionNamesByIndex[indexToIdPair.Key], + functionArgumentBuildersByIndex[indexToIdPair.Key].ToString())); + } + + conversationMessages.Add(new AssistantChatMessage(toolCalls, contentBuilder.ToString())); + + // Placeholder: each tool call must be resolved, like in the non-streaming case + string GetToolCallOutput(ChatToolCall toolCall) => null; + + foreach (ChatToolCall toolCall in toolCalls) + { + conversationMessages.Add(new ToolChatMessage(toolCall.Id, GetToolCallOutput(toolCall))); + } + + // Repeat with the history and all tool call resolution messages added + #endregion + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/02_Oyd.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/02_Oyd.cs new file mode 100644 index 000000000..1f1a85d69 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/02_Oyd.cs @@ -0,0 +1,54 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable disable + +using System; +using Azure.AI.OpenAI.Chat; +using Azure.Identity; +using OpenAI.Chat; + +namespace Azure.AI.OpenAI.Samples; + +public partial class AzureOpenAISamples +{ + public void OnYourDataSearch() + { + AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); + ChatClient chatClient = azureClient.GetChatClient("my-gpt-35-turbo-deployment"); + + #region Snippet:ChatUsingYourOwnData + // Extension methods to use data sources with options are subject to SDK surface changes. Suppress the + // warning to acknowledge and this and use the subject-to-change AddDataSource method. + #pragma warning disable AOAI001 + + ChatCompletionOptions options = new(); + options.AddDataSource(new AzureSearchChatDataSource() + { + Endpoint = new Uri("https://your-search-resource.search.windows.net"), + IndexName = "contoso-products-index", + Authentication = DataSourceAuthentication.FromApiKey( + Environment.GetEnvironmentVariable("OYD_SEARCH_KEY")), + }); + + ChatCompletion completion = chatClient.CompleteChat( + [ + new UserChatMessage("What are the best-selling Contoso products this month?"), + ], + options); + + AzureChatMessageContext onYourDataContext = completion.GetAzureMessageContext(); + + if (onYourDataContext?.Intent is not null) + { + Console.WriteLine($"Intent: {onYourDataContext.Intent}"); + } + foreach (AzureChatCitation citation in onYourDataContext?.Citations ?? []) + { + Console.WriteLine($"Citation: {citation.Content}"); + } + #endregion + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/03_Assistants.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/03_Assistants.cs new file mode 100644 index 000000000..aaf5b15ff --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Samples/03_Assistants.cs @@ -0,0 +1,78 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable disable + +using System; +using System.Threading.Tasks; +using Azure.Identity; +using OpenAI.Assistants; + +namespace Azure.AI.OpenAI.Samples; + +public partial class AzureOpenAISamples +{ + public async Task StreamingAssistantRunAsync() + { + #region Snippet:Assistants:CreateClient + AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); + + // The Assistants feature area is in beta, with API specifics subject to change. + // Suppress the [Experimental] warning via .csproj or, as here, in the code to acknowledge. + #pragma warning disable OPENAI001 + AssistantClient assistantClient = azureClient.GetAssistantClient(); + #endregion + + #region Snippet:Assistants:PrepareToRun + Assistant assistant = await assistantClient.CreateAssistantAsync( + model: "my-gpt-4o-deployment", + new AssistantCreationOptions() + { + Name = "My Friendly Test Assistant", + Instructions = "You politely help with math questions. Use the code interpreter tool when asked to " + + "visualize numbers.", + Tools = { ToolDefinition.CreateCodeInterpreter() }, + }); + ThreadInitializationMessage initialMessage = new( + MessageRole.User, + [ + "Hi, Assistant! Draw a graph for a line with a slope of 4 and y-intercept of 9." + ]); + AssistantThread thread = await assistantClient.CreateThreadAsync(new ThreadCreationOptions() + { + InitialMessages = { initialMessage }, + }); + #endregion + + #region Snippet:Assistants:StreamRun + RunCreationOptions runOptions = new() + { + AdditionalInstructions = "When possible, talk like a pirate." + }; + await foreach (StreamingUpdate streamingUpdate + in assistantClient.CreateRunStreamingAsync(thread, assistant, runOptions)) + { + if (streamingUpdate.UpdateKind == StreamingUpdateReason.RunCreated) + { + Console.WriteLine($"--- Run started! ---"); + } + else if (streamingUpdate is MessageContentUpdate contentUpdate) + { + Console.Write(contentUpdate.Text); + if (contentUpdate.ImageFileId is not null) + { + Console.WriteLine($"[Image content file ID: {contentUpdate.ImageFileId}"); + } + } + } + #endregion + + #region Snippet:Assistants:Cleanup + // Optionally, delete persistent resources that are no longer needed. + _ = await assistantClient.DeleteAssistantAsync(assistant); + _ = await assistantClient.DeleteThreadAsync(thread); + #endregion + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/AoaiTestBase.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/AoaiTestBase.cs new file mode 100644 index 000000000..70c178dd2 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/AoaiTestBase.cs @@ -0,0 +1,742 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.IO; +using System.Linq; +using System.Text; +using System.Text.Json; +using System.Threading.Tasks; +using Azure.AI.OpenAI.Tests.Models; +using Azure.AI.OpenAI.Tests.Utils; +using Azure.AI.OpenAI.Tests.Utils.Config; +using NUnit.Framework.Interfaces; +using OpenAI.Assistants; +using OpenAI.Audio; +using OpenAI.Batch; +using OpenAI.Chat; +using OpenAI.Embeddings; +using OpenAI.Files; +using OpenAI.FineTuning; +using OpenAI.Images; +using OpenAI.TestFramework; +using OpenAI.TestFramework.Recording.Proxy; +using OpenAI.TestFramework.Recording.Proxy.Service; +using OpenAI.TestFramework.Recording.RecordingProxy; +using OpenAI.TestFramework.Recording.Sanitizers; +using OpenAI.TestFramework.Utils; +using OpenAI.VectorStores; +using TokenCredential = Azure.Core.TokenCredential; + +namespace Azure.AI.OpenAI.Tests; + +public class AoaiTestBase : RecordedClientTestBase where TClient : class +{ + private const string AZURE_URI_SANITIZER_PATTERN = @"(?<=/(subscriptions|resourceGroups|accounts)/)([^/]+?)(?=(/|$))"; + private const string SMALL_1x1_PNG = "iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAIAAACQd1PeAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiQAABYkAZsVxhQAAAAMSURBVBhXY2BgYAAAAAQAAVzN/2kAAAAASUVORK5CYII="; + + public static readonly DateTimeOffset START_2024 = new DateTimeOffset(2024, 01, 01, 00, 00, 00, TimeSpan.Zero); + public static readonly DateTimeOffset UNIX_EPOCH = +#if NETFRAMEWORK + DateTimeOffset.Parse("1970-01-01T00:00:00.0000000+00:00"); +#else + DateTimeOffset.UnixEpoch; +#endif + + internal TestConfig TestConfig { get; } + + internal Assets Assets { get; } + + public AzureTestEnvironment TestEnvironment { get; } + + protected AoaiTestBase(bool isAsync) : this(isAsync, null) + { } + + protected AoaiTestBase(bool isAsync, RecordedTestMode? mode = null) + : base(isAsync, mode) + { + TestConfig = new TestConfig(Mode); + Assets = new Assets(); + TestEnvironment = new AzureTestEnvironment(Mode); + + // Remove some of the default sanitizers to customize their behaviour + RecordingOptions.SanitizersToRemove.AddRange( + [ + "AZSDK2003", // Location header (we use a less restrictive sanitizer) + "AZSDK4001", // Replaces entire host name in URL. We want to mask only subdomain part to make it easier to distinguish requests + "AZSDK3430", // OpenAI liberally uses "id" in its JSON responses, and we want to keep them in the recordings + "AZSDK3493", // $..name in JSON. OpenAI uses this for things that don't need to be sanitized + ]); + + // Prevent resource names from leaking into recordings + RecordingOptions.Sanitizers.AddRange( + [ + new UriRegexSanitizer(SanitizedJsonConfig.HOST_SUBDOMAIN_PATTERN) + { + Value = SanitizedJsonConfig.MASK_STRING + }, + new UriRegexSanitizer(AZURE_URI_SANITIZER_PATTERN) + { + Value = SanitizedJsonConfig.MASK_STRING + }, + new HeaderRegexSanitizer("Location") + { + Regex = AZURE_URI_SANITIZER_PATTERN, + Value = SanitizedJsonConfig.MASK_STRING + }, + new HeaderRegexSanitizer("Azure-AsyncOperation") + { + Regex = AZURE_URI_SANITIZER_PATTERN, + Value = SanitizedJsonConfig.MASK_STRING + }, + new BodyKeySanitizer("$..endpoint") + { + Regex = SanitizedJsonConfig.HOST_SUBDOMAIN_PATTERN, + Value = SanitizedJsonConfig.MASK_STRING + } + ]); + + // Prevent keys from leaking into our recordings + RecordingOptions.SanitizeJsonBody("*..key", "*..api_key"); + + // Because the current implementation of multi-part form content data in OpenAI and Azure OpenAI uses random + // to generate boundaries, this causes problems during playback as the boundary will be different each time. + // Longer term, we should find a way to pass the TestRecording.Random to the multi-part form generator in the + // code. The simplest solution for now is to disable recording the body for these mime types + RecordingOptions.RequestOverride = request => + { + if (request?.Headers.GetFirstOrDefault("Content-Type")?.StartsWith("multipart/form-data") == true) + { + return RequestRecordMode.RecordWithoutRequestBody; + } + + return RequestRecordMode.Record; + }; + RecordingOptions.Sanitizers.Add(new HeaderRegexSanitizer("Content-Type") + { + Regex = @"multipart/form-data; boundary=[^\s]+", + Value = "multipart/form-data; boundary=***" + }); + + // Data URIs trimmed to prevent the recording from being too large + RecordingOptions.Sanitizers.Add(new BodyKeySanitizer("$..url") + { + Regex = @"(?<=data:image/png;base64,)(.+)", + Value = SMALL_1x1_PNG + }); + + // Base64 encoded images in the response are replaced with a 1x1 black pixel PNG image to ensure valid data + RecordingOptions.Sanitizers.Add(new BodyKeySanitizer($"..b64_json") + { + Value = SMALL_1x1_PNG + }); + } + + /// + /// Gets the top level test client to use for testing. + /// + /// The test configuration to use + /// (Optional) The client options to use. + /// (Optional) The token credential to use. If this is null, an API key will be read from the + /// test configuration. + /// (Optional) The key credential to use instead of the one from the configuration. + public virtual AzureOpenAIClient GetTestTopLevelClient( + IConfiguration? config, + TestClientOptions? options = null, + TokenCredential? tokenCredential = null, + ApiKeyCredential? keyCredential = null) + { + // First validate that the config has the parameters we need + if (config == null) + { + throw CreateKeyNotFoundEx("any configuration"); + } + else if (config.Endpoint is null) + { + throw CreateKeyNotFoundEx("endpoint"); + } + else if (tokenCredential == null && keyCredential == null && string.IsNullOrEmpty(config.Key)) + { + throw CreateKeyNotFoundEx("API key"); + } + + // Configure the test options as needed + options ??= new(); + Action? requestAction = options.ShouldOutputRequests ? DumpRequest : null; + Action? responseAction = options.ShouldOutputResponses ? DumpResponse : null; + options.AddPolicy(new TestPipelinePolicy(requestAction, responseAction), PipelinePosition.PerCall); + + options = ConfigureClientOptions(options); + + AzureOpenAIClient topLevelClient; + if (tokenCredential != null) + { + topLevelClient = new AzureOpenAIClient(config.Endpoint, tokenCredential, options); + } + else + { + topLevelClient = new AzureOpenAIClient(config.Endpoint, keyCredential ?? new ApiKeyCredential(config.Key!), options); + } + + return topLevelClient; + } + + /// + /// Gets the properly instrumented client to use for testing. This have proper support for automatic sync/async method testing, + /// as well as recording, and playback support. + /// + /// (Optional) The client options to use. + /// (Optional) The token credential to use. If this is null, an API key will be read from the + /// test configuration. + /// (Optional) The key credential to use instead of the one from the configuration. + /// The test client instance. + public virtual TClient GetTestClient(TestClientOptions? options = null, TokenCredential? tokenCredential = null, ApiKeyCredential? keyCredential = null) + => GetTestClient(TestConfig.GetConfig(), options, tokenCredential, keyCredential); + + /// + /// Gets the properly instrumented client to use for testing. This have proper support for automatic sync/async method testing, + /// as well as recording, and playback support. + /// + /// + /// (Optional) The client options to use. + /// (Optional) The token credential to use. If this is null, an API key will be read from the + /// test configuration. + /// (Optional) The key credential to use instead of the one from the configuration. + /// The test client instance. + public virtual TClient GetTestClient(string configName, TestClientOptions? options = null, TokenCredential? tokenCredential = null, ApiKeyCredential? keyCredential = null) + => GetTestClient(TestConfig.GetConfig(configName), options, tokenCredential, keyCredential); + + /// + /// Gets a different type of client using the same configuration as the specified client. + /// + /// The type of other client to create. + /// The client instance whose configuration we want to use. + /// (Optional) The specific deployment to use instead of the one from the config. + /// + /// The client instance passed was not instrumented + public virtual TExplicitClient GetTestClientFrom(TClient client, string? deploymentName = null) + { + var instrumented = (TopLevelInfo?)GetClientContext(client); + if (instrumented?.TopLevelClient != null + && instrumented?.Config != null) + { + return GetTestClient(instrumented.TopLevelClient, instrumented.Config, deploymentName); + } + + throw new NotSupportedException("The client provided was not properly instrumented. Please make sure to get your test client " + + "instances using the GetTestClient() methods"); + } + + #region overrides + + /// + protected override RecordedTestMode GetDefaultRecordedTestMode() + => AzureTestEnvironment.DefaultRecordMode; + + /// + protected override ProxyServiceOptions CreateProxyServiceOptions() + => new() + { + DotnetExecutable = TestEnvironment.DotNetExe.FullName, + TestProxyDll = TestEnvironment.TestProxyDll.FullName, + DevCertFile = TestEnvironment.TestProxyHttpsCert.FullName, + DevCertPassword = TestEnvironment.TestProxyHttpsCertPassword, + StorageLocationDir = TestEnvironment.RepoRoot.FullName, + }; + + /// + protected override RecordingStartInformation CreateRecordingSessionStartInfo() + { + // This uses the same directory structure as the previous Azure.Core.TestFramework used for an easy drop in replacement. + // For example, suppose your test class is (and your class name matches the file name): + // c:\src\azure-sdk-for-net\sdk\openai\Azure.AI.OpenAI\tests\ChatTests.cs + // Then this would return something like: + // sdk\openai\Azure.AI.OpenAI\tests\SessionRecords\ChatTests\TestName.json + DirectoryInfo? sourceDir = GetType().Assembly.GetAssemblySourceDir(); + string relativeDir = PathHelpers.GetRelativePath( + TestEnvironment.RepoRoot.FullName, + sourceDir?.FullName ?? TestEnvironment.RepoRoot.FullName); + + string recordingFile = Path.Combine( + relativeDir, + "SessionRecords", + GetType().Name, + GetRecordedTestFileName()); + + // Start at the source directory for the current test project, and then walk up the directory structure searching for + // an "assets.json" file. + string? assetsFile = null; + for ( + DirectoryInfo? current = sourceDir; + current != null && current.FullName != TestEnvironment.RepoRoot.FullName; + current = current?.Parent) + { + string file = Path.Combine(current!.FullName, "assets.json"); + if (File.Exists(file)) + { + assetsFile = file; + break; + } + } + + return new() + { + RecordingFile = recordingFile, + AssetsFile = assetsFile + }; + } + + #endregion + + /// + /// Polls until a condition has been met with a maximum wait time. The function will always return the last value even + /// if the condition was not met. + /// + /// The value in the . + /// The initial value. + /// The asynchronous function to get the latest state of the value. + /// When we should stop waiting. + /// (Optional) The amount of time to wait between retries. This will be ignored in playback + /// mode. Default is 2 seconds. + /// (Optional) The maximum amount of time to wait until the condition becomes true. This will be ignored in + /// playback mode. The default is 2 minutes. + /// The final state. This will return when the conditions have been met or we timed out. + protected virtual Task WaitUntilReturnLast(T initialValue, Func>> getAsync, Predicate stopCondition, TimeSpan? waitTimeBetweenRequests = null, TimeSpan? maxWait = null) + => WaitUntilReturnLast(initialValue, new Func>(async () => await getAsync().ConfigureAwait(false)), stopCondition, waitTimeBetweenRequests, maxWait); + + /// + /// Polls until a condition has been met with a maximum wait time. The function will always return the last value even + /// if the condition was not met. + /// + /// The return value. + /// The initial value. + /// The asynchronous function to get the latest state of the value. + /// When we should stop waiting. + /// (Optional) The amount of time to wait between retries. This will be ignored in playback + /// mode. Default is 2 seconds. + /// (Optional) The maximum amount of time to wait until the condition becomes true. This will be ignored in + /// playback mode. The default is 2 minutes. + /// The final state. This will return when the conditions have been met or we timed out. + protected virtual async Task WaitUntilReturnLast(T initialValue, Func> getAsync, Predicate stopCondition, TimeSpan? waitTimeBetweenRequests = null, TimeSpan? maxWait = null) + { + TimeSpan delay, max; + if (Mode == RecordedTestMode.Playback) + { + delay = TimeSpan.FromMilliseconds(10); + max = TimeSpan.FromSeconds(30); + } + else + { + delay = waitTimeBetweenRequests ?? TimeSpan.FromSeconds(2); + max = maxWait ?? TimeSpan.FromMinutes(2); + } + + DateTimeOffset stopTime = DateTimeOffset.Now + max; + T result = initialValue; + + while (!stopCondition(result) && DateTimeOffset.Now < stopTime) + { + await Task.Delay(delay).ConfigureAwait(false); + result = await getAsync().ConfigureAwait(false); + } + + return result; + } + + /// + /// Gets the properly instrumented client to use for testing. This have proper support for automatic sync/async method testing, + /// as well as recording, and playback support. + /// + /// The test configuration to use + /// (Optional) The client options to use. + /// (Optional) The token credential to use. If this is null, an API key will be read from the + /// test configuration. + /// (Optional) The key credential to use instead of the one from the configuration. + /// The test client instance. + protected virtual TClient GetTestClient(IConfiguration? config, TestClientOptions? options = null, TokenCredential? tokenCredential = null, ApiKeyCredential? keyCredential = null) + { + AzureOpenAIClient topLevelClient = GetTestTopLevelClient(config, options, tokenCredential, keyCredential); + return GetTestClient(topLevelClient, config!); + } + + /// + /// Gets the properly instrumented client to use for testing. This have proper support for automatic sync/async method testing, + /// as well as recording, and playback support. + /// + /// The type of test client to get. + /// The top level client to use. + /// The configuration to use to get the deployment information (if needed). + /// The instrumented client instance to use. + /// Support for the type of client being requested has not been implemented yet. + protected virtual TExplicitClient GetTestClient(AzureOpenAIClient topLevelClient, IConfiguration config, string? deploymentName = null) + { + Func getDeployment = () => deploymentName ?? config?.Deployment ?? throw CreateKeyNotFoundEx("deployment"); + object clientObject; + + switch (typeof(TExplicitClient).Name) + { + case nameof(AssistantClient): + clientObject = topLevelClient.GetAssistantClient(); + break; + case nameof(AudioClient): + clientObject = topLevelClient.GetAudioClient(getDeployment()); + break; + case nameof(BatchClient): + clientObject = topLevelClient.GetBatchClient(getDeployment()); + break; + case nameof(ChatClient): + clientObject = topLevelClient.GetChatClient(getDeployment()); + break; + case nameof(EmbeddingClient): + clientObject = topLevelClient.GetEmbeddingClient(getDeployment()); + break; + case nameof(FileClient): + clientObject = topLevelClient.GetFileClient(); + break; + case nameof(FineTuningClient): + clientObject = topLevelClient.GetFineTuningClient(); + break; + case nameof(ImageClient): + clientObject = topLevelClient.GetImageClient(getDeployment()); + break; + case nameof(VectorStoreClient): + clientObject = topLevelClient.GetVectorStoreClient(); + break; + case nameof(AzureDeploymentClient): + var accessor = NonPublic.FromField("_transport"); + clientObject = new AzureDeploymentClient( + config, + TestEnvironment.Credential, + transport: accessor.Get(topLevelClient.Pipeline)); + break; + default: + throw new NotImplementedException($"Test client helpers not yet implemented for {typeof(TExplicitClient)}"); + }; + + object instrumented = WrapClient( + typeof(TExplicitClient), + clientObject, + new TopLevelInfo + { + TopLevelClient = topLevelClient, + Config = config, + }, + null); + + return (TExplicitClient)instrumented; + } + + private Exception CreateKeyNotFoundEx(string whatIsMissing) + { + return new KeyNotFoundException($"Could not find any {whatIsMissing} to use. Please make sure you have the necessary" + + $" {TestConfig.AssetsJson} config file, or have the needed environment variables set"); + } + + private static void DumpRequest(PipelineRequest request) + { + Console.WriteLine($"--- New request ---"); + Console.WriteLine($"{request.Method} {request?.Uri}"); + string headers = string.Join("\n ", + request!.Headers + .Select(kvp => $"{kvp.Key}: {(kvp.Key.ToLowerInvariant().Contains("auth") ? "***" : kvp.Value)}")); + Console.Write(" "); + Console.WriteLine(headers); + + if (request?.Content is not null) + { + using MemoryStream stream = new(); + request.Content.WriteTo(stream, default); + stream.Position = 0; + + string? contentType = request.Headers.GetFirstOrDefault("Content-Type"); + if (IsProbableTextContent(contentType)) + { + DumpText(contentType, stream); + } + else + { + DumpHex(stream); + } + } + } + + private static void DumpResponse(PipelineResponse response) + { + Console.WriteLine($"--- Response ---"); + Console.WriteLine($"{response.Status} - {response.ReasonPhrase}"); + string headers = string.Join( + "\n ", + response.Headers + .Where(kvp => !kvp.Key.ToLowerInvariant().Contains("client-")) + .Select(kvp => $"{kvp.Key}: {kvp.Value}")); + Console.Write(" "); + Console.WriteLine(headers); + + response.BufferContent(); + + if (response!.Content is not null) + { + using Stream stream = response.Content.ToStream(); + string? contentType = response.Headers.GetFirstOrDefault("Content-Type"); + if (IsProbableTextContent(contentType)) + { + DumpText(contentType, stream); + } + else + { + DumpHex(stream); + } + } + + Console.WriteLine(); + } + + private static bool IsProbableTextContent(string? contentType) + { + contentType = contentType?.ToLowerInvariant() ?? string.Empty; + return contentType.StartsWith("application/json") + || contentType.StartsWith("text/"); + } + + private static void DumpText(string? contentType, Stream stream) + { + if (contentType?.ToLowerInvariant().StartsWith("application/json") == true) + { + var json = JsonDocument.Parse(stream); + + stream = new MemoryStream(); + using (Utf8JsonWriter writer = new(stream, new() { Indented = true })) + { + json.WriteTo(writer); + } + + stream.Seek(0, SeekOrigin.Begin); + } + + using StreamReader reader = new(stream); + Console.WriteLine(reader.ReadToEnd()); + } + + private static void DumpHex(Stream stream, int maxLines = 256) + { + byte[] buffer = new byte[32]; + StringBuilder hex = new(3 * buffer.Length); + StringBuilder chars = new(buffer.Length); + + int read = 0; + for (int lines = 0; (read = stream.FillBuffer(buffer)) > 0 && lines < maxLines; lines++) + { + for (int i = 0; i < read; i++) + { + hex.AppendFormat("{0:X2} ", buffer[i]); + + char c = Convert.ToChar(buffer[i]); + chars.Append(char.IsControl(c) ? ' ' : c); + } + + Console.Write(hex.PadRight(buffer.Length * 3)); + Console.Write("| "); + Console.WriteLine(chars); + + hex.Clear(); + chars.Clear(); + } + + if (read != 0) + { + Console.WriteLine(" ... truncated"); + } + } + + protected void ValidateById(string id) + { + Assert.That(id, Is.Not.Null.Or.Empty); + switch (typeof(T).Name) + { + case nameof(Assistant): + _assistantIdsToDelete.Add(id); + break; + case nameof(AssistantThread): + _threadIdsToDelete.Add(id); + break; + case nameof(OpenAIFileInfo): + _fileIdsToDelete.Add(id); + break; + case nameof(ThreadRun): + break; + case nameof(VectorStore): + _vectorStoreIdsToDelete.Add(id); + break; + default: + throw new NotImplementedException(); + } + } + + protected void ValidateById(string id, string parentId) + { + Assert.That(id, Is.Not.Null.Or.Empty); + Assert.That(parentId, Is.Not.Null.Or.Empty); + switch (typeof(T).Name) + { + case nameof(ThreadMessage): + _threadIdsWithMessageIdsToDelete.Add((parentId, id)); + break; + case nameof(VectorStoreFileAssociation): + _vectorStoreFileAssociationsToRemove.Add((parentId, id)); + break; + default: + throw new NotImplementedException(); + } + } + + /// + /// Performs basic, invariant validation of a target that was just instantiated from its corresponding origination + /// mechanism. If applicable, the instance is recorded into the test run for cleanup of persistent resources. + /// + /// Instance type being validated. + /// The instance to validate. + /// The provided instance type isn't supported. + protected void Validate(T target) + { + if (target is ThreadMessage message) + { + ValidateById(message.Id, message.ThreadId); + } + else if (target is VectorStoreFileAssociation fileAssociation) + { + ValidateById(fileAssociation.VectorStoreId, fileAssociation.FileId); + } + else + { + ValidateById(target switch + { + Assistant assistant => assistant.Id, + AssistantThread thread => thread.Id, + OpenAIFileInfo file => file.Id, + ThreadRun run => run.Id, + VectorStore store => store.Id, + _ => throw new NotImplementedException(), + }); + } + } + + [TearDown] + protected void Cleanup() + { + AzureOpenAIClient topLevelCleanupClient = GetTestTopLevelClient(TestConfig.GetConfig(), new() + { + ShouldOutputRequests = false, + ShouldOutputResponses = false, + }); + AssistantClient client = topLevelCleanupClient.GetAssistantClient(); + VectorStoreClient vectorStoreClient = topLevelCleanupClient.GetVectorStoreClient(); + FileClient fileClient = topLevelCleanupClient.GetFileClient(); + RequestOptions requestOptions = new() { ErrorOptions = ClientErrorBehaviors.NoThrow, }; + foreach ((string threadId, string messageId) in _threadIdsWithMessageIdsToDelete) + { + Console.WriteLine($"Cleanup: {messageId} -> {client.DeleteMessage(threadId, messageId, requestOptions)?.GetRawResponse().Status}"); + } + foreach (string assistantId in _assistantIdsToDelete) + { + Console.WriteLine($"Cleanup: {assistantId} -> {client.DeleteAssistant(assistantId, requestOptions)?.GetRawResponse().Status}"); + } + foreach (string threadId in _threadIdsToDelete) + { + Console.WriteLine($"Cleanup: {threadId} -> {client.DeleteThread(threadId, requestOptions)?.GetRawResponse().Status}"); + } + foreach ((string vectorStoreId, string fileId) in _vectorStoreFileAssociationsToRemove) + { + Console.WriteLine($"Cleanup: {vectorStoreId}<->{fileId} => {vectorStoreClient.RemoveFileFromStore(vectorStoreId, fileId, requestOptions)?.GetRawResponse().Status}"); + } + foreach (string vectorStoreId in _vectorStoreIdsToDelete) + { + Console.WriteLine($"Cleanup: {vectorStoreId} => {vectorStoreClient.DeleteVectorStore(vectorStoreId, requestOptions)?.GetRawResponse().Status}"); + } + foreach (string fileId in _fileIdsToDelete) + { + Console.WriteLine($"Cleanup: {fileId} -> {fileClient.DeleteFile(fileId, requestOptions)?.GetRawResponse().Status}"); + } + _threadIdsWithMessageIdsToDelete.Clear(); + _assistantIdsToDelete.Clear(); + _threadIdsToDelete.Clear(); + _vectorStoreFileAssociationsToRemove.Clear(); + _vectorStoreIdsToDelete.Clear(); + _fileIdsToDelete.Clear(); + + // If we are in recording mode, update the recorded playback configuration as well + if (Mode == RecordedTestMode.Record + && TestContext.CurrentContext.Result.Outcome == ResultState.Success) + { + TestConfig.SavePlaybackConfig(); + } + } + + protected static void ValidateClientResult(ClientResult result) + { + Assert.That(result, Is.Not.Null); + Assert.That(result.GetRawResponse(), Is.Not.Null); + } + + protected static PipelineResponse ValidateClientResultResponse(ClientResult result) + { + ValidateClientResult(result); + + PipelineResponse response = result.GetRawResponse(); + Assert.That(response.Status, Is.GreaterThanOrEqualTo(200).And.LessThan(300)); + Assert.That(response.Headers, Is.Not.Null); + Assert.That(response.Headers.GetFirstOrDefault("Content-Type"), Does.StartWith("application/json")); + Assert.That(response.Content, Is.Not.Null); + + return response; + } + + protected virtual TModel ValidateAndParse(ClientResult result) where TModel : IJsonModel + { + var response = ValidateClientResultResponse(result); + + TModel? model = ModelReaderWriter.Read(response.Content, ModelReaderWriterOptions.Json); + Assert.That(model, Is.Not.Null); + return model!; + } + + protected virtual TModel ValidateAndParse(ClientResult result, JsonSerializerOptions? options = null) + { + var response = ValidateClientResultResponse(result); + + using Stream stream = response.Content.ToStream(); + Assert.That(stream, Is.Not.Null); + + TModel? model = JsonHelpers.Deserialize(stream, options ?? JsonOptions.OpenAIJsonOptions); + Assert.That(model, Is.Not.Null); + return model!; + } + + internal class TopLevelInfo + { + //required public object Client { get; init; } + required public AzureOpenAIClient TopLevelClient { get; init; } + required public IConfiguration Config { get; init; } + } + + private readonly List _assistantIdsToDelete = []; + private readonly List _threadIdsToDelete = []; + private readonly List<(string, string)> _threadIdsWithMessageIdsToDelete = []; + private readonly List _fileIdsToDelete = []; + private readonly List<(string, string)> _vectorStoreFileAssociationsToRemove = []; + private readonly List _vectorStoreIdsToDelete = []; +} + +public class TestClientOptions : AzureOpenAIClientOptions +{ + public TestClientOptions() : base() + { } + + public TestClientOptions(ServiceVersion version) : base(version) + { } + + public bool ShouldOutputRequests { get; set; } = true; + public bool ShouldOutputResponses { get; set; } = true; +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Assets.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Assets.cs new file mode 100644 index 000000000..15590bc9c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Assets.cs @@ -0,0 +1,81 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.IO; + +namespace Azure.AI.OpenAI.Tests +{ + public class Assets + { + public Assets() + { + HelloWorld = new() + { + Type = AssetType.Audio, + Language = "en", + Description = "Hello world", + Name = "hello_world.m4a", + RelativePath = GetPath("hello_world.m4a"), + MimeType = "audio/m4a" + }; + WhisperFrenchDescription = new() + { + Type = AssetType.Audio, + Language = "fr", + Description = "Whisper description in French", + Name = "french.wav", + RelativePath = GetPath("french.wav"), + MimeType = "audio/wave" + }; + DogAndCat = new() + { + Type = AssetType.Image, + Language = null, + Description = "A picture of a cat next to a dog", + Name = "variation_sample_image.jpg", + RelativePath = GetPath("variation_sample_image.png"), + MimeType = "image/png", + Url = new Uri("https://cdn.openai.com/API/images/guides/image_variation_original.webp") + }; + FineTuning = new() + { + Type = AssetType.Text, + Language = "en", + Description = "Fine tuning data for Open AI to generate a JSON object based on sports headlines", + Name = "fine_tuning.jsonl", + RelativePath = GetPath("fine_tuning.jsonl"), + MimeType = "text/plain" + }; + } + + public virtual AssetInfo HelloWorld { get; } + public virtual AssetInfo WhisperFrenchDescription { get; } + public virtual AssetInfo DogAndCat { get; } + public virtual AssetInfo FineTuning { get; } + + protected virtual string GetPath(string assetName) + { + return Path.Combine("Assets", assetName); + } + } + + public enum AssetType + { + Text, + Audio, + Image, + Raw + } + + public class AssetInfo + { + required public AssetType Type { get; init; } + required public string Name { get; init; } + required public string RelativePath { get; init; } + required public string MimeType { get; init; } + public string? Language { get; init; } + public string? Description { get; init; } + public Uri? Url { get; init; } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/AzureTestEnvironment.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/AzureTestEnvironment.cs new file mode 100644 index 000000000..be87fb307 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/AzureTestEnvironment.cs @@ -0,0 +1,236 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Collections.Generic; +using System.IO; +using System.Linq; +using System.Reflection; +using Azure.Core; +using Azure.Identity; +using OpenAI.TestFramework; +using OpenAI.TestFramework.Mocks; +using OpenAI.TestFramework.Recording; +using OpenAI.TestFramework.Utils; + +namespace Azure.AI.OpenAI.Tests.Utils; + +/// +/// Represents an Azure test environment. +/// +public class AzureTestEnvironment +{ + private readonly RecordedTestMode _mode; + private readonly string _optionPrefix; + private TokenCredential? _credential; + + /// + /// Initializes a new instance. + /// + /// The recorded test mode to use. + public AzureTestEnvironment(RecordedTestMode mode) + { + _mode = mode; + + /** + * We want to be able to to find "root" folders: + * - The root of the Git repo on disk + * - The root folder of the source code (eng/sdk) + * These two are usually the same. In external repos, they may however be a little different. + * + * To search for these folders, we use a simple method where we search up from these starting folders: + * - Check the "SourcePath" assembly metadata attribute value. All projects in the Azure C# repo automatically have this attribute + * added as part of the build "magic" (see {repo_root}\Directory.Build.Targets) + * - Where the executing assembly is running from + * Until we find a parent folder that contains a specific subfolder(s). + */ + DirectoryInfo?[] startingPoints = + [ + AssemblyHelper.GetAssemblySourceDir(), + new FileInfo(Assembly.GetExecutingAssembly().Location).Directory, + ]; + + RepoRoot = FindFirstParentWithSubfolders(startingPoints, ".git") + ?? throw new InvalidOperationException("Could not determine the GIT root folder for this repository"); + + string sourceRoot = (FindFirstParentWithSubfolders(startingPoints, "eng", "sdk") ?? RepoRoot) + .FullName; + + DotNetExe = AssemblyHelper.GetDotnetExecutable() + ?? throw new InvalidOperationException( + "Could not determine the dotnet executable to use. Do you have .Net installed or have your paths correctly configured?"); + + TestProxyDll = new FileInfo( + AssemblyHelper.GetAssemblyMetadata("TestProxyPath") + ?? throw new InvalidOperationException("Could not determine the path to the recording test proxy DLL")); + + TestProxyHttpsCert = new FileInfo(Path.Combine( + sourceRoot, + "eng", + "common", + "testproxy", + "dotnet-devcert.pfx")); + if (!TestProxyHttpsCert.Exists) + { + throw new InvalidOperationException("Could not find test proxy HTTPS root certificate to use."); + } + + TestProxyHttpsCertPassword = "password"; + + string? serviceName = null; + DirectoryInfo? sourceDir = GetType().Assembly.GetAssemblySourceDir(); + if (sourceDir != null) + { + string relativePath = PathHelpers.GetRelativePath( + Path.Combine(sourceRoot, "sdk"), + sourceDir.FullName); + serviceName = relativePath + .Split(new char[] { Path.DirectorySeparatorChar }, StringSplitOptions.RemoveEmptyEntries) + .FirstOrDefault()!; + } + + _optionPrefix = serviceName?.ToUpperInvariant() + "_"; + } + + /// + /// Gets the root Git folder. + /// + public DirectoryInfo RepoRoot { get; } + + /// + /// Gets the path to the dotnet executable. This will be used in combination with to start the + /// recording test proxy service. + /// + public FileInfo DotNetExe { get; } + + /// + /// The path to test proxy DLL that will be used when starting the recording test proxy service. + /// + public FileInfo TestProxyDll { get; } + + /// + /// Gets the HTTPS certificate file to use as the signing certificate for HTTPS connections to the test proxy. + /// + public FileInfo TestProxyHttpsCert { get; } + + /// + /// Gets the password for . + /// + public string TestProxyHttpsCertPassword { get; } + + /// + /// Gets the token credential to use during testing. This will change depending on the record mode. + /// + public TokenCredential Credential => _credential ??= GetCredential(); + + /// + /// Gets the default record mode to use for the test. This will attempt to read from the test context, or environment variables. + /// + public static RecordedTestMode DefaultRecordMode + { + get + { + string? modeString = TestContext.Parameters["TestMode"] + ?? Environment.GetEnvironmentVariable("AZURE_TEST_MODE"); + + if (Enum.TryParse(modeString, true, out RecordedTestMode mode)) + { + return mode; + } + + return RecordedTestMode.Playback; + } + } + + /// + /// Gets an optional value from environment variables. + /// + /// The name of the value to retrieve. + /// The value, or null if it did not exist. + public string? GetOptionalVariable(string name) + { + return new[] + { + _optionPrefix + name, + name, + "AZURE_" + name + } + .Select(Environment.GetEnvironmentVariable) + .FirstOrDefault(value => !string.IsNullOrWhiteSpace(value)); + } + + /// + /// Gets a value from environment variables, or throws an exception if it does not exist. + /// + /// The name of the value to retrieve. + /// The value. + /// If the value did not exist. + public string GetVariable(string name) + { + string? optionalVariable = GetOptionalVariable(name); + return optionalVariable + ?? throw new InvalidOperationException($"Could not find required environment variable '{_optionPrefix + name }' or '{name}'."); + } + + private static DirectoryInfo? FindFirstParentWithSubfolders(IEnumerable startingDirs, params string[] subFolders) + => startingDirs + .Select(d => FindParentWithSubfolders(d, subFolders)) + .FirstOrDefault(d => d != null); + + private static DirectoryInfo? FindParentWithSubfolders(DirectoryInfo? start, params string[] subFolders) + { + if (subFolders == null || subFolders.Length == 0) + { + return null; + } + + for (DirectoryInfo? current = start; current != null; current = current.Parent) + { + if (!current.Exists) + { + return null; + } + else if (subFolders.All(sub => current.EnumerateDirectories(sub).Any())) + { + return current; + } + } + + return null; + } + + private TokenCredential GetCredential() + { + if (_mode == RecordedTestMode.Playback) + { + return new MockTokenCredential(); + } + + // I'm not sure exactly what the possible combinations to use here are, so I've essentially copied the logic + // TestEnvironment.cs in Azure.Core.TestFramework (though it is a little simplified here) + string? clientSecret = GetOptionalVariable("CLIENT_SECRET"); + string? systemAccessToken = GetOptionalVariable("SYSTEM_ACCESSTOKEN"); + + if (!string.IsNullOrWhiteSpace(clientSecret)) + { + return new ClientSecretCredential( + GetVariable("TENANT_ID"), + GetVariable("CLIENT_ID"), + clientSecret); + } + else if (!string.IsNullOrWhiteSpace(systemAccessToken)) + { + return new AzurePipelinesCredential( + GetVariable("AZURESUBSCRIPTION_TENANT_ID"), + GetVariable("AZURESUBSCRIPTION_CLIENT_ID"), + GetVariable("AZURESUBSCRIPTION_SERVICE_CONNECTION_ID"), + systemAccessToken, + new AzurePipelinesCredentialOptions { AuthorityHost = new Uri(GetVariable("AZURE_AUTHORITY_HOST")) }); + } + else + { + return new DefaultAzureCredential( + new DefaultAzureCredentialOptions() { ExcludeManagedIdentityCredential = true }); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/BasicConfig.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/BasicConfig.cs new file mode 100644 index 000000000..d7f7e596d --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/BasicConfig.cs @@ -0,0 +1,58 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System; +using System.Collections.Generic; + +namespace Azure.AI.OpenAI.Tests.Utils.Config +{ + /// + /// A basic configuration that allows you to directly set values. + /// + public class BasicConfig : IConfiguration + { + private Dictionary _values = new Dictionary(StringComparer.OrdinalIgnoreCase); + + /// + public Uri? Endpoint { get; set; } + /// + public string? Key { get; set; } + /// + public string? Deployment { get; set; } + + /// + /// Adds an additional value to the configuration. + /// + /// The type of the value to add. + /// The key. + /// The value to add. + /// The instance for chaining. + public BasicConfig AddValue(string key, TVal? value) + { + if (value != null) + { + _values[key] = value; + } + else + { + _values.Remove(key); + } + + return this; + } + + /// + public TVal? GetValue(string key) + { + if (_values.TryGetValue(key, out object? val) + && val is TVal cast) + { + return cast; + } + + return default; + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/EnvironmentValuesConfig.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/EnvironmentValuesConfig.cs new file mode 100644 index 000000000..3cd72d98c --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/EnvironmentValuesConfig.cs @@ -0,0 +1,85 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System; +using System.ComponentModel; + +namespace Azure.AI.OpenAI.Tests.Utils.Config +{ + /// + /// Configuration that reads from environment variables. + /// + public class EnvironmentValuesConfig : INamedConfiguration + { + private const char ENV_KEY_SEPARATOR = '_'; + private const string SUFFIX_AOAI_API_KEY = "API_KEY"; + private const string SUFFIX_AOAI_ENDPOINT = "ENDPOINT"; + private const string SUFFIX_AOAI_DEPLOYMENT = "DEPLOYMENT"; + + private readonly string _prefix; + + /// + /// Creates a new instance. + /// + /// The environment value prefix to use. For example AZURE_OPENAI. + /// The prefix specified was null. + public EnvironmentValuesConfig(string prefix) + { + _prefix = prefix + ?.TrimEnd(ENV_KEY_SEPARATOR) + .ToUpperInvariant() + ?? throw new ArgumentNullException(nameof(prefix)); + + Endpoint = GetValue(SUFFIX_AOAI_ENDPOINT); + Key = GetValue(SUFFIX_AOAI_API_KEY); + Deployment = GetValue(SUFFIX_AOAI_DEPLOYMENT); + } + + /// + /// Creates a new instance. + /// + /// The environment value prefix to use. For example AZURE_OPENAI. + /// The specific type of client we want to get environment variable for + /// The prefix specified was null. + public EnvironmentValuesConfig(string prefix, string clientName) + : this($"{prefix}{ENV_KEY_SEPARATOR}{clientName}") + { + Name = clientName; + } + + /// + public string? Name { get; } + + /// + public Uri? Endpoint { get; } + + /// + public string? Key { get; } + + /// + public string? Deployment { get; } + + /// + public TVal? GetValue(string key) + { + string envKey = $"{_prefix}{ENV_KEY_SEPARATOR}{key.ToUpperInvariant()}"; + + string? value = Environment.GetEnvironmentVariable(envKey); + if (value == null) + { + return default; + } + else if (value is TVal val) + { + return val; + } + else + { + var defaultConverter = TypeDescriptor.GetConverter(typeof(TVal)); + return (TVal?)defaultConverter.ConvertFromInvariantString(value); + } + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/FlattenedConfig.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/FlattenedConfig.cs new file mode 100644 index 000000000..205a5d072 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/FlattenedConfig.cs @@ -0,0 +1,79 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Collections.Generic; +using System.Linq; +using OpenAI.TestFramework.Utils; + +namespace Azure.AI.OpenAI.Tests.Utils.Config; + +/// +/// Represents a flattened configuration that reads from one or more configurations in order. It will also +/// record the values read from each configuration. +/// +public class FlattenedConfig : IConfiguration +{ + private IReadOnlyList _configs; + private IDictionary? _recordedConfig; + + /// + /// Creates a new instance. + /// + /// The configurations to read from in order. + /// Where to store the recorded configuration. + /// The configs passed was null. + public FlattenedConfig(INamedConfiguration?[] configs, IDictionary recordedConfig) + { + _configs = configs ?? throw new ArgumentNullException(nameof(configs)); + _recordedConfig = recordedConfig ?? throw new ArgumentNullException(nameof(recordedConfig)); + + Endpoint = GetAndRecordProperty(c => c.Endpoint, (c, v) => c.Endpoint = v); + Key = GetAndRecordProperty(c => c.Key, (c, v) => c.Key = v); + Deployment = GetAndRecordProperty(c => c.Deployment, (c, v) => c.Deployment = v); + } + + /// + public Uri? Endpoint { get; } + /// + public string? Key { get; } + /// + public string? Deployment { get; } + + /// + public TVal? GetValue(string key) + { + TVal? value = default; + INamedConfiguration? selected = _configs + .Where(config => config != null) + .FirstOrDefault(config => (value = config!.GetValue(key)) != null); + + if (_recordedConfig != null && selected != null && value != null) + { + string configName = selected.Name ?? JsonConfig.DEFAULT_CONFIG_NAME; + SanitizedJsonConfig recorded = _recordedConfig.GetOrAdd(configName, _ => new SanitizedJsonConfig()); + recorded.SetValue(key, value); + } + + return value; + } + + private TVal? GetAndRecordProperty(Func getter, Action setter) + { + TVal? value = default; + INamedConfiguration? selected = _configs + .Where(config => config != null) + .FirstOrDefault(config => (value = getter(config!)) != null); + + if (_recordedConfig != null && selected != null && value != null) + { + string configName = selected.Name ?? JsonConfig.DEFAULT_CONFIG_NAME; + SanitizedJsonConfig recorded = _recordedConfig.GetOrAdd(configName, _ => new SanitizedJsonConfig()); + setter(recorded, value); + } + + return value; + } + + +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/IConfiguration.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/IConfiguration.cs new file mode 100644 index 000000000..97511d548 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/IConfiguration.cs @@ -0,0 +1,129 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Collections.Generic; +using OpenAI.TestFramework.AutoSyncAsync; + +namespace Azure.AI.OpenAI.Tests.Utils.Config; + +/// +/// A test configuration for an Azure resource. +/// +public interface IConfiguration +{ + /// + /// The endpoint to use for sending requests to the Azure resource. + /// + Uri? Endpoint { get; } + + /// + /// The API key to use for authenticating requests to the Azure resource. + /// + string? Key { get; } + + /// + /// The deployment to use for this Azure resource. + /// + string? Deployment { get; } + + /// + /// Gets additional values from the test configuration for the Azure resource. + /// + /// The type of the value. + /// The name of the value (usually snake cased). For example: fine_tuned_model. + /// The parsed value for that key, or null of the key was not found, or failed to be parsed. + TVal? GetValue(string key); +} + +/// +/// A named test configuration for an Azure resource. +/// +public interface INamedConfiguration : IConfiguration +{ + /// + /// The name of the configuration. + /// + string? Name { get; } +} + +/// +/// Extensions methods for . +/// +public static class ConfigurationExtensions +{ + /// + /// Gets additional values from the test configuration for the Azure resource, but throws exceptions if the key is not found. + /// + /// The type of the value. + /// The configuration to get a value from. + /// The name of the value (usually snake cased). For example: fine_tuned_model. + /// The successfully parsed value for that key. + /// If the configuration passed was null + /// If the key could not be found + public static TVal GetValueOrThrow(this IConfiguration? config, string key) + { + if (config == null) + { + throw new ArgumentNullException(nameof(config)); + } + + return config.GetValue(key) + ?? throw new KeyNotFoundException($"Could not find a value for '{key}' in the test configuration"); + } + + /// + /// Gets the configuration that was used when creating the client instance. + /// + /// The type of the client. + /// The client instance. + /// The configuration. + /// The client did not have a config associated with it. + public static IConfiguration GetConfigOrThrow(this TExplicitClient client) where TExplicitClient : class + { + var instrumented = GetTopLevelClientInfo(client); + return instrumented.Config ?? throw new ArgumentException("The client was instrumented with a null configuration"); + } + + /// + /// Gets the deployment to use from the configuration, or throws if none was found. + /// + /// The config. + /// (Optional) The client name to include in th exception message. + /// The deployment. + /// The deployment was not set or found. + public static string DeploymentOrThrow(this IConfiguration? config, string? clientName = null) + { + string str = clientName == null ? string.Empty : clientName + " "; + return config?.Deployment + ?? throw new KeyNotFoundException($"Could not find a {str}deployment in the test configuration"); + } + + /// + /// Gets the deployment from the specified client. + /// + /// The type of the client. + /// The client instance. + /// The deployment name used for that client instance. + /// The client either was not properly instrumented. + /// The client did not have a deployment configured. + public static string DeploymentOrThrow(this TExplicitClient client) where TExplicitClient : class + { + var instrumented = GetTopLevelClientInfo(client); + return instrumented.Config.DeploymentOrThrow(client!.GetType().Name); + } + + private static AoaiTestBase.TopLevelInfo GetTopLevelClientInfo(TExplicitClient? client) + where TExplicitClient : class + { + if (client == null) + { + throw new ArgumentNullException(nameof(client)); + } + + return ((AoaiTestBase.TopLevelInfo?)(client as IAutoSyncAsync)?.Context) + ?? throw new ArgumentException( + $"The client was not properly wrapped for automatic sync/async ({client.GetType().Name})", + nameof(client)); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/JsonConfig.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/JsonConfig.cs new file mode 100644 index 000000000..e7334b11f --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/JsonConfig.cs @@ -0,0 +1,62 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Collections.Generic; +using System.Text.Json; +using System.Text.Json.Serialization; +using OpenAI.TestFramework.Utils; + +namespace Azure.AI.OpenAI.Tests.Utils.Config; + +/// +/// A configuration that is deserialized from JSON. +/// +public class JsonConfig : IConfiguration +{ + /// + /// The default configuration key to use. + /// + public const string DEFAULT_CONFIG_NAME = "default"; + + /// + /// The JSON configuration to use when serializing and deserializing. + /// + public static readonly JsonSerializerOptions JSON_OPTIONS = new() + { + PropertyNameCaseInsensitive = true, + PropertyNamingPolicy = JsonOptions.SnakeCaseLower, + DictionaryKeyPolicy = JsonOptions.SnakeCaseLower, + WriteIndented = true, + AllowTrailingCommas = true, +#if NETFRAMEWORK + IgnoreNullValues = true, +#else + DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull, +#endif + }; + + /// + public Uri? Endpoint { get; init; } + /// + public string? Key { get; init; } + /// + public string? Deployment { get; init; } + + /// + /// Json values that are not part of the class go here. + /// + [JsonExtensionData] + public Dictionary? ExtensionData { get; set; } + + /// + public TVal? GetValue(string key) + { + if (ExtensionData?.TryGetValue(key, out JsonElement value) == true) + { + return value.Deserialize(JSON_OPTIONS); + } + + return default; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/NamedConfig.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/NamedConfig.cs new file mode 100644 index 000000000..5690f48a9 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/NamedConfig.cs @@ -0,0 +1,39 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System; +using System.Text.Json; + +namespace Azure.AI.OpenAI.Tests.Utils.Config; + +/// +/// A wrapper around a test configuration to associate an optional name. +/// +public class NamedConfig : INamedConfiguration +{ + private readonly IConfiguration? _config; + + /// + /// Creates a new instance. + /// + /// The configuration instance. + /// The name of the config. + public NamedConfig(IConfiguration? config, string? name) + { + _config = config; + Name = name; + } + + /// + public string? Name { get; } + /// + public Uri? Endpoint => _config?.Endpoint; + /// + public string? Key => _config?.Key; + /// + public string? Deployment => _config?.Deployment; + /// + public TVal? GetValue(string key) => _config == null ? default : _config.GetValue(key); +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/SanitizedJsonConfig.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/SanitizedJsonConfig.cs new file mode 100644 index 000000000..54e111aa7 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Config/SanitizedJsonConfig.cs @@ -0,0 +1,182 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Collections.Generic; +using System.Runtime.CompilerServices; +using System.Text.Json; +using System.Text.Json.Serialization; +using System.Text.RegularExpressions; +using OpenAI.TestFramework.Utils; + +namespace Azure.AI.OpenAI.Tests.Utils.Config +{ + /// + /// A sanitized JSON configuration. This will automatically sanitize the Endpoint, Key, subscription ID and resource group in the configuration + /// file. Please make sure to add any additional sanitization rules to the dictionary. + /// + public class SanitizedJsonConfig : IConfiguration + { + /// + /// The string to use when masking sensitive data. + /// + public const string MASK_STRING = "Sanitized"; + + /// + /// The pattern to match the subdomain of a URL. + /// + public const string HOST_SUBDOMAIN_PATTERN = @"(?<=.+://)([^\.]+)(?=[\./])"; + + private static readonly Regex HOST_SUBDOMAIN_MATCHER = new Regex(HOST_SUBDOMAIN_PATTERN, RegexOptions.Compiled); + private static readonly IReadOnlyDictionary> SANITIZERS = new Dictionary> + { + ["subscription_id"] = v => MASK_STRING, + ["resource_group"] = v => MASK_STRING, + ["endpoint"] = v => v is not null && (v is string || v is Uri) + ? MaskUriSubdomain(v.ToString())! + : MASK_STRING, + ["key"] = v => MASK_STRING, + ["api_key"] = v => MASK_STRING, + }; + + private Uri? _endpoint; + private string? _key; + private string? _deployment; + + /// + /// Creates a new instance. + /// + public SanitizedJsonConfig() + { + ExtensionData = new SortedDictionary(); + } + + /// + /// Creates a new instance from another . + /// + /// The configuration to create from. + /// If the configuration was null. + public SanitizedJsonConfig(JsonConfig config) : this() + { + if (config == null) + { + throw new ArgumentNullException(nameof(config)); + } + + Endpoint = config.Endpoint; + Key = config.Key; + Deployment = config.Deployment; + + if (config?.ExtensionData != null) + { + foreach (var kvp in config.ExtensionData) + { + switch (kvp.Value.ValueKind) + { + case JsonValueKind.Undefined: + case JsonValueKind.Null: + break; + case JsonValueKind.String: + SetValue(kvp.Key, kvp.Value.GetString()); + break; + default: + ExtensionData[kvp.Key] = kvp.Value.Clone(); + break; + } + } + } + } + + /// + public Uri? Endpoint + { + get => _endpoint; + set => _endpoint = MaskProperty(value); + } + + /// + public string? Key + { + get => _key; + set => _key = MaskProperty(value); + } + + /// + public string? Deployment + { + get => _deployment; + set => _deployment = MaskProperty(value); + } + + /// + /// Json values that are not part of the class go here. + /// + [JsonExtensionData] + public IDictionary ExtensionData { get; } + + /// + public virtual TVal? GetValue(string key) + { + if (ExtensionData?.TryGetValue(key, out JsonElement value) == true) + { + return value.Deserialize(JsonConfig.JSON_OPTIONS); + } + + return default; + } + + /// + /// Sets an additional value in the configuration. If the value is null it will be removed. + /// + /// Type of the value to set. + /// The name of the value (usually snake cased). For example: fine_tuned_model. + /// The value to set. + public virtual void SetValue(string key, TVal? value) + { + if (value == null) + { + if (ExtensionData != null) + { + ExtensionData.Remove(key); + } + } + else + { + value = MaskData(key, value); + JsonElement json = JsonHelpers.SerializeToElement(value, JsonConfig.JSON_OPTIONS); + ExtensionData[key] = json; + } + } + + private static TVal? MaskProperty(TVal? value, [CallerMemberName] string? key = null) + { + string convertedKey = JsonConfig.JSON_OPTIONS.PropertyNamingPolicy?.ConvertName(key ?? string.Empty) ?? string.Empty; + return MaskData(convertedKey, value); + } + + private static TVal? MaskData(string key, TVal? value) + { + if (value == null) + { + return default; + } + else if (SANITIZERS.TryGetValue(key ?? string.Empty, out var sanitizer)) + { + return (TVal?)sanitizer(value); + } + + return value; + } + + private static Uri? MaskUriSubdomain(string? uri) + { + if (uri == null) + { + return null; + } + + string maskedUrl = HOST_SUBDOMAIN_MATCHER.Replace(uri.ToString(), MASK_STRING); + return new Uri(maskedUrl); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Extensions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Extensions.cs new file mode 100644 index 000000000..d25ef06b3 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/Extensions.cs @@ -0,0 +1,68 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.IO; +using System.Text; + +namespace Azure.AI.OpenAI.Tests.Utils; + +/// +/// Helper extension methods. +/// +public static class Extensions +{ + /// + /// Attempts to fill the buffer as much as possible from a stream. This will try to keep reading + /// until the buffer is filled, or the stream ends. + /// + /// The stream to read from. + /// The buffer to try to fill. + /// The number of bytes read. + public static int FillBuffer(this Stream stream, byte[] buffer) + { + if (stream == null) + throw new ArgumentNullException(nameof(stream)); + else if (buffer == null) + throw new ArgumentNullException(nameof(buffer)); + + int totalRead = 0; + while (totalRead < buffer.Length) + { + int read = stream.Read(buffer, totalRead, buffer.Length - totalRead); + if (read == 0) + { + return totalRead; + } + + totalRead += read; + } + + return totalRead; + } + + /// + /// Pads the current instance with the specified character on the left. + /// + /// The string builder instance + /// The total width we want the string builder to be + /// The padding characters + /// The same builder for chaining, with any needed padding. + public static StringBuilder PadRight(this StringBuilder builder, int totalWidth, char paddingChar = ' ') + { + if (builder == null) + throw new ArgumentNullException(nameof(builder)); + else if (totalWidth < 0) + throw new ArgumentOutOfRangeException(nameof(totalWidth), "Total width must be greater than or equal to 0."); + else if (totalWidth == 0) + return builder; + + int padding = totalWidth - builder.Length; + if (padding > 0) + { + builder.Append(paddingChar, padding); + } + + return builder; + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/JsonOptions.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/JsonOptions.cs new file mode 100644 index 000000000..4489b0169 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/JsonOptions.cs @@ -0,0 +1,168 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Buffers; +using System.Globalization; +using System.Runtime.CompilerServices; +using System.Text.Json; + +#nullable enable + +namespace Azure.AI.OpenAI.Tests.Utils; + +/// +/// A helper class to make working with older versions of System.Text.Json simpler +/// +public static class JsonOptions +{ + // TODO FIXME once we update to newer versions of System.Text.JSon we should switch to using + // JsonNamingPolicy.SnakeCaseLower + public static JsonNamingPolicy SnakeCaseLower { get; } = + new SnakeCaseNamingPolicy(); + + public static JsonSerializerOptions OpenAIJsonOptions { get; } = new() + { + PropertyNameCaseInsensitive = true, + PropertyNamingPolicy = SnakeCaseLower, +#if NETFRAMEWORK + IgnoreNullValues = true, +#else + DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull, +#endif + Converters = + { + new ModelReaderWriterConverter(), + new UnixDateTimeConverter() + } + }; + + public static JsonSerializerOptions AzureJsonOptions { get; } = new() + { + PropertyNameCaseInsensitive = true, + PropertyNamingPolicy = JsonNamingPolicy.CamelCase, +#if NETFRAMEWORK + IgnoreNullValues = true, +#else + DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull, +#endif + }; + + // Ported over from the source code for newer versions of System.Text.Json + private class SnakeCaseNamingPolicy : JsonNamingPolicy + { + private enum SeparatorState + { + NotStarted, + UppercaseLetter, + LowercaseLetterOrDigit, + SpaceSeparator + } + + public override string ConvertName(string name) + { + if (string.IsNullOrEmpty(name)) + { + return string.Empty; + } + + return ConvertName('_', name.AsSpan()); + } + + internal static string ConvertName(char separator, ReadOnlySpan chars) + { + char[]? rentedBuffer = null; + + int num = (int)(1.2 * chars.Length); + Span output = num > 128 + ? (rentedBuffer = ArrayPool.Shared.Rent(num))! + : stackalloc char[128]; + + SeparatorState separatorState = SeparatorState.NotStarted; + int charsWritten = 0; + + for (int i = 0; i < chars.Length; i++) + { + char c = chars[i]; + UnicodeCategory unicodeCategory = char.GetUnicodeCategory(c); + switch (unicodeCategory) + { + case UnicodeCategory.UppercaseLetter: + switch (separatorState) + { + case SeparatorState.LowercaseLetterOrDigit: + case SeparatorState.SpaceSeparator: + WriteChar(separator, ref output); + break; + case SeparatorState.UppercaseLetter: + if (i + 1 < chars.Length && char.IsLower(chars[i + 1])) + { + WriteChar(separator, ref output); + } + break; + } + + c = char.ToLowerInvariant(c); + WriteChar(c, ref output); + separatorState = SeparatorState.UppercaseLetter; + break; + + case UnicodeCategory.LowercaseLetter: + case UnicodeCategory.DecimalDigitNumber: + if (separatorState == SeparatorState.SpaceSeparator) + { + WriteChar(separator, ref output); + } + + WriteChar(c, ref output); + separatorState = SeparatorState.LowercaseLetterOrDigit; + break; + + case UnicodeCategory.SpaceSeparator: + if (separatorState != 0) + { + separatorState = SeparatorState.SpaceSeparator; + } + break; + + default: + WriteChar(c, ref output); + separatorState = SeparatorState.NotStarted; + break; + } + } + + string result = output.Slice(0, charsWritten).ToString(); + if (rentedBuffer != null) + { + output.Slice(0, charsWritten).Clear(); + ArrayPool.Shared.Return(rentedBuffer); + } + return result; + + void ExpandBuffer(ref Span destination) + { + int minimumLength = checked(destination.Length * 2); + char[] array = ArrayPool.Shared.Rent(minimumLength); + destination.CopyTo(array); + if (rentedBuffer != null) + { + destination.Slice(0, charsWritten).Clear(); + ArrayPool.Shared.Return(rentedBuffer); + } + rentedBuffer = array; + destination = rentedBuffer; + } + + [MethodImpl(MethodImplOptions.AggressiveInlining)] + void WriteChar(char value, ref Span destination) + { + if (charsWritten == destination.Length) + { + ExpandBuffer(ref destination); + } + destination[charsWritten++] = value; + } + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/MockTokenCredential.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/MockTokenCredential.cs new file mode 100644 index 000000000..8615c0f07 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/MockTokenCredential.cs @@ -0,0 +1,33 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Threading; +using System.Threading.Tasks; +using Azure.Core; + +namespace OpenAI.TestFramework.Mocks; + +/// +/// A mock token credential to be used for testing. +/// +public class MockTokenCredential : TokenCredential +{ + /// + /// Event raised when a token is requested. + /// + public event EventHandler? TokenRequested; + + /// + public override AccessToken GetToken(TokenRequestContext requestContext, CancellationToken cancellationToken) + { + TokenRequested?.Invoke(this, requestContext); + return new AccessToken("TEST TOKEN " + string.Join(",", requestContext.Scopes), DateTimeOffset.MaxValue); + } + + /// + public override ValueTask GetTokenAsync(TokenRequestContext requestContext, CancellationToken cancellationToken) + { + return new(GetToken(requestContext, cancellationToken)); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/ModelReaderWriterConverter.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/ModelReaderWriterConverter.cs new file mode 100644 index 000000000..255628633 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/ModelReaderWriterConverter.cs @@ -0,0 +1,59 @@ +#nullable enable + +using System; +using System.ClientModel.Primitives; +using System.Linq; +using System.Reflection; +using System.Text.Json; +using System.Text.Json.Serialization; + +namespace Azure.AI.OpenAI.Tests.Utils +{ + /// + /// Adapter to allow mixing reflection based JSON serialization and deserialization with the ModelReaderWriter based ones + /// + public class ModelReaderWriterConverter : JsonConverterFactory + { + /// + public override bool CanConvert(Type typeToConvert) + { + bool implementsInterface = typeof(IJsonModel).IsAssignableFrom(typeToConvert); + bool hasParameterlessConstructor = typeToConvert.GetConstructors(BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public) + .Any(ci => ci.GetParameters()?.Count() == 0); + return implementsInterface && hasParameterlessConstructor; + } + + /// + public override JsonConverter CreateConverter(Type typeToConvert, JsonSerializerOptions options) + { + return (JsonConverter)Activator.CreateInstance(typeof(InnerModelReaderWriterConverter<>).MakeGenericType([typeToConvert]))!; + } + + private class InnerModelReaderWriterConverter : JsonConverter where T : IJsonModel + { + private IJsonModel _converter; + + /// + /// Creates a new instance + /// + /// The type does not have any paramterless constructor + public InnerModelReaderWriterConverter() + { + _converter = (IJsonModel)(Activator.CreateInstance(typeof(T), true) + ?? throw new ArgumentNullException()); + } + + /// + public override T Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) + { + return _converter.Create(ref reader, ModelReaderWriterOptions.Json); + } + + /// + public override void Write(Utf8JsonWriter writer, T value, JsonSerializerOptions options) + { + _converter.Write(writer, ModelReaderWriterOptions.Json); + } + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/NonPublic.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/NonPublic.cs new file mode 100644 index 000000000..f890e4e75 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/NonPublic.cs @@ -0,0 +1,123 @@ +#nullable enable + +using System; +using System.Reflection; + +namespace Azure.AI.OpenAI.Tests.Utils; + +/// +/// Helpers to make accessing the many internal or private members of the Azure test framework more streamlined +/// +public static class NonPublic +{ + /// + /// Creates an accessor for an internal, protected, or private property. + /// + /// The type of the class that defines this property. + /// The type of the property. + /// The name of the property. + /// The property accessor. + /// If a property with that name and type could not be found. + public static Accessor FromProperty(string propertyName) where TObj : class + { + PropertyInfo? prop = typeof(TObj).GetProperty( + propertyName, BindingFlags.NonPublic | BindingFlags.Public | BindingFlags.Instance); + + if (prop == null) + { + throw new ArgumentException($"'{propertyName}' property could not be found in '{typeof(TObj).FullName}'"); + } + else if (prop.PropertyType != typeof(TProp)) + { + throw new ArgumentException($"'{propertyName}' property is not of type '{typeof(TProp).FullName}'"); + } + + Func? getter = null; + Action? setter = null; + + MethodInfo? method = prop.GetGetMethod(true); + if (method != null) + { + getter = (Func)method.CreateDelegate(typeof(Func)); + } + + method = prop.GetSetMethod(true); + if (method != null) + { + setter = (Action)method.CreateDelegate(typeof(Action)); + } + + return new Accessor(getter, setter); + } + + /// + /// Creates an accessory for an internal, protected, or private field. + /// + /// The type of the class that defines this field. + /// The type of the field. + /// The name of the field. + /// The filed accessor. + /// If a field with that name and type could not be found. + public static Accessor FromField(string fieldName) where TObj : class + { + FieldInfo? field = typeof(TObj).GetField( + fieldName, BindingFlags.NonPublic | BindingFlags.Public | BindingFlags.Instance); + + if (field == null) + { + throw new ArgumentException($"'{fieldName}' field could not be found in '{typeof(TObj).FullName}'"); + } + else if (field.FieldType != typeof(TField)) + { + throw new ArgumentException($"'{fieldName}' field is not of type '{typeof(TField).FullName}'"); + } + + Func getter = (instance) => (TField)field.GetValue(instance)!; + Action? setter = (instance, val) => field.SetValue(instance, val); + + return new Accessor(getter, setter); + } + + /// + /// The accessor struct that makes accessing internal, protected, or private properties/fields easier. + /// + /// The type of the class that defines this field. + /// Tye type of the property/field. + public readonly struct Accessor where TObj : class + { + private readonly Func _getter; + private readonly Action _setter; + + public Accessor(Func? getter, Action? setter) + { + HasGet = getter != null; + _getter = getter ?? (_ => throw new InvalidOperationException("Get is not supported")); + HasSet = setter != null; + _setter = setter ?? ((_, __) => throw new InvalidOperationException("Get is not supported")); + } + + /// + /// True if we can read the value of the property/field. + /// + public bool HasGet { get; } + + /// + /// True if we can set the value of the property/field. + /// + public bool HasSet { get; } + + /// + /// Gets the value of the property/field. + /// + /// The instance to get the value from. Can be null for static properties/fields. + /// The value of the property/field. + public TValue Get(TObj? instance) => _getter(instance); + + /// + /// Sets the value of the property/field. + /// + /// The instance to set the value on. Can be null for static properties/fields. + /// The value to set. + public void Set(TObj? instance, TValue value) => _setter(instance, value); + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/RunOnScopeExit.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/RunOnScopeExit.cs new file mode 100644 index 000000000..5ec65283e --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/RunOnScopeExit.cs @@ -0,0 +1,25 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable enable + +using System; +using System.Threading.Tasks; + +namespace Azure.AI.OpenAI.Tests.Utils +{ + public class RunOnScopeExit : IAsyncDisposable + { + private Func _asyncFunc; + + public RunOnScopeExit(Func asyncFunc) + { + _asyncFunc = asyncFunc ?? throw new ArgumentNullException(nameof(asyncFunc)); + } + + public async ValueTask DisposeAsync() + { + await _asyncFunc().ConfigureAwait(false); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/TestConfig.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/TestConfig.cs new file mode 100644 index 000000000..5b999e118 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/TestConfig.cs @@ -0,0 +1,204 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Collections.Generic; +using System.IO; +using System.Linq; +using System.Reflection; +using System.Text; +using System.Text.Json; +using Azure.AI.OpenAI.Tests.Utils.Config; +using OpenAI.TestFramework; +using OpenAI.TestFramework.Utils; + +namespace Azure.AI.OpenAI.Tests; + +internal class TestConfig +{ + private const string AZURE_OPENAI_ENV_KEY_PREFIX = "AZURE_OPENAI"; + + private readonly bool _isPlayback; + private readonly IReadOnlyDictionary _jsonConfig; + private SortedDictionary _recordedConfig; + + public virtual string AssetsSubFolder => "Assets"; + public virtual string AssetsJson => "test_config.json"; + public virtual string PlaybackAssetsJson => $"playback_{AssetsJson}"; + + public TestConfig(RecordedTestMode? mode) + { + _isPlayback = mode == RecordedTestMode.Playback; + _recordedConfig = new(new DefaultFirstStringComparer()); + + // Load the previous playback configuration and use that to initialize the recorded config + string playbackConfigJson = Path.Combine(AssetsSubFolder, PlaybackAssetsJson); + var playbackConfig = ReadJsonConfig(playbackConfigJson); + if (playbackConfig != null) + { + foreach (var kvp in playbackConfig) + { + _recordedConfig.Add(kvp.Key, new SanitizedJsonConfig(kvp.Value)); + } + } + + // When in playback mode, we always use the playback configuration. This ensures that we run in the same way in CI/CD + // as we do locally. + if (_isPlayback) + { + _jsonConfig = playbackConfig + ?? throw new InvalidOperationException($"The playback config file was not found: {playbackConfigJson}"); + } + else + { + _jsonConfig = new[] + { + AssetsJson, + Path.Combine(AssetsSubFolder, AssetsJson), + Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.UserProfile), ".azure", AssetsSubFolder, AssetsJson), + Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData), ".azure", AssetsSubFolder, AssetsJson), + } + .Select(f => ReadJsonConfig(f)) + .FirstOrDefault(c => c != null) + ?? new Dictionary(); + } + } + + public virtual IConfiguration? GetConfig() + => GetConfig(ToKey()); + + public virtual IConfiguration? GetConfig(string name) + { + // In order to populate each property of the Config object, the search order is as follows: + // 1. Getting the specific config for the name in the JSON config file + // 2. Getting the value from the default config + // 3. (Not in playback) Getting the value from the AZURE_OPENAI__ environment variable + // 4. (Not in playback) Getting the value from the AZURE_OPENAI_ environment variable + // It will fall through each one if the value is null + + return new FlattenedConfig( + [ + new NamedConfig(_jsonConfig.GetValueOrDefault(name), name), + new NamedConfig(_jsonConfig.GetValueOrDefault(JsonConfig.DEFAULT_CONFIG_NAME), null), + _isPlayback ? null : new EnvironmentValuesConfig(AZURE_OPENAI_ENV_KEY_PREFIX, name), + _isPlayback ? null : new EnvironmentValuesConfig(AZURE_OPENAI_ENV_KEY_PREFIX) + ], _recordedConfig); + } + + public virtual void SavePlaybackConfig() + { + try + { + string? sourceDirectoryPath = typeof(TestConfig).Assembly + .GetCustomAttributes() + .FirstOrDefault(attrib => attrib.Key == "TestProjectSourceBasePath") + ?.Value; + + if (sourceDirectoryPath != null) + { + string playbackConfigJson = Path.Combine(sourceDirectoryPath, AssetsSubFolder, PlaybackAssetsJson); + + string oldJson = string.Empty; + if (File.Exists(playbackConfigJson)) + { + oldJson = File.ReadAllText(playbackConfigJson); + } + + string newJson = JsonSerializer.Serialize(_recordedConfig, JsonConfig.JSON_OPTIONS); + + // Visual Studio's hot reload feature can get upset if you are debugging the code and the playback config + // file changes, so we only save it if it is different + if (oldJson != newJson) + { + File.WriteAllText(playbackConfigJson, newJson, Encoding.UTF8); + } + } + } + catch (Exception ex) + { + Console.Error.WriteLine("Failed to save the playback configuration file. Details: " + ex); + } + } + + protected static string ToKey() + { + string fullName = typeof(TClient).Name; + int stopAt = fullName.LastIndexOf("Client"); + stopAt = stopAt == -1 ? fullName.Length : stopAt; + + StringBuilder builder = new(fullName.Length); + bool prevWasUpper = true; + + for (int i = 0; i < stopAt; i++) + { + char c = fullName[i]; + if (char.IsUpper(c)) + { + if (prevWasUpper) + { + builder.Append(char.ToLowerInvariant(c)); + } + else + { + builder.Append('_'); + builder.Append(char.ToLowerInvariant(c)); + } + + prevWasUpper = true; + } + else + { + builder.Append(c); + prevWasUpper = false; + } + } + + return builder.ToString(); + } + + protected static IReadOnlyDictionary? ReadJsonConfig(string fullPath) + { + try + { + if (File.Exists(fullPath)) + { + string json = File.ReadAllText(fullPath); + return JsonSerializer.Deserialize>(json, JsonConfig.JSON_OPTIONS); + } + } + catch (Exception) + { + } + + return null; + } + + private class DefaultFirstStringComparer : IComparer + { + public int Compare(string? x, string? y) + { + if (ReferenceEquals(x, y)) + { + return 0; + } + else if (x == null) + { + return -1; + } + else if (y == null) + { + return 1; + } + else if (x == JsonConfig.DEFAULT_CONFIG_NAME && y != JsonConfig.DEFAULT_CONFIG_NAME) + { + return -1; + } + else if (x != JsonConfig.DEFAULT_CONFIG_NAME && y == JsonConfig.DEFAULT_CONFIG_NAME) + { + return 1; + } + + return string.Compare(x, y, StringComparison.Ordinal); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/UnixDateTimeConverter.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/UnixDateTimeConverter.cs new file mode 100644 index 000000000..a4c9b1856 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/Utils/UnixDateTimeConverter.cs @@ -0,0 +1,109 @@ +#nullable enable + +using System; +using System.Text.Json; +using System.Text.Json.Serialization; + +namespace Azure.AI.OpenAI.Tests.Utils +{ + public class UnixDateTimeConverter : JsonConverterFactory + { + private static Lazy _dateTimeOffset = new(() => new DateTimeOffsetConverter(), false); + private static Lazy _nullableDateTimeOffset = new(() => new NullableDateTimeOffsetConverter(), false); + private static Lazy _dateTime = new(() => new DateTimeConverter(), false); + private static Lazy _nullableDateTime = new(() => new NullableDateTimeConverter(), false); + + public override bool CanConvert(Type typeToConvert) + => typeToConvert == typeof(DateTime) + || typeToConvert == typeof(DateTime?) + || typeToConvert == typeof(DateTimeOffset) + || typeToConvert == typeof(DateTimeOffset?); + + public override JsonConverter CreateConverter(Type typeToConvert, JsonSerializerOptions options) + { + switch (typeToConvert) + { + case Type t when t == typeof(DateTime): + return _dateTime.Value; + case Type t when t == typeof(DateTime?): + return _nullableDateTime.Value; + case Type t when t == typeof(DateTimeOffset): + return _dateTimeOffset.Value; + case Type t when t == typeof(DateTimeOffset?): + return _nullableDateTimeOffset.Value; + default: + throw new NotSupportedException(); + } + } + + private static DateTimeOffset? Read(ref Utf8JsonReader reader) + { + if (reader.TokenType == JsonTokenType.Null) + { + return default; + } + else if (reader.TokenType == JsonTokenType.Number) + { + long unixTimeInSeconds = reader.GetInt64(); + return DateTimeOffset.FromUnixTimeSeconds(unixTimeInSeconds).ToLocalTime(); + } + else if (reader.TokenType == JsonTokenType.String + && long.TryParse(reader.GetString(), out long unixTime)) + { + return DateTimeOffset.FromUnixTimeSeconds(unixTime).ToLocalTime(); + } + else + { + throw new JsonException("Expected a number token type but got " + reader.TokenType); + } + } + + private static void Write(Utf8JsonWriter writer, DateTimeOffset? value) + { + if (value == null) + { + writer.WriteNullValue(); + } + else + { + writer.WriteNumberValue(value.Value.ToUnixTimeSeconds()); + } + } + + private class DateTimeOffsetConverter : JsonConverter + { + public override DateTimeOffset Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) + => UnixDateTimeConverter.Read(ref reader) ?? default; + + public override void Write(Utf8JsonWriter writer, DateTimeOffset value, JsonSerializerOptions options) + => UnixDateTimeConverter.Write(writer, value); + } + + private class NullableDateTimeOffsetConverter : JsonConverter + { + public override DateTimeOffset? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) + => UnixDateTimeConverter.Read(ref reader); + + public override void Write(Utf8JsonWriter writer, DateTimeOffset? value, JsonSerializerOptions options) + => UnixDateTimeConverter.Write(writer, value); + } + + private class DateTimeConverter : JsonConverter + { + public override DateTime Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) + => UnixDateTimeConverter.Read(ref reader)?.LocalDateTime ?? default; + + public override void Write(Utf8JsonWriter writer, DateTime value, JsonSerializerOptions options) + => UnixDateTimeConverter.Write(writer, value); + } + + private class NullableDateTimeConverter : JsonConverter + { + public override DateTime? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) + => UnixDateTimeConverter.Read(ref reader)?.LocalDateTime ?? default; + + public override void Write(Utf8JsonWriter writer, DateTime? value, JsonSerializerOptions options) + => UnixDateTimeConverter.Write(writer, value); + } + } +} diff --git a/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/VectorStoreTests.cs b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/VectorStoreTests.cs new file mode 100644 index 000000000..cb6dc8e17 --- /dev/null +++ b/.dotnet.azure/sdk/openai/Azure.AI.OpenAI/tests/VectorStoreTests.cs @@ -0,0 +1,242 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +#nullable disable + +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Linq; +using System.Threading; +using System.Threading.Tasks; +using Azure.AI.OpenAI.Tests.Utils.Config; +using NUnit.Framework; +using OpenAI; +using OpenAI.Files; +using OpenAI.TestFramework; +using OpenAI.VectorStores; + +namespace Azure.AI.OpenAI.Tests; + +public class VectorStoreTests : AoaiTestBase +{ + public VectorStoreTests(bool isAsync) : base(isAsync) + { } + + [Test] + [Category("Smoke")] + public void CanCreateClient() + { + VectorStoreClient client = GetTestClient(); + Assert.That(client, Is.Not.Null); + } + + [RecordedTest] + public async Task CanCreateGetAndDeleteVectorStores() + { + VectorStoreClient client = GetTestClient(); + + VectorStore vectorStore = await client.CreateVectorStoreAsync(); + Validate(vectorStore); + bool deleted = await client.DeleteVectorStoreAsync(vectorStore); + Assert.That(deleted, Is.True); + + IReadOnlyList testFiles = await GetNewTestFilesAsync(client.GetConfigOrThrow(), 5); + + vectorStore = await client.CreateVectorStoreAsync(new VectorStoreCreationOptions() + { + FileIds = { testFiles[0].Id }, + Name = "test vector store", + ExpirationPolicy = new VectorStoreExpirationPolicy() + { + Anchor = VectorStoreExpirationAnchor.LastActiveAt, + Days = 3, + }, + Metadata = + { + ["test-key"] = "test-value", + }, + }); + Validate(vectorStore); + Assert.Multiple(() => + { + Assert.That(vectorStore.Name, Is.EqualTo("test vector store")); + Assert.That(vectorStore.ExpirationPolicy?.Anchor, Is.EqualTo(VectorStoreExpirationAnchor.LastActiveAt)); + Assert.That(vectorStore.ExpirationPolicy?.Days, Is.EqualTo(3)); + Assert.That(vectorStore.FileCounts.Total, Is.EqualTo(1)); + Assert.That(vectorStore.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(vectorStore.ExpiresAt, Is.GreaterThan(s_2024)); + Assert.That(vectorStore.Status, Is.EqualTo(VectorStoreStatus.InProgress)); + Assert.That(vectorStore.Metadata?.TryGetValue("test-key", out string metadataValue) == true && metadataValue == "test-value"); + }); + vectorStore = await client.GetVectorStoreAsync(vectorStore); + Assert.Multiple(() => + { + Assert.That(vectorStore.Name, Is.EqualTo("test vector store")); + Assert.That(vectorStore.ExpirationPolicy?.Anchor, Is.EqualTo(VectorStoreExpirationAnchor.LastActiveAt)); + Assert.That(vectorStore.ExpirationPolicy?.Days, Is.EqualTo(3)); + Assert.That(vectorStore.FileCounts.Total, Is.EqualTo(1)); + Assert.That(vectorStore.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(vectorStore.ExpiresAt, Is.GreaterThan(s_2024)); + Assert.That(vectorStore.Metadata?.TryGetValue("test-key", out string metadataValue) == true && metadataValue == "test-value"); + }); + + deleted = await client.DeleteVectorStoreAsync(vectorStore.Id); + Assert.That(deleted, Is.True); + + vectorStore = await client.CreateVectorStoreAsync(new VectorStoreCreationOptions() + { + FileIds = testFiles.Select(file => file.Id).ToList() + }); + Validate(vectorStore); + Assert.Multiple(() => + { + Assert.That(vectorStore.Name, Is.Null.Or.Empty); + Assert.That(vectorStore.FileCounts.Total, Is.EqualTo(5)); + }); + } + + [RecordedTest] + public async Task CanEnumerateVectorStores() + { + VectorStoreClient client = GetTestClient(); + for (int i = 0; i < 10; i++) + { + VectorStore vectorStore = await client.CreateVectorStoreAsync(new VectorStoreCreationOptions() + { + Name = $"Test Vector Store {i}", + }); + Validate(vectorStore); + Assert.That(vectorStore.Name, Is.EqualTo($"Test Vector Store {i}")); + } + + AsyncPageCollection response = client.GetVectorStoresAsync(new VectorStoreCollectionOptions() { Order = ListOrder.NewestFirst }); + Assert.That(response, Is.Not.Null); + + int lastIdSeen = int.MaxValue; + int count = 0; + await foreach (VectorStore vectorStore in response.GetAllValuesAsync()) + { + Assert.That(vectorStore.Id, Is.Not.Null); + if (vectorStore.Name?.StartsWith("Test Vector Store ") == true) + { + string idString = vectorStore.Name.Substring("Test Vector Store ".Length); + + Assert.That(int.TryParse(idString, out int seenId), Is.True); + Assert.That(seenId, Is.LessThan(lastIdSeen)); + lastIdSeen = seenId; + } + if (lastIdSeen == 0 || ++count >= 100) + { + break; + } + } + + Assert.That(lastIdSeen, Is.EqualTo(0)); + } + + [RecordedTest] + public async Task CanAssociateFiles() + { + VectorStoreClient client = GetTestClient(); + VectorStore vectorStore = await client.CreateVectorStoreAsync(); + Validate(vectorStore); + + IReadOnlyList files = await GetNewTestFilesAsync(client.GetConfigOrThrow(), 3); + + foreach (OpenAIFileInfo file in files) + { + VectorStoreFileAssociation association = await client.AddFileToVectorStoreAsync(vectorStore, file); + Validate(association); + Assert.Multiple(() => + { + Assert.That(association.FileId, Is.EqualTo(file.Id)); + Assert.That(association.VectorStoreId, Is.EqualTo(vectorStore.Id)); + Assert.That(association.LastError, Is.Null); + Assert.That(association.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(association.Status, Is.AnyOf(VectorStoreFileAssociationStatus.InProgress, VectorStoreFileAssociationStatus.Completed)); + }); + } + + bool removed = await client.RemoveFileFromStoreAsync(vectorStore, files[0]); + Assert.True(removed); + + // Errata: removals aren't immediately reflected when requesting the list + Thread.Sleep(1000); + + int count = 0; + AsyncPageCollection response = client.GetFileAssociationsAsync(vectorStore); + await foreach (VectorStoreFileAssociation association in response.GetAllValuesAsync()) + { + count++; + Assert.That(association.FileId, Is.Not.EqualTo(files[0].Id)); + Assert.That(association.VectorStoreId, Is.EqualTo(vectorStore.Id)); + } + + Assert.That(count, Is.EqualTo(2)); + } + + [RecordedTest] + public async Task CanUseBatchIngestion() + { + VectorStoreClient client = GetTestClient(); + VectorStore vectorStore = await client.CreateVectorStoreAsync(); + Validate(vectorStore); + + IReadOnlyList testFiles = await GetNewTestFilesAsync(client.GetConfigOrThrow(), 3); + + VectorStoreBatchFileJob batchJob = await client.CreateBatchFileJobAsync(vectorStore, testFiles); + Assert.Multiple(() => + { + Assert.That(batchJob.BatchId, Is.Not.Null); + Assert.That(batchJob.VectorStoreId, Is.EqualTo(vectorStore.Id)); + Assert.That(batchJob.Status, Is.EqualTo(VectorStoreBatchFileJobStatus.InProgress)); + }); + + batchJob = await WaitUntilReturnLast( + batchJob, + () => client.GetBatchFileJobAsync(batchJob), + b => b.Status != VectorStoreBatchFileJobStatus.InProgress); + Assert.That(batchJob.Status, Is.EqualTo(VectorStoreBatchFileJobStatus.Completed)); + + AsyncPageCollection response = client.GetFileAssociationsAsync(batchJob); + await foreach (VectorStoreFileAssociation association in response.GetAllValuesAsync()) + { + Assert.Multiple(() => + { + Assert.That(association.FileId, Is.Not.Null); + Assert.That(association.VectorStoreId, Is.EqualTo(vectorStore.Id)); + Assert.That(association.Status, Is.EqualTo(VectorStoreFileAssociationStatus.Completed)); + // Assert.That(association.Size, Is.GreaterThan(0)); + Assert.That(association.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(association.LastError, Is.Null); + }); + } + } + + private async Task> GetNewTestFilesAsync(IConfiguration config, int count) + { + AzureOpenAIClient azureClient = GetTestTopLevelClient(config, new() + { + ShouldOutputRequests = false, + ShouldOutputResponses = false, + }); + FileClient client = GetTestClient(azureClient, config); + + List files = []; + for (int i = 0; i < count; i++) + { + OpenAIFileInfo file = await client.UploadFileAsync( + BinaryData.FromString("This is a test file").ToStream(), + $"test_file_{i.ToString().PadLeft(3, '0')}.txt", + FileUploadPurpose.Assistants) + .ConfigureAwait(false); + Validate(file); + files.Add(file); + } + + return files; + } + + private static readonly DateTimeOffset s_2024 = new(2024, 1, 1, 0, 0, 0, TimeSpan.Zero); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/Directory.Build.props b/.dotnet.azure/sdk/openai/tools/TestFramework/Directory.Build.props new file mode 100644 index 000000000..f85173f26 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/Directory.Build.props @@ -0,0 +1,18 @@ + + + + false + true + false + false + false + false + true + + + + + diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/TestFramework.sln b/.dotnet.azure/sdk/openai/tools/TestFramework/TestFramework.sln new file mode 100644 index 000000000..a88dc3caf --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/TestFramework.sln @@ -0,0 +1,31 @@ + +Microsoft Visual Studio Solution File, Format Version 12.00 +# Visual Studio Version 17 +VisualStudioVersion = 17.10.35013.160 +MinimumVisualStudioVersion = 10.0.40219.1 +Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "OpenAI.TestFramework.Tests", "tests\OpenAI.TestFramework.Tests.csproj", "{61E849EB-F8BC-47C7-B730-874DD678BEA7}" +EndProject +Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "OpenAI.TestFramework", "src\OpenAI.TestFramework.csproj", "{BE2FF759-255B-44A8-BAE7-73E287AEEB97}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Debug|Any CPU = Debug|Any CPU + Release|Any CPU = Release|Any CPU + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {61E849EB-F8BC-47C7-B730-874DD678BEA7}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {61E849EB-F8BC-47C7-B730-874DD678BEA7}.Debug|Any CPU.Build.0 = Debug|Any CPU + {61E849EB-F8BC-47C7-B730-874DD678BEA7}.Release|Any CPU.ActiveCfg = Release|Any CPU + {61E849EB-F8BC-47C7-B730-874DD678BEA7}.Release|Any CPU.Build.0 = Release|Any CPU + {BE2FF759-255B-44A8-BAE7-73E287AEEB97}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {BE2FF759-255B-44A8-BAE7-73E287AEEB97}.Debug|Any CPU.Build.0 = Debug|Any CPU + {BE2FF759-255B-44A8-BAE7-73E287AEEB97}.Release|Any CPU.ActiveCfg = Release|Any CPU + {BE2FF759-255B-44A8-BAE7-73E287AEEB97}.Release|Any CPU.Build.0 = Release|Any CPU + EndGlobalSection + GlobalSection(SolutionProperties) = preSolution + HideSolutionNode = FALSE + EndGlobalSection + GlobalSection(ExtensibilityGlobals) = postSolution + SolutionGuid = {F145C399-D9D8-45F9-87DC-4BFFF983FA91} + EndGlobalSection +EndGlobal diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/assets.json b/.dotnet.azure/sdk/openai/tools/TestFramework/assets.json new file mode 100644 index 000000000..d33e24017 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/assets.json @@ -0,0 +1,6 @@ +{ + "AssetsRepo": "Azure/azure-sdk-assets", + "AssetsRepoPrefixPath": "net", + "TagPrefix": "net/openai/OpenAI.TestFramework", + "Tag": "net/openai/OpenAI.TestFramework_f41330e3ac" +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncCollectionResult.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncCollectionResult.cs new file mode 100644 index 000000000..64096eb4a --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncCollectionResult.cs @@ -0,0 +1,83 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.Runtime.CompilerServices; +using System.Runtime.ExceptionServices; + +namespace OpenAI.TestFramework.Adapters; + +/// +/// An adapter to make a look and work like a . This +/// simplifies writing test cases +/// +/// The type of the items the enumerator returns +public class SyncToAsyncCollectionResult : AsyncCollectionResult +{ + private bool _responseSet; + private CollectionResult? _syncCollection; + private Exception? _ex; + + /// + /// Creates a new instance + /// + /// The synchronous collection to wrap + /// If the collection was null + public SyncToAsyncCollectionResult(CollectionResult syncCollection) + { + _syncCollection = syncCollection ?? throw new ArgumentNullException(nameof(syncCollection)); + TrySetRawResponse(); + } + + /// + /// Creates a new instance. + /// + /// The exception to throw. + /// If the exception was null. + public SyncToAsyncCollectionResult(Exception ex) + { + _ex = ex ?? throw new ArgumentNullException(nameof(ex)); + _syncCollection = null; + } + + /// + public override IAsyncEnumerator GetAsyncEnumerator(CancellationToken cancellationToken = default) + { + return InnerEnumerable(cancellationToken).GetAsyncEnumerator(); + } + + private async IAsyncEnumerable InnerEnumerable([EnumeratorCancellation] CancellationToken cancellationToken = default) + { + if (_ex != null) + { + ExceptionDispatchInfo.Capture(_ex).Throw(); + } + + var asyncWrapper = new SyncToAsyncEnumerator(_syncCollection?.GetEnumerator()!, cancellationToken); + while (await asyncWrapper.MoveNextAsync().ConfigureAwait(false)) + { + TrySetRawResponse(); + yield return asyncWrapper.Current; + } + } + + private void TrySetRawResponse() + { + if (_responseSet) + { + return; + } + + // Client result doesn't provide virtual methods so we have to manually set it ourselves here + try + { + var raw = _syncCollection?.GetRawResponse(); + if (raw != null) + { + SetRawResponse(raw); + _responseSet = true; + } + } + catch (Exception) { /* dont' care */ } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncEnumerable.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncEnumerable.cs new file mode 100644 index 000000000..c71c1a0e1 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncEnumerable.cs @@ -0,0 +1,46 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Adapters; + +/// +/// Wraps an as an +/// +/// The type of items being enumerated. +public class SyncToAsyncEnumerable : IAsyncEnumerable +{ + private IEnumerable _enumerable; + Exception? _ex; + + /// + /// Creates a new instance. + /// + /// The synchronous enumerable to wrap. + public SyncToAsyncEnumerable(IEnumerable enumerable) + { + _enumerable = enumerable; + } + + /// + /// Creates a new instance. + /// + /// The synchronous enumerable to wrap. + public SyncToAsyncEnumerable(Exception ex) + { + _ex = ex; + _enumerable = Array.Empty(); + } + + /// + public IAsyncEnumerator GetAsyncEnumerator(CancellationToken cancellationToken = default) + { + if (_ex != null) + { + return new SyncToAsyncEnumerator(_ex); + } + else + { + return new SyncToAsyncEnumerator(_enumerable.GetEnumerator(), cancellationToken); + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncEnumerator.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncEnumerator.cs new file mode 100644 index 000000000..fa0ce81b0 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncEnumerator.cs @@ -0,0 +1,64 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Runtime.ExceptionServices; + +namespace OpenAI.TestFramework.Adapters; + +/// +/// Wraps an as an +/// +/// The type of items being enumerated. +public class SyncToAsyncEnumerator : IAsyncEnumerator +{ + private IEnumerator _sync; + private CancellationToken _token; + private Exception? _ex; + + /// + /// Creates a new instance. + /// + /// The synchronous enumerator to wrap. + /// (Optional) The cancellation token to use. + /// If the enumerator was null. + public SyncToAsyncEnumerator(IEnumerator sync, CancellationToken token = default) + { + _sync = sync ?? throw new ArgumentNullException(nameof(sync)); + _token = token; + } + + /// + /// Creates a new instance. + /// + /// The exception to throw. + /// If the exception was null. + public SyncToAsyncEnumerator(Exception ex) + { + _sync = Enumerable.Empty().GetEnumerator(); + _token = default; + _ex = ex ?? throw new ArgumentNullException(nameof(ex)); + } + + /// + public T Current => _sync.Current; + + /// + public ValueTask DisposeAsync() + { + _sync.Dispose(); + return default; + } + + /// + public ValueTask MoveNextAsync() + { + if (_ex != null) + { + ExceptionDispatchInfo.Capture(_ex).Throw(); + } + + _token.ThrowIfCancellationRequested(); + bool ret = _sync.MoveNext(); + return new ValueTask(ret); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncPageCollection.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncPageCollection.cs new file mode 100644 index 000000000..89b963137 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Adapters/SyncToAsyncPageCollection.cs @@ -0,0 +1,67 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.Runtime.ExceptionServices; + +namespace OpenAI.TestFramework.Adapters; + +/// +/// An adapter to make a look and work like a . This +/// simplifies writing test cases. +/// +/// The type of the items the enumerator returns. +public class SyncToAsyncPageCollection : AsyncPageCollection +{ + private PageCollection? _syncCollection; + private Exception? _ex; + + /// + /// Creates a new instance. + /// + /// The synchronous collection to wrap. + /// If the collection was null. + public SyncToAsyncPageCollection(PageCollection syncCollection) + { + _syncCollection = syncCollection ?? throw new ArgumentNullException(nameof(syncCollection)); + } + + /// + /// Creates a new instance. + /// + /// The exception to throw. + /// If the exception was null. + public SyncToAsyncPageCollection(Exception ex) + { + _ex = ex ?? throw new ArgumentNullException(nameof(ex)); + _syncCollection = null; + } + + /// + protected override Task> GetCurrentPageAsyncCore() + { + if (_ex != null) + { + return Task.FromException>(_ex); + } + else + { + return Task.FromResult(_syncCollection!.GetCurrentPage()); + } + } + + /// + protected override async IAsyncEnumerator> GetAsyncEnumeratorCore(CancellationToken cancellationToken = default) + { + if (_ex != null) + { + ExceptionDispatchInfo.Capture(_ex).Throw(); + } + + foreach (PageResult page in _syncCollection!) + { + await Task.Delay(0).ConfigureAwait(false); + yield return page; + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/AsyncOnlyAttribute.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AsyncOnlyAttribute.cs new file mode 100644 index 000000000..3fbfc191f --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AsyncOnlyAttribute.cs @@ -0,0 +1,14 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using NUnit.Framework; + +namespace OpenAI.TestFramework; + +/// +/// Attribute that can be applied to a test to indicate it only runs in asynchronous mode. +/// +[AttributeUsage(AttributeTargets.Method, AllowMultiple = false, Inherited = true)] +public class AsyncOnlyAttribute() : NUnitAttribute +{ +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/AsyncToSyncInterceptor.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/AsyncToSyncInterceptor.cs new file mode 100644 index 000000000..08fe268ec --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/AsyncToSyncInterceptor.cs @@ -0,0 +1,429 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.Collections.Concurrent; +using System.Diagnostics; +using System.Reflection; +using Castle.DynamicProxy; +using OpenAI.TestFramework.Adapters; +using Ext = OpenAI.TestFramework.Utils.TypeExtensions; + +namespace OpenAI.TestFramework.AutoSyncAsync; + +/// +/// An interceptor for Castle dynamic proxies that allows you to call the synchronous version of a method when the asynchronous one +/// is called on the proxy. This is useful for testing where you can write the async version of a test, and then automatically test +/// both async and sync methods with the same test code. +/// +[DebuggerStepThrough] +public class AsyncToSyncInterceptor : IInterceptor +{ + private const string AsyncSuffix = "Async"; + + private static readonly TypeArrayEquality s_typeArrayEquality = new(); + private static readonly ConcurrentDictionary> s_syncAsyncPairs = new(); + private static readonly MethodInfo s_taskFromResult = typeof(Task).GetMethod(nameof(Task.FromResult), BindingFlags.Public | BindingFlags.Static)!; + private static readonly MethodInfo s_taskFromException = typeof(Task) + .GetMethods(BindingFlags.Static | BindingFlags.Public) + .Where(m => m.Name == nameof(Task.FromException) && m.IsGenericMethodDefinition) + .First(); + + private readonly BindingFlags _flags; + + /// + /// Creates a new instance. + /// + /// True if you want to use async methods, false otherwise. + /// The binding flags to use when searching for methods. Default is public instance methods. + public AsyncToSyncInterceptor(bool useAsync, BindingFlags flags = BindingFlags.Public | BindingFlags.Instance) + { + UseAsync = useAsync; + _flags = flags; + } + + /// + /// Gets the shared use sync methods instance. + /// + public static AsyncToSyncInterceptor UseSyncMethods { get; } = new(false); + + /// + /// Gets the shared use async methods instance. + /// + public static AsyncToSyncInterceptor UseAsyncMethods { get; } = new(true); + + /// + [DebuggerStepThrough] + public virtual void Intercept(IInvocation invocation) + { + // 1. Should we even intercept this? + if (ShouldSkipIntercepting(invocation.Method)) + { + invocation.Proceed(); + return; + } + + // 2. Check if this method is one of a pair of Operation and OperationAsync methods. + bool isSyncAsyncPair = IsMethodSyncAsyncPair(invocation.Method); + if (!isSyncAsyncPair) + { + throw CreateEx("Method does not have a synchronous and asynchronous pair", invocation.Method); + } + + // 3. If it is, check if the method is the synchronous version. We only allow async versions in the test code + bool isAsyncMethod = invocation.Method.Name.EndsWith(AsyncSuffix); + if (!isAsyncMethod) + { + throw CreateEx("You must use the asynchronous versions of the methods when writing your tests", invocation.Method); + } + + Type asyncReturnType = invocation.Method.ReturnType; + + // 4. Call the correct synchronous or asynchronous method and warp the returned result or exception + if (UseAsync) + { + // Async method running in async mode, no need to do anything, special, continue normally + invocation.Proceed(); + } + else + { + // Call the equivalent sync method + string methodName = RemoveAsyncSuffix(invocation.Method.Name); + Type expectedReturnType = ToSyncRetType(asyncReturnType); + Type[] expectedArgs = invocation.Method.GetParameters().Select(p => p.ParameterType).ToArray(); + + MethodInfo syncMethod = invocation.TargetType.GetMethod( + methodName, _flags, binder: null, expectedArgs, modifiers: null)!; + + // this should never happen since we've already checked for the existence of the expected method + Debug.Assert(syncMethod != null); + if (syncMethod == null) + { + throw CreateEx("Could not find the synchronous version of the method", invocation.Method); + } + + if (syncMethod.ContainsGenericParameters) + { + syncMethod = syncMethod.MakeGenericMethod(invocation.Method.GetGenericArguments()); + } + + // Call the synchronous method + try + { + object? result = syncMethod.Invoke(invocation.InvocationTarget, invocation.Arguments); + if (result != null && !expectedReturnType.IsAssignableFrom(result.GetType())) + { + throw CreateEx("The synchronous method returned an unexpected type", invocation.Method); + } + + invocation.ReturnValue = ToAsyncResult(asyncReturnType, result); + } + catch (TargetInvocationException ex) + { + invocation.ReturnValue = ToAsyncException(asyncReturnType, ex.InnerException ?? ex); + } + } + } + + /// + /// Whether or not we are using async methods. + /// + public bool UseAsync { get; } + + /// Determines whether or not we should skip intercepting this method or not. + /// + /// The method we are inspecting. + /// True to skip intercepting this method, false otherwise. + protected virtual bool ShouldSkipIntercepting(MethodInfo? method) + { + return method == null + // Skip for special names (i.e. getters and setters) + || method.IsSpecialName + // Also for dispose methods + || method.Name == nameof(IDisposable.Dispose) + || method.Name == nameof(IAsyncDisposable.DisposeAsync); + } + + /// + /// Determines whether or not the specified method is part of a pair of synchronous and asynchronous methods. This will + /// check based on 3 factors: + /// + /// If there is a "???" and "???Async" pair of named methods + /// If the arguments are exactly the same for both methods + /// If we know how to determine the expected return type for the synchronous method, from the asynchronous one + /// + /// + /// The method to check. + /// True if it is, false otherwise. + protected virtual bool IsMethodSyncAsyncPair(MethodInfo? method) + { + if (method == null || method.DeclaringType == null) + { + return false; + } + + ISet validPrefixes = s_syncAsyncPairs.GetOrAdd(method.DeclaringType, t => DetermineValidSyncAsyncPairs(t, _flags)); + return validPrefixes.Contains(RemoveAsyncSuffix(method.Name)); + } + + /// + /// Determines what the corresponding synchronous return type would be for the specified asynchronous return type. + /// + /// The asynchronous return type. + /// The corresponding synchronous return type. + /// If we don't know what the equivalent would be. + protected virtual Type ToSyncRetType(Type asyncReturnType) + { + if (typeof(Task) == asyncReturnType || typeof(ValueTask) == asyncReturnType) + { + return typeof(void); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(Task<>), out Type[] genericTypes)) + { + return genericTypes[0]; + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(ValueTask<>), out genericTypes)) + { + return genericTypes[0]; + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(AsyncPageCollection<>), out genericTypes)) + { + return typeof(PageCollection<>).MakeGenericType(genericTypes); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(AsyncCollectionResult<>), out genericTypes)) + { + return typeof(CollectionResult<>).MakeGenericType(genericTypes); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(IAsyncEnumerable<>), out genericTypes)) + { + return typeof(IEnumerable<>).MakeGenericType(genericTypes); + } + else + { + throw new NotSupportedException("Don't know how to create the sync to async wrapper for " + asyncReturnType.FullName); + } + } + + /// + /// Wraps the result from a synchronous method into the equivalent asynchronous return type. + /// + /// The asynchronous return type. + /// The result to wrap. + /// The wrapped result. + /// If we don't support the conversion. + protected virtual object? ToAsyncResult(Type asyncReturnType, object? result) + { + if (typeof(Task) == asyncReturnType) + { + return Task.CompletedTask; + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(Task<>), out Type[] genericTypes)) + { + return s_taskFromResult + .MakeGenericMethod(genericTypes) + .Invoke(null, [result]); + } + else if (typeof(ValueTask) == asyncReturnType) + { + return new ValueTask(); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(ValueTask<>), out genericTypes)) + { + return Activator.CreateInstance( + typeof(ValueTask<>).MakeGenericType(genericTypes), + result); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(AsyncPageCollection<>), out genericTypes)) + { + return Activator.CreateInstance( + typeof(SyncToAsyncPageCollection<>).MakeGenericType(genericTypes), + result); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(AsyncCollectionResult<>), out genericTypes)) + { + return Activator.CreateInstance( + typeof(SyncToAsyncCollectionResult<>).MakeGenericType(genericTypes), + result); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(IAsyncEnumerable<>), out genericTypes)) + { + return Activator.CreateInstance( + typeof(SyncToAsyncEnumerable<>).MakeGenericType(genericTypes), + result); + } + else + { + throw new NotSupportedException("Don't know how to wrap the exception for " + asyncReturnType.FullName); + } + } + + /// + /// Wraps the exception from a synchronous method into the equivalent asynchronous return type. + /// + /// The asynchronous return type. + /// The exception to wrap. + /// The wrapped exception. + /// If we don't support the conversion. + protected virtual object? ToAsyncException(Type asyncReturnType, Exception ex) + { + if (typeof(Task) == asyncReturnType) + { + return Task.FromException(ex); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(Task<>), out Type[] genericTypes)) + { + return s_taskFromException + .MakeGenericMethod(genericTypes) + .Invoke(null, [ex]); + } + else if (typeof(ValueTask) == asyncReturnType) + { + return new ValueTask(Task.FromException(ex)); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(ValueTask<>), out genericTypes)) + { + var failedTask = s_taskFromException + .MakeGenericMethod(genericTypes) + .Invoke(null, [ex]); + return Activator.CreateInstance( + typeof(ValueTask<>).MakeGenericType(genericTypes), + failedTask); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(AsyncPageCollection<>), out genericTypes)) + { + return Activator.CreateInstance( + typeof(SyncToAsyncPageCollection<>).MakeGenericType(genericTypes), + ex); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(AsyncCollectionResult<>), out genericTypes)) + { + return Activator.CreateInstance( + typeof(SyncToAsyncCollectionResult<>).MakeGenericType(genericTypes), + ex); + } + else if (Ext.IsClosedGenericOf(asyncReturnType, typeof(IAsyncEnumerable<>), out genericTypes)) + { + return Activator.CreateInstance( + typeof(SyncToAsyncEnumerable<>).MakeGenericType(genericTypes), + ex); + } + else + { + throw new NotSupportedException("Don't know how to determine the synchronous equivalent return type of " + asyncReturnType.FullName); + } + } + + private static InvalidOperationException CreateEx(string description, MethodInfo method) + { + return new InvalidOperationException($"{description}. '{method.DeclaringType?.Name} -> {method.Name}'"); + } + + private static string RemoveAsyncSuffix(string? name) + { + if (name == null) + return string.Empty; + + int index = name.LastIndexOf(AsyncSuffix); + return index >= 0 + ? name.Substring(0, index) + : name; + } + + [DebuggerStepperBoundary] + private ISet DetermineValidSyncAsyncPairs(Type declaringType, BindingFlags flags) + { + // Group potential pairs based only on the method name removing the "Async" postfix + var potentialPairs = declaringType.GetMethods(flags) + .Where(m => !m.IsSpecialName) + .GroupBy(m => RemoveAsyncSuffix(m.Name)) + .OrderBy(g => g.Key) + .Select(g => new + { + g.Key, + Potentials = g.Select(m => new + { + m.Name, + Args = m.GetParameters().Select(p => p.ParameterType).ToArray(), + Return = m.ReturnType, + }) + // Order by name to ensure OperationName comes before OperationNameAsync + .OrderBy(p => p.Name) + // Match on method arguments + .GroupBy(g => g.Args, s_typeArrayEquality) + .Select(g => g.ToArray()) + }); + + // Now evaluate potential pairs to ensure that for each argument list for that method, there exists both a synchronous + // and asynchronous version with equivalent return types + HashSet validPairPrefixes = new(); + + foreach (var entry in potentialPairs) + { + bool allValid = entry.Potentials.All(matchedPair => + { + // because of the way we sorted above, we should have exactly 2 entries here, the first is the synchronous method + // the second the corresponding asynchronous method + return matchedPair.Length == 2 + && matchedPair[0].Name + AsyncSuffix == matchedPair[1].Name + && matchedPair[0].Return == ToSyncRetType(matchedPair[1].Return); + }); + + if (allValid) + { + validPairPrefixes.Add(entry.Key); + } + } + + return validPairPrefixes; + } + + /// + /// Helper comparer that compares all of the Types in an array for equality. + /// + private class TypeArrayEquality : IEqualityComparer + { + /// + public bool Equals(Type[]? x, Type[]? y) + { + if (ReferenceEquals(x, y)) + { + return true; + } + else if (x == null || y == null) + { + return false; + } + else if (x.LongLength != y.LongLength) + { + return false; + } + + for (long i = 0; i < x.LongLength; i++) + { + if (x[i] != y[i]) + { + return false; + } + } + + return true; + } + + /// + public int GetHashCode(Type[] obj) + { + if (obj == null) + { + return 0; + } + + int rollingHash = 1; // to distinguish empty case from null case + for (long i = 0; i < obj.LongLength; i++) + { + rollingHash = (rollingHash, obj[i].GetHashCode()).GetHashCode(); + } + + return rollingHash; + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/AutoSyncAsyncMixIn.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/AutoSyncAsyncMixIn.cs new file mode 100644 index 000000000..538f8fb75 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/AutoSyncAsyncMixIn.cs @@ -0,0 +1,27 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.AutoSyncAsync; + +/// +/// An implementation of that allows you to get the original back, as well as a place +/// to store an additional context. +/// +public class AutoSyncAsyncMixIn : IAutoSyncAsync +{ + /// + /// Creates a new instance. + /// + /// The original instance. + public AutoSyncAsyncMixIn(object original, object? context = null) + { + Original = original; + Context = context; + } + + /// + public object Original { get; } + + /// + public object? Context { get; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/IAutoSyncAsync.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/IAutoSyncAsync.cs new file mode 100644 index 000000000..551df75c5 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/IAutoSyncAsync.cs @@ -0,0 +1,22 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.AutoSyncAsync; + +/// +/// An interface that serves as a way to identify a dynamically proxied class that supports automatic sync and async testing. This +/// also provides a way to get the the original un-proxied instance. +/// instance. +/// +public interface IAutoSyncAsync +{ + /// + /// Gets the original un-proxied instance back. + /// + public object Original { get; } + + /// + /// Any additional context associated with the instrumented object (e.g. options used to create it). + /// + public object? Context { get; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/TestProxyGenerationHook.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/TestProxyGenerationHook.cs new file mode 100644 index 000000000..d11d5d86f --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/TestProxyGenerationHook.cs @@ -0,0 +1,45 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Reflection; +using Castle.DynamicProxy; +using NUnit.Framework.Interfaces; +using NUnit.Framework.Internal; + +namespace OpenAI.TestFramework.AutoSyncAsync +{ + /// + /// Controls which methods are skipped during dynamic proxy generation. + /// + public class TestProxyGenerationHook : IProxyGenerationHook + { + /// + public void MethodsInspected() + { } + + /// + public void NonProxyableMemberNotification(Type type, MemberInfo memberInfo) + { } + + /// + public bool ShouldInterceptMethod(Type type, MethodInfo methodInfo) + { + IMethodInfo? testMethod = TestExecutionContext.CurrentContext.CurrentTest.Method; + + if (methodInfo == null + // Skip for special names (i.e. getters and setters) + || methodInfo.IsSpecialName + // Also for dispose methods + || methodInfo.Name == nameof(IDisposable.Dispose) + || methodInfo.Name == nameof(IAsyncDisposable.DisposeAsync) + // If we are running a sync only or async only, skip intercepting altogether + || testMethod?.IsDefined(false) == true + || testMethod?.IsDefined(false) == true) + { + return false; + } + + return true; + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/ThisLeakInterceptor.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/ThisLeakInterceptor.cs new file mode 100644 index 000000000..c7d6c1e70 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsync/ThisLeakInterceptor.cs @@ -0,0 +1,25 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Diagnostics; +using Castle.DynamicProxy; + +namespace OpenAI.TestFramework.AutoSyncAsync; + +/// +/// A basic interceptor that prevents the leaking of the original un-proxied this instance as a return value. +/// +public class ThisLeakInterceptor : IInterceptor +{ + /// + [DebuggerStepThrough] + public void Intercept(IInvocation invocation) + { + invocation.Proceed(); + + if (invocation.ReturnValue == invocation.InvocationTarget) + { + invocation.ReturnValue = invocation.Proxy; + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsyncTestFixtureAttribute.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsyncTestFixtureAttribute.cs new file mode 100644 index 000000000..29983a94d --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/AutoSyncAsyncTestFixtureAttribute.cs @@ -0,0 +1,32 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using NUnit.Framework; +using NUnit.Framework.Interfaces; +using NUnit.Framework.Internal; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework; + +/// +/// Attribute used to indicate that a test fixture should automatically be be run in both synchronous and asynchronous mode. +/// +[AttributeUsage(AttributeTargets.Class, AllowMultiple = false, Inherited = true)] +public class AutoSyncAsyncTestFixtureAttribute : NUnitAttribute, IFixtureBuilder2 +{ + /// + public IEnumerable BuildFrom(ITypeInfo typeInfo) + => BuildFrom(typeInfo, null!); + + /// + public IEnumerable BuildFrom(ITypeInfo typeInfo, IPreFilter filter) + { + List suites = + [ + .. new TestFixtureAttribute([false]).BuildFrom(typeInfo, new AndPreFilter(filter, new SyncAsyncPreFilter(false))), + .. new TestFixtureAttribute([true]).BuildFrom(typeInfo, new AndPreFilter(filter, new SyncAsyncPreFilter(true))), + ]; + + return suites; + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/ClientTestBase.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/ClientTestBase.cs new file mode 100644 index 000000000..7406e738e --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/ClientTestBase.cs @@ -0,0 +1,174 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Diagnostics; +using Castle.DynamicProxy; +using NUnit.Framework; +using NUnit.Framework.Internal; +using OpenAI.TestFramework.AutoSyncAsync; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework; + +/// +/// Base class for client test cases. This provides support for writing only a test that uses the Async version of +/// methods, and automatically creating a test that uses the equivalent Sync version of a method. Please note that +/// this will only work for public virtual methods. In order for this to work, you should write a test that uses the +/// async version of a method. +/// +[AutoSyncAsyncTestFixture] +public abstract class ClientTestBase +{ + private static ProxyGenerator? s_proxyGenerator = null; + private static ThisLeakInterceptor? s_thisLeakInterceptor = null; + private static AsyncToSyncInterceptor? s_asyncInterceptor = null; + private static AsyncToSyncInterceptor? s_syncInterceptor = null; + + private CancellationTokenSource? _cts = null; + + /// + /// Creates a new instance. + /// + /// True to run the async version of a test, false to run the sync version of a test. + public ClientTestBase(bool isAsync) + { + IsAsync = isAsync; + } + + /// + /// Gets whether or not we are running async tests. + /// + public virtual bool IsAsync { get; } + + /// + /// Gets the start time of the test. + /// + public virtual DateTimeOffset TestStartTime => TestExecutionContext.CurrentContext.StartTime.ToUniversalTime(); + + /// + /// Gets the test timeout. + /// + public virtual TimeSpan TestTimeout => Debugger.IsAttached + ? Default.DebuggerAttachedTestTimeout + : Default.TestTimeout; + + /// + /// Gets the cancellation token to use + /// + public virtual CancellationToken Token => _cts?.Token ?? default; + + [SetUp] + public void TestSetup() + { + _cts?.Dispose(); + _cts = new CancellationTokenSource(TestTimeout); + } + + [TearDown] + public void TestCleanup() + { + _cts?.Dispose(); + _cts = null; + } + + /// + /// Gets the instance to use to create proxies of classes + /// that allow you inject additional functionality in for testing. + /// + protected static ProxyGenerator ProxyGenerator => s_proxyGenerator ??= new ProxyGenerator(); + + /// + /// An interceptor that prevents leaking a reference to the original instance as a return value from methods. + /// + protected static ThisLeakInterceptor ThisLeakInterceptor => s_thisLeakInterceptor ??= new ThisLeakInterceptor(); + + /// + /// An interceptor to force the use of async version of a method. + /// + protected static AsyncToSyncInterceptor UseSyncMethodInterceptor => s_syncInterceptor ??= new AsyncToSyncInterceptor(false); + + /// + /// An interceptor to force the use of sync version of a method. + /// + protected static AsyncToSyncInterceptor UseAsyncMethodInterceptor => s_asyncInterceptor ??= new AsyncToSyncInterceptor(true); + + /// + /// Wraps a client for automatic sync/async testing. This will return a proxied version of the client that will allow you to + /// automatically use the sync versions of a method. + /// + /// The type of the client instance. + /// The client instance to instrument for testing. + /// (Optional) Any additional context to associate with the wrapped client. + /// (Optional) Any additional interceptors to use. + /// The proxied version of the client. + public T WrapClient(T client, object? context = null, params IInterceptor[] interceptors) where T : class + => (T)WrapClient(typeof(T), client, context, interceptors); + + /// + /// Gets the original client from a wrapped client. + /// + /// The type of the client. + /// The wrapped client instance. + /// The original client instance. + /// The the client passed was not wrapped. + public virtual T UnWrap(T wrapped) where T : class + { + if (wrapped is IAutoSyncAsync instrumented) + { + return (T)instrumented.Original; + } + + throw new NotSupportedException($"That instance was not wrapped using {nameof(WrapClient)}"); + } + + /// + /// Gets the context associated with the wrapped instance. + /// + /// The type of the client. + /// The wrapped client. + /// The associated context for the wrapped instance. Will be null if none was set. + /// The the instance passed was not wrapped. + public virtual object? GetClientContext(T client) where T : class + { + if (client is IAutoSyncAsync instrumented) + { + return instrumented.Context; + } + + throw new NotSupportedException($"That instance was not wrapped using {nameof(WrapClient)}"); + } + + /// + /// Wraps a client with sync/async equivalent methods for testing. This enables the automatic testing of the sync version + /// of methods if you write an async test case. + /// + /// The type of the client. + /// The client instance to wrap. + /// (Optional) Any additional context to associate with the wrapped client. + /// (Optional) Any additional interceptors to include. + /// The wrapped version of the client. + protected internal virtual object WrapClient(Type instanceType, object client, object? context, IEnumerable? interceptors) + { + List allInterceptors = new(); + + if (interceptors != null) + { + allInterceptors.AddRange(interceptors); + } + + allInterceptors.Add(ThisLeakInterceptor); + allInterceptors.Add(IsAsync ? UseAsyncMethodInterceptor : UseSyncMethodInterceptor); + + ProxyGenerationOptions options = new(new TestProxyGenerationHook()); + options.AddMixinInstance(new AutoSyncAsyncMixIn(client, context)); + + object proxy = ProxyGenerator.CreateClassProxyWithTarget( + instanceType, + [], + client, + options, + allInterceptors.ToArray()); + + return proxy; + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/CapturedMessage.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/CapturedMessage.cs new file mode 100644 index 000000000..854b8cd7d --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/CapturedMessage.cs @@ -0,0 +1,175 @@ +// Copyright(c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Net; +using System.Net.Http; +using System.Net.Http.Headers; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Mocks; + +/// +/// A captured message. This is used as part of the . +/// +public abstract class CapturedMessage +{ + private static BinaryData? s_emptyData = null; + private static IReadOnlyDictionary>? s_emptyHeaders = null; + + /// + /// An empty header dictionary. + /// + public static IReadOnlyDictionary> EMPTY_HEADERS + => s_emptyHeaders ??= new Dictionary>(); + + /// + /// Empty binary data. + /// + public static BinaryData EMPTY_DATA => s_emptyData ??= new BinaryData(Array.Empty()); + + /// + /// Gets or sets the headers of the captured message. + /// + public IReadOnlyDictionary> Headers { get; init; } = EMPTY_HEADERS; + + /// + /// Gets or sets the content of the captured message. + /// + public BinaryData Content { get; init; } = EMPTY_DATA; + + /// + /// Copies the content from the provided to a new instance. + /// + /// The to copy the content from. + /// A new instance containing the copied content. + public static BinaryData CopyContent(HttpContent? content) + { + if (content == null) + { + return EMPTY_DATA; + } + + using Stream stream = content.ReadAsStreamAsync().Result; + return BinaryData.FromStream(stream); + } + + /// + /// Copies the headers from the provided and to a new dictionary. + /// + /// The to copy headers from. + /// The to copy headers from. + /// A new dictionary containing the copied headers. + public static IReadOnlyDictionary> CopyHeaders(HttpHeaders header, HttpContentHeaders? contentHeaders) + { + Dictionary> dict = new(StringComparer.OrdinalIgnoreCase); + foreach (var kvp in header) + { + dict[kvp.Key] = new List(kvp.Value); + } + + if (contentHeaders != null) + { + foreach (var kvp in contentHeaders) + { + var list = (List?)dict.GetValueOrDefault(kvp.Key); + if (list == null) + { + list = new List(); + dict[kvp.Key] = list; + } + + list.AddRange(kvp.Value); + } + } + + return dict; + } +} + +/// +/// A captured request. +/// +public class CapturedRequest : CapturedMessage +{ + /// + /// Creates a new instance. + /// + public CapturedRequest() + { } + + /// + /// Creates a new instance of using the provided . + /// + /// The to create the captured request from. + public CapturedRequest(HttpRequestMessage request) + { + if (request == null) + { + throw new ArgumentNullException(nameof(request)); + } + + Method = request.Method; + Uri = request.RequestUri; + Headers = CopyHeaders(request.Headers, request.Content?.Headers); + Content = CopyContent(request.Content); + } + + /// + /// Gets or sets the HTTP method of the captured request. + /// + public HttpMethod Method { get; init; } = HttpMethod.Get; + + /// + /// Gets or sets the URI of the captured request. + /// + public Uri? Uri { get; init; } +} + +/// +/// A captured response. +/// +public class CapturedResponse : CapturedMessage +{ + /// + /// Gets or sets the status code of the captured response. + /// + public HttpStatusCode Status { get; init; } = HttpStatusCode.OK; + + /// + /// Gets or sets the reason phrase of the captured response. + /// + public string? ReasonPhrase { get; init; } = "OK"; + + /// + /// Converts the captured response to an . + /// + /// The . + public HttpResponseMessage ToResponse() + { + const string contentPrefix = "Content-"; + + HttpResponseMessage response = new() + { + StatusCode = Status, + ReasonPhrase = ReasonPhrase + }; + + foreach (var kvp in Headers.Where(h => h.Key?.StartsWith(contentPrefix) == false)) + { + response.Headers.TryAddWithoutValidation(kvp.Key, kvp.Value); + } + + if (Content != null && Content.ToMemory().Length > 0) + { + response.Content = new StreamContent(Content.ToStream()); + foreach (var kvp in Headers.Where(h => h.Key?.StartsWith(contentPrefix) == true)) + { + response.Content.Headers.TryAddWithoutValidation(kvp.Key, kvp.Value); + } + } + + return response; + } +} + + diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockAsyncCollectionResult.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockAsyncCollectionResult.cs new file mode 100644 index 000000000..86e871aa6 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockAsyncCollectionResult.cs @@ -0,0 +1,32 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace OpenAI.TestFramework.Mocks; + +/// +/// Represents a mock implementation of the class. +/// +/// The type of the values in the collection. +public class MockAsyncCollectionResult : AsyncCollectionResult +{ + private readonly Func> _enumerateAsyncFunc; + + /// + /// Initializes a new instance of the class + /// with the specified asynchronous enumeration function and optional pipeline response. + /// + /// The function that asynchronously enumerates the values in the collection. + /// The optional pipeline response. + public MockAsyncCollectionResult(Func> enumerateAsyncFunc, PipelineResponse? response = null) : + base(response ?? new MockPipelineResponse()) + { + _enumerateAsyncFunc = enumerateAsyncFunc ?? throw new ArgumentNullException(nameof(enumerateAsyncFunc)); + } + + /// + public override IAsyncEnumerator GetAsyncEnumerator(CancellationToken cancellationToken = default) + => _enumerateAsyncFunc().GetAsyncEnumerator(cancellationToken); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockAsyncPageCollection.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockAsyncPageCollection.cs new file mode 100644 index 000000000..424681252 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockAsyncPageCollection.cs @@ -0,0 +1,70 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace OpenAI.TestFramework.Mocks; + +/// +/// Represents a mock implementation of the class. +/// +/// The type of the values in the collection. +public class MockAsyncPageCollection : AsyncPageCollection +{ + private readonly Func> _enumerateAsyncFunc; + private readonly PipelineResponse _response; + private readonly int _itemsPerPage; + private PageResult? _currentPage; + + /// + /// Initializes a new instance. + /// + /// The function that enumerates the collection asynchronously. + /// The pipeline response. + public MockAsyncPageCollection(Func> enumerateAsyncFunc, PipelineResponse response, int itemsPerPage = 5) + { + if (itemsPerPage <= 0) + { + throw new ArgumentOutOfRangeException(nameof(itemsPerPage)); + } + + _enumerateAsyncFunc = enumerateAsyncFunc ?? throw new ArgumentNullException(nameof(enumerateAsyncFunc)); + _response = response; + _itemsPerPage = itemsPerPage; + } + + /// + protected override Task> GetCurrentPageAsyncCore() + => Task.FromResult(_currentPage ?? throw new InvalidOperationException("Please call MoveNextAsync first.")); + + /// + protected override async IAsyncEnumerator> GetAsyncEnumeratorCore(CancellationToken cancellationToken = default) + { + List items = new(_itemsPerPage); + int pageStart = 0; + int rolling = 0; + + await foreach (TValue value in _enumerateAsyncFunc()) + { + items.Add(value); + rolling++; + if (items.Count == _itemsPerPage) + { + _currentPage = PageResult.Create(items, ToContinuation(pageStart), ToContinuation(rolling), _response); + yield return _currentPage; + items.Clear(); + pageStart = rolling; + } + } + + if (items.Count > 0) + { + _currentPage = PageResult.Create(items, ToContinuation(pageStart), ToContinuation(rolling), _response); + yield return _currentPage; + } + } + + private static ContinuationToken ToContinuation(int offset) + => ContinuationToken.FromBytes(BinaryData.FromBytes(BitConverter.GetBytes(offset))); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockCollectionResult.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockCollectionResult.cs new file mode 100644 index 000000000..e12e34b34 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockCollectionResult.cs @@ -0,0 +1,36 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace OpenAI.TestFramework.Mocks +{ + /// + /// Represents a mock implementation of the class. + /// + /// The type of the values in the collection. + public class MockCollectionResult : CollectionResult + { + private readonly Func> _enumerateFunc; + + /// + /// Initializes a new instance of the class with the specified enumeration + /// function and optional pipeline response. + /// + /// The function used to enumerate the collection. + /// The pipeline response associated with the collection. + public MockCollectionResult(Func> enumerateFunc, PipelineResponse? response = null) : + base(response ?? new MockPipelineResponse()) + { + _enumerateFunc = enumerateFunc ?? throw new ArgumentNullException(nameof(enumerateFunc)); + } + + /// + /// Returns an enumerator that iterates through the collection. + /// + /// An enumerator that can be used to iterate through the collection. + public override IEnumerator GetEnumerator() + => _enumerateFunc().GetEnumerator(); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockHeaders.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockHeaders.cs new file mode 100644 index 000000000..7cac49376 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockHeaders.cs @@ -0,0 +1,91 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Mocks; + +/// +/// Basic implementation of headers. +/// +public class MockHeaders +{ + private IDictionary> _headers = + new Dictionary>(StringComparer.OrdinalIgnoreCase); + + /// + /// Adds a header value. + /// + /// The name of the header. + /// The value to add. + public virtual void Add(string name, string value) + { + IList? existing; + if (!_headers.TryGetValue(name, out existing)) + { + existing = new List(); + _headers[name] = existing; + } + + existing.Add(value); + } + + /// + /// Removes all values of a header. + /// + /// The name of the header to remove. + /// True if we removed a value, false otherwise. + public virtual bool Remove(string name) => _headers.Remove(name); + + /// + /// Sets the value for a header. This will override all existing values. + /// + /// The name of the header. + /// The value to set. + public virtual void Set(string name, string value) => _headers[name] = new List() { value }; + + /// + /// Gets an enumerator for the header values. In the case of a header with more than one value, they will be joined into + /// a single comma separated string. + /// + /// The enumerator. + public virtual IEnumerator> GetEnumerator() + => _headers + .Select(kvp => new KeyValuePair(kvp.Key, string.Join(",", kvp.Value))) + .GetEnumerator(); + + /// + /// Gets the value for a header. In the case of a header with more than one value, they will be joined into a single comma + /// separated string. + /// + /// The name of the header. + /// The value of the headers + /// True if the header was found, false otherwise. + public virtual bool TryGetValue(string name, out string? value) + { + if (_headers.TryGetValue(name, out IList? existing)) + { + value = string.Join(",", existing); + return true; + } + + value = null; + return false; + } + + /// + /// Gets the values for a header. + /// + /// The name of the header. + /// All of the values for the header. + /// True if the header was found, false otherwise. + public virtual bool TryGetValues(string name, out IEnumerable? values) + { + if (_headers.TryGetValue(name, out IList? existing)) + { + values = existing; + return true; + } + + values = null; + return false; + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockHttpMessageHandler.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockHttpMessageHandler.cs new file mode 100644 index 000000000..74f4b9980 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockHttpMessageHandler.cs @@ -0,0 +1,124 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Net; +using System.Net.Http; + +namespace OpenAI.TestFramework.Mocks; + +/// +/// A mock message handler that doesn't use the network. This captures all received requests, and allows you to specify a handler +/// to hand craft response messages. This can be useful for unit testing. +/// +public class MockHttpMessageHandler : HttpMessageHandler, IDisposable +{ + /// + /// Handles a captured request. + /// + /// The captured request. + /// The corresponding response. + public delegate CapturedResponse RequestHandlerDelegate(CapturedRequest request); + + private RequestHandlerDelegate _handler; + private List _requests; + private List _responses; + private PipelineTransport? _transport; + + /// + /// Creates a new instance. + /// + /// (Optional) The handler to use to generate responses. Default returns an empty + /// response body with HTTP 204 + public MockHttpMessageHandler(RequestHandlerDelegate? requestHandler = null) + { + _handler = requestHandler ?? ReturnEmpty; + _requests = new List(); + _responses = new List(); + } + + /// + /// Event raised when a request is received. + /// + public event EventHandler? OnRequest; + + /// + /// Event raised when a response is generated. + /// + public event EventHandler? OnResponse; + + /// + /// Gets the transport to pass to your System.ClientModel based clients. + /// + public PipelineTransport Transport => _transport ??= new HttpClientPipelineTransport(new HttpClient(this)); + + /// + /// All received requests. + /// + public IReadOnlyList Requests => _requests; + + /// + /// All generated responses. + /// + public IReadOnlyList Responses => _responses; + + /// + /// Default handler that always returns an empty JSON payload as the response with the correct headers set + /// + /// The request + /// An empty successful JSON response + public static CapturedResponse ReturnEmptyJson(CapturedRequest request) + => new() + { + Status = HttpStatusCode.OK, + ReasonPhrase = "OK", + Content = BinaryData.FromString("{}"), + Headers = new Dictionary>() + { + ["Content-Type"] = ["application/json"], + ["Content-Length"] = ["2"] + } + }; + + /// + /// Default handler that returns an empty HTTP 204 payload + /// + /// The request + /// An HTTP 204 empty response + public static CapturedResponse ReturnEmpty(CapturedRequest request) + => new() { Status = HttpStatusCode.NoContent }; + + private HttpResponseMessage HandleRequest(HttpRequestMessage request, CancellationToken token) + { + try + { + CapturedRequest capturedRequest = new(request); + OnRequest?.Invoke(this, capturedRequest); + _requests.Add(capturedRequest); + + CapturedResponse capturedResponse = _handler(capturedRequest); + OnResponse?.Invoke(this, capturedResponse); + _responses.Add(capturedResponse); + + return capturedResponse.ToResponse(); + } + catch (Exception ex) + { + throw new ClientResultException("Failed to process request", null, ex); + } + } + + #region HttpMessagHandler implementation + +#if NET + override +#endif + protected HttpResponseMessage Send(HttpRequestMessage request, CancellationToken cancellationToken) + => HandleRequest(request, cancellationToken); + + protected override Task SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) + => Task.FromResult(HandleRequest(request, cancellationToken)); + + #endregion +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockPageCollection.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockPageCollection.cs new file mode 100644 index 000000000..1f08987ae --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockPageCollection.cs @@ -0,0 +1,71 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace OpenAI.TestFramework.Mocks; + +/// +/// Represents a mock implementation of the class. +/// +/// The type of the values in the collection. +public class MockPageCollection : PageCollection +{ + private readonly Func> _enumerateFunc; + private readonly PipelineResponse _response; + private readonly int _itemsPerPage; + private PageResult? _currentPage; + + /// + /// Creates a new instance. + /// + /// The function used to enumerate the collection. + /// The pipeline response. + /// (Optional) The number of items per page. + public MockPageCollection(Func> enumerateFunc, PipelineResponse response, int itemsPerPage = 5) + { + if (itemsPerPage <= 0) + { + throw new ArgumentOutOfRangeException(nameof(itemsPerPage)); + } + + _enumerateFunc = enumerateFunc ?? throw new ArgumentNullException(nameof(enumerateFunc)); + _response = response; + _itemsPerPage = itemsPerPage; + } + + /// + protected override PageResult GetCurrentPageCore() + => _currentPage ?? throw new InvalidOperationException("Please call MoveNextAsync first."); + + /// + protected override IEnumerator> GetEnumeratorCore() + { + List items = new(_itemsPerPage); + int pageStart = 0; + int rolling = 0; + + foreach (TValue item in _enumerateFunc()) + { + items.Add(item); + rolling++; + if (items.Count == _itemsPerPage) + { + _currentPage = PageResult.Create(items, ToContinuation(pageStart), ToContinuation(rolling), _response); + yield return _currentPage; + items.Clear(); + pageStart = rolling; + } + } + + if (items.Count > 0) + { + _currentPage = PageResult.Create(items, ToContinuation(pageStart), null, _response); + yield return _currentPage; + } + } + + private static ContinuationToken ToContinuation(int offset) + => ContinuationToken.FromBytes(BinaryData.FromBytes(BitConverter.GetBytes(offset))); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockPipelineResponse.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockPipelineResponse.cs new file mode 100644 index 000000000..1ade396df --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockPipelineResponse.cs @@ -0,0 +1,88 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; + +namespace OpenAI.TestFramework.Mocks; + +/// +/// A mock implementation of a pipeline response +/// +public class MockPipelineResponse : PipelineResponse +{ + private Stream? _contentStream; + private BinaryData? _buffered; + + /// + /// Creates a new instance. + /// + /// (Optional) The HTTP status. + /// (Optional) The HTTP reason phrase. + /// (Optional) The HTTP response body content. + public MockPipelineResponse( + int? status = null, + string? reasonPhrase = null, + BinaryData? content = null) + { + Status = status ?? 200; + ReasonPhrase = reasonPhrase ?? "OK"; + _buffered = content; + ContentStream = content?.ToStream(); + HeadersCore = new MockResponseHeaders(); + } + + /// + public override int Status { get; } + + /// + public override string ReasonPhrase { get; } + + /// + public override Stream? ContentStream + { + get => _contentStream; + set + { + _contentStream = value; + _buffered = null; + } + } + + /// + public override BinaryData Content => _buffered ?? throw new InvalidOperationException("Response content is not yet buffered"); + + /// + protected override PipelineResponseHeaders HeadersCore { get; } + + /// + public override BinaryData BufferContent(CancellationToken cancellationToken = default) + => BufferContentSyncAsync(false, cancellationToken).GetAwaiter().GetResult(); + + /// + public override ValueTask BufferContentAsync(CancellationToken cancellationToken = default) + => BufferContentSyncAsync(true, cancellationToken); + + /// + public override void Dispose() + { + ContentStream?.Dispose(); + } + + private async ValueTask BufferContentSyncAsync(bool isAsync, CancellationToken token) + { + if (_buffered != null) + { + return _buffered; + } + + _buffered = ContentStream == null + ? BinaryData.FromBytes(Array.Empty()) + : isAsync + ? await BinaryData.FromStreamAsync(ContentStream, token).ConfigureAwait(false) + : BinaryData.FromStream(ContentStream); + + ContentStream?.Dispose(); + ContentStream = _buffered.ToStream(); + return _buffered; + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockRequestHeaders.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockRequestHeaders.cs new file mode 100644 index 000000000..e03c4cd18 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockRequestHeaders.cs @@ -0,0 +1,38 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; + +namespace OpenAI.TestFramework.Mocks; + +/// +/// Mock implementation of request headers. +/// +public class MockRequestHeaders : PipelineRequestHeaders +{ + private MockHeaders _headers = new(); + + /// + public override void Add(string name, string value) + => _headers.Add(name, value); + + /// + public override bool Remove(string name) + => _headers.Remove(name); + + /// + public override void Set(string name, string value) + => _headers.Set(name, value); + + /// + public override IEnumerator> GetEnumerator() + => _headers.GetEnumerator(); + + /// + public override bool TryGetValue(string name, out string? value) + => _headers.TryGetValue(name, out value); + + /// + public override bool TryGetValues(string name, out IEnumerable? values) + => _headers.TryGetValues(name, out values); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockResponseHeaders.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockResponseHeaders.cs new file mode 100644 index 000000000..aead0b4b8 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockResponseHeaders.cs @@ -0,0 +1,26 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; + +namespace OpenAI.TestFramework.Mocks; + +/// +/// Mock implementation of response headers. +/// +public class MockResponseHeaders : PipelineResponseHeaders +{ + private MockHeaders _headers = new(); + + /// + public override IEnumerator> GetEnumerator() + => _headers.GetEnumerator(); + + /// + public override bool TryGetValue(string name, out string? value) + => _headers.TryGetValue(name, out value); + + /// + public override bool TryGetValues(string name, out IEnumerable? values) + => _headers.TryGetValues(name, out values); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockRestService.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockRestService.cs new file mode 100644 index 000000000..58420f679 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockRestService.cs @@ -0,0 +1,413 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Collections.Concurrent; +using System.Net; +using System.Net.Sockets; +using System.Text.Json; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Mocks; + +/// +/// Represents a mock REST service for testing purposes. +/// +/// The type of data stored in the service. +public class MockRestService : IDisposable +{ + /// + /// Represents an entry in the mock REST service. + /// + /// The ID of the entry. + /// The data associated with the entry. + public record Entry(string id, TData data) + { +#if NETFRAMEWORK + public Entry() : this(string.Empty, default!) + { + // .Net framework System.Text.Json cannot deserialize records without a parameterless constructor + } +#endif + }; + + /// + /// Represents an error in the mock REST service. + /// + /// The error code. + /// The error message. + /// The stack trace of the error. + public record Error(int error, string message, string? stack = null); + + private static readonly JsonSerializerOptions s_options = new() + { + WriteIndented = true, +#pragma warning disable SYSLIB0020 + IgnoreNullValues = true +#pragma warning restore SYSLIB0020 + }; + + private ConcurrentDictionary _data; + private HttpListener _listener; + private CancellationTokenSource _cts; + private Task _workerTask; + + /// + /// Initializes a new instance of the class. + /// + /// (Optional) The base path of the service. + /// (Optional) The port number to listen on. If set to 0, a port will be automatically selected. + public MockRestService(string? basePath = null, ushort port = 0) + { + _data = new(); + basePath = basePath?.EnsureEndsWith("/"); + + int maxAttempts = port == 0 ? 15 : 1; + Exception? ex = null; + for (int i = 0; _listener == null && i < maxAttempts; i++) + { + _listener = TryStartListener(basePath ?? string.Empty, port, out ex)!; + } + + if (_listener == null || ex != null) + { + throw new ApplicationException("Failed to start the mock rest service", ex); + } + + HttpEndpoint = TerminatePathWithSlash(new Uri(_listener.Prefixes.First())); + _cts = new(); + _workerTask = Task.Run(() => WorkerAsync(_cts.Token), _cts.Token); + } + + /// + /// Gets the HTTP endpoint of the mock REST service. + /// + public Uri HttpEndpoint { get; } + + /// + /// Gets all entries in the mock REST service. + /// + /// An enumerable collection of entries. + public virtual IEnumerable GetAll() + => _data.Select(kvp => new Entry(kvp.Key, kvp.Value)); + + /// + /// Tries to get an entry from the mock REST service. + /// + /// The ID of the entry to get. + /// When this method returns, contains the entry associated with the specified ID, if found; otherwise, null. + /// true if the entry was found; otherwise, false. + public virtual bool TryGet(string id, out Entry? entry) + { + if (_data.TryGetValue(id, out TData? value)) + { + entry = new(id, value); + return true; + } + + entry = null; + return false; + } + + /// + /// Tries to add an entry to the mock REST service. + /// + /// The ID of the entry to add. + /// The data associated with the entry. + /// When this method returns, contains the added entry, if successful; otherwise, null. + /// true if the entry was added successfully; otherwise, false. + public virtual bool TryAdd(string id, TData data, out Entry? entry) + { + entry = null; + + if (_data.TryAdd(id, data)) + { + entry = new(id, data); + return true; + } + + return false; + } + + /// + /// Tries to delete an entry from the mock REST service. + /// + /// The ID of the entry to delete. + /// true if the entry was deleted successfully; otherwise, false. + public virtual bool TryDelete(string id) + => _data.TryRemove(id, out _); + + /// + /// Tries to update an entry in the mock REST service. + /// + /// The ID of the entry to update. + /// The updated data for the entry. + /// When this method returns, contains the updated entry, if successful; otherwise, null. + /// true if the entry was updated successfully; otherwise, false. + public virtual bool TryUpdate(string id, TData data, out Entry? entry) + { + _data[id] = data; + entry = new(id, data); + return true; + } + + /// + /// Resets the mock REST service removing all entries. + /// + public virtual void Reset() + => _data.Clear(); + + /// + /// Disposes of the resources used by the mock REST service. + /// + public void Dispose() + { + _cts.Cancel(); + _listener.Stop(); + try { _workerTask.Wait(500); } catch { } + _listener.Close(); + _cts.Dispose(); + } + + /// + /// Worker method that handles incoming HTTP requests. + /// + /// The cancellation token. + protected virtual async Task WorkerAsync(CancellationToken token) + { + while (!token.IsCancellationRequested) + { + HttpListenerContext context = await _listener.GetContextAsync().ConfigureAwait(false); + HttpListenerRequest request = context.Request; + HttpListenerResponse response = context.Response; + + if (request == null || request.Url == null) + { + context.Response?.Abort(); + continue; + } + + try + { + response.ContentLength64 = 0; + + string? id = GetId(HttpEndpoint, request.Url); + switch (request.HttpMethod.ToUpperInvariant()) + { + case "GET": + if (id == null) + { + // Send down all data + IEnumerable allData = GetAll(); + WriteJsonResponse(response, 200, allData); + } + else if (TryGet(id, out Entry? entry) && entry != null) + { + WriteJsonResponse(response, 200, entry); + } + else + { + response.StatusCode = (int)HttpStatusCode.NotFound; + } + break; + + case "POST": + if (id == null) + { + response.StatusCode = (int)HttpStatusCode.BadRequest; + } + else + { + TData? data = ReadBody(request); + if (data == null) + { + response.StatusCode = (int)HttpStatusCode.GatewayTimeout; + } + else if (TryAdd(id, data, out Entry? entry)) + { + if (entry == null) + { + response.StatusCode = (int)HttpStatusCode.NoContent; + } + else + { + WriteJsonResponse(response, 200, entry); + } + } + else + { + response.StatusCode = (int)HttpStatusCode.Conflict; + } + } + break; + + case "PUT": + if (id == null) + { + response.StatusCode = (int)HttpStatusCode.BadRequest; + } + else + { + TData? data = ReadBody(request); + if (data == null) + { + response.StatusCode = (int)HttpStatusCode.GatewayTimeout; + } + else if (TryUpdate(id, data, out Entry? entry)) + { + if (entry == null) + { + response.StatusCode = (int)HttpStatusCode.NoContent; + } + else + { + WriteJsonResponse(response, 200, entry); + } + } + else + { + response.StatusCode = (int)HttpStatusCode.NotFound; + response.ContentLength64 = 0; + } + } + break; + + case "DELETE": + response.ContentLength64 = 0; + if (id == null) + { + response.StatusCode = (int)HttpStatusCode.BadRequest; + } + else if (TryDelete(id)) + { + response.StatusCode = (int)HttpStatusCode.NoContent; + } + else + { + response.StatusCode = (int)HttpStatusCode.NotFound; + } + break; + + default: + response.StatusCode = (int)HttpStatusCode.MethodNotAllowed; + break; + } + + response.Close(); + } + catch (Exception ex) + { + response.StatusCode = (int)HttpStatusCode.InternalServerError; + try + { + if (response.OutputStream.Length > 0 || response.OutputStream.CanSeek) + { + response.OutputStream.SetLength(0); + } + + if (response.OutputStream.Length == 0) + { + WriteJsonResponse( + response, + (int)HttpStatusCode.InternalServerError, + new Error( + 500, + ex.Message +#if DEBUG + , ex.StackTrace +#endif + )); + } + } + catch { /* we tried */ } + } + } + } + + private static ushort GetFreePort() + { + TcpListener? listener = null; + try + { + listener = new TcpListener(IPAddress.Loopback, 0); + listener.Start(); + return (ushort)((IPEndPoint)listener.LocalEndpoint).Port; + } + finally + { + listener?.Stop(); + } + } + + private static HttpListener? TryStartListener(string basePath, ushort port, out Exception? ex) + { + if (port == 0) + { + port = GetFreePort(); + } + + HttpListener? listener = null; + try + { + listener = new(); + listener.Prefixes.Add($"http://localhost:{port}/{basePath}"); + listener.Start(); + ex = null; + return listener; + } + catch (Exception e) + { + listener?.Close(); + ex = e; + return null; + } + } + + private static Uri TerminatePathWithSlash(Uri uri) + { + if (uri.IsAbsoluteUri) + { + if (!uri.AbsolutePath.EndsWith("/")) + { + UriBuilder builder = new(uri); + builder.Path += '/'; + return builder.Uri; + } + } + else if (!uri.OriginalString.EndsWith("/")) + { + return new Uri(uri.OriginalString + '/', UriKind.RelativeOrAbsolute); + } + + return uri; + } + + private static string? GetId(Uri baseUri, Uri requestUri) + { + Uri normalizedRequestUri = TerminatePathWithSlash(requestUri); + Uri relative = baseUri.MakeRelativeUri(normalizedRequestUri); + return relative.OriginalString.Split(["/"], StringSplitOptions.RemoveEmptyEntries).FirstOrDefault(); + } + + private static TData? ReadBody(HttpListenerRequest request) + { + if (request.ContentLength64 == 0) + { + return default; + } + + return JsonHelpers.Deserialize(request.InputStream, s_options); + } + + private static void WriteJsonResponse(HttpListenerResponse response, int status, T data) + { + response.StatusCode = status; + + using MemoryStream buffer = new(); + JsonHelpers.Serialize(buffer, data, s_options); + buffer.Seek(0, SeekOrigin.Begin); + + response.ContentType = "application/json"; + response.ContentLength64 = buffer.Length; + buffer.CopyTo(response.OutputStream); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockRestServiceClient.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockRestServiceClient.cs new file mode 100644 index 000000000..a3fb851d3 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Mocks/MockRestServiceClient.cs @@ -0,0 +1,274 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Globalization; +using System.Net.Http; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Mocks; + +/// +/// A client for . +/// +/// The type of data used by the client. +public class MockRestServiceClient : IDisposable +{ + private ClientPipeline _pipeline; + private Uri _baseUri; + + /// + /// Only used to generate a dynamic proxy for testing. Do not use this yourself. + /// + internal MockRestServiceClient() + { + _pipeline = null!; + _baseUri = null!; + } + + /// + /// Initializes a new instance of the class with the specified service URI and options. + /// + /// The service URI. + /// The client pipeline options. + public MockRestServiceClient(Uri serviceUri, ClientPipelineOptions? options = null) + { + _pipeline = ClientPipeline.Create(options); + _baseUri = serviceUri ?? throw new ArgumentNullException(nameof(serviceUri)); + } + + /// + /// Adds data asynchronously to the service with the specified ID. + /// + /// The ID of the data. + /// The data to add. + /// The cancellation token. + /// A task representing the asynchronous operation. + public virtual Task AddAsync(string id, TData data, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(id)) + throw new ArgumentException("Value cannot be null or whitespace.", nameof(id)); + + ValidateData(data); + return SendSyncOrAsync(true, HttpMethod.Post, id, data, token).AsTask(); + } + + /// + /// Adds data synchronously to the service with the specified ID. + /// + /// The ID of the data. + /// The data to add. + /// The cancellation token. + /// The result of the operation. + public virtual ClientResult Add(string id, TData data, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(id)) + throw new ArgumentException("Value cannot be null or whitespace.", nameof(id)); + + ValidateData(data); + return SendSyncOrAsync(false, HttpMethod.Post, id, data, token).GetAwaiter().GetResult(); + } + + /// + /// Gets data asynchronously from the service with the specified ID. Will return null if the data does not exist. + /// + /// The ID of the data. + /// The cancellation token. + /// A task representing the asynchronous operation. + public virtual async Task> GetAsync(string id, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(id)) + throw new ArgumentException("Value cannot be null or whitespace.", nameof(id)); + + try + { + ClientResult result = await SendSyncOrAsync(true, HttpMethod.Get, id, default, token) + .ConfigureAwait(false); + + var response = result.GetRawResponse(); + return ClientResult.FromOptionalValue( + response.Content.ToObjectFromJson.Entry>().data, + response); + } + catch (ClientResultException ex) + { + if (ex.GetRawResponse()?.Status == 404) + { + return ClientResult.FromOptionalValue(default, ex.GetRawResponse()!); + } + + throw; + } + } + + /// + /// Gets data synchronously from the service with the specified ID. Will return null if the data does not exist. + /// + /// The ID of the data. + /// The cancellation token. + /// The result of the operation. + public virtual ClientResult Get(string id, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(id)) + throw new ArgumentException("Value cannot be null or whitespace.", nameof(id)); + + try + { + ClientResult result = SendSyncOrAsync(false, HttpMethod.Get, id, default, token).GetAwaiter().GetResult(); + var response = result.GetRawResponse(); + return ClientResult.FromOptionalValue( + response.Content.ToObjectFromJson.Entry>().data, + response); + } + catch (ClientResultException ex) + { + if (ex.GetRawResponse()?.Status == 404) + { + return ClientResult.FromOptionalValue(default, ex.GetRawResponse()!); + } + + throw; + } + } + + /// + /// Removes data asynchronously from the service with the specified ID. + /// + /// The ID of the data. + /// The cancellation token. + /// A task representing the asynchronous operation. + public virtual async Task> RemoveAsync(string id, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(id)) + throw new ArgumentException("Value cannot be null or whitespace.", nameof(id)); + + try + { + ClientResult result = await SendSyncOrAsync(true, HttpMethod.Delete, id, default, token); + return ClientResult.FromValue(true, result.GetRawResponse()); + } + catch (ClientResultException ex) + { + if (ex.GetRawResponse()?.Status == 404) + { + return ClientResult.FromValue(false, ex.GetRawResponse()!); + } + + throw; + } + } + + /// + /// Removes data synchronously from the service with the specified ID. + /// + /// The ID of the data. + /// The cancellation token. + /// The result of the operation. + public virtual ClientResult Remove(string id, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(id)) + throw new ArgumentException("Value cannot be null or whitespace.", nameof(id)); + + try + { + ClientResult result = SendSyncOrAsync(false, HttpMethod.Delete, id, default, token).GetAwaiter().GetResult(); + return ClientResult.FromValue(true, result.GetRawResponse()); + } + catch (ClientResultException ex) + { + if (ex.GetRawResponse()?.Status == 404) + { + return ClientResult.FromValue(false, ex.GetRawResponse()!); + } + + throw; + } + } + + /// + /// Disposes of the resources used by the client. + /// + public virtual void Dispose() + { + // no obvious way to dispose of the pipeline, nor the inner transport + } + + /// + /// Validates the data before sending it to the service. + /// + /// The data to validate. + protected virtual void ValidateData(TData? data) + { + if (data == null) + { + throw new ArgumentNullException(nameof(data)); + } + } + + /// + /// Sends the request to the service synchronously or asynchronously. This will serialize the passed in data to JSON using the default + /// serializer. + /// + /// Indicates whether the request should be sent asynchronously. + /// The HTTP method. + /// The ID of the data. + /// The data to send. + /// The cancellation token. + /// The result of the operation. + protected async ValueTask SendSyncOrAsync(bool isAsync, HttpMethod method, string? id, TData? data, CancellationToken token) + { + UriBuilder builder = new(_baseUri); + if (id != null) + { + builder.Path += id; + } + + PipelineMessage message = _pipeline.CreateMessage(); + message.Request.Method = method.Method; + message.Request.Uri = builder.Uri; + message.Apply(new RequestOptions() + { + CancellationToken = token, + BufferResponse = true + }); + + if (data == null) + { + message.Request.Headers.Set("Content-Length", "0"); + } + else + { + using MemoryStream stream = new(); + JsonHelpers.Serialize(stream, data); + var binaryData = BinaryData.FromBytes(new ReadOnlyMemory(stream.GetBuffer(), 0, (int)stream.Length)); + + message.Request.Headers.Set("Content-Length", stream.Length.ToString(CultureInfo.InvariantCulture)); + message.Request.Headers.Set("Content-Type", "application/json"); + message.Request.Content = BinaryContent.Create(binaryData); + } + + if (isAsync) + { + await _pipeline.SendAsync(message).ConfigureAwait(false); + } + else + { + _pipeline.Send(message); + } + + if (message.Response?.IsError == true) + { + if (message.Response.Content?.ToMemory().Length > 0) + { + var error = message.Response.Content.ToObjectFromJson.Error>(); + throw new ClientResultException($"Error {error.error}: {error.message}", message.Response); + } + + throw new ClientResultException(message.Response); + } + + return ClientResult.FromResponse(message.Response!); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/OpenAI.TestFramework.csproj b/.dotnet.azure/sdk/openai/tools/TestFramework/src/OpenAI.TestFramework.csproj new file mode 100644 index 000000000..dc92fd798 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/OpenAI.TestFramework.csproj @@ -0,0 +1,39 @@ + + + + $(RequiredTargetFrameworks) + enable + enable + latest + + + + + Utils\Polyfill\%(RecursiveDir)\%(Filename).cs + + + + + + + + + + + + + + 0024000004800000940000000602000000240000525341310004000001000100c547cac37abd99c8db225ef2f6c8a3602f3b3606cc9891605d02baa56104f4cfc0734aa39b93bf7852f7d9266654753cc297e7d2edfe0bac1cdcf9f717241550e0a7b191195b7667bb4f64bcb8e2121380fd1d9d46ad2d92d2d15605093924cceaf74c4861eff62abf69b9291ed0a340e113be11e6a7d3113e92484cf7045cc7 + + + + + + + <_Parameter1>TestProxyPath + <_Parameter2>$(NuGetPackageRoot)\azure.sdk.tools.testproxy\$(TestProxyVersion)\tools\net6.0\any\Azure.Sdk.Tools.TestProxy.dll + + + + diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/RecordedClientTestBase.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/RecordedClientTestBase.cs new file mode 100644 index 000000000..27d6ca2e7 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/RecordedClientTestBase.cs @@ -0,0 +1,423 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; +using System.Diagnostics; +using System.Net; +using System.Text; +using NUnit.Framework; +using NUnit.Framework.Internal; +using OpenAI.TestFramework.Recording; +using OpenAI.TestFramework.Recording.Proxy; +using OpenAI.TestFramework.Recording.Proxy.Service; +using OpenAI.TestFramework.Recording.RecordingProxy; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework; + +/// +/// Base class for client test cases that supports recording and playback of HTTP/HTTPS REST requests. This recording +/// support is provided by use of the Test Proxy . +/// This provides the basic framework to start the Test Proxy, create a recording for a test or playback a recording +/// for a test. It also provides support for automatic testing of async and sync versions of methods (see +/// for more details). +/// +[NonParallelizable] +public abstract class RecordedClientTestBase : ClientTestBase +{ + /// + /// Invalid characters that will be removed from test names when creating recordings. + /// + /// + /// Using Windows version as it is the most restrictive of all platforms: + /// + /// + protected static readonly ISet s_invalidChars = new HashSet() + { + '\"', '<', '>', '|', '\0', + (char)1, (char)2, (char)3, (char)4, (char)5, (char)6, (char)7, (char)8, (char)9, (char)10, + (char)11, (char)12, (char)13, (char)14, (char)15, (char)16, (char)17, (char)18, (char)19, (char)20, + (char)21, (char)22, (char)23, (char)24, (char)25, (char)26, (char)27, (char)28, (char)29, (char)30, + (char)31, ':', '*', '?', '\\', '/' + }; + + private DateTimeOffset _testStartTime; + private TestRecordingOptions _options; + + /// + /// Creates a new instance. + /// + /// True to run the async version of a test, false to run the sync version of a test. + public RecordedClientTestBase(bool isAsync) : this(isAsync, null) + { } + + /// + /// Creates a new instance. + /// + /// True to run the async version of a test, false to run the sync version of a test. + /// (Optional) The recorded test mode to use. If unset, the default recorded test mode will be used. + public RecordedClientTestBase(bool isAsync, RecordedTestMode? mode = null) : base(isAsync) + { + _options = new TestRecordingOptions(); + Mode = mode ?? GetDefaultRecordedTestMode(); + } + + /// + public override DateTimeOffset TestStartTime => _testStartTime; + + /// + /// Gets the test proxy instance to use for the current test case. + /// + public ProxyService? Proxy { get; protected internal set; } + + /// + /// Gets or sets the current recording mode for the test. + /// + public RecordedTestMode Mode { get; set; } + + /// + /// Gets or sets the recording options to use for the current test. This will be pre-populated with a sensible configuration. + /// + public TestRecordingOptions RecordingOptions + { + get => _options; + set => _options = value ?? throw new ArgumentNullException(nameof(value)); + } + + /// + /// Gets the recording for the current test. + /// + public TestRecording? Recording { get; protected internal set; } + + /// + /// Gets the maximum amount of time to wait for starting/tearing down the test proxy, as well as the maximum amount of time + /// to wait for configuring a recording session, and then saving it or closing it. + /// + public virtual TimeSpan TestProxyWaitTime => Debugger.IsAttached + ? Default.DebuggerAttachedTestTimeout + : Default.TestProxyWaitTime; + + /// + /// Gets the test timeout. + /// + public override TimeSpan TestTimeout + { + get + { + if (Debugger.IsAttached) + { + return Default.DebuggerAttachedTestTimeout; + } + + switch (Mode) + { + default: + case RecordedTestMode.Record: + case RecordedTestMode.Live: + return TimeSpan.FromSeconds(60); + + case RecordedTestMode.Playback: + return Default.TestTimeout; + } + } + } + + /// + /// Determines whether or not to use Fiddler. If this is true, then the recording transport will be updated to use Fiddler + /// as the intermediary when talking to the test proxy, as well as accept the Fiddler root certificate. + /// + public virtual bool UseFiddler + { + get + { + // Check to see if Fiddler is already running and capturing traffic by checking to see if a proxy is configured for + // 127.0.0.1:8888 with no credentials + try + { + Uri dummyUri = new("https://not.a.real.uri.com"); + + IWebProxy webProxy = WebRequest.GetSystemWebProxy(); + Uri? proxyUri = webProxy?.GetProxy(dummyUri); + if (proxyUri == null || proxyUri == dummyUri) + { + return false; + } + + // assume default of 127.0.0.1:8888 with no credentials + var cred = webProxy?.Credentials?.GetCredential(dummyUri, string.Empty); + return proxyUri.Host == "127.0.0.1" + && proxyUri.Port == 8888 + && string.IsNullOrWhiteSpace(cred?.UserName) + && string.IsNullOrWhiteSpace(cred?.Password); + } + catch + { + return false; + } + } + } + + /// + /// Checks if the recording has a recorded value for . If there is none, the + /// will be added and return. Otherwise the existing value will be returned. + /// + /// The name of the value. + /// The value to add. + /// The existing value, or the newly added value. + /// If you called this function outside of a test run. + public string? GetOrAddRecordedValue(string name, string valueToAdd) + => GetOrAddRecordedValue(name, () => valueToAdd); + + /// + /// Checks if the recording has a recorded value for . If there is none, a value will be created, added + /// and returned. Otherwise the existing value will be returned. + /// + /// The name of the value. + /// The factory used to create the value. + /// The existing value, or the newly added value. + /// If you called this function outside of a test run. + public virtual string GetOrAddRecordedValue(string name, Func valueFactory) + { + if (Recording == null) + { + throw new InvalidOperationException("Recorded value should not be retrieved outside the test method invocation"); + } + + return Recording.GetOrAddVariable(name, valueFactory); + } + + /// + /// Starts the test proxy for the current test. This will be called once at the start of the test fixture. + /// + /// Asynchronous task. + [OneTimeSetUp] + public virtual async Task StartTestProxyAsync() + { + using CancellationTokenSource cts = new(TestProxyWaitTime); + + ProxyServiceOptions options = CreateProxyServiceOptions(); + Proxy = await ProxyService.CreateNewAsync(options, cts.Token).ConfigureAwait(false); + } + + [OneTimeTearDown] + public virtual Task StopTestProxyAsync() + { + Proxy?.Dispose(); + Proxy = null; + + //TODO FIXME: Do we need to do any cleanup here? + return Task.CompletedTask; + } + + /// + /// Starts the test proxy (if it has not already been started), and then configures the recording session for the current + /// test. This should also set the property to the new recording session. + /// + /// Asynchronous task. + [SetUp] + public virtual async Task StartTestRecordingAsync() + { + // Check if the current NUnit test method has a specific attribute applied to it + if (!IsCurrentTestRecorded()) + { + return; + } + + if (Proxy == null) + { + throw new InvalidOperationException("The proxy service was not set and/or started"); + } + + _testStartTime = DateTimeOffset.UtcNow; + + // TODO FIXME: Add logic to ignore certain tests here by throwing IgnoreException()? + + using CancellationTokenSource cts = new(TestProxyWaitTime); + Recording = await StartAndConfigureRecordingSessionAsync(Proxy, cts.Token).ConfigureAwait(false); + + // don't include test proxy overhead as part of the test time + _testStartTime = DateTimeOffset.UtcNow; + } + + /// + /// Stops a recording session for the current test. If the test passed and we are in recording mode, the recording will be saved, + /// otherwise it will be discarded. + /// + /// Asynchronous task. + [TearDown] + public virtual async Task StopTestRecordingAsync() + { + if (!IsCurrentTestRecorded()) + { + return; + } + + bool testsPassed = TestContext.CurrentContext.Result.Outcome.Status == NUnit.Framework.Interfaces.TestStatus.Passed; + using CancellationTokenSource cts = new(TestProxyWaitTime); + + if (Recording != null) + { + await Recording.FinishAsync(testsPassed, cts.Token).ConfigureAwait(false); + } + } + + /// + /// Configures the client options for a System.ClientModel based service client. This will be used to configure the transport + /// such that all requests are routed to the test proxy during recording (for capture), and playback (for replaying captured + /// requests). + /// + /// The type of the client options. + /// The options to configure. + /// The configured client options. + /// The current recording mode is not supported. + /// There was no test recording configured for this test. + public virtual TClientOptions ConfigureClientOptions(TClientOptions options) + where TClientOptions : ClientPipelineOptions + { + if (!IsCurrentTestRecorded()) + { + return options; + } + + // If we are in playback, or record mode we should set the transport to the test proxy transport, except + // in the case where we've explicitly specified the transport ourselves in case we are doing some custom + // work. + if (options.Transport != null) + { + return options; + } + + switch (Mode) + { + case RecordedTestMode.Live: + // no need to to anything special + return options; + + case RecordedTestMode.Record: + // continue + break; + + case RecordedTestMode.Playback: + // force the use of a fixed retry with a short timeout + options.RetryPolicy = new TestClientRetryPolicy(delay: TimeSpan.FromMilliseconds(100)); + break; + + default: + throw new NotSupportedException("The following mode is not supported: " + Mode); + } + + if (Recording == null) + { + throw new InvalidOperationException("Please call this from within a test method invocation"); + } + + ProxyTransportOptions transportOptions = Recording.GetProxyTransportOptions(); + transportOptions.UseFiddler = UseFiddler; + if (_options.RequestOverride != null) + { + transportOptions.ShouldRecordRequest = _options.RequestOverride; + } + + options.Transport = new ProxyTransport(transportOptions); + return options; + } + + /// + /// Gets the default recorded test mode to use. + /// + /// The test mode to use. + protected virtual RecordedTestMode GetDefaultRecordedTestMode() => RecordedTestMode.Playback; + + /// + /// Gets the name of recording JSON file that contains the recording. This will be based on a sanitized version + /// of test name, and "Async" will be automatically appended when running the asynchronous versions of tests. + /// + /// The name of the test to use. + protected virtual string GetRecordedTestFileName() + { + const string c_asyncSuffix = "Async"; + TestContext.TestAdapter testAdapter = TestContext.CurrentContext.Test; + + StringBuilder builder = new(testAdapter.Name.Length + c_asyncSuffix.Length); + foreach (char c in testAdapter.Name) + { + builder.Append(s_invalidChars.Contains(c) ? '%' : c); + } + + if (IsAsync) + { + builder.Append(c_asyncSuffix); + } + + builder.Append(".json"); + + return builder.ToString(); + } + + /// + /// Configures a recording/playback session for the current test on the test proxy. This is called at the start of every test. + /// It is responsible for configuring all the necessary sanitizers, matchers, and transforms for the test proxy. + /// + /// The test proxy service to configure the recording session for. + /// The cancellation token to use. + /// The configured test recording session. + /// The test proxy service instance did not have a valid client configured. + /// The recording mode is not supported. + protected virtual async Task StartAndConfigureRecordingSessionAsync(ProxyService proxy, CancellationToken token) + { + var client = proxy.Client ?? throw new ArgumentNullException("Test proxy client was null"); + IDictionary? variables = null; + + ProxyClientResult result; + switch (Mode) + { + case RecordedTestMode.Live: + // nothing to see here + return new TestRecording(string.Empty, RecordedTestMode.Live, proxy); + + case RecordedTestMode.Playback: + var playbackResult = await client.StartPlaybackAsync(CreateRecordingSessionStartInfo(), token).ConfigureAwait(false); + variables = playbackResult.Value; + result = playbackResult; + break; + + case RecordedTestMode.Record: + result = await client.StartRecordingAsync(CreateRecordingSessionStartInfo(), token).ConfigureAwait(false); + break; + + default: + throw new NotSupportedException("Don't know how to handle recording mode: " + Mode); + } + + string? recordingId = result.RecordingId; + if (string.IsNullOrWhiteSpace(recordingId)) + { + throw new InvalidOperationException("Recording test proxy did not return a recording ID"); + } + + TestRecording recording = new TestRecording(recordingId!, Mode, proxy, variables); + await recording.ApplyOptions(_options, token).ConfigureAwait(false); + return recording; + } + + /// + /// Determines whether or not the current test should be recorded (or played back from a file). + /// + /// True to enable the use of the recording test proxy, false otherwise. + protected virtual bool IsCurrentTestRecorded() + { + return TestExecutionContext.CurrentContext.CurrentTest.GetCustomAttributes(true).Any(); + } + + /// + /// Creates the options used when starting a new instance of the test proxy service. + /// + /// The options to use. + protected abstract ProxyServiceOptions CreateProxyServiceOptions(); + + /// + /// Creates the information used to configured a recording/playback session for the current test on the test proxy. + /// + /// The information to use. + protected abstract RecordingStartInformation CreateRecordingSessionStartInfo(); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/RecordedTestAttribute.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/RecordedTestAttribute.cs new file mode 100644 index 000000000..81d94268c --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/RecordedTestAttribute.cs @@ -0,0 +1,18 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using NUnit.Framework; + +namespace OpenAI.TestFramework; + +/// +/// An attribute used to indicate that a test should be recorded (or played back from a file). When you inherit from +/// in your test class, and add this attribute to your test function, and then +/// make sure to call +/// on the client options you use to configure a client, this should automatically enable the recording/playback +/// functionality. +/// +[AttributeUsage(AttributeTargets.Method, AllowMultiple = false, Inherited = true)] +public class RecordedTestAttribute : TestAttribute +{ +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/RecordedTestMode.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/RecordedTestMode.cs new file mode 100644 index 000000000..bc0371ccf --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/RecordedTestMode.cs @@ -0,0 +1,25 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework; + +/// +/// The recording mode. +/// +public enum RecordedTestMode +{ + /// + /// Talk to live services. No recording or playback is used. + /// + Live, + + /// + /// Record the test and overwrite any existing recordings. + /// + Record, + + /// + /// Playback the test from a recording. + /// + Playback, +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Condition.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Condition.cs new file mode 100644 index 000000000..6b0bdad00 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Condition.cs @@ -0,0 +1,16 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording; + +/// +/// A condition used to evaluate whether or not a sanitizer should be applied. +/// +public class Condition +{ + /// Gets or sets the uri regex. + public string? UriRegex { get; set; } + + /// Header condition to apply. + public HeaderCondition? ResponseHeader { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/HeaderCondition.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/HeaderCondition.cs new file mode 100644 index 000000000..12d2ba55c --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/HeaderCondition.cs @@ -0,0 +1,15 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording; + +/// +/// Header condition to apply. +/// +public class HeaderCondition +{ + /// Gets or sets the key. + public string? Key { get; set; } + /// Gets or sets the value regex. + public string? ValueRegex { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Matchers/BaseMatcher.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Matchers/BaseMatcher.cs new file mode 100644 index 000000000..c4578d763 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Matchers/BaseMatcher.cs @@ -0,0 +1,38 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Text.Json; +using System.Text.Json.Serialization; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Recording.Matchers; + +/// +/// The base class for matchers that are applied during a playback session to match an incoming request +/// to a recorded one. +/// +public abstract class BaseMatcher : IUtf8JsonSerializable +{ + /// + /// Creates a new instance. + /// + /// The type of this sanitizer (e.g. GeneralRegexSanitizer). + /// If the type was null. + protected BaseMatcher(string type) + { + Type = type ?? throw new ArgumentNullException(nameof(type)); + } + + /// + /// Gets the type of the matcher (e.g. BodilessMatcher). + /// + [JsonIgnore] + public string Type { get; } + + /// + public virtual void Write(Utf8JsonWriter writer, JsonSerializerOptions? options = null) + { + // By default use reflection based serialization + JsonSerializer.Serialize(writer, this, GetType(), Default.InnerRecordingJsonOptions); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Matchers/CustomMatcher.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Matchers/CustomMatcher.cs new file mode 100644 index 000000000..80e0f1b6d --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Matchers/CustomMatcher.cs @@ -0,0 +1,42 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.Matchers; + +/// +/// This matcher exposes the default matcher in a customizable way. Currently this merely includes enabling/disabling body match and +/// adding additional excluded headers. All optional settings are safely defaulted. This means that providing zero additional +/// configuration will produce a sanitizer that is functionally identical to the default. +/// +public class CustomMatcher() : BaseMatcher("CustomDefaultMatcher") +{ + /// + /// A comma separated list of additional headers that should be excluded during matching. "Excluded" headers are entirely ignored. + /// Unlike "ignored" headers, the presence (or lack of presence) of a header will not cause mismatch. + /// + public string? ExcludedHeaders { get; set; } + + /// + /// Should the body value be compared during lookup operations? + /// + public bool? CompareBodies { get; set; } + + /// + /// A comma separated list of additional headers that should be ignored during matching. Any headers that are "ignored" will not + /// do value comparison when matching. This means that if the recording has a header that isn't in the request, a test mismatch + /// exception will be thrown noting the lack of header in the request. This also applies if the header is present in the request + /// but not recording. + /// + public string? IgnoredHeaders { get; set; } + + /// + /// A comma separated list of query parameters that should be ignored during matching. + /// + public string? IgnoredQueryParameters { get; set; } + + /// + /// By default, the test-proxy does not sort query params before matching. Setting true will sort query params alphabetically + /// before comparing URI. + /// + public bool? IgnoreQueryOrdering { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Matchers/ExistingMatcher.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Matchers/ExistingMatcher.cs new file mode 100644 index 000000000..8d2cb4eab --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Matchers/ExistingMatcher.cs @@ -0,0 +1,35 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Text.Json; + +namespace OpenAI.TestFramework.Recording.Matchers; + +/// +/// Used for specifying the use of pre-existing matchers defined in the test proxy. +/// +/// The name of the existing matcher. +public class ExistingMatcher(string existingMatcherName) : BaseMatcher(existingMatcherName) +{ + private static ExistingMatcher? _bodiless = null; + private static ExistingMatcher? _headerless = null; + + /// + /// This matcher adjusts the "match" operation to EXCLUDE the body when matching a request to a recording's entries. + /// + public static ExistingMatcher Bodiless => _bodiless ??= new ExistingMatcher("BodilessMatcher"); + + /// + /// NOT RECOMMENDED. This matcher adjusts the "match" operation to ignore header differences when matching a request. + /// Be aware that wholly ignoring headers during matching might incur unexpected issues down the line. + /// + public static ExistingMatcher Headerless => _headerless ??= new ExistingMatcher("HeaderlessMatcher"); + + /// + public override void Write(Utf8JsonWriter writer, JsonSerializerOptions? options = null) + { + // Pre-existing matchers use an empty JSON object. + writer.WriteStartObject(); + writer.WriteEndObject(); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyClient.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyClient.cs new file mode 100644 index 000000000..55660d990 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyClient.cs @@ -0,0 +1,679 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Net.Http; +using System.Text.Json; +using OpenAI.TestFramework.Recording.Matchers; +using OpenAI.TestFramework.Recording.Proxy; +using OpenAI.TestFramework.Recording.Proxy.Service; +using OpenAI.TestFramework.Recording.Sanitizers; +using OpenAI.TestFramework.Recording.Transforms; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Recording.RecordingProxy; + +/// +/// A client for configuring the recording text proxy. Please see here for more information: +/// https://github.com/Azure/azure-sdk-tools/blob/main/tools/test-proxy/Azure.Sdk.Tools.TestProxy/README.md +/// +public class ProxyClient +{ + protected internal const string X_RECORDING_ID_HEADER = "x-recording-id"; + + private ProxyClientOptions _options; + private ClientPipeline _pipeline; + + /// + /// For testing only. + /// + internal ProxyClient() + { + _options = new(new Uri("http://localhost:0")); + _pipeline = ClientPipeline.Create(); + } + + /// + /// Creates a new instance. + /// + /// The options to use. + public ProxyClient(ProxyClientOptions options) + { + _options = options ?? throw new ArgumentNullException(nameof(options)); + _pipeline = ClientPipeline.Create(options); + } + + /// + /// Starts playback session of recordings. + /// + /// The configuration to use for starting playback. + /// The cancellation token to use. + /// The result that includes any recorded variables. + public virtual ProxyClientResult> StartPlayback(RecordingStartInformation startInfo, CancellationToken token = default) + { + if (startInfo == null) + { + throw new ArgumentNullException(nameof(startInfo)); + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "playback/start", startInfo, token); + return SendSyncOrAsync>(false, message, token).GetAwaiter().GetResult(); + } + + /// + /// Starts playback session of recordings asynchronously. + /// + /// The configuration to use for starting playback. + /// The cancellation token to use. + /// The result that includes any recorded variables. + public virtual async Task>> StartPlaybackAsync(RecordingStartInformation startInfo, CancellationToken token = default) + { + if (startInfo == null) + { + throw new ArgumentNullException(nameof(startInfo)); + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "playback/start", startInfo, token); + return await SendSyncOrAsync>(true, message, token).ConfigureAwait(false); + } + + /// + /// Stops a playback session. + /// + /// The ID for the playback session to stop. + /// The cancellation token to use. + /// The client result. + public virtual ProxyClientResult StopPlayback(string recordingId, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(recordingId)) + { + throw new ArgumentException("Recording ID cannot be null, empty, or white space only"); + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "playback/stop", null, token, new() + { + [X_RECORDING_ID_HEADER] = recordingId, + }); + return SendSyncOrAsync(false, message, token).GetAwaiter().GetResult(); + } + + /// + /// Stops a playback session asynchronously. + /// + /// The ID for the playback session to stop. + /// The cancellation token to use. + /// The client result. + public virtual async Task StopPlaybackAsync(string recordingId, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(recordingId)) + { + throw new ArgumentException("Recording ID cannot be null, empty, or white space only"); + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "playback/stop", null, token, new() + { + [X_RECORDING_ID_HEADER] = recordingId, + }); + return await SendSyncOrAsync(true, message, token).ConfigureAwait(false); + } + + /// + /// Starts a recording session. + /// + /// The configuration to use for the recording session. + /// The cancellation token to use. + /// The client result. + public virtual ProxyClientResult StartRecording(RecordingStartInformation startInfo, CancellationToken token = default) + { + if (startInfo == null) + { + throw new ArgumentNullException(nameof(startInfo)); + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "record/start", startInfo, token); + return SendSyncOrAsync(false, message, token).GetAwaiter().GetResult(); + } + + /// + /// Starts a recording session asynchronously. + /// + /// The configuration to use for the recording session. + /// The cancellation token to use. + /// The client result. + public virtual async Task StartRecordingAsync(RecordingStartInformation startInfo, CancellationToken token = default) + { + if (startInfo == null) + { + throw new ArgumentNullException(nameof(startInfo)); + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "record/start", startInfo, token); + return await SendSyncOrAsync(true, message, token).ConfigureAwait(false); + } + + /// + /// Stops a recording session. + /// + /// The identifier for the recording session. + /// (Optional) Any additional variables to include with the recording. + /// (Optional) Set this to true to turn off recording. + /// The cancellation token to use. + /// The client result. + public virtual ProxyClientResult StopRecording(string recordingId, IDictionary? variables = null, bool skipRecording = false, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(recordingId)) + { + throw new ArgumentException("Recording ID cannot be null, empty, or white space only"); + } + + Dictionary additionalHeaders = new() + { + [X_RECORDING_ID_HEADER] = recordingId + }; + + if (skipRecording) + { + additionalHeaders["x-recording-skip"] = "request-response"; + } + + variables ??= new Dictionary(); + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "record/stop", variables, token, additionalHeaders); + return SendSyncOrAsync(false, message, token).GetAwaiter().GetResult(); + } + + /// + /// Stops a recording session asynchronously. + /// + /// The ID for the recording session to stop. + /// (Optional) Any additional variables to include with the recording. + /// (Optional) Set this to true to turn off recording. + /// The cancellation token to use. + /// The client result. + public virtual async Task StopRecordingAsync(string recordingId, IDictionary? variables = null, bool skipRecording = false, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(recordingId)) + { + throw new ArgumentException("Recording ID cannot be null, empty, or white space only"); + } + + Dictionary additionalHeaders = new() + { + [X_RECORDING_ID_HEADER] = recordingId + }; + + if (skipRecording) + { + additionalHeaders["x-recording-skip"] = "request-response"; + } + + variables ??= new Dictionary(); + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "record/stop", variables, token, additionalHeaders); + return await SendSyncOrAsync(true, message, token).ConfigureAwait(false); + } + + /// + /// Sets options for the proxy. + /// + /// The identifier for the playback/recording session. + /// The options to set. + /// The cancellation token to use. + /// The client result. + public virtual ProxyClientResult SetRecordingTransportOptions(string recordingId, ProxyServiceOptions options, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(recordingId)) + { + throw new ArgumentException("Recording ID cannot be null, empty, or white space only"); + } + else if (options == null) + { + throw new ArgumentNullException(nameof(options)); + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "admin/setrecordingoptions", options, token, new() + { + [X_RECORDING_ID_HEADER] = recordingId, + }); + return SendSyncOrAsync(false, message, token).GetAwaiter().GetResult(); + } + + /// + /// Sets options for the proxy asynchronously. + /// + /// The identifier for the playback/recording session. + /// The options to set. + /// The cancellation token to use. + /// The client result. + public virtual async Task SetRecordingTransportOptionsAsync(string recordingId, ProxyServiceOptions options, CancellationToken token = default) + { + if (string.IsNullOrWhiteSpace(recordingId)) + { + throw new ArgumentException("Recording ID cannot be null, empty, or white space only"); + } + else if (options == null) + { + throw new ArgumentNullException(nameof(options)); + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "admin/setrecordingoptions", options, token, new() + { + [X_RECORDING_ID_HEADER] = recordingId, + }); + return await SendSyncOrAsync(true, message, token).ConfigureAwait(false); + } + + /// + /// Removes some pre-defined sanitizers to be used during recording/playback by specifying their IDs. + /// + /// The set of sanitizer IDs to remove. + /// (Optional) If specified, the sanitizers will be removed for a particular session only. + /// If null, the sanitizers will be removed globally on the test proxy. + /// The cancellation token to use. + /// The client result. + public virtual ProxyClientResult RemoveSanitizers(ISet sanitizerIds, string? recordingId = null, CancellationToken token = default) + { + if (sanitizerIds == null) + { + throw new ArgumentNullException(nameof(sanitizerIds)); + } + + Dictionary headers = new(); + if (recordingId != null) + { + headers[X_RECORDING_ID_HEADER] = recordingId; + } + + PipelineMessage message = CreateJsonRequest( + HttpMethod.Post, + "admin/removesanitizers", + new SanitizerIdList() { Sanitizers = sanitizerIds.ToArray() }, + token, + headers); + return SendSyncOrAsync(false, message, token).GetAwaiter().GetResult(); + } + + /// + /// Removes some pre-defined sanitizers to be used during recording/playback by specifying their IDs. + /// + /// The set of sanitizer IDs to remove. + /// (Optional) If specified, the sanitizers will be removed for a particular session only. + /// If null, the sanitizers will be removed globally on the test proxy. + /// The cancellation token to use. + /// The client result. + public virtual async Task RemoveSanitizersAsync(ISet sanitizerIds, string? recordingId = null, CancellationToken token = default) + { + if (sanitizerIds == null) + { + throw new ArgumentNullException(nameof(sanitizerIds)); + } + + Dictionary headers = new(); + if (recordingId != null) + { + headers[X_RECORDING_ID_HEADER] = recordingId; + } + + PipelineMessage message = CreateJsonRequest( + HttpMethod.Post, + "admin/removesanitizers", + new SanitizerIdList() { Sanitizers = sanitizerIds.ToArray() }, + token, + headers); + return await SendSyncOrAsync(true, message, token).ConfigureAwait(false); + } + + /// + /// Adds sanitizers for the recording test proxy. + /// + /// The sanitizers to add. + /// (Optional) If specified, the sanitizers will added for a particular session only. + /// If null, the sanitizers will be added globally on the test proxy. + /// The cancellation token to use. + /// The client result with the set of sanitizer IDs added. + public virtual ProxyClientResult> AddSanitizers(IEnumerable sanitizers, string? recordingId = null, CancellationToken token = default) + { + if (sanitizers == null) + { + throw new ArgumentNullException(nameof(sanitizers)); + } + + Dictionary headers = new(); + if (recordingId != null) + { + headers[X_RECORDING_ID_HEADER] = recordingId; + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "Admin/AddSanitizers", sanitizers, token, headers); + ProxyClientResult result = SendSyncOrAsync(false, message, token).GetAwaiter().GetResult(); + return new ProxyClientResult>( + result.Value.Sanitizers ?? Array.Empty(), + result.GetRawResponse()); + } + + /// + /// Adds sanitizers for the recording test proxy asynchronously. + /// + /// The sanitizers to add. + /// (Optional) If specified, the sanitizers will added for a particular session only. + /// If null, the sanitizers will be added globally on the test proxy. + /// The cancellation token to use. + /// The client result with the set of sanitizer IDs added. + public virtual async Task>> AddSanitizersAsync(IEnumerable sanitizers, string? recordingId = null, CancellationToken token = default) + { + if (sanitizers == null) + { + throw new ArgumentNullException(nameof(sanitizers)); + } + + Dictionary headers = new(); + if (recordingId != null) + { + headers[X_RECORDING_ID_HEADER] = recordingId; + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "Admin/AddSanitizers", sanitizers, token, headers); + ProxyClientResult result = await SendSyncOrAsync(true, message, token).ConfigureAwait(false); + return new ProxyClientResult>( + result.Value.Sanitizers ?? Array.Empty(), + result.GetRawResponse()); + } + + /// + /// Sets the matcher to use. + /// + /// The matcher to use. + /// (Optional) If specified, the matcher will be set for a particular session only. + /// If null, the matcher will be set globally on the test proxy. + /// The cancellation token to use. + /// The client result. + public virtual ProxyClientResult SetMatcher(BaseMatcher matcher, string? recordingId = null, CancellationToken token = default) + { + if (matcher == null) + { + throw new ArgumentNullException(nameof(matcher)); + } + + Dictionary headers = new() + { + ["x-abstraction-identifier"] = matcher.Type + }; + + if (recordingId != null) + { + headers[X_RECORDING_ID_HEADER] = recordingId; + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "admin/setmatcher", matcher, token, headers); + return SendSyncOrAsync(false, message, token).GetAwaiter().GetResult(); + } + + /// + /// Sets the matcher to use asynchronously. + /// + /// The matcher to use. + /// (Optional) If specified, the matcher will be set for a particular session only. + /// If null, the matcher will be set globally on the test proxy. + /// The cancellation token to use. + /// The client result. + public virtual async Task SetMatcherAsync(BaseMatcher matcher, string? recordingId = null, CancellationToken token = default) + { + if (matcher == null) + { + throw new ArgumentNullException(nameof(matcher)); + } + + Dictionary headers = new() + { + ["x-abstraction-identifier"] = matcher.Type + }; + + if (recordingId != null) + { + headers[X_RECORDING_ID_HEADER] = recordingId; + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "admin/setmatcher", matcher, token, headers); + return await SendSyncOrAsync(true, message, token).ConfigureAwait(false); + } + + /// + /// Adds a transform. + /// + /// The transform to add. + /// (Optional) If specified, the transform will be added for a particular session only. + /// If null, the transform will be added globally on the test proxy. + /// The cancellation token to use. + /// The client result. + public virtual ProxyClientResult AddTransform(BaseTransform transform, string? recordingId = null, CancellationToken token = default) + { + if (transform == null) + { + throw new ArgumentNullException(nameof(transform)); + } + + Dictionary headers = new() + { + ["x-abstraction-identifier"] = transform.Type + }; + + if (recordingId != null) + { + headers[X_RECORDING_ID_HEADER] = recordingId; + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "admin/addtransform", transform, token, headers); + return SendSyncOrAsync(false, message, token).GetAwaiter().GetResult(); + } + + /// + /// Adds a transform asynchronously. + /// + /// The transform to add. + /// (Optional) If specified, the transform will be added for a particular session only. + /// If null, the transform will be added globally on the test proxy. + /// The cancellation token to use. + /// The client result. + public virtual async Task AddTransformAsync(BaseTransform transform, string? recordingId = null, CancellationToken token = default) + { + if (transform == null) + { + throw new ArgumentNullException(nameof(transform)); + } + + Dictionary headers = new() + { + ["x-abstraction-identifier"] = transform.Type + }; + + if (recordingId != null) + { + headers[X_RECORDING_ID_HEADER] = recordingId; + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "admin/addtransform", transform, token, headers); + return await SendSyncOrAsync(true, message, token).ConfigureAwait(false); + } + + /// + /// Resets the sanitizers, matcher, and transforms to the default. + /// + /// (Optional) If specified, only the particular session will be reset. + /// If null, the reset will apply globally. + /// The cancellation token to use. + /// The client result. + public virtual ProxyClientResult Reset(string? recordingId = null, CancellationToken token = default) + { + Dictionary headers = new(); + if (recordingId != null) + { + headers[X_RECORDING_ID_HEADER] = recordingId; + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "Admin/Reset", null, token, headers); + return SendSyncOrAsync(false, message, token).GetAwaiter().GetResult(); + } + + /// + /// Resets the sanitizers, matcher, and transforms to the default asynchronously. + /// + /// (Optional) If specified, only the particular session will be reset. + /// If null, the reset will apply globally. + /// The cancellation token to use. + /// The client result. + public virtual async Task ResetAsync(string? recordingId = null, CancellationToken token = default) + { + Dictionary headers = new(); + if (recordingId != null) + { + headers[X_RECORDING_ID_HEADER] = recordingId; + } + + PipelineMessage message = CreateJsonRequest(HttpMethod.Post, "Admin/Reset", null, token, headers); + return await SendSyncOrAsync(true, message, token).ConfigureAwait(false); + } + + /// + /// Lists the available sanitizers, matchers, and transforms. + /// + /// The cancellation token. + /// The client result with the HTML returned from the service. + public virtual ProxyClientResult ListAvailable(CancellationToken token = default) + { + PipelineMessage message = CreateJsonRequest(HttpMethod.Get, "Info/Available", null, token); + ProxyClientResult result = SendSyncOrAsync(false, message, token).GetAwaiter().GetResult(); + return new ProxyClientResult(result.GetRawResponse().Content.ToString(), result.GetRawResponse()); + } + + /// + /// Lists the available sanitizers, matchers, and transforms asynchronously. + /// + /// The cancellation token. + /// The client result with the HTML returned from the service. + public virtual async Task> ListAvailableAsync(CancellationToken token = default) + { + PipelineMessage message = CreateJsonRequest(HttpMethod.Get, "Info/Available", null, token); + ProxyClientResult result = await SendSyncOrAsync(true, message, token).ConfigureAwait(false); + return new ProxyClientResult(result.GetRawResponse().Content.ToString(), result.GetRawResponse()); + } + + protected virtual PipelineMessage CreateJsonRequest(HttpMethod method, string path, TBody? body, CancellationToken token, Dictionary? headers = null) + { + PipelineMessage message = _pipeline.CreateMessage(); + message.Apply(new RequestOptions + { + CancellationToken = token, + BufferResponse = true + }); + + PipelineRequest request = message.Request; + request.Method = method.Method; + request.Uri = new Uri(_options.HttpEndpoint, path); + request.Headers.Add("Accept", "application/json"); + + if (headers != null) + { + foreach (var kvp in headers) + { + request.Headers.Add(kvp.Key, kvp.Value); + } + } + + if (body != null) + { + MemoryStream stream = new(); + using Utf8JsonWriter writer = new(stream); + JsonSerializer.Serialize(writer, body, Default.RecordingJsonOptions); + BinaryData jsonBody = BinaryData.FromBytes(new ReadOnlyMemory(stream.GetBuffer(), 0, (int)stream.Length)); + + request.Headers.Add("Content-Type", "application/json"); + request.Content = BinaryContent.Create(jsonBody); + } + + return message; + } + + protected virtual async ValueTask SendSyncOrAsync(bool isAsync, PipelineMessage message, CancellationToken token) + { + if (isAsync) + { + await _pipeline.SendAsync(message).ConfigureAwait(false); + } + else + { + _pipeline.Send(message); + } + + PipelineResponse response = message.Response ?? throw new ClientResultException("Response was null", message.Response); + if (response.IsError) + { + if (response.Content.ToMemory().Length > 0) + { + string contentType = response.Headers.GetFirstOrDefault("Content-Type") ?? string.Empty; + + if (contentType.StartsWith("text/", StringComparison.OrdinalIgnoreCase)) + { + string error = response.Content.ToString(); + throw new ClientResultException(error, response); + } + else if (contentType.StartsWith("application/json", StringComparison.OrdinalIgnoreCase)) + { + string error; + try + { + var parsed = response.Content.ToObjectFromJson(new() + { + PropertyNameCaseInsensitive = true + }); + + error = $"{parsed.Status}: {parsed.Message}"; + } + catch + { + error = response.Content.ToString(); + } + + throw new ClientResultException(error, response); + } + } + + throw new ClientResultException(response); + } + + return new ProxyClientResult(response); + } + + protected virtual async ValueTask> SendSyncOrAsync(bool isAsync, PipelineMessage message, CancellationToken token) + { + if (isAsync) + { + await SendSyncOrAsync(isAsync, message, token).ConfigureAwait(false); + } + else + { + SendSyncOrAsync(isAsync, message, token).GetAwaiter().GetResult(); + } + + PipelineResponse response = message.Response!; // we've already validated this is not null in the previous call + + try + { + TResponse? parsed = JsonSerializer.Deserialize(response.Content.ToMemory().Span, Default.TestProxyJsonOptions); + if (parsed == null) + { + throw new InvalidDataException("Response parsed to null"); + } + + return new ProxyClientResult(parsed, response); + } + catch (Exception ex) + { + throw new ClientResultException("Failed to deserialize response", message.Response, ex); + } + } + + private struct ErrorResponse + { + public string? Message { get; set; } + public string? Status { get; set; } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyClientOptions.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyClientOptions.cs new file mode 100644 index 000000000..bb35323c9 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyClientOptions.cs @@ -0,0 +1,31 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; + +namespace OpenAI.TestFramework.Recording.RecordingProxy; + +/// +/// Options for the test proxy client. +/// +public class ProxyClientOptions : ClientPipelineOptions +{ + /// + /// Creates a new instance. + /// + /// The HTTP endpoint. + /// The endpoint was null. + /// The endpoint was not absolute. + public ProxyClientOptions(Uri http) + { + if (http == null) throw new ArgumentNullException(nameof(http)); + else if (!http.IsAbsoluteUri) throw new ArgumentException("URI must be absolute", nameof(http)); + + HttpEndpoint = http; + } + + /// + /// The HTTP endpoint to use + /// + public Uri HttpEndpoint { get; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyClientResult.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyClientResult.cs new file mode 100644 index 000000000..a1e16d300 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyClientResult.cs @@ -0,0 +1,65 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Recording.RecordingProxy +{ + /// + /// Represents the result of a proxy client operation. + /// + public class ProxyClientResult : ClientResult + { + /// + /// Initializes a new instance of the class. + /// + /// (Optional) The pipeline response. + public ProxyClientResult(PipelineResponse? response = null) + { + if (response != null) + { + SetRawResponse(response); + } + } + + /// + /// Gets the recording ID from the response headers. + /// + public string? RecordingId => GetRawResponse().Headers.GetFirstOrDefault(ProxyClient.X_RECORDING_ID_HEADER); + } + + /// + /// Represents the result of a proxy client operation. + /// + /// The type of the result value. + public class ProxyClientResult : ProxyClientResult + { + /// + /// Initializes a new instance of the class. + /// + /// The result value. + /// (Optional) The pipeline response. + public ProxyClientResult(TResult value, PipelineResponse? response = null) + { + Value = value; + if (response != null) + { + SetRawResponse(response); + } + } + + /// + /// Gets the result value. + /// + public virtual TResult Value { get; } + + /// + /// Implicitly converts the to the result value. + /// + /// The instance. + /// The result value. + public static implicit operator TResult(ProxyClientResult result) => result.Value; + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyService.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyService.cs new file mode 100644 index 000000000..8d6460afd --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyService.cs @@ -0,0 +1,256 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Diagnostics; +using System.Runtime.InteropServices; +using System.Text; +using NUnit.Framework; +using OpenAI.TestFramework.Recording.RecordingProxy; +using OpenAI.TestFramework.Utils.Processes; + +namespace OpenAI.TestFramework.Recording.Proxy; + +/// +/// Represents the test proxy. See here for more information: +/// https://github.com/Azure/azure-sdk-tools/blob/main/tools/test-proxy/Azure.Sdk.Tools.TestProxy/README.md +/// +public class ProxyService : IDisposable +{ + private const int c_maxLines = 50; + + private Process _testProxyProcess; + private Uri? _http; + private Uri? _https; + private TaskCompletionSource<(int, int)> _portsAvailableTcs; + private StringBuilder _errorOutput; + private int _lines; + private ProxyClient? _client; + private WindowsJob? _windowsJob; + + /// + /// Creates a new instance. + /// + /// The options to use. + /// was null. + private ProxyService(ProxyServiceOptions options) + { + if (options == null) + { + throw new ArgumentNullException(nameof(options)); + } + + options.Validate(); + + ProcessStartInfo startInfo = new() + { + FileName = options.DotnetExecutable, + Arguments = $@"""{options.TestProxyDll}"" start -u --storage-location=""{options.StorageLocationDir}""", + RedirectStandardOutput = true, + RedirectStandardError = true, + UseShellExecute = false, + EnvironmentVariables = + { + ["ASPNETCORE_URLS"] = $"http://127.0.0.1:{options.HttpPort};https://127.0.0.1:{options.HttpsPort}", + ["Logging__LogLevel__Azure.Sdk.Tools.TestProxy"] = "Error", + ["Logging__LogLevel__Default"] = "Error", + ["Logging__LogLevel__Microsoft.AspNetCore"] = "Error", + ["Logging__LogLevel__Microsoft.Hosting.Lifetime"] = "Information", + } + }; + + if (options.DevCertFile != null) + { + startInfo.EnvironmentVariables["ASPNETCORE_Kestrel__Certificates__Default__Path"] = options.DevCertFile; + if (options.DevCertPassword != null) + { + startInfo.EnvironmentVariables["ASPNETCORE_Kestrel__Certificates__Default__Password"] = options.DevCertPassword; + } + } + + _errorOutput = new(); + _portsAvailableTcs = new(); + _testProxyProcess = new Process() + { + EnableRaisingEvents = true, + StartInfo = startInfo + }; + + _testProxyProcess.Exited += (_, _) => + { + _portsAvailableTcs.TrySetException(new InvalidOperationException("Test proxy process exited unexpectedly")); + }; + _testProxyProcess.ErrorDataReceived += HandleStdErr; + _testProxyProcess.OutputDataReceived += HandleStdOut; + + _windowsJob = null; + if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows)) + { + // If running on Windows, use a Job to instruct the OS to kill the test proxy service process + // should this current process die for any reason. + _windowsJob = new($"TestProxy_{Process.GetCurrentProcess().Id}"); + } + } + + /// + /// Gets the client to use to communicate with this recording test proxy. + /// + public ProxyClient Client => _client + ?? throw new InvalidOperationException("Please wait for the proxy to finish starting first"); + + /// + /// Gets the HTTP endpoint the test recording proxy is listening on. + /// + public Uri HttpEndpoint => _http + ?? throw new InvalidOperationException("Please wait for the proxy to finish starting first"); + + /// + /// Gets the HTTPS endpoint the test recording proxy is listening on. + /// + public Uri HttpsEndpoint => _https + ?? throw new InvalidOperationException("Please wait for the proxy to finish starting first"); + + /// + /// Creates a new instance of the recording test proxy. + /// + /// The options to use for the proxy. + /// The cancellation token to use. + /// The initialized recording test proxy instance. + public static async Task CreateNewAsync(ProxyServiceOptions options, CancellationToken token = default) + { + token.ThrowIfCancellationRequested(); + + ProxyService proxy = new ProxyService(options); + + // Try to make sure the test proxy process is terminated when we exit + AppDomain.CurrentDomain.DomainUnload += (_, _) => proxy.Dispose(); + // TODO FIXME: On Windows, use a job to ensure the OS will properly kill the process + + await proxy.StartAsync(token).ConfigureAwait(false); + return proxy; + } + + /// + /// Tears down the recording test proxy instance. + /// + public void Dispose() + { + _portsAvailableTcs.TrySetException(new ObjectDisposedException(nameof(ProxyService))); + try + { + _testProxyProcess.Kill(); + if (_windowsJob != null) + { + // do NOT call Dispose here. This will terminate this process too. + } + } catch { /* we tried */ } + } + + /// + /// Checks to see if any errors were encountered in the test proxy, and if so throws an exception. + /// + /// If there were any errors encountered. + public void ThrowOnErrors() + { + lock (_errorOutput) + { + if (_errorOutput.Length > 0) + { + string error = _errorOutput.ToString(); + _errorOutput.Clear(); + throw new InvalidOperationException($"An error occurred in the test proxy:\n{error}"); + } + } + } + + /// + /// For testing purposes only + /// + /// The client to set. + internal void SetClient(ProxyClient client) + { + _client = client; + } + + /// + /// Starts the recording test proxy instance, and waits until we can read the ports it is listening on for + /// HTTP and HTTPS. + /// + /// The cancellation token to use. + /// Asynchronous tas + /// The test proxy failed to start, or we encountered some other error. + protected async Task StartAsync(CancellationToken token = default) + { + token.Register(_portsAvailableTcs.SetCanceled); + + bool success = _testProxyProcess.Start(); + if (!success) + { + throw new InvalidOperationException("The test proxy process failed to start"); + } + + _windowsJob?.Add(_testProxyProcess); + + _testProxyProcess.BeginOutputReadLine(); + _testProxyProcess.BeginErrorReadLine(); + + await _portsAvailableTcs.Task.ConfigureAwait(false); + } + + private static Uri? ParseListeningOnUri(string line) + { + const string nowListeningOn = "Now listening on: "; + int index = line.IndexOf(nowListeningOn, StringComparison.OrdinalIgnoreCase); + if (index < 0) + { + return null; + } + + Uri.TryCreate(line.AsSpan().Slice(index + nowListeningOn.Length).Trim().ToString(), UriKind.Absolute, out Uri? uri); + return uri; + } + + private void HandleStdErr(object sender, DataReceivedEventArgs args) + { + if (args?.Data != null) + { + lock (_errorOutput) + { + _errorOutput.Append(args.Data); + } + + TestContext.Progress.WriteLine(args.Data); + } + } + + private void HandleStdOut(object sender, DataReceivedEventArgs args) + { + if (_lines++ >= c_maxLines) + { + _portsAvailableTcs.TrySetException(new InvalidOperationException( + $"Failed to start the test proxy. One or both the ports was not populated. http: {_http}, https: {_https}")); + _testProxyProcess.OutputDataReceived -= HandleStdOut; + return; + } + else if (args?.Data == null) + { + return; + } + + Uri? uri = ParseListeningOnUri(args.Data); + if (_http == null && uri?.Scheme == "http") + { + _http = uri; + _client = new ProxyClient(new ProxyClientOptions(_http!)); + } + else if (_https == null && uri?.Scheme == "https") + { + _https = uri; + } + + if (_http != null && _https != null) + { + _testProxyProcess.OutputDataReceived -= HandleStdOut; + _portsAvailableTcs.TrySetResult((_http.Port, _https.Port)); + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyServiceOptions.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyServiceOptions.cs new file mode 100644 index 000000000..2f3e3d27f --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyServiceOptions.cs @@ -0,0 +1,82 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.Proxy; + +/// +/// Options for starting the recording test proxy. +/// +public class ProxyServiceOptions +{ + /// + /// Gets the full path to the dotnet executable. + /// + required public string DotnetExecutable { get; set; } + + /// + /// Gets the full path to the test proxy DLL. + /// + required public string TestProxyDll { get; set; } + + /// + /// The path to the directory to store or read recordings from. + /// + required public string StorageLocationDir { get; set; } + + /// + /// (Optional) The file to use for the HTTPS endpoint certificate. + /// + public string? DevCertFile { get; set; } + + /// + /// (Optional) The password to use for opening the for the HTTPS endpoint. + /// + public string? DevCertPassword { get; set; } + + /// + /// (Optional) The HTTP port the test proxy should listen on. Set this to 0 to have the next available port be automatically selected. + /// + public ushort HttpPort { get; set; } + + /// + /// (Optional) The HTTPS port the test proxy should listen on. Set this to 0 to have the next available port be automatically selected. + /// + public ushort HttpsPort { get; set; } + + /// + /// Validates the configuration. + /// + /// The storage location directory was could not be found. + /// The HTTPS certificate file could not be found. + /// No password was specified for the developer certificate file. + internal protected virtual void Validate() + { + List exceptions = new(); + + if (!File.Exists(DotnetExecutable)) + { + exceptions.Add(new FileNotFoundException("Could not find (or read from) the dotnet executable: " + DotnetExecutable)); + } + else if (!File.Exists(TestProxyDll)) + { + exceptions.Add(new FileNotFoundException("Could not find (or read from) the test proxy DLL: " + TestProxyDll)); + } + else if (!Directory.Exists(StorageLocationDir)) + { + exceptions.Add(new DirectoryNotFoundException("Could not find (or read from) the following directory: " + StorageLocationDir)); + } + else if (DevCertFile != null && !File.Exists(DevCertFile)) + { + exceptions.Add(new FileNotFoundException("Could not find (or read from) the HTTPS certificate file: " + DevCertFile)); + } + else if (DevCertFile != null && DevCertPassword == null) + { + exceptions.Add(new InvalidOperationException($"You must set the {nameof(DevCertPassword)} property if you specify the {nameof(DevCertFile)}")); + } + + if (exceptions.Any()) + { + throw new AggregateException("The test proxy service configuration is invalid", exceptions); ; + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyTransport.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyTransport.cs new file mode 100644 index 000000000..69cb9a8fc --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyTransport.cs @@ -0,0 +1,216 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Net.Http; +using System.Text.Json; + +namespace OpenAI.TestFramework.Recording.RecordingProxy; + +/// +/// Implements a that will redirect all HTTP/HTTPS requests to the test proxy for recording or playback. +/// Depending on the mode, the test proxy will then either forward the request to the upstream service and record the request and response, +/// or playback the response from a previous recording. +/// +public class ProxyTransport : PipelineTransport +{ + private const string DevCertIssuer = "CN=localhost"; + private const string FiddlerCertIssuer = "CN=DO_NOT_TRUST_FiddlerRoot, O=DO_NOT_TRUST, OU=Created by http://www.fiddler2.com"; + private const string FiddlerHost = "ipv4.fiddler"; + + private readonly ProxyTransportOptions _options; + + /// + /// Initializes a new instance of the class. + /// + /// The options for the proxy transport. + public ProxyTransport(ProxyTransportOptions options) + { + _options = options ?? throw new ArgumentNullException(nameof(options)); + + string certIssuer; + if (_options.UseFiddler) + { + certIssuer = FiddlerCertIssuer; + } + else + { + certIssuer = DevCertIssuer; + } + + HttpClientHandler handler = new() + { + ServerCertificateCustomValidationCallback = (_, certificate, _, _) => certificate?.Issuer == certIssuer, + UseCookies = _options.AllowCookies, + AllowAutoRedirect = _options.AllowAutoRedirect + }; + + InnerTransport = new HttpClientPipelineTransport(new HttpClient(handler)); + } + + /// + /// The actual transport to use for sending requests, and receiving responses. + /// + protected PipelineTransport InnerTransport { get; } + + /// + protected override PipelineMessage CreateMessageCore() + { + Exception? ex = _options.MismatchException?.GetValue(); + if (ex != null) + { + throw ex; + } + + PipelineMessage message = InnerTransport.CreateMessage(); + PipelineRequest request = message.Request; + + // PipelineRequest no longer has a ClientRequestId property, so we need to set it on the headers directly + request.Headers.Add("x-ms-client-request-id", _options.RequestId); + + return message; + } + + /// + protected override void ProcessCore(PipelineMessage message) + => ProcessCoreSyncOrAsync(message, async: false).GetAwaiter().GetResult(); + + /// + protected override ValueTask ProcessCoreAsync(PipelineMessage message) + => ProcessCoreSyncOrAsync(message, async: true); + + /// + /// Processes the pipeline message synchronously or asynchronously. + /// + /// The pipeline message to process. + /// A flag indicating whether to process asynchronously. + /// A representing the asynchronous operation. + protected virtual async ValueTask ProcessCoreSyncOrAsync(PipelineMessage message, bool async) + { + try + { + RedirectToTestProxy(message); + if (async) + { + await InnerTransport.ProcessAsync(message).ConfigureAwait(false); + } + else + { + InnerTransport.Process(message); + } + + await ProcessResponseSyncAsync(message, async).ConfigureAwait(false); + } + finally + { + // revert the original URI - this is important for tests that rely on aspects of the URI in the pipeline + // e.g. KeyVault caches tokens based on URI + message.Request.Headers.TryGetValue("x-recording-upstream-base-uri", out string? original); + if (message.Request.Uri is null) + { + throw new InvalidOperationException("The request cannot have a null URI"); + } + if (original == null) + { + throw new InvalidOperationException("The TestProxy response did not contain the expected \"x-recording-upstream-base-uri\" header"); + } + + var originalBaseUri = new Uri(original); + var builder = new UriBuilder(message.Request.Uri); + builder.Scheme = originalBaseUri.Scheme; + builder.Host = originalBaseUri.Host; + builder.Port = originalBaseUri.Port; + + message.Request.Uri = builder.Uri; + } + } + + /// + /// Processes the response synchronously or asynchronously. + /// + /// The pipeline message containing the response. + /// A flag indicating whether to process asynchronously. + /// A representing the asynchronous operation. + protected virtual async ValueTask ProcessResponseSyncAsync(PipelineMessage message, bool async) + { + if (message.Response?.Headers.TryGetValues("x-request-mismatch", out _) == true) + { + if (message.Response.ContentStream == null) + { + throw new TestRecordingMismatchException("Detected a mismatch but the response had no body"); + } + + using var doc = async + ? await JsonDocument.ParseAsync(message.Response.ContentStream).ConfigureAwait(false) + : JsonDocument.Parse(message.Response.ContentStream); + throw new TestRecordingMismatchException(doc.RootElement.GetProperty("Message").GetString(), null); + } + } + + // copied from https://github.com/Azure/azure-sdk-for-net/blob/main/common/Perf/Azure.Test.Perf/TestProxyPolicy.cs + /// + /// Redirects the pipeline message to the test proxy based on the recording mode. + /// + /// The pipeline message to redirect. + protected virtual void RedirectToTestProxy(PipelineMessage message) + { + if (_options.Mode == RecordedTestMode.Record) + { + switch (_options.ShouldRecordRequest(message.Request)) + { + case RequestRecordMode.Record: + break; + case RequestRecordMode.RecordWithoutRequestBody: + message.Request.Headers.Set("x-recording-skip", "request-body"); + break; + case RequestRecordMode.DoNotRecord: + message.Request.Headers.Set("x-recording-skip", "request-response"); + break; + } + } + else if (_options.Mode == RecordedTestMode.Playback) + { + switch (_options.ShouldRecordRequest(message.Request)) + { + case RequestRecordMode.Record: + break; + case RequestRecordMode.RecordWithoutRequestBody: + // CAUTION: setting the request content to null has the unfortunate side effect of causing any HttpClient backed + // implementation of networking to not send up any Content-??? headers as well which can cause test + // mismatches. Let's work around this by setting some empty content. + message.Request.Content = BinaryContent.Create(BinaryData.FromBytes(Array.Empty())); + break; + case RequestRecordMode.DoNotRecord: + throw new InvalidOperationException( + "Cannot playback when recording has been disabled. Please make sure to skip the test or request."); + } + } + + var request = message.Request; + request.Headers.Set("x-recording-id", _options.RecordingId); + request.Headers.Set("x-recording-mode", _options.Mode.ToString().ToLowerInvariant()); + + if (request.Uri is null) + { + throw new InvalidOperationException("Request URI cannot be null"); + } + + // Intentionally reset the upstream URI in case the request URI changes between retries - e.g. when using GeoRedundant secondary Storage + var builder = new UriBuilder() + { + Scheme = request.Uri.Scheme, + Host = request.Uri.Host, + Port = request.Uri.Port, + }; + request.Headers.Set("x-recording-upstream-base-uri", builder.ToString()); + + Uri baseUri = request.Uri.Scheme == "https" ? _options.HttpsEndpoint : _options.HttpEndpoint; + + builder = new(request.Uri); + builder.Host = _options.UseFiddler ? FiddlerHost : baseUri.Host; + builder.Port = baseUri.Port; + + request.Uri = builder.Uri; + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyTransportOptions.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyTransportOptions.cs new file mode 100644 index 000000000..a0087e850 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/ProxyTransportOptions.cs @@ -0,0 +1,72 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Recording.RecordingProxy; + +/// +/// The options for the recording test proxy transport. +/// +public class ProxyTransportOptions +{ + private Func? _shouldRecordRequest; + + /// + /// Gets or sets the test proxy HTTP endpoint. + /// + required public Uri HttpEndpoint { get; set; } + + /// + /// Gets or sets the test proxy HTTPS endpoint. + /// + required public Uri HttpsEndpoint { get; set; } + + /// + /// Gets or sets the current test recording mode. + /// + required public RecordedTestMode Mode { get; set; } + + /// + /// Gets or sets the identifier for the recording. + /// + required public string RecordingId { get; set; } + + /// + /// The ID for the request. Please make sure that a consistent ID is used during recording and playback to avoid + /// mismatches. + /// + required public string RequestId { get; set; } + + /// + /// Gets or sets the delegate used to get/set the test recording mismatch exception. + /// + public PropertyDelegate? MismatchException { get; set; } + + /// + /// Gets or sets a value indicating whether to use Fiddler. If this is true, the transport will be updated to accept + /// the Fiddler root certificate. + /// + public bool UseFiddler { get; set; } + + /// + /// Gets or sets the predicate used to determine whether or not a particular request should not be recorded. + /// Default behaviour is to defer to what the matchers/sanitizers do. + /// + public Func ShouldRecordRequest + { + get => _shouldRecordRequest ?? (_ => RequestRecordMode.Record); + set => _shouldRecordRequest = value; + } + + /// + /// Gets or sets a value indicating whether to allow cookies while sending and receiving requests. + /// + public bool AllowCookies { get; set; } + + /// + /// Gets or sets a value indicating whether to allow auto redirect when processing server responses. + /// + public bool AllowAutoRedirect { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/RequestRecordMode.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/RequestRecordMode.cs new file mode 100644 index 000000000..d8a782327 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/RequestRecordMode.cs @@ -0,0 +1,23 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.RecordingProxy; + +/// +/// Enumeration of possible values of how to record a request. This acts as an override. +/// +public enum RequestRecordMode +{ + /// + /// Records the request. + /// + Record, + /// + /// Records the request headers but skips the request body. + /// + RecordWithoutRequestBody, + /// + /// Does not record the request (nor the response). + /// + DoNotRecord, +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/PemPair.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/PemPair.cs new file mode 100644 index 000000000..15c72d94e --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/PemPair.cs @@ -0,0 +1,15 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.Proxy.Service; + +/// +/// Information about certificates for the test proxy service. +/// +public class PemPair +{ + /// Gets or sets the pem value. + public string? PemValue { get; set; } + /// Gets or sets the pem key. + public string? PemKey { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/ProxyServiceRecordingOptions.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/ProxyServiceRecordingOptions.cs new file mode 100644 index 000000000..449e90926 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/ProxyServiceRecordingOptions.cs @@ -0,0 +1,25 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.Proxy.Service; + +/// +/// Options for the test proxy. +/// +public class ProxyServiceRecordingOptions +{ + /// + /// Whether or not to follow redirects + /// + public bool? HandleRedirects { get; set; } + + /// + /// If set, this will change the "root" path the test proxy uses when loading a recording. + /// + public string? ContextDirectory { get; set; } + + /// + /// Options for the transport. + /// + public ProxyServiceTransportCustomizations? Transport { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/ProxyServiceTransportCustomizations.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/ProxyServiceTransportCustomizations.cs new file mode 100644 index 000000000..6be1ba257 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/ProxyServiceTransportCustomizations.cs @@ -0,0 +1,40 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Text.Json.Serialization; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Recording.Proxy.Service; + +/// +/// Transport customizations for the test proxy service. +/// +public class ProxyServiceTransportCustomizations() +{ + /// Gets or sets the allow auto redirect. + public bool? AllowAutoRedirect { get; set; } + + /// + /// If specified, the public key contained here will be used during validation of the SSL connection by + /// comparing thumbprints. + /// + public string? TLSValidationCert { get; set; } + + /// + /// If specified, the will only be applied to the specified host. + /// + public string? TSLValidationCertHost { get; set; } + + /// + /// Each certificate pair contained within this list should be added to the clientHandler for the server + /// or an individual recording. + /// + public IList? Certificates { get; set; } + + /// + /// During playback, a response is normally returned all at once. By offering this response time, we can + /// "stretch" the writing of the response bytes over a time range of milliseconds. + /// + [JsonConverter(typeof(TimespanToMillisecondConverter))] + public TimeSpan? PlaybackResponseTime { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/RecordingStartInformation.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/RecordingStartInformation.cs new file mode 100644 index 000000000..e3526592a --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/RecordingStartInformation.cs @@ -0,0 +1,30 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Text.Json.Serialization; + +namespace OpenAI.TestFramework.Recording.Proxy.Service; + +/// +/// Information for starting a recording or playback session with the recording test proxy. +/// +public class RecordingStartInformation +{ + /// + /// Gets or sets the file to save recordings to, or to play back requests from. + /// + [JsonPropertyName("x-recording-file")] + required public string RecordingFile { get; set; } + + /// + /// Gets or sets the path to the "assets.json" file to use for integration with external Git + /// repositories. This enables the proxy to work against repositories that do not emplace their + /// test recordings directly alongside their test implementations. + /// + /// + /// Please refer to the documentation for more information: + /// https://github.com/Azure/azure-sdk-tools/blob/main/tools/test-proxy/documentation/asset-sync/README.md + /// + [JsonPropertyName("x-recording-assets-file")] + public string? AssetsFile { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/SanitizerIdList.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/SanitizerIdList.cs new file mode 100644 index 000000000..f0982542b --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Proxy/Service/SanitizerIdList.cs @@ -0,0 +1,15 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.Proxy.Service; + +/// +/// Request to remove sanitizers for the test proxy. +/// +public struct SanitizerIdList +{ + /// + /// The IDs of the sanitizers to remove. + /// + public string[]? Sanitizers { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BaseRegexSanitizer.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BaseRegexSanitizer.cs new file mode 100644 index 000000000..3c52f35b8 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BaseRegexSanitizer.cs @@ -0,0 +1,25 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.Sanitizers; + +/// +/// The case class for regex based sanitizers +/// +public abstract class BaseRegexSanitizer(string type) : BaseSanitizer(type) +{ + /// + /// Gets the regular expression to match what to replace. + /// + public string? Regex { get; set; } + + /// + /// Gets or sets the value to replace the match with. + /// + public string? Value { get; set; } + + /// + /// Gets or sets the group in the regex match to replace. + /// + public string? GroupForReplace { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BaseSanitizer.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BaseSanitizer.cs new file mode 100644 index 000000000..011145bf5 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BaseSanitizer.cs @@ -0,0 +1,53 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Text.Json; +using System.Text.Json.Serialization; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Recording.Sanitizers; + +/// +/// The base class for all test proxy recording sanitizers +/// +public abstract class BaseSanitizer : IUtf8JsonSerializable +{ + /// + /// Creates a new instance. + /// + /// The type of this sanitizer (e.g. GeneralRegexSanitizer). + /// If the type was null. + protected BaseSanitizer(string type) + { + Type = type ?? throw new ArgumentNullException(nameof(Type)); + } + + /// + /// Gets the type of the sanitizer (e.g. HeaderRegexSanitizer). + /// + [JsonIgnore] + public string Type { get; } + + /// + public void Write(Utf8JsonWriter writer, JsonSerializerOptions? options = null) + { + writer.WriteStartObject(); + { + writer.WriteString("Name"u8, Type); + writer.WritePropertyName("Body"u8); + + SerializeInner(writer, options); + } + writer.WriteEndObject(); + } + + /// + /// Serializes the child types. By default this will use reflection based serialization. + /// + /// The writer to write to. + protected virtual void SerializeInner(Utf8JsonWriter writer, JsonSerializerOptions? options = null) + { + // By default use reflection based serialization + JsonSerializer.Serialize(writer, this, GetType(), Default.InnerRecordingJsonOptions); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BodyKeySanitizer.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BodyKeySanitizer.cs new file mode 100644 index 000000000..87b3f67e1 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BodyKeySanitizer.cs @@ -0,0 +1,25 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.Sanitizers; + +/// +/// Sanitizer for a request body that matches a particular value in JSON using a JPath expression. +/// +public class BodyKeySanitizer : BaseRegexSanitizer +{ + /// + /// Creates a new instance. + /// + /// The JSON path to match. + /// If the JSON path is null. + public BodyKeySanitizer(string jsonPath) : base("BodyKeySanitizer") + { + JsonPath = jsonPath ?? throw new ArgumentNullException(nameof(jsonPath)); + } + + /// + /// The JPath expression to match a particular value to sanitize. + /// + public string JsonPath { get; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BodyRegexSanitizer.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BodyRegexSanitizer.cs new file mode 100644 index 000000000..e49b6f625 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/BodyRegexSanitizer.cs @@ -0,0 +1,25 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.Sanitizers; + +/// +/// Sanitizer for the body of a request or response. +/// +public class BodyRegexSanitizer : BaseRegexSanitizer +{ + /// + /// Creates a new instance. + /// + /// Gets the regular expression to match what to replace. + /// If was null. + public BodyRegexSanitizer(string regex) : base("BodyRegexSanitizer") + { + Regex = regex ?? throw new ArgumentNullException(nameof(regex)); + } + + /// + /// Condition to apply for the sanitization or transform. If the condition is not met, sanitization is not performed. + /// + public Condition? Condition { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/HeaderRegexSanitizer.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/HeaderRegexSanitizer.cs new file mode 100644 index 000000000..d1a76fc04 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/HeaderRegexSanitizer.cs @@ -0,0 +1,25 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.Sanitizers; + +/// +/// Sanitizer for a request header. +/// +public class HeaderRegexSanitizer : BaseRegexSanitizer +{ + /// + /// Creates a new instance. + /// + /// The header to sanitize. + /// If the is null. + public HeaderRegexSanitizer(string key) : base("HeaderRegexSanitizer") + { + Key = key ?? throw new ArgumentNullException(nameof(key)); + } + + /// + /// The name of the header to sanitize. + /// + public string Key { get; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/UriRegexSanitizer.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/UriRegexSanitizer.cs new file mode 100644 index 000000000..3c5bad68e --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Sanitizers/UriRegexSanitizer.cs @@ -0,0 +1,20 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.Sanitizers; + +/// +/// Sanitizer for a request URI. +/// +public class UriRegexSanitizer : BaseRegexSanitizer +{ + /// + /// Creates a new instance. + /// + /// The regular expression to match in the request URI. + /// If the regular expression is null. + public UriRegexSanitizer(string regex) : base("UriRegexSanitizer") + { + Regex = regex ?? throw new ArgumentNullException(nameof(regex)); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRandom.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRandom.cs new file mode 100644 index 000000000..d3b6bccb6 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRandom.cs @@ -0,0 +1,38 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording; + +/// +/// Represents an implementation of the class used for test recordings. +/// +public class TestRandom : Random +{ + private RecordedTestMode _mode; + + /// + /// Initializes a new instance of the class. + /// + /// The recorded test mode. + /// The seed value. + public TestRandom(RecordedTestMode mode, int seed) : base(seed) + { + _mode = mode; + } + + /// + /// Generates a new based on the recorded test mode. + /// + /// A new . + public Guid NewGuid() + { + if (_mode == RecordedTestMode.Live) + { + return Guid.NewGuid(); + } + + var bytes = new byte[16]; + NextBytes(bytes); + return new Guid(bytes); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRecording.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRecording.cs new file mode 100644 index 000000000..d58573f9e --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRecording.cs @@ -0,0 +1,250 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Globalization; +using System.Security.Cryptography; +using OpenAI.TestFramework.Recording.Matchers; +using OpenAI.TestFramework.Recording.Proxy; +using OpenAI.TestFramework.Recording.RecordingProxy; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Recording; + +/// +/// Represents a test recording session. This is used to record or playback requests and responses. It also provides +/// a random generator that is consistent between recording and playback sessions. +/// +public class TestRecording : IAsyncDisposable +{ + /// + /// The key to use to store the random seed in the recording. + /// + public const string RandomSeedVariableKey = "RandomSeed"; + + private SortedDictionary _variables; + + /// + /// Creates a new instance. + /// + /// The unique identifier for the recording. + /// The current recording mode. + /// The test proxy service instance to use for the recording. + /// (Optional) Any variables populate this recording this. This is normally used in + /// playback mode to pass in any variables saved as part of the recording. + /// Any of the required parameters are null. + /// Some expected values were missing or null. + /// The current recording mode is not supported. + public TestRecording(string id, RecordedTestMode mode, ProxyService proxy, IDictionary? variables = null) + { + ID = id ?? throw new ArgumentNullException(nameof(id)); + Mode = mode; + Proxy = proxy ?? throw new ArgumentNullException(nameof(proxy)); + _variables = variables == null + ? new() + : new(variables); + + if (Proxy.Client == null) + { + throw new InvalidOperationException("Recording test proxy did not have a client defined"); + } + + int seed; + switch (Mode) + { + case RecordedTestMode.Live: + Random = new TestRandom(Mode, GetRandomSeed()); + break; + + case RecordedTestMode.Record: + seed = GetRandomSeed(); + _variables[RandomSeedVariableKey] = seed.ToString(CultureInfo.InvariantCulture); + Random = new TestRandom(Mode, seed); + break; + + case RecordedTestMode.Playback: + if (Variables.TryGetValue(RandomSeedVariableKey, out string? seedString) + && int.TryParse(seedString, NumberStyles.Integer, CultureInfo.InvariantCulture, out seed)) + { + Random = new TestRandom(Mode, seed); + } + else + { + // To maximise backwards compatibility with the recordings from the previous test framework, we'll just use a random + // seed if one wasn't set instead of failing here. Worst case, we'll get recording mismatches if this is not configured + // correctly. + Random = new TestRandom(Mode, GetRandomSeed()); + } + break; + + default: + throw new NotSupportedException("Unsupported recording mode: " + Mode); + } + } + + /// + /// Gets the unique identifier for this recording. + /// + public string ID { get; } + + /// + /// Gets the current recording mode. + /// + public RecordedTestMode Mode { get; } + + /// + /// Gets the random generator to use for this recording. Using this ensures consistent random values generated during + /// recording, as well as during playback. + /// + public TestRandom Random { get; } + + /// + /// Gets the proxy service associated with the recording. + /// + protected internal ProxyService Proxy { get; } + + /// + /// Gets any variables associated with the recording. + /// + protected IReadOnlyDictionary Variables => _variables; + + /// + /// Disposes of the recording session. If you were recording, this will try to save your captured requests and + /// responses. If you were playing back, this will stop the playback session. + /// + /// Asynchronous task + public virtual ValueTask DisposeAsync() => FinishAsync(true); + + /// + /// Finishes the recording session. This will stop recording or playback. If you were recording, you can use + /// to determine whether or not captured requests and responses will be saved. + /// + /// True to save any captured requests and responses to the file specified in your + /// . False to not save. This is only used if + /// you were recording. + /// The cancellation token to use. + /// Asynchronous task + /// If the recording mode is not supported. + public async virtual ValueTask FinishAsync(bool save, CancellationToken token = default) + { + switch (Mode) + { + case RecordedTestMode.Live: + // nothing to see here, move along + break; + case RecordedTestMode.Playback: + await Proxy.Client.StopPlaybackAsync(ID, token).ConfigureAwait(false); + break; + case RecordedTestMode.Record: + await Proxy.Client.StopRecordingAsync(ID, _variables, !save, token).ConfigureAwait(false); + break; + default: + throw new NotSupportedException("The following mode is not supported: " + Mode); + } + + Proxy.ThrowOnErrors(); + } + + /// + /// Gets a recorded variable. + /// + /// The name of the variable. + /// The variable value, or null if the variable was not set. + public virtual string? GetVariable(string name) + { + return _variables.GetValueOrDefault(name); + } + + /// + /// Sets a recorded variable to a value. + /// + /// The name of the variable. + /// The value to set. + public virtual void SetVariable(string name, string value) + { + _variables[name] = value; + } + + /// + /// Gets a recorded variable, or if it was not set, creates and adds a new variable. + /// + /// The name of the variable. + /// The factory used to create a value if none was previously set. + /// The already existing value, or the newly added value. + public virtual string GetOrAddVariable(string name, Func valueFactory) + { + string? value; + if (!_variables.TryGetValue(name, out value) || value == null) + { + value = valueFactory(); + SetVariable(name, value); + } + + return value; + } + + /// + /// Gets the options to use as the options for creating transport to pass to clients. This will allow the clients to + /// forward requests to the test proxy. + /// + /// The options to use. + public virtual ProxyTransportOptions GetProxyTransportOptions() + { + return new() + { + HttpEndpoint = Proxy.HttpEndpoint, + HttpsEndpoint = Proxy.HttpsEndpoint, + Mode = Mode, + RecordingId = ID, + RequestId = Random.NewGuid().ToString() + }; + } + + /// + /// Applies recording options to the current recording. + /// + /// The recording options to apply for this recording/playback session. + /// The cancellation token to use. + /// Asynchronous task + public virtual async Task ApplyOptions(TestRecordingOptions options, CancellationToken token) + { + if (options.Sanitizers.Any()) + { + await Proxy.Client.AddSanitizersAsync(options.Sanitizers, ID, token).ConfigureAwait(false); + } + + if (options.SanitizersToRemove.Any()) + { + await Proxy.Client.RemoveSanitizersAsync(options.SanitizersToRemove, ID, token).ConfigureAwait(false); + } + + if (Mode == RecordedTestMode.Playback) + { + BaseMatcher matcher = options.Matcher ?? new CustomMatcher() + { + CompareBodies = options.CompareBodies, + ExcludedHeaders = options.ExcludedHeaders.JoinOrNull(","), + IgnoredHeaders = options.IgnoredHeaders.JoinOrNull(","), + IgnoredQueryParameters = options.IgnoredQueryParameters.JoinOrNull(","), + }; + + await Proxy.Client.SetMatcherAsync(matcher, ID, token).ConfigureAwait(false); + + foreach (var transform in options.Transforms) + { + await Proxy.Client.AddTransformAsync(transform, ID, token).ConfigureAwait(false); + } + } + } + + private static int GetRandomSeed() + { +#if NET6_0_OR_GREATER + return RandomNumberGenerator.GetInt32(int.MaxValue); +#else + byte[] bytes = new byte[4]; + using var rng = RandomNumberGenerator.Create(); + rng.GetBytes(bytes); + return BitConverter.ToInt32(bytes, 0); +#endif + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRecordingMismatchException.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRecordingMismatchException.cs new file mode 100644 index 000000000..3f6af0242 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRecordingMismatchException.cs @@ -0,0 +1,44 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Runtime.Serialization; + +namespace OpenAI.TestFramework.Recording; + +/// +/// Exception thrown when the test recording does not match during playback. +/// +[Serializable] +public class TestRecordingMismatchException : Exception +{ + /// + /// Creates a new instance + /// + public TestRecordingMismatchException() + { + } + + /// + /// Creates a new instance. + /// + /// The exception message. + public TestRecordingMismatchException(string message) : base(message) + { + } + + /// + /// Creates a new instance. + /// + /// The exception message. + /// The inner exception. + public TestRecordingMismatchException(string? message, Exception? innerException = null) : base(message, innerException) + { + } + +#if !NET8_0_OR_GREATER + /// + protected TestRecordingMismatchException(SerializationInfo info, StreamingContext context) : base(info, context) + { + } +#endif +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRecordingOptions.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRecordingOptions.cs new file mode 100644 index 000000000..de97b0d89 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/TestRecordingOptions.cs @@ -0,0 +1,151 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; +using OpenAI.TestFramework.Recording.Matchers; +using OpenAI.TestFramework.Recording.RecordingProxy; +using OpenAI.TestFramework.Recording.Sanitizers; +using OpenAI.TestFramework.Recording.Transforms; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Recording; + +/// +/// Options to configure a test recording. This can be used to set sanitizers to apply to the URI, headers, and/or body of a request +/// before matching, and before saving the recording. This can also be used to specify which matcher will be used to match a request +/// to a recorded one during playback. Finally this can be used to set the transforms applied to responses from the test proxy. +/// +public class TestRecordingOptions +{ + /// + /// Creates a new instance + /// + public TestRecordingOptions() + { } + + /// + /// The list of sanitizers to apply to request before matching, and before saving a recording. + /// + public IList Sanitizers { get; } = new List(); + + /// + /// Gets or sets the matcher to use. If this is unset, a custom matcher will be created based on the options specified in this class. + /// + public BaseMatcher? Matcher { get; set; } + + /// + /// The list of transforms to apply when returning a response during playback. + /// + public IList Transforms { get; } = new List(); + + /// + /// The sanitizers to remove from the list of default sanitizers. More details about default sanitizers can be found here: + /// https://github.com/Azure/azure-sdk-tools/blob/main/tools/test-proxy/Azure.Sdk.Tools.TestProxy/README.md#removing-a-sanitizer. + /// + /// You can find the list of sanitizer IDs to remove in two ways: + /// + /// Sending a GET request to http://{proxy_endpoint}/Info/Active + /// Looking at the source code for the test proxy here: + /// https://github.com/Azure/azure-sdk-tools/blob/main/tools/test-proxy/Azure.Sdk.Tools.TestProxy/Common/SanitizerDictionary.cs + /// + /// + public ISet SanitizersToRemove { get; } = new HashSet() + { + // For now, we should leave the default sanitizers in place since it is better to err on the side of caution + }; + + /// + /// Query parameters that we are only interested in checking if a value is set, but don't care about the actual value set. + /// + public ISet IgnoredQueryParameters { get; } = new HashSet(); + + /// + /// Headers that we are only interested in checking if a value is set, but don't care about the actual value set. + /// + public ISet IgnoredHeaders { get; } = new HashSet() + { + "Date", + "x-ms-date", + "User-Agent", + }; + + /// + /// Headers to completely disregard when recording and matching. In other words it is as if these headers were never set. + /// + public ISet ExcludedHeaders { get; } = new HashSet() + { +#if NETFRAMEWORK + // .Net framework will add some headers not found in newer .Net versions so let's completely ignore them here. It is also + // different in how it handles setting the Content-Length header when there is no body as compared to .Net + "Connection", + "Content-Length", +#endif + }; + + /// + /// Whether or not we want to compare bodies from the request and the recorded request during playback. Default + /// is true. + /// + public bool CompareBodies { get; set; } = true; + + /// + /// A function used to override if recording is enabled for a particular request. This will override other settings present + /// here. + /// + public Func? RequestOverride { get; set; } + + /// + /// Helper method to simplify sanitizing specific headers values. This will add a entry + /// to . The default replacement value will be set to . + /// + /// The keys to sanitize. + public void SanitizeHeaders(params string[] keys) + => SanitizeHeaders(Default.SanitizedValue, keys); + + /// + /// Helper method to simplify sanitizing specific headers values. This will add a entry + /// to . + /// + /// The value to replace matches with. + /// The keys to sanitize. + public virtual void SanitizeHeaders(string sanitizedValue, IEnumerable keys) + { + if (keys == null) + { + return; + } + + foreach (var key in keys) + { + Sanitizers.Add(new HeaderRegexSanitizer(key) { Value = sanitizedValue }); + } + } + + /// + /// Helper method to sanitize specific parts of a JSON request body. This will add a entry + /// to for each JSON path provided in . The default replacement value + /// will be set to . + /// + /// The JSON paths to sanitize. + public void SanitizeJsonBody(params string[] jsonPaths) + => SanitizeJsonBody(Default.SanitizedValue, jsonPaths); + + /// + /// Helper method to sanitize specific parts of a JSON request body. This will add a entry + /// to for each JSON path provided in . + /// + /// The value to replace matches with. + /// The JSON paths to sanitize. + public virtual void SanitizeJsonBody(string sanitizedValue, IEnumerable jsonPaths) + { + if (jsonPaths == null) + { + return; + } + + foreach (var key in jsonPaths) + { + Sanitizers.Add(new BodyKeySanitizer(key) { Value = sanitizedValue }); + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Transforms/BaseTransform.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Transforms/BaseTransform.cs new file mode 100644 index 000000000..11be5e1be --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Transforms/BaseTransform.cs @@ -0,0 +1,37 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Text.Json; +using System.Text.Json.Serialization; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Recording.Transforms; + +/// +/// Base class for test recording proxy transforms. Transforms are applied when returning a request during playback. +/// +public abstract class BaseTransform : IUtf8JsonSerializable +{ + /// + /// Creates a new instance. + /// + /// The type of this sanitizer (e.g. GeneralRegexSanitizer). + /// If the type was null. + protected BaseTransform(string type) + { + Type = type ?? throw new ArgumentNullException(nameof(Type)); + } + + /// + /// Gets the type of the sanitizer (e.g. HeaderRegexSanitizer). + /// + [JsonIgnore] + public string Type { get; } + + /// + public virtual void Write(Utf8JsonWriter writer, JsonSerializerOptions? options = null) + { + // By default use reflection based serialization + JsonSerializer.Serialize(writer, this, GetType(), Default.InnerRecordingJsonOptions); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Transforms/HeaderTransform.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Transforms/HeaderTransform.cs new file mode 100644 index 000000000..4817f84c5 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Recording/Transforms/HeaderTransform.cs @@ -0,0 +1,35 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Recording.Transforms; + +/// +/// Transform applied to headers before the response is generated during recording playback. +/// +public class HeaderTransform : BaseTransform +{ + /// + /// Creates a new instance. + /// + /// The response header to set. + /// If the is null. + public HeaderTransform(string key) : base("HeaderTransform") + { + Key = key ?? throw new ArgumentNullException(nameof(key)); + } + + /// + /// Gets the header to transform. + /// + public string Key { get; } + + /// + /// Gets or sets the value to set. + /// + public string? Value { get; set; } + + /// + /// The condition to apply for this transform. If the condition is not met, no transform is performed. + /// + public Condition? Condition { get; set; } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/SyncOnlyAttribute.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/SyncOnlyAttribute.cs new file mode 100644 index 000000000..2d00681f9 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/SyncOnlyAttribute.cs @@ -0,0 +1,14 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using NUnit.Framework; + +namespace OpenAI.TestFramework; + +/// +/// Attribute that can be applied to a test to indicate it only runs in synchronous mode. +/// +[AttributeUsage(AttributeTargets.Method, AllowMultiple = false, Inherited = true)] +public class SyncOnlyAttribute() : NUnitAttribute +{ +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/AndPreFilters.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/AndPreFilters.cs new file mode 100644 index 000000000..714bb78e1 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/AndPreFilters.cs @@ -0,0 +1,37 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Reflection; +using NUnit.Framework.Interfaces; + +namespace OpenAI.TestFramework.Utils; + +/// +/// Represents a pre-filter that combines multiple pre-filters using a logical AND operation. +/// +public class AndPreFilter : IPreFilter +{ + private IEnumerable _filters; + + /// + /// Initializes a new instance. + /// + /// The pre-filters to combine. + public AndPreFilter(params IPreFilter[] filters) : this((IEnumerable)filters) + { } + + /// + /// Initializes a new instance. + /// + /// The pre-filters to combine. + public AndPreFilter(IEnumerable filters) + { + _filters = filters?.Where(p => p != null) ?? Array.Empty(); + } + + /// + public bool IsMatch(Type type) => _filters.All(p => p.IsMatch(type)); + + /// + public bool IsMatch(Type type, MethodInfo method) => _filters.All(p => p.IsMatch(type, method)); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/AssemblyHelper.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/AssemblyHelper.cs new file mode 100644 index 000000000..ae11a0eed --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/AssemblyHelper.cs @@ -0,0 +1,100 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Reflection; +using System.Runtime.InteropServices; + +namespace OpenAI.TestFramework.Utils +{ + /// + /// Assembly related helper methods + /// + public static class AssemblyHelper + { + /// + /// Gets the value of the named assembly metadata attribute for the assembly where the is defined. + /// + /// The type whose assembly we want to read from. + /// The name of the metadata assembly attribute to read. + /// The value of the metadata attribute, or null if none was specified or could be found. + public static string? GetAssemblyMetadata(string name) + => GetAssemblyMetadata(typeof(T).Assembly, name); + + /// + /// Gets the value of the named assembly metadata attribute from assembly. + /// + /// The assembly to read the metadata attribute from + /// The name of the metadata assembly attribute to read. + /// The value of the metadata attribute, or null if none was specified or could be found. + public static string? GetAssemblyMetadata(this Assembly assembly, string name) + { + return assembly + ?.GetCustomAttributes() + .FirstOrDefault(a => a.Key == name && !string.IsNullOrWhiteSpace(a.Value)) + ?.Value; + } + + /// + /// Gets the root source directory for the assembly that defines the type . + /// + /// The type whose assembly source path we want to read. + /// The directory containing the original source path, or null if it was not set or did not exist. + public static DirectoryInfo? GetAssemblySourceDir() + => GetAssemblySourceDir(typeof(T).Assembly); + + /// + /// Gets the source path for the assembly. In order for this to work, you will need to set the assembly metadata attribute + /// your project file as follows: + /// + /// <ItemGroup> + /// <AssemblyAttribute Include="System.Reflection.AssemblyMetadataAttribute"> + /// <_Parameter1>SourcePath</_Parameter1> + /// <_Parameter2>$(MSBuildProjectDirectory)</_Parameter2> + /// </AssemblyAttribute> + /// </ItemGroup> + /// + /// + /// The assembly whose source path we want to find. + /// The directory containing the original source path, or null if it was not set or did not exist. + public static DirectoryInfo? GetAssemblySourceDir(this Assembly assembly) + { + string? sourcePath = assembly.GetAssemblyMetadata("SourcePath"); + if (sourcePath == null) + { + return null; + } + + DirectoryInfo dir = new(sourcePath); + return dir.Exists + ? dir + : null; + } + + /// + /// Finds the dotnet executable path for the current system. It does this by reading the DOTNET_INSTALL_DIR environment variable + /// first, and then inspecting all folders in the current PATH environment variable. + /// + /// The path to the found dotnet executable, or null if none could be found. + public static FileInfo? GetDotnetExecutable() + { + string dotnetExeName = "dotnet"; + if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows)) + { + dotnetExeName += ".exe"; + } + + List searchDirs = + [ + Environment.GetEnvironmentVariable("DOTNET_INSTALL_DIR"), + ..Environment.GetEnvironmentVariable("PATH") + ?.Split(Path.PathSeparator) + ?? Array.Empty() + ]; + + return searchDirs + .Where(dir => !string.IsNullOrWhiteSpace(dir)) + .Select(dir => new FileInfo(Path.Combine(dir!, dotnetExeName))) + .FirstOrDefault(file => file.Exists); + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/Default.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/Default.cs new file mode 100644 index 000000000..5774a4774 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/Default.cs @@ -0,0 +1,108 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System; +using System.Text.Json; +using System.Text.Json.Serialization; + +namespace OpenAI.TestFramework.Utils; + +/// +/// Options used for various recordings. +/// +public static class Default +{ + private static JsonSerializerOptions? _recordingJsonOptions; + private static JsonSerializerOptions? _innerRecordingJsonOptions; + private static JsonSerializerOptions? _testProxyJsonOptions; + private static TimeSpan? _testProxyWaitTime; + private static TimeSpan? _requestRetryDelay; + private static TimeSpan? _debuggerTestTimeout; + private static TimeSpan? _defaultTestTimeout; + + /// + /// Gets the default value to replace matches with while sanitizing. + /// + public const string SanitizedValue = "Sanitized"; + + /// + /// Gets the JSON serialization options to use for recording sanitizers, matchers, and transforms child instances. + /// + public static JsonSerializerOptions InnerRecordingJsonOptions => _innerRecordingJsonOptions ??= new() + { + PropertyNameCaseInsensitive = true, + PropertyNamingPolicy = JsonNamingPolicy.CamelCase, + WriteIndented = true, +#if NET + DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull, +#else + IgnoreNullValues = true, +#endif + }; + + /// + /// Gets the JSON serialization options to use for recording sanitizers, matchers, and transforms. + /// + public static JsonSerializerOptions RecordingJsonOptions + { + get + { + if (_recordingJsonOptions == null) + { + _recordingJsonOptions = InnerRecordingJsonOptions.Clone(); + _recordingJsonOptions.Converters.Add( + +#if NET6_0 + // .Net 6.0 seems to have a weird bug here. This is not needed for .Net framework, nor .Net 7+ + new Utf8JsonSerializableConverterFactory() +#else + new Utf8JsonSerializableConverter() +#endif + ); + } + + return _recordingJsonOptions; + } + } + + + /// + /// Gets the JSON serialization options to use for the test proxy + /// + public static JsonSerializerOptions TestProxyJsonOptions => _testProxyJsonOptions ??= new() + { + PropertyNameCaseInsensitive = true, + WriteIndented = true, +#if NET + DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull, +#else + IgnoreNullValues = true, +#endif + }; + + /// + /// The default maximum amount of time to wait to for the test proxy operations to finish (e.g. start up + /// and configuration, or saving a recording and teardown). + /// + public static TimeSpan TestProxyWaitTime => _testProxyWaitTime ??= TimeSpan.FromMinutes(2); + + /// + /// Gets the maximum number of times to retry requests + /// + public const int MaxRequestRetries = 3; + + /// + /// The amount of time to wait between requests. + /// + public static TimeSpan RequestRetryDelay => _requestRetryDelay ??= TimeSpan.FromSeconds(0.8); + + /// + /// The amount of time to wait when the debugger is attached. This is much higher than normal to allow for more time while debugging. + /// + public static TimeSpan DebuggerAttachedTestTimeout => _debuggerTestTimeout ??= TimeSpan.FromMinutes(15); + + /// + /// The default test timeout. + /// + public static TimeSpan TestTimeout => _defaultTestTimeout ??= TimeSpan.FromSeconds(15); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/Extensions.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/Extensions.cs new file mode 100644 index 000000000..437e8dc05 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/Extensions.cs @@ -0,0 +1,414 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Diagnostics; + +namespace OpenAI.TestFramework.Utils; + +/// +/// String related extension methods. +/// +public static class StringExtensions +{ + /// + /// Ensures that a string ends with a specified suffix. + /// + /// The string value. + /// The suffix to check for. + /// The string comparison type. Default is . + /// The original string if it ended in the suffix, or a new string value with the suffix appended. + public static string EnsureEndsWith(this string value, string suffix, StringComparison comparison = StringComparison.Ordinal) + { + if (value == null) + { + return null!; + } + + if (value.EndsWith(suffix, comparison)) + { + return value; + } + + return value + suffix; + } + + /// + /// Ensures that a string ends with a specified suffix. + /// + /// The string value. + /// The suffix to check for. + /// The string comparison type. Default is . + /// The original string if it ended in the suffix, or a new string value with the suffix appended. + public static string EnsureEndsWith(this string value, char suffix, StringComparison comparison = StringComparison.Ordinal) + => EnsureEndsWith(value, suffix.ToString(), comparison); +} + +/// +/// Extension methods for System.ClientModel types. +/// +public static class ScmExtensions +{ + /// + /// Gets the first value associated with the specified header name from the pipeline request headers. + /// + /// The pipeline request headers. + /// The name of the header. + /// The first non-empty value associated with the specified header name, or null if the header is not found or has no non-empty values. + public static string? GetFirstOrDefault(this PipelineRequestHeaders headers, string name) + { + if (headers?.TryGetValues(name, out IEnumerable? values) == true) + { + return values?.FirstOrDefault(v => !string.IsNullOrWhiteSpace(v)); + } + + return null; + } + + /// + /// Gets the first value associated with the specified header name from the pipeline response headers. + /// + /// The pipeline response headers. + /// The name of the header. + /// The first non-empty value associated with the specified header name, or null if the header is not found or has no non-empty values. + public static string? GetFirstOrDefault(this PipelineResponseHeaders headers, string name) + { + if (headers?.TryGetValues(name, out IEnumerable? values) == true) + { + return values?.FirstOrDefault(v => !string.IsNullOrWhiteSpace(v)); + } + + return null; + } +} + +/// +/// Extensions for collections +/// +public static class CollectionExtensions +{ + /// + /// Adds the elements to a collection. + /// + /// The type of the elements in the collection. + /// The collection to add elements to. + /// The items to add. + public static void AddRange(this ICollection collection, IEnumerable itemsToAdd) + { + foreach (T item in itemsToAdd) + { + collection.Add(item); + } + } + + /// + /// Joins the elements of a collection into a single string using the specified separator. + /// Returns null if the collection is null or empty. + /// + /// The collection of strings to join. + /// The separator string. + /// A string that consists of the elements of the collection joined by the separator, or null if the collection is null or empty. + public static string? JoinOrNull(this IEnumerable values, string separator) + { + if (values == null || !values.Any()) + { + return null; + } + + return string.Join(separator, values); + } + +#if NETFRAMEWORK + /// + /// Gets the value associated with the specified key from the dictionary, or returns the default value if the key is not found. + /// + /// The type of the keys in the dictionary. + /// The type of the values in the dictionary. + /// The dictionary. + /// The key to locate. + /// The value associated with the specified key, or the default value if the key is not found. + public static TVal? GetValueOrDefault(this IReadOnlyDictionary dict, TKey key) + => GetValueOrDefault(dict, key, default!); + + /// + /// Gets the value associated with the specified key from the dictionary, or returns the specified default value if the key is not found. + /// + /// The type of the keys in the dictionary. + /// The type of the values in the dictionary. + /// The dictionary. + /// The key to locate. + /// The default value to return if the key is not found. + /// The value associated with the specified key, or the specified default value if the key is not found. + public static TVal GetValueOrDefault(this IReadOnlyDictionary dict, TKey key, TVal defaultValue) + { + if (dict?.TryGetValue(key, out TVal? value) == true) + { + return value; + } + + return defaultValue; + } +#endif + + /// + /// Gets the value associated with the specified key from the dictionary, or returns the default value if the key is not found. + /// + /// The type of the keys in the dictionary. + /// The type of the values in the dictionary. + /// The dictionary. + /// The key to locate. + /// The value associated with the specified key, or the default value if the key is not found. + public static TVal? GetValueOrDefault(this Dictionary dict, TKey key) where TKey : notnull + => GetValueOrDefault((IDictionary)dict, key, default!); + + /// + /// Gets the value associated with the specified key from the dictionary, or returns the specified default value if the key is not found. + /// + /// The type of the keys in the dictionary. + /// The type of the values in the dictionary. + /// The dictionary. + /// The key to locate. + /// The default value to return if the key is not found. + /// The value associated with the specified key, or the specified default value if the key is not found. + public static TVal GetValueOrDefault(this Dictionary dict, TKey key, TVal defaultValue) where TKey : notnull + => GetValueOrDefault((IDictionary)dict, key, defaultValue); + + /// + /// Gets the value associated with the specified key from the sorted dictionary, or returns the default value if the key is not found. + /// + /// The type of the keys in the sorted dictionary. + /// The type of the values in the sorted dictionary. + /// The sorted dictionary. + /// The key to locate. + /// The value associated with the specified key, or the default value if the key is not found. + public static TVal? GetValueOrDefault(this SortedDictionary dict, TKey key) where TKey : notnull + => GetValueOrDefault((IDictionary)dict, key, default!); + + /// + /// Gets the value associated with the specified key from the sorted dictionary, or returns the specified default value if the key is not found. + /// + /// The type of the keys in the sorted dictionary. + /// The type of the values in the sorted dictionary. + /// The sorted dictionary. + /// The key to locate. + /// The default value to return if the key is not found. + /// The value associated with the specified key, or the specified default value if the key is not found. + public static TVal GetValueOrDefault(this SortedDictionary dict, TKey key, TVal defaultValue) where TKey : notnull + => GetValueOrDefault((IDictionary)dict, key, defaultValue); + + /// + /// Gets the value associated with the specified key from the dictionary, or returns the default value if the key is not found. + /// + /// The type of the keys in the dictionary. + /// The type of the values in the dictionary. + /// The dictionary. + /// The key to locate. + /// The value associated with the specified key, or the default value if the key is not found. + public static TVal? GetValueOrDefault(this IDictionary dict, TKey key) + => GetValueOrDefault(dict, key, default!); + + /// + /// Gets the value associated with the specified key from the dictionary, or returns the specified default value if the key is not found. + /// + /// The type of the keys in the dictionary. + /// The type of the values in the dictionary. + /// The dictionary. + /// The key to locate. + /// The default value to return if the key is not found. + /// The value associated with the specified key, or the specified default value if the key is not found. + public static TVal GetValueOrDefault(this IDictionary dict, TKey key, TVal defaultValue) + { + if (dict?.TryGetValue(key, out TVal? value) == true) + { + return value; + } + + return defaultValue; + } + + /// + /// Gets the value associated with the specified key from the dictionary, or creates and adds a new value if the key did not exist. + /// + /// The type of the keys in the dictionary. + /// The type of the values in the dictionary. + /// The dictionary. + /// The key to locate. + /// The function used to create a value for the key if it is not found in the dictionary. + /// The value associated with the specified key, or the value created by the if the key is not found. + public static TValue GetOrAdd(this IDictionary dictionary, TKey key, Func valueFactory) + { + if (dictionary == null) + { + throw new ArgumentNullException(nameof(dictionary)); + } + + if (!dictionary.TryGetValue(key, out TValue? value)) + { + value = valueFactory(key); + dictionary[key] = value; + } + + return value!; + } + + /// + /// Asynchronously returns the first element of a sequence. + /// is found. + /// + /// The type of the elements in the sequence. + /// The sequence to search. + /// A cancellation token to cancel the operation. + /// Asynchronous task. + public static ValueTask FirstOrDefaultAsync(this IAsyncEnumerable enumerable, CancellationToken token = default) + => FirstOrDefaultAsync(enumerable, _ => true); + + /// + /// Asynchronously returns the first element of a sequence that satisfies a specified condition or a default value if no such element + /// is found. + /// + /// The type of the elements in the sequence. + /// The sequence to search. + /// A function to test each element for a condition. + /// A cancellation token to cancel the operation. + /// Asynchronous task. + public static async ValueTask FirstOrDefaultAsync(this IAsyncEnumerable enumerable, Predicate predicate, CancellationToken token = default) + { + await foreach (T item in enumerable.WithCancellation(token)) + { + if (predicate(item)) + { + return item; + } + } + + return default!; + } + + /// + /// Converts an to a asynchronously. + /// + /// The type of the elements in the enumerable. + /// The to convert. + /// The cancellation token. + /// Asynchronous task to do the conversion. + public static async Task> ToListAsync(this IAsyncEnumerable asyncEnumerable, CancellationToken token = default) + { + List list = new List(); + await foreach (T item in asyncEnumerable.WithCancellation(token)) + { + list.Add(item); + } + return list; + } + + /// + /// Converts an async enumerable of pages to a asynchronously. + /// + /// The type of the elements in the enumerable. + /// The to convert. + /// The cancellation token. + /// Asynchronous task to do the conversion. + public static async Task> ToListAsync(this IAsyncEnumerable> pageAsyncEnumerable, CancellationToken token = default) + { + List list = new List(); + await foreach(PageResult page in pageAsyncEnumerable.WithCancellation(token)) + { + list.AddRange(page.Values); + } + return list; + } +} + +/// +/// Helpers for working with paths. +/// +public static class PathHelpers +{ + /// + /// Create a relative path from one path to another. Paths will be resolved before calculating the difference. + /// + /// The source path the output should be relative to. This path is always considered to be a directory. + /// The destination path. + /// The relative path or if the paths don't share the same root. + public static string GetRelativePath(string relativeTo, string path) + { + +#if NET + return Path.GetRelativePath(relativeTo, path); +#else + relativeTo = Path.GetFullPath(relativeTo).EnsureEndsWith(Path.DirectorySeparatorChar); + path = Path.GetFullPath(path).EnsureEndsWith(Path.DirectorySeparatorChar); + + Uri relativeToUri = new Uri(relativeTo); + Uri pathUri = new Uri(path); + + if (relativeToUri.Scheme != pathUri.Scheme) + { + return path; + } + + Uri relative = relativeToUri.MakeRelativeUri(pathUri); + return Uri.UnescapeDataString(relative.ToString()) + .Replace('/', '\\'); +#endif + } +} + + +/// +/// Extensions for types. +/// +public static class TypeExtensions +{ + /// + /// Determines whether the specified type either implements the open generic type specified, + /// or inherits from the open generic type specified. + /// + /// The type to inspect. + /// The open generic type. + /// The arguments of the closed generic type. + /// True if the type implements, or inherits, or is a closed version of the open type. + [DebuggerStepThrough] + public static bool IsClosedGenericOf(this Type type, Type openGeneric, out Type[] closedTypeArguments) + { + Type? closedType = null; + + if (openGeneric.IsInterface) + { + closedType = type.GetInterfaces() + .FirstOrDefault(iType => IsAssignableToOpen(iType, openGeneric)); + } + + if (closedType == null) + { + for (Type? current = type; current != null && closedType == null; current = current.BaseType) + { + if (IsAssignableToOpen(current, openGeneric)) + { + closedType = current; + } + } + } + + closedTypeArguments = closedType?.GetGenericArguments() ?? Array.Empty(); + return closedType != null; + } + + /// + /// Determines if the type is or inherits from the open generic type. + /// + /// The type. + /// The open generic type. + /// True if the open generic type could be assigned from the type. + [DebuggerStepThrough] + public static bool IsAssignableToOpen(this Type type, Type openGeneric) + { + if (!type.IsGenericType || !type.IsConstructedGenericType) + { + return false; + } + + return openGeneric.IsAssignableFrom(type.GetGenericTypeDefinition()); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/IUtf8JsonSerializable.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/IUtf8JsonSerializable.cs new file mode 100644 index 000000000..7b81c3532 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/IUtf8JsonSerializable.cs @@ -0,0 +1,19 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Text.Json; + +namespace OpenAI.TestFramework.Utils; + +/// +/// Interface applied to types that can be serialized to JSON. +/// +public interface IUtf8JsonSerializable +{ + /// + /// Writes this instance as JSON to the writer. + /// + /// The writer to write to. + /// The options to use when writing. + void Write(Utf8JsonWriter writer, JsonSerializerOptions? options = null); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/JsonHelpers.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/JsonHelpers.cs new file mode 100644 index 000000000..ed6752ca4 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/JsonHelpers.cs @@ -0,0 +1,143 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Text.Json; +using System.Text.Json.Serialization; + +namespace OpenAI.TestFramework.Utils; + +public static class JsonHelpers +{ + /// + /// Serializes the specified data to a stream using as a UTF-8 encoded JSON text. + /// + /// The type of the data to serialize. + /// The stream to write the serialized data to. + /// The data to serialize. + /// (Optional) Options to use when serializing. + public static void Serialize(Stream stream, T data, JsonSerializerOptions? options = null) + { +#if NETFRAMEWORK + using (Utf8JsonWriter writer = new(stream)) + { + JsonSerializer.Serialize(writer, data, options); + writer.Flush(); + } +#else + JsonSerializer.Serialize(stream, data, options); +#endif + } + + /// + /// Deserializes UTF-8 encoded JSON text from a stream. + /// + /// The type of the data to deserialize. + /// The stream to read the serialized data from. + /// (Optional) Options to use when deserializing. + /// The deserialized data. + public static T? Deserialize(Stream stream, JsonSerializerOptions? options = null) + { +#if NETFRAMEWORK + // For now let's keep it simple and load entire JSON bytes into memory + using MemoryStream buffer = new(); + stream.CopyTo(buffer); + + ReadOnlySpan jsonBytes = buffer.GetBuffer().AsSpan(0, (int)buffer.Length); + return JsonSerializer.Deserialize(jsonBytes, options); +#else + return JsonSerializer.Deserialize(stream, options); +#endif + } + +#if NET6_0_OR_GREATER + // .Net 6 and newer already have the extension method we need defined in JsonSerializer +#else + // TODO FIXME once we move to newer versions of System.Text.Json we can directly use the + // JsonSerializer extension method for elements + public static T? Deserialize(this JsonElement element, JsonSerializerOptions? options = null) + { + using MemoryStream stream = new(); + using Utf8JsonWriter writer = new(stream, new() + { + Encoder = System.Text.Encodings.Web.JavaScriptEncoder.UnsafeRelaxedJsonEscaping, + Indented = false, + SkipValidation = true + }); + element.WriteTo(writer); + writer.Flush(); + + stream.Seek(0, SeekOrigin.Begin); + if (((ulong)stream.Length & 0xffffffff00000000) != 0ul) + { + throw new ArgumentOutOfRangeException("JsonElement is too large"); + } + + ReadOnlySpan span = new(stream.GetBuffer(), 0, (int)stream.Length); + return JsonSerializer.Deserialize(span, options); + } +#endif + + /// + /// Serializes a value to a JsonElement. + /// + /// Type of the data to serialize. + /// The value to serialize. + /// (Optional) Options to use when serializing. + /// The serialized value as a JsonElement. + public static JsonElement SerializeToElement(T value, JsonSerializerOptions? options = null) + { +#if NET6_0_OR_GREATER + return JsonSerializer.SerializeToElement(value, options); +#else + using MemoryStream stream = new(); + Serialize(stream, value, options); + stream.Seek(0, SeekOrigin.Begin); + return JsonDocument.Parse(stream).RootElement; +#endif + } + + /// + /// Creates a clone of the specified JSON serializer options. + /// + /// The JSON serializer options to clone. + /// (Optional) Filter to apply for selecting specific converters to include in the cloned options. + /// A clone of the JSON serializer options. + public static JsonSerializerOptions Clone(this JsonSerializerOptions options, Predicate? converterFilter = null) + { +#if NET + JsonSerializerOptions cloned = new JsonSerializerOptions(options); + if (converterFilter != null) + { + cloned.Converters.Clear(); + foreach (var converter in options.Converters.Where(c => converterFilter(c))) + { + cloned.Converters.Add(converter); + } + } + + return cloned; +#else + JsonSerializerOptions clone = new() + { + AllowTrailingCommas = options.AllowTrailingCommas, + DefaultBufferSize = options.DefaultBufferSize, + DictionaryKeyPolicy = options.DictionaryKeyPolicy, + Encoder = options.Encoder, + IgnoreNullValues = options.IgnoreNullValues, + IgnoreReadOnlyProperties = options.IgnoreReadOnlyProperties, + MaxDepth = options.MaxDepth, + PropertyNameCaseInsensitive = options.PropertyNameCaseInsensitive, + PropertyNamingPolicy = options.PropertyNamingPolicy, + ReadCommentHandling = options.ReadCommentHandling, + WriteIndented = options.WriteIndented, + }; + + foreach (var converter in options.Converters.Where(c => converterFilter?.Invoke(c) ?? true)) + { + clone.Converters.Add(converter); + } + + return clone; +#endif + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/PropertyDelegate.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/PropertyDelegate.cs new file mode 100644 index 000000000..3c2f467cf --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/PropertyDelegate.cs @@ -0,0 +1,49 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Utils; + +/// +/// Represents a delegate for getting and setting property values. +/// +/// The type of the property value. +public struct PropertyDelegate +{ + private Func? _getter; + private Action? _setter; + + /// + /// Initializes a new instance of the struct. + /// + /// The delegate used to get the property value. + /// The delegate used to set the property value. + public PropertyDelegate(Func getter, Action setter) + { + _getter = getter ?? throw new ArgumentNullException(nameof(getter)); + _setter = setter ?? throw new ArgumentNullException(nameof(setter)); + } + + /// + /// Gets the value of the property. + /// + /// The value of the property. + public TVal GetValue() + { + if (_getter != null) + return _getter(); + else + throw new InvalidOperationException("No getter was set"); + } + + /// + /// Sets the value of the property. + /// + /// The value to set. + public void SetValue(TVal val) + { + if (_setter != null) + _setter(val); + else + throw new InvalidOperationException("No setter was set"); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/SyncAsyncPreFilter.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/SyncAsyncPreFilter.cs new file mode 100644 index 000000000..a2c32fc1a --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/SyncAsyncPreFilter.cs @@ -0,0 +1,41 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Reflection; +using NUnit.Framework.Interfaces; + +namespace OpenAI.TestFramework.Utils +{ + /// + /// Filter to exclude sync only or async only tests in the appropriate test run. + /// + public class SyncAsyncPreFilter : IPreFilter + { + private bool _isAsync; + + /// + /// Creates a new instance. + /// + /// True to filter for an async test run, false to filter for sync test run. + public SyncAsyncPreFilter(bool isAsync) + { + _isAsync = isAsync; + } + + /// + public bool IsMatch(Type type) + => type.GetCustomAttribute() != null; + + /// + public bool IsMatch(Type type, MethodInfo method) + { + if (!IsMatch(type)) + { + return false; + } + + return _isAsync && method.GetCustomAttribute() == null + || !_isAsync && method.GetCustomAttribute() == null; + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/TestClientRetryPolicy.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/TestClientRetryPolicy.cs new file mode 100644 index 000000000..517ff5576 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/TestClientRetryPolicy.cs @@ -0,0 +1,84 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; +using System.Diagnostics; +using System.Reflection; + +namespace OpenAI.TestFramework.Utils; + +/// +/// Represents a retry policy to be used when testing clients. +/// +public class TestClientRetryPolicy : ClientRetryPolicy +{ + private Func _getRetries; + + /// + /// Initializes a new instance of the class. + /// + /// The maximum number of retries. + /// The delay between retries. + /// Indicates whether the delay should be exponential. + public TestClientRetryPolicy(int maxRetries = Utils.Default.MaxRequestRetries, TimeSpan? delay = null, bool exponentialDelay = false) + : base(maxRetries) + { + MaxRetries = MaxRetries; + Delay = delay ?? Utils.Default.RequestRetryDelay; + IsExponentialDelay = exponentialDelay; + + // Of course, even reading the number of retries property on the PipelineMessage is internal only. + // So reflection it is + _getRetries = (Func) + (typeof(PipelineMessage).GetProperty("RetryCount", BindingFlags.NonPublic | BindingFlags.Public | BindingFlags.Instance) + ?.GetGetMethod(true) + ?.CreateDelegate(typeof(Func)) + ?? throw new InvalidOperationException("Failed to get RetryCount property")); + } + + /// + /// Gets the maximum number of retries. + /// + public int MaxRetries { get; } + + /// + /// Gets the delay between retries. + /// + public TimeSpan Delay { get; } + + /// + /// Gets a value indicating whether the delay should be exponential. + /// + public bool IsExponentialDelay { get; } + + /// + protected override TimeSpan GetNextDelay(PipelineMessage message, int tryCount) + { + TimeSpan delay = IsExponentialDelay + ? TimeSpan.FromMilliseconds((1 << tryCount - 1) * Delay.TotalMilliseconds) + : Delay; + + return delay; + } + + /// + protected override bool ShouldRetry(PipelineMessage message, Exception? exception) + { + if (_getRetries(message) >= MaxRetries) + { + return false; + } + + if (!message.ResponseClassifier.TryClassify(message, exception, out bool isRetriable) + && !PipelineMessageClassifier.Default.TryClassify(message, exception, out isRetriable)) + { + Debug.Assert(false, "Failed to classify message"); + } + + return isRetriable; + } + + /// + protected override ValueTask ShouldRetryAsync(PipelineMessage message, Exception? exception) + => new ValueTask(ShouldRetry(message, exception)); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/TestPipelinePolicy.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/TestPipelinePolicy.cs new file mode 100644 index 000000000..bc004f1bc --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/TestPipelinePolicy.cs @@ -0,0 +1,57 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel.Primitives; + +namespace OpenAI.TestFramework.Utils; + +/// +/// A pipeline policy that raises events before a request sent, and after response has been received. +/// +public class TestPipelinePolicy() : PipelinePolicy() +{ + /// + /// Creates a new instance. This will instantiate the and + /// events based on and respectively. + /// + /// (Optional) Action to perform before sending a request. + /// (Optional) Action to perform after a response is received. + public TestPipelinePolicy(Action? requestAction, Action? responseAction) : this() + { + if (requestAction != null) BeforeRequest += (s, e) => requestAction(e); + + if (responseAction != null) AfterResponse += (s, e) => responseAction(e); + } + + /// + /// Event raised before a request is sent. + /// + public event EventHandler? BeforeRequest; + + /// + /// Event raised after a response has been received. + /// + public event EventHandler? AfterResponse; + + /// + public override void Process(PipelineMessage message, IReadOnlyList pipeline, int currentIndex) + { + BeforeRequest?.Invoke(this, message.Request); + ProcessNext(message, pipeline, currentIndex); + if (message.Response != null) + { + AfterResponse?.Invoke(this, message.Response); + } + } + + /// + public override async ValueTask ProcessAsync(PipelineMessage message, IReadOnlyList pipeline, int currentIndex) + { + BeforeRequest?.Invoke(this, message.Request); + await ProcessNextAsync(message, pipeline, currentIndex).ConfigureAwait(false); + if (message.Response != null) + { + AfterResponse?.Invoke(this, message.Response); + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/TimespanToMillisecondConverter.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/TimespanToMillisecondConverter.cs new file mode 100644 index 000000000..7045f956b --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/TimespanToMillisecondConverter.cs @@ -0,0 +1,65 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Globalization; +using System.Text.Json; +using System.Text.Json.Serialization; + +namespace OpenAI.TestFramework.Utils; + +/// +/// Converter for TimeSpans to/from integer millisecond values in JSON. +/// +public class TimespanToMillisecondConverter : JsonConverter +{ + /// + /// Reads a value from JSON. + /// + /// The to read from. + /// The type of the object to convert. + /// The serializer options. + /// The deserialized value. + public override TimeSpan? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) + { + switch (reader.TokenType) + { + case JsonTokenType.Null: + return null; + + case JsonTokenType.Number: + return TimeSpan.FromMilliseconds(reader.GetInt32()); + + case JsonTokenType.String: + string? strValue = reader.GetString(); + if (int.TryParse(strValue, NumberStyles.Integer, CultureInfo.InvariantCulture, out int milliseconds)) + { + return TimeSpan.FromMilliseconds(milliseconds); + } + else + { + throw new JsonException("Invalid millisecond value: " + strValue); + } + + default: + throw new JsonException($"Don't know how to parse '{reader.TokenType}' as a millisecond value"); + } + } + + /// + /// Writes a value to JSON. + /// + /// The to write to. + /// The value to write. + /// The serializer options. + public override void Write(Utf8JsonWriter writer, TimeSpan? value, JsonSerializerOptions options) + { + if (value == null) + { + writer.WriteNullValue(); + } + else + { + writer.WriteNumberValue((int)Math.Ceiling(value.Value.TotalMilliseconds)); + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/Utf8JsonSerializableConverter.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/Utf8JsonSerializableConverter.cs new file mode 100644 index 000000000..3613f66c0 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/Utf8JsonSerializableConverter.cs @@ -0,0 +1,55 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Text.Json; +using System.Text.Json.Serialization; + +namespace OpenAI.TestFramework.Utils; + +/// +/// Converter for types that implement . +/// +public class Utf8JsonSerializableConverter : JsonConverter +{ + private static Utf8JsonSerializableConverter? s_instance; + + /// + /// Gets the shared instance of the converter. + /// + public static Utf8JsonSerializableConverter Instance => s_instance ??= new(); + + /// + public override bool CanConvert(Type typeToConvert) + => typeof(IUtf8JsonSerializable).IsAssignableFrom(typeToConvert); + + /// + public override IUtf8JsonSerializable Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) + => throw new NotSupportedException("Only writing JSON is supported"); + + /// + public override void Write(Utf8JsonWriter writer, IUtf8JsonSerializable value, JsonSerializerOptions options) + => value.Write(writer); +} + +#if NET6_0 +/// +/// .Net 6.0 has some odd quirks and is particularly pedantic with converters so directly using Utf8JsonSerializableConverter would +/// result in an InvalidCastException. The work around is to use a converter factory. Thankfully, neither .Net Framework, nor .Net 7+ +/// exhibit this behavior. +/// +public class Utf8JsonSerializableConverterFactory : JsonConverterFactory +{ + public override bool CanConvert(Type typeToConvert) => typeof(IUtf8JsonSerializable).IsAssignableFrom(typeToConvert); + public override JsonConverter? CreateConverter(Type typeToConvert, JsonSerializerOptions options) + => (JsonConverter?)Activator.CreateInstance(typeof(InnerConverter<>).MakeGenericType(typeToConvert)); + + private class InnerConverter : JsonConverter where T : IUtf8JsonSerializable + { + public override T Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) + => (T)Utf8JsonSerializableConverter.Instance.Read(ref reader, typeToConvert, options); + + public override void Write(Utf8JsonWriter writer, T value, JsonSerializerOptions options) + => Utf8JsonSerializableConverter.Instance.Write(writer, value, options); + } +} +#endif diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/WindowsJob.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/WindowsJob.cs new file mode 100644 index 000000000..a5eb2570b --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/src/Utils/WindowsJob.cs @@ -0,0 +1,208 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.Diagnostics; +using System.Runtime.ConstrainedExecution; +using System.Runtime.InteropServices; +using System.Security; + +namespace OpenAI.TestFramework.Utils.Processes; + +/// +/// A job provides a way to link several processes together on Windows. In this way, they can all be +/// terminated by calling the method. The OS will also automatically terminate +/// the linked processes if the owner process terminates. +/// +public class WindowsJob : IDisposable +{ + private IntPtr _jobHandle; + private int _disposed; + + /// + /// Creates a new job + /// + /// (Optional) The name to associate + public WindowsJob(string? name = null) + { + if (!RuntimeInformation.IsOSPlatform(OSPlatform.Windows)) + { + throw new NotSupportedException("This is only supported on Windows platforms"); + } + + var securityAttributes = new SECURITY_ATTRIBUTES() + { + nLength = (uint)Marshal.SizeOf(typeof(SECURITY_ATTRIBUTES)), + lpSecurityDescriptor = IntPtr.Zero, + bInheritHandle = false + }; + + // Create the job handle + _jobHandle = CreateJobObject(ref securityAttributes, name); + if (_jobHandle == IntPtr.Zero) + { + throw new COMException("Failed to create job", Marshal.GetLastWin32Error()); + } + + // Set the job state so that all associated handles are closed + var extendedInfo = new JOBOBJECT_EXTENDED_LIMIT_INFORMATION() + { + BasicLimitInformation = new JOBOBJECT_BASIC_LIMIT_INFORMATION() + { + LimitFlags = JobObjectLimits.LIMIT_KILL_ON_JOB_CLOSE + } + }; + + int length = Marshal.SizeOf(typeof(JOBOBJECT_EXTENDED_LIMIT_INFORMATION)); + IntPtr ptr = IntPtr.Zero; + try + { + ptr = Marshal.AllocHGlobal(length); + Marshal.StructureToPtr(extendedInfo, ptr, false); + + bool success = SetInformationJobObject( + _jobHandle, + JOBOBJECTINFOCLASS.JobObjectExtendedLimitInformation, + ptr, + (uint)length); + + if (!success) + { + throw new COMException("Failed to set the job extended information", Marshal.GetLastWin32Error()); + } + } + finally + { + Marshal.FreeHGlobal(ptr); + } + } + + /// + /// Adds a process to the job + /// + /// The process to add + public void Add(Process process) + { + if (process == null) + { + throw new ArgumentNullException(nameof(process)); + } + else if (process.Handle == IntPtr.Zero) + { + throw new ArgumentException("The specified process has a NULL handle"); + } + + bool success = AssignProcessToJobObject(_jobHandle, process.Handle); + if (!success) + { + throw new COMException("Failed to add the process to the job", Marshal.GetLastWin32Error()); + } + } + + /// + /// Closes the job. This will close all linked processes + /// + public void Close() + { + CloseHandle(_jobHandle); + _jobHandle = IntPtr.Zero; + } + + /// + /// Disposes of the job. This will also close all linked process. + /// + public void Dispose() + { + if (Interlocked.Exchange(ref _disposed, 1) == 0) + { + Close(); + } + } + + #region native methods + + [DllImport("kernel32.dll", CharSet = CharSet.Auto, SetLastError = true)] + internal static extern IntPtr CreateJobObject([In] ref SECURITY_ATTRIBUTES lpJobAttributes, string? lpName); + + [DllImport("kernel32.dll", CharSet = CharSet.Auto, SetLastError = true)] + internal static extern IntPtr OpenJobObject(uint dwDesiredAccess, bool bInheritHandles, string lpName); + + [DllImport("kernel32.dll", SetLastError = true)] + [return: MarshalAs(UnmanagedType.Bool)] + internal static extern bool AssignProcessToJobObject(IntPtr hJob, IntPtr hProcess); + + [DllImport("kernel32.dll", SetLastError = true)] + [return: MarshalAs(UnmanagedType.Bool)] + internal static extern bool SetInformationJobObject( + [In] IntPtr hJob, + JOBOBJECTINFOCLASS JobObjectInfoClass, + [In] IntPtr lpJobObjectInfo, + uint cbJobObjectInfoLength); + + [DllImport("kernel32.dll", SetLastError = true)] +#if NETFRAMEWORK + [ReliabilityContract(Consistency.WillNotCorruptState, Cer.Success)] +#endif + [SuppressUnmanagedCodeSecurity] + [return: MarshalAs(UnmanagedType.Bool)] + internal static extern bool CloseHandle(IntPtr hObject); + +#endregion + + #region native types + + [StructLayout(LayoutKind.Sequential)] + internal struct SECURITY_ATTRIBUTES + { + public uint nLength; + public IntPtr lpSecurityDescriptor; + public bool bInheritHandle; + } + + [StructLayout(LayoutKind.Sequential)] + internal struct JOBOBJECT_BASIC_LIMIT_INFORMATION + { + public Int64 PerProcessUserTimeLimit; + public Int64 PerJobUserTimeLimit; + public JobObjectLimits LimitFlags; + public UIntPtr MinimumWorkingSetSize; + public UIntPtr MaximumWorkingSetSize; + public UInt32 ActiveProcessLimit; + public UIntPtr Affinity; + public UInt32 PriorityClass; + public UInt32 SchedulingClass; + } + + [StructLayout(LayoutKind.Sequential)] + internal struct JOBOBJECT_EXTENDED_LIMIT_INFORMATION + { + public JOBOBJECT_BASIC_LIMIT_INFORMATION BasicLimitInformation; + public IO_COUNTERS IoInfo; + public UIntPtr ProcessMemoryLimit; + public UIntPtr JobMemoryLimit; + public UIntPtr PeakProcessMemoryUsed; + public UIntPtr PeakJobMemoryUsed; + } + + [StructLayout(LayoutKind.Sequential)] + internal struct IO_COUNTERS + { + public UInt64 ReadOperationCount; + public UInt64 WriteOperationCount; + public UInt64 OtherOperationCount; + public UInt64 ReadTransferCount; + public UInt64 WriteTransferCount; + public UInt64 OtherTransferCount; + } + + internal enum JOBOBJECTINFOCLASS + { + JobObjectExtendedLimitInformation = 9, + } + + internal enum JobObjectLimits : UInt32 + { + LIMIT_KILL_ON_JOB_CLOSE = 0x00002000, + } +} + +#endregion diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/tests/AdaptersTests.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/AdaptersTests.cs new file mode 100644 index 000000000..266d388a3 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/AdaptersTests.cs @@ -0,0 +1,107 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.Diagnostics; +using NUnit.Framework; +using OpenAI.TestFramework.Adapters; +using OpenAI.TestFramework.Mocks; + +namespace OpenAI.TestFramework.Tests; + +[TestFixture] +public class AdaptersTests +{ + public CancellationToken Token => + new CancellationTokenSource(Debugger.IsAttached + ? TimeSpan.FromMinutes(15) + : TimeSpan.FromSeconds(5)) + .Token; + + [Test] + public async Task TestSyncToAsyncEnumerator() + { + const int start = 0; + const int num = 100; + + IEnumerator sync = Enumerable.Range(start, num).GetEnumerator(); + await using SyncToAsyncEnumerator async = new(sync, Token); + + for (int i = start; i < num; i++) + { + bool success = await async.MoveNextAsync(); + Assert.That(success, Is.True); + Assert.That(async.Current, Is.EqualTo(i)); + } + } + + [Test] + public async Task TestSyncToAsyncResultCollection() + { + const int start = 0; + const int num = 100; + + MockCollectionResult sync = new(() => Enumerable.Range(start, num)); + SyncToAsyncCollectionResult asyncAdapter = new(sync); + + await using var asyncEnumerator = asyncAdapter.GetAsyncEnumerator(Token); + + for (int i = start; i < num; i++) + { + bool success = await asyncEnumerator.MoveNextAsync(); + Assert.That(success, Is.True); + Assert.That(asyncEnumerator.Current, Is.EqualTo(i)); + } + } + + [Test] + public async Task TestFailedSyncToAsyncResultCollection() + { + MockCollectionResult sync = new(Fail); + SyncToAsyncCollectionResult asyncAdapter = new(sync); + + await using var asyncEnumerator = asyncAdapter.GetAsyncEnumerator(Token); + Assert.ThrowsAsync(() => asyncEnumerator.MoveNextAsync().AsTask()); + } + + [Test] + public async Task TestSyncToAsyncPageableCollection() + { + const int start = 0; + const int num = 100; + const int itemsPerPage = 10; + int expectedPages = (int)Math.Ceiling((double)num / itemsPerPage); + + MockPageCollection sync = new(() => Enumerable.Range(start, num), new MockPipelineResponse(), itemsPerPage); + SyncToAsyncPageCollection asyncAdapter = new(sync); + + int numPages = 0; + int expected = 0; + await foreach (var page in asyncAdapter) + { + numPages++; + foreach (int actual in page.Values) + { + Assert.That(actual, Is.EqualTo(expected)); + expected++; + } + } + + Assert.That(numPages, Is.EqualTo(expectedPages)); + } + + [Test] + public async Task TestFailedSyncToAsyncPageableCollection() + { + MockPageCollection sync = new(Fail, new MockPipelineResponse()); + SyncToAsyncPageCollection asyncAdapter = new(sync); + + await using var asyncEnumerator = ((IAsyncEnumerable>)asyncAdapter).GetAsyncEnumerator(Token); + Assert.ThrowsAsync(() => asyncEnumerator.MoveNextAsync().AsTask()); + } + + private static IEnumerable Fail() + { + throw new ApplicationException("This should fail"); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/tests/AutoSyncAsyncTests.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/AutoSyncAsyncTests.cs new file mode 100644 index 000000000..44f13bee3 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/AutoSyncAsyncTests.cs @@ -0,0 +1,201 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using NUnit.Framework; +using OpenAI.TestFramework.Tests.Helpers; + +namespace OpenAI.TestFramework.Tests; + +public class AutoSyncAsyncTests(bool useAsync) : ClientTestBase(useAsync) +{ + private static readonly string EX_MSG = Guid.NewGuid().ToString(); + + [Test] + [SyncOnly] + public void OnlyInSyncMode() + { + Assert.That(IsAsync, Is.False); + } + + [Test] + [AsyncOnly] + public void OnlyInAsyncMode() + { + Assert.That(IsAsync, Is.True); + } + + [Test] + public void CanGetOriginal() + { + MockClient original = new MockClient(); + + MockClient instrumented = WrapClient(original); + Assert.That(instrumented, Is.Not.Null); + Assert.That(ReferenceEquals(original, instrumented), Is.False); + Assert.That(typeof(MockClient).IsAssignableFrom(instrumented.GetType()), Is.True); + + MockClient recovered = UnWrap(instrumented); + Assert.That(recovered, Is.Not.Null); + Assert.That(ReferenceEquals(original, recovered), Is.True); + } + + [Test] + public void CanGetContext() + { + var context = new MockClientContext(); + + MockClient client = WrapClient(new MockClient(), context); + Assert.That(client, Is.Not.Null); + + var recoveredContext = GetClientContext(client) as MockClientContext; + Assert.That(recoveredContext, Is.Not.Null); + Assert.That(recoveredContext!.Id, Is.EqualTo(context.Id)); + Assert.That(ReferenceEquals(recoveredContext, context), Is.True); + } + + [Test] + public async Task TaskWorks() + { + MockClient client = WrapClient(new MockClient()); + await client.DoAsync(); + AssertCorrectFunctionCalled(client); + } + + [Test] + public void FailedTaskWorks() + { + MockClient client = WrapClient(new MockClient()); + ArgumentException? ex = Assert.ThrowsAsync(() => client.FailAsync(EX_MSG)); + Assert.That(ex, Is.Not.Null); + Assert.That(ex!.Message, Is.EqualTo(EX_MSG)); + AssertCorrectFunctionCalled(client); + } + + [Test] + public async Task TaskWithResultWorks() + { + MockClient client = WrapClient(new MockClient()); + int count = await client.CountAsync(); + Assert.That(count, Is.EqualTo(IsAsync ? 12 : 5)); + AssertCorrectFunctionCalled(client); + } + + [Test] + public void FailedTaskWithResultWorks() + { + MockClient client = WrapClient(new MockClient()); + ArgumentException? ex = Assert.ThrowsAsync(() => client.FailWithResultAsync(EX_MSG)); + Assert.That(ex, Is.Not.Null); + Assert.That(ex!.Message, Is.EqualTo(EX_MSG)); + AssertCorrectFunctionCalled(client); + } + + [Test] + public async Task ResultCollectionWorks() + { + const int num = 3; + const int increment = 2; + + MockClient client = WrapClient(new MockClient()); + AsyncCollectionResult coll = client.ResultCollectionAsync(num, increment); + + Assert.IsNotNull(coll); + Assert.That(coll.GetRawResponse(), Is.Not.Null); + Assert.That(coll.GetRawResponse().Status, Is.EqualTo(200)); + Assert.That(coll.GetRawResponse().ReasonPhrase, Is.EqualTo("OK")); + + int numResults = 0; + await foreach (int i in coll) + { + Assert.That(i, Is.EqualTo(numResults * increment)); + numResults++; + } + + Assert.That(numResults, Is.EqualTo(num)); + AssertCorrectFunctionCalled(client); + } + + [Test] + public void FailedResultCollection() + { + MockClient client = WrapClient(new MockClient()); + + // For now we mimic how the OpenAI and Azure OpenAI libraries work in that no service requests are sent + // until we try to enumerate the async collections. So exceptions aren't expected initially + AsyncCollectionResult coll = client.FailResultCollectionAsync(EX_MSG); + Assert.That(coll, Is.Not.Null); + + IAsyncEnumerator enumerator = coll.GetAsyncEnumerator(); + Assert.That(enumerator, Is.Not.Null); + ArgumentException? ex = Assert.ThrowsAsync(() => enumerator.MoveNextAsync().AsTask()); + Assert.That(ex, Is.Not.Null); + Assert.That(ex!.Message, Is.EqualTo(EX_MSG)); + AssertCorrectFunctionCalled(client); + } + + [Test] + public async Task PageableCollectionWorks() + { + const int num = 50; + const int increment = 1; + const int itemsPerPage = 20; + int expectedPages = (int)Math.Ceiling((double)num / itemsPerPage); + + MockClient client = WrapClient(new MockClient()); + AsyncPageCollection coll = client.PageableCollectionAsync(num, increment, itemsPerPage); + Assert.IsNotNull(coll); + + int numPages = 0; + int numResults = 0; + await foreach(PageResult page in coll) + { + Assert.That(page.GetRawResponse(), Is.Not.Null); + Assert.That(page.GetRawResponse().Status, Is.EqualTo(200)); + Assert.That(page.GetRawResponse().ReasonPhrase, Is.EqualTo("OK")); + + numPages++; + foreach (int actual in page.Values) + { + Assert.That(actual, Is.EqualTo(numResults * increment)); + numResults++; + } + } + + Assert.That(numResults, Is.EqualTo(num)); + Assert.That(numPages, Is.EqualTo(expectedPages)); + AssertCorrectFunctionCalled(client); + } + + [Test] + public void FailedPageableCollection() + { + MockClient client = WrapClient(new MockClient()); + + // For now we mimic how the OpenAI and Azure OpenAI libraries work in that no service requests are sent + // until we try to enumerate the async collections. So exceptions aren't expected initially + AsyncPageCollection coll = client.FailPageableCollectionAsync(EX_MSG); + Assert.That(coll, Is.Not.Null); + + IAsyncEnumerator> enumerator = ((IAsyncEnumerable>)coll).GetAsyncEnumerator(); + Assert.That(enumerator, Is.Not.Null); + ArgumentException? ex = Assert.ThrowsAsync(() => enumerator.MoveNextAsync().AsTask()); + Assert.That(ex, Is.Not.Null); + Assert.That(ex!.Message, Is.EqualTo(EX_MSG)); + AssertCorrectFunctionCalled(client); + } + + private void AssertCorrectFunctionCalled(MockClient client, int expectedCalls = 1) + { + if (IsAsync) + { + Assert.That(client.AsyncHit, Is.EqualTo(expectedCalls)); + Assert.That(client.SyncHit, Is.EqualTo(0)); + } + else + { + Assert.That(client.AsyncHit, Is.EqualTo(0)); + Assert.That(client.SyncHit, Is.EqualTo(expectedCalls)); + } + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/tests/Helpers/MockClient.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/Helpers/MockClient.cs new file mode 100644 index 000000000..9eac6054d --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/Helpers/MockClient.cs @@ -0,0 +1,149 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.Runtime.CompilerServices; +using OpenAI.TestFramework.Mocks; + +namespace OpenAI.TestFramework.Tests.Helpers; + +public class MockClient +{ + private int _asyncHit; + private int _syncHit; + + public virtual int AsyncHit => _asyncHit; + public virtual int SyncHit => _syncHit; + + public virtual Task DoAsync() + { + Interlocked.Increment(ref _asyncHit); + return Task.Delay(200); + } + + public virtual void Do() + { + Interlocked.Increment(ref _syncHit); + } + + public virtual Task FailAsync(string message) + { + Interlocked.Increment(ref _asyncHit); + return Task.FromException(new ArgumentException(message)); + } + + public virtual void Fail(string message) + { + Interlocked.Increment(ref _syncHit); + throw new ArgumentException(message); + } + + public virtual async Task CountAsync() + { + Interlocked.Increment(ref _asyncHit); + await Task.Delay(100).ConfigureAwait(false); + return 12; + } + + public virtual int Count() + { + Interlocked.Increment(ref _syncHit); + return 5; + } + + public virtual Task FailWithResultAsync(string message) + { + Interlocked.Increment(ref _asyncHit); + return Task.FromException(new ArgumentException(message)); + } + + public virtual int FailWithResult(string message) + { + Interlocked.Increment(ref _syncHit); + throw new ArgumentException(message); + } + + public virtual AsyncCollectionResult ResultCollectionAsync(int num, int increment = 5) + { + Interlocked.Increment(ref _asyncHit); + return new MockAsyncCollectionResult(() => EnumerateAsync(num, increment)); + } + + public virtual CollectionResult ResultCollection(int num, int increment = 5) + { + Interlocked.Increment(ref _syncHit); + return new MockCollectionResult(() => Enumerate(num, increment)); + } + + public virtual AsyncCollectionResult FailResultCollectionAsync(string message) + { + Interlocked.Increment(ref _asyncHit); + return new MockAsyncCollectionResult(() => FailEnumerateAsync(message)); + } + + public virtual CollectionResult FailResultCollection(string message) + { + Interlocked.Increment(ref _syncHit); + return new MockCollectionResult(() => FailEnumerate(message)); + } + + public virtual AsyncPageCollection PageableCollectionAsync(int num, int increment, int itemsPerPage) + { + Interlocked.Increment(ref _asyncHit); + return new MockAsyncPageCollection(() => EnumerateAsync(num, increment), new MockPipelineResponse(), itemsPerPage); + } + + public virtual PageCollection PageableCollection(int num, int increment, int itemsPerPage) + { + Interlocked.Increment(ref _syncHit); + return new MockPageCollection(() => Enumerate(num, increment), new MockPipelineResponse(), itemsPerPage); + } + + public virtual AsyncPageCollection FailPageableCollectionAsync(string message) + { + Interlocked.Increment(ref _asyncHit); + return new MockAsyncPageCollection(() => FailEnumerateAsync(message), new MockPipelineResponse()); + } + + public virtual PageCollection FailPageableCollection(string message) + { + Interlocked.Increment(ref _syncHit); + return new MockPageCollection(() => FailEnumerate(message), new MockPipelineResponse()); + } + + private async IAsyncEnumerable EnumerateAsync(int num, int increment, [EnumeratorCancellation] CancellationToken token = default) + { + int running = 0; + for (int i = 0; i < num; i++, running += increment) + { + await Task.Delay(100); + yield return running; + } + } + + private IEnumerable Enumerate(int num, int increment) + { + int running = 0; + for (int i = 0; i < num; i++, running += increment) + { + yield return running; + } + } + + private async IAsyncEnumerable FailEnumerateAsync(string message, [EnumeratorCancellation] CancellationToken token = default) + { + bool c = true; + await Task.Delay(100).ConfigureAwait(false); + if (c) + { + throw new ArgumentException(message); + } + + yield break; + } + + private IEnumerable FailEnumerate(string message) + { + throw new ArgumentException(message); + } +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/tests/Helpers/MockClientContext.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/Helpers/MockClientContext.cs new file mode 100644 index 000000000..e36ed1e1f --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/Helpers/MockClientContext.cs @@ -0,0 +1,9 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +namespace OpenAI.TestFramework.Tests.Helpers; + +public class MockClientContext +{ + public string Id { get; } = Guid.NewGuid().ToString(); +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/tests/MockStringServiceTests.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/MockStringServiceTests.cs new file mode 100644 index 000000000..d5d3edff4 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/MockStringServiceTests.cs @@ -0,0 +1,146 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Reflection; +using NUnit.Framework; +using OpenAI.TestFramework.Mocks; +using OpenAI.TestFramework.Recording.Proxy; +using OpenAI.TestFramework.Recording.Proxy.Service; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Tests; + +public class MockStringServiceTests : RecordedClientTestBase +{ + private const string c_basePath = "data"; + + public MockStringServiceTests(bool isAsync) + : base(isAsync, null) + { + RecordingOptions.SanitizersToRemove.Add("AZSDK3430"); // $..id + } + + public DirectoryInfo RepositoryRoot { get; } = FindRepoRoot(); + + [Test] + public async Task AddAndGet() + { + const string id = "first.one"; + const string expected = "The first value goes here"; + + using MockRestService service = new(c_basePath); + var options = ConfigureClientOptions(new ClientPipelineOptions()); + using var client = WrapClient(new MockRestServiceClient(service.HttpEndpoint, options)); + + ClientResult add = await client.AddAsync(id, expected, Token); + Assert.That(add, Is.Not.Null); + Assert.That(add.GetRawResponse().Status, Is.EqualTo(200)); + + string? retrieved = await client.GetAsync("first.one", Token); + Assert.That(retrieved, Is.EqualTo(expected)); + } + + [Test] + public async Task AddAndDelete() + { + const string id = "first.one"; + const string expected = "The first value goes here"; + + using MockRestService service = new(c_basePath); + var options = ConfigureClientOptions(new ClientPipelineOptions()); + using var client = WrapClient(new MockRestServiceClient(service.HttpEndpoint, options)); + + ClientResult add = await client.AddAsync(id, expected, Token); + Assert.That(add, Is.Not.Null); + Assert.That(add.GetRawResponse().Status, Is.EqualTo(200)); + + bool deleted = await client.RemoveAsync(id, Token); + Assert.That(deleted, Is.True); + + string? retrieved = await client.GetAsync("first.one", Token); + Assert.That(retrieved, Is.Null); + } + + #region overrides + + protected override ProxyServiceOptions CreateProxyServiceOptions() + => new() + { + DotnetExecutable = AssemblyHelper.GetDotnetExecutable()?.FullName!, + TestProxyDll = AssemblyHelper.GetAssemblyMetadata("TestProxyPath")!, + DevCertFile = Path.Combine( + RepositoryRoot.FullName, + "eng", + "common", + "testproxy", + "dotnet-devcert.pfx"), + DevCertPassword = "password", + StorageLocationDir = RepositoryRoot.FullName, + }; + + protected override RecordingStartInformation CreateRecordingSessionStartInfo() + => new() + { + RecordingFile = GetRecordingFile(), + AssetsFile = GetAssetsFile() + }; + + #endregion + + #region helper methods + + private static DirectoryInfo FindRepoRoot() + { + /** + * This code assumes that we are running in the standard Azure .Net SDK repository layout. With this in mind, + * we generally assume that we are running our test code from + * /artifacts/bin/// + * So to find the root we keep navigating up until we find a folder with a .git subfolder + * + * Another alternative would be to call: git rev-parse --show-toplevel + */ + + DirectoryInfo? current = new FileInfo(Assembly.GetExecutingAssembly().Location).Directory; + while (current != null && !current.EnumerateDirectories(".git").Any()) + { + current = current.Parent; + } + + return current + ?? throw new InvalidOperationException("Could not determine the root folder for this repository"); + } + + private string GetRecordingFile() + { + DirectoryInfo sourceDir = AssemblyHelper.GetAssemblySourceDir() + ?? throw new InvalidOperationException("Could not determine the source path for this assembly"); + string relativeDir = PathHelpers.GetRelativePath(RepositoryRoot.FullName, sourceDir.FullName); + return Path.Combine( + relativeDir, + "SessionRecords", + GetType().Name, + GetRecordedTestFileName()); + } + + private string? GetAssetsFile() + { + DirectoryInfo? sourceDir = AssemblyHelper.GetAssemblySourceDir() + ?? throw new InvalidOperationException("Could not determine the source path for this assembly"); + + // walk up the tree until we hit either the repository root, or found a folder with an "assets.json" file + for (; sourceDir != null && sourceDir?.FullName != RepositoryRoot.FullName; sourceDir = sourceDir.Parent) + { + string assetsFile = Path.Combine(sourceDir!.FullName, "assets.json"); + if (File.Exists(assetsFile)) + { + return assetsFile; + } + } + + return null; + } + + #endregion +} diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/tests/OpenAI.TestFramework.Tests.csproj b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/OpenAI.TestFramework.Tests.csproj new file mode 100644 index 000000000..e6934e292 --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/OpenAI.TestFramework.Tests.csproj @@ -0,0 +1,21 @@ + + + + $(RequiredTargetFrameworks);net8.0 + enable + enable + latest + + + + + + + + + + + + + + diff --git a/.dotnet.azure/sdk/openai/tools/TestFramework/tests/ProxyServiceTests.cs b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/ProxyServiceTests.cs new file mode 100644 index 000000000..496afbc6d --- /dev/null +++ b/.dotnet.azure/sdk/openai/tools/TestFramework/tests/ProxyServiceTests.cs @@ -0,0 +1,351 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using System.ClientModel; +using System.ClientModel.Primitives; +using NUnit.Framework; +using OpenAI.TestFramework.Mocks; +using OpenAI.TestFramework.Recording; +using OpenAI.TestFramework.Recording.Matchers; +using OpenAI.TestFramework.Recording.Proxy; +using OpenAI.TestFramework.Recording.Proxy.Service; +using OpenAI.TestFramework.Recording.RecordingProxy; +using OpenAI.TestFramework.Recording.Sanitizers; +using OpenAI.TestFramework.Recording.Transforms; +using OpenAI.TestFramework.Utils; + +namespace OpenAI.TestFramework.Tests +{ + [NonParallelizable] + public class ProxyServiceTests(bool isAsync) : ClientTestBase(isAsync) + { + #region Properties and setup/teardown methods + + public DirectoryInfo? RecordingDir { get; private set; } + + public FileInfo? RecordingFile { get; private set; } + + [SetUp] + public void CreateRecordingFile() + { + RecordingDir = new DirectoryInfo(Path.Combine(Path.GetTempPath(), "RecordingTests", Guid.NewGuid().ToString())); + if (!RecordingDir.Exists) + { + RecordingDir.Create(); + } + + RecordingFile = new FileInfo(Path.Combine(RecordingDir.FullName, Path.GetRandomFileName() + ".json")); + } + + [TearDown] + public void DeleteRecordingFile() + { + if (RecordingFile != null) + { + RecordingFile.Delete(); + } + + if (RecordingDir != null) + { + RecordingDir.Delete(true); + } + } + + #endregion + + [Test] + public async Task StartProxy() + { + using ProxyService proxy = await CreateProxyServiceAsync(); + + Assert.That(proxy.HttpEndpoint, Is.Not.Null); + Assert.That(proxy.HttpEndpoint.Port, Is.GreaterThan(0).And.LessThanOrEqualTo(ushort.MaxValue)); + Assert.That(proxy.HttpsEndpoint, Is.Not.Null); + Assert.That(proxy.HttpsEndpoint.Port, Is.GreaterThan(0).And.LessThanOrEqualTo(ushort.MaxValue)); + + ProxyClientResult available = await proxy.Client.ListAvailableAsync(Token); + Assert.That(available, Is.Not.Null); + Assert.That(available.GetRawResponse(), Is.Not.Null); + Assert.That(available.GetRawResponse().Status, Is.EqualTo(200)); + Assert.That(available.Value, Is.Not.Null); + Assert.That(available.Value, Does.Contain("BodilessMatcher")); + } + + [Test] + public async Task AddSanitizers() + { + using ProxyService proxy = await CreateProxyServiceAsync(); + + List sanitizers = + [ + new BodyKeySanitizer("body.key"), + new BodyRegexSanitizer("(.*)") + { + GroupForReplace = "1", + Condition = new Recording.Condition() + { + ResponseHeader = new() + { + Key = "Content-Type", + ValueRegex = "json$" + }, + UriRegex = "https://[^/]+/sub" + } + }, + new HeaderRegexSanitizer("Authentication") + { + Value = "replacement", + GroupForReplace = "1", + Regex = "^Bearer " + }, + new UriRegexSanitizer("https://[^/]+/sub") + { + GroupForReplace = "1", + Value = "replacement" + } + ]; + + ProxyClientResult> result = await proxy.Client.AddSanitizersAsync(sanitizers, token: Token); + Assert.That(result, Is.Not.Null); + Assert.That(result.GetRawResponse(), Is.Not.Null); + Assert.That(result.GetRawResponse().Status, Is.EqualTo(200)); + Assert.That(result.Value, Is.Not.Null); + Assert.That(result.Value, Has.Count.EqualTo(sanitizers.Count)); + } + + [Test] + public async Task SetMatcher() + { + using ProxyService proxy = await CreateProxyServiceAsync(); + + BaseMatcher[] matchers = + [ + ExistingMatcher.Headerless, + ExistingMatcher.Bodiless, + new CustomMatcher() + { + CompareBodies = false, + ExcludedHeaders = "Authorization", + IgnoredHeaders = "Content-Length,Content-Type", + IgnoredQueryParameters = "page,version", + IgnoreQueryOrdering = true, + } + ]; + + foreach (var matcher in matchers) + { + ProxyClientResult result = await proxy.Client.SetMatcherAsync(matcher, token: Token); + Assert.That(result, Is.Not.Null); + Assert.That(result.GetRawResponse(), Is.Not.Null); + Assert.That(result.GetRawResponse().Status, Is.EqualTo(200)); + } + } + + [Test] + public async Task SetTransform() + { + using ProxyService proxy = await CreateProxyServiceAsync(); + + HeaderTransform transform = new("X-Client-RequestId") + { + Value = "replacement", + Condition = new() + { + UriRegex = "http.*://[^/]+/(.*)" + } + }; + + ProxyClientResult result = await proxy.Client.AddTransformAsync(transform, token: Token); + Assert.That(result, Is.Not.Null); + Assert.That(result.GetRawResponse(), Is.Not.Null); + Assert.That(result.GetRawResponse().Status, Is.EqualTo(200)); + } + + [Test] + public async Task StartStopRecording() + { + const string key1 = "key1"; + string value1 = Guid.NewGuid().ToString(); + const string key2 = "the.others"; + string value2 = "value"; + + using ProxyService proxy = await CreateProxyServiceAsync(); + + RecordingStartInformation startInfo = new() + { + RecordingFile = RecordingFile!.FullName, + }; + + ProxyClientResult result = await proxy.Client.StartRecordingAsync(startInfo, token: Token); + Assert.That(result, Is.Not.Null); + Assert.That(result.GetRawResponse(), Is.Not.Null); + Assert.That(result.GetRawResponse().Status, Is.EqualTo(200)); + + string recordingId = result.RecordingId!; + Assert.That(recordingId, Is.Not.Null); + + Dictionary additional = new() + { + [key1] = value1, + [key2] = value2, + }; + + result = await proxy.Client.StopRecordingAsync(recordingId, additional, false, Token); + + // At this point we should have a recording file + string recordedJson = File.ReadAllText(RecordingFile.FullName); + Assert.That(recordedJson, Does.Contain(key1) + .And.Contain(value1) + .And.Contain(key2) + .And.Contain(value2)); + } + + [Test] + public async Task RecordAndPlayback() + { + using ProxyService recordingProxyService = await CreateProxyServiceAsync(); + RecordingStartInformation startInfo = new() { RecordingFile = RecordingFile!.FullName }; + + using MockRestService mockRestService = new(); + TestRecordingOptions recordingOptions = new() + { + SanitizersToRemove = + { + "AZSDK3430", // $..id + } + }; + + string id1; + string id2; + + // Start recording, and capture some requests + { + ProxyClientResult result = await recordingProxyService.Client.StartRecordingAsync(startInfo, Token); + Assert.That(result, Is.Not.Null); + Assert.That(result.RecordingId, !Is.Null.Or.Empty); + string recordingId = result.RecordingId!; + + await using TestRecording recording = new(recordingId, RecordedTestMode.Record, recordingProxyService); + await recording.ApplyOptions(recordingOptions, Token); + + id1 = recording.Random.NewGuid().ToString(); + id2 = recording.Random.NewGuid().ToString(); + + await SendRequestsAsync(recording, mockRestService.HttpEndpoint, id1, id2, Token); + } + + // validate the service has what we expect + var serviceIds = mockRestService.GetAll() + .Select(e => e.id) + .ToArray(); + Assert.That(serviceIds, Is.EquivalentTo(new[] { id1, id2 })); + + mockRestService.Reset(); + + // Playback the recording + { + ProxyClientResult> result = await recordingProxyService.Client.StartPlaybackAsync(startInfo, Token); + Assert.That(result, Is.Not.Null); + Assert.That(result.RecordingId, !Is.Null.Or.Empty); + string recordingId = result.RecordingId!; + + await using TestRecording playback = new(recordingId, RecordedTestMode.Playback, recordingProxyService, result.Value); + await playback.ApplyOptions(recordingOptions, Token); + + string id = playback.Random.NewGuid().ToString(); + Assert.That(id, Is.EqualTo(id1)); + id = playback.Random.NewGuid().ToString(); + Assert.That(id, Is.EqualTo(id2)); + + await SendRequestsAsync(playback, mockRestService.HttpEndpoint, id1, id2, Token); + } + + // since we are playing back, the service should not have been called + Assert.That(mockRestService.GetAll().Count(), Is.EqualTo(0)); + + static async Task SendRequestsAsync(TestRecording recording, Uri restEndpoint, string id1, string id2, CancellationToken token) + { + const string value1 = "The value for the first item"; + const string value2 = "The secondary value goes here"; + const string id3 = "random"; + const string value3 = "Sure why not"; + + ClientPipelineOptions options = new(); + options.RetryPolicy = new TestClientRetryPolicy(0, TimeSpan.FromMilliseconds(100)); + options.Transport = new ProxyTransport(recording.GetProxyTransportOptions()); + + using MockRestServiceClient client = new(restEndpoint, options); + + ClientResult add = await client.AddAsync(id1, value1, token); + Assert.That(add, Is.Not.Null); + Assert.That(add.GetRawResponse().Status, Is.EqualTo(200)); + + add = await client.AddAsync(id2, value2, token); + Assert.That(add, Is.Not.Null); + Assert.That(add.GetRawResponse().Status, Is.EqualTo(200)); + + add = await client.AddAsync(id3, value3, token); + Assert.That(add, Is.Not.Null); + Assert.That(add.GetRawResponse().Status, Is.EqualTo(200)); + + ClientResult get = await client.GetAsync(id2, token); + Assert.That(add, Is.Not.Null); + Assert.That(add.GetRawResponse().Status, Is.EqualTo(200)); + Assert.That(get.Value, Is.EqualTo(value2)); + + get = await client.GetAsync(id3, token); + Assert.That(add, Is.Not.Null); + Assert.That(add.GetRawResponse().Status, Is.EqualTo(200)); + Assert.That(get.Value, Is.EqualTo(value3)); + + ClientResult remove = await client.RemoveAsync(id3, token); + Assert.That(remove.Value, Is.True); + + remove = await client.RemoveAsync("does.not.exist", token); + Assert.That(remove.Value, Is.False); + + get = await client.GetAsync(id3, token); + Assert.That(get, Is.Not.Null); + Assert.That(get.GetRawResponse().Status, Is.EqualTo(404)); + Assert.That(get.Value, Is.Null); + } + } + + #region helper methods + + private async Task CreateProxyServiceAsync() + { + ProxyService? proxy = null; + try + { + proxy = await ProxyService.CreateNewAsync( + new ProxyServiceOptions() + { + DotnetExecutable = AssemblyHelper.GetDotnetExecutable()?.FullName!, + TestProxyDll = AssemblyHelper.GetAssemblyMetadata("TestProxyPath")!, + StorageLocationDir = RecordingDir!.FullName + }, + Token); + + Assert.That(proxy, Is.Not.Null); + Assert.DoesNotThrow(proxy.ThrowOnErrors); + Assert.That(proxy.Client, Is.Not.Null); + + var wrappedClient = WrapClient(proxy.Client); + var setter = typeof(ProxyService).GetMethod("SetClient", System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Public | System.Reflection.BindingFlags.Instance) + ?? throw new InvalidOperationException("Could not find the ProxyService.SetClient method"); + setter.Invoke(proxy, [wrappedClient]); + + var ret = proxy; + proxy = null; + return ret; + } + finally + { + proxy?.Dispose(); + } + } + + #endregion + } +} diff --git a/.dotnet/.github/ISSUE_TEMPLATE/bug_report.yaml b/.dotnet/.github/ISSUE_TEMPLATE/bug_report.yaml new file mode 100644 index 000000000..479ab9af9 --- /dev/null +++ b/.dotnet/.github/ISSUE_TEMPLATE/bug_report.yaml @@ -0,0 +1,72 @@ +name: Bug report +description: Report an issue or bug with this library +labels: ['bug'] +body: + - type: markdown + attributes: + value: | + Thanks for taking the time to fill out this bug report! + - type: checkboxes + id: non_api + attributes: + label: Confirm this is not an issue with the underlying OpenAI API + description: Issues with the underlying OpenAI API should be reported in our [Developer Community](https://community.openai.com/c/api/7) + options: + - label: This is an issue with the Python library + required: true + - type: checkboxes + id: non_azure + attributes: + label: Confirm this is not an issue with Azure OpenAI + description: Issues related to Azure OpenAI should be reported in the [Azure SDK repo](https://github.com/Azure/azure-sdk-for-net/issues) + options: + - label: This is not an issue with Azure OpenAI + required: true + - type: textarea + id: what-happened + attributes: + label: Describe the bug + description: A clear and concise description of what the bug is, and any additional context. + placeholder: Tell us what you see. + validations: + required: true + - type: textarea + id: repro-steps + attributes: + label: To Reproduce + description: Steps to reproduce the behavior. + placeholder: | + 1. Fetch a '...' + 2. Update the '....' + 3. See error + validations: + required: true + - type: textarea + id: code-snippets + attributes: + label: Code snippets + description: If applicable, add code snippets to help explain your problem. + render: C# + validations: + required: false + - type: input + id: os + attributes: + label: OS + placeholder: winOS + validations: + required: true + - type: input + id: language-version + attributes: + label: .NET version + placeholder: + validations: + required: true + - type: input + id: lib-version + attributes: + label: Library version + placeholder: + validations: + required: true \ No newline at end of file diff --git a/.dotnet/.github/ISSUE_TEMPLATE/config.yml b/.dotnet/.github/ISSUE_TEMPLATE/config.yml new file mode 100644 index 000000000..cb9e00e0b --- /dev/null +++ b/.dotnet/.github/ISSUE_TEMPLATE/config.yml @@ -0,0 +1,7 @@ +blank_issues_enabled: false +contact_links: + - name: OpenAI support + url: https://help.openai.com/ + about: | + Please only file issues here that you believe represent actual bugs or feature requests for the OpenAI .NET library. + If you're having general trouble with the OpenAI API, please visit our help center to get support. \ No newline at end of file diff --git a/.dotnet/.github/ISSUE_TEMPLATE/feature_request.yaml b/.dotnet/.github/ISSUE_TEMPLATE/feature_request.yaml new file mode 100644 index 000000000..cdb9812e9 --- /dev/null +++ b/.dotnet/.github/ISSUE_TEMPLATE/feature_request.yaml @@ -0,0 +1,36 @@ +name: Feature request +description: Suggest an idea for this library +labels: ['feature-request'] +body: + - type: markdown + attributes: + value: | + Thanks for taking the time to fill out this feature request! + - type: checkboxes + id: non_api + attributes: + label: Confirm this is not a feature request for the underlying OpenAI API. + description: Feature requests for the underlying OpenAI API should be reported in our [Developer Community](https://community.openai.com/c/api/7) + options: + - label: This is not a feature request for the underlying OpenAI API + required: true + - type: checkboxes + id: non_azure + attributes: + label: Confirm this is not a feature request for Azure OpenAI. + description: Feature requests for Azure OpenAI should be reported reported in the [Azure SDK repo](https://github.com/Azure/azure-sdk-for-net/issues) + options: + - label: This is not a feature request for Azure OpenAI + required: true + - type: textarea + id: feature + attributes: + label: Describe the feature or improvement you're requesting + description: A clear and concise description of what you want to happen. + validations: + required: true + - type: textarea + id: context + attributes: + label: Additional context + description: Add any other context about the feature request here. \ No newline at end of file diff --git a/.dotnet/.github/workflows.md b/.dotnet/.github/workflows.md new file mode 100644 index 000000000..e820c8c64 --- /dev/null +++ b/.dotnet/.github/workflows.md @@ -0,0 +1,28 @@ +The workflows in this repository try to follow existing, basic samples with little customization. + +## main.yml +We use a standard dotnet build/test/pack workflow +https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-net + +- Build the solution using the dotnet cli + - Strong name the assemblies using a key stored in the repository + https://github.com/dotnet/runtime/blob/main/docs/project/strong-name-signing.md +- Test the built libraries + - In a PR run, only local tests are run. + - In a CI run, live tests are run using a repository secret containing an OpenAI token + https://docs.github.com/en/actions/security-guides/using-secrets-in-github-actions +- Package the built libraries +- Publish the package to a GitHub NuGet registry + https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-nuget-registry +- Publish a single build artifact containing test results and a nuget package + +## release.yml +Releases are triggered by publishing a release in the GitHub repository. The release workflow will: + +- Build the solution using the dotnet cli + - Strong name the assemblies using a key stored in the repository +- Test the built libraries + - Live tests are run using a repository secret containing an OpenAI token +- Package the built libraries +- Publish the package to public NuGet registry +- Publish a single build artifact containing test results and a nuget package diff --git a/.dotnet/.github/workflows/live-test.yml b/.dotnet/.github/workflows/live-test.yml new file mode 100644 index 000000000..a8a706e75 --- /dev/null +++ b/.dotnet/.github/workflows/live-test.yml @@ -0,0 +1,42 @@ +# This workflow is triggered by the user and runs live tests on the codebase. +name: Live Test + +on: + workflow_dispatch: + pull_request: + types: + - labeled + +jobs: + test: + name: Live Test + runs-on: ubuntu-latest + if: github.event_name == 'workflow_dispatch' || contains(github.event.pull_request.labels.*.name, 'live test') + environment: Live Testing + env: + version_suffix_args: ${{ format('/p:VersionSuffix="alpha.{0}"', github.run_number) }} + steps: + - name: Setup .NET + uses: actions/setup-dotnet@v3 + with: + dotnet-version: '8.x' + + - name: Checkout code + uses: actions/checkout@v2 + + - name: Run live tests + run: dotnet test ./tests/OpenAI.Tests.csproj + --configuration Release + --filter="TestCategory!=Smoke&TestCategory!=Images&TestCategory!=Moderations&TestCategory!=Manual" + --logger "trx;LogFilePrefix=live" + --results-directory ${{github.workspace}}/artifacts/test-results + ${{ env.version_suffix_args}} + env: + OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} + + - name: Upload artifact + uses: actions/upload-artifact@v4 + if: ${{ !cancelled() }} + with: + name: test-artifacts + path: ${{github.workspace}}/artifacts diff --git a/.dotnet/.github/workflows/main.yml b/.dotnet/.github/workflows/main.yml new file mode 100644 index 000000000..54325dc64 --- /dev/null +++ b/.dotnet/.github/workflows/main.yml @@ -0,0 +1,47 @@ +# This worflow is triggered on push to main, pull request, or by manual invocation. +# It builds and unit tests the codebase. +name: Build and Unit Test + +on: + workflow_dispatch: + push: + branches: + - main + pull_request: + types: [opened, reopened, synchronize] + +jobs: + build: # Test, pack and publish the Open AI nuget package as a build artifact + name: Build + runs-on: ubuntu-latest + env: + version_suffix_args: ${{ format('/p:VersionSuffix="alpha.{0}"', github.run_number) }} + steps: + - name: Setup .NET + uses: actions/setup-dotnet@v3 + with: + dotnet-version: '8.x' + + - name: Checkout code + uses: actions/checkout@v2 + + - name: Build and pack + run: dotnet pack + --configuration Release + --output "${{github.workspace}}/artifacts/packages" + ${{ env.version_suffix_args}} + + - name: Run unit tests + run: dotnet test + --configuration Release + --filter="TestCategory=Smoke&TestCategory!=Manual" + --logger "trx;LogFilePrefix=smoke" + --results-directory ${{github.workspace}}/artifacts/test-results + ${{ env.version_suffix_args}} + + - name: Upload artifact + uses: actions/upload-artifact@v4 + if: ${{ !cancelled() }} + with: + name: build-artifacts + path: ${{github.workspace}}/artifacts diff --git a/.dotnet/.github/workflows/release.yml b/.dotnet/.github/workflows/release.yml new file mode 100644 index 000000000..05989aaf9 --- /dev/null +++ b/.dotnet/.github/workflows/release.yml @@ -0,0 +1,105 @@ +# This workflow is triggered by new releases and on a daily schedule. +# It builds, unit tests, live tests and published the Open AI nuget package. +# For daily runs, the package is published to the GitHub package registry. +# For releases, the package is published to the NuGet package registry. +name: Release package + +on: + release: + types: [published] + schedule: + # run every day at 00:00 + - cron: '0 0 * * *' + +jobs: + build: + name: Build + runs-on: ubuntu-latest + environment: Live Testing + env: + version_suffix_args: ${{ github.event_name == 'schedule' && format('/p:VersionSuffix="alpha.{0}"', github.run_number) || '' }} + permissions: + packages: write + contents: write + steps: + - name: Setup .NET + uses: actions/setup-dotnet@v3 + with: + dotnet-version: '8.x' + + - name: Checkout code + uses: actions/checkout@v2 + + # Pack the client nuget package and include url back to the repository and release tag + - name: Build and Pack + run: dotnet pack + --configuration Release + --output "${{ github.workspace }}/artifacts/packages" + /p:PackageProjectUrl="${{ github.server_url }}/${{ github.repository }}/tree/${{ github.event.release.tag_name }}" + /p:PackageReleaseNotes="${{ github.server_url }}/${{ github.repository }}/blob/${{ github.event.release.tag_name }}/CHANGELOG.md" + ${{ env.version_suffix_args }} + + - name: Unit Test + run: dotnet test + --configuration Release + --filter="TestCategory=Smoke&TestCategory!=Manual" + --logger "trx;LogFileName=${{ github.workspace }}/artifacts/test-results/smoke.trx" + ${{ env.version_suffix_args }} + + - name: Run Live Tests + run: dotnet test ./tests/OpenAI.Tests.csproj + --configuration Release + --filter="TestCategory!=Smoke&TestCategory!=Images&TestCategory!=Moderations&TestCategory!=Manual" + --logger "trx;LogFilePrefix=live" + --results-directory ${{ github.workspace }}/artifacts/test-results + ${{ env.version_suffix_args }} + env: + OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} + + - name: Upload artifact + uses: actions/upload-artifact@v4 + if: ${{ !cancelled() }} + with: + name: build-artifacts + path: ${{ github.workspace }}/artifacts + + deploy: + name: Publish Package + needs: build + runs-on: ubuntu-latest + steps: + - name: Checkout code + uses: actions/checkout@v2 + + - name: Download build artifacts + uses: actions/download-artifact@v4 + + - name: Upload release asset + if: github.event_name == 'release' + run: gh release upload ${{ github.event.release.tag_name }} + ${{ github.workspace }}/build-artifacts/packages/*.*nupkg + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + + - name: NuGet authenticate + run: dotnet nuget add source + "https://nuget.pkg.github.com/${{ github.repository_owner }}/index.json" + --name "github" + --username ${{ github.actor }} + --password ${{ secrets.GITHUB_TOKEN }} + --store-password-in-clear-text + + - name: Publish package to local feed + run: dotnet nuget push + ${{github.workspace}}/build-artifacts/packages/*.nupkg + --source "github" + --api-key ${{ secrets.GITHUB_TOKEN }} + --skip-duplicate + + - name: Publish package to nuget.org + if: github.event_name == 'release' + run: dotnet nuget push + ${{github.workspace}}/build-artifacts/packages/*.nupkg + --source https://api.nuget.org/v3/index.json + --api-key ${{ secrets.NUGET_API_KEY }} + --skip-duplicate diff --git a/.dotnet/.gitignore b/.dotnet/.gitignore new file mode 100644 index 000000000..f71e3dc8e --- /dev/null +++ b/.dotnet/.gitignore @@ -0,0 +1,179 @@ +## Ignore Visual Studio temporary files, build results, and +## files generated by popular Visual Studio add-ons. + +# User-specific files +*.suo +*.user +*.sln.docstates +.vs/ +*.lock.json +developer/ +launch.json +launchSettings.json + +# Default Assets restore directory +.assets + +# Build results +/artifacts +binaries/ +[Dd]ebug*/ +[Rr]elease/ +build/ +restoredPackages/ +PolicheckOutput/ +tools/net46/ +tools/SdkBuildTools/ +tools/Microsoft.WindowsAzure.Build.Tasks/packages/ +PublishedNugets/ +src/NuGet.Config +tools/7-zip/ +#tools/LocalNugetFeed/Microsoft.Internal.NetSdkBuild.Mgmt.Tools.*.nupkg + +[Tt]est[Rr]esult +[Bb]uild[Ll]og.* + +*_i.c +*_p.c +*.ilk +*.meta +*.obj +*.pch +*.pdb +*.pgc +*.pgd +*.rsp +*.sbr +*.tlb +*.tli +*.tlh +*.tmp +*.vspscc +*.vssscc +.builds + +*.pidb + +*.log +*.scc +# Visual C++ cache files +ipch/ +*.aps +*.ncb +*.opensdf +*.sdf + +# Visual Studio profiler +*.psess +*.vsp + +# VS Code +**/.vscode/* +!.vscode/cspell.json + +# Code analysis +*.CodeAnalysisLog.xml + +# Guidance Automation Toolkit +*.gpState + +# ReSharper is a .NET coding add-in +_ReSharper*/ + +*.[Rr]e[Ss]harper + +# Rider IDE +.idea + +# NCrunch +*.ncrunch* +.*crunch*.local.xml + +# Installshield output folder +[Ee]xpress + +# DocProject is a documentation generator add-in +DocProject/buildhelp/ +DocProject/Help/*.HxT +DocProject/Help/*.HxC +DocProject/Help/*.hhc +DocProject/Help/*.hhk +DocProject/Help/*.hhp +DocProject/Help/Html2 +DocProject/Help/html + +# Click-Once directory +publish + +# Publish Web Output +*.Publish.xml + +# Others +[Bb]in +[Oo]bj +TestResults +[Tt]est[Rr]esult* +*.Cache +ClientBin +~$* +*.dbmdl + +*.[Pp]ublish.xml + +Generated_Code #added for RIA/Silverlight projects + +# Build tasks +tools/*.dll + +# Sensitive files +*.keys +!Azure.Extensions.AspNetCore.DataProtection.Keys +!Azure.Security.KeyVault.Keys +*.pfx +TestConfigurations.xml +*.json.env +*.bicep.env + +# Backup & report files from converting an old project file to a newer +# Visual Studio version. Backup files are not needed, because we have git ;-) +_UpgradeReport_Files/ +Backup*/ +UpgradeLog*.XML + +# NuGet +packages +packages/repositories.config +testPackages + +# Mac development +.DS_Store + +# Specification DLLs +*.Specification.dll + +# Generated readme.txt files # +src/*/readme.txt + +build.out +.nuget/ + +# Azure Project +csx/ +*.GhostDoc.xml +pingme.txt + +# TS/Node files +dist/ +node_modules/ + +# MSBuild binary log files +msbuild.binlog + +# BenchmarkDotNet +BenchmarkDotNet.Artifacts + +artifacts +.assets + +# Temporary typespec folders for typespec generation +TempTypeSpecFiles/ diff --git a/.dotnet/CHANGELOG.md b/.dotnet/CHANGELOG.md new file mode 100644 index 000000000..2697eb9f3 --- /dev/null +++ b/.dotnet/CHANGELOG.md @@ -0,0 +1,169 @@ +# Release History + +## 2.0.0-beta.11 (Unreleased) + +### Features Added + +### Breaking Changes + +- Updated fine-tuning pagination methods `GetJobs`, `GetEvents`, and `GetJobCheckpoints` to return `IEnumerable` instead of `ClientResult`. (commit_hash) +- Updated the batching pagination method `GetBatches` to return `IEnumerable` instead of `ClientResult`. (commit_hash) + +### Bugs Fixed + +### Other Changes + +## 2.0.0-beta.10 (2024-08-26) + +### Breaking Changes + +- Renamed `AudioClient`'s `GenerateSpeechFromText` methods to simply `GenerateSpeech`. ([d84bf54](https://github.com/openai/openai-dotnet/commit/d84bf54df14ddac4c49f6efd61467b600d34ecd7)) +- Changed the type of `OpenAIFileInfo`'s `SizeInBytes` property from `long?` to `int?`. ([d84bf54](https://github.com/openai/openai-dotnet/commit/d84bf54df14ddac4c49f6efd61467b600d34ecd7)) + +### Bugs Fixed + +- Fixed a newly introduced bug ([#185](https://github.com/openai/openai-dotnet/pull/185)) where providing `OpenAIClientOptions` to a top-level `OpenAIClient` did not carry over to scenario clients (e.g. `ChatClient`) created via that top-level client ([d84bf54](https://github.com/openai/openai-dotnet/commit/d84bf54df14ddac4c49f6efd61467b600d34ecd7)) + +### Other Changes + +- Removed the version path parameter "v1" from the default endpoint URL. ([d84bf54](https://github.com/openai/openai-dotnet/commit/d84bf54df14ddac4c49f6efd61467b600d34ecd7)) + +## 2.0.0-beta.9 (2024-08-23) + +### Features Added + +- Added support for the new [structured outputs](https://platform.openai.com/docs/guides/structured-outputs/introduction) response format feature, which enables chat completions, assistants, and tools on each of those clients to provide a specific JSON Schema that generated content should adhere to. ([3467b53](https://github.com/openai/openai-dotnet/commit/3467b535c918e72237a4c0dc36d4bda5548edb7a)) + - To enable top-level structured outputs for response content, use `ChatResponseFormat.CreateJsonSchemaFormat()` and `AssistantResponseFormat.CreateJsonSchemaFormat()` as the `ResponseFormat` in method options like `ChatCompletionOptions` + - To enable structured outputs for function tools, set `StrictParameterSchemaEnabled` to `true` on the tool definition + - For more information, please see [the new section in readme.md](readme.md#how-to-use-structured-outputs) +- Chat completions: the request message types of `AssistantChatMessage`, `SystemChatMessage`, and `ToolChatMessage` now support array-based content part collections in addition to simple string input. ([3467b53](https://github.com/openai/openai-dotnet/commit/3467b535c918e72237a4c0dc36d4bda5548edb7a)) +- Added the following model factories (static classes that can be used to instantiate OpenAI models for mocking in non-live test scenarios): + - `OpenAIAudioModelFactory` in the `OpenAI.Audio` namespace ([3284295](https://github.com/openai/openai-dotnet/commit/3284295e7fd9922a3395d921513473bcb483655e)) + - `OpenAIEmbeddingsModelFactory` in the `OpenAI.Embeddings` namespace ([3284295](https://github.com/openai/openai-dotnet/commit/3284295e7fd9922a3395d921513473bcb483655e)) + - `OpenAIFilesModelFactory` in the `OpenAI.Files` namespace ([b1ce397](https://github.com/openai/openai-dotnet/commit/b1ce397ff4f9a55db797167be9e86e138ed5d403)) + - `OpenAIImagesModelFactory` in the `OpenAI.Images` namespace ([3284295](https://github.com/openai/openai-dotnet/commit/3284295e7fd9922a3395d921513473bcb483655e)) + - `OpenAIModelsModelFactory` in the `OpenAI.Models` namespace ([b1ce397](https://github.com/openai/openai-dotnet/commit/b1ce397ff4f9a55db797167be9e86e138ed5d403)) + - `OpenAIModerationsModelFactory` in the `OpenAI.Moderations` namespace ([b1ce397](https://github.com/openai/openai-dotnet/commit/b1ce397ff4f9a55db797167be9e86e138ed5d403)) + +### Breaking Changes + +- Removed client constructors that do not explicitly take an API key parameter or an endpoint via an `OpenAIClientOptions` parameter, making it clearer how to appropriately instantiate a client. ([13a9c68](https://github.com/openai/openai-dotnet/commit/13a9c68647c8d54475f1529a63b13ad711bd4ba6)) +- Removed the endpoint parameter from all client constructors, making it clearer that an alternative endpoint must be specified via the `OpenAIClientOptions` parameter. ([13a9c68](https://github.com/openai/openai-dotnet/commit/13a9c68647c8d54475f1529a63b13ad711bd4ba6)) +- Removed `OpenAIClient`'s `Endpoint` `protected` property. ([13a9c68](https://github.com/openai/openai-dotnet/commit/13a9c68647c8d54475f1529a63b13ad711bd4ba6)) +- Made `OpenAIClient`'s constructor that takes a `ClientPipeline` parameter `protected internal` instead of just `protected`. ([13a9c68](https://github.com/openai/openai-dotnet/commit/13a9c68647c8d54475f1529a63b13ad711bd4ba6)) +- Renamed the `User` property in applicable Options classes to `EndUserId`, making its purpose clearer. ([13a9c68](https://github.com/openai/openai-dotnet/commit/13a9c68647c8d54475f1529a63b13ad711bd4ba6)) + +### Bugs Fixed + +- The `Assistants` namespace `VectorStoreCreationHelper` type now properly includes a `ChunkingStrategy` property. ([3467b53](https://github.com/openai/openai-dotnet/commit/3467b535c918e72237a4c0dc36d4bda5548edb7a)) + +### Other Changes + +- `ChatCompletion.ToString()` will no longer throw an exception when no content is present, as is the case for tool calls. Additionally, if a tool call is present with no content, `ToString()` will return the serialized form of the first available tool call. ([3467b53](https://github.com/openai/openai-dotnet/commit/3467b535c918e72237a4c0dc36d4bda5548edb7a)) + +## 2.0.0-beta.8 (2024-07-31) + +### Breaking Changes + +- Changed name of return types from methods returning streaming collections from `ResultCollection` to `CollectionResult`. ([7bdecfd](https://github.com/openai/openai-dotnet/commit/7bdecfd8d294be933c7779c7e5b6435ba8a8eab0)) +- Changed return types from methods returning paginated collections from `PageableCollection` to `PageCollection`. ([7bdecfd](https://github.com/openai/openai-dotnet/commit/7bdecfd8d294be933c7779c7e5b6435ba8a8eab0)) +- Users must now call `GetAllValues` on the collection of pages to enumerate collection items directly. Corresponding protocol methods return `IEnumerable` where each collection item represents a single service response holding a page of values. ([7bdecfd](https://github.com/openai/openai-dotnet/commit/7bdecfd8d294be933c7779c7e5b6435ba8a8eab0)) +- Updated `VectorStoreFileCounts` and `VectorStoreFileAssociationError` types from `readonly struct` to `class`. ([58f93c8](https://github.com/openai/openai-dotnet/commit/58f93c8d5ea080adfee8b37ae3cc034ebb06c79f)) + +### Bugs Fixed + +- ([#49](https://github.com/openai/openai-dotnet/issues/49)) Fixed a bug with extensible enums implementing case-insensitive equality but case-sensitive hash codes. ([0c12500](https://github.com/openai/openai-dotnet/commit/0c125002ffd791594597ef837f4d10582bdff004)) +- ([#57](https://github.com/openai/openai-dotnet/issues/57)) Fixed a bug with requests URIs with query string parameter potentially containing a malformed double question mark (`??`) on .NET Framework (net481). ([0c12500](https://github.com/openai/openai-dotnet/commit/0c125002ffd791594597ef837f4d10582bdff004)) +- Added optional `CancellationToken` parameters to methods for `AssistantClient` and `VectorStore` client, consistent with past changes in [19a65a0](https://github.com/openai/openai-dotnet/commit/19a65a0a943fa3bef1ec8504708aaa526a1ee03a). ([d77539c](https://github.com/openai/openai-dotnet/commit/d77539ca04467c166f848953eb866012a265555c)) +- Fixed Assistants `FileSearchToolDefinition`'s `MaxResults` parameter to appropriately serialize and deserialize the value ([d77539c](https://github.com/openai/openai-dotnet/commit/d77539ca04467c166f848953eb866012a265555c)) +- Added missing `[EditorBrowsable(EditorBrowsableState.Never)]` attributes to `AssistantClient` protocol methods, which should improve discoverability of the strongly typed methods. ([d77539c](https://github.com/openai/openai-dotnet/commit/d77539ca04467c166f848953eb866012a265555c)) + +### Other Changes + +- Removed the usage of `init` and updated properties to use `set`. ([58f93c8](https://github.com/openai/openai-dotnet/commit/58f93c8d5ea080adfee8b37ae3cc034ebb06c79f)) + +## 2.0.0-beta.7 (2024-06-24) + +### Bugs Fixed + +- ([#84](https://github.com/openai/openai-dotnet/issues/84)) Fixed a `NullReferenceException` thrown when adding the custom headers policy while `OpenAIClientOptions` is null ([0b97311](https://github.com/openai/openai-dotnet/commit/0b97311f58dfb28bd883d990f68d548da040a807)) + +## 2.0.0-beta.6 (2024-06-21) + +### Features Added + +- `OrganizationId` and `ProjectId` are now present on `OpenAIClientOptions`. When instantiating a client, providing an instance of `OpenAIClientOptions` with these properties set will cause the client to add the appropriate request headers for org/project, eliminating the need to manually configure the headers. ([9ee7dff](https://github.com/openai/openai-dotnet/commit/9ee7dff064a9412c069a793ff62096b8db4aa43d)) + +### Bugs Fixed + +- ([#72](https://github.com/openai/openai-dotnet/issues/72)) Fixed `filename` request encoding in operations using `multipart/form-data`, including `files` and `audio` ([2ba8e69](https://github.com/openai/openai-dotnet/commit/2ba8e69512e187ea0b761edda8bce2cd5c79c58a)) +- ([#79](https://github.com/openai/openai-dotnet/issues/79)) Fixed hard-coded `user` role for caller-created Assistants API messages on threads ([d665b61](https://github.com/openai/openai-dotnet/commit/d665b61fc7ef1ada00a8ef5c00d1a47d276be032)) +- Fixed non-streaming Assistants API run step details not reporting code interpreter logs when present ([d665b61](https://github.com/openai/openai-dotnet/commit/d665b61fc7ef1ada00a8ef5c00d1a47d276be032)) + +### Breaking Changes + +**Assistants (beta)**: + +- `AssistantClient.CreateMessage()` and the explicit constructor for `ThreadInitializationMessage` now require a `MessageRole` parameter. This properly enables the ability to create an Assistant message representing conversation history on a new thread. ([d665b61](https://github.com/openai/openai-dotnet/commit/d665b61fc7ef1ada00a8ef5c00d1a47d276be032)) + +## 2.0.0-beta.5 (2024-06-14) + +### Features Added + +- API updates, current to [openai/openai-openapi@dd73070b](https://github.com/openai/openai-openapi/commit/dd73070b1d507645d24c249a63ebebd3ec38c0cb) ([1af6569](https://github.com/openai/openai-dotnet/commit/1af6569e2ceae9d840b8826e42d7e3b2569b43f6)) + - This includes `MaxResults` for Assistants `FileSearchToolDefinition`, `ParallelToolCallsEnabled` for function tools in Assistants and Chat, and `FileChunkingStrategy` for Assistants VectorStores +- Optional `CancellationToken` parameters are now directly present on most methods, eliminating the need to use protocol methods ([19a65a0](https://github.com/openai/openai-dotnet/commit/19a65a0a943fa3bef1ec8504708aaa526a1ee03a)) + +### Bugs Fixed + +- ([#30](https://github.com/openai/openai-dotnet/issues/30)) Mitigated a .NET runtime issue that prevented chat message content and several other types from serializing correct on targets including mono and wasm ([896b9e0](https://github.com/openai/openai-dotnet/commit/896b9e0bc60f0ace90bd0d1af1254cf2680f8df6)) +- Assistants: Fixed an issue that threw an exception when receiving code interpreter run step logs when streaming a run ([207d597](https://github.com/openai/openai-dotnet/commit/207d59762e7eeb666b7ab2728a0bbee7c0cdd918)) +- Fixed a concurrency condition that could cause `multipart/form-data` requests to no longer generate random content part boundaries (no direct scenario impact) ([7cacdee](https://github.com/openai/openai-dotnet/commit/7cacdee2366df3cfa7e6c43bb050da54d23f8db9)) + +### Breaking Changes + +**Assistants (beta)**: + +- `InputQuote` is removed from Assistants `TextAnnotation` and `TextAnnotationUpdate`, per [openai/openai-openapi@dd73070b](https://github.com/openai/openai-openapi/commit/dd73070b1d507645d24c249a63ebebd3ec38c0cb) ([1af6569](https://github.com/openai/openai-dotnet/commit/1af6569e2ceae9d840b8826e42d7e3b2569b43f6)) + +### Other Changes + +- Added an environment-variable-based test project to the repository with baseline scenario coverage ([db6328a](https://github.com/openai/openai-dotnet/commit/db6328accdd7927f19915cdc5412eb841f2447c1)) +- Build/analyzer warnings cleaned up throughout the project ([45fc4d7](https://github.com/openai/openai-dotnet/commit/45fc4d72c12314aea83264ebe2e1dc18870e5c06), [b1fa082](https://github.com/openai/openai-dotnet/commit/b1fa0828a875906ef33ffe43ff1cd1a85fbd1a60), [22ab606](https://github.com/openai/openai-dotnet/commit/22ab606b867bbe0ea7f6918843dbc5e11dfe78eb)) +- Proactively aligned library's implementation of server-sent event (SSE) handling with the source of the incoming `System.Net.ServerSentEvents` namespace ([674e0f7](https://github.com/openai/openai-dotnet/commit/674e0f773b26a22eb039e879539c4c7a44fdffdd)) + +## 2.0.0-beta.4 (2024-06-10) + +### Features Added + +- Added new, built-in helpers to simplify the use of text-only message content ([1c40de6](https://github.com/openai/openai-dotnet/commit/1c40de673a67ddf834b673aaabb94b2c42076e03)) + +### Bugs Fixed + +- Optimized embedding deserialization and addressed correctness on big endian systems ([e28b5a7](https://github.com/openai/openai-dotnet/commit/e28b5a7787df4b1baa772406b09a0f650a45c77f)) +- Optimized b64_json message parsing via regex ([efd76b5](https://github.com/openai/openai-dotnet/commit/efd76b50f094c585350240aea051ba342c6f6622)) + +## 2.0.0-beta.3 (2024-06-07) + +### Bugs Fixed + +- Removed a vestigial package reference ([5874f53](https://github.com/openai/openai-dotnet/commit/5874f533722ab46a3e077dacb6c3474e0ecca96e)) + +## 2.0.0-beta.2 (2024-06-06) + +### Bugs Fixed + +- Addressed an assembly properties issue ([bf21eb5](https://github.com/openai/openai-dotnet/commit/bf21eb5ad367aaac418dbbf320f98187fee5089a)) +- Added migration guide to package ([f150666](https://github.com/openai/openai-dotnet/commit/f150666cd2ed552720207098b3b604a8e1ca73df)) + +## 2.0.0-beta.1 (2024-06-06) + +### Features Added + +This is the official OpenAI client library for C# / .NET. It provides convenient access to the OpenAI REST API from .NET applications and supports all the latest features. It is generated from our [OpenAPI specification](https://github.com/openai/openai-openapi) in collaboration with Microsoft. + +### Breaking Changes + +If you are a user migrating from version 1.11.0 or earlier, we will soon share a migration guide to help you get started. + +- ***Addendum:** the [migration guide](https://github.com/openai/openai-dotnet/blob/main/MigrationGuide.md) is now available.* \ No newline at end of file diff --git a/.dotnet/LICENSE b/.dotnet/LICENSE new file mode 100644 index 000000000..c1a63fe15 --- /dev/null +++ b/.dotnet/LICENSE @@ -0,0 +1,22 @@ +The MIT License (MIT) + +Copyright (c) 2024 OpenAI (https://openai.com) + + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/.dotnet/MigrationGuide.md b/.dotnet/MigrationGuide.md new file mode 100644 index 000000000..4ef8862ae --- /dev/null +++ b/.dotnet/MigrationGuide.md @@ -0,0 +1,341 @@ +# Guide for migrating to OpenAI 2.0.0-beta.1 or higher from OpenAI 1.11.0 + +This guide is intended to assist in the migration to the official OpenAI library (2.0.0-beta.1 or higher) from [OpenAI 1.11.0][openai_1110], focusing on side-by-side comparisons for similar operations between libraries. Version 2.0.0-beta.1 will be used for comparison with 1.11.0 but this guide can still be safely used when migrating to higher versions. + +Prior to 2.0.0-beta.1, the OpenAI package was a community library not officially supported by OpenAI. See the [CHANGELOG][changelog] for more details. + +Familiarity with the OpenAI 1.11.0 package is assumed. For those new to any OpenAI library for .NET, see the [README][readme] rather than this guide. + +## Table of contents +- [Client usage](#client-usage) +- [Authentication](#authentication) +- [Highlighted scenarios](#highlighted-scenarios) + - [Chat Completions: Text generation](#chat-completions-text-generation) + - [Chat Completions: Streaming](#chat-completions-streaming) + - [Chat Completions: JSON mode](#chat-completions-json-mode) + - [Chat Completions: Vision](#chat-completions-vision) + - [Audio: Speech-to-text](#audio-speech-to-text) + - [Audio: Text-to-speech](#audio-text-to-speech) + - [Image: Image generation](#image-image-generation) +- [Additional examples](#additional-examples) + +## Client usage + +The client usage has considerably changed between libraries. While the OpenAI 1.11.0 had a single client, `OpenAIAPI`, from which multiple APIs could be accessed, OpenAI 2.0.0-beta.1 keeps a separate client per API. The following snippets illustrate this difference when invoking the image generation capability from the Image API: + +OpenAI 1.11.0: +```cs +OpenAIAPI api = new OpenAIAPI(""); +ImageResult result = await api.ImageGenerations.CreateImageAsync("Draw a quick brown fox jumping over a lazy dog.", Model.DALLE3); +``` + +OpenAI 2.0.0-beta.1: +```cs +ImageClient client = new ImageClient("dall-e-3", ""); +ClientResult result = await client.GenerateImageAsync("Draw a quick brown fox jumping over a lazy dog."); +``` + +Another major difference highlighted in the snippets above is that OpenAI 2.0.0-beta.1 requires the model to be explicitly set during client instantiation, while the `OpenAIAPI` client allows a model to be specified per call. + +The table below illustrates to which client each endpoint of `OpenAIAPI` was ported. Note that the deprecated Completions API is not supported in 2.0.0-beta.1: + +Old library's endpoint|New library's client +|-|- +|Chat | ChatClient +|ImageGenerations | ImageClient +|TextToSpeech | AudioClient +|Transcriptions | AudioClient +|Translations | AudioClient +|Moderation | ModerationClient +|Embeddings | EmbeddingClient +|Files | FileClient +|Models | ModelClient +|Completions | Not supported + +## Authentication + +To authenticate to OpenAI, you must set an API key when creating a client. + +OpenAI 1.11.0 allowed setting the API key in 3 different ways: +- Directly from a string +- From an environment variable +- From a configuration file + +```cs +OpenAIAPI api; + +// Sets the API key directly from a string. +api = new OpenAIAPI(""); + +// Attempts to load the API key from environment variables OPENAI_KEY and OPENAI_API_KEY. +api = new OpenAIAPI(APIAuthentication.LoadFromEnv()); + +// Attempts to load the API key from a configuration file. +api = new OpenAIAPI(APIAuthentication.LoadFromPath("", "")); +``` + +OpenAI 2.0.0-beta.1 only supports setting it from a string or from an environment variable. The following snippet illustrates the behavior with the `ChatClient`, but other clients behave the same: + +```cs +ChatClient client; + +// Sets the API key directly from a string. +client = new ChatClient("gpt-3.5-turbo", ""); + +// When no API key string is specified, attempts to load the API key from the environment variable OPENAI_API_KEY. +client = new ChatClient("gpt-3.5-turbo"); +``` + +Note that, unlike the OpenAI 1.11.0, OpenAI 2.0.0-beta.1 will never attempt to load the API key from the `OPENAI_KEY` environment variable. Only `OPENAI_API_KEY` is supported. + +## Highlighted scenarios + +The following sections illustrate side-by-side comparisons for similar operations between the two libraries, highlighting common scenarios. + +### Chat Completions: Text generation + +OpenAI 1.11.0: +```cs +OpenAIAPI api = new OpenAIAPI(""); +Conversation conversation = api.Chat.CreateConversation(); + +conversation.Model = Model.ChatGPTTurbo; +conversation.AppendSystemMessage("You are a helpful assistant."); +conversation.AppendUserInput("When was the Nobel Prize founded?"); + +await conversation.GetResponseFromChatbotAsync(); + +conversation.AppendUserInput("Who was the first person to be awarded one?"); + +await conversation.GetResponseFromChatbotAsync(); + +foreach (ChatMessage message in conversation.Messages) +{ + Console.WriteLine($"{message.Role}: {message.TextContent}"); +} +``` + +OpenAI 2.0.0-beta.1: +```cs +ChatClient client = new ChatClient("gpt-3.5-turbo", ""); +List messages = new List() +{ + new SystemChatMessage("You are a helpful assistant."), + new UserChatMessage("When was the Nobel Prize founded?") +}; + +ClientResult result = await client.CompleteChatAsync(messages); + +messages.Add(new AssistantChatMessage(result)); +messages.Add(new UserChatMessage("Who was the first person to be awarded one?")); + +result = await client.CompleteChatAsync(messages); + +messages.Add(new AssistantChatMessage(result)); + +foreach (ChatMessage message in messages) +{ + string role = message.GetType().Name; + string text = message.Content[0].Text; + + Console.WriteLine($"{role}: {text}"); +} +``` + +### Chat Completions: Streaming + +OpenAI 1.11.0: +```cs +OpenAIAPI api = new OpenAIAPI(""); +Conversation conversation = api.Chat.CreateConversation(); + +conversation.Model = Model.ChatGPTTurbo; +conversation.AppendUserInput("Give me a list of Nobel Prize winners of the last 5 years."); + +await foreach (string response in conversation.StreamResponseEnumerableFromChatbotAsync()) +{ + Console.Write(response); +} +``` + +OpenAI 2.0.0-beta.1: +```cs +ChatClient client = new ChatClient("gpt-3.5-turbo", ""); +List messages = new List() +{ + new UserChatMessage("Give me a list of Nobel Prize winners of the last 5 years.") +}; + +await foreach (StreamingChatCompletionUpdate chatUpdate in client.CompleteChatStreamingAsync(messages)) +{ + if (chatUpdate.ContentUpdate.Count > 0) + { + Console.Write(chatUpdate.ContentUpdate[0].Text); + } +} +``` + +### Chat Completions: JSON mode + +OpenAI 1.11.0: +```cs +OpenAIAPI api = new OpenAIAPI(""); +ChatRequest request = new ChatRequest() +{ + Model = Model.ChatGPTTurbo, + ResponseFormat = request.ResponseFormats.JsonObject, + Messages = new List() + { + new ChatMessage(ChatMessageRole.System, "You are a helpful assistant designed to output JSON."), + new ChatMessage(ChatMessageRole.User, "Give me a JSON object listing Nobel Prize winners of the last 5 years.") + } +}; + +ChatResult result = await api.Chat.CreateChatCompletionAsync(request); + +Console.WriteLine(result); +``` + +OpenAI 2.0.0-beta.1: +```cs +ChatClient client = new ChatClient("gpt-3.5-turbo", ""); +List messages = new List() +{ + new SystemChatMessage("You are a helpful assistant designed to output JSON."), + new UserChatMessage("Give me a JSON object listing Nobel Prize winners of the last 5 years.") +}; +ChatCompletionOptions options = new ChatCompletionOptions() +{ + ResponseFormat = ChatResponseFormat.JsonObject +}; + +ClientResult result = await client.CompleteChatAsync(messages, options); +string text = result.Value.Content[0].Text; + +Console.WriteLine(text); +``` + +### Chat Completions: Vision + +OpenAI 1.11.0: +```cs +OpenAIAPI api = new OpenAIAPI(""); +Conversation conversation = api.Chat.CreateConversation(); +byte[] imageData = await File.ReadAllBytesAsync(""); + +conversation.Model = Model.GPT4_Vision; +conversation.AppendUserInput("Describe this image.", ImageInput.FromImageBytes(imageData)); + +string response = await conversation.GetResponseFromChatbotAsync(); + +Console.WriteLine(response); +``` + +OpenAI 2.0.0-beta.1: +```cs +ChatClient client = new ChatClient("gpt-4-vision-preview", ""); +using FileStream file = File.OpenRead(""); +BinaryData imageData = await BinaryData.FromStreamAsync(file); +List messages = new List() +{ + new UserChatMessage( + ChatMessageContentPart.CreateTextMessageContentPart("Describe this image."), + ChatMessageContentPart.CreateImageMessageContentPart(imageData, "image/png")) +}; + +ClientResult result = await client.CompleteChatAsync(messages); +string text = result.Value.Content[0].Text; + +Console.WriteLine(text); +``` + +### Audio: Speech-to-text + +OpenAI 1.11.0: +```cs +OpenAIAPI api = new OpenAIAPI(""); +string result = await api.Transcriptions.GetTextAsync("", "fr"); + +Console.WriteLine(result); +``` + +OpenAI 2.0.0-beta.1: +```cs +AudioClient client = new AudioClient("whisper-1", ""); +AudioTranscriptionOptions options = new AudioTranscriptionOptions() +{ + Language = "fr" +}; + +ClientResult result = await client.TranscribeAudioAsync("", options); +string text = result.Value.Text; + +Console.WriteLine(text); +``` + +### Audio: Text-to-speech + +OpenAI 1.11.0: +```cs +OpenAIAPI api = new OpenAIAPI(""); +TextToSpeechRequest request = new TextToSpeechRequest() +{ + Input = "Hasta la vista, baby.", + Model = Model.TTS_Speed, + Voice = "alloy" +}; + +await api.TextToSpeech.SaveSpeechToFileAsync(request, ""); +``` + +OpenAI 2.0.0-beta.1: +```cs +AudioClient client = new AudioClient("tts-1", ""); + +ClientResult result = await client.GenerateSpeechFromTextAsync("Hasta la vista, baby.", GeneratedSpeechVoice.Alloy); +BinaryData data = result.Value; + +await File.WriteAllBytesAsync("", data.ToArray()); +``` + +### Image: Image generation + +OpenAI 1.11.0: +```cs +OpenAIAPI api = new OpenAIAPI(""); +ImageGenerationRequest request = new ImageGenerationRequest() +{ + Prompt = "Draw a quick brown fox jumping over a lazy dog.", + Model = Model.DALLE3, + Quality = "standard", + Size = ImageSize._1024 +}; + +ImageResult result = await api.ImageGenerations.CreateImageAsync(request); + +Console.WriteLine(result.Data[0].Url); +``` + +OpenAI 2.0.0-beta.1: +```cs +ImageClient client = new ImageClient("dall-e-3", ""); +ImageGenerationOptions options = new ImageGenerationOptions() +{ + Quality = GeneratedImageQuality.Standard, + Size = GeneratedImageSize.W1024xH1024 +}; + +ClientResult result = await client.GenerateImageAsync("Draw a quick brown fox jumping over a lazy dog.", options); +Uri imageUri = result.Value.ImageUri; + +Console.WriteLine(imageUri.AbsoluteUri); +``` + +## Additional examples + +For additional examples, see [OpenAI Examples][examples]. + +[readme]: https://github.com/openai/openai-dotnet/blob/main/README.md +[changelog]: https://github.com/openai/openai-dotnet/blob/main/CHANGELOG.md +[examples]: https://github.com/openai/openai-dotnet/tree/main/examples +[openai_1110]: https://aka.ms/openai1110 diff --git a/.dotnet/OpenAI.sln b/.dotnet/OpenAI.sln new file mode 100644 index 000000000..d6350d85f --- /dev/null +++ b/.dotnet/OpenAI.sln @@ -0,0 +1,36 @@ +Microsoft Visual Studio Solution File, Format Version 12.00 +# Visual Studio Version 17 +VisualStudioVersion = 17.9.34902.65 +MinimumVisualStudioVersion = 10.0.40219.1 +Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "OpenAI", "src\OpenAI.csproj", "{28FF4005-4467-4E36-92E7-DEA27DEB1519}" +EndProject +Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "OpenAI.Examples", "examples\OpenAI.Examples.csproj", "{1F1CD1D4-9932-4B73-99D8-C252A67D4B46}" +EndProject +Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "OpenAI.Tests", "tests\OpenAI.Tests.csproj", "{6F156401-2544-41D7-B204-3148C51C1D09}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Debug|Any CPU = Debug|Any CPU + Release|Any CPU = Release|Any CPU + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {28FF4005-4467-4E36-92E7-DEA27DEB1519}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {28FF4005-4467-4E36-92E7-DEA27DEB1519}.Debug|Any CPU.Build.0 = Debug|Any CPU + {28FF4005-4467-4E36-92E7-DEA27DEB1519}.Release|Any CPU.ActiveCfg = Release|Any CPU + {28FF4005-4467-4E36-92E7-DEA27DEB1519}.Release|Any CPU.Build.0 = Release|Any CPU + {1F1CD1D4-9932-4B73-99D8-C252A67D4B46}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {1F1CD1D4-9932-4B73-99D8-C252A67D4B46}.Debug|Any CPU.Build.0 = Debug|Any CPU + {1F1CD1D4-9932-4B73-99D8-C252A67D4B46}.Release|Any CPU.ActiveCfg = Release|Any CPU + {1F1CD1D4-9932-4B73-99D8-C252A67D4B46}.Release|Any CPU.Build.0 = Release|Any CPU + {6F156401-2544-41D7-B204-3148C51C1D09}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {6F156401-2544-41D7-B204-3148C51C1D09}.Debug|Any CPU.Build.0 = Debug|Any CPU + {6F156401-2544-41D7-B204-3148C51C1D09}.Release|Any CPU.ActiveCfg = Release|Any CPU + {6F156401-2544-41D7-B204-3148C51C1D09}.Release|Any CPU.Build.0 = Release|Any CPU + EndGlobalSection + GlobalSection(SolutionProperties) = preSolution + HideSolutionNode = FALSE + EndGlobalSection + GlobalSection(ExtensibilityGlobals) = postSolution + SolutionGuid = {A97F4B90-2591-4689-B1F8-5F21FE6D6CAE} + EndGlobalSection +EndGlobal \ No newline at end of file diff --git a/.dotnet/README.md b/.dotnet/README.md new file mode 100644 index 000000000..7185e3464 --- /dev/null +++ b/.dotnet/README.md @@ -0,0 +1,811 @@ +# OpenAI .NET API library + +[![NuGet version](https://img.shields.io/nuget/vpre/openai.svg)](https://www.nuget.org/packages/OpenAI/absoluteLatest) + +The OpenAI .NET library provides convenient access to the OpenAI REST API from .NET applications. + +It is generated from our [OpenAPI specification](https://github.com/openai/openai-openapi) in collaboration with Microsoft. + +## Table of Contents + +- [Getting started](#getting-started) + - [Prerequisites](#prerequisites) + - [Install the NuGet package](#install-the-nuget-package) +- [Using the client library](#using-the-client-library) + - [Namespace organization](#namespace-organization) + - [Using the async API](#using-the-async-api) + - [Using the `OpenAIClient` class](#using-the-openaiclient-class) +- [How to use chat completions with streaming](#how-to-use-chat-completions-with-streaming) +- [How to use chat completions with tools and function calling](#how-to-use-chat-completions-with-tools-and-function-calling) +- [How to use structured outputs](#how-to-use-structured-outputs) +- [How to generate text embeddings](#how-to-generate-text-embeddings) +- [How to generate images](#how-to-generate-images) +- [How to transcribe audio](#how-to-transcribe-audio) +- [How to use assistants with retrieval augmented generation (RAG)](#how-to-use-assistants-with-retrieval-augmented-generation-rag) +- [How to use streaming and GPT-4o vision with assistants](#how-to-use-streaming-and-gpt-4o-vision-with-assistants) +- [How to work with Azure OpenAI](#how-to-work-with-azure-openai) +- [Advanced scenarios](#advanced-scenarios) + - [Using protocol methods](#using-protocol-methods) + - [Automatically retrying errors](#automatically-retrying-errors) + - [Observability](#observability) + +## Getting started + +### Prerequisites + +To call the OpenAI REST API, you will need an API key. To obtain one, first [create a new OpenAI account](https://platform.openai.com/signup) or [log in](https://platform.openai.com/login). Next, navigate to the [API key page](https://platform.openai.com/account/api-keys) and select "Create new secret key", optionally naming the key. Make sure to save your API key somewhere safe and do not share it with anyone. + +### Install the NuGet package + +Add the client library to your .NET project with [NuGet](https://www.nuget.org/) using your IDE or the dotnet CLI: + +```cli +dotnet add package OpenAI --prerelease +``` + +Note that the code examples included below were written using [.NET 8](https://dotnet.microsoft.com/download/dotnet/8.0). The OpenAI .NET library is compatible with all .NET Standard 2.0 applications but some code examples in this document may depend on newer language features. + +## Using the client library + +The full API of this library can be found in the [api.md](https://github.com/openai/openai-dotnet/blob/main/api/api.md) file, and there are many [code examples](https://github.com/openai/openai-dotnet/tree/main/examples) to help. For instance, the following snippet illustrates the basic use of the chat completions API: + +```csharp +using OpenAI.Chat; + +ChatClient client = new(model: "gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + +ChatCompletion completion = client.CompleteChat("Say 'this is a test.'"); + +Console.WriteLine($"[ASSISTANT]: {completion}"); +``` + +While you can pass your API key directly as a string, it is highly recommended to keep it in a secure location and instead access it via an environment variable or configuration file as shown above to avoid storing it in source control. + +### Namespace organization + +The library is organized into several namespaces corresponding to OpenAI feature areas. Each namespace contains a corresponding client class. + +| Namespace | Client class | Notes | +| ------------------------------|------------------------------|---------------------| +| `OpenAI.Assistants` | `AssistantClient` | \[Experimental\] | +| `OpenAI.Audio` | `AudioClient` | | +| `OpenAI.Batch` | `BatchClient` | | +| `OpenAI.Chat` | `ChatClient` | | +| `OpenAI.Embeddings` | `EmbeddingClient` | | +| `OpenAI.FineTuning` | `FineTuningClient` | | +| `OpenAI.Files` | `FileClient` | | +| `OpenAI.Images` | `ImageClient` | | +| `OpenAI.Models` | `ModelClient` | | +| `OpenAI.Moderations` | `ModerationClient` | | +| `OpenAI.VectorStores` | `VectorStoreClient` | \[Experimental\] | + +### Using the async API + +Every client method that performs a synchronous API call has an asynchronous variant in the same client class. For instance, the asynchronous variant of the `ChatClient`'s `CompleteChat` method is `CompleteChatAsync`. To rewrite the call above using the asynchronous counterpart, simply `await` the call to the corresponding async variant: + +```csharp +ChatCompletion completion = await client.CompleteChatAsync("Say 'this is a test.'"); +``` + +### Using the `OpenAIClient` class + +In addition to the namespaces mentioned above, there is also the parent `OpenAI` namespace itself: + +```csharp +using OpenAI; +``` + +This namespace contains the `OpenAIClient` class, which offers certain conveniences when you need to work with multiple feature area clients. Specifically, you can use an instance of this class to create instances of the other clients and have them share the same implementation details, which might be more efficient. + +You can create an `OpenAIClient` by specifying the API key that all clients will use for authentication: + +```csharp +OpenAIClient client = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); +``` + +Next, to create an instance of an `AudioClient`, for example, you can call the `OpenAIClient`'s `GetAudioClient` method by passing the OpenAI model that the `AudioClient` will use, just as if you were using the `AudioClient` constructor directly. If necessary, you can create additional clients of the same type to target different models. + +```csharp +AudioClient ttsClient = client.GetAudioClient("tts-1"); +AudioClient whisperClient = client.GetAudioClient("whisper-1"); +``` + +## How to use chat completions with streaming + +When you request a chat completion, the default behavior is for the server to generate it in its entirety before sending it back in a single response. Consequently, long chat completions can require waiting for several seconds before hearing back from the server. To mitigate this, the OpenAI REST API supports the ability to stream partial results back as they are being generated, allowing you to start processing the beginning of the completion before it is finished. + +The client library offers a convenient approach to working with streaming chat completions. If you wanted to re-write the example from the previous section using streaming, rather than calling the `ChatClient`'s `CompleteChat` method, you would call its `CompleteChatStreaming` method instead: + +```csharp +CollectionResult updates + = client.CompleteChatStreaming("Say 'this is a test.'"); +``` + +Notice that the returned value is a `CollectionResult` instance, which can be enumerated to process the streaming response chunks as they arrive: + +```csharp +Console.WriteLine($"[ASSISTANT]:"); +foreach (StreamingChatCompletionUpdate update in updates) +{ + foreach (ChatMessageContentPart updatePart in update.ContentUpdate) + { + Console.Write(updatePart); + } +} +``` + +Alternatively, you can do this asynchronously by calling the `CompleteChatStreamingAsync` method to get an `AsyncCollectionResult` and enumerate it using `await foreach`: + +```csharp +AsyncCollectionResult updates + = client.CompleteChatStreamingAsync("Say 'this is a test.'"); + +Console.WriteLine($"[ASSISTANT]:"); +await foreach (StreamingChatCompletionUpdate update in updates) +{ + foreach (ChatMessageContentPart updatePart in update.ContentUpdate) + { + Console.Write(updatePart.Text); + } +} +``` + +## How to use chat completions with tools and function calling + +In this example, you have two functions. The first function can retrieve a user's current geographic location (e.g., by polling the location service APIs of the user's device), while the second function can query the weather in a given location (e.g., by making an API call to some third-party weather service). You want chat completions to be able to call these functions if the model deems it necessary to have this information in order to respond to a user request. For illustrative purposes, consider the following: + +```csharp +private static string GetCurrentLocation() +{ + // Call the location API here. + return "San Francisco"; +} + +private static string GetCurrentWeather(string location, string unit = "celsius") +{ + // Call the weather API here. + return $"31 {unit}"; +} +``` + +Start by creating two `ChatTool` instances using the static `CreateFunctionTool` method to describe each function: + +```csharp +private static readonly ChatTool getCurrentLocationTool = ChatTool.CreateFunctionTool( + functionName: nameof(GetCurrentLocation), + functionDescription: "Get the user's current location" +); + +private static readonly ChatTool getCurrentWeatherTool = ChatTool.CreateFunctionTool( + functionName: nameof(GetCurrentWeather), + functionDescription: "Get the current weather in a given location", + functionParameters: BinaryData.FromString(""" + { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. Boston, MA" + }, + "unit": { + "type": "string", + "enum": [ "celsius", "fahrenheit" ], + "description": "The temperature unit to use. Infer this from the specified location." + } + }, + "required": [ "location" ] + } + """) +); +``` + +Next, create a `ChatCompletionOptions` instance and add both to its `Tools` property. You will pass the `ChatCompletionOptions` as an argument in your calls to the `ChatClient`'s `CompleteChat` method. + +```csharp +List messages = [ + new UserChatMessage("What's the weather like today?"), +]; + +ChatCompletionOptions options = new() +{ + Tools = { getCurrentLocationTool, getCurrentWeatherTool }, +}; +``` + +When the resulting `ChatCompletion` has a `FinishReason` property equal to `ChatFinishReason.ToolCalls`, it means that the model has determined that one or more tools must be called before the assistant can respond appropriately. In those cases, you must first call the function specified in the `ChatCompletion`'s `ToolCalls` and then call the `ChatClient`'s `CompleteChat` method again while passing the function's result as an additional `ChatRequestToolMessage`. Repeat this process as needed. + +```csharp +bool requiresAction; + +do +{ + requiresAction = false; + ChatCompletion chatCompletion = client.CompleteChat(messages, options); + + switch (chatCompletion.FinishReason) + { + case ChatFinishReason.Stop: + { + // Add the assistant message to the conversation history. + messages.Add(new AssistantChatMessage(chatCompletion)); + break; + } + + case ChatFinishReason.ToolCalls: + { + // First, add the assistant message with tool calls to the conversation history. + messages.Add(new AssistantChatMessage(chatCompletion)); + + // Then, add a new tool message for each tool call that is resolved. + foreach (ChatToolCall toolCall in chatCompletion.ToolCalls) + { + switch (toolCall.FunctionName) + { + case nameof(GetCurrentLocation): + { + string toolResult = GetCurrentLocation(); + messages.Add(new ToolChatMessage(toolCall.Id, toolResult)); + break; + } + + case nameof(GetCurrentWeather): + { + // The arguments that the model wants to use to call the function are specified as a + // stringified JSON object based on the schema defined in the tool definition. Note that + // the model may hallucinate arguments too. Consequently, it is important to do the + // appropriate parsing and validation before calling the function. + using JsonDocument argumentsJson = JsonDocument.Parse(toolCall.FunctionArguments); + bool hasLocation = argumentsJson.RootElement.TryGetProperty("location", out JsonElement location); + bool hasUnit = argumentsJson.RootElement.TryGetProperty("unit", out JsonElement unit); + + if (!hasLocation) + { + throw new ArgumentNullException(nameof(location), "The location argument is required."); + } + + string toolResult = hasUnit + ? GetCurrentWeather(location.GetString(), unit.GetString()) + : GetCurrentWeather(location.GetString()); + messages.Add(new ToolChatMessage(toolCall.Id, toolResult)); + break; + } + + default: + { + // Handle other unexpected calls. + throw new NotImplementedException(); + } + } + } + + requiresAction = true; + break; + } + + case ChatFinishReason.Length: + throw new NotImplementedException("Incomplete model output due to MaxTokens parameter or token limit exceeded."); + + case ChatFinishReason.ContentFilter: + throw new NotImplementedException("Omitted content due to a content filter flag."); + + case ChatFinishReason.FunctionCall: + throw new NotImplementedException("Deprecated in favor of tool calls."); + + default: + throw new NotImplementedException(chatCompletion.FinishReason.ToString()); + } +} while (requiresAction); +``` + +## How to use structured outputs + +Beginning with the `gpt-4o-mini`, `gpt-4o-mini-2024-07-18`, and `gpt-4o-2024-08-06` model snapshots, structured outputs are available for both top-level response content and tool calls in the chat completion and assistants APIs. + +For information about the feature, see [the Structured Outputs guide](https://platform.openai.com/docs/guides/structured-outputs/introduction). + +To use structured outputs to constrain chat completion content, set an appropriate `ChatResponseFormat` as in the following example: + +```csharp +ChatCompletionOptions options = new() +{ + ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat( + name: "math_reasoning", + jsonSchema: BinaryData.FromString(""" + { + "type": "object", + "properties": { + "steps": { + "type": "array", + "items": { + "type": "object", + "properties": { + "explanation": { "type": "string" }, + "output": { "type": "string" } + }, + "required": ["explanation", "output"], + "additionalProperties": false + } + }, + "final_answer": { "type": "string" } + }, + "required": ["steps", "final_answer"], + "additionalProperties": false + } + """), + strictSchemaEnabled: true) +}; + +ChatCompletion chatCompletion = await client.CompleteChatAsync( + ["How can I solve 8x + 7 = -23?"], + options); + +using JsonDocument structuredJson = JsonDocument.Parse(chatCompletion.ToString()); + +Console.WriteLine($"Final answer: {structuredJson.RootElement.GetProperty("final_answer").GetString()}"); +Console.WriteLine("Reasoning steps:"); + +foreach (JsonElement stepElement in structuredJson.RootElement.GetProperty("steps").EnumerateArray()) +{ + Console.WriteLine($" - Explanation: {stepElement.GetProperty("explanation").GetString()}"); + Console.WriteLine($" Output: {stepElement.GetProperty("output")}"); +} +``` + +## How to generate text embeddings + +In this example, you want to create a trip-planning website that allows customers to write a prompt describing the kind of hotel that they are looking for and then offers hotel recommendations that closely match this description. To achieve this, it is possible to use text embeddings to measure the relatedness of text strings. In summary, you can get embeddings of the hotel descriptions, store them in a vector database, and use them to build a search index that you can query using the embedding of a given customer's prompt. + +To generate a text embedding, use `EmbeddingClient` from the `OpenAI.Embeddings` namespace: + +```csharp +using OpenAI.Embeddings; + +EmbeddingClient client = new(model: "text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + +string description = "Best hotel in town if you like luxury hotels. They have an amazing infinity pool, a spa," + + " and a really helpful concierge. The location is perfect -- right downtown, close to all the tourist" + + " attractions. We highly recommend this hotel."; + +Embedding embedding = client.GenerateEmbedding(description); +ReadOnlyMemory vector = embedding.Vector; +``` + +Notice that the resulting embedding is a list (also called a vector) of floating point numbers represented as an instance of `ReadOnlyMemory`. By default, the length of the embedding vector will be 1536 when using the `text-embedding-3-small` model or 3072 when using the `text-embedding-3-large` model. Generally, larger embeddings perform better, but using them also tends to cost more in terms of compute, memory, and storage. You can reduce the dimensions of the embedding by creating an instance of the `EmbeddingGenerationOptions` class, setting the `Dimensions` property, and passing it as an argument in your call to the `GenerateEmbedding` method: + +```csharp +EmbeddingGenerationOptions options = new() { Dimensions = 512 }; + +Embedding embedding = client.GenerateEmbedding(description, options); +``` + +## How to generate images + +In this example, you want to build an app to help interior designers prototype new ideas based on the latest design trends. As part of the creative process, an interior designer can use this app to generate images for inspiration simply by describing the scene in their head as a prompt. As expected, high-quality, strikingly dramatic images with finer details deliver the best results for this application. + +To generate an image, use `ImageClient` from the `OpenAI.Images` namespace: + +```csharp +using OpenAI.Images; + +ImageClient client = new(model: "dall-e-3", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); +``` + +Generating an image always requires a `prompt` that describes what should be generated. To further tailor the image generation to your specific needs, you can create an instance of the `ImageGenerationOptions` class and set the `Quality`, `Size`, and `Style` properties accordingly. Note that you can also set the `ResponseFormat` property of `ImageGenerationOptions` to `GeneratedImageFormat.Bytes` in order to receive the resulting PNG as `BinaryData` (instead of the default remote `Uri`) if this is convenient for your use case. + +```csharp +string prompt = "The concept for a living room that blends Scandinavian simplicity with Japanese minimalism for" + + " a serene and cozy atmosphere. It's a space that invites relaxation and mindfulness, with natural light" + + " and fresh air. Using neutral tones, including colors like white, beige, gray, and black, that create a" + + " sense of harmony. Featuring sleek wood furniture with clean lines and subtle curves to add warmth and" + + " elegance. Plants and flowers in ceramic pots adding color and life to a space. They can serve as focal" + + " points, creating a connection with nature. Soft textiles and cushions in organic fabrics adding comfort" + + " and softness to a space. They can serve as accents, adding contrast and texture."; + +ImageGenerationOptions options = new() +{ + Quality = GeneratedImageQuality.High, + Size = GeneratedImageSize.W1792xH1024, + Style = GeneratedImageStyle.Vivid, + ResponseFormat = GeneratedImageFormat.Bytes +}; +``` + +Finally, call the `ImageClient`'s `GenerateImage` method by passing the prompt and the `ImageGenerationOptions` instance as arguments: + +```csharp +GeneratedImage image = client.GenerateImage(prompt, options); +BinaryData bytes = image.ImageBytes; +``` + +For illustrative purposes, you could then save the generated image to local storage: + +```csharp +using FileStream stream = File.OpenWrite($"{Guid.NewGuid()}.png"); +bytes.ToStream().CopyTo(stream); +``` + +## How to transcribe audio + +In this example, an audio file is transcribed using the Whisper speech-to-text model, including both word- and audio-segment-level timestamp information. + +```csharp +using OpenAI.Audio; + +AudioClient client = new(model: "whisper-1", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + +string audioFilePath = Path.Combine("Assets", "audio_houseplant_care.mp3"); + +AudioTranscriptionOptions options = new() +{ + ResponseFormat = AudioTranscriptionFormat.Verbose, + Granularities = AudioTimestampGranularities.Word | AudioTimestampGranularities.Segment, +}; + +AudioTranscription transcription = client.TranscribeAudio(audioFilePath, options); + +Console.WriteLine("Transcription:"); +Console.WriteLine($"{transcription.Text}"); + +Console.WriteLine(); +Console.WriteLine($"Words:"); +foreach (TranscribedWord word in transcription.Words) +{ + Console.WriteLine($" {word.Word,15} : {word.Start.TotalMilliseconds,5:0} - {word.End.TotalMilliseconds,5:0}"); +} + +Console.WriteLine(); +Console.WriteLine($"Segments:"); +foreach (TranscribedSegment segment in transcription.Segments) +{ + Console.WriteLine($" {segment.Text,90} : {segment.Start.TotalMilliseconds,5:0} - {segment.End.TotalMilliseconds,5:0}"); +} +``` + +## How to use assistants with retrieval augmented generation (RAG) + +In this example, you have a JSON document with the monthly sales information of different products, and you want to build an assistant capable of analyzing it and answering questions about it. + +To achieve this, use both `FileClient` from the `OpenAI.Files` namespace and `AssistantClient` from the `OpenAI.Assistants` namespace. + +Important: The Assistants REST API is currently in beta. As such, the details are subject to change, and correspondingly the `AssistantClient` is attributed as `[Experimental]`. To use it, suppress the `OPENAI001` warning at either the project level or, as below, in the code itself. + +```csharp +using OpenAI.Assistants; +using OpenAI.Files; + +// Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning. +#pragma warning disable OPENAI001 +OpenAIClient openAIClient = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); +FileClient fileClient = openAIClient.GetFileClient(); +AssistantClient assistantClient = openAIClient.GetAssistantClient(); +``` + +Here is an example of what the JSON document might look like: + +```csharp +using Stream document = BinaryData.FromString(""" + { + "description": "This document contains the sale history data for Contoso products.", + "sales": [ + { + "month": "January", + "by_product": { + "113043": 15, + "113045": 12, + "113049": 2 + } + }, + { + "month": "February", + "by_product": { + "113045": 22 + } + }, + { + "month": "March", + "by_product": { + "113045": 16, + "113055": 5 + } + } + ] + } + """).ToStream(); +``` + +Upload this document to OpenAI using the `FileClient`'s `UploadFile` method, ensuring that you use `FileUploadPurpose.Assistants` to allow your assistant to access it later: + +```csharp +OpenAIFileInfo salesFile = fileClient.UploadFile( + document, + "monthly_sales.json", + FileUploadPurpose.Assistants); +``` + +Create a new assistant using an instance of the `AssistantCreationOptions` class to customize it. Here, we use: + +- A friendly `Name` for the assistant, as will display in the Playground +- Tool definition instances for the tools that the assistant should have access to; here, we use `FileSearchToolDefinition` to process the sales document we just uploaded and `CodeInterpreterToolDefinition` so we can analyze and visualize the numeric data +- Resources for the assistant to use with its tools, here using the `VectorStoreCreationHelper` type to automatically make a new vector store that indexes the sales file; alternatively, you could use `VectorStoreClient` to manage the vector store separately + +```csharp +AssistantCreationOptions assistantOptions = new() +{ + Name = "Example: Contoso sales RAG", + Instructions = + "You are an assistant that looks up sales data and helps visualize the information based" + + " on user queries. When asked to generate a graph, chart, or other visualization, use" + + " the code interpreter tool to do so.", + Tools = + { + new FileSearchToolDefinition(), + new CodeInterpreterToolDefinition(), + }, + ToolResources = new() + { + FileSearch = new() + { + NewVectorStores = + { + new VectorStoreCreationHelper([salesFile.Id]), + } + } + }, +}; + +Assistant assistant = assistantClient.CreateAssistant("gpt-4o", assistantOptions); +``` + +Next, create a new thread. For illustrative purposes, you could include an initial user message asking about the sales information of a given product and then use the `AssistantClient`'s `CreateThreadAndRun` method to get it started: + +```csharp +ThreadCreationOptions threadOptions = new() +{ + InitialMessages = { "How well did product 113045 sell in February? Graph its trend over time." } +}; + +ThreadRun threadRun = assistantClient.CreateThreadAndRun(assistant.Id, threadOptions); +``` + +Poll the status of the run until it is no longer queued or in progress: + +```csharp +do +{ + Thread.Sleep(TimeSpan.FromSeconds(1)); + threadRun = assistantClient.GetRun(threadRun.ThreadId, threadRun.Id); +} while (!threadRun.Status.IsTerminal); +``` + +If everything went well, the terminal status of the run will be `RunStatus.Completed`. + +Finally, you can use the `AssistantClient`'s `GetMessages` method to retrieve the messages associated with this thread, which now include the responses from the assistant to the initial user message. + +For illustrative purposes, you could print the messages to the console and also save any images produced by the assistant to local storage: + +```csharp +PageCollection messagePages = assistantClient.GetMessages(threadRun.ThreadId, new MessageCollectionOptions() { Order = ListOrder.OldestFirst }); +IEnumerable messages = messagePages.GetAllValues(); + +foreach (ThreadMessage message in messages) +{ + Console.Write($"[{message.Role.ToString().ToUpper()}]: "); + foreach (MessageContent contentItem in message.Content) + { + if (!string.IsNullOrEmpty(contentItem.Text)) + { + Console.WriteLine($"{contentItem.Text}"); + + if (contentItem.TextAnnotations.Count > 0) + { + Console.WriteLine(); + } + + // Include annotations, if any. + foreach (TextAnnotation annotation in contentItem.TextAnnotations) + { + if (!string.IsNullOrEmpty(annotation.InputFileId)) + { + Console.WriteLine($"* File citation, file ID: {annotation.InputFileId}"); + } + if (!string.IsNullOrEmpty(annotation.OutputFileId)) + { + Console.WriteLine($"* File output, new file ID: {annotation.OutputFileId}"); + } + } + } + if (!string.IsNullOrEmpty(contentItem.ImageFileId)) + { + OpenAIFileInfo imageInfo = fileClient.GetFile(contentItem.ImageFileId); + BinaryData imageBytes = fileClient.DownloadFile(contentItem.ImageFileId); + using FileStream stream = File.OpenWrite($"{imageInfo.Filename}.png"); + imageBytes.ToStream().CopyTo(stream); + + Console.WriteLine($""); + } + } + Console.WriteLine(); +} +``` + +And it would yield something like this: + +```text +[USER]: How well did product 113045 sell in February? Graph its trend over time. + +[ASSISTANT]: Product 113045 sold 22 units in February【4:0†monthly_sales.json】. + +Now, I will generate a graph to show its sales trend over time. + +* File citation, file ID: file-hGOiwGNftMgOsjbynBpMCPFn + +[ASSISTANT]: +The sales trend for Product 113045 over the past three months shows that: + +- In January, 12 units were sold. +- In February, 22 units were sold, indicating significant growth. +- In March, sales dropped slightly to 16 units. + +The graph above visualizes this trend, showing a peak in sales during February. +``` + +## How to use streaming and GPT-4o vision with assistants + +This example shows how to use the v2 Assistants API to provide image data to an assistant and then stream the run's response. + +As before, you will use a `FileClient` and an `AssistantClient`: + +```csharp +// Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning. +#pragma warning disable OPENAI001 +OpenAIClient openAIClient = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); +FileClient fileClient = openAIClient.GetFileClient(); +AssistantClient assistantClient = openAIClient.GetAssistantClient(); +``` + +For this example, we will use both image data from a local file as well as an image located at a URL. For the local data, we upload the file with the `Vision` upload purpose, which would also allow it to be downloaded and retrieved later. + +```csharp +OpenAIFileInfo pictureOfAppleFile = fileClient.UploadFile( + "picture-of-apple.jpg", + FileUploadPurpose.Vision); +Uri linkToPictureOfOrange = new("https://platform.openai.com/fictitious-files/picture-of-orange.png"); +``` + +Next, create a new assistant with a vision-capable model like `gpt-4o` and a thread with the image information referenced: + +```csharp +Assistant assistant = assistantClient.CreateAssistant( + model: "gpt-4o", + new AssistantCreationOptions() + { + Instructions = "When asked a question, attempt to answer very concisely. " + + "Prefer one-sentence answers whenever feasible." + }); + +AssistantThread thread = assistantClient.CreateThread(new ThreadCreationOptions() +{ + InitialMessages = + { + new ThreadInitializationMessage( + [ + "Hello, assistant! Please compare these two images for me:", + MessageContent.FromImageFileId(pictureOfAppleFile.Id), + MessageContent.FromImageUrl(linkToPictureOfOrange), + ]), + } +}); +``` + +With the assistant and thread prepared, use the `CreateRunStreaming` method to get an enumerable `CollectionResult`. You can then iterate over this collection with `foreach`. For async calling patterns, use `CreateRunStreamingAsync` and iterate over the `AsyncCollectionResult` with `await foreach`, instead. Note that streaming variants also exist for `CreateThreadAndRunStreaming` and `SubmitToolOutputsToRunStreaming`. + +```csharp +CollectionResult streamingUpdates = assistantClient.CreateRunStreaming( + thread, + assistant, + new RunCreationOptions() + { + AdditionalInstructions = "When possible, try to sneak in puns if you're asked to compare things.", + }); +``` + +Finally, to handle the `StreamingUpdates` as they arrive, you can use the `UpdateKind` property on the base `StreamingUpdate` and/or downcast to a specifically desired update type, like `MessageContentUpdate` for `thread.message.delta` events or `RequiredActionUpdate` for streaming tool calls. + +```csharp +foreach (StreamingUpdate streamingUpdate in streamingUpdates) +{ + if (streamingUpdate.UpdateKind == StreamingUpdateReason.RunCreated) + { + Console.WriteLine($"--- Run started! ---"); + } + if (streamingUpdate is MessageContentUpdate contentUpdate) + { + Console.Write(contentUpdate.Text); + } +} +``` + +This will yield streamed output from the run like the following: + +```text +--- Run started! --- +The first image shows a red apple with a smooth skin and a single leaf, while the second image depicts an orange with a rough, textured skin and a leaf with droplets of water. Comparing them might seem impossible - it's like apples and oranges! +``` + +## How to work with Azure OpenAI + +For Azure OpenAI scenarios use the [Azure SDK](https://github.com/Azure/azure-sdk-for-net) and more specifically the [Azure OpenAI client library for .NET](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/README.md). + +The Azure OpenAI client library for .NET is a companion to this library and all common capabilities between OpenAI and Azure OpenAI share the same scenario clients, methods, and request/response types. It is designed to make Azure specific scenarios straightforward, with extensions for Azure-specific concepts like Responsible AI content filter results and On Your Data integration. + +```c# +AzureOpenAIClient azureClient = new( + new Uri("https://your-azure-openai-resource.com"), + new DefaultAzureCredential()); +ChatClient chatClient = azureClient.GetChatClient("my-gpt-35-turbo-deployment"); + +ChatCompletion completion = chatClient.CompleteChat( + [ + // System messages represent instructions or other guidance about how the assistant should behave + new SystemChatMessage("You are a helpful assistant that talks like a pirate."), + // User messages represent user input, whether historical or the most recen tinput + new UserChatMessage("Hi, can you help me?"), + // Assistant messages in a request represent conversation history for responses + new AssistantChatMessage("Arrr! Of course, me hearty! What can I do for ye?"), + new UserChatMessage("What's the best way to train a parrot?"), + ]); + +Console.WriteLine($"{completion.Role}: {completion.Content[0].Text}"); +``` + +## Advanced scenarios + +### Using protocol methods + +In addition to the client methods that use strongly-typed request and response objects, the .NET library also provides _protocol methods_ that enable more direct access to the REST API. Protocol methods are "binary in, binary out" accepting `BinaryContent` as request bodies and providing `BinaryData` as response bodies. + +For example, to use the protocol method variant of the `ChatClient`'s `CompleteChat` method, pass the request body as `BinaryContent`: + +```csharp +ChatClient client = new("gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + +BinaryData input = BinaryData.FromBytes(""" +{ + "model": "gpt-4o", + "messages": [ + { + "role": "user", + "content": "How does AI work? Explain it in simple terms." + } + ] +} +"""u8.ToArray()); + +using BinaryContent content = BinaryContent.Create(input); +ClientResult result = client.CompleteChat(content); +BinaryData output = result.GetRawResponse().Content; + +using JsonDocument outputAsJson = JsonDocument.Parse(output); +string message = outputAsJson.RootElement + .GetProperty("choices"u8)[0] + .GetProperty("message"u8) + .GetProperty("content"u8) + .GetString(); +``` + +Notice how you can then call the resulting `ClientResult`'s `GetRawResponse` method and retrieve the response body as `BinaryData` via the `PipelineResponse`'s `Content` property. + +### Automatically retrying errors + +By default, the client classes will automatically retry the following errors up to three additional times using exponential backoff: + +- 408 Request Timeout +- 429 Too Many Requests +- 500 Internal Server Error +- 502 Bad Gateway +- 503 Service Unavailable +- 504 Gateway Timeout + +### Observability + +OpenAI .NET library supports experimental distributed tracing and metrics with OpenTelemetry. Check out [Observability with OpenTelemetry](./docs/observability.md) for more details. diff --git a/.dotnet/api/OpenAI.netstandard2.0.cs b/.dotnet/api/OpenAI.netstandard2.0.cs new file mode 100644 index 000000000..9db987645 --- /dev/null +++ b/.dotnet/api/OpenAI.netstandard2.0.cs @@ -0,0 +1,2376 @@ +namespace OpenAI { + public readonly partial struct ListOrder : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public ListOrder(string value); + public static ListOrder NewestFirst { get; } + public static ListOrder OldestFirst { get; } + public readonly bool Equals(ListOrder other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(ListOrder left, ListOrder right); + public static implicit operator ListOrder(string value); + public static bool operator !=(ListOrder left, ListOrder right); + public override readonly string ToString(); + } + public class OpenAIClient { + protected OpenAIClient(); + public OpenAIClient(ApiKeyCredential credential, OpenAIClientOptions options); + public OpenAIClient(ApiKeyCredential credential); + protected internal OpenAIClient(ClientPipeline pipeline, OpenAIClientOptions options); + public virtual ClientPipeline Pipeline { get; } + public virtual AssistantClient GetAssistantClient(); + public virtual AudioClient GetAudioClient(string model); + public virtual BatchClient GetBatchClient(); + public virtual ChatClient GetChatClient(string model); + public virtual EmbeddingClient GetEmbeddingClient(string model); + public virtual FileClient GetFileClient(); + public virtual FineTuningClient GetFineTuningClient(); + public virtual ImageClient GetImageClient(string model); + public virtual ModelClient GetModelClient(); + public virtual ModerationClient GetModerationClient(string model); + public virtual VectorStoreClient GetVectorStoreClient(); + } + public class OpenAIClientOptions : ClientPipelineOptions { + public string ApplicationId { get; set; } + public Uri Endpoint { get; set; } + public string OrganizationId { get; set; } + public string ProjectId { get; set; } + } +} +namespace OpenAI.Assistants { + public class Assistant : IJsonModel, IPersistableModel { + public DateTimeOffset CreatedAt { get; } + public string Description { get; } + public string Id { get; } + public string Instructions { get; } + public IReadOnlyDictionary Metadata { get; } + public string Model { get; } + public string Name { get; } + public float? NucleusSamplingFactor { get; } + public AssistantResponseFormat ResponseFormat { get; } + public float? Temperature { get; } + public ToolResources ToolResources { get; } + public IReadOnlyList Tools { get; } + Assistant IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + Assistant IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class AssistantClient { + protected AssistantClient(); + public AssistantClient(ApiKeyCredential credential, OpenAIClientOptions options); + public AssistantClient(ApiKeyCredential credential); + protected internal AssistantClient(ClientPipeline pipeline, OpenAIClientOptions options); + public virtual ClientPipeline Pipeline { get; } + public virtual ClientResult CancelRun(ThreadRun run); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CancelRun(string threadId, string runId, RequestOptions options); + public virtual ClientResult CancelRun(string threadId, string runId, CancellationToken cancellationToken = default); + public virtual Task> CancelRunAsync(ThreadRun run); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CancelRunAsync(string threadId, string runId, RequestOptions options); + public virtual Task> CancelRunAsync(string threadId, string runId, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateAssistant(BinaryContent content, RequestOptions options = null); + public virtual ClientResult CreateAssistant(string model, AssistantCreationOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CreateAssistantAsync(BinaryContent content, RequestOptions options = null); + public virtual Task> CreateAssistantAsync(string model, AssistantCreationOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult CreateMessage(AssistantThread thread, MessageRole role, IEnumerable content, MessageCreationOptions options = null); + public virtual ClientResult CreateMessage(string threadId, MessageRole role, IEnumerable content, MessageCreationOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateMessage(string threadId, BinaryContent content, RequestOptions options = null); + public virtual Task> CreateMessageAsync(AssistantThread thread, MessageRole role, IEnumerable content, MessageCreationOptions options = null); + public virtual Task> CreateMessageAsync(string threadId, MessageRole role, IEnumerable content, MessageCreationOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CreateMessageAsync(string threadId, BinaryContent content, RequestOptions options = null); + public virtual ClientResult CreateRun(AssistantThread thread, Assistant assistant, RunCreationOptions options = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateRun(string threadId, BinaryContent content, RequestOptions options = null); + public virtual ClientResult CreateRun(string threadId, string assistantId, RunCreationOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> CreateRunAsync(AssistantThread thread, Assistant assistant, RunCreationOptions options = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CreateRunAsync(string threadId, BinaryContent content, RequestOptions options = null); + public virtual Task> CreateRunAsync(string threadId, string assistantId, RunCreationOptions options = null, CancellationToken cancellationToken = default); + public virtual CollectionResult CreateRunStreaming(AssistantThread thread, Assistant assistant, RunCreationOptions options = null); + public virtual CollectionResult CreateRunStreaming(string threadId, string assistantId, RunCreationOptions options = null, CancellationToken cancellationToken = default); + public virtual AsyncCollectionResult CreateRunStreamingAsync(AssistantThread thread, Assistant assistant, RunCreationOptions options = null); + public virtual AsyncCollectionResult CreateRunStreamingAsync(string threadId, string assistantId, RunCreationOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult CreateThread(ThreadCreationOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateThread(BinaryContent content, RequestOptions options = null); + public virtual ClientResult CreateThreadAndRun(Assistant assistant, ThreadCreationOptions threadOptions = null, RunCreationOptions runOptions = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateThreadAndRun(BinaryContent content, RequestOptions options = null); + public virtual ClientResult CreateThreadAndRun(string assistantId, ThreadCreationOptions threadOptions = null, RunCreationOptions runOptions = null, CancellationToken cancellationToken = default); + public virtual Task> CreateThreadAndRunAsync(Assistant assistant, ThreadCreationOptions threadOptions = null, RunCreationOptions runOptions = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CreateThreadAndRunAsync(BinaryContent content, RequestOptions options = null); + public virtual Task> CreateThreadAndRunAsync(string assistantId, ThreadCreationOptions threadOptions = null, RunCreationOptions runOptions = null, CancellationToken cancellationToken = default); + public virtual CollectionResult CreateThreadAndRunStreaming(Assistant assistant, ThreadCreationOptions threadOptions = null, RunCreationOptions runOptions = null); + public virtual CollectionResult CreateThreadAndRunStreaming(string assistantId, ThreadCreationOptions threadOptions = null, RunCreationOptions runOptions = null, CancellationToken cancellationToken = default); + public virtual AsyncCollectionResult CreateThreadAndRunStreamingAsync(Assistant assistant, ThreadCreationOptions threadOptions = null, RunCreationOptions runOptions = null); + public virtual AsyncCollectionResult CreateThreadAndRunStreamingAsync(string assistantId, ThreadCreationOptions threadOptions = null, RunCreationOptions runOptions = null, CancellationToken cancellationToken = default); + public virtual Task> CreateThreadAsync(ThreadCreationOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CreateThreadAsync(BinaryContent content, RequestOptions options = null); + public virtual ClientResult DeleteAssistant(Assistant assistant); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteAssistant(string assistantId, RequestOptions options); + public virtual ClientResult DeleteAssistant(string assistantId, CancellationToken cancellationToken = default); + public virtual Task> DeleteAssistantAsync(Assistant assistant); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task DeleteAssistantAsync(string assistantId, RequestOptions options); + public virtual Task> DeleteAssistantAsync(string assistantId, CancellationToken cancellationToken = default); + public virtual ClientResult DeleteMessage(ThreadMessage message); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteMessage(string threadId, string messageId, RequestOptions options); + public virtual ClientResult DeleteMessage(string threadId, string messageId, CancellationToken cancellationToken = default); + public virtual Task> DeleteMessageAsync(ThreadMessage message); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task DeleteMessageAsync(string threadId, string messageId, RequestOptions options); + public virtual Task> DeleteMessageAsync(string threadId, string messageId, CancellationToken cancellationToken = default); + public virtual ClientResult DeleteThread(AssistantThread thread); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteThread(string threadId, RequestOptions options); + public virtual ClientResult DeleteThread(string threadId, CancellationToken cancellationToken = default); + public virtual Task> DeleteThreadAsync(AssistantThread thread); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task DeleteThreadAsync(string threadId, RequestOptions options); + public virtual Task> DeleteThreadAsync(string threadId, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetAssistant(string assistantId, RequestOptions options); + public virtual ClientResult GetAssistant(string assistantId, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetAssistantAsync(string assistantId, RequestOptions options); + public virtual Task> GetAssistantAsync(string assistantId, CancellationToken cancellationToken = default); + public virtual PageCollection GetAssistants(AssistantCollectionOptions options = null, CancellationToken cancellationToken = default); + public virtual PageCollection GetAssistants(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetAssistants(int? limit, string order, string after, string before, RequestOptions options); + public virtual AsyncPageCollection GetAssistantsAsync(AssistantCollectionOptions options = null, CancellationToken cancellationToken = default); + public virtual AsyncPageCollection GetAssistantsAsync(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetAssistantsAsync(int? limit, string order, string after, string before, RequestOptions options); + public virtual ClientResult GetMessage(ThreadMessage message); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetMessage(string threadId, string messageId, RequestOptions options); + public virtual ClientResult GetMessage(string threadId, string messageId, CancellationToken cancellationToken = default); + public virtual Task> GetMessageAsync(ThreadMessage message); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetMessageAsync(string threadId, string messageId, RequestOptions options); + public virtual Task> GetMessageAsync(string threadId, string messageId, CancellationToken cancellationToken = default); + public virtual PageCollection GetMessages(AssistantThread thread, MessageCollectionOptions options = null); + public virtual PageCollection GetMessages(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + public virtual PageCollection GetMessages(string threadId, MessageCollectionOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetMessages(string threadId, int? limit, string order, string after, string before, RequestOptions options); + public virtual AsyncPageCollection GetMessagesAsync(AssistantThread thread, MessageCollectionOptions options = null); + public virtual AsyncPageCollection GetMessagesAsync(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + public virtual AsyncPageCollection GetMessagesAsync(string threadId, MessageCollectionOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetMessagesAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options); + public virtual ClientResult GetRun(ThreadRun run); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetRun(string threadId, string runId, RequestOptions options); + public virtual ClientResult GetRun(string threadId, string runId, CancellationToken cancellationToken = default); + public virtual Task> GetRunAsync(ThreadRun run); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetRunAsync(string threadId, string runId, RequestOptions options); + public virtual Task> GetRunAsync(string threadId, string runId, CancellationToken cancellationToken = default); + public virtual PageCollection GetRuns(AssistantThread thread, RunCollectionOptions options = null); + public virtual PageCollection GetRuns(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + public virtual PageCollection GetRuns(string threadId, RunCollectionOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetRuns(string threadId, int? limit, string order, string after, string before, RequestOptions options); + public virtual AsyncPageCollection GetRunsAsync(AssistantThread thread, RunCollectionOptions options = null); + public virtual AsyncPageCollection GetRunsAsync(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + public virtual AsyncPageCollection GetRunsAsync(string threadId, RunCollectionOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetRunsAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetRunStep(string threadId, string runId, string stepId, RequestOptions options); + public virtual ClientResult GetRunStep(string threadId, string runId, string stepId, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetRunStepAsync(string threadId, string runId, string stepId, RequestOptions options); + public virtual Task> GetRunStepAsync(string threadId, string runId, string stepId, CancellationToken cancellationToken = default); + public virtual PageCollection GetRunSteps(ThreadRun run, RunStepCollectionOptions options = null); + public virtual PageCollection GetRunSteps(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + public virtual PageCollection GetRunSteps(string threadId, string runId, RunStepCollectionOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetRunSteps(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options); + public virtual AsyncPageCollection GetRunStepsAsync(ThreadRun run, RunStepCollectionOptions options = null); + public virtual AsyncPageCollection GetRunStepsAsync(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + public virtual AsyncPageCollection GetRunStepsAsync(string threadId, string runId, RunStepCollectionOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetRunStepsAsync(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options); + public virtual ClientResult GetThread(AssistantThread thread); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetThread(string threadId, RequestOptions options); + public virtual ClientResult GetThread(string threadId, CancellationToken cancellationToken = default); + public virtual Task> GetThreadAsync(AssistantThread thread); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetThreadAsync(string threadId, RequestOptions options); + public virtual Task> GetThreadAsync(string threadId, CancellationToken cancellationToken = default); + public virtual ClientResult ModifyAssistant(Assistant assistant, AssistantModificationOptions options); + public virtual ClientResult ModifyAssistant(string assistantId, AssistantModificationOptions options, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ModifyAssistant(string assistantId, BinaryContent content, RequestOptions options = null); + public virtual Task> ModifyAssistantAsync(Assistant assistant, AssistantModificationOptions options); + public virtual Task> ModifyAssistantAsync(string assistantId, AssistantModificationOptions options, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task ModifyAssistantAsync(string assistantId, BinaryContent content, RequestOptions options = null); + public virtual ClientResult ModifyMessage(ThreadMessage message, MessageModificationOptions options); + public virtual ClientResult ModifyMessage(string threadId, string messageId, MessageModificationOptions options, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ModifyMessage(string threadId, string messageId, BinaryContent content, RequestOptions options = null); + public virtual Task> ModifyMessageAsync(ThreadMessage message, MessageModificationOptions options); + public virtual Task> ModifyMessageAsync(string threadId, string messageId, MessageModificationOptions options, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task ModifyMessageAsync(string threadId, string messageId, BinaryContent content, RequestOptions options = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ModifyRun(string threadId, string runId, BinaryContent content, RequestOptions options = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task ModifyRunAsync(string threadId, string runId, BinaryContent content, RequestOptions options = null); + public virtual ClientResult ModifyThread(AssistantThread thread, ThreadModificationOptions options); + public virtual ClientResult ModifyThread(string threadId, ThreadModificationOptions options, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ModifyThread(string threadId, BinaryContent content, RequestOptions options = null); + public virtual Task> ModifyThreadAsync(AssistantThread thread, ThreadModificationOptions options); + public virtual Task> ModifyThreadAsync(string threadId, ThreadModificationOptions options, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task ModifyThreadAsync(string threadId, BinaryContent content, RequestOptions options = null); + public virtual ClientResult SubmitToolOutputsToRun(ThreadRun run, IEnumerable toolOutputs); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult SubmitToolOutputsToRun(string threadId, string runId, BinaryContent content, RequestOptions options = null); + public virtual ClientResult SubmitToolOutputsToRun(string threadId, string runId, IEnumerable toolOutputs, CancellationToken cancellationToken = default); + public virtual Task> SubmitToolOutputsToRunAsync(ThreadRun run, IEnumerable toolOutputs); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task SubmitToolOutputsToRunAsync(string threadId, string runId, BinaryContent content, RequestOptions options = null); + public virtual Task> SubmitToolOutputsToRunAsync(string threadId, string runId, IEnumerable toolOutputs, CancellationToken cancellationToken = default); + public virtual CollectionResult SubmitToolOutputsToRunStreaming(ThreadRun run, IEnumerable toolOutputs); + public virtual CollectionResult SubmitToolOutputsToRunStreaming(string threadId, string runId, IEnumerable toolOutputs, CancellationToken cancellationToken = default); + public virtual AsyncCollectionResult SubmitToolOutputsToRunStreamingAsync(ThreadRun run, IEnumerable toolOutputs); + public virtual AsyncCollectionResult SubmitToolOutputsToRunStreamingAsync(string threadId, string runId, IEnumerable toolOutputs, CancellationToken cancellationToken = default); + } + public class AssistantCollectionOptions { + public string AfterId { get; set; } + public string BeforeId { get; set; } + public ListOrder? Order { get; set; } + public int? PageSize { get; set; } + } + public class AssistantCreationOptions : IJsonModel, IPersistableModel { + public string Description { get; set; } + public string Instructions { get; set; } + public IDictionary Metadata { get; set; } + public string Name { get; set; } + public float? NucleusSamplingFactor { get; set; } + public AssistantResponseFormat ResponseFormat { get; set; } + public float? Temperature { get; set; } + public ToolResources ToolResources { get; set; } + public IList Tools { get; } + AssistantCreationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + AssistantCreationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class AssistantModificationOptions : IJsonModel, IPersistableModel { + public IList DefaultTools { get; set; } + public string Description { get; set; } + public string Instructions { get; set; } + public IDictionary Metadata { get; set; } + public string Model { get; set; } + public string Name { get; set; } + public float? NucleusSamplingFactor { get; set; } + public AssistantResponseFormat ResponseFormat { get; set; } + public float? Temperature { get; set; } + public ToolResources ToolResources { get; set; } + AssistantModificationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + AssistantModificationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public abstract class AssistantResponseFormat : IEquatable, IEquatable, IJsonModel, IPersistableModel { + public static AssistantResponseFormat Auto { get; } + public static AssistantResponseFormat JsonObject { get; } + public static AssistantResponseFormat Text { get; } + public static AssistantResponseFormat CreateAutoFormat(); + public static AssistantResponseFormat CreateJsonObjectFormat(); + public static AssistantResponseFormat CreateJsonSchemaFormat(string name, BinaryData jsonSchema, string description = null, bool? strictSchemaEnabled = null); + public static AssistantResponseFormat CreateTextFormat(); + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode(); + [EditorBrowsable(EditorBrowsableState.Never)] + public static bool operator ==(AssistantResponseFormat first, AssistantResponseFormat second); + [EditorBrowsable(EditorBrowsableState.Never)] + public static implicit operator AssistantResponseFormat(string plainTextFormat); + [EditorBrowsable(EditorBrowsableState.Never)] + public static bool operator !=(AssistantResponseFormat first, AssistantResponseFormat second); + AssistantResponseFormat IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + AssistantResponseFormat IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + [EditorBrowsable(EditorBrowsableState.Never)] + bool IEquatable.Equals(AssistantResponseFormat other); + [EditorBrowsable(EditorBrowsableState.Never)] + bool IEquatable.Equals(string other); + public override string ToString(); + } + public class AssistantThread : IJsonModel, IPersistableModel { + public DateTimeOffset CreatedAt { get; } + public string Id { get; } + public IReadOnlyDictionary Metadata { get; } + public ToolResources ToolResources { get; } + AssistantThread IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + AssistantThread IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class CodeInterpreterToolDefinition : ToolDefinition, IJsonModel, IPersistableModel { + CodeInterpreterToolDefinition IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + CodeInterpreterToolDefinition IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + } + public class CodeInterpreterToolResources : IJsonModel, IPersistableModel { + public IList FileIds { get; set; } + CodeInterpreterToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + CodeInterpreterToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class FileSearchToolDefinition : ToolDefinition, IJsonModel, IPersistableModel { + public int? MaxResults { get; set; } + FileSearchToolDefinition IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + FileSearchToolDefinition IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + } + public class FileSearchToolResources : IJsonModel, IPersistableModel { + public IList NewVectorStores { get; } + public IList VectorStoreIds { get; set; } + FileSearchToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + FileSearchToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class FunctionToolDefinition : ToolDefinition, IJsonModel, IPersistableModel { + public FunctionToolDefinition(); + public FunctionToolDefinition(string name); + public string Description { get; set; } + public required string FunctionName { get; set; } + public BinaryData Parameters { get; set; } + public bool? StrictParameterSchemaEnabled { get; set; } + FunctionToolDefinition IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + FunctionToolDefinition IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + } + public class MessageCollectionOptions { + public string AfterId { get; set; } + public string BeforeId { get; set; } + public ListOrder? Order { get; set; } + public int? PageSize { get; set; } + } + public abstract class MessageContent : IJsonModel, IPersistableModel { + public MessageImageDetail? ImageDetail { get; } + public string ImageFileId { get; } + public Uri ImageUrl { get; } + public string Refusal { get; } + public string Text { get; } + public IReadOnlyList TextAnnotations { get; } + public static MessageContent FromImageFileId(string imageFileId, MessageImageDetail? detail = null); + public static MessageContent FromImageUrl(Uri imageUri, MessageImageDetail? detail = null); + public static MessageContent FromText(string text); + public static implicit operator MessageContent(string value); + MessageContent IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + MessageContent IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + protected abstract void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + } + public class MessageContentUpdate : StreamingUpdate { + public MessageImageDetail? ImageDetail { get; } + public string ImageFileId { get; } + public string MessageId { get; } + public int MessageIndex { get; } + public string RefusalUpdate { get; } + public MessageRole? Role { get; } + public string Text { get; } + public TextAnnotationUpdate TextAnnotation { get; } + } + public class MessageCreationAttachment : IJsonModel, IPersistableModel { + public MessageCreationAttachment(string fileId, IEnumerable tools); + public string FileId { get; } + public IReadOnlyList Tools { get; } + MessageCreationAttachment IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + MessageCreationAttachment IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class MessageCreationOptions : IJsonModel, IPersistableModel { + public IList Attachments { get; set; } + public IDictionary Metadata { get; set; } + MessageCreationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + MessageCreationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class MessageFailureDetails : IJsonModel, IPersistableModel { + public MessageFailureReason Reason { get; } + MessageFailureDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + MessageFailureDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct MessageFailureReason : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public MessageFailureReason(string value); + public static MessageFailureReason ContentFilter { get; } + public static MessageFailureReason MaxTokens { get; } + public static MessageFailureReason RunCancelled { get; } + public static MessageFailureReason RunExpired { get; } + public static MessageFailureReason RunFailed { get; } + public readonly bool Equals(MessageFailureReason other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(MessageFailureReason left, MessageFailureReason right); + public static implicit operator MessageFailureReason(string value); + public static bool operator !=(MessageFailureReason left, MessageFailureReason right); + public override readonly string ToString(); + } + public enum MessageImageDetail { + Auto = 0, + Low = 1, + High = 2 + } + public class MessageModificationOptions : IJsonModel, IPersistableModel { + public IDictionary Metadata { get; set; } + MessageModificationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + MessageModificationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public enum MessageRole { + User = 0, + Assistant = 1 + } + public readonly partial struct MessageStatus : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public MessageStatus(string value); + public static MessageStatus Completed { get; } + public static MessageStatus Incomplete { get; } + public static MessageStatus InProgress { get; } + public readonly bool Equals(MessageStatus other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(MessageStatus left, MessageStatus right); + public static implicit operator MessageStatus(string value); + public static bool operator !=(MessageStatus left, MessageStatus right); + public override readonly string ToString(); + } + public class MessageStatusUpdate : StreamingUpdate { + } + public abstract class RequiredAction { + public string FunctionArguments { get; } + public string FunctionName { get; } + public string ToolCallId { get; } + } + public class RequiredActionUpdate : RunUpdate { + public string FunctionArguments { get; } + public string FunctionName { get; } + public string ToolCallId { get; } + public ThreadRun GetThreadRun(); + } + public class RunCollectionOptions { + public string AfterId { get; set; } + public string BeforeId { get; set; } + public ListOrder? Order { get; set; } + public int? PageSize { get; set; } + } + public class RunCreationOptions : IJsonModel, IPersistableModel { + public string AdditionalInstructions { get; set; } + public IList AdditionalMessages { get; } + public string InstructionsOverride { get; set; } + public int? MaxCompletionTokens { get; set; } + public int? MaxPromptTokens { get; set; } + public IDictionary Metadata { get; } + public string ModelOverride { get; set; } + public float? NucleusSamplingFactor { get; set; } + public bool? ParallelToolCallsEnabled { get; set; } + public AssistantResponseFormat ResponseFormat { get; set; } + public float? Temperature { get; set; } + public ToolConstraint ToolConstraint { get; set; } + public IList ToolsOverride { get; } + public RunTruncationStrategy TruncationStrategy { get; set; } + RunCreationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunCreationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class RunError : IJsonModel, IPersistableModel { + public RunErrorCode Code { get; } + public string Message { get; } + RunError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct RunErrorCode : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public RunErrorCode(string value); + public static RunErrorCode InvalidPrompt { get; } + public static RunErrorCode RateLimitExceeded { get; } + public static RunErrorCode ServerError { get; } + public readonly bool Equals(RunErrorCode other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(RunErrorCode left, RunErrorCode right); + public static implicit operator RunErrorCode(string value); + public static bool operator !=(RunErrorCode left, RunErrorCode right); + public override readonly string ToString(); + } + public class RunIncompleteDetails : IJsonModel, IPersistableModel { + public RunIncompleteReason? Reason { get; } + RunIncompleteDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunIncompleteDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct RunIncompleteReason : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public RunIncompleteReason(string value); + public static RunIncompleteReason MaxCompletionTokens { get; } + public static RunIncompleteReason MaxPromptTokens { get; } + public readonly bool Equals(RunIncompleteReason other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(RunIncompleteReason left, RunIncompleteReason right); + public static implicit operator RunIncompleteReason(string value); + public static bool operator !=(RunIncompleteReason left, RunIncompleteReason right); + public override readonly string ToString(); + } + public class RunModificationOptions : IJsonModel, IPersistableModel { + public IDictionary Metadata { get; set; } + RunModificationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunModificationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct RunStatus : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public RunStatus(string value); + public static RunStatus Cancelled { get; } + public static RunStatus Cancelling { get; } + public static RunStatus Completed { get; } + public static RunStatus Expired { get; } + public static RunStatus Failed { get; } + public static RunStatus Incomplete { get; } + public static RunStatus InProgress { get; } + public bool IsTerminal { get; } + public static RunStatus Queued { get; } + public static RunStatus RequiresAction { get; } + public readonly bool Equals(RunStatus other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(RunStatus left, RunStatus right); + public static implicit operator RunStatus(string value); + public static bool operator !=(RunStatus left, RunStatus right); + public override readonly string ToString(); + } + public class RunStep : IJsonModel, IPersistableModel { + public string AssistantId { get; } + public DateTimeOffset? CancelledAt { get; } + public DateTimeOffset? CompletedAt { get; } + public DateTimeOffset CreatedAt { get; } + public RunStepDetails Details { get; } + public DateTimeOffset? ExpiredAt { get; } + public DateTimeOffset? FailedAt { get; } + public string Id { get; } + public RunStepError LastError { get; } + public IReadOnlyDictionary Metadata { get; } + public string RunId { get; } + public RunStepStatus Status { get; } + public string ThreadId { get; } + public RunStepType Type { get; } + public RunStepTokenUsage Usage { get; } + RunStep IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunStep IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public abstract class RunStepCodeInterpreterOutput : IJsonModel, IPersistableModel { + public string ImageFileId { get; } + public string Logs { get; } + RunStepCodeInterpreterOutput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunStepCodeInterpreterOutput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class RunStepCollectionOptions { + public string AfterId { get; set; } + public string BeforeId { get; set; } + public ListOrder? Order { get; set; } + public int? PageSize { get; set; } + } + public abstract class RunStepDetails : IJsonModel, IPersistableModel { + public string CreatedMessageId { get; } + public IReadOnlyList ToolCalls { get; } + RunStepDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunStepDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class RunStepDetailsUpdate : StreamingUpdate { + public string CodeInterpreterInput { get; } + public IReadOnlyList CodeInterpreterOutputs { get; } + public string CreatedMessageId { get; } + public string FunctionArguments { get; } + public string FunctionName { get; } + public string FunctionOutput { get; } + public string StepId { get; } + public string ToolCallId { get; } + public int? ToolCallIndex { get; } + } + public class RunStepError : IJsonModel, IPersistableModel { + public RunStepErrorCode Code { get; } + public string Message { get; } + RunStepError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunStepError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct RunStepErrorCode : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public RunStepErrorCode(string value); + public static RunStepErrorCode RateLimitExceeded { get; } + public static RunStepErrorCode ServerError { get; } + public readonly bool Equals(RunStepErrorCode other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(RunStepErrorCode left, RunStepErrorCode right); + public static implicit operator RunStepErrorCode(string value); + public static bool operator !=(RunStepErrorCode left, RunStepErrorCode right); + public override readonly string ToString(); + } + public readonly partial struct RunStepStatus : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public RunStepStatus(string value); + public static RunStepStatus Cancelled { get; } + public static RunStepStatus Completed { get; } + public static RunStepStatus Expired { get; } + public static RunStepStatus Failed { get; } + public static RunStepStatus InProgress { get; } + public readonly bool Equals(RunStepStatus other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(RunStepStatus left, RunStepStatus right); + public static implicit operator RunStepStatus(string value); + public static bool operator !=(RunStepStatus left, RunStepStatus right); + public override readonly string ToString(); + } + public class RunStepTokenUsage : IJsonModel, IPersistableModel { + public int CompletionTokens { get; } + public int PromptTokens { get; } + public int TotalTokens { get; } + RunStepTokenUsage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunStepTokenUsage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public abstract class RunStepToolCall : IJsonModel, IPersistableModel { + public string CodeInterpreterInput { get; } + public IReadOnlyList CodeInterpreterOutputs { get; } + public string FunctionArguments { get; } + public string FunctionName { get; } + public string FunctionOutput { get; } + public string ToolCallId { get; } + public RunStepToolCallKind ToolKind { get; } + RunStepToolCall IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunStepToolCall IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public enum RunStepToolCallKind { + Unknown = 0, + CodeInterpreter = 1, + FileSearch = 2, + Function = 3 + } + public readonly partial struct RunStepType : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public RunStepType(string value); + public static RunStepType MessageCreation { get; } + public static RunStepType ToolCalls { get; } + public readonly bool Equals(RunStepType other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(RunStepType left, RunStepType right); + public static implicit operator RunStepType(string value); + public static bool operator !=(RunStepType left, RunStepType right); + public override readonly string ToString(); + } + public class RunStepUpdate : StreamingUpdate { + } + public abstract class RunStepUpdateCodeInterpreterOutput : IJsonModel, IPersistableModel { + public string ImageFileId { get; } + public string Logs { get; } + public int OutputIndex { get; } + RunStepUpdateCodeInterpreterOutput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunStepUpdateCodeInterpreterOutput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class RunTokenUsage : IJsonModel, IPersistableModel { + public int CompletionTokens { get; } + public int PromptTokens { get; } + public int TotalTokens { get; } + RunTokenUsage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunTokenUsage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class RunTruncationStrategy : IJsonModel, IPersistableModel { + public static RunTruncationStrategy Auto { get; } + public int? LastMessages { get; } + public static RunTruncationStrategy CreateLastMessagesStrategy(int lastMessageCount); + RunTruncationStrategy IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + RunTruncationStrategy IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class RunUpdate : StreamingUpdate { + } + public abstract class StreamingUpdate { + public StreamingUpdateReason UpdateKind { get; } + } + public enum StreamingUpdateReason { + Unknown = 0, + ThreadCreated = 1, + RunCreated = 2, + RunQueued = 3, + RunInProgress = 4, + RunRequiresAction = 5, + RunCompleted = 6, + RunIncomplete = 7, + RunFailed = 8, + RunCancelling = 9, + RunCancelled = 10, + RunExpired = 11, + RunStepCreated = 12, + RunStepInProgress = 13, + RunStepUpdated = 14, + RunStepCompleted = 15, + RunStepFailed = 16, + RunStepCancelled = 17, + RunStepExpired = 18, + MessageCreated = 19, + MessageInProgress = 20, + MessageUpdated = 21, + MessageCompleted = 22, + MessageFailed = 23, + Error = 24, + Done = 25 + } + public class StreamingUpdate : StreamingUpdate where T : class { + public T Value { get; } + public static implicit operator T(StreamingUpdate update); + } + public class TextAnnotation { + public int EndIndex { get; } + public string InputFileId { get; } + public string OutputFileId { get; } + public int StartIndex { get; } + public string TextToReplace { get; } + } + public class TextAnnotationUpdate { + public int ContentIndex { get; } + public int? EndIndex { get; } + public string InputFileId { get; } + public string OutputFileId { get; } + public int? StartIndex { get; } + public string TextToReplace { get; } + } + public class ThreadCreationOptions : IJsonModel, IPersistableModel { + public IList InitialMessages { get; } + public IDictionary Metadata { get; set; } + public ToolResources ToolResources { get; set; } + ThreadCreationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ThreadCreationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ThreadInitializationMessage : MessageCreationOptions { + public ThreadInitializationMessage(MessageRole role, IEnumerable content); + public static implicit operator ThreadInitializationMessage(string initializationMessage); + } + public class ThreadMessage : IJsonModel, IPersistableModel { + public string AssistantId { get; } + public IReadOnlyList Attachments { get; } + public DateTimeOffset? CompletedAt { get; } + public IReadOnlyList Content { get; } + public DateTimeOffset CreatedAt { get; } + public string Id { get; } + public DateTimeOffset? IncompleteAt { get; } + public MessageFailureDetails IncompleteDetails { get; } + public IReadOnlyDictionary Metadata { get; } + public MessageRole Role { get; } + public string RunId { get; } + public MessageStatus Status { get; } + public string ThreadId { get; } + ThreadMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ThreadMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ThreadModificationOptions : IJsonModel, IPersistableModel { + public IDictionary Metadata { get; set; } + public ToolResources ToolResources { get; set; } + ThreadModificationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ThreadModificationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ThreadRun : IJsonModel, IPersistableModel { + public string AssistantId { get; } + public DateTimeOffset? CancelledAt { get; } + public DateTimeOffset? CompletedAt { get; } + public DateTimeOffset CreatedAt { get; } + public DateTimeOffset? ExpiresAt { get; } + public DateTimeOffset? FailedAt { get; } + public string Id { get; } + public RunIncompleteDetails IncompleteDetails { get; } + public string Instructions { get; } + public RunError LastError { get; } + public int? MaxCompletionTokens { get; } + public int? MaxPromptTokens { get; } + public IReadOnlyDictionary Metadata { get; } + public string Model { get; } + public float? NucleusSamplingFactor { get; } + public bool? ParallelToolCallsEnabled { get; } + public IReadOnlyList RequiredActions { get; } + public AssistantResponseFormat ResponseFormat { get; } + public DateTimeOffset? StartedAt { get; } + public RunStatus Status { get; } + public float? Temperature { get; } + public string ThreadId { get; } + public ToolConstraint ToolConstraint { get; } + public IReadOnlyList Tools { get; } + public RunTruncationStrategy TruncationStrategy { get; } + public RunTokenUsage Usage { get; } + ThreadRun IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ThreadRun IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ThreadUpdate : StreamingUpdate { + public DateTimeOffset CreatedAt { get; } + public string Id { get; } + public IReadOnlyDictionary Metadata { get; } + public ToolResources ToolResources { get; } + } + public class ToolConstraint : IJsonModel, IPersistableModel { + public ToolConstraint(ToolDefinition toolDefinition); + public static ToolConstraint Auto { get; } + public static ToolConstraint None { get; } + public static ToolConstraint Required { get; } + ToolConstraint IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ToolConstraint IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public abstract class ToolDefinition : IJsonModel, IPersistableModel { + protected ToolDefinition(); + protected ToolDefinition(string type); + public static CodeInterpreterToolDefinition CreateCodeInterpreter(); + public static FileSearchToolDefinition CreateFileSearch(int? maxResults = null); + public static FunctionToolDefinition CreateFunction(string name, string description = null, BinaryData parameters = null, bool? strictParameterSchemaEnabled = null); + ToolDefinition IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ToolDefinition IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + protected abstract void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + } + public class ToolOutput : IJsonModel, IPersistableModel { + public ToolOutput(); + public ToolOutput(string toolCallId, string output); + public string Output { get; set; } + public string ToolCallId { get; set; } + ToolOutput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ToolOutput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ToolResources : IJsonModel, IPersistableModel { + public CodeInterpreterToolResources CodeInterpreter { get; set; } + public FileSearchToolResources FileSearch { get; set; } + ToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class VectorStoreCreationHelper : IJsonModel, IPersistableModel { + public VectorStoreCreationHelper(); + public VectorStoreCreationHelper(IEnumerable files, IDictionary metadata = null); + public VectorStoreCreationHelper(IEnumerable fileIds, IDictionary metadata = null); + public FileChunkingStrategy ChunkingStrategy { get; set; } + public IList FileIds { get; } + public IDictionary Metadata { get; } + VectorStoreCreationHelper IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + VectorStoreCreationHelper IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } +} +namespace OpenAI.Audio { + public class AudioClient { + protected AudioClient(); + protected internal AudioClient(ClientPipeline pipeline, string model, OpenAIClientOptions options); + public AudioClient(string model, ApiKeyCredential credential, OpenAIClientOptions options); + public AudioClient(string model, ApiKeyCredential credential); + public virtual ClientPipeline Pipeline { get; } + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GenerateSpeech(BinaryContent content, RequestOptions options = null); + public virtual ClientResult GenerateSpeech(string text, GeneratedSpeechVoice voice, SpeechGenerationOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GenerateSpeechAsync(BinaryContent content, RequestOptions options = null); + public virtual Task> GenerateSpeechAsync(string text, GeneratedSpeechVoice voice, SpeechGenerationOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult TranscribeAudio(BinaryContent content, string contentType, RequestOptions options = null); + public virtual ClientResult TranscribeAudio(Stream audio, string audioFilename, AudioTranscriptionOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult TranscribeAudio(string audioFilePath, AudioTranscriptionOptions options = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task TranscribeAudioAsync(BinaryContent content, string contentType, RequestOptions options = null); + public virtual Task> TranscribeAudioAsync(Stream audio, string audioFilename, AudioTranscriptionOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> TranscribeAudioAsync(string audioFilePath, AudioTranscriptionOptions options = null); + public virtual ClientResult TranslateAudio(BinaryContent content, string contentType, RequestOptions options = null); + public virtual ClientResult TranslateAudio(Stream audio, string audioFilename, AudioTranslationOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult TranslateAudio(string audioFilePath, AudioTranslationOptions options = null); + public virtual Task TranslateAudioAsync(BinaryContent content, string contentType, RequestOptions options = null); + public virtual Task> TranslateAudioAsync(Stream audio, string audioFilename, AudioTranslationOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> TranslateAudioAsync(string audioFilePath, AudioTranslationOptions options = null); + } + [Flags] + public enum AudioTimestampGranularities { + Default = 0, + Word = 1, + Segment = 2 + } + public class AudioTranscription : IJsonModel, IPersistableModel { + public TimeSpan? Duration { get; } + public string Language { get; } + public IReadOnlyList Segments { get; } + public string Text { get; } + public IReadOnlyList Words { get; } + AudioTranscription IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + AudioTranscription IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public enum AudioTranscriptionFormat { + Text = 0, + Simple = 1, + Verbose = 2, + Srt = 3, + Vtt = 4 + } + public class AudioTranscriptionOptions : IJsonModel, IPersistableModel { + public AudioTimestampGranularities Granularities { get; set; } + public string Language { get; set; } + public string Prompt { get; set; } + public AudioTranscriptionFormat? ResponseFormat { get; set; } + public float? Temperature { get; set; } + AudioTranscriptionOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + AudioTranscriptionOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class AudioTranslation : IJsonModel, IPersistableModel { + public TimeSpan? Duration { get; } + public string Language { get; } + public IReadOnlyList Segments { get; } + public string Text { get; } + AudioTranslation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + AudioTranslation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public enum AudioTranslationFormat { + Text = 0, + Simple = 1, + Verbose = 2, + Srt = 3, + Vtt = 4 + } + public class AudioTranslationOptions : IJsonModel, IPersistableModel { + public string Prompt { get; set; } + public AudioTranslationFormat? ResponseFormat { get; set; } + public float? Temperature { get; set; } + AudioTranslationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + AudioTranslationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public enum GeneratedSpeechFormat { + Mp3 = 0, + Opus = 1, + Aac = 2, + Flac = 3, + Wav = 4, + Pcm = 5 + } + public enum GeneratedSpeechVoice { + Alloy = 0, + Echo = 1, + Fable = 2, + Onyx = 3, + Nova = 4, + Shimmer = 5 + } + public static class OpenAIAudioModelFactory { + public static AudioTranscription AudioTranscription(string language = null, TimeSpan? duration = null, string text = null, IEnumerable words = null, IEnumerable segments = null); + public static AudioTranslation AudioTranslation(string language = null, TimeSpan? duration = null, string text = null, IEnumerable segments = null); + public static TranscribedSegment TranscribedSegment(int id = 0, long seekOffset = 0, TimeSpan start = default, TimeSpan end = default, string text = null, IEnumerable tokenIds = null, float temperature = 0, double averageLogProbability = 0, float compressionRatio = 0, double noSpeechProbability = 0); + public static TranscribedWord TranscribedWord(string word = null, TimeSpan start = default, TimeSpan end = default); + } + public class SpeechGenerationOptions : IJsonModel, IPersistableModel { + public GeneratedSpeechFormat? ResponseFormat { get; set; } + public float? Speed { get; set; } + SpeechGenerationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + SpeechGenerationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct TranscribedSegment : IJsonModel, IPersistableModel, IJsonModel, IPersistableModel { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public double AverageLogProbability { get; } + public float CompressionRatio { get; } + public TimeSpan End { get; } + public int Id { get; } + public double NoSpeechProbability { get; } + public long SeekOffset { get; } + public TimeSpan Start { get; } + public float Temperature { get; } + public string Text { get; } + public IReadOnlyList TokenIds { get; } + readonly TranscribedSegment IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + readonly void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + readonly object IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + readonly void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + readonly TranscribedSegment IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + readonly string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + readonly BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + readonly object IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + readonly string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + readonly BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct TranscribedWord : IJsonModel, IPersistableModel, IJsonModel, IPersistableModel { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public TimeSpan End { get; } + public TimeSpan Start { get; } + public string Word { get; } + readonly TranscribedWord IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + readonly void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + readonly object IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + readonly void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + readonly TranscribedWord IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + readonly string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + readonly BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + readonly object IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + readonly string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + readonly BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } +} +namespace OpenAI.Batch { + public class BatchClient { + protected BatchClient(); + public BatchClient(ApiKeyCredential credential, OpenAIClientOptions options); + public BatchClient(ApiKeyCredential credential); + protected internal BatchClient(ClientPipeline pipeline, OpenAIClientOptions options); + public virtual ClientPipeline Pipeline { get; } + public virtual ClientResult CancelBatch(string batchId, RequestOptions options); + public virtual Task CancelBatchAsync(string batchId, RequestOptions options); + public virtual ClientResult CreateBatch(BinaryContent content, RequestOptions options = null); + public virtual Task CreateBatchAsync(BinaryContent content, RequestOptions options = null); + public virtual ClientResult GetBatch(string batchId, RequestOptions options); + public virtual Task GetBatchAsync(string batchId, RequestOptions options); + public virtual IEnumerable GetBatches(string after, int? limit, RequestOptions options); + public virtual IAsyncEnumerable GetBatchesAsync(string after, int? limit, RequestOptions options); + } +} +namespace OpenAI.Chat { + public class AssistantChatMessage : ChatMessage, IJsonModel, IPersistableModel { + public AssistantChatMessage(ChatCompletion chatCompletion); + public AssistantChatMessage(ChatFunctionCall functionCall, string content = null); + public AssistantChatMessage(params ChatMessageContentPart[] contentParts); + public AssistantChatMessage(IEnumerable contentParts); + public AssistantChatMessage(IEnumerable toolCalls, string content = null); + public AssistantChatMessage(string content); + public ChatFunctionCall FunctionCall { get; set; } + public string ParticipantName { get; set; } + public string Refusal { get; set; } + public IList ToolCalls { get; } + AssistantChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + AssistantChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + } + public class ChatClient { + protected ChatClient(); + protected internal ChatClient(ClientPipeline pipeline, string model, OpenAIClientOptions options); + public ChatClient(string model, ApiKeyCredential credential, OpenAIClientOptions options); + public ChatClient(string model, ApiKeyCredential credential); + public virtual ClientPipeline Pipeline { get; } + public virtual ClientResult CompleteChat(params ChatMessage[] messages); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CompleteChat(BinaryContent content, RequestOptions options = null); + public virtual ClientResult CompleteChat(IEnumerable messages, ChatCompletionOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> CompleteChatAsync(params ChatMessage[] messages); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CompleteChatAsync(BinaryContent content, RequestOptions options = null); + public virtual Task> CompleteChatAsync(IEnumerable messages, ChatCompletionOptions options = null, CancellationToken cancellationToken = default); + public virtual CollectionResult CompleteChatStreaming(params ChatMessage[] messages); + public virtual CollectionResult CompleteChatStreaming(IEnumerable messages, ChatCompletionOptions options = null, CancellationToken cancellationToken = default); + public virtual AsyncCollectionResult CompleteChatStreamingAsync(params ChatMessage[] messages); + public virtual AsyncCollectionResult CompleteChatStreamingAsync(IEnumerable messages, ChatCompletionOptions options = null, CancellationToken cancellationToken = default); + } + public class ChatCompletion : IJsonModel, IPersistableModel { + public IReadOnlyList Content { get; } + public IReadOnlyList ContentTokenLogProbabilities { get; } + public DateTimeOffset CreatedAt { get; } + public ChatFinishReason FinishReason { get; } + public ChatFunctionCall FunctionCall { get; } + public string Id { get; } + public string Model { get; } + public string Refusal { get; } + public IReadOnlyList RefusalTokenLogProbabilities { get; } + public ChatMessageRole Role { get; } + public string SystemFingerprint { get; } + public IReadOnlyList ToolCalls { get; } + public ChatTokenUsage Usage { get; } + ChatCompletion IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatCompletion IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + public override string ToString(); + } + public class ChatCompletionOptions : IJsonModel, IPersistableModel { + public string EndUserId { get; set; } + public float? FrequencyPenalty { get; set; } + public ChatFunctionChoice FunctionChoice { get; set; } + public IList Functions { get; } + public bool? IncludeLogProbabilities { get; set; } + public IDictionary LogitBiases { get; } + public int? MaxTokens { get; set; } + public bool? ParallelToolCallsEnabled { get; set; } + public float? PresencePenalty { get; set; } + public ChatResponseFormat ResponseFormat { get; set; } + public long? Seed { get; set; } + public IList StopSequences { get; } + public float? Temperature { get; set; } + public ChatToolChoice ToolChoice { get; set; } + public IList Tools { get; } + public int? TopLogProbabilityCount { get; set; } + public float? TopP { get; set; } + ChatCompletionOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatCompletionOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public enum ChatFinishReason { + Stop = 0, + Length = 1, + ContentFilter = 2, + ToolCalls = 3, + FunctionCall = 4 + } + [Obsolete("This field is marked as deprecated.")] + public class ChatFunction : IJsonModel, IPersistableModel { + public ChatFunction(string functionName, string functionDescription = null, BinaryData functionParameters = null); + public string FunctionDescription { get; set; } + public string FunctionName { get; } + public BinaryData FunctionParameters { get; set; } + ChatFunction IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatFunction IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ChatFunctionCall : IJsonModel, IPersistableModel { + public ChatFunctionCall(string functionName, string functionArguments); + public string FunctionArguments { get; } + public string FunctionName { get; } + ChatFunctionCall IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatFunctionCall IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ChatFunctionChoice : IJsonModel, IPersistableModel { + public ChatFunctionChoice(ChatFunction chatFunction); + public static ChatFunctionChoice Auto { get; } + public static ChatFunctionChoice None { get; } + ChatFunctionChoice IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatFunctionChoice IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public abstract class ChatMessage : IJsonModel, IPersistableModel { + protected ChatMessage(); + protected internal ChatMessage(ChatMessageRole role, IEnumerable contentParts); + protected internal ChatMessage(ChatMessageRole role, string content); + public IList Content { get; } + public static AssistantChatMessage CreateAssistantMessage(ChatCompletion chatCompletion); + public static AssistantChatMessage CreateAssistantMessage(ChatFunctionCall functionCall, string content = null); + public static AssistantChatMessage CreateAssistantMessage(params ChatMessageContentPart[] contentParts); + public static AssistantChatMessage CreateAssistantMessage(IEnumerable contentParts); + public static AssistantChatMessage CreateAssistantMessage(IEnumerable toolCalls, string content = null); + public static AssistantChatMessage CreateAssistantMessage(string content); + public static FunctionChatMessage CreateFunctionMessage(string functionName, string content); + public static SystemChatMessage CreateSystemMessage(params ChatMessageContentPart[] contentParts); + public static SystemChatMessage CreateSystemMessage(IEnumerable contentParts); + public static SystemChatMessage CreateSystemMessage(string content); + public static ToolChatMessage CreateToolChatMessage(string toolCallId, params ChatMessageContentPart[] contentParts); + public static ToolChatMessage CreateToolChatMessage(string toolCallId, IEnumerable contentParts); + public static ToolChatMessage CreateToolChatMessage(string toolCallId, string content); + public static UserChatMessage CreateUserMessage(params ChatMessageContentPart[] contentParts); + public static UserChatMessage CreateUserMessage(IEnumerable contentParts); + public static UserChatMessage CreateUserMessage(string content); + public static implicit operator ChatMessage(string userMessage); + ChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + protected abstract void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + } + public class ChatMessageContentPart : IJsonModel, IPersistableModel { + public BinaryData ImageBytes { get; } + public string ImageBytesMediaType { get; } + public ImageChatMessageContentPartDetail? ImageDetail { get; } + public Uri ImageUri { get; } + public ChatMessageContentPartKind Kind { get; } + public string Refusal { get; } + public string Text { get; } + public static ChatMessageContentPart CreateImageMessageContentPart(BinaryData imageBytes, string imageBytesMediaType, ImageChatMessageContentPartDetail? imageDetail = null); + public static ChatMessageContentPart CreateImageMessageContentPart(Uri imageUri, ImageChatMessageContentPartDetail? imageDetail = null); + public static ChatMessageContentPart CreateRefusalMessageContentPart(string refusal); + public static ChatMessageContentPart CreateTextMessageContentPart(string text); + public static implicit operator ChatMessageContentPart(string content); + ChatMessageContentPart IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatMessageContentPart IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + public override string ToString(); + } + public readonly partial struct ChatMessageContentPartKind : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public ChatMessageContentPartKind(string value); + public static ChatMessageContentPartKind Image { get; } + public static ChatMessageContentPartKind Refusal { get; } + public static ChatMessageContentPartKind Text { get; } + public readonly bool Equals(ChatMessageContentPartKind other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(ChatMessageContentPartKind left, ChatMessageContentPartKind right); + public static implicit operator ChatMessageContentPartKind(string value); + public static bool operator !=(ChatMessageContentPartKind left, ChatMessageContentPartKind right); + public override readonly string ToString(); + } + public enum ChatMessageRole { + System = 0, + User = 1, + Assistant = 2, + Tool = 3, + Function = 4 + } + public abstract class ChatResponseFormat : IEquatable, IJsonModel, IPersistableModel { + public static ChatResponseFormat JsonObject { get; } + public static ChatResponseFormat Text { get; } + public static ChatResponseFormat CreateJsonObjectFormat(); + public static ChatResponseFormat CreateJsonSchemaFormat(string name, BinaryData jsonSchema, string description = null, bool? strictSchemaEnabled = null); + public static ChatResponseFormat CreateTextFormat(); + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode(); + [EditorBrowsable(EditorBrowsableState.Never)] + public static bool operator ==(ChatResponseFormat first, ChatResponseFormat second); + [EditorBrowsable(EditorBrowsableState.Never)] + public static bool operator !=(ChatResponseFormat first, ChatResponseFormat second); + ChatResponseFormat IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatResponseFormat IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + [EditorBrowsable(EditorBrowsableState.Never)] + bool IEquatable.Equals(ChatResponseFormat other); + public override string ToString(); + } + public class ChatTokenLogProbabilityInfo : IJsonModel, IPersistableModel { + public float LogProbability { get; } + public string Token { get; } + public IReadOnlyList TopLogProbabilities { get; } + public IReadOnlyList Utf8ByteValues { get; } + ChatTokenLogProbabilityInfo IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatTokenLogProbabilityInfo IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ChatTokenTopLogProbabilityInfo : IJsonModel, IPersistableModel { + public float LogProbability { get; } + public string Token { get; } + public IReadOnlyList Utf8ByteValues { get; } + ChatTokenTopLogProbabilityInfo IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatTokenTopLogProbabilityInfo IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ChatTokenUsage : IJsonModel, IPersistableModel { + public int InputTokens { get; } + public int OutputTokens { get; } + public int TotalTokens { get; } + ChatTokenUsage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatTokenUsage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ChatTool : IJsonModel, IPersistableModel { + public string FunctionDescription { get; } + public string FunctionName { get; } + public BinaryData FunctionParameters { get; } + public ChatToolKind Kind { get; } + public bool? StrictParameterSchemaEnabled { get; } + public static ChatTool CreateFunctionTool(string functionName, string functionDescription = null, BinaryData functionParameters = null, bool? strictParameterSchemaEnabled = null); + ChatTool IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatTool IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ChatToolCall : IJsonModel, IPersistableModel { + public string FunctionArguments { get; } + public string FunctionName { get; } + public string Id { get; set; } + public ChatToolCallKind Kind { get; } + public static ChatToolCall CreateFunctionToolCall(string toolCallId, string functionName, string functionArguments); + ChatToolCall IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatToolCall IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct ChatToolCallKind : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public ChatToolCallKind(string value); + public static ChatToolCallKind Function { get; } + public readonly bool Equals(ChatToolCallKind other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(ChatToolCallKind left, ChatToolCallKind right); + public static implicit operator ChatToolCallKind(string value); + public static bool operator !=(ChatToolCallKind left, ChatToolCallKind right); + public override readonly string ToString(); + } + public class ChatToolChoice : IJsonModel, IPersistableModel { + public ChatToolChoice(ChatTool tool); + public static ChatToolChoice Auto { get; } + public static ChatToolChoice None { get; } + public static ChatToolChoice Required { get; } + ChatToolChoice IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ChatToolChoice IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct ChatToolKind : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public ChatToolKind(string value); + public static ChatToolKind Function { get; } + public readonly bool Equals(ChatToolKind other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(ChatToolKind left, ChatToolKind right); + public static implicit operator ChatToolKind(string value); + public static bool operator !=(ChatToolKind left, ChatToolKind right); + public override readonly string ToString(); + } + [Obsolete("This field is marked as deprecated.")] + public class FunctionChatMessage : ChatMessage, IJsonModel, IPersistableModel { + public FunctionChatMessage(string functionName, string content = null); + public string FunctionName { get; } + FunctionChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + FunctionChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + } + public readonly partial struct ImageChatMessageContentPartDetail : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public ImageChatMessageContentPartDetail(string value); + public static ImageChatMessageContentPartDetail Auto { get; } + public static ImageChatMessageContentPartDetail High { get; } + public static ImageChatMessageContentPartDetail Low { get; } + public readonly bool Equals(ImageChatMessageContentPartDetail other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(ImageChatMessageContentPartDetail left, ImageChatMessageContentPartDetail right); + public static implicit operator ImageChatMessageContentPartDetail(string value); + public static bool operator !=(ImageChatMessageContentPartDetail left, ImageChatMessageContentPartDetail right); + public override readonly string ToString(); + } + public class StreamingChatCompletionUpdate : IJsonModel, IPersistableModel { + public IReadOnlyList ContentTokenLogProbabilities { get; } + public IReadOnlyList ContentUpdate { get; } + public DateTimeOffset CreatedAt { get; } + public ChatFinishReason? FinishReason { get; } + public StreamingChatFunctionCallUpdate FunctionCallUpdate { get; } + public string Id { get; } + public string Model { get; } + public IReadOnlyList RefusalTokenLogProbabilities { get; } + public string RefusalUpdate { get; } + public ChatMessageRole? Role { get; } + public string SystemFingerprint { get; } + public IReadOnlyList ToolCallUpdates { get; } + public ChatTokenUsage Usage { get; } + StreamingChatCompletionUpdate IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + StreamingChatCompletionUpdate IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class StreamingChatFunctionCallUpdate : IJsonModel, IPersistableModel { + public string FunctionArgumentsUpdate { get; } + public string FunctionName { get; } + StreamingChatFunctionCallUpdate IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + StreamingChatFunctionCallUpdate IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class StreamingChatToolCallUpdate : IJsonModel, IPersistableModel { + public string FunctionArgumentsUpdate { get; } + public string FunctionName { get; } + public string Id { get; } + public int Index { get; } + public ChatToolCallKind Kind { get; } + StreamingChatToolCallUpdate IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + StreamingChatToolCallUpdate IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class SystemChatMessage : ChatMessage, IJsonModel, IPersistableModel { + public SystemChatMessage(params ChatMessageContentPart[] contentParts); + public SystemChatMessage(IEnumerable contentParts); + public SystemChatMessage(string content); + public string ParticipantName { get; set; } + SystemChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + SystemChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + } + public class ToolChatMessage : ChatMessage, IJsonModel, IPersistableModel { + public ToolChatMessage(string toolCallId, params ChatMessageContentPart[] contentParts); + public ToolChatMessage(string toolCallId, IEnumerable contentParts); + public ToolChatMessage(string toolCallId, string content); + public string ToolCallId { get; } + ToolChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ToolChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + } + public class UserChatMessage : ChatMessage, IJsonModel, IPersistableModel { + public UserChatMessage(params ChatMessageContentPart[] content); + public UserChatMessage(IEnumerable content); + public UserChatMessage(string content); + public string ParticipantName { get; set; } + UserChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + UserChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + } +} +namespace OpenAI.Embeddings { + public class Embedding : IJsonModel, IPersistableModel { + public int Index { get; } + public ReadOnlyMemory Vector { get; } + Embedding IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + Embedding IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class EmbeddingClient { + protected EmbeddingClient(); + protected internal EmbeddingClient(ClientPipeline pipeline, string model, OpenAIClientOptions options); + public EmbeddingClient(string model, ApiKeyCredential credential, OpenAIClientOptions options); + public EmbeddingClient(string model, ApiKeyCredential credential); + public virtual ClientPipeline Pipeline { get; } + public virtual ClientResult GenerateEmbedding(string input, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> GenerateEmbeddingAsync(string input, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GenerateEmbeddings(BinaryContent content, RequestOptions options = null); + public virtual ClientResult GenerateEmbeddings(IEnumerable> inputs, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult GenerateEmbeddings(IEnumerable inputs, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GenerateEmbeddingsAsync(BinaryContent content, RequestOptions options = null); + public virtual Task> GenerateEmbeddingsAsync(IEnumerable> inputs, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> GenerateEmbeddingsAsync(IEnumerable inputs, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default); + } + public class EmbeddingCollection : ObjectModel.ReadOnlyCollection, IJsonModel, IPersistableModel { + public string Model { get; } + public EmbeddingTokenUsage Usage { get; } + EmbeddingCollection IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + EmbeddingCollection IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class EmbeddingGenerationOptions : IJsonModel, IPersistableModel { + public int? Dimensions { get; set; } + public string EndUserId { get; set; } + EmbeddingGenerationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + EmbeddingGenerationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class EmbeddingTokenUsage : IJsonModel, IPersistableModel { + public int InputTokens { get; } + public int TotalTokens { get; } + EmbeddingTokenUsage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + EmbeddingTokenUsage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public static class OpenAIEmbeddingsModelFactory { + public static Embedding Embedding(int index = 0, IEnumerable vector = null); + public static EmbeddingCollection EmbeddingCollection(IEnumerable items = null, string model = null, EmbeddingTokenUsage usage = null); + public static EmbeddingTokenUsage EmbeddingTokenUsage(int inputTokens = 0, int totalTokens = 0); + } +} +namespace OpenAI.Files { + public class FileClient { + protected FileClient(); + public FileClient(ApiKeyCredential credential, OpenAIClientOptions options); + public FileClient(ApiKeyCredential credential); + protected internal FileClient(ClientPipeline pipeline, OpenAIClientOptions options); + public virtual ClientPipeline Pipeline { get; } + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteFile(string fileId, RequestOptions options); + public virtual ClientResult DeleteFile(string fileId, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task DeleteFileAsync(string fileId, RequestOptions options); + public virtual Task> DeleteFileAsync(string fileId, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DownloadFile(string fileId, RequestOptions options); + public virtual ClientResult DownloadFile(string fileId, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task DownloadFileAsync(string fileId, RequestOptions options); + public virtual Task> DownloadFileAsync(string fileId, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetFile(string fileId, RequestOptions options); + public virtual ClientResult GetFile(string fileId, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetFileAsync(string fileId, RequestOptions options); + public virtual Task> GetFileAsync(string fileId, CancellationToken cancellationToken = default); + public virtual ClientResult GetFiles(OpenAIFilePurpose? purpose = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetFiles(string purpose, RequestOptions options); + public virtual Task> GetFilesAsync(OpenAIFilePurpose? purpose = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetFilesAsync(string purpose, RequestOptions options); + public virtual ClientResult UploadFile(BinaryData file, string filename, FileUploadPurpose purpose); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult UploadFile(BinaryContent content, string contentType, RequestOptions options = null); + public virtual ClientResult UploadFile(Stream file, string filename, FileUploadPurpose purpose, CancellationToken cancellationToken = default); + public virtual ClientResult UploadFile(string filePath, FileUploadPurpose purpose); + public virtual Task> UploadFileAsync(BinaryData file, string filename, FileUploadPurpose purpose); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task UploadFileAsync(BinaryContent content, string contentType, RequestOptions options = null); + public virtual Task> UploadFileAsync(Stream file, string filename, FileUploadPurpose purpose, CancellationToken cancellationToken = default); + public virtual Task> UploadFileAsync(string filePath, FileUploadPurpose purpose); + } + public readonly partial struct FileUploadPurpose : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public FileUploadPurpose(string value); + public static FileUploadPurpose Assistants { get; } + public static FileUploadPurpose Batch { get; } + public static FileUploadPurpose FineTune { get; } + public static FileUploadPurpose Vision { get; } + public readonly bool Equals(FileUploadPurpose other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(FileUploadPurpose left, FileUploadPurpose right); + public static implicit operator FileUploadPurpose(string value); + public static bool operator !=(FileUploadPurpose left, FileUploadPurpose right); + public override readonly string ToString(); + } + public class OpenAIFileInfo : IJsonModel, IPersistableModel { + public DateTimeOffset CreatedAt { get; } + public string Filename { get; } + public string Id { get; } + public OpenAIFilePurpose Purpose { get; } + public int? SizeInBytes { get; } + public OpenAIFileStatus Status { get; } + public string StatusDetails { get; } + OpenAIFileInfo IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + OpenAIFileInfo IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class OpenAIFileInfoCollection : ObjectModel.ReadOnlyCollection, IJsonModel, IPersistableModel { + OpenAIFileInfoCollection IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + OpenAIFileInfoCollection IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct OpenAIFilePurpose : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public OpenAIFilePurpose(string value); + public static OpenAIFilePurpose Assistants { get; } + public static OpenAIFilePurpose AssistantsOutput { get; } + public static OpenAIFilePurpose Batch { get; } + public static OpenAIFilePurpose BatchOutput { get; } + public static OpenAIFilePurpose FineTune { get; } + public static OpenAIFilePurpose FineTuneResults { get; } + public static OpenAIFilePurpose Vision { get; } + public readonly bool Equals(OpenAIFilePurpose other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(OpenAIFilePurpose left, OpenAIFilePurpose right); + public static implicit operator OpenAIFilePurpose(string value); + public static bool operator !=(OpenAIFilePurpose left, OpenAIFilePurpose right); + public override readonly string ToString(); + } + public static class OpenAIFilesModelFactory { + public static OpenAIFileInfo OpenAIFileInfo(string id = null, int? sizeInBytes = null, DateTimeOffset createdAt = default, string filename = null, OpenAIFilePurpose purpose = default, OpenAIFileStatus status = default, string statusDetails = null); + public static OpenAIFileInfoCollection OpenAIFileInfoCollection(IEnumerable items = null); + } + public readonly partial struct OpenAIFileStatus : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public OpenAIFileStatus(string value); + public static OpenAIFileStatus Error { get; } + public static OpenAIFileStatus Processed { get; } + public static OpenAIFileStatus Uploaded { get; } + public readonly bool Equals(OpenAIFileStatus other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(OpenAIFileStatus left, OpenAIFileStatus right); + public static implicit operator OpenAIFileStatus(string value); + public static bool operator !=(OpenAIFileStatus left, OpenAIFileStatus right); + public override readonly string ToString(); + } +} +namespace OpenAI.FineTuning { + public class FineTuningClient { + protected FineTuningClient(); + public FineTuningClient(ApiKeyCredential credential, OpenAIClientOptions options); + public FineTuningClient(ApiKeyCredential credential); + protected internal FineTuningClient(ClientPipeline pipeline, OpenAIClientOptions options); + public virtual ClientPipeline Pipeline { get; } + public virtual ClientResult CancelJob(string jobId, RequestOptions options); + public virtual Task CancelJobAsync(string jobId, RequestOptions options); + public virtual ClientResult CreateJob(BinaryContent content, RequestOptions options = null); + public virtual Task CreateJobAsync(BinaryContent content, RequestOptions options = null); + public virtual ClientResult GetJob(string jobId, RequestOptions options); + public virtual Task GetJobAsync(string jobId, RequestOptions options); + public virtual IEnumerable GetJobCheckpoints(string jobId, string after, int? limit, RequestOptions options); + public virtual IAsyncEnumerable GetJobCheckpointsAsync(string jobId, string after, int? limit, RequestOptions options); + public virtual IEnumerable GetJobEvents(string jobId, string after, int? limit, RequestOptions options); + public virtual IAsyncEnumerable GetJobEventsAsync(string jobId, string after, int? limit, RequestOptions options); + public virtual IEnumerable GetJobs(string after, int? limit, RequestOptions options); + public virtual IAsyncEnumerable GetJobsAsync(string after, int? limit, RequestOptions options); + } +} +namespace OpenAI.Images { + public class GeneratedImage : IJsonModel, IPersistableModel { + public BinaryData ImageBytes { get; } + public Uri ImageUri { get; } + public string RevisedPrompt { get; } + GeneratedImage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + GeneratedImage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class GeneratedImageCollection : ObjectModel.ReadOnlyCollection, IJsonModel, IPersistableModel { + public DateTimeOffset Created { get; } + public DateTimeOffset CreatedAt { get; } + GeneratedImageCollection IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + GeneratedImageCollection IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public enum GeneratedImageFormat { + Bytes = 0, + Uri = 1 + } + public enum GeneratedImageQuality { + High = 0, + Standard = 1 + } + public readonly partial struct GeneratedImageSize : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public static readonly GeneratedImageSize W1024xH1024; + public static readonly GeneratedImageSize W1024xH1792; + public static readonly GeneratedImageSize W1792xH1024; + public static readonly GeneratedImageSize W256xH256; + public static readonly GeneratedImageSize W512xH512; + public GeneratedImageSize(int width, int height); + public readonly bool Equals(GeneratedImageSize other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(GeneratedImageSize left, GeneratedImageSize right); + public static bool operator !=(GeneratedImageSize left, GeneratedImageSize right); + public override readonly string ToString(); + } + public enum GeneratedImageStyle { + Vivid = 0, + Natural = 1 + } + public class ImageClient { + protected ImageClient(); + protected internal ImageClient(ClientPipeline pipeline, string model, OpenAIClientOptions options); + public ImageClient(string model, ApiKeyCredential credential, OpenAIClientOptions options); + public ImageClient(string model, ApiKeyCredential credential); + public virtual ClientPipeline Pipeline { get; } + public virtual ClientResult GenerateImage(string prompt, ImageGenerationOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> GenerateImageAsync(string prompt, ImageGenerationOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult GenerateImageEdit(Stream image, string imageFilename, string prompt, ImageEditOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult GenerateImageEdit(Stream image, string imageFilename, string prompt, Stream mask, string maskFilename, ImageEditOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult GenerateImageEdit(string imageFilePath, string prompt, ImageEditOptions options = null); + public virtual ClientResult GenerateImageEdit(string imageFilePath, string prompt, string maskFilePath, ImageEditOptions options = null); + public virtual Task> GenerateImageEditAsync(Stream image, string imageFilename, string prompt, ImageEditOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> GenerateImageEditAsync(Stream image, string imageFilename, string prompt, Stream mask, string maskFilename, ImageEditOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> GenerateImageEditAsync(string imageFilePath, string prompt, ImageEditOptions options = null); + public virtual Task> GenerateImageEditAsync(string imageFilePath, string prompt, string maskFilePath, ImageEditOptions options = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GenerateImageEdits(BinaryContent content, string contentType, RequestOptions options = null); + public virtual ClientResult GenerateImageEdits(Stream image, string imageFilename, string prompt, int imageCount, ImageEditOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult GenerateImageEdits(Stream image, string imageFilename, string prompt, Stream mask, string maskFilename, int imageCount, ImageEditOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult GenerateImageEdits(string imageFilePath, string prompt, int imageCount, ImageEditOptions options = null); + public virtual ClientResult GenerateImageEdits(string imageFilePath, string prompt, string maskFilePath, int imageCount, ImageEditOptions options = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GenerateImageEditsAsync(BinaryContent content, string contentType, RequestOptions options = null); + public virtual Task> GenerateImageEditsAsync(Stream image, string imageFilename, string prompt, int imageCount, ImageEditOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> GenerateImageEditsAsync(Stream image, string imageFilename, string prompt, Stream mask, string maskFilename, int imageCount, ImageEditOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> GenerateImageEditsAsync(string imageFilePath, string prompt, int imageCount, ImageEditOptions options = null); + public virtual Task> GenerateImageEditsAsync(string imageFilePath, string prompt, string maskFilePath, int imageCount, ImageEditOptions options = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GenerateImages(BinaryContent content, RequestOptions options = null); + public virtual ClientResult GenerateImages(string prompt, int imageCount, ImageGenerationOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GenerateImagesAsync(BinaryContent content, RequestOptions options = null); + public virtual Task> GenerateImagesAsync(string prompt, int imageCount, ImageGenerationOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult GenerateImageVariation(Stream image, string imageFilename, ImageVariationOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult GenerateImageVariation(string imageFilePath, ImageVariationOptions options = null); + public virtual Task> GenerateImageVariationAsync(Stream image, string imageFilename, ImageVariationOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> GenerateImageVariationAsync(string imageFilePath, ImageVariationOptions options = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GenerateImageVariations(BinaryContent content, string contentType, RequestOptions options = null); + public virtual ClientResult GenerateImageVariations(Stream image, string imageFilename, int imageCount, ImageVariationOptions options = null, CancellationToken cancellationToken = default); + public virtual ClientResult GenerateImageVariations(string imageFilePath, int imageCount, ImageVariationOptions options = null); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GenerateImageVariationsAsync(BinaryContent content, string contentType, RequestOptions options = null); + public virtual Task> GenerateImageVariationsAsync(Stream image, string imageFilename, int imageCount, ImageVariationOptions options = null, CancellationToken cancellationToken = default); + public virtual Task> GenerateImageVariationsAsync(string imageFilePath, int imageCount, ImageVariationOptions options = null); + } + public class ImageEditOptions : IJsonModel, IPersistableModel { + public string EndUserId { get; set; } + public GeneratedImageFormat? ResponseFormat { get; set; } + public GeneratedImageSize? Size { get; set; } + ImageEditOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ImageEditOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ImageGenerationOptions : IJsonModel, IPersistableModel { + public string EndUserId { get; set; } + public GeneratedImageQuality? Quality { get; set; } + public GeneratedImageFormat? ResponseFormat { get; set; } + public GeneratedImageSize? Size { get; set; } + public GeneratedImageStyle? Style { get; set; } + ImageGenerationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ImageGenerationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ImageVariationOptions : IJsonModel, IPersistableModel { + public string EndUserId { get; set; } + public GeneratedImageFormat? ResponseFormat { get; set; } + public GeneratedImageSize? Size { get; set; } + ImageVariationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ImageVariationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public static class OpenAIImagesModelFactory { + public static GeneratedImage GeneratedImage(BinaryData imageBytes = null, Uri imageUri = null, string revisedPrompt = null); + public static GeneratedImageCollection GeneratedImageCollection(DateTimeOffset createdAt = default, IEnumerable items = null); + } +} +namespace OpenAI.Models { + public class ModelClient { + protected ModelClient(); + public ModelClient(ApiKeyCredential credential, OpenAIClientOptions options); + public ModelClient(ApiKeyCredential credential); + protected internal ModelClient(ClientPipeline pipeline, OpenAIClientOptions options); + public virtual ClientPipeline Pipeline { get; } + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteModel(string model, RequestOptions options); + public virtual ClientResult DeleteModel(string model); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task DeleteModelAsync(string model, RequestOptions options); + public virtual Task> DeleteModelAsync(string model); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetModel(string model, RequestOptions options); + public virtual ClientResult GetModel(string model); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetModelAsync(string model, RequestOptions options); + public virtual Task> GetModelAsync(string model); + public virtual ClientResult GetModels(); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetModels(RequestOptions options); + public virtual Task> GetModelsAsync(); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetModelsAsync(RequestOptions options); + } + public class OpenAIModelInfo : IJsonModel, IPersistableModel { + public DateTimeOffset CreatedAt { get; } + public string Id { get; } + public string OwnedBy { get; } + OpenAIModelInfo IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + OpenAIModelInfo IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class OpenAIModelInfoCollection : ObjectModel.ReadOnlyCollection, IJsonModel, IPersistableModel { + OpenAIModelInfoCollection IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + OpenAIModelInfoCollection IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public static class OpenAIModelsModelFactory { + public static OpenAIModelInfo OpenAIModelInfo(string id = null, DateTimeOffset createdAt = default, string ownedBy = null); + public static OpenAIModelInfoCollection OpenAIModelInfoCollection(IEnumerable items = null); + } +} +namespace OpenAI.Moderations { + public class ModerationCategories : IJsonModel, IPersistableModel { + public bool Harassment { get; } + public bool HarassmentThreatening { get; } + public bool Hate { get; } + public bool HateThreatening { get; } + public bool SelfHarm { get; } + public bool SelfHarmInstructions { get; } + public bool SelfHarmIntent { get; } + public bool Sexual { get; } + public bool SexualMinors { get; } + public bool Violence { get; } + public bool ViolenceGraphic { get; } + ModerationCategories IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ModerationCategories IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ModerationCategoryScores : IJsonModel, IPersistableModel { + public float Harassment { get; } + public float HarassmentThreatening { get; } + public float Hate { get; } + public float HateThreatening { get; } + public float SelfHarm { get; } + public float SelfHarmInstructions { get; } + public float SelfHarmIntent { get; } + public float Sexual { get; } + public float SexualMinors { get; } + public float Violence { get; } + public float ViolenceGraphic { get; } + ModerationCategoryScores IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ModerationCategoryScores IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ModerationClient { + protected ModerationClient(); + protected internal ModerationClient(ClientPipeline pipeline, string model, OpenAIClientOptions options); + public ModerationClient(string model, ApiKeyCredential credential, OpenAIClientOptions options); + public ModerationClient(string model, ApiKeyCredential credential); + public virtual ClientPipeline Pipeline { get; } + public virtual ClientResult ClassifyTextInput(string input, CancellationToken cancellationToken = default); + public virtual Task> ClassifyTextInputAsync(string input, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ClassifyTextInputs(BinaryContent content, RequestOptions options = null); + public virtual ClientResult ClassifyTextInputs(IEnumerable inputs, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task ClassifyTextInputsAsync(BinaryContent content, RequestOptions options = null); + public virtual Task> ClassifyTextInputsAsync(IEnumerable inputs, CancellationToken cancellationToken = default); + } + public class ModerationCollection : ObjectModel.ReadOnlyCollection, IJsonModel, IPersistableModel { + public string Id { get; } + public string Model { get; } + ModerationCollection IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ModerationCollection IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class ModerationResult : IJsonModel, IPersistableModel { + public ModerationCategories Categories { get; } + public ModerationCategoryScores CategoryScores { get; } + public bool Flagged { get; } + ModerationResult IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + ModerationResult IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public static class OpenAIModerationsModelFactory { + public static ModerationCategories ModerationCategories(bool hate = false, bool hateThreatening = false, bool harassment = false, bool harassmentThreatening = false, bool selfHarm = false, bool selfHarmIntent = false, bool selfHarmInstructions = false, bool sexual = false, bool sexualMinors = false, bool violence = false, bool violenceGraphic = false); + public static ModerationCategoryScores ModerationCategoryScores(float hate = 0, float hateThreatening = 0, float harassment = 0, float harassmentThreatening = 0, float selfHarm = 0, float selfHarmIntent = 0, float selfHarmInstructions = 0, float sexual = 0, float sexualMinors = 0, float violence = 0, float violenceGraphic = 0); + public static ModerationCollection ModerationCollection(string id = null, string model = null, IEnumerable items = null); + public static ModerationResult ModerationResult(bool flagged = false, ModerationCategories categories = null, ModerationCategoryScores categoryScores = null); + } +} +namespace OpenAI.VectorStores { + public abstract class FileChunkingStrategy : IJsonModel, IPersistableModel { + public static FileChunkingStrategy Auto { get; } + public static FileChunkingStrategy Unknown { get; } + public static FileChunkingStrategy CreateStaticStrategy(int maxTokensPerChunk, int overlappingTokenCount); + FileChunkingStrategy IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + FileChunkingStrategy IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class StaticFileChunkingStrategy : FileChunkingStrategy, IJsonModel, IPersistableModel { + public StaticFileChunkingStrategy(int maxTokensPerChunk, int overlappingTokenCount); + public int MaxTokensPerChunk { get; } + public int OverlappingTokenCount { get; } + StaticFileChunkingStrategy IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + StaticFileChunkingStrategy IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class VectorStore : IJsonModel, IPersistableModel { + public DateTimeOffset CreatedAt { get; } + public VectorStoreExpirationPolicy ExpirationPolicy { get; } + public DateTimeOffset? ExpiresAt { get; } + public VectorStoreFileCounts FileCounts { get; } + public string Id { get; } + public DateTimeOffset? LastActiveAt { get; } + public IReadOnlyDictionary Metadata { get; } + public string Name { get; } + public VectorStoreStatus Status { get; } + public int UsageBytes { get; } + VectorStore IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + VectorStore IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class VectorStoreBatchFileJob : IJsonModel, IPersistableModel { + public string BatchId { get; } + public DateTimeOffset CreatedAt { get; } + public VectorStoreFileCounts FileCounts { get; } + public VectorStoreBatchFileJobStatus Status { get; } + public string VectorStoreId { get; } + VectorStoreBatchFileJob IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + VectorStoreBatchFileJob IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct VectorStoreBatchFileJobStatus : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public VectorStoreBatchFileJobStatus(string value); + public static VectorStoreBatchFileJobStatus Cancelled { get; } + public static VectorStoreBatchFileJobStatus Completed { get; } + public static VectorStoreBatchFileJobStatus Failed { get; } + public static VectorStoreBatchFileJobStatus InProgress { get; } + public readonly bool Equals(VectorStoreBatchFileJobStatus other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(VectorStoreBatchFileJobStatus left, VectorStoreBatchFileJobStatus right); + public static implicit operator VectorStoreBatchFileJobStatus(string value); + public static bool operator !=(VectorStoreBatchFileJobStatus left, VectorStoreBatchFileJobStatus right); + public override readonly string ToString(); + } + public class VectorStoreClient { + protected VectorStoreClient(); + public VectorStoreClient(ApiKeyCredential credential, OpenAIClientOptions options); + public VectorStoreClient(ApiKeyCredential credential); + protected internal VectorStoreClient(ClientPipeline pipeline, OpenAIClientOptions options); + public virtual ClientPipeline Pipeline { get; } + public virtual ClientResult AddFileToVectorStore(VectorStore vectorStore, OpenAIFileInfo file); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult AddFileToVectorStore(string vectorStoreId, BinaryContent content, RequestOptions options = null); + public virtual ClientResult AddFileToVectorStore(string vectorStoreId, string fileId, CancellationToken cancellationToken = default); + public virtual Task> AddFileToVectorStoreAsync(VectorStore vectorStore, OpenAIFileInfo file); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task AddFileToVectorStoreAsync(string vectorStoreId, BinaryContent content, RequestOptions options = null); + public virtual Task> AddFileToVectorStoreAsync(string vectorStoreId, string fileId, CancellationToken cancellationToken = default); + public virtual ClientResult CancelBatchFileJob(VectorStoreBatchFileJob batchJob); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CancelBatchFileJob(string vectorStoreId, string batchId, RequestOptions options); + public virtual ClientResult CancelBatchFileJob(string vectorStoreId, string batchJobId, CancellationToken cancellationToken = default); + public virtual Task> CancelBatchFileJobAsync(VectorStoreBatchFileJob batchJob); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CancelBatchFileJobAsync(string vectorStoreId, string batchId, RequestOptions options); + public virtual Task> CancelBatchFileJobAsync(string vectorStoreId, string batchJobId, CancellationToken cancellationToken = default); + public virtual ClientResult CreateBatchFileJob(VectorStore vectorStore, IEnumerable files); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateBatchFileJob(string vectorStoreId, BinaryContent content, RequestOptions options = null); + public virtual ClientResult CreateBatchFileJob(string vectorStoreId, IEnumerable fileIds, CancellationToken cancellationToken = default); + public virtual Task> CreateBatchFileJobAsync(VectorStore vectorStore, IEnumerable files); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CreateBatchFileJobAsync(string vectorStoreId, BinaryContent content, RequestOptions options = null); + public virtual Task> CreateBatchFileJobAsync(string vectorStoreId, IEnumerable fileIds, CancellationToken cancellationToken = default); + public virtual ClientResult CreateVectorStore(VectorStoreCreationOptions vectorStore = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateVectorStore(BinaryContent content, RequestOptions options = null); + public virtual Task> CreateVectorStoreAsync(VectorStoreCreationOptions vectorStore = null, CancellationToken cancellationToken = default); + public virtual Task CreateVectorStoreAsync(BinaryContent content, RequestOptions options = null); + public virtual ClientResult DeleteVectorStore(VectorStore vectorStore); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteVectorStore(string vectorStoreId, RequestOptions options); + public virtual ClientResult DeleteVectorStore(string vectorStoreId, CancellationToken cancellationToken = default); + public virtual Task> DeleteVectorStoreAsync(VectorStore vectorStore); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task DeleteVectorStoreAsync(string vectorStoreId, RequestOptions options); + public virtual Task> DeleteVectorStoreAsync(string vectorStoreId, CancellationToken cancellationToken = default); + public virtual ClientResult GetBatchFileJob(VectorStoreBatchFileJob batchJob); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetBatchFileJob(string vectorStoreId, string batchId, RequestOptions options); + public virtual ClientResult GetBatchFileJob(string vectorStoreId, string batchJobId, CancellationToken cancellationToken = default); + public virtual Task> GetBatchFileJobAsync(VectorStoreBatchFileJob batchJob); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetBatchFileJobAsync(string vectorStoreId, string batchId, RequestOptions options); + public virtual Task> GetBatchFileJobAsync(string vectorStoreId, string batchJobId, CancellationToken cancellationToken = default); + public virtual ClientResult GetFileAssociation(VectorStore vectorStore, OpenAIFileInfo file); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetFileAssociation(string vectorStoreId, string fileId, RequestOptions options); + public virtual ClientResult GetFileAssociation(string vectorStoreId, string fileId, CancellationToken cancellationToken = default); + public virtual Task> GetFileAssociationAsync(VectorStore vectorStore, OpenAIFileInfo file); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetFileAssociationAsync(string vectorStoreId, string fileId, RequestOptions options); + public virtual Task> GetFileAssociationAsync(string vectorStoreId, string fileId, CancellationToken cancellationToken = default); + public virtual PageCollection GetFileAssociations(VectorStore vectorStore, VectorStoreFileAssociationCollectionOptions options = null); + public virtual PageCollection GetFileAssociations(VectorStoreBatchFileJob batchJob, VectorStoreFileAssociationCollectionOptions options = null); + public virtual PageCollection GetFileAssociations(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + public virtual PageCollection GetFileAssociations(string vectorStoreId, VectorStoreFileAssociationCollectionOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetFileAssociations(string vectorStoreId, int? limit, string order, string after, string before, string filter, RequestOptions options); + public virtual PageCollection GetFileAssociations(string vectorStoreId, string batchJobId, VectorStoreFileAssociationCollectionOptions options = null, CancellationToken cancellationToken = default); + public virtual PageCollection GetFileAssociations(string vectorStoreId, string batchJobId, ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetFileAssociations(string vectorStoreId, string batchId, int? limit, string order, string after, string before, string filter, RequestOptions options); + public virtual AsyncPageCollection GetFileAssociationsAsync(VectorStore vectorStore, VectorStoreFileAssociationCollectionOptions options = null); + public virtual AsyncPageCollection GetFileAssociationsAsync(VectorStoreBatchFileJob batchJob, VectorStoreFileAssociationCollectionOptions options = null); + public virtual AsyncPageCollection GetFileAssociationsAsync(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + public virtual AsyncPageCollection GetFileAssociationsAsync(string vectorStoreId, VectorStoreFileAssociationCollectionOptions options = null, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetFileAssociationsAsync(string vectorStoreId, int? limit, string order, string after, string before, string filter, RequestOptions options); + public virtual AsyncPageCollection GetFileAssociationsAsync(string vectorStoreId, string batchJobId, VectorStoreFileAssociationCollectionOptions options = null, CancellationToken cancellationToken = default); + public virtual AsyncPageCollection GetFileAssociationsAsync(string vectorStoreId, string batchJobId, ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetFileAssociationsAsync(string vectorStoreId, string batchId, int? limit, string order, string after, string before, string filter, RequestOptions options); + public virtual ClientResult GetVectorStore(VectorStore vectorStore); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetVectorStore(string vectorStoreId, RequestOptions options); + public virtual ClientResult GetVectorStore(string vectorStoreId, CancellationToken cancellationToken = default); + public virtual Task> GetVectorStoreAsync(VectorStore vectorStore); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetVectorStoreAsync(string vectorStoreId, RequestOptions options); + public virtual Task> GetVectorStoreAsync(string vectorStoreId, CancellationToken cancellationToken = default); + public virtual PageCollection GetVectorStores(VectorStoreCollectionOptions options = null, CancellationToken cancellationToken = default); + public virtual PageCollection GetVectorStores(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetVectorStores(int? limit, string order, string after, string before, RequestOptions options); + public virtual AsyncPageCollection GetVectorStoresAsync(VectorStoreCollectionOptions options = null, CancellationToken cancellationToken = default); + public virtual AsyncPageCollection GetVectorStoresAsync(ContinuationToken firstPageToken, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetVectorStoresAsync(int? limit, string order, string after, string before, RequestOptions options); + public virtual ClientResult ModifyVectorStore(VectorStore vectorStore, VectorStoreModificationOptions options); + public virtual ClientResult ModifyVectorStore(string vectorStoreId, VectorStoreModificationOptions vectorStore, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ModifyVectorStore(string vectorStoreId, BinaryContent content, RequestOptions options = null); + public virtual Task> ModifyVectorStoreAsync(VectorStore vectorStore, VectorStoreModificationOptions options); + public virtual Task> ModifyVectorStoreAsync(string vectorStoreId, VectorStoreModificationOptions vectorStore, CancellationToken cancellationToken = default); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task ModifyVectorStoreAsync(string vectorStoreId, BinaryContent content, RequestOptions options = null); + public virtual ClientResult RemoveFileFromStore(VectorStore vectorStore, OpenAIFileInfo file); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult RemoveFileFromStore(string vectorStoreId, string fileId, RequestOptions options); + public virtual ClientResult RemoveFileFromStore(string vectorStoreId, string fileId, CancellationToken cancellationToken = default); + public virtual Task> RemoveFileFromStoreAsync(VectorStore vectorStore, OpenAIFileInfo file); + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task RemoveFileFromStoreAsync(string vectorStoreId, string fileId, RequestOptions options); + public virtual Task> RemoveFileFromStoreAsync(string vectorStoreId, string fileId, CancellationToken cancellationToken = default); + } + public class VectorStoreCollectionOptions { + public string AfterId { get; set; } + public string BeforeId { get; set; } + public ListOrder? Order { get; set; } + public int? PageSize { get; set; } + } + public class VectorStoreCreationOptions : IJsonModel, IPersistableModel { + public FileChunkingStrategy ChunkingStrategy { get; set; } + public VectorStoreExpirationPolicy ExpirationPolicy { get; set; } + public IList FileIds { get; set; } + public IDictionary Metadata { get; set; } + public string Name { get; set; } + VectorStoreCreationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + VectorStoreCreationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public enum VectorStoreExpirationAnchor { + Unknown = 0, + LastActiveAt = 1 + } + public class VectorStoreExpirationPolicy : IJsonModel, IPersistableModel { + public VectorStoreExpirationPolicy(); + public VectorStoreExpirationPolicy(VectorStoreExpirationAnchor anchor, int days); + public required VectorStoreExpirationAnchor Anchor { get; set; } + public required int Days { get; set; } + VectorStoreExpirationPolicy IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + VectorStoreExpirationPolicy IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class VectorStoreFileAssociation : IJsonModel, IPersistableModel { + public FileChunkingStrategy ChunkingStrategy { get; } + public DateTimeOffset CreatedAt { get; } + public string FileId { get; } + public VectorStoreFileAssociationError LastError { get; } + public int Size { get; } + public VectorStoreFileAssociationStatus Status { get; } + public string VectorStoreId { get; } + VectorStoreFileAssociation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + VectorStoreFileAssociation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public class VectorStoreFileAssociationCollectionOptions { + public string AfterId { get; set; } + public string BeforeId { get; set; } + public VectorStoreFileStatusFilter? Filter { get; set; } + public ListOrder? Order { get; set; } + public int? PageSize { get; set; } + } + public class VectorStoreFileAssociationError : IJsonModel, IPersistableModel { + public VectorStoreFileAssociationErrorCode Code { get; } + public string Message { get; } + VectorStoreFileAssociationError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + VectorStoreFileAssociationError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct VectorStoreFileAssociationErrorCode : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public VectorStoreFileAssociationErrorCode(string value); + public static VectorStoreFileAssociationErrorCode InvalidFile { get; } + public static VectorStoreFileAssociationErrorCode ServerError { get; } + public static VectorStoreFileAssociationErrorCode UnsupportedFile { get; } + public readonly bool Equals(VectorStoreFileAssociationErrorCode other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(VectorStoreFileAssociationErrorCode left, VectorStoreFileAssociationErrorCode right); + public static implicit operator VectorStoreFileAssociationErrorCode(string value); + public static bool operator !=(VectorStoreFileAssociationErrorCode left, VectorStoreFileAssociationErrorCode right); + public override readonly string ToString(); + } + public enum VectorStoreFileAssociationStatus { + Unknown = 0, + InProgress = 1, + Completed = 2, + Cancelled = 3, + Failed = 4 + } + public class VectorStoreFileCounts : IJsonModel, IPersistableModel { + public int Cancelled { get; } + public int Completed { get; } + public int Failed { get; } + public int InProgress { get; } + public int Total { get; } + VectorStoreFileCounts IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + VectorStoreFileCounts IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public readonly partial struct VectorStoreFileStatusFilter : IEquatable { + private readonly object _dummy; + private readonly int _dummyPrimitive; + public VectorStoreFileStatusFilter(string value); + public static VectorStoreFileStatusFilter Cancelled { get; } + public static VectorStoreFileStatusFilter Completed { get; } + public static VectorStoreFileStatusFilter Failed { get; } + public static VectorStoreFileStatusFilter InProgress { get; } + public readonly bool Equals(VectorStoreFileStatusFilter other); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly bool Equals(object obj); + [EditorBrowsable(EditorBrowsableState.Never)] + public override readonly int GetHashCode(); + public static bool operator ==(VectorStoreFileStatusFilter left, VectorStoreFileStatusFilter right); + public static implicit operator VectorStoreFileStatusFilter(string value); + public static bool operator !=(VectorStoreFileStatusFilter left, VectorStoreFileStatusFilter right); + public override readonly string ToString(); + } + public class VectorStoreModificationOptions : IJsonModel, IPersistableModel { + public VectorStoreExpirationPolicy ExpirationPolicy { get; set; } + public IDictionary Metadata { get; set; } + public string Name { get; set; } + VectorStoreModificationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options); + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options); + VectorStoreModificationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options); + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options); + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options); + } + public enum VectorStoreStatus { + Unknown = 0, + InProgress = 1, + Completed = 2, + Expired = 3 + } +} \ No newline at end of file diff --git a/.dotnet/docs/observability.md b/.dotnet/docs/observability.md new file mode 100644 index 000000000..8dd3290aa --- /dev/null +++ b/.dotnet/docs/observability.md @@ -0,0 +1,57 @@ +## Observability with OpenTelemetry + +> Note: +> OpenAI .NET SDK instrumentation is in development and is not complete. See [Available sources and meters](#available-sources-and-meters) section for the list of covered operations. + +OpenAI .NET library is instrumented with distributed tracing and metrics using .NET [tracing](https://learn.microsoft.com/dotnet/core/diagnostics/distributed-tracing) +and [metrics](https://learn.microsoft.com/dotnet/core/diagnostics/metrics-instrumentation) API and supports [OpenTelemetry](https://learn.microsoft.com/dotnet/core/diagnostics/observability-with-otel). + +OpenAI .NET instrumentation follows [OpenTelemetry Semantic Conventions for Generative AI systems](https://github.com/open-telemetry/semantic-conventions/tree/main/docs/gen-ai). + +### How to enable + +The instrumentation is **experimental** - volume and semantics of the telemetry items may change. + +To enable the instrumentation: + +1. Set instrumentation feature-flag using one of the following options: + + - set the `OPENAI_EXPERIMENTAL_ENABLE_OPEN_TELEMETRY` environment variable to `"true"` + - set the `OpenAI.Experimental.EnableOpenTelemetry` context switch to true in your application code when application + is starting and before initializing any OpenAI clients. For example: + + ```csharp + AppContext.SetSwitch("OpenAI.Experimental.EnableOpenTelemetry", true); + ``` + +2. Enable OpenAI telemetry: + + ```csharp + builder.Services.AddOpenTelemetry() + .WithTracing(b => + { + b.AddSource("OpenAI.*") + ... + .AddOtlpExporter(); + }) + .WithMetrics(b => + { + b.AddMeter("OpenAI.*") + ... + .AddOtlpExporter(); + }); + ``` + + Distributed tracing is enabled with `AddSource("OpenAI.*")` which tells OpenTelemetry to listen to all [ActivitySources](https://learn.microsoft.com/dotnet/api/system.diagnostics.activitysource) with names starting with `OpenAI.*`. + + Similarly, metrics are configured with `AddMeter("OpenAI.*")` which enables all OpenAI-related [Meters](https://learn.microsoft.com/dotnet/api/system.diagnostics.metrics.meter). + +Consider enabling [HTTP client instrumentation](https://www.nuget.org/packages/OpenTelemetry.Instrumentation.Http) to see all HTTP client +calls made by your application including those done by the OpenAI SDK. +Check out [OpenTelemetry documentation](https://opentelemetry.io/docs/languages/net/getting-started/) for more details. + +### Available sources and meters + +The following sources and meters are available: + +- `OpenAI.ChatClient` - records traces and metrics for `ChatClient` operations (except streaming and protocol methods which are not instrumented yet) diff --git a/.dotnet/examples/Assets/audio_french.wav b/.dotnet/examples/Assets/audio_french.wav new file mode 100644 index 000000000..847f3463a Binary files /dev/null and b/.dotnet/examples/Assets/audio_french.wav differ diff --git a/.dotnet/examples/Assets/audio_houseplant_care.mp3 b/.dotnet/examples/Assets/audio_houseplant_care.mp3 new file mode 100644 index 000000000..8a2f61411 Binary files /dev/null and b/.dotnet/examples/Assets/audio_houseplant_care.mp3 differ diff --git a/.dotnet/examples/Assets/images_dog_and_cat.png b/.dotnet/examples/Assets/images_dog_and_cat.png new file mode 100644 index 000000000..063f3e4d6 Binary files /dev/null and b/.dotnet/examples/Assets/images_dog_and_cat.png differ diff --git a/.dotnet/examples/Assets/images_flower_vase.png b/.dotnet/examples/Assets/images_flower_vase.png new file mode 100644 index 000000000..b2647dd75 Binary files /dev/null and b/.dotnet/examples/Assets/images_flower_vase.png differ diff --git a/.dotnet/examples/Assets/images_flower_vase_with_mask.png b/.dotnet/examples/Assets/images_flower_vase_with_mask.png new file mode 100644 index 000000000..cb1954243 Binary files /dev/null and b/.dotnet/examples/Assets/images_flower_vase_with_mask.png differ diff --git a/.dotnet/examples/Assistants/Example01_RetrievalAugmentedGeneration.cs b/.dotnet/examples/Assistants/Example01_RetrievalAugmentedGeneration.cs new file mode 100644 index 000000000..6d9117e9c --- /dev/null +++ b/.dotnet/examples/Assistants/Example01_RetrievalAugmentedGeneration.cs @@ -0,0 +1,150 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using OpenAI.Files; +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.IO; +using System.Threading; + +namespace OpenAI.Examples; + +public partial class AssistantExamples +{ + [Test] + public void Example01_RetrievalAugmentedGeneration() + { + // Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning. +#pragma warning disable OPENAI001 + OpenAIClient openAIClient = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + FileClient fileClient = openAIClient.GetFileClient(); + AssistantClient assistantClient = openAIClient.GetAssistantClient(); + + // First, let's contrive a document we'll use retrieval with and upload it. + using Stream document = BinaryData.FromString(""" + { + "description": "This document contains the sale history data for Contoso products.", + "sales": [ + { + "month": "January", + "by_product": { + "113043": 15, + "113045": 12, + "113049": 2 + } + }, + { + "month": "February", + "by_product": { + "113045": 22 + } + }, + { + "month": "March", + "by_product": { + "113045": 16, + "113055": 5 + } + } + ] + } + """).ToStream(); + + OpenAIFileInfo salesFile = fileClient.UploadFile( + document, + "monthly_sales.json", + FileUploadPurpose.Assistants); + + // Now, we'll create a client intended to help with that data + AssistantCreationOptions assistantOptions = new() + { + Name = "Example: Contoso sales RAG", + Instructions = + "You are an assistant that looks up sales data and helps visualize the information based" + + " on user queries. When asked to generate a graph, chart, or other visualization, use" + + " the code interpreter tool to do so.", + Tools = + { + new FileSearchToolDefinition(), + new CodeInterpreterToolDefinition(), + }, + ToolResources = new() + { + FileSearch = new() + { + NewVectorStores = + { + new VectorStoreCreationHelper([salesFile.Id]), + } + } + }, + }; + + Assistant assistant = assistantClient.CreateAssistant("gpt-4o", assistantOptions); + + // Now we'll create a thread with a user query about the data already associated with the assistant, then run it + ThreadCreationOptions threadOptions = new() + { + InitialMessages = { "How well did product 113045 sell in February? Graph its trend over time." } + }; + + ThreadRun threadRun = assistantClient.CreateThreadAndRun(assistant.Id, threadOptions); + + // Check back to see when the run is done + do + { + Thread.Sleep(TimeSpan.FromSeconds(1)); + threadRun = assistantClient.GetRun(threadRun.ThreadId, threadRun.Id); + } while (!threadRun.Status.IsTerminal); + + // Finally, we'll print out the full history for the thread that includes the augmented generation + PageCollection messagePages + = assistantClient.GetMessages(threadRun.ThreadId, new MessageCollectionOptions() { Order = ListOrder.OldestFirst }); + IEnumerable messages = messagePages.GetAllValues(); + + foreach (ThreadMessage message in messages) + { + Console.Write($"[{message.Role.ToString().ToUpper()}]: "); + foreach (MessageContent contentItem in message.Content) + { + if (!string.IsNullOrEmpty(contentItem.Text)) + { + Console.WriteLine($"{contentItem.Text}"); + + if (contentItem.TextAnnotations.Count > 0) + { + Console.WriteLine(); + } + + // Include annotations, if any. + foreach (TextAnnotation annotation in contentItem.TextAnnotations) + { + if (!string.IsNullOrEmpty(annotation.InputFileId)) + { + Console.WriteLine($"* File citation, file ID: {annotation.InputFileId}"); + } + if (!string.IsNullOrEmpty(annotation.OutputFileId)) + { + Console.WriteLine($"* File output, new file ID: {annotation.OutputFileId}"); + } + } + } + if (!string.IsNullOrEmpty(contentItem.ImageFileId)) + { + OpenAIFileInfo imageInfo = fileClient.GetFile(contentItem.ImageFileId); + BinaryData imageBytes = fileClient.DownloadFile(contentItem.ImageFileId); + using FileStream stream = File.OpenWrite($"{imageInfo.Filename}.png"); + imageBytes.ToStream().CopyTo(stream); + + Console.WriteLine($""); + } + } + Console.WriteLine(); + } + + // Optionally, delete any persistent resources you no longer need. + _ = assistantClient.DeleteThread(threadRun.ThreadId); + _ = assistantClient.DeleteAssistant(assistant); + _ = fileClient.DeleteFile(salesFile.Id); + } +} diff --git a/.dotnet/examples/Assistants/Example01_RetrievalAugmentedGenerationAsync.cs b/.dotnet/examples/Assistants/Example01_RetrievalAugmentedGenerationAsync.cs new file mode 100644 index 000000000..13b49145a --- /dev/null +++ b/.dotnet/examples/Assistants/Example01_RetrievalAugmentedGenerationAsync.cs @@ -0,0 +1,151 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using OpenAI.Files; +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.IO; +using System.Threading; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class AssistantExamples +{ + [Test] + public async Task Example01_RetrievalAugmentedGenerationAsync() + { + // Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning. +#pragma warning disable OPENAI001 + OpenAIClient openAIClient = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + FileClient fileClient = openAIClient.GetFileClient(); + AssistantClient assistantClient = openAIClient.GetAssistantClient(); + + // First, let's contrive a document we'll use retrieval with and upload it. + using Stream document = BinaryData.FromString(""" + { + "description": "This document contains the sale history data for Contoso products.", + "sales": [ + { + "month": "January", + "by_product": { + "113043": 15, + "113045": 12, + "113049": 2 + } + }, + { + "month": "February", + "by_product": { + "113045": 22 + } + }, + { + "month": "March", + "by_product": { + "113045": 16, + "113055": 5 + } + } + ] + } + """).ToStream(); + + OpenAIFileInfo salesFile = await fileClient.UploadFileAsync( + document, + "monthly_sales.json", + FileUploadPurpose.Assistants); + + // Now, we'll create a client intended to help with that data + AssistantCreationOptions assistantOptions = new() + { + Name = "Example: Contoso sales RAG", + Instructions = + "You are an assistant that looks up sales data and helps visualize the information based" + + " on user queries. When asked to generate a graph, chart, or other visualization, use" + + " the code interpreter tool to do so.", + Tools = + { + new FileSearchToolDefinition(), + new CodeInterpreterToolDefinition(), + }, + ToolResources = new() + { + FileSearch = new() + { + NewVectorStores = + { + new VectorStoreCreationHelper([salesFile.Id]), + } + } + }, + }; + + Assistant assistant = await assistantClient.CreateAssistantAsync("gpt-4o", assistantOptions); + + // Now we'll create a thread with a user query about the data already associated with the assistant, then run it + ThreadCreationOptions threadOptions = new() + { + InitialMessages = { "How well did product 113045 sell in February? Graph its trend over time." } + }; + + ThreadRun threadRun = await assistantClient.CreateThreadAndRunAsync(assistant.Id, threadOptions); + + // Check back to see when the run is done + do + { + Thread.Sleep(TimeSpan.FromSeconds(1)); + threadRun = assistantClient.GetRun(threadRun.ThreadId, threadRun.Id); + } while (!threadRun.Status.IsTerminal); + + // Finally, we'll print out the full history for the thread that includes the augmented generation + AsyncPageCollection messagePages + = assistantClient.GetMessagesAsync(threadRun.ThreadId, new MessageCollectionOptions() { Order = ListOrder.OldestFirst }); + IAsyncEnumerable messages = messagePages.GetAllValuesAsync(); + + await foreach (ThreadMessage message in messages) + { + Console.Write($"[{message.Role.ToString().ToUpper()}]: "); + foreach (MessageContent contentItem in message.Content) + { + if (!string.IsNullOrEmpty(contentItem.Text)) + { + Console.WriteLine($"{contentItem.Text}"); + + if (contentItem.TextAnnotations.Count > 0) + { + Console.WriteLine(); + } + + // Include annotations, if any. + foreach (TextAnnotation annotation in contentItem.TextAnnotations) + { + if (!string.IsNullOrEmpty(annotation.InputFileId)) + { + Console.WriteLine($"* File citation, file ID: {annotation.InputFileId}"); + } + if (!string.IsNullOrEmpty(annotation.OutputFileId)) + { + Console.WriteLine($"* File output, new file ID: {annotation.OutputFileId}"); + } + } + } + if (!string.IsNullOrEmpty(contentItem.ImageFileId)) + { + OpenAIFileInfo imageInfo = await fileClient.GetFileAsync(contentItem.ImageFileId); + BinaryData imageBytes = await fileClient.DownloadFileAsync(contentItem.ImageFileId); + using FileStream stream = File.OpenWrite($"{imageInfo.Filename}.png"); + imageBytes.ToStream().CopyTo(stream); + + Console.WriteLine($""); + } + } + Console.WriteLine(); + } + + // Optionally, delete any persistent resources you no longer need. + _ = await assistantClient.DeleteThreadAsync(threadRun.ThreadId); + _ = await assistantClient.DeleteAssistantAsync(assistant); + _ = await fileClient.DeleteFileAsync(salesFile.Id); + } +} diff --git a/.dotnet/examples/Assistants/Example02_FunctionCalling.cs b/.dotnet/examples/Assistants/Example02_FunctionCalling.cs new file mode 100644 index 000000000..a0eb05637 --- /dev/null +++ b/.dotnet/examples/Assistants/Example02_FunctionCalling.cs @@ -0,0 +1,193 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Text.Json; +using System.Threading; + +namespace OpenAI.Examples; + +public partial class AssistantExamples +{ + [Test] + public void Example02_FunctionCalling() + { + #region + string GetCurrentLocation() + { + // Call a location API here. + return "San Francisco"; + } + + const string GetCurrentLocationFunctionName = "get_current_location"; + + FunctionToolDefinition getLocationTool = new() + { + FunctionName = GetCurrentLocationFunctionName, + Description = "Get the user's current location" + }; + + string GetCurrentWeather(string location, string unit = "celsius") + { + // Call a weather API here. + return $"31 {unit}"; + } + + const string GetCurrentWeatherFunctionName = "get_current_weather"; + + FunctionToolDefinition getWeatherTool = new() + { + FunctionName = GetCurrentWeatherFunctionName, + Description = "Get the current weather in a given location", + Parameters = BinaryData.FromString(""" + { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. Boston, MA" + }, + "unit": { + "type": "string", + "enum": [ "celsius", "fahrenheit" ], + "description": "The temperature unit to use. Infer this from the specified location." + } + }, + "required": [ "location" ] + } + """), + }; + #endregion + + // Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning. +#pragma warning disable OPENAI001 + AssistantClient client = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + #region + // Create an assistant that can call the function tools. + AssistantCreationOptions assistantOptions = new() + { + Name = "Example: Function Calling", + Instructions = + "Don't make assumptions about what values to plug into functions." + + " Ask for clarification if a user request is ambiguous.", + Tools = { getLocationTool, getWeatherTool }, + }; + + Assistant assistant = client.CreateAssistant("gpt-4-turbo", assistantOptions); + #endregion + + #region + // Create a thread with an initial user message and run it. + ThreadCreationOptions threadOptions = new() + { + InitialMessages = { "What's the weather like today?" } + }; + + ThreadRun run = client.CreateThreadAndRun(assistant.Id, threadOptions); + #endregion + + #region + // Poll the run until it is no longer queued or in progress. + while (!run.Status.IsTerminal) + { + Thread.Sleep(TimeSpan.FromSeconds(1)); + run = client.GetRun(run.ThreadId, run.Id); + + // If the run requires action, resolve them. + if (run.Status == RunStatus.RequiresAction) + { + List toolOutputs = []; + + foreach (RequiredAction action in run.RequiredActions) + { + switch (action.FunctionName) + { + case GetCurrentLocationFunctionName: + { + string toolResult = GetCurrentLocation(); + toolOutputs.Add(new ToolOutput(action.ToolCallId, toolResult)); + break; + } + + case GetCurrentWeatherFunctionName: + { + // The arguments that the model wants to use to call the function are specified as a + // stringified JSON object based on the schema defined in the tool definition. Note that + // the model may hallucinate arguments too. Consequently, it is important to do the + // appropriate parsing and validation before calling the function. + using JsonDocument argumentsJson = JsonDocument.Parse(action.FunctionArguments); + bool hasLocation = argumentsJson.RootElement.TryGetProperty("location", out JsonElement location); + bool hasUnit = argumentsJson.RootElement.TryGetProperty("unit", out JsonElement unit); + + if (!hasLocation) + { + throw new ArgumentNullException(nameof(location), "The location argument is required."); + } + + string toolResult = hasUnit + ? GetCurrentWeather(location.GetString(), unit.GetString()) + : GetCurrentWeather(location.GetString()); + toolOutputs.Add(new ToolOutput(action.ToolCallId, toolResult)); + break; + } + + default: + { + // Handle other or unexpected calls. + throw new NotImplementedException(); + } + } + } + + // Submit the tool outputs to the assistant, which returns the run to the queued state. + run = client.SubmitToolOutputsToRun(run.ThreadId, run.Id, toolOutputs); + } + } + #endregion + + #region + // With the run complete, list the messages and display their content + if (run.Status == RunStatus.Completed) + { + PageCollection messagePages + = client.GetMessages(run.ThreadId, new MessageCollectionOptions() { Order = ListOrder.OldestFirst }); + IEnumerable messages = messagePages.GetAllValues(); + + foreach (ThreadMessage message in messages) + { + Console.WriteLine($"[{message.Role.ToString().ToUpper()}]: "); + foreach (MessageContent contentItem in message.Content) + { + Console.WriteLine($"{contentItem.Text}"); + + if (contentItem.ImageFileId is not null) + { + Console.WriteLine($" {contentItem.ImageFileId}"); + } + + // Include annotations, if any. + if (contentItem.TextAnnotations.Count > 0) + { + Console.WriteLine(); + foreach (TextAnnotation annotation in contentItem.TextAnnotations) + { + Console.WriteLine($"* File ID used by file_search: {annotation.InputFileId}"); + Console.WriteLine($"* File ID created by code_interpreter: {annotation.OutputFileId}"); + Console.WriteLine($"* Text to replace: {annotation.TextToReplace}"); + Console.WriteLine($"* Message content index range: {annotation.StartIndex}-{annotation.EndIndex}"); + } + } + + } + Console.WriteLine(); + } + } + else + { + throw new NotImplementedException(run.Status.ToString()); + } + #endregion + } +} diff --git a/.dotnet/examples/Assistants/Example02_FunctionCallingAsync.cs b/.dotnet/examples/Assistants/Example02_FunctionCallingAsync.cs new file mode 100644 index 000000000..2ee924c56 --- /dev/null +++ b/.dotnet/examples/Assistants/Example02_FunctionCallingAsync.cs @@ -0,0 +1,193 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Text.Json; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class AssistantExamples +{ + [Test] + public async Task Example02_FunctionCallingAsync() + { + #region + string GetCurrentLocation() + { + // Call a location API here. + return "San Francisco"; + } + + const string GetCurrentLocationFunctionName = "get_current_location"; + + FunctionToolDefinition getLocationTool = new() + { + FunctionName = GetCurrentLocationFunctionName, + Description = "Get the user's current location" + }; + + string GetCurrentWeather(string location, string unit = "celsius") + { + // Call a weather API here. + return $"31 {unit}"; + } + + const string GetCurrentWeatherFunctionName = "get_current_weather"; + + FunctionToolDefinition getWeatherTool = new() + { + FunctionName = GetCurrentWeatherFunctionName, + Description = "Get the current weather in a given location", + Parameters = BinaryData.FromString(""" + { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. Boston, MA" + }, + "unit": { + "type": "string", + "enum": [ "celsius", "fahrenheit" ], + "description": "The temperature unit to use. Infer this from the specified location." + } + }, + "required": [ "location" ] + } + """), + }; + #endregion + + // Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning. +#pragma warning disable OPENAI001 + AssistantClient client = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + #region + // Create an assistant that can call the function tools. + AssistantCreationOptions assistantOptions = new() + { + Name = "Example: Function Calling", + Instructions = + "Don't make assumptions about what values to plug into functions." + + " Ask for clarification if a user request is ambiguous.", + Tools = { getLocationTool, getWeatherTool }, + }; + + Assistant assistant = await client.CreateAssistantAsync("gpt-4-turbo", assistantOptions); + #endregion + + #region + // Create a thread with an initial user message and run it. + ThreadCreationOptions threadOptions = new() + { + InitialMessages = { "What's the weather like today?" } + }; + + ThreadRun run = await client.CreateThreadAndRunAsync(assistant.Id, threadOptions); + #endregion + + #region + // Poll the run until it is no longer queued or in progress. + while (!run.Status.IsTerminal) + { + await Task.Delay(TimeSpan.FromSeconds(1)); + run = await client.GetRunAsync(run.ThreadId, run.Id); + + // If the run requires action, resolve them. + if (run.Status == RunStatus.RequiresAction) + { + List toolOutputs = []; + + foreach (RequiredAction action in run.RequiredActions) + { + switch (action.FunctionName) + { + case GetCurrentLocationFunctionName: + { + string toolResult = GetCurrentLocation(); + toolOutputs.Add(new ToolOutput(action.ToolCallId, toolResult)); + break; + } + + case GetCurrentWeatherFunctionName: + { + // The arguments that the model wants to use to call the function are specified as a + // stringified JSON object based on the schema defined in the tool definition. Note that + // the model may hallucinate arguments too. Consequently, it is important to do the + // appropriate parsing and validation before calling the function. + using JsonDocument argumentsJson = JsonDocument.Parse(action.FunctionArguments); + bool hasLocation = argumentsJson.RootElement.TryGetProperty("location", out JsonElement location); + bool hasUnit = argumentsJson.RootElement.TryGetProperty("unit", out JsonElement unit); + + if (!hasLocation) + { + throw new ArgumentNullException(nameof(location), "The location argument is required."); + } + + string toolResult = hasUnit + ? GetCurrentWeather(location.GetString(), unit.GetString()) + : GetCurrentWeather(location.GetString()); + toolOutputs.Add(new ToolOutput(action.ToolCallId, toolResult)); + break; + } + + default: + { + // Handle other or unexpected calls. + throw new NotImplementedException(); + } + } + } + + // Submit the tool outputs to the assistant, which returns the run to the queued state. + run = await client.SubmitToolOutputsToRunAsync(run.ThreadId, run.Id, toolOutputs); + } + } + #endregion + + #region + // With the run complete, list the messages and display their content + if (run.Status == RunStatus.Completed) + { + AsyncPageCollection messagePages + = client.GetMessagesAsync(run.ThreadId, new MessageCollectionOptions() { Order = ListOrder.OldestFirst }); + IAsyncEnumerable messages = messagePages.GetAllValuesAsync(); + + await foreach (ThreadMessage message in messages) + { + Console.WriteLine($"[{message.Role.ToString().ToUpper()}]: "); + foreach (MessageContent contentItem in message.Content) + { + Console.WriteLine($"{contentItem.Text}"); + + if (contentItem.ImageFileId is not null) + { + Console.WriteLine($" {contentItem.ImageFileId}"); + } + + // Include annotations, if any. + if (contentItem.TextAnnotations.Count > 0) + { + Console.WriteLine(); + foreach (TextAnnotation annotation in contentItem.TextAnnotations) + { + Console.WriteLine($"* File ID used by file_search: {annotation.InputFileId}"); + Console.WriteLine($"* File ID created by code_interpreter: {annotation.OutputFileId}"); + Console.WriteLine($"* Text to replace: {annotation.TextToReplace}"); + Console.WriteLine($"* Message content index range: {annotation.StartIndex}-{annotation.EndIndex}"); + } + } + + } + Console.WriteLine(); + } + } + else + { + throw new NotImplementedException(run.Status.ToString()); + } + #endregion + } +} diff --git a/.dotnet/examples/Assistants/Example02b_FunctionCallingStreaming.cs b/.dotnet/examples/Assistants/Example02b_FunctionCallingStreaming.cs new file mode 100644 index 000000000..9c3e0adfc --- /dev/null +++ b/.dotnet/examples/Assistants/Example02b_FunctionCallingStreaming.cs @@ -0,0 +1,138 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class AssistantExamples +{ + [Test] + public async Task Example02b_FunctionCallingStreaming() + { + // This example parallels the content at the following location: + // https://platform.openai.com/docs/assistants/tools/function-calling/function-calling-beta + #region Step 1 - Define Functions + + // First, define the functions that the assistant will use in its defined tools. + + FunctionToolDefinition getTemperatureTool = new() + { + FunctionName = "get_current_temperature", + Description = "Gets the current temperature at a specific location.", + Parameters = BinaryData.FromString(""" + { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g., San Francisco, CA" + }, + "unit": { + "type": "string", + "enum": ["Celsius", "Fahrenheit"], + "description": "The temperature unit to use. Infer this from the user's location." + } + } + } + """), + }; + + FunctionToolDefinition getRainProbabilityTool = new() + { + FunctionName = "get_current_rain_probability", + Description = "Gets the current forecasted probability of rain at a specific location," + + " represented as a percent chance in the range of 0 to 100.", + Parameters = BinaryData.FromString(""" + { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g., San Francisco, CA" + } + }, + "required": ["location"] + } + """), + }; + + #endregion + + // Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning. +#pragma warning disable OPENAI001 + AssistantClient client = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + #region Create a new assistant with function tools + // Create an assistant that can call the function tools. + AssistantCreationOptions assistantOptions = new() + { + Name = "Example: Function Calling", + Instructions = + "Don't make assumptions about what values to plug into functions." + + " Ask for clarification if a user request is ambiguous.", + Tools = { getTemperatureTool, getRainProbabilityTool }, + }; + + Assistant assistant = await client.CreateAssistantAsync("gpt-4-turbo", assistantOptions); + #endregion + + #region Step 2 - Create a thread and add messages + AssistantThread thread = await client.CreateThreadAsync(); + ThreadMessage message = await client.CreateMessageAsync( + thread, + MessageRole.User, + [ + "What's the weather in San Francisco today and the likelihood it'll rain?" + ]); + #endregion + + #region Step 3 - Initiate a streaming run + AsyncCollectionResult asyncUpdates + = client.CreateRunStreamingAsync(thread, assistant); + + ThreadRun currentRun = null; + do + { + currentRun = null; + List outputsToSubmit = []; + await foreach (StreamingUpdate update in asyncUpdates) + { + if (update is RunUpdate runUpdate) + { + currentRun = runUpdate; + } + else if (update is RequiredActionUpdate requiredActionUpdate) + { + if (requiredActionUpdate.FunctionName == getTemperatureTool.FunctionName) + { + outputsToSubmit.Add(new ToolOutput(requiredActionUpdate.ToolCallId, "57")); + } + else if (requiredActionUpdate.FunctionName == getRainProbabilityTool.FunctionName) + { + outputsToSubmit.Add(new ToolOutput(requiredActionUpdate.ToolCallId, "25%")); + } + } + else if (update is MessageContentUpdate contentUpdate) + { + Console.Write(contentUpdate.Text); + } + } + if (outputsToSubmit.Count > 0) + { + asyncUpdates = client.SubmitToolOutputsToRunStreamingAsync(currentRun, outputsToSubmit); + } + } + while (currentRun?.Status.IsTerminal == false); + + #endregion + + // Optionally, delete the resources for tidiness if no longer needed. + RequestOptions noThrowOptions = new() { ErrorOptions = ClientErrorBehaviors.NoThrow }; + _ = await client.DeleteThreadAsync(thread.Id, noThrowOptions); + _ = await client.DeleteAssistantAsync(assistant.Id, noThrowOptions); + } +} diff --git a/.dotnet/examples/Assistants/Example03_ListAssistantsWithPagination.cs b/.dotnet/examples/Assistants/Example03_ListAssistantsWithPagination.cs new file mode 100644 index 000000000..83776d7a1 --- /dev/null +++ b/.dotnet/examples/Assistants/Example03_ListAssistantsWithPagination.cs @@ -0,0 +1,29 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using System; +using System.ClientModel; +using System.Collections.Generic; + +namespace OpenAI.Examples; + +public partial class AssistantExamples +{ + [Test] + public void Example03_ListAssistantsWithPagination() + { + // Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning. +#pragma warning disable OPENAI001 + AssistantClient client = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + int count = 0; + + PageCollection assistantPages = client.GetAssistants(); + IEnumerable assistants = assistantPages.GetAllValues(); + foreach (Assistant assistant in assistants) + { + Console.WriteLine($"[{count,3}] {assistant.Id} {assistant.CreatedAt:s} {assistant.Name}"); + + count++; + } + } +} diff --git a/.dotnet/examples/Assistants/Example03_ListAssistantsWithPaginationAsync.cs b/.dotnet/examples/Assistants/Example03_ListAssistantsWithPaginationAsync.cs new file mode 100644 index 000000000..1f4d8218e --- /dev/null +++ b/.dotnet/examples/Assistants/Example03_ListAssistantsWithPaginationAsync.cs @@ -0,0 +1,30 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class AssistantExamples +{ + [Test] + public async Task Example03_ListAssistantsWithPaginationAsync() + { + // Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning. +#pragma warning disable OPENAI001 + AssistantClient client = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + int count = 0; + + AsyncPageCollection assistantPages = client.GetAssistantsAsync(); + IAsyncEnumerable assistants = assistantPages.GetAllValuesAsync(); + await foreach (Assistant assistant in assistants) + { + Console.WriteLine($"[{count,3}] {assistant.Id} {assistant.CreatedAt:s} {assistant.Name}"); + + count++; + } + } +} diff --git a/.dotnet/examples/Assistants/Example04_AllTheTools.cs b/.dotnet/examples/Assistants/Example04_AllTheTools.cs new file mode 100644 index 000000000..5a8943052 --- /dev/null +++ b/.dotnet/examples/Assistants/Example04_AllTheTools.cs @@ -0,0 +1,206 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using OpenAI.Files; +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Text.Json; +using System.Threading; + +namespace OpenAI.Examples; + +public partial class AssistantExamples +{ + [Test] + public void Example04_AllTheTools() + { +#pragma warning disable OPENAI001 + + #region Define a function tool + static string GetNameOfFamilyMember(string relation) + => relation switch + { + { } when relation.Contains("father") => "John Doe", + { } when relation.Contains("mother") => "Jane Doe", + _ => throw new ArgumentException(relation, nameof(relation)) + }; + + FunctionToolDefinition getNameOfFamilyMemberTool = new() + { + FunctionName = nameof(GetNameOfFamilyMember), + Description = "Provided a family relation type like 'father' or 'mother', " + + "gets the name of the related person from the user.", + Parameters = BinaryData.FromString(""" + { + "type": "object", + "properties": { + "relation": { + "type": "string", + "description": "The relation to the user to query, e.g. 'mother' or 'father'" + } + }, + "required": [ "relation" ] + } + """), + }; + + #region Upload a mock file for use with file search + FileClient fileClient = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + OpenAIFileInfo favoriteNumberFile = fileClient.UploadFile( + BinaryData.FromString(""" + This file contains the favorite numbers for individuals. + + John Doe: 14 + Bob Doe: 32 + Jane Doe: 44 + """).ToStream(), + "favorite_numbers.txt", + FileUploadPurpose.Assistants); + #endregion + + #region Create an assistant with functions, file search, and code interpreter all enabled + AssistantClient client = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + Assistant assistant = client.CreateAssistant("gpt-4-turbo", new AssistantCreationOptions() + { + Instructions = "Use functions to resolve family relations into the names of people. Use file search to " + + " look up the favorite numbers of people. Use code interpreter to create graphs of lines.", + Tools = { getNameOfFamilyMemberTool, new FileSearchToolDefinition(), new CodeInterpreterToolDefinition() }, + ToolResources = new() + { + FileSearch = new() + { + NewVectorStores = + { + new VectorStoreCreationHelper([favoriteNumberFile.Id]), + }, + }, + }, + }); + #endregion + + #region Create a new thread and start a run + AssistantThread thread = client.CreateThread(new ThreadCreationOptions() + { + InitialMessages = + { + "Create a graph of a line with a slope that's my father's favorite number " + + "and an offset that's my mother's favorite number.", + "Include people's names in your response and cite where you found them." + } + }); + + ThreadRun run = client.CreateRun(thread, assistant); + #endregion + + #region Complete the run, calling functions as needed + // Poll the run until it is no longer queued or in progress. + while (!run.Status.IsTerminal) + { + Thread.Sleep(TimeSpan.FromSeconds(1)); + run = client.GetRun(run.ThreadId, run.Id); + + // If the run requires action, resolve them. + if (run.Status == RunStatus.RequiresAction) + { + List toolOutputs = []; + + foreach (RequiredAction action in run.RequiredActions) + { + switch (action.FunctionName) + { + case nameof(GetNameOfFamilyMember): + { + using JsonDocument argumentsDocument = JsonDocument.Parse(action.FunctionArguments); + string relation = argumentsDocument.RootElement.TryGetProperty("relation", out JsonElement relationProperty) + ? relationProperty.GetString() + : null; + string toolResult = GetNameOfFamilyMember(relation); + toolOutputs.Add(new ToolOutput(action.ToolCallId, toolResult)); + break; + } + + default: + { + // Handle other or unexpected calls. + throw new NotImplementedException(); + } + } + } + + // Submit the tool outputs to the assistant, which returns the run to the queued state. + run = client.SubmitToolOutputsToRun(run.ThreadId, run.Id, toolOutputs); + } + } + #endregion + + #region + // With the run complete, list the messages and display their content + if (run.Status == RunStatus.Completed) + { + PageCollection messagePages + = client.GetMessages(run.ThreadId, new MessageCollectionOptions() { Order = ListOrder.OldestFirst }); + IEnumerable messages = messagePages.GetAllValues(); + + foreach (ThreadMessage message in messages) + { + Console.WriteLine($"[{message.Role.ToString().ToUpper()}]: "); + foreach (MessageContent contentItem in message.Content) + { + Console.WriteLine($"{contentItem.Text}"); + + if (contentItem.ImageFileId is not null) + { + Console.WriteLine($" {contentItem.ImageFileId}"); + } + + // Include annotations, if any. + if (contentItem.TextAnnotations.Count > 0) + { + Console.WriteLine(); + foreach (TextAnnotation annotation in contentItem.TextAnnotations) + { + Console.WriteLine($"* File ID used by file_search: {annotation.InputFileId}"); + Console.WriteLine($"* File ID created by code_interpreter: {annotation.OutputFileId}"); + Console.WriteLine($"* Text to replace: {annotation.TextToReplace}"); + Console.WriteLine($"* Message content index range: {annotation.StartIndex}-{annotation.EndIndex}"); + } + } + + } + Console.WriteLine(); + } + #endregion + + #region List run steps for details about tool calls + PageCollection runSteps = client.GetRunSteps( + run, new RunStepCollectionOptions() + { + Order = ListOrder.OldestFirst + }); + foreach (RunStep step in runSteps.GetAllValues()) + { + Console.WriteLine($"Run step: {step.Status}"); + foreach (RunStepToolCall toolCall in step.Details.ToolCalls) + { + Console.WriteLine($" --> Tool call: {toolCall.ToolKind}"); + foreach (RunStepCodeInterpreterOutput output in toolCall.CodeInterpreterOutputs) + { + Console.WriteLine($" --> Output: {output.ImageFileId}"); + } + } + } + #endregion + } + else + { + throw new NotImplementedException(run.Status.ToString()); + } + #endregion + + #region Clean up any temporary resources that are no longer needed + _ = client.DeleteThread(thread); + _ = client.DeleteAssistant(assistant); + _ = fileClient.DeleteFile(favoriteNumberFile.Id); + #endregion + } +} diff --git a/.dotnet/examples/Assistants/Example05_AssistantsWithVision.cs b/.dotnet/examples/Assistants/Example05_AssistantsWithVision.cs new file mode 100644 index 000000000..99315cdfb --- /dev/null +++ b/.dotnet/examples/Assistants/Example05_AssistantsWithVision.cs @@ -0,0 +1,72 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using OpenAI.Files; +using System; +using System.ClientModel; + +namespace OpenAI.Examples; + +public partial class AssistantExamples +{ + [Test] + public void Example05_AssistantsWithVision() + { + // Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning. +#pragma warning disable OPENAI001 + OpenAIClient openAIClient = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + FileClient fileClient = openAIClient.GetFileClient(); + AssistantClient assistantClient = openAIClient.GetAssistantClient(); + + OpenAIFileInfo pictureOfAppleFile = fileClient.UploadFile( + "picture-of-apple.jpg", + FileUploadPurpose.Vision); + Uri linkToPictureOfOrange = new("https://platform.openai.com/fictitious-files/picture-of-orange.png"); + + Assistant assistant = assistantClient.CreateAssistant( + "gpt-4o", + new AssistantCreationOptions() + { + Instructions = "When asked a question, attempt to answer very concisely. " + + "Prefer one-sentence answers whenever feasible." + }); + + AssistantThread thread = assistantClient.CreateThread(new ThreadCreationOptions() + { + InitialMessages = + { + new ThreadInitializationMessage( + MessageRole.User, + [ + "Hello, assistant! Please compare these two images for me:", + MessageContent.FromImageFileId(pictureOfAppleFile.Id), + MessageContent.FromImageUrl(linkToPictureOfOrange), + ]), + } + }); + + CollectionResult streamingUpdates = assistantClient.CreateRunStreaming( + thread, + assistant, + new RunCreationOptions() + { + AdditionalInstructions = "When possible, try to sneak in puns if you're asked to compare things.", + }); + + foreach (StreamingUpdate streamingUpdate in streamingUpdates) + { + if (streamingUpdate.UpdateKind == StreamingUpdateReason.RunCreated) + { + Console.WriteLine($"--- Run started! ---"); + } + if (streamingUpdate is MessageContentUpdate contentUpdate) + { + Console.Write(contentUpdate.Text); + } + } + + // Delete temporary resources, if desired + _ = fileClient.DeleteFile(pictureOfAppleFile.Id); + _ = assistantClient.DeleteThread(thread); + _ = assistantClient.DeleteAssistant(assistant); + } +} diff --git a/.dotnet/examples/Assistants/Example05_AssistantsWithVisionAsync.cs b/.dotnet/examples/Assistants/Example05_AssistantsWithVisionAsync.cs new file mode 100644 index 000000000..db23818dd --- /dev/null +++ b/.dotnet/examples/Assistants/Example05_AssistantsWithVisionAsync.cs @@ -0,0 +1,72 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using OpenAI.Files; +using System; +using System.ClientModel; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class AssistantExamples +{ + [Test] + public async Task Example05_AssistantsWithVisionAsync() + { + // Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning. +#pragma warning disable OPENAI001 + OpenAIClient openAIClient = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + FileClient fileClient = openAIClient.GetFileClient(); + AssistantClient assistantClient = openAIClient.GetAssistantClient(); + + OpenAIFileInfo pictureOfAppleFile = await fileClient.UploadFileAsync( + "picture-of-apple.jpg", + FileUploadPurpose.Vision); + Uri linkToPictureOfOrange = new("https://platform.openai.com/fictitious-files/picture-of-orange.png"); + + Assistant assistant = await assistantClient.CreateAssistantAsync( + "gpt-4o", + new AssistantCreationOptions() + { + Instructions = "When asked a question, attempt to answer very concisely. " + + "Prefer one-sentence answers whenever feasible." + }); + + AssistantThread thread = await assistantClient.CreateThreadAsync(new ThreadCreationOptions() + { + InitialMessages = + { + new ThreadInitializationMessage( + MessageRole.User, + [ + "Hello, assistant! Please compare these two images for me:", + MessageContent.FromImageFileId(pictureOfAppleFile.Id), + MessageContent.FromImageUrl(linkToPictureOfOrange), + ]), + } + }); + + AsyncCollectionResult streamingUpdates = assistantClient.CreateRunStreamingAsync( + thread, + assistant, + new RunCreationOptions() + { + AdditionalInstructions = "When possible, try to sneak in puns if you're asked to compare things.", + }); + + await foreach (StreamingUpdate streamingUpdate in streamingUpdates) + { + if (streamingUpdate.UpdateKind == StreamingUpdateReason.RunCreated) + { + Console.WriteLine($"--- Run started! ---"); + } + if (streamingUpdate is MessageContentUpdate contentUpdate) + { + Console.Write(contentUpdate.Text); + } + } + + _ = await fileClient.DeleteFileAsync(pictureOfAppleFile.Id); + _ = await assistantClient.DeleteThreadAsync(thread); + _ = await assistantClient.DeleteAssistantAsync(assistant); + } +} diff --git a/.dotnet/examples/Audio/Example01_SimpleTextToSpeech.cs b/.dotnet/examples/Audio/Example01_SimpleTextToSpeech.cs new file mode 100644 index 000000000..8573152ff --- /dev/null +++ b/.dotnet/examples/Audio/Example01_SimpleTextToSpeech.cs @@ -0,0 +1,26 @@ +using NUnit.Framework; +using OpenAI.Audio; +using System; +using System.IO; + +namespace OpenAI.Examples; + +public partial class AudioExamples +{ + [Test] + public void Example01_SimpleTextToSpeech() + { + AudioClient client = new("tts-1", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string input = "Overwatering is a common issue for those taking care of houseplants. To prevent it, it is" + + " crucial to allow the soil to dry out between waterings. Instead of watering on a fixed schedule," + + " consider using a moisture meter to accurately gauge the soil’s wetness. Should the soil retain" + + " moisture, it is wise to postpone watering for a couple more days. When in doubt, it is often safer" + + " to water sparingly and maintain a less-is-more approach."; + + BinaryData speech = client.GenerateSpeech(input, GeneratedSpeechVoice.Alloy); + + using FileStream stream = File.OpenWrite($"{Guid.NewGuid()}.mp3"); + speech.ToStream().CopyTo(stream); + } +} diff --git a/.dotnet/examples/Audio/Example01_SimpleTextToSpeechAsync.cs b/.dotnet/examples/Audio/Example01_SimpleTextToSpeechAsync.cs new file mode 100644 index 000000000..7226ae739 --- /dev/null +++ b/.dotnet/examples/Audio/Example01_SimpleTextToSpeechAsync.cs @@ -0,0 +1,27 @@ +using NUnit.Framework; +using OpenAI.Audio; +using System; +using System.IO; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class AudioExamples +{ + [Test] + public async Task Example01_SimpleTextToSpeechAsync() + { + AudioClient client = new("tts-1", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string input = "Overwatering is a common issue for those taking care of houseplants. To prevent it, it is" + + " crucial to allow the soil to dry out between waterings. Instead of watering on a fixed schedule," + + " consider using a moisture meter to accurately gauge the soil’s wetness. Should the soil retain" + + " moisture, it is wise to postpone watering for a couple more days. When in doubt, it is often safer" + + " to water sparingly and maintain a less-is-more approach."; + + BinaryData speech = await client.GenerateSpeechAsync(input, GeneratedSpeechVoice.Alloy); + + using FileStream stream = File.OpenWrite($"{Guid.NewGuid()}.mp3"); + speech.ToStream().CopyTo(stream); + } +} diff --git a/.dotnet/examples/Audio/Example02_SimpleTranscription.cs b/.dotnet/examples/Audio/Example02_SimpleTranscription.cs new file mode 100644 index 000000000..d5596adcc --- /dev/null +++ b/.dotnet/examples/Audio/Example02_SimpleTranscription.cs @@ -0,0 +1,21 @@ +using NUnit.Framework; +using OpenAI.Audio; +using System; +using System.IO; + +namespace OpenAI.Examples; + +public partial class AudioExamples +{ + [Test] + public void Example02_SimpleTranscription() + { + AudioClient client = new("whisper-1", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string audioFilePath = Path.Combine("Assets", "audio_houseplant_care.mp3"); + + AudioTranscription transcription = client.TranscribeAudio(audioFilePath); + + Console.WriteLine($"{transcription.Text}"); + } +} diff --git a/.dotnet/examples/Audio/Example02_SimpleTranscriptionAsync.cs b/.dotnet/examples/Audio/Example02_SimpleTranscriptionAsync.cs new file mode 100644 index 000000000..8c46d947f --- /dev/null +++ b/.dotnet/examples/Audio/Example02_SimpleTranscriptionAsync.cs @@ -0,0 +1,22 @@ +using NUnit.Framework; +using OpenAI.Audio; +using System; +using System.IO; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class AudioExamples +{ + [Test] + public async Task Example02_SimpleTranscriptionAsync() + { + AudioClient client = new("whisper-1", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string audioFilePath = Path.Combine("Assets", "audio_houseplant_care.mp3"); + + AudioTranscription transcription = await client.TranscribeAudioAsync(audioFilePath); + + Console.WriteLine($"{transcription.Text}"); + } +} diff --git a/.dotnet/examples/Audio/Example03_VerboseTranscription.cs b/.dotnet/examples/Audio/Example03_VerboseTranscription.cs new file mode 100644 index 000000000..14e796c87 --- /dev/null +++ b/.dotnet/examples/Audio/Example03_VerboseTranscription.cs @@ -0,0 +1,42 @@ +using NUnit.Framework; +using OpenAI.Audio; +using System; +using System.IO; + +namespace OpenAI.Examples; + +public partial class AudioExamples +{ + [Test] + public void Example03_VerboseTranscription() + { + AudioClient client = new("whisper-1", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string audioFilePath = Path.Combine("Assets", "audio_houseplant_care.mp3"); + + AudioTranscriptionOptions options = new() + { + ResponseFormat = AudioTranscriptionFormat.Verbose, + Granularities = AudioTimestampGranularities.Word | AudioTimestampGranularities.Segment, + }; + + AudioTranscription transcription = client.TranscribeAudio(audioFilePath, options); + + Console.WriteLine("Transcription:"); + Console.WriteLine($"{transcription.Text}"); + + Console.WriteLine(); + Console.WriteLine($"Words:"); + foreach (TranscribedWord word in transcription.Words) + { + Console.WriteLine($" {word.Word,15} : {word.Start.TotalMilliseconds,5:0} - {word.End.TotalMilliseconds,5:0}"); + } + + Console.WriteLine(); + Console.WriteLine($"Segments:"); + foreach (TranscribedSegment segment in transcription.Segments) + { + Console.WriteLine($" {segment.Text,90} : {segment.Start.TotalMilliseconds,5:0} - {segment.End.TotalMilliseconds,5:0}"); + } + } +} diff --git a/.dotnet/examples/Audio/Example03_VerboseTrascriptionAsync.cs b/.dotnet/examples/Audio/Example03_VerboseTrascriptionAsync.cs new file mode 100644 index 000000000..570b82527 --- /dev/null +++ b/.dotnet/examples/Audio/Example03_VerboseTrascriptionAsync.cs @@ -0,0 +1,43 @@ +using NUnit.Framework; +using OpenAI.Audio; +using System; +using System.IO; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class AudioExamples +{ + [Test] + public async Task Example03_VerboseTranscriptionAsync() + { + AudioClient client = new("whisper-1", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string audioFilePath = Path.Combine("Assets", "audio_houseplant_care.mp3"); + + AudioTranscriptionOptions options = new() + { + ResponseFormat = AudioTranscriptionFormat.Verbose, + Granularities = AudioTimestampGranularities.Word | AudioTimestampGranularities.Segment, + }; + + AudioTranscription transcription = await client.TranscribeAudioAsync(audioFilePath, options); + + Console.WriteLine("Transcription:"); + Console.WriteLine($"{transcription.Text}"); + + Console.WriteLine(); + Console.WriteLine($"Words:"); + foreach (TranscribedWord word in transcription.Words) + { + Console.WriteLine($" {word.Word,15} : {word.Start.TotalMilliseconds,5:0} - {word.End.TotalMilliseconds,5:0}"); + } + + Console.WriteLine(); + Console.WriteLine($"Segments:"); + foreach (TranscribedSegment segment in transcription.Segments) + { + Console.WriteLine($" {segment.Text,90} : {segment.Start.TotalMilliseconds,5:0} - {segment.End.TotalMilliseconds,5:0}"); + } + } +} diff --git a/.dotnet/examples/Audio/Example04_SimpleTranslation.cs b/.dotnet/examples/Audio/Example04_SimpleTranslation.cs new file mode 100644 index 000000000..fa6fdebf5 --- /dev/null +++ b/.dotnet/examples/Audio/Example04_SimpleTranslation.cs @@ -0,0 +1,21 @@ +using NUnit.Framework; +using OpenAI.Audio; +using System; +using System.IO; + +namespace OpenAI.Examples; + +public partial class AudioExamples +{ + [Test] + public void Example04_SimpleTranslation() + { + AudioClient client = new("whisper-1", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string audioFilePath = Path.Combine("Assets", "audio_french.wav"); + + AudioTranslation translation = client.TranslateAudio(audioFilePath); + + Console.WriteLine($"{translation.Text}"); + } +} diff --git a/.dotnet/examples/Audio/Example04_SimpleTranslationAsync.cs b/.dotnet/examples/Audio/Example04_SimpleTranslationAsync.cs new file mode 100644 index 000000000..5578258d8 --- /dev/null +++ b/.dotnet/examples/Audio/Example04_SimpleTranslationAsync.cs @@ -0,0 +1,22 @@ +using NUnit.Framework; +using OpenAI.Audio; +using System; +using System.IO; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class AudioExamples +{ + [Test] + public async Task Example04_SimpleTranslationAsync() + { + AudioClient client = new("whisper-1", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string audioFilePath = Path.Combine("Assets", "audio_french.wav"); + + AudioTranslation translation = await client.TranslateAudioAsync(audioFilePath); + + Console.WriteLine($"{translation.Text}"); + } +} diff --git a/.dotnet/examples/Chat/Example01_SimpleChat.cs b/.dotnet/examples/Chat/Example01_SimpleChat.cs new file mode 100644 index 000000000..3715532d4 --- /dev/null +++ b/.dotnet/examples/Chat/Example01_SimpleChat.cs @@ -0,0 +1,18 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + [Test] + public void Example01_SimpleChat() + { + ChatClient client = new(model: "gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + ChatCompletion completion = client.CompleteChat("Say 'this is a test.'"); + + Console.WriteLine($"[ASSISTANT]: {completion}"); + } +} diff --git a/.dotnet/examples/Chat/Example01_SimpleChatAsync.cs b/.dotnet/examples/Chat/Example01_SimpleChatAsync.cs new file mode 100644 index 000000000..14b61d1b9 --- /dev/null +++ b/.dotnet/examples/Chat/Example01_SimpleChatAsync.cs @@ -0,0 +1,19 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + [Test] + public async Task Example01_SimpleChatAsync() + { + ChatClient client = new(model: "gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + ChatCompletion completion = await client.CompleteChatAsync("Say 'this is a test.'"); + + Console.WriteLine($"{completion}"); + } +} diff --git a/.dotnet/examples/Chat/Example01_SimpleChat_Cancellations.cs b/.dotnet/examples/Chat/Example01_SimpleChat_Cancellations.cs new file mode 100644 index 000000000..aaf13df02 --- /dev/null +++ b/.dotnet/examples/Chat/Example01_SimpleChat_Cancellations.cs @@ -0,0 +1,40 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + [Test] + public void Example01_SimpleChat_Cancellations() + { + ChatClient client = new(model: "gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + CancellationTokenSource ct = new CancellationTokenSource(); + RequestOptions options = new() { CancellationToken = ct.Token }; + + ChatMessage message = ChatMessage.CreateUserMessage("Say 'this is a test.'"); + var body = new { + model = "gpt-4o", + messages = new[] { + new + { + role = "user", + content = "Say \u0027this is a test.\u0027" + } + } + }; + + BinaryData json = BinaryData.FromObjectAsJson(body); + ClientResult result = client.CompleteChat(BinaryContent.Create(json), options); + + // The following code will be simplified in the future. + var wireFormat = new ModelReaderWriterOptions("W"); + ChatCompletion completion = ModelReaderWriter.Read(result.GetRawResponse().Content, wireFormat); + Console.WriteLine($"[ASSISTANT]: {completion}"); + } +} diff --git a/.dotnet/examples/Chat/Example02_SimpleChatStreaming.cs b/.dotnet/examples/Chat/Example02_SimpleChatStreaming.cs new file mode 100644 index 000000000..50b8938f0 --- /dev/null +++ b/.dotnet/examples/Chat/Example02_SimpleChatStreaming.cs @@ -0,0 +1,27 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.ClientModel; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + [Test] + public void Example02_SimpleChatStreaming() + { + ChatClient client = new(model: "gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + CollectionResult updates + = client.CompleteChatStreaming("Say 'this is a test.'"); + + Console.WriteLine($"[ASSISTANT]:"); + foreach (StreamingChatCompletionUpdate update in updates) + { + foreach (ChatMessageContentPart updatePart in update.ContentUpdate) + { + Console.Write(updatePart); + } + } + } +} diff --git a/.dotnet/examples/Chat/Example02_SimpleChatStreamingAsync.cs b/.dotnet/examples/Chat/Example02_SimpleChatStreamingAsync.cs new file mode 100644 index 000000000..c22bb4d8f --- /dev/null +++ b/.dotnet/examples/Chat/Example02_SimpleChatStreamingAsync.cs @@ -0,0 +1,28 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.ClientModel; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + [Test] + public async Task Example02_SimpleChatStreamingAsync() + { + ChatClient client = new(model: "gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + AsyncCollectionResult updates + = client.CompleteChatStreamingAsync("Say 'this is a test.'"); + + Console.WriteLine($"[ASSISTANT]:"); + await foreach (StreamingChatCompletionUpdate update in updates) + { + foreach (ChatMessageContentPart updatePart in update.ContentUpdate) + { + Console.Write(updatePart.Text); + } + } + } +} diff --git a/.dotnet/examples/Chat/Example03_FunctionCalling.cs b/.dotnet/examples/Chat/Example03_FunctionCalling.cs new file mode 100644 index 000000000..9f725a412 --- /dev/null +++ b/.dotnet/examples/Chat/Example03_FunctionCalling.cs @@ -0,0 +1,186 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + #region + private static string GetCurrentLocation() + { + // Call the location API here. + return "San Francisco"; + } + + private static string GetCurrentWeather(string location, string unit = "celsius") + { + // Call the weather API here. + return $"31 {unit}"; + } + #endregion + + #region + private static readonly ChatTool getCurrentLocationTool = ChatTool.CreateFunctionTool( + functionName: nameof(GetCurrentLocation), + functionDescription: "Get the user's current location" + ); + + private static readonly ChatTool getCurrentWeatherTool = ChatTool.CreateFunctionTool( + functionName: nameof(GetCurrentWeather), + functionDescription: "Get the current weather in a given location", + functionParameters: BinaryData.FromString(""" + { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. Boston, MA" + }, + "unit": { + "type": "string", + "enum": [ "celsius", "fahrenheit" ], + "description": "The temperature unit to use. Infer this from the specified location." + } + }, + "required": [ "location" ] + } + """) + ); + #endregion + + [Test] + public void Example03_FunctionCalling() + { + ChatClient client = new("gpt-4-turbo", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + #region + List messages = [ + new UserChatMessage("What's the weather like today?"), + ]; + + ChatCompletionOptions options = new() + { + Tools = { getCurrentLocationTool, getCurrentWeatherTool }, + }; + #endregion + + #region + bool requiresAction; + + do + { + requiresAction = false; + ChatCompletion chatCompletion = client.CompleteChat(messages, options); + + switch (chatCompletion.FinishReason) + { + case ChatFinishReason.Stop: + { + // Add the assistant message to the conversation history. + messages.Add(new AssistantChatMessage(chatCompletion)); + break; + } + + case ChatFinishReason.ToolCalls: + { + // First, add the assistant message with tool calls to the conversation history. + messages.Add(new AssistantChatMessage(chatCompletion)); + + // Then, add a new tool message for each tool call that is resolved. + foreach (ChatToolCall toolCall in chatCompletion.ToolCalls) + { + switch (toolCall.FunctionName) + { + case nameof(GetCurrentLocation): + { + string toolResult = GetCurrentLocation(); + messages.Add(new ToolChatMessage(toolCall.Id, toolResult)); + break; + } + + case nameof(GetCurrentWeather): + { + // The arguments that the model wants to use to call the function are specified as a + // stringified JSON object based on the schema defined in the tool definition. Note that + // the model may hallucinate arguments too. Consequently, it is important to do the + // appropriate parsing and validation before calling the function. + using JsonDocument argumentsJson = JsonDocument.Parse(toolCall.FunctionArguments); + bool hasLocation = argumentsJson.RootElement.TryGetProperty("location", out JsonElement location); + bool hasUnit = argumentsJson.RootElement.TryGetProperty("unit", out JsonElement unit); + + if (!hasLocation) + { + throw new ArgumentNullException(nameof(location), "The location argument is required."); + } + + string toolResult = hasUnit + ? GetCurrentWeather(location.GetString(), unit.GetString()) + : GetCurrentWeather(location.GetString()); + messages.Add(new ToolChatMessage(toolCall.Id, toolResult)); + break; + } + + default: + { + // Handle other unexpected calls. + throw new NotImplementedException(); + } + } + } + + requiresAction = true; + break; + } + + case ChatFinishReason.Length: + throw new NotImplementedException("Incomplete model output due to MaxTokens parameter or token limit exceeded."); + + case ChatFinishReason.ContentFilter: + throw new NotImplementedException("Omitted content due to a content filter flag."); + + case ChatFinishReason.FunctionCall: + throw new NotImplementedException("Deprecated in favor of tool calls."); + + default: + throw new NotImplementedException(chatCompletion.FinishReason.ToString()); + } + } while (requiresAction); + #endregion + + #region + foreach (ChatMessage requestMessage in messages) + { + switch (requestMessage) + { + case SystemChatMessage systemMessage: + Console.WriteLine($"[SYSTEM]:"); + Console.WriteLine($"{systemMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case UserChatMessage userMessage: + Console.WriteLine($"[USER]:"); + Console.WriteLine($"{userMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case AssistantChatMessage assistantMessage when assistantMessage.Content.Count > 0: + Console.WriteLine($"[ASSISTANT]:"); + Console.WriteLine($"{assistantMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case ToolChatMessage: + // Do not print any tool messages; let the assistant summarize the tool results instead. + break; + + default: + break; + } + } + #endregion + } +} diff --git a/.dotnet/examples/Chat/Example03_FunctionCallingAsync.cs b/.dotnet/examples/Chat/Example03_FunctionCallingAsync.cs new file mode 100644 index 000000000..e66464be1 --- /dev/null +++ b/.dotnet/examples/Chat/Example03_FunctionCallingAsync.cs @@ -0,0 +1,146 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.Collections.Generic; +using System.Text.Json; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + // See Example03_FunctionCalling.cs for the tool and function definitions. + + [Test] + public async Task Example03_FunctionCallingAsync() + { + ChatClient client = new("gpt-4-turbo", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + #region + List messages = [ + new UserChatMessage("What's the weather like today?"), + ]; + + ChatCompletionOptions options = new() + { + Tools = { getCurrentLocationTool, getCurrentWeatherTool }, + }; + #endregion + + #region + bool requiresAction; + + do + { + requiresAction = false; + ChatCompletion chatCompletion = await client.CompleteChatAsync(messages, options); + + switch (chatCompletion.FinishReason) + { + case ChatFinishReason.Stop: + { + // Add the assistant message to the conversation history. + messages.Add(new AssistantChatMessage(chatCompletion)); + break; + } + + case ChatFinishReason.ToolCalls: + { + // First, add the assistant message with tool calls to the conversation history. + messages.Add(new AssistantChatMessage(chatCompletion)); + + // Then, add a new tool message for each tool call that is resolved. + foreach (ChatToolCall toolCall in chatCompletion.ToolCalls) + { + switch (toolCall.FunctionName) + { + case nameof(GetCurrentLocation): + { + string toolResult = GetCurrentLocation(); + messages.Add(new ToolChatMessage(toolCall.Id, toolResult)); + break; + } + + case nameof(GetCurrentWeather): + { + // The arguments that the model wants to use to call the function are specified as a + // stringified JSON object based on the schema defined in the tool definition. Note that + // the model may hallucinate arguments too. Consequently, it is important to do the + // appropriate parsing and validation before calling the function. + using JsonDocument argumentsJson = JsonDocument.Parse(toolCall.FunctionArguments); + bool hasLocation = argumentsJson.RootElement.TryGetProperty("location", out JsonElement location); + bool hasUnit = argumentsJson.RootElement.TryGetProperty("unit", out JsonElement unit); + + if (!hasLocation) + { + throw new ArgumentNullException(nameof(location), "The location argument is required."); + } + + string toolResult = hasUnit + ? GetCurrentWeather(location.GetString(), unit.GetString()) + : GetCurrentWeather(location.GetString()); + messages.Add(new ToolChatMessage(toolCall.Id, toolResult)); + break; + } + + default: + { + // Handle other unexpected calls. + throw new NotImplementedException(); + } + } + } + + requiresAction = true; + break; + } + + case ChatFinishReason.Length: + throw new NotImplementedException("Incomplete model output due to MaxTokens parameter or token limit exceeded."); + + case ChatFinishReason.ContentFilter: + throw new NotImplementedException("Omitted content due to a content filter flag."); + + case ChatFinishReason.FunctionCall: + throw new NotImplementedException("Deprecated in favor of tool calls."); + + default: + throw new NotImplementedException(chatCompletion.FinishReason.ToString()); + } + } while (requiresAction); + #endregion + + #region + foreach (ChatMessage requestMessage in messages) + { + switch (requestMessage) + { + case SystemChatMessage systemMessage: + Console.WriteLine($"[SYSTEM]:"); + Console.WriteLine($"{systemMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case UserChatMessage userMessage: + Console.WriteLine($"[USER]:"); + Console.WriteLine($"{userMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case AssistantChatMessage assistantMessage when assistantMessage.Content.Count > 0: + Console.WriteLine($"[ASSISTANT]:"); + Console.WriteLine($"{assistantMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case ToolChatMessage: + // Do not print any tool messages; let the assistant summarize the tool results instead. + break; + + default: + break; + } + } + #endregion + } +} diff --git a/.dotnet/examples/Chat/Example04_FunctionCallingStreaming.cs b/.dotnet/examples/Chat/Example04_FunctionCallingStreaming.cs new file mode 100644 index 000000000..3f0770692 --- /dev/null +++ b/.dotnet/examples/Chat/Example04_FunctionCallingStreaming.cs @@ -0,0 +1,202 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Text; +using System.Text.Json; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + // See Example03_FunctionCalling.cs for the tool and function definitions. + + [Test] + public void Example04_FunctionCallingStreaming() + { + ChatClient client = new("gpt-4-turbo", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + #region + List messages = [ + new UserChatMessage("What's the weather like today?"), + ]; + + ChatCompletionOptions options = new() + { + Tools = { getCurrentLocationTool, getCurrentWeatherTool }, + }; + #endregion + + #region + bool requiresAction; + + do + { + requiresAction = false; + Dictionary indexToToolCallId = []; + Dictionary indexToFunctionName = []; + Dictionary indexToFunctionArguments = []; + StringBuilder contentBuilder = new(); + CollectionResult chatUpdates + = client.CompleteChatStreaming(messages, options); + + foreach (StreamingChatCompletionUpdate chatUpdate in chatUpdates) + { + // Accumulate the text content as new updates arrive. + foreach (ChatMessageContentPart contentPart in chatUpdate.ContentUpdate) + { + contentBuilder.Append(contentPart.Text); + } + + // Build the tool calls as new updates arrive. + foreach (StreamingChatToolCallUpdate toolCallUpdate in chatUpdate.ToolCallUpdates) + { + // Keep track of which tool call ID belongs to this update index. + if (toolCallUpdate.Id is not null) + { + indexToToolCallId[toolCallUpdate.Index] = toolCallUpdate.Id; + } + + // Keep track of which function name belongs to this update index. + if (toolCallUpdate.FunctionName is not null) + { + indexToFunctionName[toolCallUpdate.Index] = toolCallUpdate.FunctionName; + } + + // Keep track of which function arguments belong to this update index, + // and accumulate the arguments string as new updates arrive. + if (toolCallUpdate.FunctionArgumentsUpdate is not null) + { + StringBuilder argumentsBuilder + = indexToFunctionArguments.TryGetValue(toolCallUpdate.Index, out StringBuilder existingBuilder) + ? existingBuilder + : new StringBuilder(); + argumentsBuilder.Append(toolCallUpdate.FunctionArgumentsUpdate); + indexToFunctionArguments[toolCallUpdate.Index] = argumentsBuilder; + } + } + + switch (chatUpdate.FinishReason) + { + case ChatFinishReason.Stop: + { + // Add the assistant message to the conversation history. + messages.Add(new AssistantChatMessage(contentBuilder.ToString())); + break; + } + + case ChatFinishReason.ToolCalls: + { + // First, collect the accumulated function arguments into complete tool calls to be processed + List toolCalls = []; + foreach ((int index, string toolCallId) in indexToToolCallId) + { + ChatToolCall toolCall = ChatToolCall.CreateFunctionToolCall( + toolCallId, + indexToFunctionName[index], + indexToFunctionArguments[index].ToString()); + + toolCalls.Add(toolCall); + } + + // Next, add the assistant message with tool calls to the conversation history. + string content = contentBuilder.Length > 0 ? contentBuilder.ToString() : null; + messages.Add(new AssistantChatMessage(toolCalls, content)); + + // Then, add a new tool message for each tool call to be resolved. + foreach (ChatToolCall toolCall in toolCalls) + { + switch (toolCall.FunctionName) + { + case nameof(GetCurrentLocation): + { + string toolResult = GetCurrentLocation(); + messages.Add(new ToolChatMessage(toolCall.Id, toolResult)); + break; + } + + case nameof(GetCurrentWeather): + { + // The arguments that the model wants to use to call the function are specified as a + // stringified JSON object based on the schema defined in the tool definition. Note that + // the model may hallucinate arguments too. Consequently, it is important to do the + // appropriate parsing and validation before calling the function. + using JsonDocument argumentsJson = JsonDocument.Parse(toolCall.FunctionArguments); + bool hasLocation = argumentsJson.RootElement.TryGetProperty("location", out JsonElement location); + bool hasUnit = argumentsJson.RootElement.TryGetProperty("unit", out JsonElement unit); + + if (!hasLocation) + { + throw new ArgumentNullException(nameof(location), "The location argument is required."); + } + + string toolResult = hasUnit + ? GetCurrentWeather(location.GetString(), unit.GetString()) + : GetCurrentWeather(location.GetString()); + messages.Add(new ToolChatMessage(toolCall.Id, toolResult)); + break; + } + + default: + { + // Handle other unexpected calls. + throw new NotImplementedException(); + } + } + } + + requiresAction = true; + break; + } + + case ChatFinishReason.Length: + throw new NotImplementedException("Incomplete model output due to MaxTokens parameter or token limit exceeded."); + + case ChatFinishReason.ContentFilter: + throw new NotImplementedException("Omitted content due to a content filter flag."); + + case ChatFinishReason.FunctionCall: + throw new NotImplementedException("Deprecated in favor of tool calls."); + + case null: + break; + } + } + } while (requiresAction); + #endregion + + #region + foreach (ChatMessage requestMessage in messages) + { + switch (requestMessage) + { + case SystemChatMessage systemMessage: + Console.WriteLine($"[SYSTEM]:"); + Console.WriteLine($"{systemMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case UserChatMessage userMessage: + Console.WriteLine($"[USER]:"); + Console.WriteLine($"{userMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case AssistantChatMessage assistantMessage when assistantMessage.Content.Count > 0: + Console.WriteLine($"[ASSISTANT]:"); + Console.WriteLine($"{assistantMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case ToolChatMessage: + // Do not print any tool messages; let the assistant summarize the tool results instead. + break; + + default: + break; + } + } + #endregion + } +} diff --git a/.dotnet/examples/Chat/Example04_FunctionCallingStreamingAsync.cs b/.dotnet/examples/Chat/Example04_FunctionCallingStreamingAsync.cs new file mode 100644 index 000000000..6fac2c494 --- /dev/null +++ b/.dotnet/examples/Chat/Example04_FunctionCallingStreamingAsync.cs @@ -0,0 +1,203 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Text; +using System.Text.Json; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + // See Example03_FunctionCalling.cs for the tool and function definitions. + + [Test] + public async Task Example04_FunctionCallingStreamingAsync() + { + ChatClient client = new("gpt-4-turbo", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + #region + List messages = [ + new UserChatMessage("What's the weather like today?"), + ]; + + ChatCompletionOptions options = new() + { + Tools = { getCurrentLocationTool, getCurrentWeatherTool }, + }; + #endregion + + #region + bool requiresAction; + + do + { + requiresAction = false; + Dictionary indexToToolCallId = []; + Dictionary indexToFunctionName = []; + Dictionary indexToFunctionArguments = []; + StringBuilder contentBuilder = new(); + AsyncCollectionResult chatUpdates + = client.CompleteChatStreamingAsync(messages, options); + + await foreach (StreamingChatCompletionUpdate chatUpdate in chatUpdates) + { + // Accumulate the text content as new updates arrive. + foreach (ChatMessageContentPart contentPart in chatUpdate.ContentUpdate) + { + contentBuilder.Append(contentPart.Text); + } + + // Build the tool calls as new updates arrive. + foreach (StreamingChatToolCallUpdate toolCallUpdate in chatUpdate.ToolCallUpdates) + { + // Keep track of which tool call ID belongs to this update index. + if (toolCallUpdate.Id is not null) + { + indexToToolCallId[toolCallUpdate.Index] = toolCallUpdate.Id; + } + + // Keep track of which function name belongs to this update index. + if (toolCallUpdate.FunctionName is not null) + { + indexToFunctionName[toolCallUpdate.Index] = toolCallUpdate.FunctionName; + } + + // Keep track of which function arguments belong to this update index, + // and accumulate the arguments string as new updates arrive. + if (toolCallUpdate.FunctionArgumentsUpdate is not null) + { + StringBuilder argumentsBuilder + = indexToFunctionArguments.TryGetValue(toolCallUpdate.Index, out StringBuilder existingBuilder) + ? existingBuilder + : new StringBuilder(); + argumentsBuilder.Append(toolCallUpdate.FunctionArgumentsUpdate); + indexToFunctionArguments[toolCallUpdate.Index] = argumentsBuilder; + } + } + + switch (chatUpdate.FinishReason) + { + case ChatFinishReason.Stop: + { + // Add the assistant message to the conversation history. + messages.Add(new AssistantChatMessage(contentBuilder.ToString())); + break; + } + + case ChatFinishReason.ToolCalls: + { + // First, collect the accumulated function arguments into complete tool calls to be processed + List toolCalls = []; + foreach ((int index, string toolCallId) in indexToToolCallId) + { + ChatToolCall toolCall = ChatToolCall.CreateFunctionToolCall( + toolCallId, + indexToFunctionName[index], + indexToFunctionArguments[index].ToString()); + + toolCalls.Add(toolCall); + } + + // Next, add the assistant message with tool calls to the conversation history. + string content = contentBuilder.Length > 0 ? contentBuilder.ToString() : null; + messages.Add(new AssistantChatMessage(toolCalls, content)); + + // Then, add a new tool message for each tool call to be resolved. + foreach (ChatToolCall toolCall in toolCalls) + { + switch (toolCall.FunctionName) + { + case nameof(GetCurrentLocation): + { + string toolResult = GetCurrentLocation(); + messages.Add(new ToolChatMessage(toolCall.Id, toolResult)); + break; + } + + case nameof(GetCurrentWeather): + { + // The arguments that the model wants to use to call the function are specified as a + // stringified JSON object based on the schema defined in the tool definition. Note that + // the model may hallucinate arguments too. Consequently, it is important to do the + // appropriate parsing and validation before calling the function. + using JsonDocument argumentsJson = JsonDocument.Parse(toolCall.FunctionArguments); + bool hasLocation = argumentsJson.RootElement.TryGetProperty("location", out JsonElement location); + bool hasUnit = argumentsJson.RootElement.TryGetProperty("unit", out JsonElement unit); + + if (!hasLocation) + { + throw new ArgumentNullException(nameof(location), "The location argument is required."); + } + + string toolResult = hasUnit + ? GetCurrentWeather(location.GetString(), unit.GetString()) + : GetCurrentWeather(location.GetString()); + messages.Add(new ToolChatMessage(toolCall.Id, toolResult)); + break; + } + + default: + { + // Handle other unexpected calls. + throw new NotImplementedException(); + } + } + } + + requiresAction = true; + break; + } + + case ChatFinishReason.Length: + throw new NotImplementedException("Incomplete model output due to MaxTokens parameter or token limit exceeded."); + + case ChatFinishReason.ContentFilter: + throw new NotImplementedException("Omitted content due to a content filter flag."); + + case ChatFinishReason.FunctionCall: + throw new NotImplementedException("Deprecated in favor of tool calls."); + + case null: + break; + } + } + } while (requiresAction); + #endregion + + #region + foreach (ChatMessage requestMessage in messages) + { + switch (requestMessage) + { + case SystemChatMessage systemMessage: + Console.WriteLine($"[SYSTEM]:"); + Console.WriteLine($"{systemMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case UserChatMessage userMessage: + Console.WriteLine($"[USER]:"); + Console.WriteLine($"{userMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case AssistantChatMessage assistantMessage when assistantMessage.Content.Count > 0: + Console.WriteLine($"[ASSISTANT]:"); + Console.WriteLine($"{assistantMessage.Content[0].Text}"); + Console.WriteLine(); + break; + + case ToolChatMessage: + // Do not print any tool messages; let the assistant summarize the tool results instead. + break; + + default: + break; + } + } + #endregion + } +} diff --git a/.dotnet/examples/Chat/Example05_ChatWithVision.cs b/.dotnet/examples/Chat/Example05_ChatWithVision.cs new file mode 100644 index 000000000..dbe8543ea --- /dev/null +++ b/.dotnet/examples/Chat/Example05_ChatWithVision.cs @@ -0,0 +1,31 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.Collections.Generic; +using System.IO; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + [Test] + public void Example05_ChatWithVision() + { + ChatClient client = new("gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string imageFilePath = Path.Combine("Assets", "images_dog_and_cat.png"); + using Stream imageStream = File.OpenRead(imageFilePath); + BinaryData imageBytes = BinaryData.FromStream(imageStream); + + List messages = [ + new UserChatMessage( + ChatMessageContentPart.CreateTextMessageContentPart("Please describe the following image."), + ChatMessageContentPart.CreateImageMessageContentPart(imageBytes, "image/png")) + ]; + + ChatCompletion chatCompletion = client.CompleteChat(messages); + + Console.WriteLine($"[ASSISTANT]:"); + Console.WriteLine($"{chatCompletion.Content[0].Text}"); + } +} \ No newline at end of file diff --git a/.dotnet/examples/Chat/Example05_ChatWithVisionAsync.cs b/.dotnet/examples/Chat/Example05_ChatWithVisionAsync.cs new file mode 100644 index 000000000..c8f4c05c0 --- /dev/null +++ b/.dotnet/examples/Chat/Example05_ChatWithVisionAsync.cs @@ -0,0 +1,32 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.Collections.Generic; +using System.IO; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + [Test] + public async Task Example05_ChatWithVisionAsync() + { + ChatClient client = new("gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string imageFilePath = Path.Combine("Assets", "images_dog_and_cat.png"); + using Stream imageStream = File.OpenRead(imageFilePath); + BinaryData imageBytes = BinaryData.FromStream(imageStream); + + List messages = [ + new UserChatMessage( + ChatMessageContentPart.CreateTextMessageContentPart("Please describe the following image."), + ChatMessageContentPart.CreateImageMessageContentPart(imageBytes, "image/png")) + ]; + + ChatCompletion chatCompletion = await client.CompleteChatAsync(messages); + + Console.WriteLine($"[ASSISTANT]:"); + Console.WriteLine($"{chatCompletion.Content[0].Text}"); + } +} \ No newline at end of file diff --git a/.dotnet/examples/Chat/Example06_SimpleChatProtocol.cs b/.dotnet/examples/Chat/Example06_SimpleChatProtocol.cs new file mode 100644 index 000000000..e8682bfd5 --- /dev/null +++ b/.dotnet/examples/Chat/Example06_SimpleChatProtocol.cs @@ -0,0 +1,42 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.ClientModel; +using System.Text.Json; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + [Test] + public void Example06_SimpleChatProtocol() + { + ChatClient client = new("gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + BinaryData input = BinaryData.FromBytes(""" + { + "model": "gpt-4o", + "messages": [ + { + "role": "user", + "content": "How does AI work? Explain it in simple terms." + } + ] + } + """u8.ToArray()); + + using BinaryContent content = BinaryContent.Create(input); + ClientResult result = client.CompleteChat(content); + BinaryData output = result.GetRawResponse().Content; + + using JsonDocument outputAsJson = JsonDocument.Parse(output.ToString()); + string message = outputAsJson.RootElement + .GetProperty("choices"u8)[0] + .GetProperty("message"u8) + .GetProperty("content"u8) + .GetString(); + + Console.WriteLine($"[ASSISTANT]:"); + Console.WriteLine($"{message}"); + } +} diff --git a/.dotnet/examples/Chat/Example06_SimpleChatProtocolAsync.cs b/.dotnet/examples/Chat/Example06_SimpleChatProtocolAsync.cs new file mode 100644 index 000000000..8e363e965 --- /dev/null +++ b/.dotnet/examples/Chat/Example06_SimpleChatProtocolAsync.cs @@ -0,0 +1,43 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.ClientModel; +using System.Text.Json; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + [Test] + public async Task Example06_SimpleChatProtocolAsync() + { + ChatClient client = new("gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + BinaryData input = BinaryData.FromBytes(""" + { + "model": "gpt-4o", + "messages": [ + { + "role": "user", + "content": "How does AI work? Explain it in simple terms." + } + ] + } + """u8.ToArray()); + + using BinaryContent content = BinaryContent.Create(input); + ClientResult result = await client.CompleteChatAsync(content); + BinaryData output = result.GetRawResponse().Content; + + using JsonDocument outputAsJson = JsonDocument.Parse(output.ToString()); + string message = outputAsJson.RootElement + .GetProperty("choices"u8)[0] + .GetProperty("message"u8) + .GetProperty("content"u8) + .GetString(); + + Console.WriteLine($"[ASSISTANT]:"); + Console.WriteLine($"{message}"); + } +} diff --git a/.dotnet/examples/Chat/Example07_StructuredOutputs.cs b/.dotnet/examples/Chat/Example07_StructuredOutputs.cs new file mode 100644 index 000000000..52669ebe7 --- /dev/null +++ b/.dotnet/examples/Chat/Example07_StructuredOutputs.cs @@ -0,0 +1,59 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.Text.Json; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + [Test] + public void Example07_StructuredOutputs() + { + ChatClient client = new("gpt-4o-mini", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + ChatCompletionOptions options = new() + { + ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat( + name: "math_reasoning", + jsonSchema: BinaryData.FromString(""" + { + "type": "object", + "properties": { + "steps": { + "type": "array", + "items": { + "type": "object", + "properties": { + "explanation": { "type": "string" }, + "output": { "type": "string" } + }, + "required": ["explanation", "output"], + "additionalProperties": false + } + }, + "final_answer": { "type": "string" } + }, + "required": ["steps", "final_answer"], + "additionalProperties": false + } + """), + strictSchemaEnabled: true) + }; + + ChatCompletion chatCompletion = client.CompleteChat( + ["How can I solve 8x + 7 = -23?"], + options); + + using JsonDocument structuredJson = JsonDocument.Parse(chatCompletion.ToString()); + + Console.WriteLine($"Final answer: {structuredJson.RootElement.GetProperty("final_answer").GetString()}"); + Console.WriteLine("Reasoning steps:"); + + foreach (JsonElement stepElement in structuredJson.RootElement.GetProperty("steps").EnumerateArray()) + { + Console.WriteLine($" - Explanation: {stepElement.GetProperty("explanation").GetString()}"); + Console.WriteLine($" Output: {stepElement.GetProperty("output")}"); + } + } +} diff --git a/.dotnet/examples/Chat/Example07_StructuredOutputsAsync.cs b/.dotnet/examples/Chat/Example07_StructuredOutputsAsync.cs new file mode 100644 index 000000000..280b67bcd --- /dev/null +++ b/.dotnet/examples/Chat/Example07_StructuredOutputsAsync.cs @@ -0,0 +1,60 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.Text.Json; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class ChatExamples +{ + [Test] + public async Task Example07_StructuredOutputsAsync() + { + ChatClient client = new("gpt-4o-mini", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + ChatCompletionOptions options = new() + { + ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat( + name: "math_reasoning", + jsonSchema: BinaryData.FromString(""" + { + "type": "object", + "properties": { + "steps": { + "type": "array", + "items": { + "type": "object", + "properties": { + "explanation": { "type": "string" }, + "output": { "type": "string" } + }, + "required": ["explanation", "output"], + "additionalProperties": false + } + }, + "final_answer": { "type": "string" } + }, + "required": ["steps", "final_answer"], + "additionalProperties": false + } + """), + strictSchemaEnabled: true) + }; + + ChatCompletion chatCompletion = await client.CompleteChatAsync( + ["How can I solve 8x + 7 = -23?"], + options); + + using JsonDocument structuredJson = JsonDocument.Parse(chatCompletion.ToString()); + + Console.WriteLine($"Final answer: {structuredJson.RootElement.GetProperty("final_answer").GetString()}"); + Console.WriteLine("Reasoning steps:"); + + foreach (JsonElement stepElement in structuredJson.RootElement.GetProperty("steps").EnumerateArray()) + { + Console.WriteLine($" - Explanation: {stepElement.GetProperty("explanation").GetString()}"); + Console.WriteLine($" Output: {stepElement.GetProperty("output")}"); + } + } +} diff --git a/.dotnet/examples/ClientExamples.cs b/.dotnet/examples/ClientExamples.cs new file mode 100644 index 000000000..dff74f119 --- /dev/null +++ b/.dotnet/examples/ClientExamples.cs @@ -0,0 +1,49 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using OpenAI.Audio; +using OpenAI.Chat; +using OpenAI.Embeddings; +using OpenAI.Files; +using OpenAI.Images; +using System; + +namespace OpenAI.Examples.Miscellaneous; + +public partial class ClientExamples +{ + [Test] + public void CreateChatClient() + { + ChatClient client = new("gpt-4o", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + } + + [Test] + public void CreateEmbeddingClient() + { + EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + } + + [Test] + public void CreateImageClient() + { + ImageClient client = new("dall-e-3", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + } + + [Test] + public void CreateMultipleAudioClients() + { + OpenAIClient client = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + AudioClient ttsClient = client.GetAudioClient("tts-1"); + AudioClient whisperClient = client.GetAudioClient("whisper-1"); + } + + [Test] + public void CreateAssistantAndFileClients() + { + OpenAIClient openAIClient = new(Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + FileClient fileClient = openAIClient.GetFileClient(); +#pragma warning disable OPENAI001 + AssistantClient assistantClient = openAIClient.GetAssistantClient(); +#pragma warning restore OPENAI001 + } +} diff --git a/.dotnet/examples/CombinationExamples.cs b/.dotnet/examples/CombinationExamples.cs new file mode 100644 index 000000000..7987d81f2 --- /dev/null +++ b/.dotnet/examples/CombinationExamples.cs @@ -0,0 +1,150 @@ +using NUnit.Framework; +using OpenAI.Audio; +using OpenAI.Chat; +using OpenAI.Images; +using System; +using System.ClientModel; +using System.IO; +using System.Threading.Tasks; + +namespace OpenAI.Examples.Miscellaneous; + +public partial class CombinationExamples +{ + [Test] + public void AlpacaArtAssessor() + { + // First, we create an image using dall-e-3: + ImageClient imageClient = new("dall-e-3", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + ClientResult imageResult = imageClient.GenerateImage( + "a majestic alpaca on a mountain ridge, backed by an expansive blue sky accented with sparse clouds", + new() + { + Style = GeneratedImageStyle.Vivid, + Quality = GeneratedImageQuality.High, + Size = GeneratedImageSize.W1792xH1024, + }); + GeneratedImage imageGeneration = imageResult.Value; + Console.WriteLine($"Majestic alpaca available at:\n{imageGeneration.ImageUri.AbsoluteUri}"); + + // Now, we'll ask a cranky art critic to evaluate the image using gpt-4-vision-preview: + ChatClient chatClient = new("gpt-4o-mini", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + ChatCompletion chatCompletion = chatClient.CompleteChat( + [ + new SystemChatMessage("Assume the role of a cranky art critic. When asked to describe or " + + "evaluate imagery, focus on criticizing elements of subject, composition, and other details."), + new UserChatMessage( + ChatMessageContentPart.CreateTextMessageContentPart("describe the following image in a few sentences"), + ChatMessageContentPart.CreateImageMessageContentPart(imageGeneration.ImageUri)), + ], + new ChatCompletionOptions() + { + MaxTokens = 2048, + } + ); + + string chatResponseText = chatCompletion.Content[0].Text; + Console.WriteLine($"Art critique of majestic alpaca:\n{chatResponseText}"); + + // Finally, we'll get some text-to-speech for that critical evaluation using tts-1-hd: + AudioClient audioClient = new("tts-1-hd", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + ClientResult ttsResult = audioClient.GenerateSpeech( + text: chatResponseText, + GeneratedSpeechVoice.Fable, + new SpeechGenerationOptions() + { + Speed = 0.9f, + ResponseFormat = GeneratedSpeechFormat.Opus, + }); + FileInfo ttsFileInfo = new($"{chatCompletion.Id}.opus"); + using (FileStream ttsFileStream = ttsFileInfo.Create()) + using (BinaryWriter ttsFileWriter = new(ttsFileStream)) + { + ttsFileWriter.Write(ttsResult.Value); + } + Console.WriteLine($"Alpaca evaluation audio available at:\n{new Uri(ttsFileInfo.FullName).AbsoluteUri}"); + } + + [Test] + public async Task CuriousCreatureCreator() + { + // First, we'll use gpt-4 to have a creative helper imagine a twist on a household pet + ChatClient creativeWriterClient = new("gpt-4o-mini", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + ClientResult creativeWriterResult = creativeWriterClient.CompleteChat( + [ + new SystemChatMessage("You're a creative helper that specializes in brainstorming designs for concepts that fuse ordinary, mundane items with a fantastical touch. In particular, you can provide good one-paragraph descriptions of concept images."), + new UserChatMessage("Imagine a household pet. Now add in a subtle touch of magic or 'different'. What do you imagine? Provide a one-paragraph description of a picture of this new creature, focusing on the details of the imagery such that it'd be suitable for creating a picture."), + ], + new ChatCompletionOptions() + { + MaxTokens = 2048, + }); + string description = creativeWriterResult.Value.Content[0].Text; + Console.WriteLine($"Creative helper's creature description:\n{description}"); + + // Asynchronously, in parallel to the next steps, we'll get the creative description in the voice of Onyx + AudioClient ttsClient = new("tts-1-hd", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + Task> imageDescriptionAudioTask = ttsClient.GenerateSpeechAsync( + description, + GeneratedSpeechVoice.Onyx, + new SpeechGenerationOptions() + { + Speed = 1.1f, + ResponseFormat = GeneratedSpeechFormat.Opus, + }); + _ = Task.Run(async () => + { + ClientResult audioResult = await imageDescriptionAudioTask; + FileInfo audioFileInfo = new FileInfo($"{creativeWriterResult.Value.Id}-description.opus"); + using FileStream fileStream = audioFileInfo.Create(); + using BinaryWriter fileWriter = new(fileStream); + fileWriter.Write(audioResult.Value); + Console.WriteLine($"Spoken description available at:\n{new Uri(audioFileInfo.FullName).AbsoluteUri}"); + }); + + // Meanwhile, we'll use dall-e-3 to generate a rendition of our LLM artist's vision + ImageClient imageGenerationClient = new("dall-e-3", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + ClientResult imageGenerationResult = await imageGenerationClient.GenerateImageAsync( + description, + new ImageGenerationOptions() + { + Size = GeneratedImageSize.W1792xH1024, + Quality = GeneratedImageQuality.High, + }); + Uri imageLocation = imageGenerationResult.Value.ImageUri; + Console.WriteLine($"Creature image available at:\n{imageLocation.AbsoluteUri}"); + + // Now, we'll use gpt-4-vision-preview to get a hopelessly taken assessment from a usually exigent art connoisseur + ChatClient imageCriticClient = new("gpt-4o-mini", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + ClientResult criticalAppraisalResult = await imageCriticClient.CompleteChatAsync( + [ + new SystemChatMessage("Assume the role of an art critic. Although usually cranky and occasionally even referred to as a 'curmudgeon', you're somehow entirely smitten with the subject presented to you and, despite your best efforts, can't help but lavish praise when you're asked to appraise a provided image."), + new UserChatMessage( + ChatMessageContentPart.CreateTextMessageContentPart("Evaluate this image for me. What is it, and what do you think of it?"), + ChatMessageContentPart.CreateImageMessageContentPart(imageLocation)), + ], + new ChatCompletionOptions() + { + MaxTokens = 2048, + }); + string appraisal = criticalAppraisalResult.Value.Content[0].Text; + Console.WriteLine($"Critic's appraisal:\n{appraisal}"); + + // Finally, we'll get that art expert's laudations in the voice of Fable + ClientResult appraisalAudioResult = await ttsClient.GenerateSpeechAsync( + appraisal, + GeneratedSpeechVoice.Fable, + new SpeechGenerationOptions() + { + Speed = 0.9f, + ResponseFormat = GeneratedSpeechFormat.Opus, + }); + FileInfo criticAudioFileInfo = new($"{criticalAppraisalResult.Value.Id}-appraisal.opus"); + using (FileStream criticStream = criticAudioFileInfo.Create()) + using (BinaryWriter criticFileWriter = new(criticStream)) + { + criticFileWriter.Write(appraisalAudioResult.Value); + } + Console.WriteLine($"Critical appraisal available at:\n{new Uri(criticAudioFileInfo.FullName).AbsoluteUri}"); + } +} diff --git a/.dotnet/examples/Directory.Build.targets b/.dotnet/examples/Directory.Build.targets new file mode 100644 index 000000000..9108ca3f9 --- /dev/null +++ b/.dotnet/examples/Directory.Build.targets @@ -0,0 +1,7 @@ + + + + PreserveNewest + + + \ No newline at end of file diff --git a/.dotnet/examples/Embeddings/Example01_SimpleEmbedding.cs b/.dotnet/examples/Embeddings/Example01_SimpleEmbedding.cs new file mode 100644 index 000000000..0eaacb7c0 --- /dev/null +++ b/.dotnet/examples/Embeddings/Example01_SimpleEmbedding.cs @@ -0,0 +1,28 @@ +using NUnit.Framework; +using OpenAI.Embeddings; +using System; + +namespace OpenAI.Examples; + +public partial class EmbeddingExamples +{ + [Test] + public void Example01_SimpleEmbedding() + { + EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string description = "Best hotel in town if you like luxury hotels. They have an amazing infinity pool, a spa," + + " and a really helpful concierge. The location is perfect -- right downtown, close to all the tourist" + + " attractions. We highly recommend this hotel."; + + Embedding embedding = client.GenerateEmbedding(description); + ReadOnlyMemory vector = embedding.Vector; + + Console.WriteLine($"Dimension: {vector.Length}"); + Console.WriteLine($"Floats: "); + for (int i = 0; i < vector.Length; i++) + { + Console.WriteLine($" [{i,4}] = {vector.Span[i]}"); + } + } +} diff --git a/.dotnet/examples/Embeddings/Example01_SimpleEmbeddingAsync.cs b/.dotnet/examples/Embeddings/Example01_SimpleEmbeddingAsync.cs new file mode 100644 index 000000000..0a81b5875 --- /dev/null +++ b/.dotnet/examples/Embeddings/Example01_SimpleEmbeddingAsync.cs @@ -0,0 +1,29 @@ +using NUnit.Framework; +using OpenAI.Embeddings; +using System; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class EmbeddingExamples +{ + [Test] + public async Task Example01_SimpleEmbeddingAsync() + { + EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string description = "Best hotel in town if you like luxury hotels. They have an amazing infinity pool, a spa," + + " and a really helpful concierge. The location is perfect -- right downtown, close to all the tourist" + + " attractions. We highly recommend this hotel."; + + Embedding embedding = await client.GenerateEmbeddingAsync(description); + ReadOnlyMemory vector = embedding.Vector; + + Console.WriteLine($"Dimension: {vector.Length}"); + Console.WriteLine($"Floats: "); + for (int i = 0; i < vector.Length; i++) + { + Console.WriteLine($" [{i,4}] = {vector.Span[i]}"); + } + } +} diff --git a/.dotnet/examples/Embeddings/Example02_EmbeddingWithOptions.cs b/.dotnet/examples/Embeddings/Example02_EmbeddingWithOptions.cs new file mode 100644 index 000000000..16e744f28 --- /dev/null +++ b/.dotnet/examples/Embeddings/Example02_EmbeddingWithOptions.cs @@ -0,0 +1,30 @@ +using NUnit.Framework; +using OpenAI.Embeddings; +using System; + +namespace OpenAI.Examples; + +public partial class EmbeddingExamples +{ + [Test] + public void Example02_EmbeddingWithOptions() + { + EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string description = "Best hotel in town if you like luxury hotels. They have an amazing infinity pool, a spa," + + " and a really helpful concierge. The location is perfect -- right downtown, close to all the tourist" + + " attractions. We highly recommend this hotel."; + + EmbeddingGenerationOptions options = new() { Dimensions = 512 }; + + Embedding embedding = client.GenerateEmbedding(description, options); + ReadOnlyMemory vector = embedding.Vector; + + Console.WriteLine($"Dimension: {vector.Length}"); + Console.WriteLine($"Floats: "); + for (int i = 0; i < vector.Length; i++) + { + Console.WriteLine($" [{i,3}] = {vector.Span[i]}"); + } + } +} diff --git a/.dotnet/examples/Embeddings/Example02_EmbeddingWithOptionsAsync.cs b/.dotnet/examples/Embeddings/Example02_EmbeddingWithOptionsAsync.cs new file mode 100644 index 000000000..bbb12cb77 --- /dev/null +++ b/.dotnet/examples/Embeddings/Example02_EmbeddingWithOptionsAsync.cs @@ -0,0 +1,31 @@ +using NUnit.Framework; +using OpenAI.Embeddings; +using System; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class EmbeddingExamples +{ + [Test] + public async Task Example02_EmbeddingWithOptionsAsync() + { + EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string description = "Best hotel in town if you like luxury hotels. They have an amazing infinity pool, a spa," + + " and a really helpful concierge. The location is perfect -- right downtown, close to all the tourist" + + " attractions. We highly recommend this hotel."; + + EmbeddingGenerationOptions options = new() { Dimensions = 512 }; + + Embedding embedding = await client.GenerateEmbeddingAsync(description, options); + ReadOnlyMemory vector = embedding.Vector; + + Console.WriteLine($"Dimension: {vector.Length}"); + Console.WriteLine($"Floats: "); + for (int i = 0; i < vector.Length; i++) + { + Console.WriteLine($" [{i,3}] = {vector.Span[i]}"); + } + } +} diff --git a/.dotnet/examples/Embeddings/Example03_MultipleEmbeddings.cs b/.dotnet/examples/Embeddings/Example03_MultipleEmbeddings.cs new file mode 100644 index 000000000..a85dd2007 --- /dev/null +++ b/.dotnet/examples/Embeddings/Example03_MultipleEmbeddings.cs @@ -0,0 +1,37 @@ +using NUnit.Framework; +using OpenAI.Embeddings; +using System; +using System.Collections.Generic; + +namespace OpenAI.Examples; + +public partial class EmbeddingExamples +{ + [Test] + public void Example03_MultipleEmbeddings() + { + EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string category = "Luxury"; + string description = "Best hotel in town if you like luxury hotels. They have an amazing infinity pool, a spa," + + " and a really helpful concierge. The location is perfect -- right downtown, close to all the tourist" + + " attractions. We highly recommend this hotel."; + List inputs = [category, description]; + + EmbeddingCollection collection = client.GenerateEmbeddings(inputs); + + foreach (Embedding embedding in collection) + { + ReadOnlyMemory vector = embedding.Vector; + + Console.WriteLine($"Dimension: {vector.Length}"); + Console.WriteLine($"Floats: "); + for (int i = 0; i < vector.Length; i++) + { + Console.WriteLine($" [{i,4}] = {vector.Span[i]}"); + } + + Console.WriteLine(); + } + } +} diff --git a/.dotnet/examples/Embeddings/Example03_MultipleEmbeddingsAsync.cs b/.dotnet/examples/Embeddings/Example03_MultipleEmbeddingsAsync.cs new file mode 100644 index 000000000..6f82d2d5d --- /dev/null +++ b/.dotnet/examples/Embeddings/Example03_MultipleEmbeddingsAsync.cs @@ -0,0 +1,38 @@ +using NUnit.Framework; +using OpenAI.Embeddings; +using System; +using System.Collections.Generic; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class EmbeddingExamples +{ + [Test] + public async Task Example03_MultipleEmbeddingsAsync() + { + EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string category = "Luxury"; + string description = "Best hotel in town if you like luxury hotels. They have an amazing infinity pool, a spa," + + " and a really helpful concierge. The location is perfect -- right downtown, close to all the tourist" + + " attractions. We highly recommend this hotel."; + List inputs = [category, description]; + + EmbeddingCollection collection = await client.GenerateEmbeddingsAsync(inputs); + + foreach (Embedding embedding in collection) + { + ReadOnlyMemory vector = embedding.Vector; + + Console.WriteLine($"Dimension: {vector.Length}"); + Console.WriteLine($"Floats: "); + for (int i = 0; i < vector.Length; i++) + { + Console.WriteLine($" [{i,4}] = {vector.Span[i]}"); + } + + Console.WriteLine(); + } + } +} diff --git a/.dotnet/examples/Embeddings/Example04_SimpleEmbeddingProtocol.cs b/.dotnet/examples/Embeddings/Example04_SimpleEmbeddingProtocol.cs new file mode 100644 index 000000000..ea0992be8 --- /dev/null +++ b/.dotnet/examples/Embeddings/Example04_SimpleEmbeddingProtocol.cs @@ -0,0 +1,43 @@ +using NUnit.Framework; +using OpenAI.Embeddings; +using System; +using System.ClientModel; +using System.Text.Json; + +namespace OpenAI.Examples; + +public partial class EmbeddingExamples +{ + [Test] + public void Example04_SimpleEmbeddingProtocol() + { + EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string description = "Best hotel in town if you like luxury hotels. They have an amazing infinity pool, a spa," + + " and a really helpful concierge. The location is perfect -- right downtown, close to all the tourist" + + " attractions. We highly recommend this hotel."; + + BinaryData input = BinaryData.FromObjectAsJson(new { + model = "text-embedding-3-small", + input = description, + encoding_format = "float" + }); + + using BinaryContent content = BinaryContent.Create(input); + ClientResult result = client.GenerateEmbeddings(content); + BinaryData output = result.GetRawResponse().Content; + + using JsonDocument outputAsJson = JsonDocument.Parse(output.ToString()); + JsonElement vector = outputAsJson.RootElement + .GetProperty("data"u8)[0] + .GetProperty("embedding"u8); + + Console.WriteLine($"Dimension: {vector.GetArrayLength()}"); + Console.WriteLine($"Floats: "); + int i = 0; + foreach (JsonElement element in vector.EnumerateArray()) + { + Console.WriteLine($" [{i++,4}] = {element.GetDouble()}"); + } + } +} diff --git a/.dotnet/examples/Embeddings/Example04_SimpleEmbeddingProtocolAsync.cs b/.dotnet/examples/Embeddings/Example04_SimpleEmbeddingProtocolAsync.cs new file mode 100644 index 000000000..e3be7b52f --- /dev/null +++ b/.dotnet/examples/Embeddings/Example04_SimpleEmbeddingProtocolAsync.cs @@ -0,0 +1,45 @@ +using NUnit.Framework; +using OpenAI.Embeddings; +using System; +using System.ClientModel; +using System.Text.Json; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class EmbeddingExamples +{ + [Test] + public async Task Example04_SimpleEmbeddingProtocolAsync() + { + EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string description = "Best hotel in town if you like luxury hotels. They have an amazing infinity pool, a spa," + + " and a really helpful concierge. The location is perfect -- right downtown, close to all the tourist" + + " attractions. We highly recommend this hotel."; + + BinaryData input = BinaryData.FromObjectAsJson(new + { + model = "text-embedding-3-small", + input = description, + encoding_format = "float" + }); + + using BinaryContent content = BinaryContent.Create(input); + ClientResult result = await client.GenerateEmbeddingsAsync(content); + BinaryData output = result.GetRawResponse().Content; + + using JsonDocument outputAsJson = JsonDocument.Parse(output.ToString()); + JsonElement vector = outputAsJson.RootElement + .GetProperty("data"u8)[0] + .GetProperty("embedding"u8); + + Console.WriteLine($"Dimension: {vector.GetArrayLength()}"); + Console.WriteLine($"Floats: "); + int i = 0; + foreach (JsonElement element in vector.EnumerateArray()) + { + Console.WriteLine($" [{i++,4}] = {element.GetDouble()}"); + } + } +} diff --git a/.dotnet/examples/Images/Example01_SimpleImageGeneration.cs b/.dotnet/examples/Images/Example01_SimpleImageGeneration.cs new file mode 100644 index 000000000..f194348aa --- /dev/null +++ b/.dotnet/examples/Images/Example01_SimpleImageGeneration.cs @@ -0,0 +1,37 @@ +using NUnit.Framework; +using OpenAI.Images; +using System; +using System.IO; + +namespace OpenAI.Examples; + +public partial class ImageExamples +{ + [Test] + public void Example01_SimpleImageGeneration() + { + ImageClient client = new("dall-e-3", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string prompt = "The concept for a living room that blends Scandinavian simplicity with Japanese minimalism for" + + " a serene and cozy atmosphere. It's a space that invites relaxation and mindfulness, with natural light" + + " and fresh air. Using neutral tones, including colors like white, beige, gray, and black, that create a" + + " sense of harmony. Featuring sleek wood furniture with clean lines and subtle curves to add warmth and" + + " elegance. Plants and flowers in ceramic pots adding color and life to a space. They can serve as focal" + + " points, creating a connection with nature. Soft textiles and cushions in organic fabrics adding comfort" + + " and softness to a space. They can serve as accents, adding contrast and texture."; + + ImageGenerationOptions options = new() + { + Quality = GeneratedImageQuality.High, + Size = GeneratedImageSize.W1792xH1024, + Style = GeneratedImageStyle.Vivid, + ResponseFormat = GeneratedImageFormat.Bytes + }; + + GeneratedImage image = client.GenerateImage(prompt, options); + BinaryData bytes = image.ImageBytes; + + using FileStream stream = File.OpenWrite($"{Guid.NewGuid()}.png"); + bytes.ToStream().CopyTo(stream); + } +} diff --git a/.dotnet/examples/Images/Example01_SimpleImageGenerationAsync.cs b/.dotnet/examples/Images/Example01_SimpleImageGenerationAsync.cs new file mode 100644 index 000000000..20487d3a9 --- /dev/null +++ b/.dotnet/examples/Images/Example01_SimpleImageGenerationAsync.cs @@ -0,0 +1,38 @@ +using NUnit.Framework; +using OpenAI.Images; +using System; +using System.IO; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class ImageExamples +{ + [Test] + public async Task Example01_SimpleImageGenerationAsync() + { + ImageClient client = new("dall-e-3", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string prompt = "The concept for a living room that blends Scandinavian simplicity with Japanese minimalism for" + + " a serene and cozy atmosphere. It's a space that invites relaxation and mindfulness, with natural light" + + " and fresh air. Using neutral tones, including colors like white, beige, gray, and black, that create a" + + " sense of harmony. Featuring sleek wood furniture with clean lines and subtle curves to add warmth and" + + " elegance. Plants and flowers in ceramic pots adding color and life to a space. They can serve as focal" + + " points, creating a connection with nature. Soft textiles and cushions in organic fabrics adding comfort" + + " and softness to a space. They can serve as accents, adding contrast and texture."; + + ImageGenerationOptions options = new() + { + Quality = GeneratedImageQuality.High, + Size = GeneratedImageSize.W1792xH1024, + Style = GeneratedImageStyle.Vivid, + ResponseFormat = GeneratedImageFormat.Bytes + }; + + GeneratedImage image = await client.GenerateImageAsync(prompt, options); + BinaryData bytes = image.ImageBytes; + + using FileStream stream = File.OpenWrite($"{Guid.NewGuid()}.png"); + bytes.ToStream().CopyTo(stream); + } +} diff --git a/.dotnet/examples/Images/Example02_SimpleImageEdit.cs b/.dotnet/examples/Images/Example02_SimpleImageEdit.cs new file mode 100644 index 000000000..e3114dc3f --- /dev/null +++ b/.dotnet/examples/Images/Example02_SimpleImageEdit.cs @@ -0,0 +1,31 @@ +using NUnit.Framework; +using OpenAI.Images; +using System; +using System.IO; + +namespace OpenAI.Examples; + +public partial class ImageExamples +{ + [Test] + public void Example02_SimpleImageEdit() + { + ImageClient client = new("dall-e-2", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string imageFilePath = Path.Combine("Assets", "images_flower_vase.png"); + string prompt = "A vase full of beautiful flowers."; + string maskFilePath = Path.Combine("Assets", "images_flower_vase_with_mask.png"); + + ImageEditOptions options = new() + { + Size = GeneratedImageSize.W512xH512, + ResponseFormat = GeneratedImageFormat.Bytes + }; + + GeneratedImage edit = client.GenerateImageEdit(imageFilePath, prompt, maskFilePath, options); + BinaryData bytes = edit.ImageBytes; + + using FileStream stream = File.OpenWrite($"{Guid.NewGuid()}.png"); + bytes.ToStream().CopyTo(stream); + } +} diff --git a/.dotnet/examples/Images/Example02_SimpleImageEditAsync.cs b/.dotnet/examples/Images/Example02_SimpleImageEditAsync.cs new file mode 100644 index 000000000..84a4d3495 --- /dev/null +++ b/.dotnet/examples/Images/Example02_SimpleImageEditAsync.cs @@ -0,0 +1,32 @@ +using NUnit.Framework; +using OpenAI.Images; +using System; +using System.IO; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class ImageExamples +{ + [Test] + public async Task Example02_SimpleImageEditAsync() + { + ImageClient client = new("dall-e-2", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string imageFilePath = Path.Combine("Assets", "images_flower_vase.png"); + string prompt = "A vase full of beautiful flowers."; + string maskFilePath = Path.Combine("Assets", "images_flower_vase_with_mask.png"); + + ImageEditOptions options = new() + { + Size = GeneratedImageSize.W512xH512, + ResponseFormat = GeneratedImageFormat.Bytes + }; + + GeneratedImage edit = await client.GenerateImageEditAsync(imageFilePath, prompt, maskFilePath, options); + BinaryData bytes = edit.ImageBytes; + + using FileStream stream = File.OpenWrite($"{Guid.NewGuid()}.png"); + await bytes.ToStream().CopyToAsync(stream); + } +} diff --git a/.dotnet/examples/Images/Example03_SimpleImageVariation.cs b/.dotnet/examples/Images/Example03_SimpleImageVariation.cs new file mode 100644 index 000000000..04bd3a39f --- /dev/null +++ b/.dotnet/examples/Images/Example03_SimpleImageVariation.cs @@ -0,0 +1,29 @@ +using NUnit.Framework; +using OpenAI.Images; +using System; +using System.IO; + +namespace OpenAI.Examples; + +public partial class ImageExamples +{ + [Test] + public void Example03_SimpleImageVariation() + { + ImageClient client = new("dall-e-2", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string imageFilePath = Path.Combine("Assets", "images_dog_and_cat.png"); + + ImageVariationOptions options = new() + { + Size = GeneratedImageSize.W256xH256, + ResponseFormat = GeneratedImageFormat.Bytes + }; + + GeneratedImage variation = client.GenerateImageVariation(imageFilePath, options); + BinaryData bytes = variation.ImageBytes; + + using FileStream stream = File.OpenWrite($"{Guid.NewGuid()}.png"); + bytes.ToStream().CopyTo(stream); + } +} diff --git a/.dotnet/examples/Images/Example03_SimpleImageVariationAsync.cs b/.dotnet/examples/Images/Example03_SimpleImageVariationAsync.cs new file mode 100644 index 000000000..50c90a24e --- /dev/null +++ b/.dotnet/examples/Images/Example03_SimpleImageVariationAsync.cs @@ -0,0 +1,30 @@ +using NUnit.Framework; +using OpenAI.Images; +using System; +using System.IO; +using System.Threading.Tasks; + +namespace OpenAI.Examples; + +public partial class ImageExamples +{ + [Test] + public async Task Example03_SimpleImageVariationAsync() + { + ImageClient client = new("dall-e-2", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string imageFilePath = Path.Combine("Assets", "images_dog_and_cat.png"); + + ImageVariationOptions options = new() + { + Size = GeneratedImageSize.W256xH256, + ResponseFormat = GeneratedImageFormat.Bytes + }; + + GeneratedImage variation = await client.GenerateImageVariationAsync(imageFilePath, options); + BinaryData bytes = variation.ImageBytes; + + using FileStream stream = File.OpenWrite($"{Guid.NewGuid()}.png"); + bytes.ToStream().CopyTo(stream); + } +} diff --git a/.dotnet/examples/OpenAI.Examples.csproj b/.dotnet/examples/OpenAI.Examples.csproj new file mode 100644 index 000000000..b33b036ea --- /dev/null +++ b/.dotnet/examples/OpenAI.Examples.csproj @@ -0,0 +1,18 @@ + + + net8.0 + + $(NoWarn);CS1591 + latest + + + + + + + + + + + + \ No newline at end of file diff --git a/.dotnet/global.json b/.dotnet/global.json new file mode 100644 index 000000000..7eb9bc5b3 --- /dev/null +++ b/.dotnet/global.json @@ -0,0 +1,6 @@ +{ + "sdk": { + "version": "8.0.100", + "rollForward": "feature" + } +} \ No newline at end of file diff --git a/.dotnet/nuget.config b/.dotnet/nuget.config new file mode 100644 index 000000000..2805dff37 --- /dev/null +++ b/.dotnet/nuget.config @@ -0,0 +1,8 @@ + + + + + + + + diff --git a/.dotnet/src/Custom/Administration/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/Administration/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..7ed945616 --- /dev/null +++ b/.dotnet/src/Custom/Administration/Internal/GeneratorStubs.cs @@ -0,0 +1,36 @@ +namespace OpenAI.Administration; + +[CodeGenModel("AuditLogActorServiceAccount")] internal partial class InternalAuditLogActorServiceAccount { } +[CodeGenModel("AuditLogActorUser")] internal partial class InternalAuditLogActorUser { } +[CodeGenModel("AuditLogActorApiKey")] internal partial class InternalAuditLogActorApiKey { } +[CodeGenModel("AuditLogActorSession")] internal partial class InternalAuditLogActorSession { } +[CodeGenModel("AuditLogActor")] internal partial class InternalAuditLogActor { } +[CodeGenModel("AuditLog")] internal partial class InternalAuditLog { } +[CodeGenModel("ListAuditLogsResponse")] internal partial class InternalListAuditLogsResponse { } +[CodeGenModel("Invite")] internal partial class InternalInvite { } +[CodeGenModel("InviteListResponse")] internal partial class InternalInviteListResponse { } +[CodeGenModel("InviteRequest")] internal partial class InternalInviteRequest { } +[CodeGenModel("InviteDeleteResponse")] internal partial class InternalInviteDeleteResponse { } +[CodeGenModel("User")] internal partial class InternalUser { } +[CodeGenModel("UserListResponse")] internal partial class InternalUserListResponse { } +[CodeGenModel("UserRoleUpdateRequest")] internal partial class InternalUserRoleUpdateRequest { } +[CodeGenModel("UserDeleteResponse")] internal partial class InternalUserDeleteResponse { } +[CodeGenModel("Project")] internal partial class InternalProject { } +[CodeGenModel("ProjectListResponse")] internal partial class InternalProjectListResponse { } +[CodeGenModel("ProjectCreateRequest")] internal partial class InternalProjectCreateRequest { } +[CodeGenModel("ProjectUpdateRequest")] internal partial class InternalProjectUpdateRequest { } +[CodeGenModel("DefaultProjectErrorResponse")] internal partial class InternalDefaultProjectErrorResponse { } +[CodeGenModel("ProjectUser")] internal partial class InternalProjectUser { } +[CodeGenModel("ProjectUserListResponse")] internal partial class InternalProjectUserListResponse { } +[CodeGenModel("ProjectUserCreateRequest")] internal partial class InternalProjectUserCreateRequest { } +[CodeGenModel("ProjectUserUpdateRequest")] internal partial class InternalProjectUserUpdateRequest { } +[CodeGenModel("ProjectUserDeleteResponse")] internal partial class InternalProjectUserDeleteResponse { } +[CodeGenModel("ProjectServiceAccount")] internal partial class InternalProjectServiceAccount { } +[CodeGenModel("ProjectServiceAccountListResponse")] internal partial class InternalProjectServiceAccountListResponse { } +[CodeGenModel("ProjectServiceAccountCreateRequest")] internal partial class InternalProjectServiceAccountCreateRequest { } +[CodeGenModel("ProjectServiceAccountCreateResponse")] internal partial class InternalProjectServiceAccountCreateResponse { } +[CodeGenModel("ProjectServiceAccountApiKey")] internal partial class InternalProjectServiceAccountApiKey { } +[CodeGenModel("ProjectServiceAccountDeleteResponse")] internal partial class InternalProjectServiceAccountDeleteResponse { } +[CodeGenModel("ProjectApiKey")] internal partial class InternalProjectApiKey { } +[CodeGenModel("ProjectApiKeyListResponse")] internal partial class InternalProjectApiKeyListResponse { } +[CodeGenModel("ProjectApiKeyDeleteResponse")] internal partial class InternalProjectApiKeyDeleteResponse { } diff --git a/.dotnet/src/Custom/Assistants/Assistant.cs b/.dotnet/src/Custom/Assistants/Assistant.cs new file mode 100644 index 000000000..34ee40166 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Assistant.cs @@ -0,0 +1,21 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("AssistantObject")] +public partial class Assistant +{ + // CUSTOM: Made internal. + /// The object type, which is always `assistant`. + [CodeGenMember("Object")] + internal InternalAssistantObjectObject Object { get; } = InternalAssistantObjectObject.Assistant; + + /// + public AssistantResponseFormat ResponseFormat { get; } + + /// + /// An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + /// + /// We generally recommend altering this or temperature but not both. + /// + [CodeGenMember("TopP")] + public float? NucleusSamplingFactor { get; } +} diff --git a/.dotnet/src/Custom/Assistants/AssistantClient.Convenience.cs b/.dotnet/src/Custom/Assistants/AssistantClient.Convenience.cs new file mode 100644 index 000000000..5eb15133d --- /dev/null +++ b/.dotnet/src/Custom/Assistants/AssistantClient.Convenience.cs @@ -0,0 +1,464 @@ +using System.ClientModel; +using System.Collections.Generic; +using System.Threading.Tasks; + +namespace OpenAI.Assistants; + +public partial class AssistantClient +{ + /// + /// Modifies an existing . + /// + /// The assistant to modify. + /// The changes to apply to the assistant. + /// + /// An updated instance that reflects the requested changes. + /// + public virtual Task> ModifyAssistantAsync(Assistant assistant, AssistantModificationOptions options) + => ModifyAssistantAsync(assistant?.Id, options); + + /// + /// Modifies an existing . + /// + /// The assistant to modify. + /// The changes to apply to the assistant. + /// + /// An updated instance that reflects the requested changes. + /// + public virtual ClientResult ModifyAssistant(Assistant assistant, AssistantModificationOptions options) + => ModifyAssistant(assistant?.Id, options); + + /// + /// Deletes an existing . + /// + /// The assistant to delete. + /// A value indicating whether the deletion was successful. + public virtual Task> DeleteAssistantAsync(Assistant assistant) + => DeleteAssistantAsync(assistant?.Id); + + /// + /// Deletes an existing . + /// + /// The assistant to delete. + /// A value indicating whether the deletion was successful. + public virtual ClientResult DeleteAssistant(Assistant assistant) + => DeleteAssistant(assistant?.Id); + + /// + /// Gets an updated instance of an existing . + /// + /// The existing thread to refresh the state of. + /// An updated instance of the provided . + public virtual Task> GetThreadAsync(AssistantThread thread) + => GetThreadAsync(thread?.Id); + + /// + /// Gets an updated instance of an existing . + /// + /// The existing thread to refresh the state of. + /// An updated instance of the provided . + public virtual ClientResult GetThread(AssistantThread thread) + => GetThread(thread?.Id); + + /// + /// Modifies an existing . + /// + /// The thread to modify. + /// The modifications to apply to the thread. + /// The updated instance. + public virtual Task> ModifyThreadAsync(AssistantThread thread, ThreadModificationOptions options) + => ModifyThreadAsync(thread?.Id, options); + + /// + /// Modifies an existing . + /// + /// The thread to modify. + /// The modifications to apply to the thread. + /// The updated instance. + public virtual ClientResult ModifyThread(AssistantThread thread, ThreadModificationOptions options) + => ModifyThread(thread?.Id, options); + + /// + /// Deletes an existing . + /// + /// The thread to delete. + /// A value indicating whether the deletion was successful. + public virtual Task> DeleteThreadAsync(AssistantThread thread) + => DeleteThreadAsync(thread?.Id); + + /// + /// Deletes an existing . + /// + /// The thread to delete. + /// A value indicating whether the deletion was successful. + public virtual ClientResult DeleteThread(AssistantThread thread) + => DeleteThread(thread?.Id); + + /// + /// Creates a new on an existing . + /// + /// The thread to associate the new message with. + /// The role to associate with the new message. + /// The collection of items for the message. + /// Additional options to apply to the new message. + /// A new . + public virtual Task> CreateMessageAsync( + AssistantThread thread, + MessageRole role, + IEnumerable content, + MessageCreationOptions options = null) + => CreateMessageAsync(thread?.Id, role, content, options); + + /// + /// Creates a new on an existing . + /// + /// The thread to associate the new message with. + /// The role to associate with the new message. + /// The collection of items for the message. + /// Additional options to apply to the new message. + /// A new . + public virtual ClientResult CreateMessage( + AssistantThread thread, + MessageRole role, + IEnumerable content, + MessageCreationOptions options = null) + => CreateMessage(thread?.Id, role, content, options); + + /// + /// Gets a page collection holding instances from an existing . + /// + /// The thread to list messages from. + /// Options describing the collection to return. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetMessagesAsync( + AssistantThread thread, + MessageCollectionOptions options = default) + { + Argument.AssertNotNull(thread, nameof(thread)); + + return GetMessagesAsync(thread.Id, options); + } + + /// + /// Gets a page collection holding instances from an existing . + /// + /// The thread to list messages from. + /// Options describing the collection to return. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetMessages( + AssistantThread thread, + MessageCollectionOptions options = default) + { + Argument.AssertNotNull(thread, nameof(thread)); + + return GetMessages(thread.Id, options); + } + + /// + /// Gets an updated instance of an existing . + /// + /// The existing message to refresh the state of. + /// An updated instance of the provided . + public virtual Task> GetMessageAsync(ThreadMessage message) + => GetMessageAsync(message?.ThreadId, message?.Id); + + /// + /// Gets an updated instance of an existing . + /// + /// The existing message to refresh the state of. + /// An updated instance of the provided . + public virtual ClientResult GetMessage(ThreadMessage message) + => GetMessage(message?.ThreadId, message?.Id); + + /// + /// Modifies an existing . + /// + /// The message to modify. + /// The changes to apply to the message. + /// The updated . + public virtual Task> ModifyMessageAsync(ThreadMessage message, MessageModificationOptions options) + => ModifyMessageAsync(message?.ThreadId, message?.Id, options); + + /// + /// Modifies an existing . + /// + /// The message to modify. + /// The changes to apply to the message. + /// The updated . + public virtual ClientResult ModifyMessage(ThreadMessage message, MessageModificationOptions options) + => ModifyMessage(message?.ThreadId, message?.Id, options); + + /// + /// Deletes an existing . + /// + /// The message to delete. + /// A value indicating whether the deletion was successful. + public virtual Task> DeleteMessageAsync(ThreadMessage message) + => DeleteMessageAsync(message?.ThreadId, message?.Id); + + /// + /// Deletes an existing . + /// + /// The message to delete. + /// A value indicating whether the deletion was successful. + public virtual ClientResult DeleteMessage(ThreadMessage message) + => DeleteMessage(message?.ThreadId, message?.Id); + + /// + /// Begins a new that evaluates a using a specified + /// . + /// + /// The thread that the run should evaluate. + /// The assistant that should be used when evaluating the thread. + /// Additional options for the run. + /// A new instance. + public virtual Task> CreateRunAsync(AssistantThread thread, Assistant assistant, RunCreationOptions options = null) + => CreateRunAsync(thread?.Id, assistant?.Id, options); + + /// + /// Begins a new that evaluates a using a specified + /// . + /// + /// The thread that the run should evaluate. + /// The assistant that should be used when evaluating the thread. + /// Additional options for the run. + /// A new instance. + public virtual ClientResult CreateRun(AssistantThread thread, Assistant assistant, RunCreationOptions options = null) + => CreateRun(thread?.Id, assistant?.Id, options); + + /// + /// Begins a new streaming that evaluates a using a specified + /// . + /// + /// The thread that the run should evaluate. + /// The assistant that should be used when evaluating the thread. + /// Additional options for the run. + public virtual AsyncCollectionResult CreateRunStreamingAsync( + AssistantThread thread, + Assistant assistant, + RunCreationOptions options = null) + => CreateRunStreamingAsync(thread?.Id, assistant?.Id, options); + + /// + /// Begins a new streaming that evaluates a using a specified + /// . + /// + /// The thread that the run should evaluate. + /// The assistant that should be used when evaluating the thread. + /// Additional options for the run. + public virtual CollectionResult CreateRunStreaming( + AssistantThread thread, + Assistant assistant, + RunCreationOptions options = null) + => CreateRunStreaming(thread?.Id, assistant?.Id, options); + + /// + /// Creates a new thread and immediately begins a run against it using the specified . + /// + /// The assistant that the new run should use. + /// Options for the new thread that will be created. + /// Additional options to apply to the run that will begin. + /// A new . + public virtual Task> CreateThreadAndRunAsync( + Assistant assistant, + ThreadCreationOptions threadOptions = null, + RunCreationOptions runOptions = null) + => CreateThreadAndRunAsync(assistant?.Id, threadOptions, runOptions); + + /// + /// Creates a new thread and immediately begins a run against it using the specified . + /// + /// The assistant that the new run should use. + /// Options for the new thread that will be created. + /// Additional options to apply to the run that will begin. + /// A new . + public virtual ClientResult CreateThreadAndRun( + Assistant assistant, + ThreadCreationOptions threadOptions = null, + RunCreationOptions runOptions = null) + => CreateThreadAndRun(assistant?.Id, threadOptions, runOptions); + + /// + /// Creates a new thread and immediately begins a streaming run against it using the specified . + /// + /// The assistant that the new run should use. + /// Options for the new thread that will be created. + /// Additional options to apply to the run that will begin. + public virtual AsyncCollectionResult CreateThreadAndRunStreamingAsync( + Assistant assistant, + ThreadCreationOptions threadOptions = null, + RunCreationOptions runOptions = null) + => CreateThreadAndRunStreamingAsync(assistant?.Id, threadOptions, runOptions); + + /// + /// Creates a new thread and immediately begins a streaming run against it using the specified . + /// + /// The assistant that the new run should use. + /// Options for the new thread that will be created. + /// Additional options to apply to the run that will begin. + public virtual CollectionResult CreateThreadAndRunStreaming( + Assistant assistant, + ThreadCreationOptions threadOptions = null, + RunCreationOptions runOptions = null) + => CreateThreadAndRunStreaming(assistant?.Id, threadOptions, runOptions); + + /// + /// Gets a page collection holding instances associated with an existing . + /// + /// The thread that runs in the list should be associated with. + /// Options describing the collection to return. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetRunsAsync( + AssistantThread thread, + RunCollectionOptions options = default) + { + Argument.AssertNotNull(thread, nameof(thread)); + + return GetRunsAsync(thread.Id, options); + } + + /// + /// Gets a page collection holding instances associated with an existing . + /// + /// The thread that runs in the list should be associated with. + /// Options describing the collection to return. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetRuns( + AssistantThread thread, + RunCollectionOptions options = default) + { + Argument.AssertNotNull(thread, nameof(thread)); + + return GetRuns(thread.Id, options); + } + + /// + /// Gets a refreshed instance of an existing . + /// + /// The run to get a refreshed instance of. + /// A new instance with updated information. + public virtual Task> GetRunAsync(ThreadRun run) + => GetRunAsync(run?.ThreadId, run?.Id); + + /// + /// Gets a refreshed instance of an existing . + /// + /// The run to get a refreshed instance of. + /// A new instance with updated information. + public virtual ClientResult GetRun(ThreadRun run) + => GetRun(run?.ThreadId, run?.Id); + + /// + /// Submits a collection of required tool call outputs to a run and resumes the run. + /// + /// The run that reached a requires_action status. + /// + /// The tool outputs, corresponding to instances from the run. + /// + /// The , updated after the submission was processed. + public virtual Task> SubmitToolOutputsToRunAsync( + ThreadRun run, + IEnumerable toolOutputs) + => SubmitToolOutputsToRunAsync(run?.ThreadId, run?.Id, toolOutputs); + + /// + /// Submits a collection of required tool call outputs to a run and resumes the run. + /// + /// The run that reached a requires_action status. + /// + /// The tool outputs, corresponding to instances from the run. + /// + /// The , updated after the submission was processed. + public virtual ClientResult SubmitToolOutputsToRun( + ThreadRun run, + IEnumerable toolOutputs) + => SubmitToolOutputsToRun(run?.ThreadId, run?.Id, toolOutputs); + + /// + /// Submits a collection of required tool call outputs to a run and resumes the run with streaming enabled. + /// + /// The run that reached a requires_action status. + /// + /// The tool outputs, corresponding to instances from the run. + /// + public virtual AsyncCollectionResult SubmitToolOutputsToRunStreamingAsync( + ThreadRun run, + IEnumerable toolOutputs) + => SubmitToolOutputsToRunStreamingAsync(run?.ThreadId, run?.Id, toolOutputs); + + /// + /// Submits a collection of required tool call outputs to a run and resumes the run with streaming enabled. + /// + /// The run that reached a requires_action status. + /// + /// The tool outputs, corresponding to instances from the run. + /// + public virtual CollectionResult SubmitToolOutputsToRunStreaming( + ThreadRun run, + IEnumerable toolOutputs) + => SubmitToolOutputsToRunStreaming(run?.ThreadId, run?.Id, toolOutputs); + + /// + /// Cancels an in-progress . + /// + /// The run to cancel. + /// An updated instance, reflecting the new status of the run. + public virtual Task> CancelRunAsync(ThreadRun run) + => CancelRunAsync(run?.ThreadId, run?.Id); + + /// + /// Cancels an in-progress . + /// + /// The run to cancel. + /// An updated instance, reflecting the new status of the run. + public virtual ClientResult CancelRun(ThreadRun run) + => CancelRun(run?.ThreadId, run?.Id); + + /// + /// Gets a page collection holding instances associated with a . + /// + /// The run to list run steps from. + /// Options describing the collection to return. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetRunStepsAsync( + ThreadRun run, + RunStepCollectionOptions options = default) + { + Argument.AssertNotNull(run, nameof(run)); + + return GetRunStepsAsync(run.ThreadId, run.Id, options); + } + + /// + /// Gets a page collection holding instances associated with a . + /// + /// The run to list run steps from. + /// Options describing the collection to return. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetRunSteps( + ThreadRun run, + RunStepCollectionOptions options = default) + { + Argument.AssertNotNull(run, nameof(run)); + + return GetRunSteps(run.ThreadId, run.Id, options); + } +} diff --git a/.dotnet/src/Custom/Assistants/AssistantClient.Protocol.cs b/.dotnet/src/Custom/Assistants/AssistantClient.Protocol.cs new file mode 100644 index 000000000..dace80d8f --- /dev/null +++ b/.dotnet/src/Custom/Assistants/AssistantClient.Protocol.cs @@ -0,0 +1,589 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.ComponentModel; +using System.Threading.Tasks; + +namespace OpenAI.Assistants; + +public partial class AssistantClient +{ + /// + /// [Protocol Method] Create an assistant with a model and instructions. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task CreateAssistantAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateAssistantRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Create an assistant with a model and instructions. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateAssistant(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateAssistantRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Returns a paginated collection of assistants. + /// + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetAssistantsAsync(int? limit, string order, string after, string before, RequestOptions options) + { + AssistantsPageEnumerator enumerator = new(_pipeline, _endpoint, limit, order, after, before, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + /// + /// [Protocol Method] Returns a paginated collection of assistants. + /// + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetAssistants(int? limit, string order, string after, string before, RequestOptions options) + { + AssistantsPageEnumerator enumerator = new(_pipeline, _endpoint, limit, order, after, before, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + /// [Protocol Method] Retrieves an assistant. + /// + /// The ID of the assistant to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GetAssistantAsync(string assistantId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + using PipelineMessage message = CreateGetAssistantRequest(assistantId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieves an assistant. + /// + /// The ID of the assistant to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetAssistant(string assistantId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + using PipelineMessage message = CreateGetAssistantRequest(assistantId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Modifies an assistant. + /// + /// The ID of the assistant to modify. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task ModifyAssistantAsync(string assistantId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyAssistantRequest(assistantId, content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Modifies an assistant. + /// + /// The ID of the assistant to modify. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ModifyAssistant(string assistantId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyAssistantRequest(assistantId, content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Delete an assistant. + /// + /// The ID of the assistant to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task DeleteAssistantAsync(string assistantId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + using PipelineMessage message = CreateDeleteAssistantRequest(assistantId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Delete an assistant. + /// + /// The ID of the assistant to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteAssistant(string assistantId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + using PipelineMessage message = CreateDeleteAssistantRequest(assistantId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CreateMessageAsync(string threadId, BinaryContent content, RequestOptions options = null) + => _messageSubClient.CreateMessageAsync(threadId, content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateMessage(string threadId, BinaryContent content, RequestOptions options = null) + => _messageSubClient.CreateMessage(threadId, content, options); + + /// + /// [Protocol Method] Returns a paginated collection of messages for a given thread. + /// + /// The ID of the [thread](/docs/api-reference/threads) the messages belong to. + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetMessagesAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + MessagesPageEnumerator enumerator = new(_pipeline, _endpoint, threadId, limit, order, after, before, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + /// + /// [Protocol Method] Returns a paginated collection of messages for a given thread. + /// + /// The ID of the [thread](/docs/api-reference/threads) the messages belong to. + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetMessages(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + MessagesPageEnumerator enumerator = new(_pipeline, _endpoint, threadId, limit, order, after, before, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetMessageAsync(string threadId, string messageId, RequestOptions options) + => _messageSubClient.GetMessageAsync(threadId, messageId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetMessage(string threadId, string messageId, RequestOptions options) + => _messageSubClient.GetMessage(threadId, messageId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task ModifyMessageAsync(string threadId, string messageId, BinaryContent content, RequestOptions options = null) + => _messageSubClient.ModifyMessageAsync(threadId, messageId, content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ModifyMessage(string threadId, string messageId, BinaryContent content, RequestOptions options = null) + => _messageSubClient.ModifyMessage(threadId, messageId, content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task DeleteMessageAsync(string threadId, string messageId, RequestOptions options) + => _messageSubClient.DeleteMessageAsync(threadId, messageId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteMessage(string threadId, string messageId, RequestOptions options) + => _messageSubClient.DeleteMessage(threadId, messageId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CreateThreadAndRunAsync(BinaryContent content, RequestOptions options = null) + => _runSubClient.CreateThreadAndRunAsync(content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateThreadAndRun(BinaryContent content, RequestOptions options = null) + => _runSubClient.CreateThreadAndRun(content, options = null); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CreateRunAsync(string threadId, BinaryContent content, RequestOptions options = null) + => _runSubClient.CreateRunAsync(threadId, content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateRun(string threadId, BinaryContent content, RequestOptions options = null) + => _runSubClient.CreateRun(threadId, content, options); + + /// + /// [Protocol Method] Returns a paginated collection of runs belonging to a thread. + /// + /// The ID of the thread the run belongs to. + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetRunsAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + RunsPageEnumerator enumerator = new(_pipeline, _endpoint, threadId, limit, order, after, before, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + /// + /// [Protocol Method] Returns a paginated collection of runs belonging to a thread. + /// + /// The ID of the thread the run belongs to. + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetRuns(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + RunsPageEnumerator enumerator = new(_pipeline, _endpoint, threadId, limit, order, after, before, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetRunAsync(string threadId, string runId, RequestOptions options) + => _runSubClient.GetRunAsync(threadId, runId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetRun(string threadId, string runId, RequestOptions options) + => _runSubClient.GetRun(threadId, runId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task ModifyRunAsync(string threadId, string runId, BinaryContent content, RequestOptions options = null) + => _runSubClient.ModifyRunAsync(threadId, runId, content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ModifyRun(string threadId, string runId, BinaryContent content, RequestOptions options = null) + => _runSubClient.ModifyRun(threadId, runId, content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CancelRunAsync(string threadId, string runId, RequestOptions options) + => _runSubClient.CancelRunAsync(threadId, runId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CancelRun(string threadId, string runId, RequestOptions options) + => _runSubClient.CancelRun(threadId, runId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task SubmitToolOutputsToRunAsync(string threadId, string runId, BinaryContent content, RequestOptions options = null) + => _runSubClient.SubmitToolOutputsToRunAsync(threadId, runId, content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult SubmitToolOutputsToRun(string threadId, string runId, BinaryContent content, RequestOptions options = null) + => _runSubClient.SubmitToolOutputsToRun(threadId, runId, content, options); + + /// + /// [Protocol Method] Returns a paginated collection of run steps belonging to a run. + /// + /// The ID of the thread the run and run steps belong to. + /// The ID of the run the run steps belong to. + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetRunStepsAsync(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + RunStepsPageEnumerator enumerator = new(_pipeline, _endpoint, threadId, runId, limit, order, after, before, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + /// + /// [Protocol Method] Returns a paginated collection of run steps belonging to a run. + /// + /// The ID of the thread the run and run steps belong to. + /// The ID of the run the run steps belong to. + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetRunSteps(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + RunStepsPageEnumerator enumerator = new(_pipeline, _endpoint, threadId, runId, limit, order, after, before, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetRunStepAsync(string threadId, string runId, string stepId, RequestOptions options) + => _runSubClient.GetRunStepAsync(threadId, runId, stepId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetRunStep(string threadId, string runId, string stepId, RequestOptions options) + => _runSubClient.GetRunStep(threadId, runId, stepId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task CreateThreadAsync(BinaryContent content, RequestOptions options = null) + => _threadSubClient.CreateThreadAsync(content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateThread(BinaryContent content, RequestOptions options = null) + => _threadSubClient.CreateThread(content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task GetThreadAsync(string threadId, RequestOptions options) + => _threadSubClient.GetThreadAsync(threadId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetThread(string threadId, RequestOptions options) + => _threadSubClient.GetThread(threadId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task ModifyThreadAsync(string threadId, BinaryContent content, RequestOptions options = null) + => _threadSubClient.ModifyThreadAsync(threadId, content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ModifyThread(string threadId, BinaryContent content, RequestOptions options = null) + => _threadSubClient.ModifyThread(threadId, content, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual Task DeleteThreadAsync(string threadId, RequestOptions options) + => _threadSubClient.DeleteThreadAsync(threadId, options); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteThread(string threadId, RequestOptions options) + => _threadSubClient.DeleteThread(threadId, options); +} diff --git a/.dotnet/src/Custom/Assistants/AssistantClient.cs b/.dotnet/src/Custom/Assistants/AssistantClient.cs new file mode 100644 index 000000000..036f71be6 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/AssistantClient.cs @@ -0,0 +1,1227 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Diagnostics.CodeAnalysis; +using System.Linq; +using System.Runtime.CompilerServices; +using System.Threading; +using System.Threading.Tasks; + +namespace OpenAI.Assistants; + +/// The service client for OpenAI assistants operations. +[Experimental("OPENAI001")] +[CodeGenClient("Assistants")] +[CodeGenSuppress("AssistantClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateAssistantAsync", typeof(AssistantCreationOptions))] +[CodeGenSuppress("CreateAssistant", typeof(AssistantCreationOptions))] +[CodeGenSuppress("GetAssistantAsync", typeof(string))] +[CodeGenSuppress("GetAssistant", typeof(string))] +[CodeGenSuppress("ModifyAssistantAsync", typeof(string), typeof(AssistantModificationOptions))] +[CodeGenSuppress("ModifyAssistant", typeof(string), typeof(AssistantModificationOptions))] +[CodeGenSuppress("DeleteAssistantAsync", typeof(string))] +[CodeGenSuppress("DeleteAssistant", typeof(string))] +[CodeGenSuppress("GetAssistantsAsync", typeof(int?), typeof(ListOrder?), typeof(string), typeof(string))] +[CodeGenSuppress("GetAssistants", typeof(int?), typeof(ListOrder?), typeof(string), typeof(string))] +public partial class AssistantClient +{ + private readonly InternalAssistantMessageClient _messageSubClient; + private readonly InternalAssistantRunClient _runSubClient; + private readonly InternalAssistantThreadClient _threadSubClient; + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// is null. + public AssistantClient(ApiKeyCredential credential) : this(credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// is null. + public AssistantClient(ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + _messageSubClient = new(_pipeline, options); + _runSubClient = new(_pipeline, options); + _threadSubClient = new(_pipeline, options); + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The options to configure the client. + /// is null. + protected internal AssistantClient(ClientPipeline pipeline, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + options ??= new OpenAIClientOptions(); + + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + _messageSubClient = new(_pipeline, options); + _runSubClient = new(_pipeline, options); + _threadSubClient = new(_pipeline, options); + } + + /// Creates a new assistant. + /// The default model that the assistant should use. + /// The additional to use. + /// A token that can be used to cancel this method call. + /// is null or empty. + public virtual async Task> CreateAssistantAsync(string model, AssistantCreationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + options ??= new(); + options.Model = model; + + ClientResult protocolResult = await CreateAssistantAsync(options?.ToBinaryContent(), cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, Assistant.FromResponse); + } + + /// Creates a new assistant. + /// The default model that the assistant should use. + /// The additional to use. + /// A token that can be used to cancel this method call. + /// is null or empty. + public virtual ClientResult CreateAssistant(string model, AssistantCreationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + options ??= new(); + options.Model = model; + + ClientResult protocolResult = CreateAssistant(options?.ToBinaryContent(), cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, Assistant.FromResponse); + } + + /// + /// Gets a page collection holding instances. + /// + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetAssistantsAsync( + AssistantCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + return GetAssistantsAsync(options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Rehydrates a page collection holding instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetAssistantsAsync( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + AssistantsPageToken pageToken = AssistantsPageToken.FromToken(firstPageToken); + return GetAssistantsAsync(pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken.Before, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Gets a page collection holding instances. + /// + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetAssistants( + AssistantCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + return GetAssistants(options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Rehydrates a page collection holding instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetAssistants( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + AssistantsPageToken pageToken = AssistantsPageToken.FromToken(firstPageToken); + return GetAssistants(pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken.Before, cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Gets an instance representing an existing based on its ID. + /// + /// The ID of the Assistant to retrieve. + /// A token that can be used to cancel this method call. + /// An instance representing the state of the Assistant with the provided ID. + public virtual async Task> GetAssistantAsync(string assistantId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + ClientResult protocolResult = await GetAssistantAsync(assistantId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, Assistant.FromResponse); + } + + /// + /// Gets an instance representing an existing based on its ID. + /// + /// The ID of the Assistant to retrieve. + /// A token that can be used to cancel this method call. + /// An instance representing the state of the Assistant with the provided ID. + public virtual ClientResult GetAssistant(string assistantId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + ClientResult protocolResult = GetAssistant(assistantId, cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, Assistant.FromResponse); + } + + /// + /// Modifies an existing . + /// + /// The ID of the Assistant to retrieve. + /// The new options to apply to the existing Assistant. + /// A token that can be used to cancel this method call. + /// An updated instance representing the state of the Assistant with the provided ID. + public virtual async Task> ModifyAssistantAsync(string assistantId, AssistantModificationOptions options, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + Argument.AssertNotNull(options, nameof(options)); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult protocolResult + = await ModifyAssistantAsync(assistantId, content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, Assistant.FromResponse); + } + + /// + /// Modifies an existing . + /// + /// The ID of the Assistant to retrieve. + /// The new options to apply to the existing Assistant. + /// A token that can be used to cancel this method call. + /// An updated instance representing the state of the Assistant with the provided ID. + public virtual ClientResult ModifyAssistant(string assistantId, AssistantModificationOptions options, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + Argument.AssertNotNull(options, nameof(options)); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult protocolResult = ModifyAssistant(assistantId, content, null); + return CreateResultFromProtocol(protocolResult, Assistant.FromResponse); + } + + /// + /// Deletes an existing . + /// + /// The ID of the assistant to delete. + /// A token that can be used to cancel this method call. + /// A value indicating whether the deletion was successful. + public virtual async Task> DeleteAssistantAsync(string assistantId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + ClientResult protocolResult = await DeleteAssistantAsync(assistantId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, response + => InternalDeleteAssistantResponse.FromResponse(response).Deleted); + } + + /// + /// Deletes an existing . + /// + /// The ID of the assistant to delete. + /// A token that can be used to cancel this method call. + /// A value indicating whether the deletion was successful. + public virtual ClientResult DeleteAssistant(string assistantId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + ClientResult protocolResult = DeleteAssistant(assistantId, cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, response + => InternalDeleteAssistantResponse.FromResponse(response).Deleted); + } + + /// + /// Creates a new . + /// + /// Additional options to use when creating the thread. + /// A token that can be used to cancel this method call. + /// A new thread. + public virtual async Task> CreateThreadAsync(ThreadCreationOptions options = null, CancellationToken cancellationToken = default) + { + ClientResult protocolResult = await CreateThreadAsync(options?.ToBinaryContent(), cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, AssistantThread.FromResponse); + } + + /// + /// Creates a new . + /// + /// Additional options to use when creating the thread. + /// A token that can be used to cancel this method call. + /// A new thread. + public virtual ClientResult CreateThread(ThreadCreationOptions options = null, CancellationToken cancellationToken = default) + { + ClientResult protocolResult = CreateThread(options?.ToBinaryContent(), cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, AssistantThread.FromResponse); + } + + /// + /// Gets an existing , retrieved via a known ID. + /// + /// The ID of the thread to retrieve. + /// A token that can be used to cancel this method call. + /// The existing thread instance. + public virtual async Task> GetThreadAsync(string threadId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + ClientResult protocolResult = await GetThreadAsync(threadId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, AssistantThread.FromResponse); + } + + /// + /// Gets an existing , retrieved via a known ID. + /// + /// The ID of the thread to retrieve. + /// A token that can be used to cancel this method call. + /// The existing thread instance. + public virtual ClientResult GetThread(string threadId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + ClientResult protocolResult = GetThread(threadId, cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, AssistantThread.FromResponse); + } + + /// + /// Modifies an existing . + /// + /// The ID of the thread to modify. + /// The modifications to apply to the thread. + /// A token that can be used to cancel this method call. + /// The updated instance. + public virtual async Task> ModifyThreadAsync(string threadId, ThreadModificationOptions options, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNull(options, nameof(options)); + + ClientResult protocolResult = await ModifyThreadAsync(threadId, options?.ToBinaryContent(), cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, AssistantThread.FromResponse); + } + + /// + /// Modifies an existing . + /// + /// The ID of the thread to modify. + /// The modifications to apply to the thread. + /// A token that can be used to cancel this method call. + /// The updated instance. + public virtual ClientResult ModifyThread(string threadId, ThreadModificationOptions options, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNull(options, nameof(options)); + + ClientResult protocolResult = ModifyThread(threadId, options?.ToBinaryContent(), cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, AssistantThread.FromResponse); + } + + /// + /// Deletes an existing . + /// + /// The ID of the thread to delete. + /// A token that can be used to cancel this method call. + /// A value indicating whether the deletion was successful. + public virtual async Task> DeleteThreadAsync(string threadId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + ClientResult protocolResult = await DeleteThreadAsync(threadId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, response + => InternalDeleteThreadResponse.FromResponse(response).Deleted); + } + + /// + /// Deletes an existing . + /// + /// The ID of the thread to delete. + /// A token that can be used to cancel this method call. + /// A value indicating whether the deletion was successful. + public virtual ClientResult DeleteThread(string threadId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + ClientResult protocolResult = DeleteThread(threadId, cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, response + => InternalDeleteThreadResponse.FromResponse(response).Deleted); + } + + /// + /// Creates a new on an existing . + /// + /// The ID of the thread to associate the new message with. + /// The role to associate with the new message. + /// The collection of items for the message. + /// Additional options to apply to the new message. + /// A token that can be used to cancel this method call. + /// A new . + public virtual async Task> CreateMessageAsync( + string threadId, + MessageRole role, + IEnumerable content, + MessageCreationOptions options = null, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + options ??= new(); + options.Role = role; + options.Content.Clear(); + foreach (MessageContent contentItem in content) + { + options.Content.Add(contentItem); + } + + ClientResult protocolResult = await CreateMessageAsync(threadId, options?.ToBinaryContent(), cancellationToken.ToRequestOptions()) + .ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, ThreadMessage.FromResponse); + } + + /// + /// Creates a new on an existing . + /// + /// The ID of the thread to associate the new message with. + /// The role to associate with the new message. + /// The collection of items for the message. + /// Additional options to apply to the new message. + /// A token that can be used to cancel this method call. + /// A new . + public virtual ClientResult CreateMessage( + string threadId, + MessageRole role, + IEnumerable content, + MessageCreationOptions options = null, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + options ??= new(); + options.Role = role; + options.Content.Clear(); + foreach (MessageContent contentItem in content) + { + options.Content.Add(contentItem); + } + + ClientResult protocolResult = CreateMessage(threadId, options?.ToBinaryContent(), cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, ThreadMessage.FromResponse); + } + + /// + /// Gets a page collection of instances from an existing . + /// + /// The ID of the thread to list messages from. + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetMessagesAsync( + string threadId, + MessageCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + return GetMessagesAsync(threadId, options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Rehydrates a page collection of instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetMessagesAsync( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + MessagesPageToken pageToken = MessagesPageToken.FromToken(firstPageToken); + return GetMessagesAsync(pageToken?.ThreadId, pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Gets a page collection holding instances from an existing . + /// + /// The ID of the thread to list messages from. + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetMessages( + string threadId, + MessageCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + return GetMessages(threadId, options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Rehydrates a page collection holding instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetMessages( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + MessagesPageToken pageToken = MessagesPageToken.FromToken(firstPageToken); + return GetMessages(pageToken?.ThreadId, pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, cancellationToken.ToRequestOptions()) + as PageCollection; + + } + + /// + /// Gets an existing from a known . + /// + /// The ID of the thread to retrieve the message from. + /// The ID of the message to retrieve. + /// A token that can be used to cancel this method call. + /// The existing instance. + public virtual async Task> GetMessageAsync(string threadId, string messageId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + ClientResult protocolResult = await GetMessageAsync(threadId, messageId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, ThreadMessage.FromResponse); + } + + /// + /// Gets an existing from a known . + /// + /// The ID of the thread to retrieve the message from. + /// The ID of the message to retrieve. + /// A token that can be used to cancel this method call. + /// The existing instance. + public virtual ClientResult GetMessage(string threadId, string messageId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + ClientResult protocolResult = GetMessage(threadId, messageId, cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, ThreadMessage.FromResponse); + } + + /// + /// Modifies an existing . + /// + /// The ID of the thread associated with the message to modify. + /// The ID of the message to modify. + /// The changes to apply to the message. + /// A token that can be used to cancel this method call. + /// The updated . + public virtual async Task> ModifyMessageAsync(string threadId, string messageId, MessageModificationOptions options, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + Argument.AssertNotNull(options, nameof(options)); + + ClientResult protocolResult = await ModifyMessageAsync(threadId, messageId, options?.ToBinaryContent(), cancellationToken.ToRequestOptions()) + .ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, ThreadMessage.FromResponse); + } + + /// + /// Modifies an existing . + /// + /// The ID of the thread associated with the message to modify. + /// The ID of the message to modify. + /// The changes to apply to the message. + /// A token that can be used to cancel this method call. + /// The updated . + public virtual ClientResult ModifyMessage(string threadId, string messageId, MessageModificationOptions options, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + Argument.AssertNotNull(options, nameof(options)); + + ClientResult protocolResult = ModifyMessage(threadId, messageId, options?.ToBinaryContent(), cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, ThreadMessage.FromResponse); + } + + /// + /// Deletes an existing . + /// + /// The ID of the thread associated with the message. + /// The ID of the message. + /// A token that can be used to cancel this method call. + /// A value indicating whether the deletion was successful. + public virtual async Task> DeleteMessageAsync(string threadId, string messageId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + ClientResult protocolResult = await DeleteMessageAsync(threadId, messageId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, response => + InternalDeleteMessageResponse.FromResponse(response).Deleted); + } + + /// + /// Deletes an existing . + /// + /// The ID of the thread associated with the message. + /// The ID of the message. + /// A token that can be used to cancel this method call. + /// A value indicating whether the deletion was successful. + public virtual ClientResult DeleteMessage(string threadId, string messageId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + ClientResult protocolResult = DeleteMessage(threadId, messageId, cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, response => + InternalDeleteMessageResponse.FromResponse(response).Deleted); + } + + /// + /// Begins a new that evaluates a using a specified + /// . + /// + /// The ID of the thread that the run should evaluate. + /// The ID of the assistant that should be used when evaluating the thread. + /// Additional options for the run. + /// A token that can be used to cancel this method call. + /// A new instance. + public virtual async Task> CreateRunAsync(string threadId, string assistantId, RunCreationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + options ??= new(); + options.AssistantId = assistantId; + options.Stream = null; + + ClientResult protocolResult = await CreateRunAsync(threadId, options.ToBinaryContent(), cancellationToken.ToRequestOptions()) + .ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, ThreadRun.FromResponse); + } + + /// + /// Begins a new that evaluates a using a specified + /// . + /// + /// The ID of the thread that the run should evaluate. + /// The ID of the assistant that should be used when evaluating the thread. + /// Additional options for the run. + /// A token that can be used to cancel this method call. + /// A new instance. + public virtual ClientResult CreateRun(string threadId, string assistantId, RunCreationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + options ??= new(); + options.AssistantId = assistantId; + options.Stream = null; + + ClientResult protocolResult = CreateRun(threadId, options.ToBinaryContent(), cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, ThreadRun.FromResponse); + } + + /// + /// Begins a new streaming that evaluates a using a specified + /// . + /// + /// The ID of the thread that the run should evaluate. + /// The ID of the assistant that should be used when evaluating the thread. + /// Additional options for the run. + /// A token that can be used to cancel this method call. + public virtual AsyncCollectionResult CreateRunStreamingAsync( + string threadId, + string assistantId, + RunCreationOptions options = null, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + options ??= new(); + options.AssistantId = assistantId; + options.Stream = true; + + async Task getResultAsync() => + await CreateRunAsync(threadId, options.ToBinaryContent(), cancellationToken.ToRequestOptions(streaming: true)) + .ConfigureAwait(false); + + return new AsyncStreamingUpdateCollection(getResultAsync); + } + + /// + /// Begins a new streaming that evaluates a using a specified + /// . + /// + /// The ID of the thread that the run should evaluate. + /// The ID of the assistant that should be used when evaluating the thread. + /// Additional options for the run. + /// A token that can be used to cancel this method call. + public virtual CollectionResult CreateRunStreaming( + string threadId, + string assistantId, + RunCreationOptions options = null, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + options ??= new(); + options.AssistantId = assistantId; + options.Stream = true; + + ClientResult getResult() => CreateRun(threadId, options.ToBinaryContent(), cancellationToken.ToRequestOptions(streaming: true)); + + return new StreamingUpdateCollection(getResult); + } + + /// + /// Creates a new thread and immediately begins a run against it using the specified . + /// + /// The ID of the assistant that the new run should use. + /// Options for the new thread that will be created. + /// Additional options to apply to the run that will begin. + /// A token that can be used to cancel this method call. + /// A new . + public virtual async Task> CreateThreadAndRunAsync( + string assistantId, + ThreadCreationOptions threadOptions = null, + RunCreationOptions runOptions = null, + CancellationToken cancellationToken = default) + { + runOptions ??= new(); + runOptions.Stream = null; + BinaryContent protocolContent = CreateThreadAndRunProtocolContent(assistantId, threadOptions, runOptions); + ClientResult protocolResult = await CreateThreadAndRunAsync(protocolContent, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, ThreadRun.FromResponse); + } + + /// + /// Creates a new thread and immediately begins a run against it using the specified . + /// + /// The ID of the assistant that the new run should use. + /// Options for the new thread that will be created. + /// Additional options to apply to the run that will begin. + /// A token that can be used to cancel this method call. + /// A new . + public virtual ClientResult CreateThreadAndRun( + string assistantId, + ThreadCreationOptions threadOptions = null, + RunCreationOptions runOptions = null, + CancellationToken cancellationToken = default) + { + runOptions ??= new(); + runOptions.Stream = null; + BinaryContent protocolContent = CreateThreadAndRunProtocolContent(assistantId, threadOptions, runOptions); + ClientResult protocolResult = CreateThreadAndRun(protocolContent, cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, ThreadRun.FromResponse); + } + + /// + /// Creates a new thread and immediately begins a streaming run against it using the specified . + /// + /// The ID of the assistant that the new run should use. + /// Options for the new thread that will be created. + /// Additional options to apply to the run that will begin. + /// A token that can be used to cancel this method call. + public virtual AsyncCollectionResult CreateThreadAndRunStreamingAsync( + string assistantId, + ThreadCreationOptions threadOptions = null, + RunCreationOptions runOptions = null, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + runOptions ??= new(); + runOptions.Stream = true; + BinaryContent protocolContent = CreateThreadAndRunProtocolContent(assistantId, threadOptions, runOptions); + + async Task getResultAsync() => + await CreateThreadAndRunAsync(protocolContent, cancellationToken.ToRequestOptions(streaming: true)) + .ConfigureAwait(false); + + return new AsyncStreamingUpdateCollection(getResultAsync); + } + + /// + /// Creates a new thread and immediately begins a streaming run against it using the specified . + /// + /// The ID of the assistant that the new run should use. + /// Options for the new thread that will be created. + /// Additional options to apply to the run that will begin. + /// A token that can be used to cancel this method call. + public virtual CollectionResult CreateThreadAndRunStreaming( + string assistantId, + ThreadCreationOptions threadOptions = null, + RunCreationOptions runOptions = null, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + + runOptions ??= new(); + runOptions.Stream = true; + BinaryContent protocolContent = CreateThreadAndRunProtocolContent(assistantId, threadOptions, runOptions); + + ClientResult getResult() => CreateThreadAndRun(protocolContent, cancellationToken.ToRequestOptions(streaming: true)); + + return new StreamingUpdateCollection(getResult); + } + + /// + /// Gets a page collection holding instances associated with an existing . + /// + /// The ID of the thread that runs in the list should be associated with. + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetRunsAsync( + string threadId, + RunCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + return GetRunsAsync(threadId, options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Rehydrates a page collection holding instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetRunsAsync( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + RunsPageToken pageToken = RunsPageToken.FromToken(firstPageToken); + return GetRunsAsync(pageToken?.ThreadId, pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Gets a page collection holding instances associated with an existing . + /// + /// The ID of the thread that runs in the list should be associated with. + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetRuns( + string threadId, + RunCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + return GetRuns(threadId, options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Rehydrates a page collection holding instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetRuns( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + RunsPageToken pageToken = RunsPageToken.FromToken(firstPageToken); + return GetRuns(pageToken?.ThreadId, pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Gets an existing from a known . + /// + /// The ID of the thread to retrieve the run from. + /// The ID of the run to retrieve. + /// A token that can be used to cancel this method call. + /// The existing instance. + public virtual async Task> GetRunAsync(string threadId, string runId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + ClientResult protocolResult = await GetRunAsync(threadId, runId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, ThreadRun.FromResponse); + } + + /// + /// Gets an existing from a known . + /// + /// The ID of the thread to retrieve the run from. + /// The ID of the run to retrieve. + /// A token that can be used to cancel this method call. + /// The existing instance. + public virtual ClientResult GetRun(string threadId, string runId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + ClientResult protocolResult = GetRun(threadId, runId, cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, ThreadRun.FromResponse); + } + + /// + /// Submits a collection of required tool call outputs to a run and resumes the run. + /// + /// The thread ID of the thread being run. + /// The ID of the run that reached a requires_action status. + /// + /// The tool outputs, corresponding to instances from the run. + /// + /// A token that can be used to cancel this method call. + /// The , updated after the submission was processed. + public virtual async Task> SubmitToolOutputsToRunAsync( + string threadId, + string runId, + IEnumerable toolOutputs, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + BinaryContent content = new InternalSubmitToolOutputsRunRequest(toolOutputs).ToBinaryContent(); + ClientResult protocolResult = await SubmitToolOutputsToRunAsync(threadId, runId, content, cancellationToken.ToRequestOptions()) + .ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, ThreadRun.FromResponse); + } + + /// + /// Submits a collection of required tool call outputs to a run and resumes the run. + /// + /// The thread ID of the thread being run. + /// The ID of the run that reached a requires_action status. + /// + /// The tool outputs, corresponding to instances from the run. + /// + /// A token that can be used to cancel this method call. + /// The , updated after the submission was processed. + public virtual ClientResult SubmitToolOutputsToRun( + string threadId, + string runId, + IEnumerable toolOutputs, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + BinaryContent content = new InternalSubmitToolOutputsRunRequest(toolOutputs).ToBinaryContent(); + ClientResult protocolResult = SubmitToolOutputsToRun(threadId, runId, content, cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, ThreadRun.FromResponse); + } + + /// + /// Submits a collection of required tool call outputs to a run and resumes the run with streaming enabled. + /// + /// The thread ID of the thread being run. + /// The ID of the run that reached a requires_action status. + /// + /// The tool outputs, corresponding to instances from the run. + /// + /// A token that can be used to cancel this method call. + public virtual AsyncCollectionResult SubmitToolOutputsToRunStreamingAsync( + string threadId, + string runId, + IEnumerable toolOutputs, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + BinaryContent content = new InternalSubmitToolOutputsRunRequest(toolOutputs.ToList(), stream: true, null) + .ToBinaryContent(); + + async Task getResultAsync() => + await SubmitToolOutputsToRunAsync(threadId, runId, content, cancellationToken.ToRequestOptions(streaming: true)) + .ConfigureAwait(false); + + return new AsyncStreamingUpdateCollection(getResultAsync); + } + + /// + /// Submits a collection of required tool call outputs to a run and resumes the run with streaming enabled. + /// + /// The thread ID of the thread being run. + /// The ID of the run that reached a requires_action status. + /// + /// The tool outputs, corresponding to instances from the run. + /// + /// A token that can be used to cancel this method call. + public virtual CollectionResult SubmitToolOutputsToRunStreaming( + string threadId, + string runId, + IEnumerable toolOutputs, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + BinaryContent content = new InternalSubmitToolOutputsRunRequest(toolOutputs.ToList(), stream: true, null) + .ToBinaryContent(); + + ClientResult getResult() => SubmitToolOutputsToRun(threadId, runId, content, cancellationToken.ToRequestOptions(streaming: true)); + + return new StreamingUpdateCollection(getResult); + } + + /// + /// Cancels an in-progress . + /// + /// The ID of the thread associated with the run. + /// The ID of the run to cancel. + /// A token that can be used to cancel this method call. + /// An updated instance, reflecting the new status of the run. + public virtual async Task> CancelRunAsync(string threadId, string runId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + ClientResult protocolResult = await CancelRunAsync(threadId, runId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, ThreadRun.FromResponse); + } + + /// + /// Cancels an in-progress . + /// + /// The ID of the thread associated with the run. + /// The ID of the run to cancel. + /// A token that can be used to cancel this method call. + /// An updated instance, reflecting the new status of the run. + public virtual ClientResult CancelRun(string threadId, string runId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + ClientResult protocolResult = CancelRun(threadId, runId, cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, ThreadRun.FromResponse); + } + + /// + /// Gets a page collection holding instances associated with a . + /// + /// The ID of the thread associated with the run. + /// The ID of the run to list run steps from. + /// + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetRunStepsAsync( + string threadId, + string runId, + RunStepCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + return GetRunStepsAsync(threadId, runId, options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Rehydrates a page collection holding instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetRunStepsAsync( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + RunStepsPageToken pageToken = RunStepsPageToken.FromToken(firstPageToken); + return GetRunStepsAsync(pageToken?.ThreadId, pageToken?.RunId, pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Gets a page collection holding instances associated with a . + /// + /// The ID of the thread associated with the run. + /// The ID of the run to list run steps from. + /// + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetRunSteps( + string threadId, + string runId, + RunStepCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + return GetRunSteps(threadId, runId, options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Rehydrates a page collection holding instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetRunSteps( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + RunStepsPageToken pageToken = RunStepsPageToken.FromToken(firstPageToken); + return GetRunSteps(pageToken?.ThreadId, pageToken?.RunId, pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Gets a single run step from a run. + /// + /// The ID of the thread associated with the run. + /// The ID of the run. + /// The ID of the run step. + /// A token that can be used to cancel this method call. + /// A instance corresponding to the specified step. + public virtual async Task> GetRunStepAsync(string threadId, string runId, string stepId, CancellationToken cancellationToken = default) + { + ClientResult protocolResult = await GetRunStepAsync(threadId, runId, stepId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return CreateResultFromProtocol(protocolResult, RunStep.FromResponse); + } + + /// + /// Gets a single run step from a run. + /// + /// The ID of the thread associated with the run. + /// The ID of the run. + /// The ID of the run step. + /// A token that can be used to cancel this method call. + /// A instance corresponding to the specified step. + public virtual ClientResult GetRunStep(string threadId, string runId, string stepId, CancellationToken cancellationToken = default) + { + ClientResult protocolResult = GetRunStep(threadId, runId, stepId, cancellationToken.ToRequestOptions()); + return CreateResultFromProtocol(protocolResult, RunStep.FromResponse); + } + + private static BinaryContent CreateThreadAndRunProtocolContent( + string assistantId, + ThreadCreationOptions threadOptions, + RunCreationOptions runOptions) + { + Argument.AssertNotNullOrEmpty(assistantId, nameof(assistantId)); + InternalCreateThreadAndRunRequest internalRequest = new( + assistantId, + threadOptions, + runOptions.ModelOverride, + runOptions.InstructionsOverride, + runOptions.ToolsOverride, + // TODO: reconcile exposure of the the two different tool_resources, if needed + threadOptions?.ToolResources, + runOptions.Metadata, + runOptions.Temperature, + runOptions.NucleusSamplingFactor, + runOptions.Stream, + runOptions.MaxPromptTokens, + runOptions.MaxCompletionTokens, + runOptions.TruncationStrategy, + runOptions.ToolConstraint, + runOptions.ParallelToolCallsEnabled, + runOptions.ResponseFormat, + serializedAdditionalRawData: null); + return internalRequest.ToBinaryContent(); + } + + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private static ClientResult CreateResultFromProtocol(ClientResult protocolResult, Func responseDeserializer) + { + PipelineResponse pipelineResponse = protocolResult?.GetRawResponse(); + T deserializedResultValue = responseDeserializer.Invoke(pipelineResponse); + return ClientResult.FromValue(deserializedResultValue, pipelineResponse); + } +} diff --git a/.dotnet/src/Custom/Assistants/AssistantCollectionOptions.cs b/.dotnet/src/Custom/Assistants/AssistantCollectionOptions.cs new file mode 100644 index 000000000..c7baa96d1 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/AssistantCollectionOptions.cs @@ -0,0 +1,33 @@ +namespace OpenAI.Assistants; + +/// +/// Represents addition options available when requesting a collection of instances. +/// +public class AssistantCollectionOptions +{ + /// + /// Creates a new instance of . + /// + public AssistantCollectionOptions() { } + + /// + /// The order that results should appear in the list according to + /// their created_at timestamp. + /// + public ListOrder? Order { get; set; } + + /// + /// The number of values to return in a page result. + /// + public int? PageSize { get; set; } + + /// + /// The id of the item preceeding the first item in the collection. + /// + public string AfterId { get; set; } + + /// + /// The id of the item following the last item in the collection. + /// + public string BeforeId { get; set; } +} diff --git a/.dotnet/src/Custom/Assistants/AssistantCreationOptions.cs b/.dotnet/src/Custom/Assistants/AssistantCreationOptions.cs new file mode 100644 index 000000000..c00b6e82e --- /dev/null +++ b/.dotnet/src/Custom/Assistants/AssistantCreationOptions.cs @@ -0,0 +1,54 @@ +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +/// +/// Represents additional options available when creating a new . +/// +[CodeGenModel("CreateAssistantRequest")] +[CodeGenSuppress(nameof(AssistantCreationOptions), typeof(string))] +public partial class AssistantCreationOptions +{ + // CUSTOM: visibility hidden to promote required property to method parameter + internal string Model { get; set; } + + /// + /// + /// A list of tool enabled on the assistant. + /// + /// There can be a maximum of 128 tools per assistant. Tools can be of types `code_interpreter`, `file_search`, or `function`. + /// + [CodeGenMember("Tools")] + public IList Tools { get; } = new ChangeTrackingList(); + + /// + [CodeGenMember("ToolResources")] + public ToolResources ToolResources { get; set; } + + /// + [CodeGenMember("ResponseFormat")] + public AssistantResponseFormat ResponseFormat { get; set; } + + /// + /// An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + /// + /// We generally recommend altering this or temperature but not both. + /// + [CodeGenMember("TopP")] + public float? NucleusSamplingFactor { get; set; } + + internal AssistantCreationOptions(InternalCreateAssistantRequestModel model) + : this() + { + Model = model.ToString(); + } + + /// + /// Creates a new instance of . + /// + public AssistantCreationOptions() + { + Metadata = new ChangeTrackingDictionary(); + Tools = new ChangeTrackingList(); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/AssistantModificationOptions.cs b/.dotnet/src/Custom/Assistants/AssistantModificationOptions.cs new file mode 100644 index 000000000..bbd8b2efb --- /dev/null +++ b/.dotnet/src/Custom/Assistants/AssistantModificationOptions.cs @@ -0,0 +1,43 @@ +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +/// +/// Represents additional options available when modifying an existing . +/// +[CodeGenModel("ModifyAssistantRequest")] +public partial class AssistantModificationOptions +{ + /// + /// The replacement model that the assistant should use. + /// + public string Model { get; set; } + + /// + /// + /// A list of tool enabled on the assistant. + /// + /// There can be a maximum of 128 tools per assistant. Tools can be of types `code_interpreter`, `file_search`, or `function`. + /// + [CodeGenMember("Tools")] + public IList DefaultTools { get; set; } = new ChangeTrackingList(); + + // CUSTOM: reuse common request/response models for tool resources. Note that modification operations use the + // response models (which do not contain resource initialization helpers). + + /// + [CodeGenMember("ToolResources")] + public ToolResources ToolResources { get; set; } + + /// + [CodeGenMember("ResponseFormat")] + public AssistantResponseFormat ResponseFormat { get; set; } + + /// + /// An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + /// + /// We generally recommend altering this or temperature but not both. + /// + [CodeGenMember("TopP")] + public float? NucleusSamplingFactor { get; set; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/AssistantResponseFormat.Serialization.cs b/.dotnet/src/Custom/Assistants/AssistantResponseFormat.Serialization.cs new file mode 100644 index 000000000..363c33eab --- /dev/null +++ b/.dotnet/src/Custom/Assistants/AssistantResponseFormat.Serialization.cs @@ -0,0 +1,60 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Create", typeof(Utf8JsonReader), typeof(ModelReaderWriterOptions))] +[CodeGenSuppress("global::System.ClientModel.Primitives.IPersistableModel.Write", typeof(ModelReaderWriterOptions))] +[CodeGenSuppress("global::System.ClientModel.Primitives.IPersistableModel.Create", typeof(BinaryData), typeof(ModelReaderWriterOptions))] +[CodeGenSuppress("global::System.ClientModel.Primitives.IPersistableModel.GetFormatFromOptions", typeof(ModelReaderWriterOptions))] +public partial class AssistantResponseFormat : IJsonModel +{ + internal static void SerializeAssistantResponseFormat(AssistantResponseFormat instance, Utf8JsonWriter writer, ModelReaderWriterOptions options = null) + { + throw new InvalidOperationException(); + } + + internal static AssistantResponseFormat DeserializeAssistantResponseFormat(JsonElement element, ModelReaderWriterOptions options = null) + { + return element.ValueKind switch + { + JsonValueKind.String => InternalAssistantResponseFormatPlainTextNoObject.DeserializeInternalAssistantResponseFormatPlainTextNoObject(element, options), + JsonValueKind.Object when element.TryGetProperty("type", out JsonElement discriminatorElement) + => discriminatorElement.GetString() switch + { + "json_object" => InternalAssistantResponseFormatJsonObject.DeserializeInternalAssistantResponseFormatJsonObject(element, options), + "json_schema" => InternalAssistantResponseFormatJsonSchema.DeserializeInternalAssistantResponseFormatJsonSchema(element, options), + "text" => InternalAssistantResponseFormatText.DeserializeInternalAssistantResponseFormatText(element, options), + _ => null, + }, + _ => null, + }; + } + + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeAssistantResponseFormat, writer, options); + + AssistantResponseFormat IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + => CustomSerializationHelpers.DeserializeNewInstance(this, DeserializeAssistantResponseFormat, ref reader, options); + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, options); + + AssistantResponseFormat IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + => CustomSerializationHelpers.DeserializeNewInstance(this, DeserializeAssistantResponseFormat, data, options); + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static AssistantResponseFormat FromResponse(PipelineResponse response) + { + throw new InvalidOperationException(); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } +} diff --git a/.dotnet/src/Custom/Assistants/AssistantResponseFormat.cs b/.dotnet/src/Custom/Assistants/AssistantResponseFormat.cs new file mode 100644 index 000000000..c87b9b076 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/AssistantResponseFormat.cs @@ -0,0 +1,109 @@ +using OpenAI.Internal; +using System; +using System.ClientModel.Primitives; +using System.ComponentModel; + +namespace OpenAI.Assistants; + +[CodeGenModel("AssistantResponseFormat")] +public partial class AssistantResponseFormat : IEquatable, IEquatable +{ + public static AssistantResponseFormat Auto { get; } = CreateAutoFormat(); + public static AssistantResponseFormat Text { get; } = CreateTextFormat(); + public static AssistantResponseFormat JsonObject { get; } = CreateJsonObjectFormat(); + + public static AssistantResponseFormat CreateAutoFormat() + => new InternalAssistantResponseFormatPlainTextNoObject("auto"); + public static AssistantResponseFormat CreateTextFormat() + => new InternalAssistantResponseFormatText(); + public static AssistantResponseFormat CreateJsonObjectFormat() + => new InternalAssistantResponseFormatJsonObject(); + public static AssistantResponseFormat CreateJsonSchemaFormat( + string name, + BinaryData jsonSchema, + string description = null, + bool? strictSchemaEnabled = null) + { + Argument.AssertNotNullOrEmpty(name, nameof(name)); + Argument.AssertNotNull(jsonSchema, nameof(jsonSchema)); + + InternalResponseFormatJsonSchemaJsonSchema internalSchema = new( + description, + name, + jsonSchema, + strictSchemaEnabled, + null); + return new InternalAssistantResponseFormatJsonSchema(internalSchema); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public static bool operator ==(AssistantResponseFormat first, AssistantResponseFormat second) + { + if (first is null) + { + return second is null; + } + return first.Equals(second); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public static bool operator !=(AssistantResponseFormat first, AssistantResponseFormat second) + => !(first == second); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) + { + return (this as IEquatable).Equals(obj as AssistantResponseFormat) + || ToString().Equals(obj?.ToString()); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => ToString().GetHashCode(); + + [EditorBrowsable(EditorBrowsableState.Never)] + public static implicit operator AssistantResponseFormat(string plainTextFormat) + => new InternalAssistantResponseFormatPlainTextNoObject(plainTextFormat); + + [EditorBrowsable(EditorBrowsableState.Never)] + bool IEquatable.Equals(AssistantResponseFormat other) + { + if (other is null) + { + return false; + } + + if (Object.ReferenceEquals(this, other)) + { + return true; + } + + return + (this is InternalAssistantResponseFormatPlainTextNoObject thisPlainText + && other is InternalAssistantResponseFormatPlainTextNoObject otherPlainText + && thisPlainText.Value.Equals(otherPlainText.Value)) + || (this is InternalAssistantResponseFormatText && other is InternalAssistantResponseFormatText) + || (this is InternalAssistantResponseFormatJsonObject && other is InternalAssistantResponseFormatJsonObject) + || (this is InternalAssistantResponseFormatJsonSchema thisJsonSchema + && other is InternalAssistantResponseFormatJsonSchema otherJsonSchema + && thisJsonSchema.JsonSchema.Name.Equals(otherJsonSchema.JsonSchema.Name)); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + bool IEquatable.Equals(string other) + { + return this is InternalAssistantResponseFormatPlainTextNoObject thisPlainText + && thisPlainText.Value.Equals(other); + } + + public override string ToString() + { + if (this is InternalAssistantResponseFormatPlainTextNoObject plainTextInstance) + { + return plainTextInstance.Value; + } + else + { + return ModelReaderWriter.Write(this).ToString(); + } + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/AssistantThread.cs b/.dotnet/src/Custom/Assistants/AssistantThread.cs new file mode 100644 index 000000000..376766668 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/AssistantThread.cs @@ -0,0 +1,18 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("ThreadObject")] +public partial class AssistantThread +{ + // CUSTOM: Made internal. + /// The object type, which is always `thread`. + [CodeGenMember("Object")] + internal InternalThreadObjectObject Object { get; } = InternalThreadObjectObject.Thread; + + + /// + /// The set of resources that are made available to the assistant's tools on this thread. + /// The resources are specific to the type of tool. + /// For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + /// + public ToolResources ToolResources { get; } +} diff --git a/.dotnet/src/Custom/Assistants/CodeInterpreterToolDefinition.Serialization.cs b/.dotnet/src/Custom/Assistants/CodeInterpreterToolDefinition.Serialization.cs new file mode 100644 index 000000000..6318e87c7 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/CodeInterpreterToolDefinition.Serialization.cs @@ -0,0 +1,26 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class CodeInterpreterToolDefinition : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeCodeInterpreterToolDefinition, writer, options); + + internal static void SerializeCodeInterpreterToolDefinition(CodeInterpreterToolDefinition instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/CodeInterpreterToolResources.cs b/.dotnet/src/Custom/Assistants/CodeInterpreterToolResources.cs new file mode 100644 index 000000000..82f354922 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/CodeInterpreterToolResources.cs @@ -0,0 +1,27 @@ +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +/// The AssistantObjectToolResourcesCodeInterpreter. +[CodeGenModel("AssistantObjectToolResourcesCodeInterpreter")] +public partial class CodeInterpreterToolResources +{ + private ChangeTrackingList _fileIds = new ChangeTrackingList(); + + /// A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter`` tool. There can be a maximum of 20 files associated with the tool. + public IList FileIds + { + get => _fileIds; + set + { + _fileIds = new ChangeTrackingList(); + foreach (string fileId in value) + { + _fileIds.Add(fileId); + } + } + } + + public CodeInterpreterToolResources() + { } +} diff --git a/.dotnet/src/Custom/Assistants/FileSearchToolDefinition.Serialization.cs b/.dotnet/src/Custom/Assistants/FileSearchToolDefinition.Serialization.cs new file mode 100644 index 000000000..19cd98074 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/FileSearchToolDefinition.Serialization.cs @@ -0,0 +1,34 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class FileSearchToolDefinition : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeFileSearchToolDefinition, writer, options); + + internal static void SerializeFileSearchToolDefinition(FileSearchToolDefinition instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + if (Optional.IsDefined(MaxResults)) + { + writer.WritePropertyName("file_search"u8); + writer.WriteStartObject(); + writer.WritePropertyName("max_num_results"u8); + writer.WriteNumberValue(MaxResults.Value); + writer.WriteEndObject(); + } + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} diff --git a/.dotnet/src/Custom/Assistants/FileSearchToolDefinition.cs b/.dotnet/src/Custom/Assistants/FileSearchToolDefinition.cs new file mode 100644 index 000000000..2bc8efff7 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/FileSearchToolDefinition.cs @@ -0,0 +1,28 @@ +using System; +using System.Collections.Generic; +using System.Diagnostics.CodeAnalysis; + +namespace OpenAI.Assistants; + +[CodeGenModel("AssistantToolsFileSearch")] +[CodeGenSuppress(nameof(FileSearchToolDefinition))] +public partial class FileSearchToolDefinition : ToolDefinition +{ + public int? MaxResults + { + get => _fileSearch?.InternalMaxNumResults; + set => _fileSearch.InternalMaxNumResults = value; + } + + /// + /// Creates a new instance of . + /// + public FileSearchToolDefinition() + : base("file_search") + { + _fileSearch = new InternalAssistantToolsFileSearchFileSearch(); + } + + [CodeGenMember("FileSearch")] + private InternalAssistantToolsFileSearchFileSearch _fileSearch; +} diff --git a/.dotnet/src/Custom/Assistants/FileSearchToolResources.cs b/.dotnet/src/Custom/Assistants/FileSearchToolResources.cs new file mode 100644 index 000000000..53e0d6bcf --- /dev/null +++ b/.dotnet/src/Custom/Assistants/FileSearchToolResources.cs @@ -0,0 +1,36 @@ +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenModel("ToolResourcesFileSearch")] +[CodeGenSerialization(nameof(NewVectorStores), "vector_stores", SerializationValueHook = nameof(SerializeNewVectorStores))] +public partial class FileSearchToolResources +{ + private ChangeTrackingList _vectorStoreIds = new(); + + [CodeGenMember("VectorStoreIds")] + public IList VectorStoreIds + { + get => _vectorStoreIds; + set + { + _vectorStoreIds = new ChangeTrackingList(); + foreach (string item in value) + { + _vectorStoreIds.Add(item); + } + } + } + + [CodeGenMember("VectorStores")] + public IList NewVectorStores { get; } = new ChangeTrackingList(); + + public FileSearchToolResources() + { } + + private void SerializeNewVectorStores(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => writer.WriteObjectValue(NewVectorStores, options); +} + diff --git a/.dotnet/src/Custom/Assistants/FunctionToolDefinition.Serialization.cs b/.dotnet/src/Custom/Assistants/FunctionToolDefinition.Serialization.cs new file mode 100644 index 000000000..701725671 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/FunctionToolDefinition.Serialization.cs @@ -0,0 +1,28 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class FunctionToolDefinition : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeFunctionToolDefinition, writer, options); + + internal static void SerializeFunctionToolDefinition(FunctionToolDefinition instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("function"u8); + writer.WriteObjectValue(_internalFunction, options); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/FunctionToolDefinition.cs b/.dotnet/src/Custom/Assistants/FunctionToolDefinition.cs new file mode 100644 index 000000000..70e2ba163 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/FunctionToolDefinition.cs @@ -0,0 +1,69 @@ +using System; +using System.Collections.Generic; +using System.Diagnostics.CodeAnalysis; + +namespace OpenAI.Assistants; + +[CodeGenModel("AssistantToolsFunction")] +[CodeGenSuppress(nameof(FunctionToolDefinition), typeof(InternalFunctionDefinition))] +public partial class FunctionToolDefinition : ToolDefinition +{ + // CUSTOM: the visibility of the underlying function object is hidden to simplify the structure of the tool. + + [CodeGenMember("Function")] + private readonly InternalFunctionDefinition _internalFunction; + + /// + public required string FunctionName + { + get => _internalFunction.Name; + set => _internalFunction.Name = value; + } + + /// + public string Description + { + get => _internalFunction.Description; + set => _internalFunction.Description = value; + } + + /// + public BinaryData Parameters + { + get => _internalFunction.Parameters; + set => _internalFunction.Parameters = value; + } + + public bool? StrictParameterSchemaEnabled + { + get => _internalFunction.Strict; + set => _internalFunction.Strict = value; + } + + /// + /// Creates a new instance of . + /// + [SetsRequiredMembers] + public FunctionToolDefinition(string name) + : base("function") + { + Argument.AssertNotNullOrEmpty(name, nameof(name)); + _internalFunction = new(null, name, null, null, null); + } + + /// + /// Creates a new instance of . + /// + public FunctionToolDefinition() + : base("function") + { + _internalFunction = new InternalFunctionDefinition(); + } + + [SetsRequiredMembers] + internal FunctionToolDefinition(string type, IDictionary serializedAdditionalRawData, InternalFunctionDefinition function) + : base(type, serializedAdditionalRawData) + { + _internalFunction = function; + } +} diff --git a/.dotnet/src/Custom/Assistants/GeneratorStubs.cs b/.dotnet/src/Custom/Assistants/GeneratorStubs.cs new file mode 100644 index 000000000..9efb3dfb5 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/GeneratorStubs.cs @@ -0,0 +1,51 @@ +namespace OpenAI.Assistants; + +/* + * This file stubs and performs minimal customization to generated public types for the OpenAI.Assistants namespace + * that are not otherwise attributed elsewhere. + */ + +[CodeGenModel("AssistantToolsCode")] +public partial class CodeInterpreterToolDefinition : ToolDefinition { } + +[CodeGenModel("MessageObjectStatus")] +public readonly partial struct MessageStatus { } + +[CodeGenModel("MessageObjectIncompleteDetails")] +public partial class MessageFailureDetails { } + +[CodeGenModel("MessageObjectIncompleteDetailsReason")] +public readonly partial struct MessageFailureReason { } + +[CodeGenModel("RunCompletionUsage")] +public partial class RunTokenUsage { } + +[CodeGenModel("RunObjectLastError")] +public partial class RunError { } + +[CodeGenModel("RunObjectLastErrorCode")] +public readonly partial struct RunErrorCode { } + +[CodeGenModel("RunObjectIncompleteDetails")] +public partial class RunIncompleteDetails { } + +[CodeGenModel("RunObjectIncompleteDetailsReason")] +public readonly partial struct RunIncompleteReason { } + +[CodeGenModel("RunStepObjectType")] +public readonly partial struct RunStepType { } + +[CodeGenModel("RunStepObjectStatus")] +public readonly partial struct RunStepStatus { } + +[CodeGenModel("RunStepObjectLastError")] +public partial class RunStepError { } + +[CodeGenModel("RunStepObjectLastErrorCode")] +public readonly partial struct RunStepErrorCode { } + +[CodeGenModel("RunStepCompletionUsage")] +public partial class RunStepTokenUsage { } + +[CodeGenModel("RunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject")] +public partial class RunStepCodeInterpreterOutput { } diff --git a/.dotnet/src/Custom/Assistants/Internal/GeneratorStubs.Internal.cs b/.dotnet/src/Custom/Assistants/Internal/GeneratorStubs.Internal.cs new file mode 100644 index 000000000..08f753db0 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/GeneratorStubs.Internal.cs @@ -0,0 +1,363 @@ +namespace OpenAI.Assistants; + +/* + * This file stubs and performs minimal customization to generated internal types for the OpenAI.Assistants namespace. + */ + +[CodeGenModel("SubmitToolOutputsRunRequest")] +internal partial class InternalSubmitToolOutputsRunRequest { } + +[CodeGenModel("CreateAssistantRequestModel")] +internal readonly partial struct InternalCreateAssistantRequestModel { } + +[CodeGenModel("MessageContentTextObjectAnnotation")] +internal partial class InternalMessageContentTextObjectAnnotation { } + +[CodeGenModel("MessageContentTextAnnotationsFileCitationObject")] +internal partial class InternalMessageContentTextAnnotationsFileCitationObject { } + +[CodeGenModel("MessageContentTextAnnotationsFilePathObject")] +internal partial class InternalMessageContentTextAnnotationsFilePathObject { } + +[CodeGenModel("MessageDeltaContentImageFileObjectImageFile")] +internal partial class InternalMessageDeltaContentImageFileObjectImageFile +{ + [CodeGenMember("Detail")] + internal string Detail { get; set; } +} + +[CodeGenModel("MessageDeltaContentImageUrlObjectImageUrl")] +internal partial class InternalMessageDeltaContentImageUrlObjectImageUrl +{ + [CodeGenMember("Detail")] + internal string Detail { get; } +} + +[CodeGenModel("MessageDeltaContentImageFileObject")] +internal partial class InternalMessageDeltaContentImageFileObject { } + +[CodeGenModel("MessageDeltaContentImageUrlObject")] +internal partial class InternalMessageDeltaContentImageUrlObject { } + +[CodeGenModel("MessageDeltaObjectDelta")] +internal partial class InternalMessageDeltaObjectDelta +{ + [CodeGenMember("Role")] + internal MessageRole Role { get; } +} + +[CodeGenModel("MessageDeltaContentTextObject")] +internal partial class InternalMessageDeltaContentTextObject { } + +[CodeGenModel("MessageDeltaContentTextObjectText")] +internal partial class InternalMessageDeltaContentTextObjectText { } + +[CodeGenModel("MessageDeltaContentTextAnnotationsFileCitationObject")] +internal partial class InternalMessageDeltaContentTextAnnotationsFileCitationObject { } + +[CodeGenModel("MessageDeltaTextContentAnnotation")] +internal partial class InternalMessageDeltaTextContentAnnotation { } + +[CodeGenModel("MessageDeltaContentTextAnnotationsFileCitationObjectFileCitation")] +internal partial class InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation { } + +[CodeGenModel("RunStepDeltaObject")] +internal partial class InternalRunStepDelta { private readonly object Object; } + +[CodeGenModel("RunStepDeltaObjectDelta")] +internal partial class InternalRunStepDeltaObjectDelta { } + +[CodeGenModel("MessageDeltaContentTextAnnotationsFilePathObject")] +internal partial class InternalMessageDeltaContentTextAnnotationsFilePathObject { } + +[CodeGenModel("MessageDeltaContentTextAnnotationsFilePathObjectFilePath")] +internal partial class InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath { } + +[CodeGenModel("MessageDeltaContent")] +internal partial class InternalMessageDeltaContent { } + +[CodeGenModel("DeleteAssistantResponse")] +internal partial class InternalDeleteAssistantResponse { } + +[CodeGenModel("DeleteAssistantResponseObject")] +internal readonly partial struct InternalDeleteAssistantResponseObject { } + +[CodeGenModel("DeleteThreadResponse")] +internal partial class InternalDeleteThreadResponse { } + +[CodeGenModel("DeleteThreadResponseObject")] +internal readonly partial struct InternalDeleteThreadResponseObject { } + +[CodeGenModel("DeleteMessageResponse")] +internal partial class InternalDeleteMessageResponse { } + +[CodeGenModel("DeleteMessageResponseObject")] +internal readonly partial struct InternalDeleteMessageResponseObject { } + +[CodeGenModel("CreateThreadAndRunRequest")] +internal partial class InternalCreateThreadAndRunRequest +{ + public string Model { get; set; } + public ToolResources ToolResources { get; set; } + public AssistantResponseFormat ResponseFormat { get; set; } + public ToolConstraint ToolChoice { get; set; } +} + +[CodeGenModel("MessageContentImageUrlObjectImageUrl")] +internal partial class InternalMessageContentImageUrlObjectImageUrl +{ + [CodeGenMember("Detail")] + internal string Detail { get; } +} + +[CodeGenModel("MessageContentImageFileObjectImageFile")] +internal partial class InternalMessageContentItemFileObjectImageFile +{ + [CodeGenMember("Detail")] + internal string Detail { get; set; } +} + +[CodeGenModel("MessageContentTextObjectText")] +internal partial class InternalMessageContentTextObjectText { } + +[CodeGenModel("MessageContentRefusalObjectType")] +internal readonly partial struct InternalMessageContentRefusalObjectType { } + +[CodeGenModel("RunStepDetailsMessageCreationObjectMessageCreation")] +internal partial class InternalRunStepDetailsMessageCreationObjectMessageCreation { } + +[CodeGenModel("RunStepDetailsToolCallsFunctionObjectFunction")] +internal partial class InternalRunStepDetailsToolCallsFunctionObjectFunction { } + +[CodeGenModel("RunStepDetailsToolCallsCodeObjectCodeInterpreter")] +internal partial class InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter { } + +[CodeGenModel("RunStepDetailsToolCallsCodeOutputImageObjectImage")] +internal partial class InternalRunStepDetailsToolCallsCodeOutputImageObjectImage { } + +[CodeGenModel("MessageContentTextAnnotationsFileCitationObjectFileCitation")] +internal partial class InternalMessageContentTextAnnotationsFileCitationObjectFileCitation { } + +[CodeGenModel("MessageContentTextAnnotationsFilePathObjectFilePath")] +internal partial class InternalMessageContentTextAnnotationsFilePathObjectFilePath { } + +[CodeGenModel("RunObjectRequiredAction")] +internal partial class InternalRunRequiredAction { private readonly object Type; } + +[CodeGenModel("RunObjectRequiredActionSubmitToolOutputs")] +internal partial class InternalRunObjectRequiredActionSubmitToolOutputs { private readonly object Type; } + +[CodeGenModel("RunToolCallObjectFunction")] +internal partial class InternalRunToolCallObjectFunction { } + +[CodeGenModel("ListAssistantsResponse")] +internal partial class InternalListAssistantsResponse : IInternalListResponse { } + +[CodeGenModel("ListAssistantsResponseObject")] +internal readonly partial struct InternalListAssistantsResponseObject {} + +[CodeGenModel("ListThreadsResponse")] +internal partial class InternalListThreadsResponse : IInternalListResponse { } + +[CodeGenModel("ListThreadsResponseObject")] +internal readonly partial struct InternalListThreadsResponseObject {} + +[CodeGenModel("ListMessagesResponse")] +internal partial class InternalListMessagesResponse : IInternalListResponse { } + +[CodeGenModel("ListMessagesResponseObject")] +internal readonly partial struct InternalListMessagesResponseObject {} + +[CodeGenModel("ListRunsResponse")] +internal partial class InternalListRunsResponse : IInternalListResponse { } + +[CodeGenModel("ListRunsResponseObject")] +internal readonly partial struct InternalListRunsResponseObject {} + +[CodeGenModel("ListRunStepsResponse")] +internal partial class InternalListRunStepsResponse : IInternalListResponse { } + +[CodeGenModel("ListRunStepsResponseObject")] +internal readonly partial struct InternalListRunStepsResponseObject {} + +[CodeGenModel("RunStepDetailsToolCallsFileSearchObject")] +internal partial class InternalRunStepFileSearchToolCallDetails { } + +[CodeGenModel("TruncationObjectType")] +internal readonly partial struct InternalTruncationObjectType { } + +[CodeGenModel("AssistantsNamedToolChoiceType")] +internal readonly partial struct InternalAssistantsNamedToolChoiceType { } + +[CodeGenModel("RunStepDeltaStepDetailsToolCallsCodeObject")] +internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeObject { } + +[CodeGenModel("RunStepDeltaStepDetailsToolCallsCodeOutputImageObject")] +internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject { } + +[CodeGenModel("RunStepDeltaStepDetailsToolCallsCodeOutputLogsObject")] +internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject +{ + [CodeGenMember("Logs")] + public string InternalLogs { get; set; } +} + +[CodeGenModel("RunStepDeltaStepDetailsMessageCreationObject")] +internal partial class InternalRunStepDeltaStepDetailsMessageCreationObject { } + +[CodeGenModel("RunStepDeltaStepDetailsToolCallsObject")] +internal partial class InternalRunStepDeltaStepDetailsToolCallsObject { } + +[CodeGenModel("RunStepDeltaStepDetailsToolCallsFileSearchObject")] +internal partial class InternalRunStepDeltaStepDetailsToolCallsFileSearchObject { } + +[CodeGenModel("RunStepDeltaStepDetailsToolCallsFunctionObject")] +internal partial class InternalRunStepDeltaStepDetailsToolCallsFunctionObject { } + +[CodeGenModel("RunStepDeltaStepDetailsToolCallsObjectToolCallsObject")] +internal partial class InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject { } + +[CodeGenModel("RunStepDeltaStepDetailsMessageCreationObjectMessageCreation")] +internal partial class InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation { } + +[CodeGenModel("RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter")] +internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter { } + +[CodeGenModel("RunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage")] +internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage { } + +[CodeGenModel("RunStepDeltaStepDetails")] +internal partial class InternalRunStepDeltaStepDetails { } + +[CodeGenModel("RunStepDeltaStepDetailsToolCallsFunctionObjectFunction")] +internal partial class InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction { } + +[CodeGenModel("AssistantsNamedToolChoiceFunction")] +internal partial class InternalAssistantsNamedToolChoiceFunction { } + +[CodeGenModel("AssistantObjectObject")] +internal readonly partial struct InternalAssistantObjectObject { } + +[CodeGenModel("MessageObjectObject")] +internal readonly partial struct InternalMessageObjectObject { } + +[CodeGenModel("RunObjectObject")] +internal readonly partial struct InternalRunObjectObject { } + +[CodeGenModel("RunStepObjectObject")] +internal readonly partial struct InternalRunStepObjectObject { } + +[CodeGenModel("ThreadObjectObject")] +internal readonly partial struct InternalThreadObjectObject { } + +[CodeGenModel("MessageRequestContentTextObjectType")] +internal readonly partial struct InternalMessageRequestContentTextObjectType { } + +[CodeGenModel("MessageContentImageUrlObjectImageUrlDetail")] +internal readonly partial struct InternalMessageContentImageUrlObjectImageUrlDetail { } + +[CodeGenModel("MessageContentImageFileObjectImageFileDetail")] +internal readonly partial struct InternalMessageContentImageFileObjectImageFileDetail { } + +[CodeGenModel("MessageDeltaContentImageFileObjectImageFileDetail")] +internal readonly partial struct InternalMessageDeltaContentImageFileObjectImageFileDetail { } + +[CodeGenModel("MessageDeltaContentImageUrlObjectImageUrlDetail")] +internal readonly partial struct InternalMessageDeltaContentImageUrlObjectImageUrlDetail { } + +[CodeGenModel("MessageDeltaObject")] +internal partial class InternalMessageDeltaObject { } + +[CodeGenModel("MessageDeltaObjectDeltaRole")] +internal readonly partial struct InternalMessageDeltaObjectDeltaRole { } + +[CodeGenModel("MessageDeltaObjectObject")] +internal readonly partial struct InternalMessageDeltaObjectObject { } + +[CodeGenModel("MessageObjectAttachment")] +internal partial class InternalMessageObjectAttachment { } + +[CodeGenModel("MessageContentImageFileObjectType")] +internal readonly partial struct InternalMessageContentImageFileObjectType { } + +[CodeGenModel("MessageContentImageUrlObjectType")] +internal readonly partial struct InternalMessageContentImageUrlObjectType { } + +[CodeGenModel("MessageContentTextObjectType")] +internal readonly partial struct InternalMessageContentTextObjectType { } + +[CodeGenModel("RunObjectRequiredActionType")] +internal readonly partial struct InternalRunObjectRequiredActionType { } + +[CodeGenModel("RunStepDeltaObjectObject")] +internal readonly partial struct InternalRunStepDeltaObjectObject { } + +[CodeGenModel("RunToolCallObjectType")] +internal readonly partial struct InternalRunToolCallObjectType { } + +[CodeGenModel("MessageObjectRole")] +internal readonly partial struct InternalMessageObjectRole { } + +[CodeGenModel("CreateRunRequestModel")] +internal readonly partial struct InternalCreateRunRequestModel { } + +[CodeGenModel("CreateAssistantRequestToolResources")] +internal partial class InternalCreateAssistantRequestToolResources { } + +[CodeGenModel("CreateAssistantRequestToolResourcesCodeInterpreter")] +internal partial class InternalCreateAssistantRequestToolResourcesCodeInterpreter { } + +[CodeGenModel("CreateThreadAndRunRequestModel")] +internal readonly partial struct InternalCreateThreadAndRunRequestModel { } + +[CodeGenModel("CreateThreadAndRunRequestToolChoice")] +internal readonly partial struct InternalCreateThreadAndRunRequestToolChoice { } + +[CodeGenModel("CreateThreadAndRunRequestToolResources")] +internal partial class InternalCreateThreadAndRunRequestToolResources { } + +[CodeGenModel("CreateThreadAndRunRequestToolResourcesCodeInterpreter")] +internal partial class InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter { } + +[CodeGenModel("CreateThreadRequestToolResources")] +internal partial class InternalCreateThreadRequestToolResources { } + +[CodeGenModel("CreateThreadRequestToolResourcesCodeInterpreter")] +internal partial class InternalCreateThreadRequestToolResourcesCodeInterpreter { } + +[CodeGenModel("CreateThreadRequestToolResourcesFileSearchBase")] +internal partial class InternalCreateThreadRequestToolResourcesFileSearchBase { } + +[CodeGenModel("ModifyAssistantRequestToolResources")] +internal partial class InternalModifyAssistantRequestToolResources { } + +[CodeGenModel("ModifyAssistantRequestToolResourcesCodeInterpreter")] +internal partial class InternalModifyAssistantRequestToolResourcesCodeInterpreter { } + +[CodeGenModel("ModifyThreadRequestToolResources")] +internal partial class InternalModifyThreadRequestToolResources { } + +[CodeGenModel("ModifyThreadRequestToolResourcesCodeInterpreter")] +internal partial class InternalModifyThreadRequestToolResourcesCodeInterpreter { } + +[CodeGenModel("ThreadObjectToolResources")] +internal partial class InternalThreadObjectToolResources { } + +[CodeGenModel("ThreadObjectToolResourcesCodeInterpreter")] +internal partial class InternalThreadObjectToolResourcesCodeInterpreter { } + +[CodeGenModel("ThreadObjectToolResourcesFileSearch")] +internal partial class InternalThreadObjectToolResourcesFileSearch { } + +[CodeGenModel("AssistantToolsFileSearchTypeOnly")] +internal partial class InternalAssistantToolsFileSearchTypeOnly { } + +[CodeGenModel("AssistantToolsFileSearchTypeOnlyType")] +internal readonly partial struct InternalAssistantToolsFileSearchTypeOnlyType { } + +[CodeGenModel("AssistantResponseFormatText")] internal partial class InternalAssistantResponseFormatText { } +[CodeGenModel("AssistantResponseFormatJsonObject")] internal partial class InternalAssistantResponseFormatJsonObject { } +[CodeGenModel("AssistantResponseFormatJsonSchema")] internal partial class InternalAssistantResponseFormatJsonSchema { } +[CodeGenModel("UnknownAssistantResponseFormat")] internal partial class InternalUnknownAssistantResponseFormat { } +[CodeGenModel("MessageDeltaContentRefusalObject")] internal partial class InternalMessageDeltaContentRefusalObject { } +[CodeGenModel("ToolResourcesFileSearchIdsOnly")] internal partial class InternalToolResourcesFileSearchIdsOnly { } diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalAssistantMessageClient.Protocol.cs b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantMessageClient.Protocol.cs new file mode 100644 index 000000000..fb54903bc --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantMessageClient.Protocol.cs @@ -0,0 +1,165 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Assistants; + +internal partial class InternalAssistantMessageClient +{ + /// + /// [Protocol Method] Create a message. + /// + /// The ID of the [thread](/docs/api-reference/threads) to create a message for. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task CreateMessageAsync(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateMessageRequest(threadId, content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Create a message. + /// + /// The ID of the [thread](/docs/api-reference/threads) to create a message for. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult CreateMessage(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateMessageRequest(threadId, content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Retrieve a message. + /// + /// The ID of the [thread](/docs/api-reference/threads) to which this message belongs. + /// The ID of the message to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task GetMessageAsync(string threadId, string messageId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + using PipelineMessage message = CreateGetMessageRequest(threadId, messageId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieve a message. + /// + /// The ID of the [thread](/docs/api-reference/threads) to which this message belongs. + /// The ID of the message to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult GetMessage(string threadId, string messageId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + using PipelineMessage message = CreateGetMessageRequest(threadId, messageId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Modifies a message. + /// + /// The ID of the thread to which this message belongs. + /// The ID of the message to modify. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// , or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task ModifyMessageAsync(string threadId, string messageId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyMessageRequest(threadId, messageId, content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Modifies a message. + /// + /// The ID of the thread to which this message belongs. + /// The ID of the message to modify. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// , or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult ModifyMessage(string threadId, string messageId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyMessageRequest(threadId, messageId, content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Deletes a message. + /// + /// The ID of the thread to which this message belongs. + /// The ID of the message to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task DeleteMessageAsync(string threadId, string messageId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + using PipelineMessage message = CreateDeleteMessageRequest(threadId, messageId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Deletes a message. + /// + /// The ID of the thread to which this message belongs. + /// The ID of the message to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult DeleteMessage(string threadId, string messageId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(messageId, nameof(messageId)); + + using PipelineMessage message = CreateDeleteMessageRequest(threadId, messageId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalAssistantMessageClient.cs b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantMessageClient.cs new file mode 100644 index 000000000..625f36158 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantMessageClient.cs @@ -0,0 +1,63 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace OpenAI.Assistants; + +[CodeGenClient("Messages")] +[CodeGenSuppress("InternalAssistantMessageClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateMessageAsync", typeof(string), typeof(MessageCreationOptions))] +[CodeGenSuppress("CreateMessage", typeof(string), typeof(MessageCreationOptions))] +[CodeGenSuppress("GetMessagesAsync", typeof(string), typeof(int?), typeof(ListOrder?), typeof(string), typeof(string))] +[CodeGenSuppress("GetMessages", typeof(string), typeof(int?), typeof(ListOrder?), typeof(string), typeof(string))] +[CodeGenSuppress("GetMessageAsync", typeof(string), typeof(string))] +[CodeGenSuppress("GetMessage", typeof(string), typeof(string))] +[CodeGenSuppress("ModifyMessageAsync", typeof(string), typeof(string), typeof(MessageModificationOptions))] +[CodeGenSuppress("ModifyMessage", typeof(string), typeof(string), typeof(MessageModificationOptions))] +[CodeGenSuppress("DeleteMessageAsync", typeof(string), typeof(string))] +[CodeGenSuppress("DeleteMessage", typeof(string), typeof(string))] +internal partial class InternalAssistantMessageClient +{ + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// is null. + public InternalAssistantMessageClient(ApiKeyCredential credential) : this(credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// is null. + public InternalAssistantMessageClient(ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The options to configure the client. + /// is null. + protected internal InternalAssistantMessageClient(ClientPipeline pipeline, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + options ??= new OpenAIClientOptions(); + + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalAssistantResponseFormatPlainTextNoObject.Serialization.cs b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantResponseFormatPlainTextNoObject.Serialization.cs new file mode 100644 index 000000000..1d962611c --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantResponseFormatPlainTextNoObject.Serialization.cs @@ -0,0 +1,36 @@ +using System; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants; + +internal partial class InternalAssistantResponseFormatPlainTextNoObject : IJsonModel +{ + internal static void SerializeInternalAssistantResponseFormatPlainTextNoObject(InternalAssistantResponseFormatPlainTextNoObject instance, Utf8JsonWriter writer, ModelReaderWriterOptions options = null) + { + writer.WriteStringValue(instance.Value); + } + + internal static InternalAssistantResponseFormatPlainTextNoObject DeserializeInternalAssistantResponseFormatPlainTextNoObject(JsonElement element, ModelReaderWriterOptions options = null) + { + if (element.ValueKind == JsonValueKind.String) + { + return new(element.GetString()); + } + return null; + } + + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeInternalAssistantResponseFormatPlainTextNoObject, writer, options); + + InternalAssistantResponseFormatPlainTextNoObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + => CustomSerializationHelpers.DeserializeNewInstance(this, DeserializeInternalAssistantResponseFormatPlainTextNoObject, ref reader, options); + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, options); + + InternalAssistantResponseFormatPlainTextNoObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + => CustomSerializationHelpers.DeserializeNewInstance(this, DeserializeInternalAssistantResponseFormatPlainTextNoObject, data, options); + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalAssistantResponseFormatPlainTextNoObject.cs b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantResponseFormatPlainTextNoObject.cs new file mode 100644 index 000000000..6c3e2ee7c --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantResponseFormatPlainTextNoObject.cs @@ -0,0 +1,11 @@ +namespace OpenAI.Assistants; + +internal partial class InternalAssistantResponseFormatPlainTextNoObject : AssistantResponseFormat +{ + public string Value { get; set; } + + public InternalAssistantResponseFormatPlainTextNoObject(string value) + { + Value = value; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalAssistantRunClient.Protocol.cs b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantRunClient.Protocol.cs new file mode 100644 index 000000000..4ffef5d1b --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantRunClient.Protocol.cs @@ -0,0 +1,351 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Assistants; + +internal partial class InternalAssistantRunClient +{ + /// + /// [Protocol Method] Create a thread and run it in one request. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task CreateThreadAndRunAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + PipelineMessage message = null; + try + { + message = CreateCreateThreadAndRunRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + /// + /// [Protocol Method] Create a thread and run it in one request. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult CreateThreadAndRun(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + PipelineMessage message = null; + try + { + message = CreateCreateThreadAndRunRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + /// + /// [Protocol Method] Create a run. + /// + /// The ID of the thread to run. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task CreateRunAsync(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNull(content, nameof(content)); + + PipelineMessage message = null; + try + { + message = CreateCreateRunRequest(threadId, content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + /// + /// [Protocol Method] Create a run. + /// + /// The ID of the thread to run. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult CreateRun(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNull(content, nameof(content)); + + PipelineMessage message = null; + try + { + message = CreateCreateRunRequest(threadId, content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + /// + /// [Protocol Method] Retrieves a run. + /// + /// The ID of the [thread](/docs/api-reference/threads) that was run. + /// The ID of the run to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task GetRunAsync(string threadId, string runId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateGetRunRequest(threadId, runId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieves a run. + /// + /// The ID of the [thread](/docs/api-reference/threads) that was run. + /// The ID of the run to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult GetRun(string threadId, string runId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateGetRunRequest(threadId, runId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Modifies a run. + /// + /// The ID of the [thread](/docs/api-reference/threads) that was run. + /// The ID of the run to modify. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// , or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task ModifyRunAsync(string threadId, string runId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyRunRequest(threadId, runId, content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Modifies a run. + /// + /// The ID of the [thread](/docs/api-reference/threads) that was run. + /// The ID of the run to modify. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// , or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult ModifyRun(string threadId, string runId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyRunRequest(threadId, runId, content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Cancels a run that is `in_progress`. + /// + /// The ID of the thread to which this run belongs. + /// The ID of the run to cancel. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task CancelRunAsync(string threadId, string runId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateCancelRunRequest(threadId, runId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Cancels a run that is `in_progress`. + /// + /// The ID of the thread to which this run belongs. + /// The ID of the run to cancel. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult CancelRun(string threadId, string runId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateCancelRunRequest(threadId, runId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] When a run has the `status: "requires_action"` and `required_action.type` is + /// `submit_tool_outputs`, this endpoint can be used to submit the outputs from the tool calls once + /// they're all completed. All outputs must be submitted in a single request. + /// + /// The ID of the [thread](/docs/api-reference/threads) to which this run belongs. + /// The ID of the run that requires the tool output submission. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// , or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task SubmitToolOutputsToRunAsync(string threadId, string runId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + Argument.AssertNotNull(content, nameof(content)); + + PipelineMessage message = null; + try + { + message = CreateSubmitToolOutputsToRunRequest(threadId, runId, content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + /// + /// [Protocol Method] When a run has the `status: "requires_action"` and `required_action.type` is + /// `submit_tool_outputs`, this endpoint can be used to submit the outputs from the tool calls once + /// they're all completed. All outputs must be submitted in a single request. + /// + /// The ID of the [thread](/docs/api-reference/threads) to which this run belongs. + /// The ID of the run that requires the tool output submission. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// , or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult SubmitToolOutputsToRun(string threadId, string runId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + Argument.AssertNotNull(content, nameof(content)); + + PipelineMessage message = null; + try + { + message = CreateSubmitToolOutputsToRunRequest(threadId, runId, content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + finally + { + if (options?.BufferResponse != false) + { + message.Dispose(); + } + } + } + + /// + /// [Protocol Method] Retrieves a run step. + /// + /// The ID of the thread to which the run and run step belongs. + /// The ID of the run to which the run step belongs. + /// The ID of the run step to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// , or is null. + /// , or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task GetRunStepAsync(string threadId, string runId, string stepId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + Argument.AssertNotNullOrEmpty(stepId, nameof(stepId)); + + using PipelineMessage message = CreateGetRunStepRequest(threadId, runId, stepId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieves a run step. + /// + /// The ID of the thread to which the run and run step belongs. + /// The ID of the run to which the run step belongs. + /// The ID of the run step to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// , or is null. + /// , or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult GetRunStep(string threadId, string runId, string stepId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + Argument.AssertNotNullOrEmpty(stepId, nameof(stepId)); + + using PipelineMessage message = CreateGetRunStepRequest(threadId, runId, stepId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalAssistantRunClient.cs b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantRunClient.cs new file mode 100644 index 000000000..8b6474cfe --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantRunClient.cs @@ -0,0 +1,71 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace OpenAI.Assistants; + +[CodeGenClient("Runs")] +[CodeGenSuppress("InternalAssistantRunClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateThreadAndRunAsync", typeof(InternalCreateThreadAndRunRequest))] +[CodeGenSuppress("CreateThreadAndRun", typeof(InternalCreateThreadAndRunRequest))] +[CodeGenSuppress("CreateRunAsync", typeof(string), typeof(RunCreationOptions))] +[CodeGenSuppress("CreateRun", typeof(string), typeof(RunCreationOptions))] +[CodeGenSuppress("GetRunsAsync", typeof(string), typeof(int?), typeof(ListOrder?), typeof(string), typeof(string))] +[CodeGenSuppress("GetRuns", typeof(string), typeof(int?), typeof(ListOrder?), typeof(string), typeof(string))] +[CodeGenSuppress("GetRunAsync", typeof(string), typeof(string))] +[CodeGenSuppress("GetRun", typeof(string), typeof(string))] +[CodeGenSuppress("ModifyRunAsync", typeof(string), typeof(string), typeof(RunModificationOptions))] +[CodeGenSuppress("ModifyRun", typeof(string), typeof(string), typeof(RunModificationOptions))] +[CodeGenSuppress("CancelRunAsync", typeof(string), typeof(string))] +[CodeGenSuppress("CancelRun", typeof(string), typeof(string))] +[CodeGenSuppress("SubmitToolOutputsToRunAsync", typeof(string), typeof(string), typeof(InternalSubmitToolOutputsRunRequest))] +[CodeGenSuppress("SubmitToolOutputsToRun", typeof(string), typeof(string), typeof(InternalSubmitToolOutputsRunRequest))] +[CodeGenSuppress("GetRunStepsAsync", typeof(string), typeof(string), typeof(int?), typeof(ListOrder?), typeof(string), typeof(string))] +[CodeGenSuppress("GetRunSteps", typeof(string), typeof(string), typeof(int?), typeof(ListOrder?), typeof(string), typeof(string))] +[CodeGenSuppress("GetRunStepAsync", typeof(string), typeof(string), typeof(string))] +[CodeGenSuppress("GetRunStep", typeof(string), typeof(string), typeof(string))] +internal partial class InternalAssistantRunClient +{ + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// is null. + public InternalAssistantRunClient(ApiKeyCredential credential) : this(credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// is null. + public InternalAssistantRunClient(ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The options to configure the client. + /// is null. + protected internal InternalAssistantRunClient(ClientPipeline pipeline, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + options ??= new OpenAIClientOptions(); + + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalAssistantThreadClient.Protocol.cs b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantThreadClient.Protocol.cs new file mode 100644 index 000000000..3986d26c8 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantThreadClient.Protocol.cs @@ -0,0 +1,143 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Assistants; + +internal partial class InternalAssistantThreadClient +{ + /// + /// [Protocol Method] Create a thread. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task CreateThreadAsync(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCreateThreadRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Create a thread. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult CreateThread(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCreateThreadRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Retrieves a thread. + /// + /// The ID of the thread to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task GetThreadAsync(string threadId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetThreadRequest(threadId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieves a thread. + /// + /// The ID of the thread to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult GetThread(string threadId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetThreadRequest(threadId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Modifies a thread. + /// + /// The ID of the thread to modify. Only the `metadata` can be modified. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task ModifyThreadAsync(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyThreadRequest(threadId, content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Modifies a thread. + /// + /// The ID of the thread to modify. Only the `metadata` can be modified. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult ModifyThread(string threadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyThreadRequest(threadId, content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Delete a thread. + /// + /// The ID of the thread to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task DeleteThreadAsync(string threadId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateDeleteThreadRequest(threadId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Delete a thread. + /// + /// The ID of the thread to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult DeleteThread(string threadId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateDeleteThreadRequest(threadId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalAssistantThreadClient.cs b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantThreadClient.cs new file mode 100644 index 000000000..295a8a492 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantThreadClient.cs @@ -0,0 +1,61 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace OpenAI.Assistants; + +[CodeGenClient("Threads")] +[CodeGenSuppress("InternalAssistantThreadClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateThreadAsync", typeof(ThreadCreationOptions))] +[CodeGenSuppress("CreateThread", typeof(ThreadCreationOptions))] +[CodeGenSuppress("GetThreadAsync", typeof(string))] +[CodeGenSuppress("GetThread", typeof(string))] +[CodeGenSuppress("ModifyThreadAsync", typeof(string), typeof(ThreadModificationOptions))] +[CodeGenSuppress("ModifyThread", typeof(string), typeof(ThreadModificationOptions))] +[CodeGenSuppress("DeleteThreadAsync", typeof(string))] +[CodeGenSuppress("DeleteThread", typeof(string))] +internal partial class InternalAssistantThreadClient +{ + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// is null. + public InternalAssistantThreadClient(ApiKeyCredential credential) : this(credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// is null. + public InternalAssistantThreadClient(ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The options to configure the client. + /// is null. + protected internal InternalAssistantThreadClient(ClientPipeline pipeline, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + options ??= new OpenAIClientOptions(); + + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalAssistantToolsFileSearchFileSearch.cs b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantToolsFileSearchFileSearch.cs new file mode 100644 index 000000000..781f98db2 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalAssistantToolsFileSearchFileSearch.cs @@ -0,0 +1,8 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("AssistantToolsFileSearchFileSearch")] +internal partial class InternalAssistantToolsFileSearchFileSearch +{ + [CodeGenMember("MaxNumResults")] + internal int? InternalMaxNumResults { get; set; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageFileContent.Serialization.cs b/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageFileContent.Serialization.cs new file mode 100644 index 000000000..c6dd038a1 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageFileContent.Serialization.cs @@ -0,0 +1,28 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +internal partial class InternalMessageImageFileContent : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeInternalMessageImageFileContent, writer, options); + + internal static void SerializeInternalMessageImageFileContent(InternalMessageImageFileContent instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(_type); + writer.WritePropertyName("image_file"u8); + writer.WriteObjectValue(_imageFile, options); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageFileContent.cs b/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageFileContent.cs new file mode 100644 index 000000000..222a44dc8 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageFileContent.cs @@ -0,0 +1,41 @@ +using System; + +namespace OpenAI.Assistants; + +/// +/// Represents an item of image file content within an Assistants API message. +/// +/// +/// Use the method to +/// create an instance of this type. +/// +[CodeGenModel("MessageContentImageFileObject")] +[CodeGenSuppress("InternalMessageImageFileContent", typeof(InternalMessageContentItemFileObjectImageFile))] +internal partial class InternalMessageImageFileContent +{ + [CodeGenMember("Type")] + private string _type = "image_file"; + + [CodeGenMember("ImageFile")] + internal InternalMessageContentItemFileObjectImageFile _imageFile; + + /// + public string InternalFileId => _imageFile.FileId; + + /// + public MessageImageDetail? InternalDetail => _imageFile.Detail?.ToMessageImageDetail(); + + /// Initializes a new instance of . + internal InternalMessageImageFileContent(string imageFileId, MessageImageDetail? detail = null) + : this(new InternalMessageContentItemFileObjectImageFile(imageFileId, detail?.ToSerialString(), null)) + {} + + /// Initializes a new instance of . + /// + /// is null. + internal InternalMessageImageFileContent(InternalMessageContentItemFileObjectImageFile imageFile) + { + Argument.AssertNotNull(imageFile, nameof(imageFile)); + _imageFile = imageFile; + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageUrlContent.Serialization.cs b/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageUrlContent.Serialization.cs new file mode 100644 index 000000000..48d0f7751 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageUrlContent.Serialization.cs @@ -0,0 +1,28 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +internal partial class InternalMessageImageUrlContent : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeInternalMessageImageUrlContent, writer, options); + + internal static void SerializeInternalMessageImageUrlContent(InternalMessageImageUrlContent instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(_type); + writer.WritePropertyName("image_url"u8); + writer.WriteObjectValue(_imageUrl, options); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageUrlContent.cs b/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageUrlContent.cs new file mode 100644 index 000000000..f007fec8c --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalMessageImageUrlContent.cs @@ -0,0 +1,41 @@ +using System; + +namespace OpenAI.Assistants; + +/// +/// Represents an item of image URL content within an Assistants API message. +/// +/// +/// Use the method to +/// create an instance of this type. +/// +[CodeGenModel("MessageContentImageUrlObject")] +[CodeGenSuppress("MessageImageUrlContent", typeof(InternalMessageContentImageUrlObjectImageUrl))] +internal partial class InternalMessageImageUrlContent +{ + [CodeGenMember("Type")] + private string _type = "image_url"; + + [CodeGenMember("ImageUrl")] + internal InternalMessageContentImageUrlObjectImageUrl _imageUrl; + + /// + public Uri InternalUrl => _imageUrl.Url; + + /// + public MessageImageDetail? InternalDetail => _imageUrl.Detail?.ToMessageImageDetail(); + + /// Initializes a new instance of . + internal InternalMessageImageUrlContent(Uri url, MessageImageDetail? detail = null) + : this(new InternalMessageContentImageUrlObjectImageUrl(url, detail?.ToSerialString(), null)) + {} + + /// Initializes a new instance of . + /// + /// is null. + internal InternalMessageImageUrlContent(InternalMessageContentImageUrlObjectImageUrl imageUrl) + { + Argument.AssertNotNull(imageUrl, nameof(imageUrl)); + _imageUrl = imageUrl; + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalMessageRefusalContent.Serialization.cs b/.dotnet/src/Custom/Assistants/Internal/InternalMessageRefusalContent.Serialization.cs new file mode 100644 index 000000000..b14bf17a6 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalMessageRefusalContent.Serialization.cs @@ -0,0 +1,27 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +internal partial class InternalMessageRefusalContent : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeInternalMessageRefusalContent, writer, options); + + internal static void SerializeInternalMessageRefusalContent(InternalMessageRefusalContent instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(_type); + writer.WritePropertyName("refusal"u8); + writer.WriteStringValue(Refusal); + writer.WriteEndObject(); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalMessageRefusalContent.cs b/.dotnet/src/Custom/Assistants/Internal/InternalMessageRefusalContent.cs new file mode 100644 index 000000000..ed0188314 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalMessageRefusalContent.cs @@ -0,0 +1,20 @@ +using System; + +namespace OpenAI.Assistants; + +/// +/// Represents an item of image URL content within an Assistants API message. +/// +/// +/// Use the method to +/// create an instance of this type. +/// +[CodeGenModel("MessageContentRefusalObject")] +internal partial class InternalMessageRefusalContent +{ + [CodeGenMember("Type")] + private string _type = "refusal"; + + [CodeGenMember("Refusal")] + public string InternalRefusal { get; set; } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalRequestMessageTextContent.Serialization.cs b/.dotnet/src/Custom/Assistants/Internal/InternalRequestMessageTextContent.Serialization.cs new file mode 100644 index 000000000..565c34ee4 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalRequestMessageTextContent.Serialization.cs @@ -0,0 +1,28 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +internal partial class InternalRequestMessageTextContent : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeInternalRequestMessageTextContent, writer, options); + + internal static void SerializeInternalRequestMessageTextContent(InternalRequestMessageTextContent instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type.ToString()); + writer.WritePropertyName("text"u8); + writer.WriteStringValue(InternalText); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalRequestMessageTextContent.cs b/.dotnet/src/Custom/Assistants/Internal/InternalRequestMessageTextContent.cs new file mode 100644 index 000000000..9925844e2 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalRequestMessageTextContent.cs @@ -0,0 +1,15 @@ +namespace OpenAI.Assistants; + +/// +/// Represents an item of text content within an Assistants API message. +/// +/// +/// Use the method to create an instance of this +/// type. +/// +[CodeGenModel("MessageRequestContentTextObject")] +internal partial class InternalRequestMessageTextContent +{ + [CodeGenMember("Text")] + internal string InternalText { get; } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalRequiredFunctionToolCall.cs b/.dotnet/src/Custom/Assistants/Internal/InternalRequiredFunctionToolCall.cs new file mode 100644 index 000000000..89ed0f097 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalRequiredFunctionToolCall.cs @@ -0,0 +1,25 @@ +namespace OpenAI.Assistants; + +/// +/// A requested invocation of a defined function tool, needed by an Assistants API run to continue. +/// +[CodeGenModel("RunToolCallObject")] +internal partial class InternalRequiredFunctionToolCall : InternalRequiredToolCall +{ + // CUSTOM: + // - 'Type' is hidden, as the object discriminator does not carry additional value to the caller in the context + // of a strongly-typed object model + // - 'Function' is hidden and its constituent 'Name' and 'Arguments' members are promoted to direct visibility + + [CodeGenMember("Type")] + private readonly object _type; + [CodeGenMember("Function")] + internal readonly InternalRunToolCallObjectFunction _internalFunction; + + /// + public string InternalName => _internalFunction.Name; + + /// + public string InternalArguments => _internalFunction.Arguments; + +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalRequiredToolCall.cs b/.dotnet/src/Custom/Assistants/Internal/InternalRequiredToolCall.cs new file mode 100644 index 000000000..4d7fc35a1 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalRequiredToolCall.cs @@ -0,0 +1,14 @@ +namespace OpenAI.Assistants; + +/// +/// An abstract, base representation for a tool call that an Assistants API run requires outputs +/// from in order to continue. +/// +/// +/// is the abstract base type for all required tool calls. Its +/// concrete type can be one of: +/// +/// +/// +/// +internal abstract partial class InternalRequiredToolCall : RequiredAction { } \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalResponseMessageTextContent.Serialization.cs b/.dotnet/src/Custom/Assistants/Internal/InternalResponseMessageTextContent.Serialization.cs new file mode 100644 index 000000000..5cb19bb15 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalResponseMessageTextContent.Serialization.cs @@ -0,0 +1,28 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +internal partial class InternalResponseMessageTextContent : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeInternalResponseMessageTextContent, writer, options); + + internal static void SerializeInternalResponseMessageTextContent(InternalResponseMessageTextContent instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(_type); + writer.WritePropertyName("text"u8); + writer.WriteObjectValue(_text, options); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalResponseMessageTextContent.cs b/.dotnet/src/Custom/Assistants/Internal/InternalResponseMessageTextContent.cs new file mode 100644 index 000000000..0d7b79794 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalResponseMessageTextContent.cs @@ -0,0 +1,45 @@ +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +/// +/// Represents an item of annotated text content within an Assistants API response message. +/// +[CodeGenModel("MessageContentTextObject")] +internal partial class InternalResponseMessageTextContent +{ + /// + public string InternalText => _text.Value; + + public IReadOnlyList InternalAnnotations => _annotations ??= WrapAnnotations(); + + [CodeGenMember("Type")] + private readonly string _type; + + [CodeGenMember("Text")] + private readonly InternalMessageContentTextObjectText _text; + + private IReadOnlyList _annotations; + + /// Initializes a new instance of . + /// + /// is null. + internal InternalResponseMessageTextContent(InternalMessageContentTextObjectText internalText) + { + Argument.AssertNotNull(internalText, nameof(internalText)); + + _text = internalText; + } + + public override string ToString() => Text; + + private IReadOnlyList WrapAnnotations() + { + List annotations = []; + foreach (InternalMessageContentTextObjectAnnotation internalAnnotation in _text?.Annotations ?? []) + { + annotations.Add(new(internalAnnotation)); + } + return annotations; + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalRunStepCodeInterpreterLogOutput.cs b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepCodeInterpreterLogOutput.cs new file mode 100644 index 000000000..4d0a1d8e0 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepCodeInterpreterLogOutput.cs @@ -0,0 +1,11 @@ +namespace OpenAI.Assistants +{ + /// Text output from the Code Interpreter tool call as part of a run step. + [CodeGenModel("RunStepDetailsToolCallsCodeOutputLogsObject")] + internal partial class InternalRunStepCodeInterpreterLogOutput : RunStepCodeInterpreterOutput + { + /// The text output from the Code Interpreter tool call. + [CodeGenMember("Logs")] + public string InternalLogs { get; } + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalRunStepCodeInterpreterToolCallDetails.cs b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepCodeInterpreterToolCallDetails.cs new file mode 100644 index 000000000..71f873f02 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepCodeInterpreterToolCallDetails.cs @@ -0,0 +1,16 @@ +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +[CodeGenModel("RunStepDetailsToolCallsCodeObject")] +internal partial class InternalRunStepCodeInterpreterToolCallDetails +{ + /// + public string Input => _codeInterpreter.Input; + + /// + public IReadOnlyList Outputs => _codeInterpreter.Outputs; + + [CodeGenMember("CodeInterpreter")] + internal InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter _codeInterpreter; +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalRunStepDetailsMessageCreationObject.cs b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepDetailsMessageCreationObject.cs new file mode 100644 index 000000000..9c0ab12f5 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepDetailsMessageCreationObject.cs @@ -0,0 +1,11 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("RunStepDetailsMessageCreationObject")] +internal partial class InternalRunStepDetailsMessageCreationObject : RunStepDetails +{ + /// + public string InternalMessageId => _messageCreation.MessageId; + + [CodeGenMember("MessageCreation")] + internal readonly InternalRunStepDetailsMessageCreationObjectMessageCreation _messageCreation; +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalRunStepDetailsToolCallsCodeOutputImageObject.cs b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepDetailsToolCallsCodeOutputImageObject.cs new file mode 100644 index 000000000..15558d514 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepDetailsToolCallsCodeOutputImageObject.cs @@ -0,0 +1,11 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("RunStepDetailsToolCallsCodeOutputImageObject")] +internal partial class InternalRunStepDetailsToolCallsCodeOutputImageObject +{ + /// + public string FileId => _image.FileId; + + [CodeGenMember("Image")] + internal InternalRunStepDetailsToolCallsCodeOutputImageObjectImage _image; +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalRunStepFunctionToolCallDetails.cs b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepFunctionToolCallDetails.cs new file mode 100644 index 000000000..04db0b2ab --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepFunctionToolCallDetails.cs @@ -0,0 +1,12 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("RunStepDetailsToolCallsFunctionObject")] +internal partial class InternalRunStepFunctionToolCallDetails +{ + public string InternalName => _internalFunction.Name; + public string InternalArguments => _internalFunction.Arguments; + public string InternalOutput => _internalFunction.Output; + + [CodeGenMember("Function")] + internal InternalRunStepDetailsToolCallsFunctionObjectFunction _internalFunction; +} diff --git a/.dotnet/src/Custom/Assistants/Internal/InternalRunStepToolCallDetailsCollection.cs b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepToolCallDetailsCollection.cs new file mode 100644 index 000000000..59e91f6e1 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/InternalRunStepToolCallDetailsCollection.cs @@ -0,0 +1,22 @@ +using System.Collections; +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +[CodeGenModel("RunStepDetailsToolCallsObject")] +internal partial class InternalRunStepDetailsToolCallsObject : IReadOnlyList +{ + [CodeGenMember("ToolCalls")] + private IReadOnlyList InternalToolCalls { get; } = []; + + /// + public RunStepToolCall this[int index] => InternalToolCalls[index]; + + /// + public int Count => InternalToolCalls.Count; + + /// + public IEnumerator GetEnumerator() => InternalToolCalls.GetEnumerator(); + + IEnumerator IEnumerable.GetEnumerator() => InternalToolCalls.GetEnumerator(); +} diff --git a/.dotnet/src/Custom/Assistants/Internal/Pagination/AssistantsPageEnumerator.cs b/.dotnet/src/Custom/Assistants/Internal/Pagination/AssistantsPageEnumerator.cs new file mode 100644 index 000000000..7f970feca --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/Pagination/AssistantsPageEnumerator.cs @@ -0,0 +1,135 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.Assistants; + +internal partial class AssistantsPageEnumerator : PageEnumerator +{ + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + private readonly int? _limit; + private readonly string _order; + + private string _after; + + private readonly string _before; + private readonly RequestOptions _options; + + public virtual ClientPipeline Pipeline => _pipeline; + + public AssistantsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + int? limit, string order, string after, string before, + RequestOptions options) + { + _pipeline = pipeline; + _endpoint = endpoint; + + _limit = limit; + _order = order; + _after = after; + _before = before; + _options = options; + } + + public override async Task GetFirstAsync() + => await GetAssistantsAsync(_limit, _order, _after, _before, _options).ConfigureAwait(false); + + public override ClientResult GetFirst() + => GetAssistants(_limit, _order, _after, _before, _options); + + public override async Task GetNextAsync(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return await GetAssistantsAsync(_limit, _order, _after, _before, _options).ConfigureAwait(false); + } + + public override ClientResult GetNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return GetAssistants(_limit, _order, _after, _before, _options); + } + + public override bool HasNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + bool hasMore = doc.RootElement.GetProperty("has_more"u8).GetBoolean(); + + return hasMore; + } + + public override PageResult GetPageFromResult(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + InternalListAssistantsResponse list = ModelReaderWriter.Read(response.Content)!; + + AssistantsPageToken pageToken = AssistantsPageToken.FromOptions(_limit, _order, _after, _before); + AssistantsPageToken? nextPageToken = pageToken.GetNextPageToken(list.HasMore, list.LastId); + + return PageResult.Create(list.Data, pageToken, nextPageToken, response); + } + + internal virtual async Task GetAssistantsAsync(int? limit, string order, string after, string before, RequestOptions options) + { + using PipelineMessage message = CreateGetAssistantsRequest(limit, order, after, before, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal virtual ClientResult GetAssistants(int? limit, string order, string after, string before, RequestOptions options) + { + using PipelineMessage message = CreateGetAssistantsRequest(limit, order, after, before, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + private PipelineMessage CreateGetAssistantsRequest(int? limit, string order, string after, string before, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/assistants", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier? _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); +} diff --git a/.dotnet/src/Custom/Assistants/Internal/Pagination/AssistantsPageToken.cs b/.dotnet/src/Custom/Assistants/Internal/Pagination/AssistantsPageToken.cs new file mode 100644 index 000000000..5cb19ed54 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/Pagination/AssistantsPageToken.cs @@ -0,0 +1,142 @@ +using System; +using System.ClientModel; +using System.Diagnostics; +using System.IO; +using System.Text.Json; + +#nullable enable + +namespace OpenAI.Assistants; + +internal class AssistantsPageToken : ContinuationToken +{ + protected AssistantsPageToken(int? limit, string? order, string? after, string? before) + { + Limit = limit; + Order = order; + After = after; + Before = before; + } + + public int? Limit { get; } + + public string? Order { get; } + + public string? After { get; } + + public string? Before { get; } + + public override BinaryData ToBytes() + { + using MemoryStream stream = new(); + using Utf8JsonWriter writer = new(stream); + + writer.WriteStartObject(); + + if (Limit.HasValue) + { + writer.WriteNumber("limit", Limit.Value); + } + + if (Order is not null) + { + writer.WriteString("order", Order); + } + + if (After is not null) + { + writer.WriteString("after", After); + } + + if (Before is not null) + { + writer.WriteString("before", Before); + } + + writer.WriteEndObject(); + + writer.Flush(); + stream.Position = 0; + + return BinaryData.FromStream(stream); + } + + public AssistantsPageToken? GetNextPageToken(bool hasMore, string? lastId) + { + if (!hasMore || lastId is null) + { + return null; + } + + return new AssistantsPageToken(Limit, Order, After, Before); + } + + public static AssistantsPageToken FromToken(ContinuationToken token) + { + if (token is AssistantsPageToken pageToken) + { + return pageToken; + } + + BinaryData data = token.ToBytes(); + + if (data.ToMemory().Length == 0) + { + throw new ArgumentException("Failed to create AssistantsPageToken from provided pageToken.", nameof(pageToken)); + } + + Utf8JsonReader reader = new(data); + + int? limit = null; + string? order = null; + string? after = null; + string? before = null; + + reader.Read(); + + Debug.Assert(reader.TokenType == JsonTokenType.StartObject); + + while (reader.Read()) + { + if (reader.TokenType == JsonTokenType.EndObject) + { + break; + } + + Debug.Assert(reader.TokenType == JsonTokenType.PropertyName); + + string propertyName = reader.GetString()!; + + switch (propertyName) + { + case "limit": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.Number); + limit = reader.GetInt32(); + break; + case "order": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + order = reader.GetString(); + break; + case "after": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + after = reader.GetString(); + break; + case "before": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + before = reader.GetString(); + break; + default: + throw new JsonException($"Unrecognized property '{propertyName}'."); + } + } + + return new(limit, order, after, before); + } + + public static AssistantsPageToken FromOptions(int? limit, string? order, string? after, string? before) + => new AssistantsPageToken(limit, order, after, before); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/Pagination/MessagesPageEnumerator.cs b/.dotnet/src/Custom/Assistants/Internal/Pagination/MessagesPageEnumerator.cs new file mode 100644 index 000000000..860bfbe0d --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/Pagination/MessagesPageEnumerator.cs @@ -0,0 +1,145 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.Assistants; + +internal partial class MessagesPageEnumerator : PageEnumerator +{ + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + private readonly string _threadId; + private readonly int? _limit; + private readonly string _order; + + private string _after; + + private readonly string _before; + private readonly RequestOptions _options; + + public virtual ClientPipeline Pipeline => _pipeline; + + public MessagesPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string threadId, + int? limit, string order, string after, string before, + RequestOptions options) + { + _pipeline = pipeline; + _endpoint = endpoint; + + _threadId = threadId; + _limit = limit; + _order = order; + _after = after; + _before = before; + + _options = options; + } + + public override async Task GetFirstAsync() + => await GetMessagesAsync(_threadId, _limit, _order, _after, _before, _options).ConfigureAwait(false); + + public override ClientResult GetFirst() + => GetMessages(_threadId, _limit, _order, _after, _before, _options); + + public override async Task GetNextAsync(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return await GetMessagesAsync(_threadId, _limit, _order, _after, _before, _options).ConfigureAwait(false); + } + + public override ClientResult GetNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return GetMessages(_threadId, _limit, _order, _after, _before, _options); + } + + public override bool HasNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + bool hasMore = doc.RootElement.GetProperty("has_more"u8).GetBoolean(); + + return hasMore; + } + + public override PageResult GetPageFromResult(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + InternalListMessagesResponse list = ModelReaderWriter.Read(response.Content)!; + + MessagesPageToken pageToken = MessagesPageToken.FromOptions(_threadId, _limit, _order, _after, _before); + MessagesPageToken? nextPageToken = pageToken.GetNextPageToken(list.HasMore, list.LastId); + + return PageResult.Create(list.Data, pageToken, nextPageToken, response); + } + + internal virtual async Task GetMessagesAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetMessagesRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal virtual ClientResult GetMessages(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetMessagesRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + private PipelineMessage CreateGetMessagesRequest(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/messages", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier? _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); +} diff --git a/.dotnet/src/Custom/Assistants/Internal/Pagination/MessagesPageToken.cs b/.dotnet/src/Custom/Assistants/Internal/Pagination/MessagesPageToken.cs new file mode 100644 index 000000000..ff4bfa9fa --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/Pagination/MessagesPageToken.cs @@ -0,0 +1,158 @@ +using System; +using System.ClientModel; +using System.Diagnostics; +using System.IO; +using System.Text.Json; + +#nullable enable + +namespace OpenAI.Assistants; + +internal class MessagesPageToken : ContinuationToken +{ + protected MessagesPageToken(string threadId, int? limit, string? order, string? after, string? before) + { + ThreadId = threadId; + + Limit = limit; + Order = order; + After = after; + Before = before; + } + + public string ThreadId { get; } + + public int? Limit { get; } + + public string? Order { get; } + + public string? After { get; } + + public string? Before { get; } + + public override BinaryData ToBytes() + { + using MemoryStream stream = new(); + using Utf8JsonWriter writer = new(stream); + + writer.WriteStartObject(); + writer.WriteString("threadId", ThreadId); + + if (Limit.HasValue) + { + writer.WriteNumber("limit", Limit.Value); + } + + if (Order is not null) + { + writer.WriteString("order", Order); + } + + if (After is not null) + { + writer.WriteString("after", After); + } + + if (Before is not null) + { + writer.WriteString("before", Before); + } + + writer.WriteEndObject(); + + writer.Flush(); + stream.Position = 0; + + return BinaryData.FromStream(stream); + } + + public MessagesPageToken? GetNextPageToken(bool hasMore, string? lastId) + { + if (!hasMore || lastId is null) + { + return null; + } + + return new(ThreadId, Limit, Order, lastId, Before); + } + + public static MessagesPageToken FromToken(ContinuationToken pageToken) + { + if (pageToken is MessagesPageToken token) + { + return token; + } + + BinaryData data = pageToken.ToBytes(); + + if (data.ToMemory().Length == 0) + { + throw new ArgumentException("Failed to create MessagesPageToken from provided pageToken.", nameof(pageToken)); + } + + Utf8JsonReader reader = new(data); + + string threadId = null!; + int? limit = null; + string? order = null; + string? after = null; + string? before = null; + + reader.Read(); + + Debug.Assert(reader.TokenType == JsonTokenType.StartObject); + + while (reader.Read()) + { + if (reader.TokenType == JsonTokenType.EndObject) + { + break; + } + + Debug.Assert(reader.TokenType == JsonTokenType.PropertyName); + + string propertyName = reader.GetString()!; + + switch (propertyName) + { + case "threadId": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + threadId = reader.GetString()!; + break; + case "limit": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.Number); + limit = reader.GetInt32(); + break; + case "order": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + order = reader.GetString(); + break; + case "after": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + after = reader.GetString(); + break; + case "before": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + before = reader.GetString(); + break; + default: + throw new JsonException($"Unrecognized property '{propertyName}'."); + } + } + + if (threadId is null) + { + throw new ArgumentException("Failed to create MessagesPageToken from provided pageToken.", nameof(pageToken)); + } + + return new(threadId, limit, order, after, before); + } + + public static MessagesPageToken FromOptions(string threadId, int? limit, string? order, string? after, string? before) + => new(threadId, limit, order, after, before); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/Pagination/RunStepsPageEnumerator.cs b/.dotnet/src/Custom/Assistants/Internal/Pagination/RunStepsPageEnumerator.cs new file mode 100644 index 000000000..1245c4763 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/Pagination/RunStepsPageEnumerator.cs @@ -0,0 +1,151 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.Assistants; + +internal partial class RunStepsPageEnumerator : PageEnumerator +{ + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + private readonly string _threadId; + private readonly string _runId; + + private readonly int? _limit; + private readonly string? _order; + private readonly string? _before; + private readonly RequestOptions _options; + + private string? _after; + + public virtual ClientPipeline Pipeline => _pipeline; + + public RunStepsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string threadId, string runId, + int? limit, string? order, string? after, string? before, + RequestOptions options) + { + _pipeline = pipeline; + _endpoint = endpoint; + + _threadId = threadId; + _runId = runId; + + _limit = limit; + _order = order; + _after = after; + _before = before; + _options = options; + } + + public override async Task GetFirstAsync() + => await GetRunStepsAsync(_threadId, _runId, _limit, _order, _after, _before, _options).ConfigureAwait(false); + + public override ClientResult GetFirst() + => GetRunSteps(_threadId, _runId, _limit, _order, _after, _before, _options); + + public override async Task GetNextAsync(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return await GetRunStepsAsync(_threadId, _runId, _limit, _order, _after, _before, _options).ConfigureAwait(false); + } + + public override ClientResult GetNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return GetRunSteps(_threadId, _runId, _limit, _order, _after, _before, _options); + } + + public override bool HasNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + bool hasMore = doc.RootElement.GetProperty("has_more"u8).GetBoolean(); + + return hasMore; + } + + public override PageResult GetPageFromResult(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + InternalListRunStepsResponse list = ModelReaderWriter.Read(response.Content)!; + + RunStepsPageToken pageToken = RunStepsPageToken.FromOptions(_threadId, _runId, _limit, _order, _after, _before); + RunStepsPageToken? nextPageToken = pageToken.GetNextPageToken(list.HasMore, list.LastId); + + return PageResult.Create(list.Data, pageToken, nextPageToken, response); + } + + internal async virtual Task GetRunStepsAsync(string threadId, string runId, int? limit, string? order, string? after, string? before, RequestOptions? options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateGetRunStepsRequest(threadId, runId, limit, order, after, before, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal virtual ClientResult GetRunSteps(string threadId, string runId, int? limit, string? order, string? after, string? before, RequestOptions? options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateGetRunStepsRequest(threadId, runId, limit, order, after, before, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + private PipelineMessage CreateGetRunStepsRequest(string threadId, string runId, int? limit, string? order, string? after, string? before, RequestOptions? options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/runs/", false); + uri.AppendPath(runId, true); + uri.AppendPath("/steps", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier? _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); +} diff --git a/.dotnet/src/Custom/Assistants/Internal/Pagination/RunStepsPageToken.cs b/.dotnet/src/Custom/Assistants/Internal/Pagination/RunStepsPageToken.cs new file mode 100644 index 000000000..2579e7442 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/Pagination/RunStepsPageToken.cs @@ -0,0 +1,168 @@ +using System; +using System.ClientModel; +using System.Diagnostics; +using System.IO; +using System.Text.Json; + +#nullable enable + +namespace OpenAI.Assistants; + +internal class RunStepsPageToken : ContinuationToken +{ + protected RunStepsPageToken(string threadId, string runId, int? limit, string? order, string? after, string? before) + { + ThreadId = threadId; + RunId = runId; + + Limit = limit; + Order = order; + After = after; + Before = before; + } + + public string ThreadId { get; } + + public string RunId { get; } + + public int? Limit { get; } + + public string? Order { get; } + + public string? After { get; } + + public string? Before { get; } + + public override BinaryData ToBytes() + { + using MemoryStream stream = new(); + using Utf8JsonWriter writer = new(stream); + + writer.WriteStartObject(); + writer.WriteString("threadId", ThreadId); + writer.WriteString("runId", RunId); + + if (Limit.HasValue) + { + writer.WriteNumber("limit", Limit.Value); + } + + if (Order is not null) + { + writer.WriteString("order", Order); + } + + if (After is not null) + { + writer.WriteString("after", After); + } + + if (Before is not null) + { + writer.WriteString("before", Before); + } + + writer.WriteEndObject(); + + writer.Flush(); + stream.Position = 0; + + return BinaryData.FromStream(stream); + } + + public RunStepsPageToken? GetNextPageToken(bool hasMore, string? lastId) + { + if (!hasMore || lastId is null) + { + return null; + } + + return new RunStepsPageToken(ThreadId, RunId, Limit, Order, After, Before); + } + + public static RunStepsPageToken FromToken(ContinuationToken pageToken) + { + if (pageToken is RunStepsPageToken token) + { + return token; + } + + BinaryData data = pageToken.ToBytes(); + + if (data.ToMemory().Length == 0) + { + throw new ArgumentException("Failed to create RunStepsPageToken from provided pageToken.", nameof(pageToken)); + } + + Utf8JsonReader reader = new(data); + + string threadId = null!; + string runId = null!; + int? limit = null; + string? order = null; + string? after = null; + string? before = null; + + reader.Read(); + + Debug.Assert(reader.TokenType == JsonTokenType.StartObject); + + while (reader.Read()) + { + if (reader.TokenType == JsonTokenType.EndObject) + { + break; + } + + Debug.Assert(reader.TokenType == JsonTokenType.PropertyName); + + string propertyName = reader.GetString()!; + + switch (propertyName) + { + case "threadId": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + threadId = reader.GetString()!; + break; + case "runId": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + runId = reader.GetString()!; + break; + case "limit": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.Number); + limit = reader.GetInt32(); + break; + case "order": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + order = reader.GetString(); + break; + case "after": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + after = reader.GetString(); + break; + case "before": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + before = reader.GetString(); + break; + default: + throw new JsonException($"Unrecognized property '{propertyName}'."); + } + } + + if (threadId is null || runId is null) + { + throw new ArgumentException("Failed to create RunStepsPageToken from provided pageToken.", nameof(pageToken)); + } + + return new(threadId, runId, limit, order, after, before); + } + + public static RunStepsPageToken FromOptions(string threadId, string runId, int? limit, string? order, string? after, string? before) + => new RunStepsPageToken(threadId, runId, limit, order, after, before); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/Pagination/RunsPageEnumerator.cs b/.dotnet/src/Custom/Assistants/Internal/Pagination/RunsPageEnumerator.cs new file mode 100644 index 000000000..879f389be --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/Pagination/RunsPageEnumerator.cs @@ -0,0 +1,143 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.Assistants; + +internal partial class RunsPageEnumerator : PageEnumerator +{ + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + private readonly string _threadId; + private readonly int? _limit; + private readonly string _order; + + private string _after; + + private readonly string _before; + private readonly RequestOptions _options; + + public virtual ClientPipeline Pipeline => _pipeline; + + public RunsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string threadId, int? limit, string order, string after, string before, + RequestOptions options) + { + _pipeline = pipeline; + _endpoint = endpoint; + + _threadId = threadId; + _limit = limit; + _order = order; + _after = after; + _before = before; + _options = options; + } + + public override async Task GetFirstAsync() + => await GetRunsAsync(_threadId, _limit, _order, _after, _before, _options).ConfigureAwait(false); + + public override ClientResult GetFirst() + => GetRuns(_threadId, _limit, _order, _after, _before, _options); + + public override async Task GetNextAsync(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return await GetRunsAsync(_threadId, _limit, _order, _after, _before, _options).ConfigureAwait(false); + } + + public override ClientResult GetNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return GetRuns(_threadId, _limit, _order, _after, _before, _options); + } + + public override bool HasNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + bool hasMore = doc.RootElement.GetProperty("has_more"u8).GetBoolean(); + + return hasMore; + } + + public override PageResult GetPageFromResult(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + InternalListRunsResponse list = ModelReaderWriter.Read(response.Content)!; + + RunsPageToken pageToken = RunsPageToken.FromOptions(_threadId, _limit, _order, _after, _before); + RunsPageToken? nextPageToken = pageToken.GetNextPageToken(list.HasMore, list.LastId); + + return PageResult.Create(list.Data, pageToken, nextPageToken, response); + } + + internal async virtual Task GetRunsAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetRunsRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal virtual ClientResult GetRuns(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetRunsRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + private PipelineMessage CreateGetRunsRequest(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/runs", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier? _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); +} diff --git a/.dotnet/src/Custom/Assistants/Internal/Pagination/RunsPageToken.cs b/.dotnet/src/Custom/Assistants/Internal/Pagination/RunsPageToken.cs new file mode 100644 index 000000000..28b27f475 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/Pagination/RunsPageToken.cs @@ -0,0 +1,158 @@ +using System; +using System.ClientModel; +using System.Diagnostics; +using System.IO; +using System.Text.Json; + +#nullable enable + +namespace OpenAI.Assistants; + +internal class RunsPageToken : ContinuationToken +{ + protected RunsPageToken(string threadId, int? limit, string? order, string? after, string? before) + { + ThreadId = threadId; + + Limit = limit; + Order = order; + After = after; + Before = before; + } + + public string ThreadId { get; } + + public int? Limit { get; } + + public string? Order { get; } + + public string? After { get; } + + public string? Before { get; } + + public override BinaryData ToBytes() + { + using MemoryStream stream = new(); + using Utf8JsonWriter writer = new(stream); + + writer.WriteStartObject(); + writer.WriteString("threadId", ThreadId); + + if (Limit.HasValue) + { + writer.WriteNumber("limit", Limit.Value); + } + + if (Order is not null) + { + writer.WriteString("order", Order); + } + + if (After is not null) + { + writer.WriteString("after", After); + } + + if (Before is not null) + { + writer.WriteString("before", Before); + } + + writer.WriteEndObject(); + + writer.Flush(); + stream.Position = 0; + + return BinaryData.FromStream(stream); + } + + public RunsPageToken? GetNextPageToken(bool hasMore, string? lastId) + { + if (!hasMore || lastId is null) + { + return null; + } + + return new RunsPageToken(ThreadId, Limit, Order, After, Before); + } + + public static RunsPageToken FromToken(ContinuationToken pageToken) + { + if (pageToken is RunsPageToken token) + { + return token; + } + + BinaryData data = pageToken.ToBytes(); + + if (data.ToMemory().Length == 0) + { + throw new ArgumentException("Failed to create RunsPageToken from provided pageToken.", nameof(pageToken)); + } + + Utf8JsonReader reader = new(data); + + string threadId = null!; + int? limit = null; + string? order = null; + string? after = null; + string? before = null; + + reader.Read(); + + Debug.Assert(reader.TokenType == JsonTokenType.StartObject); + + while (reader.Read()) + { + if (reader.TokenType == JsonTokenType.EndObject) + { + break; + } + + Debug.Assert(reader.TokenType == JsonTokenType.PropertyName); + + string propertyName = reader.GetString()!; + + switch (propertyName) + { + case "threadId": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + threadId = reader.GetString()!; + break; + case "limit": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.Number); + limit = reader.GetInt32(); + break; + case "order": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + order = reader.GetString(); + break; + case "after": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + after = reader.GetString(); + break; + case "before": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + before = reader.GetString(); + break; + default: + throw new JsonException($"Unrecognized property '{propertyName}'."); + } + } + + if (threadId is null) + { + throw new ArgumentException("Failed to create RunsPageToken from provided pageToken.", nameof(pageToken)); + } + + return new(threadId, limit, order, after, before); + } + + public static RunsPageToken FromOptions(string threadId, int? limit, string? order, string? after, string? before) + => new RunsPageToken(threadId, limit, order, after, before); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownAssistantToolDefinition.Serialization.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownAssistantToolDefinition.Serialization.cs new file mode 100644 index 000000000..3a80df55e --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownAssistantToolDefinition.Serialization.cs @@ -0,0 +1,26 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +internal partial class UnknownAssistantToolDefinition : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeUnknownAssistantToolDefinition, writer, options); + + internal static void SerializeUnknownAssistantToolDefinition(UnknownAssistantToolDefinition instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownAssistantToolDefinition.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownAssistantToolDefinition.cs new file mode 100644 index 000000000..d9c826db6 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownAssistantToolDefinition.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("UnknownAssistantToolDefinition")] +internal partial class UnknownAssistantToolDefinition +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownMessageContentTextObjectAnnotation.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownMessageContentTextObjectAnnotation.cs new file mode 100644 index 000000000..bf2ac05a0 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownMessageContentTextObjectAnnotation.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("UnknownMessageContentTextObjectAnnotation")] +internal partial class UnknownMessageContentTextObjectAnnotation +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownMessageDeltaContent.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownMessageDeltaContent.cs new file mode 100644 index 000000000..3fcd7656c --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownMessageDeltaContent.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("UnknownMessageDeltaContent")] +internal partial class UnknownMessageDeltaContent +{ +} diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownMessageDeltaTextContentAnnotation.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownMessageDeltaTextContentAnnotation.cs new file mode 100644 index 000000000..53016f480 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownMessageDeltaTextContentAnnotation.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("UnknownMessageDeltaTextContentAnnotation")] +internal partial class UnknownMessageDeltaTextContentAnnotation +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDeltaStepDetails.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDeltaStepDetails.cs new file mode 100644 index 000000000..1d309757d --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDeltaStepDetails.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("UnknownRunStepDeltaStepDetails")] +internal partial class UnknownRunStepDeltaStepDetails +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs new file mode 100644 index 000000000..30579f8f8 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject")] +internal partial class UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.cs new file mode 100644 index 000000000..55ffc905e --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject")] +internal partial class UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDetails.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDetails.cs new file mode 100644 index 000000000..2fd75b988 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDetails.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("UnknownRunStepDetails")] +internal partial class UnknownRunStepDetails +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs new file mode 100644 index 000000000..d6fbc0575 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject")] +internal partial class UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDetailsToolCallsObjectToolCallsObject.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDetailsToolCallsObjectToolCallsObject.cs new file mode 100644 index 000000000..0540c25da --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepDetailsToolCallsObjectToolCallsObject.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("UnknownRunStepDetailsToolCallsObjectToolCallsObject")] +internal partial class UnknownRunStepDetailsToolCallsObjectToolCallsObject +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepObjectStepDetails.cs b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepObjectStepDetails.cs new file mode 100644 index 000000000..d18908820 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Internal/UnknownRunStepObjectStepDetails.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("UnknownRunStepObjectStepDetails")] +internal partial class UnknownRunStepObjectStepDetails +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/MessageCollectionOptions.cs b/.dotnet/src/Custom/Assistants/MessageCollectionOptions.cs new file mode 100644 index 000000000..ac4bf2215 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/MessageCollectionOptions.cs @@ -0,0 +1,33 @@ +namespace OpenAI.Assistants; + +/// +/// Represents addition options available when requesting a collection of instances. +/// +public class MessageCollectionOptions +{ + /// + /// Creates a new instance of . + /// + public MessageCollectionOptions() { } + + /// + /// The order that results should appear in the list according to + /// their created_at timestamp. + /// + public ListOrder? Order { get; set; } + + /// + /// The number of values to return in a page result. + /// + public int? PageSize { get; set; } + + /// + /// The id of the item preceeding the first item in the collection. + /// + public string AfterId { get; set; } + + /// + /// The id of the item following the last item in the collection. + /// + public string BeforeId { get; set; } +} diff --git a/.dotnet/src/Custom/Assistants/MessageContent.Serialization.cs b/.dotnet/src/Custom/Assistants/MessageContent.Serialization.cs new file mode 100644 index 000000000..77d646504 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/MessageContent.Serialization.cs @@ -0,0 +1,41 @@ +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public abstract partial class MessageContent : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, WriteCore, writer, options); + + internal static void WriteCore(MessageContent instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected abstract void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); + + internal static MessageContent DeserializeMessageContent(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + foreach (var property in element.EnumerateObject()) + { + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "image_file": return InternalMessageImageFileContent.DeserializeInternalMessageImageFileContent(element, options); + case "image_url": return InternalMessageImageUrlContent.DeserializeInternalMessageImageUrlContent(element, options); + case "text": return InternalResponseMessageTextContent.DeserializeInternalResponseMessageTextContent(element, options); + default: return null; + } + } + } + + return null; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/MessageContent.cs b/.dotnet/src/Custom/Assistants/MessageContent.cs new file mode 100644 index 000000000..8043d79cb --- /dev/null +++ b/.dotnet/src/Custom/Assistants/MessageContent.cs @@ -0,0 +1,66 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +[CodeGenModel("MessageContent")] +public abstract partial class MessageContent +{ + /// + /// Creates a new instance that refers to an uploaded image with a known file ID. + /// + /// + /// + /// + public static MessageContent FromImageFileId( + string imageFileId, + MessageImageDetail? detail = null) + => new InternalMessageImageFileContent(imageFileId, detail); + + /// + /// Creates a new instance of that refers to an image at a model-accessible + /// internet location. + /// + /// + /// + /// + public static MessageContent FromImageUrl(Uri imageUri, MessageImageDetail? detail = null) + => new InternalMessageImageUrlContent(imageUri, detail); + + /// + /// Creates a new instance that encapsulates a simple string input. + /// + /// + /// + public static MessageContent FromText(string text) + => new InternalRequestMessageTextContent(text); + + /// + public Uri ImageUrl => AsInternalImageUrl?.InternalUrl; + /// + public string ImageFileId => AsInternalImageFile?.InternalFileId; + /// + public MessageImageDetail? ImageDetail => AsInternalImageFile?.InternalDetail ?? AsInternalImageUrl?.InternalDetail; + /// + public string Text => AsInternalRequestText?.InternalText ?? AsInternalResponseText?.InternalText; + /// + public IReadOnlyList TextAnnotations => AsInternalResponseText?.InternalAnnotations ?? []; + public string Refusal => AsRefusal?.InternalRefusal; + + private InternalMessageImageFileContent AsInternalImageFile => this as InternalMessageImageFileContent; + private InternalMessageImageUrlContent AsInternalImageUrl => this as InternalMessageImageUrlContent; + private InternalResponseMessageTextContent AsInternalResponseText => this as InternalResponseMessageTextContent; + private InternalRequestMessageTextContent AsInternalRequestText => this as InternalRequestMessageTextContent; + private InternalMessageRefusalContent AsRefusal => this as InternalMessageRefusalContent; + + /// + /// The implicit conversion operator that infers an equivalent + /// instance from a plain . + /// + /// The text for the message content. + public static implicit operator MessageContent(string value) => FromText(value); + + /// Creates a new instance of for mocking. + protected MessageContent() + { } +} diff --git a/.dotnet/src/Custom/Assistants/MessageCreationAttachment.cs b/.dotnet/src/Custom/Assistants/MessageCreationAttachment.cs new file mode 100644 index 000000000..46827002a --- /dev/null +++ b/.dotnet/src/Custom/Assistants/MessageCreationAttachment.cs @@ -0,0 +1,45 @@ +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenModel("CreateMessageRequestAttachment")] +[CodeGenSerialization(nameof(Tools), "tools", SerializationValueHook = nameof(SerializeTools), DeserializationValueHook = nameof(DeserializeTools))] +public partial class MessageCreationAttachment +{ + /// + /// The tools to which the attachment applies to. + /// + /// + /// These are instances that can be checked via downcast, e.g.: + /// + /// if (message.Attachments[0].Tools[0] is ) + /// { + /// // The attachment applies to the code interpreter tool + /// } + /// + /// + [CodeGenMember("Tools")] + public IReadOnlyList Tools { get; } = new ChangeTrackingList(); + + private void SerializeTools(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => writer.WriteObjectValue(Tools, options); + + private static void DeserializeTools(JsonProperty property, ref IReadOnlyList tools) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + tools = null; + } + else + { + List deserializedTools = []; + foreach (JsonElement toolElement in property.Value.EnumerateArray()) + { + deserializedTools.Add(ToolDefinition.DeserializeToolDefinition(toolElement)); + } + tools = deserializedTools; + } + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/MessageCreationOptions.Serialization.cs b/.dotnet/src/Custom/Assistants/MessageCreationOptions.Serialization.cs new file mode 100644 index 000000000..abbbc25d8 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/MessageCreationOptions.Serialization.cs @@ -0,0 +1,25 @@ +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class MessageCreationOptions : IJsonModel + { + private void SerializeContent(Utf8JsonWriter writer, ModelReaderWriterOptions options = null) + { + if (Content.Count == 1 && Content[0] is InternalRequestMessageTextContent textContent) + { + writer.WriteStringValue(textContent.Text); + } + else + { + writer.WriteStartArray(); + foreach (var item in Content) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + } + } +} diff --git a/.dotnet/src/Custom/Assistants/MessageCreationOptions.cs b/.dotnet/src/Custom/Assistants/MessageCreationOptions.cs new file mode 100644 index 000000000..4e0e1bd89 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/MessageCreationOptions.cs @@ -0,0 +1,41 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +/// +/// Represents additional options available when creating a new . +/// +[CodeGenModel("CreateMessageRequest")] +[CodeGenSuppress("MessageCreationOptions", typeof(MessageRole), typeof(IEnumerable))] +[CodeGenSerialization(nameof(Content), SerializationValueHook=nameof(SerializeContent))] +public partial class MessageCreationOptions +{ + // CUSTOM: role is hidden, as this required property is promoted to a method parameter + + [CodeGenMember("Role")] + internal MessageRole Role { get; set; } + + // CUSTOM: content is hidden to allow the promotion of required request information into top-level + // method signatures. + + [CodeGenMember("Content")] + internal IList Content { get; } + + /// + /// Creates a new instance of . + /// + public MessageCreationOptions() + : this( + MessageRole.User, + new ChangeTrackingList(), + new ChangeTrackingList(), + new ChangeTrackingDictionary(), + new ChangeTrackingDictionary()) + {} + + internal MessageCreationOptions(IEnumerable content) : this() + { + Content = [.. content]; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/MessageImageDetail.Serialization.cs b/.dotnet/src/Custom/Assistants/MessageImageDetail.Serialization.cs new file mode 100644 index 000000000..1aa776a84 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/MessageImageDetail.Serialization.cs @@ -0,0 +1,21 @@ +using System; + +namespace OpenAI.Assistants; +internal static partial class MessageImageDetailExtensions +{ + public static string ToSerialString(this MessageImageDetail value) => value switch + { + MessageImageDetail.Auto => "auto", + MessageImageDetail.Low => "low", + MessageImageDetail.High => "high", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, $"Unknown MessageImageDetail value: {value}") + }; + + public static MessageImageDetail ToMessageImageDetail(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "auto")) return MessageImageDetail.Auto; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "low")) return MessageImageDetail.Low; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "high")) return MessageImageDetail.High; + throw new ArgumentOutOfRangeException(nameof(value), value, $"Unknown MessageImageDetail value: {value}"); + } +} diff --git a/.dotnet/src/Custom/Assistants/MessageImageDetail.cs b/.dotnet/src/Custom/Assistants/MessageImageDetail.cs new file mode 100644 index 000000000..17cc7827e --- /dev/null +++ b/.dotnet/src/Custom/Assistants/MessageImageDetail.cs @@ -0,0 +1,17 @@ +namespace OpenAI.Assistants; + +/// +/// The available detail settings to use when processing an image. +/// These settings balance token consumption and the resolution of evaluation performed. +/// +public enum MessageImageDetail +{ + /// Default. Allows the model to automatically select detail. + Auto, + + /// Reduced detail that uses fewer tokens than . + Low, + + /// Increased detail that uses more tokens than . + High, +} diff --git a/.dotnet/src/Custom/Assistants/MessageModificationOptions.cs b/.dotnet/src/Custom/Assistants/MessageModificationOptions.cs new file mode 100644 index 000000000..3c43bea30 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/MessageModificationOptions.cs @@ -0,0 +1,9 @@ +namespace OpenAI.Assistants; + +/// +/// Represents additional options available when modifying an existing . +/// +[CodeGenModel("ModifyMessageRequest")] +public partial class MessageModificationOptions +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/MessageRole.cs b/.dotnet/src/Custom/Assistants/MessageRole.cs new file mode 100644 index 000000000..6aae4c418 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/MessageRole.cs @@ -0,0 +1,17 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("CreateMessageRequestRole")] +public enum MessageRole +{ + /// + /// Indicates the message is sent by an actual user. + /// + [CodeGenMember("User")] + User, + + /// + /// Indicates the message was generated by the assistant. + /// + [CodeGenMember("Assistant")] + Assistant, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/RequiredAction.cs b/.dotnet/src/Custom/Assistants/RequiredAction.cs new file mode 100644 index 000000000..3de39cd43 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RequiredAction.cs @@ -0,0 +1,26 @@ +namespace OpenAI.Assistants; + +/// +/// An abstract, base representation for an action that an Assistants API run requires outputs +/// from in order to continue. +/// +/// +/// is the abstract base type for all required actions. Its +/// concrete type can be one of: +/// +/// +/// +/// +public abstract partial class RequiredAction +{ + /// + public string FunctionName => AsFunction?.InternalName; + + /// + public string FunctionArguments => AsFunction?.InternalArguments; + + /// + public string ToolCallId => AsFunction?.Id; + + private InternalRequiredFunctionToolCall AsFunction => this as InternalRequiredFunctionToolCall; +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/RunCollectionOptions.cs b/.dotnet/src/Custom/Assistants/RunCollectionOptions.cs new file mode 100644 index 000000000..f64482328 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunCollectionOptions.cs @@ -0,0 +1,33 @@ +namespace OpenAI.Assistants; + +/// +/// Represents addition options available when requesting a collection of instances. +/// +public class RunCollectionOptions +{ + /// + /// Creates a new instance of . + /// + public RunCollectionOptions() { } + + /// + /// The order that results should appear in the list according to + /// their created_at timestamp. + /// + public ListOrder? Order { get; set; } + + /// + /// The number of values to return in a page result. + /// + public int? PageSize { get; set; } + + /// + /// The id of the item preceeding the first item in the collection. + /// + public string AfterId { get; set; } + + /// + /// The id of the item following the last item in the collection. + /// + public string BeforeId { get; set; } +} diff --git a/.dotnet/src/Custom/Assistants/RunCreationOptions.cs b/.dotnet/src/Custom/Assistants/RunCreationOptions.cs new file mode 100644 index 000000000..bcc45ee2f --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunCreationOptions.cs @@ -0,0 +1,138 @@ +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Linq; +using System.Text.Json; + +namespace OpenAI.Assistants; + +/// +/// Represents additional options available when creating a new . +/// +[CodeGenModel("CreateRunRequest")] +[CodeGenSuppress("RunCreationOptions", typeof(string))] +[CodeGenSerialization(nameof(ToolConstraint), "tool_choice", SerializationValueHook = nameof(SerializeToolConstraint))] +public partial class RunCreationOptions +{ + // CUSTOM: assistant_id/stream visibility hidden so that they can be promoted to required method parameters + [CodeGenMember("AssistantId")] + internal string AssistantId { get; set; } + + [CodeGenMember("Stream")] + internal bool? Stream { get; set; } + + /// + [CodeGenMember("ResponseFormat")] + public AssistantResponseFormat ResponseFormat { get; set; } + + /// + /// A run-specific model name that will override the assistant's defined model. If not provided, the assistant's + /// selection will be used. + /// + [CodeGenMember("Model")] + public string ModelOverride { get; set; } + + /// + /// A run specific replacement for the assistant's default instructions that will override the assistant-level + /// instructions. If not specified, the assistant's instructions will be used. + /// + [CodeGenMember("Instructions")] + public string InstructionsOverride { get; set; } + + /// + /// Run-specific additional instructions that will be appended to the assistant-level instructions solely for this + /// run. Unlike , the assistant's instructions are preserved and these additional + /// instructions are concatenated. + /// + [CodeGenMember("AdditionalInstructions")] + public string AdditionalInstructions { get; set; } + + /// Adds additional messages to the thread before creating the run. + public IList AdditionalMessages { get; } = new ChangeTrackingList(); + + [CodeGenMember("AdditionalMessages")] + internal IList InternalMessages + { + get => AdditionalMessages.Select(initializationMessage => initializationMessage as MessageCreationOptions).ToList(); + private set + { + // Note: this path is exclusively used in a test or deserialization case; here, we'll convert the + // underlying wire-friendly representation into the initialization message abstraction. + + AdditionalMessages.Clear(); + foreach (MessageCreationOptions baseMessageOptions in value) + { + AdditionalMessages.Add(new ThreadInitializationMessage(baseMessageOptions)); + } + } + } + + /// + /// Whether to enable parallel function calling during tool use. + /// + /// + /// Assumed true if not otherwise specified. + /// + [CodeGenMember("ParallelToolCalls")] + public bool? ParallelToolCallsEnabled { get; set; } + + /// + /// A run-specific collection of tool definitions that will override the assistant-level defaults. If not provided, + /// the assistant's defined tools will be used. Available tools include: + /// + /// + /// + /// code_interpreter - + /// - works with data, math, and computer code + /// + /// + /// file_search - + /// - dynamically enriches an Run's context with content from vector stores + /// + /// + /// function - + /// - enables caller-provided custom functions for actions and enrichment + /// + /// + /// + /// + [CodeGenMember("Tools")] + public IList ToolsOverride { get; } = new ChangeTrackingList(); + + /// Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + public IDictionary Metadata { get; } = new ChangeTrackingDictionary(); + + /// What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + public float? Temperature { get; set; } + + /// + /// An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + /// + /// We generally recommend altering this or temperature but not both. + /// + [CodeGenMember("TopP")] + public float? NucleusSamplingFactor { get; set; } + + /// The maximum number of prompt tokens that may be used over the course of the run. The run will make a best effort to use only the number of prompt tokens specified, across multiple turns of the run. If the run exceeds the number of prompt tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + public int? MaxPromptTokens { get; set; } + + /// The maximum number of completion tokens that may be used over the course of the run. The run will make a best effort to use only the number of completion tokens specified, across multiple turns of the run. If the run exceeds the number of completion tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + public int? MaxCompletionTokens { get; set; } + + /// Gets or sets the truncation strategy. + public RunTruncationStrategy TruncationStrategy { get; set; } + + /// + /// + /// + [CodeGenMember("ToolChoice")] + public ToolConstraint ToolConstraint { get; set; } + + /// + /// Creates a new instance of . + /// + public RunCreationOptions() + { } + + private void SerializeToolConstraint(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => writer.WriteObjectValue(ToolConstraint, options); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/RunModificationOptions.cs b/.dotnet/src/Custom/Assistants/RunModificationOptions.cs new file mode 100644 index 000000000..561074b99 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunModificationOptions.cs @@ -0,0 +1,9 @@ +namespace OpenAI.Assistants; + +/// +/// Represents additional options available when modifying an existing . +/// +[CodeGenModel("ModifyRunRequest")] +public partial class RunModificationOptions +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/RunStatus.cs b/.dotnet/src/Custom/Assistants/RunStatus.cs new file mode 100644 index 000000000..b2b5da90e --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunStatus.cs @@ -0,0 +1,20 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("RunObjectStatus")] +public readonly partial struct RunStatus +{ + /// + /// [Helper property] Gets a value indicating whether this run status represents a condition wherein the run can + /// no longer continue. + /// + /// + /// For more information, please refer to: + /// https://platform.openai.com/docs/assistants/how-it-works/run-lifecycle + /// + public bool IsTerminal + => _value == CompletedValue + || _value == ExpiredValue + || _value == FailedValue + || _value == IncompleteValue + || _value == CancelledValue; +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/RunStep.cs b/.dotnet/src/Custom/Assistants/RunStep.cs new file mode 100644 index 000000000..061cb3cd8 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunStep.cs @@ -0,0 +1,28 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("RunStepObject")] +public partial class RunStep +{ + // CUSTOM: Made internal. + /// The object type, which is always `thread.run.step`. + [CodeGenMember("Object")] + internal InternalRunStepObjectObject Object { get; } = InternalRunStepObjectObject.ThreadRunStep; + + /// + /// The step_details associated with this run step. + /// + /// + /// + /// Please note is the base class. + /// + /// + /// According to the scenario, a derived class of the base class might need to be assigned here, or this property + /// needs to be casted to one of the possible derived classes. + /// + /// + /// The available derived classes include and . + /// + /// + [CodeGenMember("StepDetails")] + public RunStepDetails Details { get; } +} diff --git a/.dotnet/src/Custom/Assistants/RunStepCodeInterpreterOutput.cs b/.dotnet/src/Custom/Assistants/RunStepCodeInterpreterOutput.cs new file mode 100644 index 000000000..8c0b7c7fc --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunStepCodeInterpreterOutput.cs @@ -0,0 +1,12 @@ +namespace OpenAI.Assistants; + +public abstract partial class RunStepCodeInterpreterOutput +{ + /// + public string ImageFileId => AsInternalImage?.FileId; + /// + public string Logs => AsInternalLogs?.InternalLogs; + + private InternalRunStepDetailsToolCallsCodeOutputImageObject AsInternalImage => this as InternalRunStepDetailsToolCallsCodeOutputImageObject; + private InternalRunStepCodeInterpreterLogOutput AsInternalLogs => this as InternalRunStepCodeInterpreterLogOutput; +} diff --git a/.dotnet/src/Custom/Assistants/RunStepCollectionOptions.cs b/.dotnet/src/Custom/Assistants/RunStepCollectionOptions.cs new file mode 100644 index 000000000..d85097058 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunStepCollectionOptions.cs @@ -0,0 +1,33 @@ +namespace OpenAI.Assistants; + +/// +/// Represents addition options available when requesting a collection of instances. +/// +public class RunStepCollectionOptions +{ + /// + /// Creates a new instance of . + /// + public RunStepCollectionOptions() { } + + /// + /// The order that results should appear in the list according to + /// their created_at timestamp. + /// + public ListOrder? Order { get; set; } + + /// + /// The number of values to return in a page result. + /// + public int? PageSize { get; set; } + + /// + /// The id of the item preceeding the first item in the collection. + /// + public string AfterId { get; set; } + + /// + /// The id of the item following the last item in the collection. + /// + public string BeforeId { get; set; } +} diff --git a/.dotnet/src/Custom/Assistants/RunStepDetails.cs b/.dotnet/src/Custom/Assistants/RunStepDetails.cs new file mode 100644 index 000000000..8ade16fc9 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunStepDetails.cs @@ -0,0 +1,14 @@ +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + [CodeGenModel("RunStepObjectStepDetails")] + public abstract partial class RunStepDetails + { + public string CreatedMessageId => AsInternalMessageCreation?.InternalMessageId; + public IReadOnlyList ToolCalls => AsInternalToolCalls ?? []; + + private InternalRunStepDetailsMessageCreationObject AsInternalMessageCreation => this as InternalRunStepDetailsMessageCreationObject; + private InternalRunStepDetailsToolCallsObject AsInternalToolCalls => this as InternalRunStepDetailsToolCallsObject; + } +} diff --git a/.dotnet/src/Custom/Assistants/RunStepToolCall.cs b/.dotnet/src/Custom/Assistants/RunStepToolCall.cs new file mode 100644 index 000000000..3cb729b12 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunStepToolCall.cs @@ -0,0 +1,34 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +[CodeGenModel("RunStepDetailsToolCallsObjectToolCallsObject")] +public partial class RunStepToolCall +{ + public string ToolCallId + => AsCodeInterpreter?.Id + ?? AsFunction?.Id + ?? AsFileSearch?.Id + ?? (SerializedAdditionalRawData?.TryGetValue("id", out BinaryData idData) == true + ? idData.ToString() + : null); + + public string CodeInterpreterInput => AsCodeInterpreter?.Input; + public IReadOnlyList CodeInterpreterOutputs => AsCodeInterpreter?.Outputs ?? []; + + public string FunctionName => AsFunction?.InternalName; + public string FunctionArguments => AsFunction?.InternalArguments; + public string FunctionOutput => AsFunction?.InternalOutput; + + public RunStepToolCallKind ToolKind + => AsCodeInterpreter is not null ? RunStepToolCallKind.CodeInterpreter + : AsFileSearch is not null ? RunStepToolCallKind.FileSearch + : AsFunction is not null ? RunStepToolCallKind.Function + : RunStepToolCallKind.Unknown; + + private InternalRunStepCodeInterpreterToolCallDetails AsCodeInterpreter + => this as InternalRunStepCodeInterpreterToolCallDetails; + private InternalRunStepFunctionToolCallDetails AsFunction => this as InternalRunStepFunctionToolCallDetails; + private InternalRunStepFileSearchToolCallDetails AsFileSearch => this as InternalRunStepFileSearchToolCallDetails; +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/RunStepToolCallKind.Serialization.cs b/.dotnet/src/Custom/Assistants/RunStepToolCallKind.Serialization.cs new file mode 100644 index 000000000..869a9ad02 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunStepToolCallKind.Serialization.cs @@ -0,0 +1,21 @@ +using System; + +namespace OpenAI.Assistants; +internal static partial class RunStepToolCallKindExtensions +{ + public static string ToSerialString(this RunStepToolCallKind value) => value switch + { + RunStepToolCallKind.CodeInterpreter => "code_interpreter", + RunStepToolCallKind.FileSearch => "file_search", + RunStepToolCallKind.Function => "function", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, $"Unknown {nameof(RunStepToolCallKind)} value: {value}") + }; + + public static RunStepToolCallKind ToRunStepToolCallKind(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "code_interpreter")) return RunStepToolCallKind.CodeInterpreter; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "file_search")) return RunStepToolCallKind.FileSearch; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "function")) return RunStepToolCallKind.Function; + throw new ArgumentOutOfRangeException(nameof(value), value, $"Unknown {nameof(RunStepToolCallKind)} value: {value}"); + } +} diff --git a/.dotnet/src/Custom/Assistants/RunStepToolCallKind.cs b/.dotnet/src/Custom/Assistants/RunStepToolCallKind.cs new file mode 100644 index 000000000..f1d89ab29 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunStepToolCallKind.cs @@ -0,0 +1,9 @@ +namespace OpenAI.Assistants; + +public enum RunStepToolCallKind +{ + Unknown, + CodeInterpreter, + FileSearch, + Function, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/RunTruncationStrategy.cs b/.dotnet/src/Custom/Assistants/RunTruncationStrategy.cs new file mode 100644 index 000000000..324723a0c --- /dev/null +++ b/.dotnet/src/Custom/Assistants/RunTruncationStrategy.cs @@ -0,0 +1,68 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + /// Controls for how a thread will be truncated prior to the run. Use this to control the intial context window of the run. + [CodeGenModel("TruncationObject")] + [CodeGenSuppress(nameof(RunTruncationStrategy), typeof(InternalTruncationObjectType))] + public partial class RunTruncationStrategy + { + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + private IDictionary SerializedAdditionalRawData; + + /// The truncation strategy to use for the thread. The default is `auto`. If set to `last_messages`, the thread will be truncated to the n most recent messages in the thread. When set to `auto`, messages in the middle of the thread will be dropped to fit the context length of the model, `max_prompt_tokens`. + [CodeGenMember("Type")] + internal readonly InternalTruncationObjectType _type; + + /// The number of most recent messages from the thread when constructing the context for the run. + /// + /// + [CodeGenMember("LastMessages")] + public int? LastMessages { get; } + + /// + /// The default that will eliminate messages in the middle of the thread + /// to fit within the context length of the model or the max prompt tokens. + /// + public static RunTruncationStrategy Auto { get; } = new(InternalTruncationObjectType.Auto, 0, null); + + /// + /// Creates a new instance using the last_messages strategy type, + /// which will truncate the thread to specified count of preceding messages for the run. + /// + /// The count of last messages that the run should evaluate. + /// + public static RunTruncationStrategy CreateLastMessagesStrategy(int lastMessageCount) + => new(InternalTruncationObjectType.LastMessages, lastMessageCount, null); + } +} diff --git a/.dotnet/src/Custom/Assistants/Streaming/AsyncStreamingUpdateCollection.cs b/.dotnet/src/Custom/Assistants/Streaming/AsyncStreamingUpdateCollection.cs new file mode 100644 index 000000000..1a44545a5 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/AsyncStreamingUpdateCollection.cs @@ -0,0 +1,145 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Diagnostics; +using System.Linq; +using System.Net.ServerSentEvents; +using System.Threading; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.Assistants; + +/// +/// Implementation of collection abstraction over streaming assistant updates. +/// +internal class AsyncStreamingUpdateCollection : AsyncCollectionResult +{ + private readonly Func> _getResultAsync; + + public AsyncStreamingUpdateCollection(Func> getResultAsync) : base() + { + Argument.AssertNotNull(getResultAsync, nameof(getResultAsync)); + + _getResultAsync = getResultAsync; + } + + public override IAsyncEnumerator GetAsyncEnumerator(CancellationToken cancellationToken = default) + { + return new AsyncStreamingUpdateEnumerator(_getResultAsync, this, cancellationToken); + } + + private sealed class AsyncStreamingUpdateEnumerator : IAsyncEnumerator + { + private static ReadOnlySpan TerminalData => "[DONE]"u8; + + private readonly Func> _getResultAsync; + private readonly AsyncStreamingUpdateCollection _enumerable; + private readonly CancellationToken _cancellationToken; + + // These enumerators represent what is effectively a doubly-nested + // loop over the outer event collection and the inner update collection, + // i.e.: + // foreach (var sse in _events) { + // // get _updates from sse event + // foreach (var update in _updates) { ... } + // } + private IAsyncEnumerator>? _events; + private IEnumerator? _updates; + + private StreamingUpdate? _current; + private bool _started; + + public AsyncStreamingUpdateEnumerator(Func> getResultAsync, + AsyncStreamingUpdateCollection enumerable, + CancellationToken cancellationToken) + { + Debug.Assert(getResultAsync is not null); + Debug.Assert(enumerable is not null); + + _getResultAsync = getResultAsync!; + _enumerable = enumerable!; + _cancellationToken = cancellationToken; + } + + StreamingUpdate IAsyncEnumerator.Current + => _current!; + + async ValueTask IAsyncEnumerator.MoveNextAsync() + { + if (_events is null && _started) + { + throw new ObjectDisposedException(nameof(AsyncStreamingUpdateEnumerator)); + } + + _cancellationToken.ThrowIfCancellationRequested(); + _events ??= await CreateEventEnumeratorAsync().ConfigureAwait(false); + _started = true; + + if (_updates is not null && _updates.MoveNext()) + { + _current = _updates.Current; + return true; + } + + if (await _events.MoveNextAsync().ConfigureAwait(false)) + { + if (_events.Current.Data.AsSpan().SequenceEqual(TerminalData)) + { + _current = default; + return false; + } + + var updates = StreamingUpdate.FromEvent(_events.Current); + _updates = updates.GetEnumerator(); + + if (_updates.MoveNext()) + { + _current = _updates.Current; + return true; + } + } + + _current = default; + return false; + } + + private async Task>> CreateEventEnumeratorAsync() + { + ClientResult result = await _getResultAsync().ConfigureAwait(false); + PipelineResponse response = result.GetRawResponse(); + _enumerable.SetRawResponse(response); + + if (response.ContentStream is null) + { + throw new InvalidOperationException("Unable to create result from response with null ContentStream"); + } + + IAsyncEnumerable> enumerable = SseParser.Create(response.ContentStream, (_, bytes) => bytes.ToArray()).EnumerateAsync(); + return enumerable.GetAsyncEnumerator(_cancellationToken); + } + + public async ValueTask DisposeAsync() + { + await DisposeAsyncCore().ConfigureAwait(false); + + GC.SuppressFinalize(this); + } + + private async ValueTask DisposeAsyncCore() + { + if (_events is not null) + { + await _events.DisposeAsync().ConfigureAwait(false); + _events = null; + + // Dispose the response so we don't leave the unbuffered + // network stream open. + PipelineResponse response = _enumerable.GetRawResponse(); + response.Dispose(); + } + } + } +} diff --git a/.dotnet/src/Custom/Assistants/Streaming/MessageContentUpdate.cs b/.dotnet/src/Custom/Assistants/Streaming/MessageContentUpdate.cs new file mode 100644 index 000000000..ad6944617 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/MessageContentUpdate.cs @@ -0,0 +1,93 @@ +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +/// +/// Represents a streaming update to content as part of the Assistants API. +/// +/// +/// Distinct instances will be generated for each part +/// and each content subcomponent, such as instances, even if this information +/// arrived in the same response chunk. +/// +public partial class MessageContentUpdate : StreamingUpdate +{ + /// + public string MessageId => _delta.Id; + + /// + public int MessageIndex => _textContent?.Index + ?? _imageFileContent?.Index + ?? _imageUrlContent?.Index + ?? _refusalContent?.Index + ?? TextAnnotation?.ContentIndex + ?? 0; + + /// + public MessageRole? Role => _delta.Delta?.Role; + + /// + public string ImageFileId => _imageFileContent?.ImageFile?.FileId; + + /// + public MessageImageDetail? ImageDetail => _imageFileContent?.ImageFile?.Detail?.ToMessageImageDetail() + ?? _imageUrlContent?.ImageUrl?.Detail?.ToMessageImageDetail(); + + /// + public string Text => _textContent?.Text?.Value; + + /// + /// An update to an annotation associated with a specific content item in the message's content items collection. + /// + public TextAnnotationUpdate TextAnnotation { get; } + + public string RefusalUpdate => _refusalContent?.Refusal; + + private readonly InternalMessageDeltaContentImageFileObject _imageFileContent; + private readonly InternalMessageDeltaContentTextObject _textContent; + private readonly InternalMessageDeltaContentImageUrlObject _imageUrlContent; + private readonly InternalMessageDeltaContentRefusalObject _refusalContent; + private readonly InternalMessageDeltaObject _delta; + + internal MessageContentUpdate(InternalMessageDeltaObject delta, InternalMessageDeltaContent content) + : base(StreamingUpdateReason.MessageUpdated) + { + _delta = delta; + _textContent = content as InternalMessageDeltaContentTextObject; + _imageFileContent = content as InternalMessageDeltaContentImageFileObject; + _imageUrlContent = content as InternalMessageDeltaContentImageUrlObject; + _refusalContent = content as InternalMessageDeltaContentRefusalObject; + } + + internal MessageContentUpdate(InternalMessageDeltaObject delta, TextAnnotationUpdate annotation) + : base(StreamingUpdateReason.MessageUpdated) + { + _delta = delta; + TextAnnotation = annotation; + } + + internal static IEnumerable DeserializeMessageContentUpdates( + JsonElement element, + StreamingUpdateReason _, + ModelReaderWriterOptions options = null) + { + InternalMessageDeltaObject deltaObject = InternalMessageDeltaObject.DeserializeInternalMessageDeltaObject(element, options); + List updates = []; + foreach (InternalMessageDeltaContent deltaContent in deltaObject.Delta.Content ?? []) + { + updates.Add(new(deltaObject, deltaContent)); + if (deltaContent is InternalMessageDeltaContentTextObject textContent) + { + foreach (InternalMessageDeltaTextContentAnnotation internalAnnotation in textContent.Text.Annotations) + { + TextAnnotationUpdate annotation = new(internalAnnotation); + updates.Add(new(deltaObject, annotation)); + } + } + } + return updates; + } +} + diff --git a/.dotnet/src/Custom/Assistants/Streaming/MessageStatusUpdate.cs b/.dotnet/src/Custom/Assistants/Streaming/MessageStatusUpdate.cs new file mode 100644 index 000000000..d6b19cf29 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/MessageStatusUpdate.cs @@ -0,0 +1,27 @@ +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +/// +/// The update type presented when the status of a message changes. +/// +public class MessageStatusUpdate : StreamingUpdate +{ + internal MessageStatusUpdate(ThreadMessage message, StreamingUpdateReason updateKind) + : base(message, updateKind) + { } + + internal static IEnumerable DeserializeMessageStatusUpdates( + JsonElement element, + StreamingUpdateReason updateKind, + ModelReaderWriterOptions options = null) + { + ThreadMessage message = ThreadMessage.DeserializeThreadMessage(element, options); + return updateKind switch + { + _ => [new MessageStatusUpdate(message, updateKind)], + }; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Streaming/RequiredActionUpdate.cs b/.dotnet/src/Custom/Assistants/Streaming/RequiredActionUpdate.cs new file mode 100644 index 000000000..2c1899236 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/RequiredActionUpdate.cs @@ -0,0 +1,52 @@ +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +/// +/// The update type presented when the status of a has changed to requires_action, +/// indicating that tool output submission or another intervention is needed for the run to continue. +/// +/// +/// Distinct instances will generated for each required action, meaning that +/// parallel function calling will present multiple updates even if the tool calls arrive at the same time. +/// +public class RequiredActionUpdate : RunUpdate +{ + /// + public string FunctionName => AsFunctionCall?.FunctionName; + + /// + public string FunctionArguments => AsFunctionCall?.FunctionArguments; + + /// + public string ToolCallId => AsFunctionCall?.Id; + + private InternalRequiredFunctionToolCall AsFunctionCall => _requiredAction as InternalRequiredFunctionToolCall; + + private readonly RequiredAction _requiredAction; + + internal RequiredActionUpdate(ThreadRun run, RequiredAction action) + : base(run, StreamingUpdateReason.RunRequiresAction) + { + _requiredAction = action; + } + + /// + /// Gets the full, deserialized instance associated with this streaming required action + /// update. + /// + /// + public ThreadRun GetThreadRun() => Value; + + internal static IEnumerable DeserializeRequiredActionUpdates(JsonElement element) + { + ThreadRun run = ThreadRun.DeserializeThreadRun(element); + List updates = []; + foreach (RequiredAction action in run.RequiredActions ?? []) + { + updates.Add(new(run, action)); + } + return updates; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Streaming/RunStepDetailsUpdate.cs b/.dotnet/src/Custom/Assistants/Streaming/RunStepDetailsUpdate.cs new file mode 100644 index 000000000..588d3fd7b --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/RunStepDetailsUpdate.cs @@ -0,0 +1,87 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +/// +/// The update type presented when run step details, including tool call progress, have changed. +/// +public class RunStepDetailsUpdate : StreamingUpdate +{ + internal readonly InternalRunStepDelta _delta; + internal readonly InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject _toolCall; + private readonly InternalRunStepDeltaStepDetailsMessageCreationObject _asMessageCreation; + private readonly InternalRunStepDeltaStepDetailsToolCallsCodeObject _asCodeCall; + private readonly InternalRunStepDeltaStepDetailsToolCallsFileSearchObject _asFileSearchCall; + private readonly InternalRunStepDeltaStepDetailsToolCallsFunctionObject _asFunctionCall; + + /// + public string StepId => _delta?.Id; + + /// + public string CreatedMessageId => _asMessageCreation?.MessageCreation?.MessageId; + + /// + public string ToolCallId + => _asCodeCall?.Id + ?? _asFileSearchCall?.Id + ?? _asFunctionCall?.Id + ?? (_toolCall?.SerializedAdditionalRawData?.TryGetValue("id", out BinaryData idData) == true + ? idData.ToString() + : null); + + /// + public int? ToolCallIndex => _asCodeCall?.Index ?? _asFileSearchCall?.Index ?? _asFunctionCall?.Index; + + /// + public string CodeInterpreterInput => _asCodeCall?.CodeInterpreter?.Input; + + /// + public IReadOnlyList CodeInterpreterOutputs + => _asCodeCall?.CodeInterpreter?.Outputs; + + /// + public string FunctionName => _asFunctionCall.Function?.Name; + + /// + public string FunctionArguments => _asFunctionCall?.Function?.Arguments; + + /// + public string FunctionOutput => _asFunctionCall?.Function?.Output; + + internal RunStepDetailsUpdate( + InternalRunStepDelta stepDelta, + InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject toolCall = null) + : base(StreamingUpdateReason.RunStepUpdated) + { + _asMessageCreation = stepDelta?.Delta?.StepDetails as InternalRunStepDeltaStepDetailsMessageCreationObject; + _asCodeCall = toolCall as InternalRunStepDeltaStepDetailsToolCallsCodeObject; + _asFileSearchCall = toolCall as InternalRunStepDeltaStepDetailsToolCallsFileSearchObject; + _asFunctionCall = toolCall as InternalRunStepDeltaStepDetailsToolCallsFunctionObject; + _delta = stepDelta; + _toolCall = toolCall; + } + + internal static IEnumerable DeserializeRunStepDetailsUpdates( + JsonElement element, + StreamingUpdateReason updateKind, + ModelReaderWriterOptions options = null) + { + InternalRunStepDelta stepDelta = InternalRunStepDelta.DeserializeInternalRunStepDelta(element, options); + List updates = []; + if (stepDelta?.Delta?.StepDetails is InternalRunStepDeltaStepDetailsMessageCreationObject) + { + updates.Add(new RunStepDetailsUpdate(stepDelta)); + } + else if (stepDelta?.Delta?.StepDetails is InternalRunStepDeltaStepDetailsToolCallsObject toolCalls) + { + foreach (InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject toolCall in toolCalls.ToolCalls) + { + updates.Add(new RunStepDetailsUpdate(stepDelta, toolCall)); + } + } + return updates; + } +} diff --git a/.dotnet/src/Custom/Assistants/Streaming/RunStepDetailsUpdateCodeInterpreterOutput.cs b/.dotnet/src/Custom/Assistants/Streaming/RunStepDetailsUpdateCodeInterpreterOutput.cs new file mode 100644 index 000000000..c24ea9dd4 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/RunStepDetailsUpdateCodeInterpreterOutput.cs @@ -0,0 +1,19 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject")] +public partial class RunStepUpdateCodeInterpreterOutput +{ + /// + public int OutputIndex => AsLogs?.Index ?? AsImage?.Index ?? 0; + + /// + public string Logs => AsLogs?.InternalLogs; + + /// + public string ImageFileId => AsImage?.Image?.FileId; + + private InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject AsLogs + => this as InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject; + private InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject AsImage + => this as InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject; +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Streaming/RunStepUpdate.cs b/.dotnet/src/Custom/Assistants/Streaming/RunStepUpdate.cs new file mode 100644 index 000000000..c34345a8e --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/RunStepUpdate.cs @@ -0,0 +1,27 @@ +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +/// +/// The update type presented when the status of a run step changes. +/// +public class RunStepUpdate : StreamingUpdate +{ + internal RunStepUpdate(RunStep runStep, StreamingUpdateReason updateKind) + : base(runStep, updateKind) + { } + + internal static IEnumerable> DeserializeRunStepUpdates( + JsonElement element, + StreamingUpdateReason updateKind, + ModelReaderWriterOptions options = null) + { + RunStep runStep = RunStep.DeserializeRunStep(element, options); + return updateKind switch + { + _ => [new RunStepUpdate(runStep, updateKind)], + }; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Streaming/RunUpdate.cs b/.dotnet/src/Custom/Assistants/Streaming/RunUpdate.cs new file mode 100644 index 000000000..77df3f678 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/RunUpdate.cs @@ -0,0 +1,26 @@ +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +/// +/// The update type presented when the status of a has changed. +/// +public class RunUpdate : StreamingUpdate +{ + internal RunUpdate(ThreadRun run, StreamingUpdateReason updateKind) : base(run, updateKind) + { } + + internal static IEnumerable> DeserializeRunUpdates( + JsonElement element, + StreamingUpdateReason updateKind, + ModelReaderWriterOptions options = null) + { + ThreadRun run = ThreadRun.DeserializeThreadRun(element, options); + return updateKind switch + { + _ => [new RunUpdate(run, updateKind)], + }; + } +} diff --git a/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdate.cs b/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdate.cs new file mode 100644 index 000000000..7940993d3 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdate.cs @@ -0,0 +1,102 @@ +using System.Collections.Generic; +using System.Net.ServerSentEvents; +using System.Text.Json; + +namespace OpenAI.Assistants; + +/// +/// Represents a single item of streamed Assistants API data. +/// +/// +/// Please note that this is the abstract base type. To access data, downcast an instance of this type to an +/// appropriate, derived update type: +/// +/// For messages: , +/// +/// +/// For runs and run steps: , , , +/// +/// +/// +/// For threads: +/// +/// +public abstract partial class StreamingUpdate +{ + /// + /// A value indicating what type of event this update represents. + /// + /// + /// Many events share the same response type. For example, and + /// are both associated with a instance. + /// You can use the value of to differentiate between these events when the type is not + /// sufficient to do so. + /// + public StreamingUpdateReason UpdateKind { get; } + + internal StreamingUpdate(StreamingUpdateReason updateKind) + { + UpdateKind = updateKind; + } + + internal static IEnumerable FromEvent(SseItem sseItem) + { + StreamingUpdateReason updateKind = StreamingUpdateReasonExtensions.FromSseEventLabel(sseItem.EventType); + using JsonDocument dataDocument = JsonDocument.Parse(sseItem.Data); + JsonElement e = dataDocument.RootElement; + + return updateKind switch + { + StreamingUpdateReason.ThreadCreated => ThreadUpdate.DeserializeThreadCreationUpdates(e, updateKind), + StreamingUpdateReason.RunCreated + or StreamingUpdateReason.RunQueued + or StreamingUpdateReason.RunInProgress + or StreamingUpdateReason.RunCompleted + or StreamingUpdateReason.RunIncomplete + or StreamingUpdateReason.RunFailed + or StreamingUpdateReason.RunCancelling + or StreamingUpdateReason.RunCancelled + or StreamingUpdateReason.RunExpired => RunUpdate.DeserializeRunUpdates(e, updateKind), + StreamingUpdateReason.RunRequiresAction => RequiredActionUpdate.DeserializeRequiredActionUpdates(e), + StreamingUpdateReason.RunStepCreated + or StreamingUpdateReason.RunStepInProgress + or StreamingUpdateReason.RunStepCompleted + or StreamingUpdateReason.RunStepFailed + or StreamingUpdateReason.RunStepCancelled + or StreamingUpdateReason.RunStepExpired => RunStepUpdate.DeserializeRunStepUpdates(e, updateKind), + StreamingUpdateReason.MessageCreated + or StreamingUpdateReason.MessageInProgress + or StreamingUpdateReason.MessageCompleted + or StreamingUpdateReason.MessageFailed => MessageStatusUpdate.DeserializeMessageStatusUpdates(e, updateKind), + StreamingUpdateReason.RunStepUpdated => RunStepDetailsUpdate.DeserializeRunStepDetailsUpdates(e, updateKind), + StreamingUpdateReason.MessageUpdated => MessageContentUpdate.DeserializeMessageContentUpdates(e, updateKind), + _ => null, + }; + } +} + +/// +/// Represents a single item of streamed data that encapsulates an underlying response value type. +/// +/// The response value type of the "delta" payload. +public partial class StreamingUpdate : StreamingUpdate + where T : class +{ + /// + /// The underlying response value received with the streaming event. + /// + public T Value { get; } + + internal StreamingUpdate(T value, StreamingUpdateReason updateKind) + : base(updateKind) + { + Value = value; + } + + /// + /// Implicit operator that allows the underlying value type of the to be used + /// directly. + /// + /// + public static implicit operator T(StreamingUpdate update) => update.Value; +} diff --git a/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdateCollection.cs b/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdateCollection.cs new file mode 100644 index 000000000..d0099d995 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdateCollection.cs @@ -0,0 +1,145 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections; +using System.Collections.Generic; +using System.Diagnostics; +using System.Net.ServerSentEvents; + +#nullable enable + +namespace OpenAI.Assistants; + +/// +/// Implementation of collection abstraction over streaming assistant updates. +/// +internal class StreamingUpdateCollection : CollectionResult +{ + private readonly Func _getResult; + + public StreamingUpdateCollection(Func getResult) : base() + { + Argument.AssertNotNull(getResult, nameof(getResult)); + + _getResult = getResult; + } + + public override IEnumerator GetEnumerator() + { + return new StreamingUpdateEnumerator(_getResult, this); + } + + private sealed class StreamingUpdateEnumerator : IEnumerator + { + private static ReadOnlySpan TerminalData => "[DONE]"u8; + + private readonly Func _getResult; + private readonly StreamingUpdateCollection _enumerable; + + // These enumerators represent what is effectively a doubly-nested + // loop over the outer event collection and the inner update collection, + // i.e.: + // foreach (var sse in _events) { + // // get _updates from sse event + // foreach (var update in _updates) { ... } + // } + private IEnumerator>? _events; + private IEnumerator? _updates; + + private StreamingUpdate? _current; + private bool _started; + + public StreamingUpdateEnumerator(Func getResult, + StreamingUpdateCollection enumerable) + { + Debug.Assert(getResult is not null); + Debug.Assert(enumerable is not null); + + _getResult = getResult!; + _enumerable = enumerable!; + } + + StreamingUpdate IEnumerator.Current + => _current!; + + object IEnumerator.Current => throw new NotImplementedException(); + + public bool MoveNext() + { + if (_events is null && _started) + { + throw new ObjectDisposedException(nameof(StreamingUpdateEnumerator)); + } + + _events ??= CreateEventEnumerator(); + _started = true; + + if (_updates is not null && _updates.MoveNext()) + { + _current = _updates.Current; + return true; + } + + if (_events.MoveNext()) + { + if (_events.Current.Data.AsSpan().SequenceEqual(TerminalData)) + { + _current = default; + return false; + } + + var updates = StreamingUpdate.FromEvent(_events.Current); + _updates = updates.GetEnumerator(); + + if (_updates.MoveNext()) + { + _current = _updates.Current; + return true; + } + } + + _current = default; + return false; + } + + private IEnumerator> CreateEventEnumerator() + { + ClientResult result = _getResult(); + PipelineResponse response = result.GetRawResponse(); + _enumerable.SetRawResponse(response); + + if (response.ContentStream is null) + { + throw new InvalidOperationException("Unable to create result from response with null ContentStream"); + } + + IEnumerable> enumerable = SseParser.Create(response.ContentStream, (_, bytes) => bytes.ToArray()).Enumerate(); + return enumerable.GetEnumerator(); + } + + public void Reset() + { + throw new NotSupportedException("Cannot seek back in an SSE stream."); + } + + public void Dispose() + { + Dispose(true); + GC.SuppressFinalize(this); + } + + private void Dispose(bool disposing) + { + if (disposing && _events is not null) + { + _events.Dispose(); + _events = null; + + // Dispose the response so we don't leave the unbuffered + // network stream open. + PipelineResponse response = _enumerable.GetRawResponse(); + response.Dispose(); + } + } + } +} diff --git a/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdateReason.Serialization.cs b/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdateReason.Serialization.cs new file mode 100644 index 000000000..6445528ad --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdateReason.Serialization.cs @@ -0,0 +1,63 @@ +namespace OpenAI.Assistants; + +internal static class StreamingUpdateReasonExtensions +{ + internal static string ToSseEventLabel(this StreamingUpdateReason value) => value switch + { + StreamingUpdateReason.ThreadCreated => "thread.created", + StreamingUpdateReason.RunCreated => "thread.run.created", + StreamingUpdateReason.RunQueued => "thread.run.queued", + StreamingUpdateReason.RunInProgress => "thread.run.in_progress", + StreamingUpdateReason.RunRequiresAction => "thread.run.requires_action", + StreamingUpdateReason.RunCompleted => "thread.run.completed", + StreamingUpdateReason.RunFailed => "thread.run.failed", + StreamingUpdateReason.RunCancelling => "thread.run.cancelling", + StreamingUpdateReason.RunCancelled => "thread.run.cancelled", + StreamingUpdateReason.RunExpired => "thread.run.expired", + StreamingUpdateReason.RunStepCreated => "thread.run.step.created", + StreamingUpdateReason.RunStepInProgress => "thread.run.step.in_progress", + StreamingUpdateReason.RunStepUpdated => "thread.run.step.delta", + StreamingUpdateReason.RunStepCompleted => "thread.run.step.completed", + StreamingUpdateReason.RunStepFailed => "thread.run.step.failed", + StreamingUpdateReason.RunStepCancelled => "thread.run.step.cancelled", + StreamingUpdateReason.RunStepExpired => "thread.run.step.expired", + StreamingUpdateReason.MessageCreated => "thread.message.created", + StreamingUpdateReason.MessageInProgress => "thread.message.in_progress", + StreamingUpdateReason.MessageUpdated => "thread.message.delta", + StreamingUpdateReason.MessageCompleted => "thread.message.completed", + StreamingUpdateReason.MessageFailed => "thread.message.incomplete", + StreamingUpdateReason.Error => "error", + StreamingUpdateReason.Done => "done", + _ => string.Empty + }; + + internal static StreamingUpdateReason FromSseEventLabel(string label) => label switch + { + "thread.created" => StreamingUpdateReason.ThreadCreated, + "thread.run.created" => StreamingUpdateReason.RunCreated, + "thread.run.queued" => StreamingUpdateReason.RunQueued, + "thread.run.in_progress" => StreamingUpdateReason.RunInProgress, + "thread.run.requires_action" => StreamingUpdateReason.RunRequiresAction, + "thread.run.completed" => StreamingUpdateReason.RunCompleted, + "thread.run.incomplete" => StreamingUpdateReason.RunIncomplete, + "thread.run.failed" => StreamingUpdateReason.RunFailed, + "thread.run.cancelling" => StreamingUpdateReason.RunCancelling, + "thread.run.cancelled" => StreamingUpdateReason.RunCancelled, + "thread.run.expired" => StreamingUpdateReason.RunExpired, + "thread.run.step.created" => StreamingUpdateReason.RunStepCreated, + "thread.run.step.in_progress" => StreamingUpdateReason.RunStepInProgress, + "thread.run.step.delta" => StreamingUpdateReason.RunStepUpdated, + "thread.run.step.completed" => StreamingUpdateReason.RunStepCompleted, + "thread.run.step.failed" => StreamingUpdateReason.RunStepFailed, + "thread.run.step.cancelled" => StreamingUpdateReason.RunStepCancelled, + "thread.run.step.expired" => StreamingUpdateReason.RunStepExpired, + "thread.message.created" => StreamingUpdateReason.MessageCreated, + "thread.message.in_progress" => StreamingUpdateReason.MessageInProgress, + "thread.message.delta" => StreamingUpdateReason.MessageUpdated, + "thread.message.completed" => StreamingUpdateReason.MessageCompleted, + "thread.message.incomplete" => StreamingUpdateReason.MessageFailed, + "error" => StreamingUpdateReason.Error, + "done" => StreamingUpdateReason.Done, + _ => StreamingUpdateReason.Unknown, + }; +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdateReason.cs b/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdateReason.cs new file mode 100644 index 000000000..e23673342 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/StreamingUpdateReason.cs @@ -0,0 +1,127 @@ +namespace OpenAI.Assistants; + +/// +/// The collection of values associated with the event names of streaming update payloads. These correspond to the +/// expected downcast data type of the as well as to the expected data present in the +/// payload. +/// +public enum StreamingUpdateReason +{ + /// + /// Indicates that there is no known reason associated with the streaming update. + /// + Unknown, + /// + /// Indicates that an update was generated as part of a thread.created event. + /// + /// This reason is typically only associated with calls to + /// , + /// as other run-related methods operate on a thread that has previously been created. + /// + ThreadCreated, + /// + /// Indicates that an update was generated as part of a thread.run.created event. + /// + RunCreated, + /// + /// Indicates that an update was generated as part of a thread.run.queued event. + /// + RunQueued, + /// + /// Indicates that an update was generated as part of a thread.run.in_progress event. + /// + RunInProgress, + /// + /// Indicates that an update was generated as part of a thread.run.requires_action event. + /// + /// + /// Note that, if multiple actions occur within a single event, as can be the case with the parallel tool calling, + /// distinct instances will be generated for each + /// . + /// + RunRequiresAction, + /// + /// Indicates that an update was generated as part of a thread.run.completed event. + /// + RunCompleted, + /// + /// Indicates that an update was generated as part of a thread.run.incomplete event. + /// + RunIncomplete, + /// + /// Indicates that an update was generated as part of a thread.run.failed event. + /// + RunFailed, + /// + /// Indicates that an update was generated as part of a thread.run.cancelling event. + /// + RunCancelling, + /// + /// Indicates that an update was generated as part of a thread.run.cancelled event. + /// + RunCancelled, + /// + /// Indicates that an update was generated as part of a thread.run.expired event. + /// + RunExpired, + /// + /// Indicates that an update was generated as part of a thread.run.step.created event. + /// + RunStepCreated, + /// + /// Indicates that an update was generated as part of a thread.run.step.in_progress event. + /// + RunStepInProgress, + /// + /// Indicates that an update was generated as part of a thread.run.step.delta event. + /// + RunStepUpdated, + /// + /// Indicates that an update was generated as part of a thread.run.step.completed event. + /// + RunStepCompleted, + /// + /// Indicates that an update was generated as part of a thread.run.step.failed event. + /// + RunStepFailed, + /// + /// Indicates that an update was generated as part of a thread.run.step.cancelled event. + /// + RunStepCancelled, + /// + /// Indicates that an update was generated as part of a thread.run.step.expired event. + /// + RunStepExpired, + /// + /// Indicates that an update was generated as part of a thread.message.created event. + /// + MessageCreated, + /// + /// Indicates that an update was generated as part of a thread.message.in_progress event. + /// + MessageInProgress, + /// + /// Indicates that an update was generated as part of a thread.message.delta event. + /// + /// + /// Distinct instances will be created per each content update and/or content + /// annotation present on the event. + /// + MessageUpdated, + /// + /// Indicates that an update was generated as part of a thread.message.completed event. + /// + MessageCompleted, + /// + /// Indicates that an update was generated as part of a thread.message.failed event. + /// + MessageFailed, + /// + /// Indicates that an update was generated as part of a thread.message.error event. + /// + Error, + /// + /// Indicates the end of streaming update events. This value should never be typically observed. + /// + Done, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Streaming/TextAnnotationUpdate.cs b/.dotnet/src/Custom/Assistants/Streaming/TextAnnotationUpdate.cs new file mode 100644 index 000000000..e7abb4030 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/TextAnnotationUpdate.cs @@ -0,0 +1,67 @@ +using System; + +namespace OpenAI.Assistants; + +public class TextAnnotationUpdate +{ + /// + /// The index of the content item that this annotation applies to. + /// + public int ContentIndex + => _fileSearchCitation?.Index + ?? _codeCitation?.Index + ?? (_internalAnnotation?.SerializedAdditionalRawData.TryGetValue("index", out BinaryData indexData) == true + ? int.Parse(indexData.ToString()) + : -1); + + /// + /// The index in the message content at which the citation begins. + /// + public int? StartIndex + => _fileSearchCitation?.StartIndex + ?? _codeCitation?.StartIndex + ?? (_internalAnnotation?.SerializedAdditionalRawData.TryGetValue("start_index", out BinaryData indexData) == true + ? int.Parse(indexData.ToString()) + : null); + + /// + /// The index in the message content at which the citation ends. + /// + public int? EndIndex + => _fileSearchCitation?.EndIndex + ?? _codeCitation?.EndIndex + ?? (_internalAnnotation?.SerializedAdditionalRawData.TryGetValue("start_index", out BinaryData indexData) == true + ? int.Parse(indexData.ToString()) + : null); + + /// + /// The text in the message content that should be replaced. + /// + public string TextToReplace + => _fileSearchCitation?.Text + ?? _codeCitation?.Text + ?? (_internalAnnotation?.SerializedAdditionalRawData.TryGetValue("text", out BinaryData textData) == true + ? textData.ToString() + : null); + + /// + /// The ID of the file cited by the file_search tool for this annotation. + /// + public string InputFileId => _fileSearchCitation?.FileCitation?.FileId; + + /// + /// The ID of the file that was generated by the code_interpreter tool for this citation. + /// + public string OutputFileId => _codeCitation?.FilePath?.FileId; + + internal readonly InternalMessageDeltaTextContentAnnotation _internalAnnotation; + private readonly InternalMessageDeltaContentTextAnnotationsFileCitationObject _fileSearchCitation; + private readonly InternalMessageDeltaContentTextAnnotationsFilePathObject _codeCitation; + + internal TextAnnotationUpdate(InternalMessageDeltaTextContentAnnotation internalAnnotation) + { + _internalAnnotation = internalAnnotation; + _fileSearchCitation = internalAnnotation as InternalMessageDeltaContentTextAnnotationsFileCitationObject; + _codeCitation = internalAnnotation as InternalMessageDeltaContentTextAnnotationsFilePathObject; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/Streaming/ThreadUpdate.cs b/.dotnet/src/Custom/Assistants/Streaming/ThreadUpdate.cs new file mode 100644 index 000000000..c839281ff --- /dev/null +++ b/.dotnet/src/Custom/Assistants/Streaming/ThreadUpdate.cs @@ -0,0 +1,37 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +/// +/// The update type presented when a streamed event indicates a thread was created. +/// +public class ThreadUpdate : StreamingUpdate +{ + /// + public string Id => Value.Id; + /// + public IReadOnlyDictionary Metadata => Value.Metadata; + /// + public DateTimeOffset CreatedAt => Value.CreatedAt; + /// + public ToolResources ToolResources => Value.ToolResources; + + internal ThreadUpdate(AssistantThread thread) : base(thread, StreamingUpdateReason.ThreadCreated) + { } + + internal static IEnumerable> DeserializeThreadCreationUpdates( + JsonElement element, + StreamingUpdateReason updateKind, + ModelReaderWriterOptions options = null) + { + AssistantThread thread = AssistantThread.DeserializeAssistantThread(element, options); + return updateKind switch + { + StreamingUpdateReason.ThreadCreated => [new ThreadUpdate(thread)], + _ => [new StreamingUpdate(thread, updateKind)], + }; + } +} diff --git a/.dotnet/src/Custom/Assistants/TextAnnotation.cs b/.dotnet/src/Custom/Assistants/TextAnnotation.cs new file mode 100644 index 000000000..2a4652b54 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/TextAnnotation.cs @@ -0,0 +1,57 @@ +using System; + +namespace OpenAI.Assistants; + +public class TextAnnotation +{ + internal readonly InternalMessageContentTextObjectAnnotation _internalAnnotation; + private readonly InternalMessageContentTextAnnotationsFileCitationObject _fileSearchCitation; + private readonly InternalMessageContentTextAnnotationsFilePathObject _codeCitation; + + /// + /// The index in the message content at which the citation begins. + /// + public int StartIndex + => _fileSearchCitation?.StartIndex + ?? _codeCitation?.StartIndex + ?? (_internalAnnotation?.SerializedAdditionalRawData?.TryGetValue("start_index", out BinaryData indexData) == true + ? int.Parse(indexData.ToString()) + : -1); + + /// + /// The index in the message content at which the citation ends. + /// + public int EndIndex => + _fileSearchCitation?.EndIndex + ?? _codeCitation?.EndIndex + ?? (_internalAnnotation?.SerializedAdditionalRawData?.TryGetValue("end_index", out BinaryData indexData) == true + ? int.Parse(indexData.ToString()) + : -1); + + /// + /// The text in the message content that should be replaced. + /// + public string TextToReplace => + _fileSearchCitation?.Text + ?? _codeCitation?.Text + ?? (_internalAnnotation?.SerializedAdditionalRawData?.TryGetValue("text", out BinaryData textData) == true + ? textData.ToString() + : null); + + /// + /// The ID of the file cited by the file_search tool for this annotation. + /// + public string InputFileId => _fileSearchCitation?.FileCitation?.FileId; + + /// + /// The ID of the file that was generated by the code_interpreter tool for this citation. + /// + public string OutputFileId => _codeCitation?.FilePath?.FileId; + + internal TextAnnotation(InternalMessageContentTextObjectAnnotation internalAnnotation) + { + _internalAnnotation = internalAnnotation; + _fileSearchCitation = internalAnnotation as InternalMessageContentTextAnnotationsFileCitationObject; + _codeCitation = internalAnnotation as InternalMessageContentTextAnnotationsFilePathObject; ; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/ThreadCreationOptions.cs b/.dotnet/src/Custom/Assistants/ThreadCreationOptions.cs new file mode 100644 index 000000000..f0ae040f7 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/ThreadCreationOptions.cs @@ -0,0 +1,51 @@ +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants; + +/// +/// Represents additional options available when creating a new . +/// +[CodeGenModel("CreateThreadRequest")] +public partial class ThreadCreationOptions +{ + // CUSTOM: reuse a common type for request/response model representations of tool resources + + /// + [CodeGenMember("ToolResources")] + public ToolResources ToolResources { get; set; } + + // CUSTOM: the wire-oriented messages type list is hidden so that we can propagate top-level required semantics + // of message creation into the collection. + + [CodeGenMember("Messages")] + internal IList InternalMessages + { + get => InitialMessages.Select(initializationMessage => initializationMessage as MessageCreationOptions).ToList(); + private set + { + // Note: this path is exclusively used in a test or deserialization case; here, we'll convert the + // underlying wire-friendly representation into the initialization message abstraction. + + InitialMessages.Clear(); + foreach (MessageCreationOptions baseMessageOptions in value) + { + InitialMessages.Add(new ThreadInitializationMessage(baseMessageOptions)); + } + } + } + + /// + /// The collection of new message definitions that should be added to the thread immediately upon its creation. + /// + /// + /// Items may be inserted into this collection via list initializer, e.g.: + /// + /// options = new() + /// { + /// InitialMessages = { new (["Hello, world!"]) }, + /// } + /// + /// + public IList InitialMessages { get; } = new ChangeTrackingList(); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/ThreadInitializationMessage.cs b/.dotnet/src/Custom/Assistants/ThreadInitializationMessage.cs new file mode 100644 index 000000000..60c443977 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/ThreadInitializationMessage.cs @@ -0,0 +1,33 @@ +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +public partial class ThreadInitializationMessage : MessageCreationOptions +{ + /// + /// Creates a new instance of . + /// + /// + /// The content items that should be included in the message, added to the thread being created. + /// + public ThreadInitializationMessage(MessageRole role, IEnumerable content) : base(content) + { + Role = role; + } + + internal ThreadInitializationMessage(MessageCreationOptions baseOptions) + : base(baseOptions.Role, baseOptions.Content, baseOptions.Attachments, baseOptions.Metadata, null) + { } + + /// + /// Implicitly creates a new instance of from a single item of plain text + /// content, assuming the role of . + /// + /// + /// Using a in the position of a is equivalent to + /// using the constructor with + /// and a single content instance. + /// + public static implicit operator ThreadInitializationMessage(string initializationMessage) + => new(MessageRole.User, [initializationMessage]); +} diff --git a/.dotnet/src/Custom/Assistants/ThreadMessage.cs b/.dotnet/src/Custom/Assistants/ThreadMessage.cs new file mode 100644 index 000000000..1e831a070 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/ThreadMessage.cs @@ -0,0 +1,20 @@ +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +[CodeGenModel("MessageObject")] +public partial class ThreadMessage +{ + // CUSTOM: Made internal. + /// The object type, which is always `thread.message`. + [CodeGenMember("Object")] + internal InternalMessageObjectObject Object { get; } = InternalMessageObjectObject.ThreadMessage; + + + /// + [CodeGenMember("Role")] + public MessageRole Role { get; } + + /// A list of files attached to the message, and the tools they were added to. + public IReadOnlyList Attachments { get; } +} diff --git a/.dotnet/src/Custom/Assistants/ThreadModificationOptions.cs b/.dotnet/src/Custom/Assistants/ThreadModificationOptions.cs new file mode 100644 index 000000000..5216abe20 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/ThreadModificationOptions.cs @@ -0,0 +1,15 @@ +namespace OpenAI.Assistants; + +/// +/// Represents additional options available when modifying an existing . +/// +[CodeGenModel("ModifyThreadRequest")] +public partial class ThreadModificationOptions +{ + // CUSTOM: reuse common request/response models for tool resources. Note that modification operations use the + // response models (which do not contain resource initialization helpers). + + /// + [CodeGenMember("ToolResources")] + public ToolResources ToolResources { get; set; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/ThreadRun.cs b/.dotnet/src/Custom/Assistants/ThreadRun.cs new file mode 100644 index 000000000..b2a0f3f43 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/ThreadRun.cs @@ -0,0 +1,96 @@ +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants; + +// CUSTOM: +// - Required actions are abstracted into a forward-compatible, strongly-typed conceptual +// hierarchy and formatted into a more intuitive collection for the consumer. + +[CodeGenModel("RunObject")] +public partial class ThreadRun +{ + // CUSTOM: Made internal. + /// The object type, which is always `thread.run`. + [CodeGenMember("Object")] + internal InternalRunObjectObject Object { get; } = InternalRunObjectObject.ThreadRun; + + + [CodeGenMember("RequiredAction")] + internal readonly InternalRunRequiredAction _internalRequiredAction; + + // CUSTOM: Removed null check for `toolConstraint` and `responseFormat`. + internal ThreadRun(string id, DateTimeOffset createdAt, string threadId, string assistantId, RunStatus status, InternalRunRequiredAction internalRequiredAction, RunError lastError, DateTimeOffset? expiresAt, DateTimeOffset? startedAt, DateTimeOffset? cancelledAt, DateTimeOffset? failedAt, DateTimeOffset? completedAt, RunIncompleteDetails incompleteDetails, string model, string instructions, IEnumerable tools, IReadOnlyDictionary metadata, RunTokenUsage usage, int? maxPromptTokens, int? maxCompletionTokens, RunTruncationStrategy truncationStrategy, ToolConstraint toolConstraint, bool? parallelToolCallsEnabled, AssistantResponseFormat responseFormat) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(threadId, nameof(threadId)); + Argument.AssertNotNull(assistantId, nameof(assistantId)); + Argument.AssertNotNull(model, nameof(model)); + Argument.AssertNotNull(instructions, nameof(instructions)); + Argument.AssertNotNull(tools, nameof(tools)); + + Id = id; + CreatedAt = createdAt; + ThreadId = threadId; + AssistantId = assistantId; + Status = status; + _internalRequiredAction = internalRequiredAction; + LastError = lastError; + ExpiresAt = expiresAt; + StartedAt = startedAt; + CancelledAt = cancelledAt; + FailedAt = failedAt; + CompletedAt = completedAt; + IncompleteDetails = incompleteDetails; + Model = model; + Instructions = instructions; + Tools = tools.ToList(); + Metadata = metadata; + Usage = usage; + MaxPromptTokens = maxPromptTokens; + MaxCompletionTokens = maxCompletionTokens; + TruncationStrategy = truncationStrategy; + ToolConstraint = toolConstraint; + ParallelToolCallsEnabled = parallelToolCallsEnabled; + ResponseFormat = responseFormat; + } + + + /// + /// The list of required actions that must have their results submitted for the run to continue. + /// + /// + /// is the abstract base type for all required actions. Its + /// concrete type can be one of: + /// + /// + /// + /// + public IReadOnlyList RequiredActions => _internalRequiredAction?.SubmitToolOutputs?.ToolCalls ?? []; + + /// + [CodeGenMember("ResponseFormat")] + public AssistantResponseFormat ResponseFormat { get; } + + [CodeGenMember("ToolChoice")] + public ToolConstraint ToolConstraint { get; } + + /// + /// An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + /// + /// We generally recommend altering this or temperature but not both. + /// + [CodeGenMember("TopP")] + public float? NucleusSamplingFactor { get; } + + /// + /// Whether parallel function calling is enabled during tool use for the thread. + /// + /// + /// Assumed true if not otherwise specified. + /// + [CodeGenMember("ParallelToolCalls")] + public bool? ParallelToolCallsEnabled { get; } + +} diff --git a/.dotnet/src/Custom/Assistants/ToolConstraint.Serialization.cs b/.dotnet/src/Custom/Assistants/ToolConstraint.Serialization.cs new file mode 100644 index 000000000..7d2a335e4 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/ToolConstraint.Serialization.cs @@ -0,0 +1,97 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Create", typeof(Utf8JsonReader), typeof(ModelReaderWriterOptions))] +[CodeGenSuppress("global::System.ClientModel.Primitives.IPersistableModel.Write", typeof(ModelReaderWriterOptions))] +[CodeGenSuppress("global::System.ClientModel.Primitives.IPersistableModel.Create", typeof(BinaryData), typeof(ModelReaderWriterOptions))] +public partial class ToolConstraint : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeToolConstraint, writer, options); + + ToolConstraint IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + => CustomSerializationHelpers.DeserializeNewInstance(this, DeserializeToolConstraint, ref reader, options); + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, options); + + ToolConstraint IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + => CustomSerializationHelpers.DeserializeNewInstance(this, DeserializeToolConstraint, data, options); + + internal static void SerializeToolConstraint(ToolConstraint instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + if (instance._plainTextValue is not null) + { + writer.WriteStringValue(instance._plainTextValue); + } + else + { + writer.WriteStartObject(); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(instance._objectType.ToString()); + if (Optional.IsDefined(instance._objectFunctionName)) + { + writer.WritePropertyName("function"u8); + writer.WriteStartObject(); + writer.WritePropertyName("name"u8); + writer.WriteStringValue(instance._objectFunctionName); + writer.WriteEndObject(); + } + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + } + + internal static ToolConstraint DeserializeToolConstraint(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + string plainTextValue = null; + string objectType = null; + string objectFunctionName = null; + IDictionary rawDataDictionary = new ChangeTrackingDictionary(); + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.ValueKind == JsonValueKind.String) + { + plainTextValue = element.GetString(); + } + else + { + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + objectType = property.Value.GetString(); + continue; + } + if (property.NameEquals("function"u8)) + { + foreach (JsonProperty functionProperty in property.Value.EnumerateObject()) + { + if (functionProperty.NameEquals("name"u8)) + { + objectFunctionName = functionProperty.Value.GetString(); + continue; + } + } + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + } + + return new ToolConstraint(plainTextValue, objectType, objectFunctionName, rawDataDictionary); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/ToolConstraint.cs b/.dotnet/src/Custom/Assistants/ToolConstraint.cs new file mode 100644 index 000000000..dc3de61ac --- /dev/null +++ b/.dotnet/src/Custom/Assistants/ToolConstraint.cs @@ -0,0 +1,55 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants; + +[CodeGenModel("AssistantsNamedToolChoice")] +public partial class ToolConstraint +{ + private readonly string _plainTextValue; + [CodeGenMember("Type")] + private readonly string _objectType; + private readonly string _objectFunctionName; + private readonly IDictionary SerializedAdditionalRawData; + + // CUSTOM: Made internal. + /// Gets or sets the function. + [CodeGenMember("Function")] + internal InternalAssistantsNamedToolChoiceFunction Function { get; set; } + + public static ToolConstraint None { get; } = new("none"); + public static ToolConstraint Auto { get; } = new("auto"); + public static ToolConstraint Required { get; } = new("required"); + + public ToolConstraint(ToolDefinition toolDefinition) + { + switch (toolDefinition) + { + case CodeInterpreterToolDefinition: + _objectType = "code_interpreter"; + break; + case FileSearchToolDefinition: + _objectType = "file_search"; + break; + case FunctionToolDefinition functionTool: + _objectType = "function"; + _objectFunctionName = functionTool.FunctionName; + break; + default: + throw new ArgumentOutOfRangeException(nameof(toolDefinition)); + } + SerializedAdditionalRawData = new ChangeTrackingDictionary(); + } + + internal ToolConstraint(string plainTextValue) + : this(plainTextValue, null, null, null) + { } + + internal ToolConstraint(string plainTextValue, string objectType, string objectFunctionName, IDictionary serializedAdditionalRawData) + { + _plainTextValue = plainTextValue; + _objectType = objectType; + _objectFunctionName = objectFunctionName; + SerializedAdditionalRawData = serializedAdditionalRawData ?? new ChangeTrackingDictionary(); + } +} diff --git a/.dotnet/src/Custom/Assistants/ToolDefinition.Serialization.cs b/.dotnet/src/Custom/Assistants/ToolDefinition.Serialization.cs new file mode 100644 index 000000000..9894fb1ed --- /dev/null +++ b/.dotnet/src/Custom/Assistants/ToolDefinition.Serialization.cs @@ -0,0 +1,19 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Runtime.CompilerServices; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public abstract partial class ToolDefinition : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, WriteCore, writer, options); + + internal static void WriteCore(ToolDefinition instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected abstract void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/ToolDefinition.cs b/.dotnet/src/Custom/Assistants/ToolDefinition.cs new file mode 100644 index 000000000..0b876e1c1 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/ToolDefinition.cs @@ -0,0 +1,29 @@ +using System; + +namespace OpenAI.Assistants; + +[CodeGenModel("AssistantToolDefinition")] +public abstract partial class ToolDefinition +{ + public static CodeInterpreterToolDefinition CreateCodeInterpreter() + => new CodeInterpreterToolDefinition(); + public static FileSearchToolDefinition CreateFileSearch(int? maxResults = null) + { + return new FileSearchToolDefinition() + { + MaxResults = maxResults + }; + } + public static FunctionToolDefinition CreateFunction(string name, string description = null, BinaryData parameters = null, bool? strictParameterSchemaEnabled = null) + => new FunctionToolDefinition(name) + { + Description = description, + Parameters = parameters, + StrictParameterSchemaEnabled = strictParameterSchemaEnabled, + }; + + protected ToolDefinition(string type) + { + Type = type; + } +} diff --git a/.dotnet/src/Custom/Assistants/ToolOutput.cs b/.dotnet/src/Custom/Assistants/ToolOutput.cs new file mode 100644 index 000000000..ad32112d0 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/ToolOutput.cs @@ -0,0 +1,16 @@ +namespace OpenAI.Assistants; + +[CodeGenModel("SubmitToolOutputsRunRequestToolOutput")] +public partial class ToolOutput +{ + /// + /// Creates a new instance of . + /// + /// + /// The ID of that the provided output resolves. + /// + /// The output from the specified tool. + public ToolOutput(string toolCallId, string output) + : this(toolCallId, output, null) + {} +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Assistants/ToolResources.cs b/.dotnet/src/Custom/Assistants/ToolResources.cs new file mode 100644 index 000000000..1043eead2 --- /dev/null +++ b/.dotnet/src/Custom/Assistants/ToolResources.cs @@ -0,0 +1,20 @@ +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants; + +[CodeGenModel("AssistantObjectToolResources")] +[CodeGenSerialization(nameof(FileSearch), "file_search", SerializationValueHook = nameof(SerializeFileSearch))] +public partial class ToolResources +{ + /// Gets the code interpreter. + public CodeInterpreterToolResources CodeInterpreter { get; set; } + /// Gets the file search. + public FileSearchToolResources FileSearch { get; set; } + + public ToolResources() + {} + + private void SerializeFileSearch(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => writer.WriteObjectValue(FileSearch, options); +} diff --git a/.dotnet/src/Custom/Assistants/VectorStoreCreationHelper.cs b/.dotnet/src/Custom/Assistants/VectorStoreCreationHelper.cs new file mode 100644 index 000000000..cba26fb8e --- /dev/null +++ b/.dotnet/src/Custom/Assistants/VectorStoreCreationHelper.cs @@ -0,0 +1,23 @@ +using OpenAI.Files; +using OpenAI.VectorStores; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants; + +[CodeGenModel("ToolResourcesFileSearchVectorStore")] +public partial class VectorStoreCreationHelper +{ + [CodeGenMember("ChunkingStrategy")] + public FileChunkingStrategy ChunkingStrategy { get; set; } + + public VectorStoreCreationHelper(IEnumerable fileIds, IDictionary metadata = null) + { + FileIds = fileIds.ToList(); + Metadata = metadata ?? new ChangeTrackingDictionary(); + } + + public VectorStoreCreationHelper(IEnumerable files, IDictionary metadata = null) + : this(files?.Select(file => file.Id) ?? [], metadata) + {} +} diff --git a/.dotnet/src/Custom/Audio/AudioClient.Protocol.cs b/.dotnet/src/Custom/Audio/AudioClient.Protocol.cs new file mode 100644 index 000000000..0fd68c698 --- /dev/null +++ b/.dotnet/src/Custom/Audio/AudioClient.Protocol.cs @@ -0,0 +1,156 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; +using System.Threading.Tasks; + +namespace OpenAI.Audio; + +[CodeGenSuppress("CreateSpeechAsync", typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("CreateSpeech", typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("CreateTranscriptionAsync", typeof(BinaryContent), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CreateTranscription", typeof(BinaryContent), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CreateTranslationAsync", typeof(BinaryContent), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CreateTranslation", typeof(BinaryContent), typeof(string), typeof(RequestOptions))] +public partial class AudioClient +{ + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + /// + /// [Protocol Method] Generates text-to-speech audio using the specified voice speaking the provided input text. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GenerateSpeechAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateSpeechRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + /// + /// [Protocol Method] Generates text-to-speech audio using the specified voice speaking the provided input text. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GenerateSpeech(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateSpeechRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + // - Added "contentType" parameter. + /// + /// [Protocol Method] Transcribes audio. + /// + /// The content to send as the body of the request. + /// The content type of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task TranscribeAudioAsync(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateCreateTranscriptionRequest(content, contentType, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + // - Added "contentType" parameter. + /// + /// [Protocol Method] Transcribes audio. + /// + /// The content to send as the body of the request. + /// The content type of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult TranscribeAudio(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateCreateTranscriptionRequest(content, contentType, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + // - Added "contentType" parameter. + /// + /// [Protocol Method] Translates audio into English. + /// + /// The content to send as the body of the request. + /// The content type of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task TranslateAudioAsync(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateCreateTranslationRequest(content, contentType, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + // - Added "contentType" parameter. + /// + /// [Protocol Method] Translates audio into English. + /// + /// The content to send as the body of the request. + /// The content type of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult TranslateAudio(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateCreateTranslationRequest(content, contentType, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/AudioClient.cs b/.dotnet/src/Custom/Audio/AudioClient.cs new file mode 100644 index 000000000..58c0e49a8 --- /dev/null +++ b/.dotnet/src/Custom/Audio/AudioClient.cs @@ -0,0 +1,321 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.IO; +using System.Threading; +using System.Threading.Tasks; + +namespace OpenAI.Audio; + +// CUSTOM: +// - Renamed. +// - Suppressed constructor that takes endpoint parameter; endpoint is now a property in the options class. +// - Suppressed methods that only take the options parameter. +/// The service client for OpenAI audio operations. +[CodeGenClient("Audio")] +[CodeGenSuppress("AudioClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateSpeechAsync", typeof(SpeechGenerationOptions))] +[CodeGenSuppress("CreateSpeech", typeof(SpeechGenerationOptions))] +[CodeGenSuppress("CreateTranscriptionAsync", typeof(AudioTranscriptionOptions))] +[CodeGenSuppress("CreateTranscription", typeof(AudioTranscriptionOptions))] +[CodeGenSuppress("CreateTranslationAsync", typeof(AudioTranslationOptions))] +[CodeGenSuppress("CreateTranslation", typeof(AudioTranslationOptions))] +public partial class AudioClient +{ + private readonly string _model; + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public AudioClient(string model, ApiKeyCredential credential) : this(model, credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public AudioClient(string model, ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + protected internal AudioClient(ClientPipeline pipeline, string model, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNullOrEmpty(model, nameof(model)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } + + #region GenerateSpeech + + /// Generates a life-like, spoken audio recording of the input text. + /// + /// The default format of the generated audio is unless otherwise specified + /// via . + /// + /// The text to generate audio for. + /// The voice to use in the generated audio. + /// The options to configure the audio generation. + /// A token that can be used to cancel this method call. + /// is null. + /// The generated audio in the specified output format. + public virtual async Task> GenerateSpeechAsync(string text, GeneratedSpeechVoice voice, SpeechGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(text, nameof(text)); + + options ??= new(); + CreateSpeechGenerationOptions(text, voice, ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = await GenerateSpeechAsync(content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(result.GetRawResponse().Content, result.GetRawResponse()); + } + + /// Generates a life-like, spoken audio recording of the input text. + /// + /// The default format of the generated audio is unless otherwise specified + /// via . + /// + /// The text to generate audio for. + /// The voice to use in the generated audio. + /// The options to configure the audio generation. + /// A token that can be used to cancel this method call. + /// is null. + /// The generated audio in the specified output format. + public virtual ClientResult GenerateSpeech(string text, GeneratedSpeechVoice voice, SpeechGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(text, nameof(text)); + + options ??= new(); + CreateSpeechGenerationOptions(text, voice, ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = GenerateSpeech(content, cancellationToken.ToRequestOptions()); ; + return ClientResult.FromValue(result.GetRawResponse().Content, result.GetRawResponse()); + } + + #endregion + + #region TranscribeAudio + + /// Transcribes the input audio. + /// The audio stream to transcribe. + /// + /// The filename associated with the audio stream. The filename's extension (for example: .mp3) will be used to + /// validate the format of the input audio. The request may fail if the filename's extension and the actual + /// format of the input audio do not match. + /// + /// The options to configure the audio transcription. + /// A token that can be used to cancel this method call. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> TranscribeAudioAsync(Stream audio, string audioFilename, AudioTranscriptionOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(audio, nameof(audio)); + Argument.AssertNotNullOrEmpty(audioFilename, nameof(audioFilename)); + + options ??= new(); + CreateAudioTranscriptionOptions(audio, audioFilename, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(audio, audioFilename); + ClientResult result = await TranscribeAudioAsync(content, content.ContentType, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(AudioTranscription.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Transcribes the input audio. + /// The audio stream to transcribe. + /// + /// The filename associated with the audio stream. The filename's extension (for example: .mp3) will be used to + /// validate the format of the input audio. The request may fail if the filename's extension and the actual + /// format of the input audio do not match. + /// + /// The options to configure the audio transcription. + /// A token that can be used to cancel this method call. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult TranscribeAudio(Stream audio, string audioFilename, AudioTranscriptionOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(audio, nameof(audio)); + Argument.AssertNotNullOrEmpty(audioFilename, nameof(audioFilename)); + + options ??= new(); + CreateAudioTranscriptionOptions(audio, audioFilename, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(audio, audioFilename); + ClientResult result = TranscribeAudio(content, content.ContentType, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(AudioTranscription.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Transcribes the input audio. + /// + /// The path of the audio file to transcribe. The provided file path's extension (for example: .mp3) will be + /// used to validate the format of the input audio. The request may fail if the file path's extension and the + /// actual format of the input audio do not match. + /// + /// The options to configure the audio transcription. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> TranscribeAudioAsync(string audioFilePath, AudioTranscriptionOptions options = null) + { + Argument.AssertNotNullOrEmpty(audioFilePath, nameof(audioFilePath)); + + using FileStream audioStream = File.OpenRead(audioFilePath); + return await TranscribeAudioAsync(audioStream, audioFilePath, options).ConfigureAwait(false); + } + + /// Transcribes the input audio. + /// + /// The path of the audio file to transcribe. The provided file path's extension (for example: .mp3) will be + /// used to validate the format of the input audio. The request may fail if the file path's extension and the + /// actual format of the input audio do not match. + /// + /// The options to configure the audio transcription. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult TranscribeAudio(string audioFilePath, AudioTranscriptionOptions options = null) + { + Argument.AssertNotNullOrEmpty(audioFilePath, nameof(audioFilePath)); + + using FileStream audioStream = File.OpenRead(audioFilePath); + return TranscribeAudio(audioStream, audioFilePath, options); + } + + #endregion + + #region TranslateAudio + + /// Translates the input audio into English. + /// The audio stream to translate. + /// + /// The filename associated with the audio stream. The filename's extension (for example: .mp3) will be used to + /// validate the format of the input audio. The request may fail if the filename's extension and the actual + /// format of the input audio do not match. + /// + /// The options to configure the audio translation. + /// A token that can be used to cancel this method call. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> TranslateAudioAsync(Stream audio, string audioFilename, AudioTranslationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(audio, nameof(audio)); + Argument.AssertNotNullOrEmpty(audioFilename, nameof(audioFilename)); + + options ??= new(); + CreateAudioTranslationOptions(audio, audioFilename, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(audio, audioFilename); + ClientResult result = await TranslateAudioAsync(content, content.ContentType, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(AudioTranslation.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Translates the input audio into English. + /// The audio stream to translate. + /// + /// The filename associated with the audio stream. The filename's extension (for example: .mp3) will be used to + /// validate the format of the input audio. The request may fail if the filename's extension and the actual + /// format of the input audio do not match. + /// + /// The options to configure the audio translation. + /// A token that can be used to cancel this method call. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult TranslateAudio(Stream audio, string audioFilename, AudioTranslationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(audio, nameof(audio)); + Argument.AssertNotNullOrEmpty(audioFilename, nameof(audioFilename)); + + options ??= new(); + CreateAudioTranslationOptions(audio, audioFilename, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(audio, audioFilename); + ClientResult result = TranslateAudio(content, content.ContentType, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(AudioTranslation.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Translates the input audio into English. + /// + /// The path of the audio file to translate. The provided file path's extension (for example: .mp3) will be + /// used to validate the format of the input audio. The request may fail if the file path's extension and the + /// actual format of the input audio do not match. + /// + /// The options to configure the audio translation. + /// was null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult TranslateAudio(string audioFilePath, AudioTranslationOptions options = null) + { + Argument.AssertNotNullOrEmpty(audioFilePath, nameof(audioFilePath)); + + using FileStream audioStream = File.OpenRead(audioFilePath); + return TranslateAudio(audioStream, audioFilePath, options); + } + + /// Translates the input audio into English. + /// + /// The path of the audio file to translate. The provided file path's extension (for example: .mp3) will be + /// used to validate the format of the input audio. The request may fail if the file path's extension and the + /// actual format of the input audio do not match. + /// + /// The options to configure the audio translation. + /// was null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> TranslateAudioAsync(string audioFilePath, AudioTranslationOptions options = null) + { + Argument.AssertNotNull(audioFilePath, nameof(audioFilePath)); + + using FileStream audioStream = File.OpenRead(audioFilePath); + return await TranslateAudioAsync(audioStream, audioFilePath, options); + } + + #endregion + + private void CreateSpeechGenerationOptions(string text, GeneratedSpeechVoice voice, ref SpeechGenerationOptions options) + { + options.Input = text; + options.Voice = voice; + options.Model = _model; + } + + private void CreateAudioTranscriptionOptions(Stream audio, string audioFilename, ref AudioTranscriptionOptions options) + { + options.Model = _model; + } + + private void CreateAudioTranslationOptions(Stream audio, string audioFilename, ref AudioTranslationOptions options) + { + options.Model = _model; + } +} diff --git a/.dotnet/src/Custom/Audio/AudioTimestampGranularities.cs b/.dotnet/src/Custom/Audio/AudioTimestampGranularities.cs new file mode 100644 index 000000000..1deac3f0c --- /dev/null +++ b/.dotnet/src/Custom/Audio/AudioTimestampGranularities.cs @@ -0,0 +1,28 @@ +using System; + +namespace OpenAI.Audio; + +/// +/// Specifies the timestamp granularities to populate for a transcription. +/// +[Flags] +public enum AudioTimestampGranularities +{ + /// + /// The default value that, when equivalent to a request's flags, specifies no specific audio timestamp granularity + /// and defers to the default timestamp behavior. + /// + Default = 0, + + /// + /// The value that, when present in the request's flags, specifies that audio information should include word-level + /// timestamp information. + /// + Word = 1, + + /// + /// The value that, when present in the request's flags, specifies that audio information should include + /// segment-level timestamp information. + /// + Segment = 2, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/AudioTranscription.Serialization.cs b/.dotnet/src/Custom/Audio/AudioTranscription.Serialization.cs new file mode 100644 index 000000000..7f717d9cf --- /dev/null +++ b/.dotnet/src/Custom/Audio/AudioTranscription.Serialization.cs @@ -0,0 +1,28 @@ +using System; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Audio; + +public partial class AudioTranscription +{ + internal static AudioTranscription FromResponse(PipelineResponse response) + { + // Customization: handle plain text responses (SRT/VTT formats) + if (response?.Headers?.TryGetValue("Content-Type", out string contentType) == true && + contentType.StartsWith("text/plain", StringComparison.Ordinal)) + { + return new AudioTranscription( + InternalCreateTranscriptionResponseVerboseJsonTask.Transcribe, + language: null, + duration: null, + text: response.Content?.ToString(), + words: [], + segments: [], + serializedAdditionalRawData: new ChangeTrackingDictionary()); + } + + using var document = JsonDocument.Parse(response.Content); + return DeserializeAudioTranscription(document.RootElement); + } +} diff --git a/.dotnet/src/Custom/Audio/AudioTranscription.cs b/.dotnet/src/Custom/Audio/AudioTranscription.cs new file mode 100644 index 000000000..a8483d33b --- /dev/null +++ b/.dotnet/src/Custom/Audio/AudioTranscription.cs @@ -0,0 +1,14 @@ +using System; + +namespace OpenAI.Audio; + +[CodeGenModel("CreateTranscriptionResponseVerboseJson")] +public partial class AudioTranscription +{ + // CUSTOM: Made private. This property does not add value in the context of a strongly-typed class. + private InternalCreateTranscriptionResponseVerboseJsonTask Task { get; } = InternalCreateTranscriptionResponseVerboseJsonTask.Transcribe; + + // CUSTOM: Made nullable because this is an optional property. + /// The duration of the input audio. + public TimeSpan? Duration { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/AudioTranscriptionFormat.cs b/.dotnet/src/Custom/Audio/AudioTranscriptionFormat.cs new file mode 100644 index 000000000..77a9a721b --- /dev/null +++ b/.dotnet/src/Custom/Audio/AudioTranscriptionFormat.cs @@ -0,0 +1,31 @@ +using System.ComponentModel; + +namespace OpenAI.Audio; + +/// +/// Specifies the format of the audio transcription. +/// +[CodeGenModel("CreateTranscriptionRequestResponseFormat1")] +public enum AudioTranscriptionFormat +{ + /// Text. + [CodeGenMember("Text")] + [EditorBrowsable(EditorBrowsableState.Never)] + Text, + + /// Simple. + [CodeGenMember("Json")] + Simple, + + /// Verbose. + [CodeGenMember("VerboseJson")] + Verbose, + + /// SRT. + [CodeGenMember("Srt")] + Srt, + + /// VTT. + [CodeGenMember("Vtt")] + Vtt, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/AudioTranscriptionOptions.cs b/.dotnet/src/Custom/Audio/AudioTranscriptionOptions.cs new file mode 100644 index 000000000..b13679e0c --- /dev/null +++ b/.dotnet/src/Custom/Audio/AudioTranscriptionOptions.cs @@ -0,0 +1,136 @@ +using OpenAI.Internal; +using System; +using System.Collections.Generic; +using System.IO; + +namespace OpenAI.Audio; + +[CodeGenModel("CreateTranscriptionRequest")] +[CodeGenSuppress("AudioTranscriptionOptions", typeof(BinaryData), typeof(InternalCreateTranscriptionRequestModel))] +public partial class AudioTranscriptionOptions +{ + // CUSTOM: Made internal. This value comes from a parameter on the client method. + /// + /// The audio file object (not file name) to transcribe, in one of these formats: flac, mp3, mp4, + /// mpeg, mpga, m4a, ogg, pcm, wav, or webm. + /// + /// To assign a byte[] to this property use . + /// The byte[] will be serialized to a Base64 encoded string. + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromBytes(new byte[] { 1, 2, 3 }) + /// Creates a payload of "AQID". + /// + /// + /// + /// + internal BinaryData File { get; } + + // CUSTOM: + // - Made internal. The model is specified by the client. + // - Added setter. + /// + /// ID of the model to use. Only `whisper-1` (which is powered by our open source Whisper V2 model) + /// is currently available. + /// + internal InternalCreateTranscriptionRequestModel Model { get; set; } + + // CUSTOM: Made internal. The model is specified by the client. + /// + /// The timestamp granularities to populate for this transcription. `response_format` must be set + /// `verbose_json` to use timestamp granularities. Either or both of these options are supported: + /// `word`, or `segment`. Note: There is no additional latency for segment timestamps, but + /// generating word timestamps incurs additional latency. + /// + /// To assign an object to the element of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal IList TimestampGranularities { get; } + + // CUSTOM: Made public now that there are no required properties. + /// Initializes a new instance of . + public AudioTranscriptionOptions() + { + } + + /// + /// The timestamp granularities to populate for this transcription. + /// + public AudioTimestampGranularities Granularities { get; set; } + + internal MultipartFormDataBinaryContent ToMultipartContent(Stream audio, string audioFilename) + { + MultipartFormDataBinaryContent content = new(); + + content.Add(audio, "file", audioFilename); + content.Add(Model.ToString(), "model"); + + if (Language is not null) + { + content.Add(Language, "language"); + } + + if (Prompt is not null) + { + content.Add(Prompt, "prompt"); + } + + if (ResponseFormat is not null) + { + string value = ResponseFormat switch + { + AudioTranscriptionFormat.Simple => "json", + AudioTranscriptionFormat.Verbose => "verbose_json", + AudioTranscriptionFormat.Srt => "srt", + AudioTranscriptionFormat.Vtt => "vtt", + _ => throw new ArgumentException(nameof(ResponseFormat)) + }; + + content.Add(value, "response_format"); + } + + if (Temperature is not null) + { + content.Add(Temperature.Value, "temperature"); + } + + if (Granularities.HasFlag(AudioTimestampGranularities.Word)) + { + content.Add("word", "timestamp_granularities[]"); + } + + if (Granularities.HasFlag(AudioTimestampGranularities.Segment)) + { + content.Add("segment", "timestamp_granularities[]"); + } + + return content; + } +} diff --git a/.dotnet/src/Custom/Audio/AudioTranslation.Serialization.cs b/.dotnet/src/Custom/Audio/AudioTranslation.Serialization.cs new file mode 100644 index 000000000..637099849 --- /dev/null +++ b/.dotnet/src/Custom/Audio/AudioTranslation.Serialization.cs @@ -0,0 +1,27 @@ +using System; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Audio; + +public partial class AudioTranslation +{ + internal static AudioTranslation FromResponse(PipelineResponse response) + { + // Customization: handle plain text responses (SRT/VTT formats) + if (response?.Headers?.TryGetValue("Content-Type", out string contentType) == true && + contentType.StartsWith("text/plain", StringComparison.Ordinal)) + { + return new AudioTranslation( + InternalCreateTranslationResponseVerboseJsonTask.Translate, + language: null, + duration: null, + text: response.Content?.ToString(), + segments: [], + serializedAdditionalRawData: new ChangeTrackingDictionary()); + } + + using var document = JsonDocument.Parse(response.Content); + return DeserializeAudioTranslation(document.RootElement); + } +} diff --git a/.dotnet/src/Custom/Audio/AudioTranslation.cs b/.dotnet/src/Custom/Audio/AudioTranslation.cs new file mode 100644 index 000000000..d5dc6f060 --- /dev/null +++ b/.dotnet/src/Custom/Audio/AudioTranslation.cs @@ -0,0 +1,14 @@ +using System; + +namespace OpenAI.Audio; + +[CodeGenModel("CreateTranslationResponseVerboseJson")] +public partial class AudioTranslation +{ + // CUSTOM: Made private. This property does not add value in the context of a strongly-typed class. + private InternalCreateTranslationResponseVerboseJsonTask Task { get; } = InternalCreateTranslationResponseVerboseJsonTask.Translate; + + // CUSTOM: Made nullable because this is an optional property. + /// The duration of the input audio. + public TimeSpan? Duration { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/AudioTranslationFormat.cs b/.dotnet/src/Custom/Audio/AudioTranslationFormat.cs new file mode 100644 index 000000000..84fd6686a --- /dev/null +++ b/.dotnet/src/Custom/Audio/AudioTranslationFormat.cs @@ -0,0 +1,31 @@ +using System.ComponentModel; + +namespace OpenAI.Audio; + +/// +/// Specifies the format of the audio translation. +/// +[CodeGenModel("CreateTranslationRequestResponseFormat")] +public enum AudioTranslationFormat +{ + /// Text. + [CodeGenMember("Text")] + [EditorBrowsable(EditorBrowsableState.Never)] + Text, + + /// Simple. + [CodeGenMember("Json")] + Simple, + + /// Verbose. + [CodeGenMember("VerboseJson")] + Verbose, + + /// SRT. + [CodeGenMember("Srt")] + Srt, + + /// VTT. + [CodeGenMember("Vtt")] + Vtt, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/AudioTranslationOptions.cs b/.dotnet/src/Custom/Audio/AudioTranslationOptions.cs new file mode 100644 index 000000000..7632ed9c7 --- /dev/null +++ b/.dotnet/src/Custom/Audio/AudioTranslationOptions.cs @@ -0,0 +1,76 @@ +using OpenAI.Embeddings; +using OpenAI.Images; +using OpenAI.Internal; +using System; +using System.IO; + +namespace OpenAI.Audio; + +[CodeGenModel("CreateTranslationRequest")] +[CodeGenSuppress("AudioTranslationOptions", typeof(BinaryData), typeof(InternalCreateTranslationRequestModel))] +public partial class AudioTranslationOptions +{ + // CUSTOM: Made internal. This value comes from a parameter on the client method. + /// + /// The audio file object (not file name) to transcribe, in one of these formats: flac, mp3, mp4, + /// mpeg, mpga, m4a, ogg, pcm, wav, or webm. + /// + /// To assign a byte[] to this property use . + /// The byte[] will be serialized to a Base64 encoded string. + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromBytes(new byte[] { 1, 2, 3 }) + /// Creates a payload of "AQID". + /// + /// + /// + /// + internal BinaryData File { get; } + + // CUSTOM: + // - Made internal. The model is specified by the client. + // - Added setter. + /// + /// ID of the model to use. Only `whisper-1` (which is powered by our open source Whisper V2 model) + /// is currently available. + /// + internal InternalCreateTranslationRequestModel Model { get; set; } + + // CUSTOM: Made public now that there are no required properties. + /// Initializes a new instance of . + public AudioTranslationOptions() + { + } + + internal MultipartFormDataBinaryContent ToMultipartContent(Stream audio, string audioFilename) + { + MultipartFormDataBinaryContent content = new(); + + content.Add(audio, "file", audioFilename); + content.Add(Model.ToString(), "model"); + + if (Prompt is not null) + { + content.Add(Prompt, "prompt"); + } + + if (ResponseFormat is not null) + { + string value = ResponseFormat switch + { + AudioTranslationFormat.Simple => "json", + AudioTranslationFormat.Verbose => "verbose_json", + AudioTranslationFormat.Srt => "srt", + AudioTranslationFormat.Vtt => "vtt", + _ => throw new ArgumentException(nameof(ResponseFormat)) + }; + + content.Add(value, "response_format"); + } + + return content; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/GeneratedSpeechFormat.cs b/.dotnet/src/Custom/Audio/GeneratedSpeechFormat.cs new file mode 100644 index 000000000..4559bca96 --- /dev/null +++ b/.dotnet/src/Custom/Audio/GeneratedSpeechFormat.cs @@ -0,0 +1,32 @@ +namespace OpenAI.Audio; + +/// +/// Represents an audio data format available as either input or output into an audio operation. +/// +[CodeGenModel("CreateSpeechRequestResponseFormat")] +public enum GeneratedSpeechFormat +{ + /// MP3. /// + [CodeGenMember("Mp3")] + Mp3, + + /// Opus. /// + [CodeGenMember("Opus")] + Opus, + + /// AAC (advanced audio coding). /// + [CodeGenMember("Aac")] + Aac, + + /// FLAC (free lossless audio codec). /// + [CodeGenMember("Flac")] + Flac, + + /// WAV. /// + [CodeGenMember("Wav")] + Wav, + + /// PCM (pulse-code modulation). /// + [CodeGenMember("Pcm")] + Pcm, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/GeneratedSpeechVoice.cs b/.dotnet/src/Custom/Audio/GeneratedSpeechVoice.cs new file mode 100644 index 000000000..d0aa0653c --- /dev/null +++ b/.dotnet/src/Custom/Audio/GeneratedSpeechVoice.cs @@ -0,0 +1,34 @@ +using System; + +namespace OpenAI.Audio; + +/// +/// Represents the available text-to-speech voices. +/// +[CodeGenModel("CreateSpeechRequestVoice")] +public enum GeneratedSpeechVoice +{ + /// Alloy. + [CodeGenMember("Alloy")] + Alloy, + + /// Echo. + [CodeGenMember("Echo")] + Echo, + + /// Fable. + [CodeGenMember("Fable")] + Fable, + + /// Onyx. + [CodeGenMember("Onyx")] + Onyx, + + /// Nova. + [CodeGenMember("Nova")] + Nova, + + /// Shimmer. + [CodeGenMember("Shimmer")] + Shimmer, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/Audio/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..22859b0d0 --- /dev/null +++ b/.dotnet/src/Custom/Audio/Internal/GeneratorStubs.cs @@ -0,0 +1,27 @@ +namespace OpenAI.Audio; + +// CUSTOM: Made internal. + +[CodeGenModel("CreateSpeechRequestModel")] +internal readonly partial struct InternalCreateSpeechRequestModel { } + +[CodeGenModel("CreateTranscriptionRequestModel")] +internal readonly partial struct InternalCreateTranscriptionRequestModel { } + +[CodeGenModel("CreateTranscriptionRequestTimestampGranularity")] +internal readonly partial struct InternalCreateTranscriptionRequestTimestampGranularity { } + +[CodeGenModel("CreateTranscriptionResponseJson")] +internal partial class InternalCreateTranscriptionResponseJson { } + +[CodeGenModel("CreateTranscriptionResponseVerboseJsonTask")] +internal readonly partial struct InternalCreateTranscriptionResponseVerboseJsonTask { } + +[CodeGenModel("CreateTranslationRequestModel")] +internal readonly partial struct InternalCreateTranslationRequestModel { } + +[CodeGenModel("CreateTranslationResponseJson")] +internal partial class InternalCreateTranslationResponseJson { } + +[CodeGenModel("CreateTranslationResponseVerboseJsonTask")] +internal readonly partial struct InternalCreateTranslationResponseVerboseJsonTask { } \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/OpenAIAudioModelFactory.cs b/.dotnet/src/Custom/Audio/OpenAIAudioModelFactory.cs new file mode 100644 index 000000000..636b7768f --- /dev/null +++ b/.dotnet/src/Custom/Audio/OpenAIAudioModelFactory.cs @@ -0,0 +1,72 @@ +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Audio; + +/// Model factory for models. +public static partial class OpenAIAudioModelFactory +{ + /// Initializes a new instance of . + /// A new instance for mocking. + public static AudioTranscription AudioTranscription(string language = null, TimeSpan? duration = null, string text = null, IEnumerable words = null, IEnumerable segments = null) + { + words ??= new List(); + segments ??= new List(); + + return new AudioTranscription( + InternalCreateTranscriptionResponseVerboseJsonTask.Transcribe, + language, + duration, + text, + words.ToList(), + segments.ToList(), + serializedAdditionalRawData: null); + } + + /// Initializes a new instance of . + /// A new instance for mocking. + public static AudioTranslation AudioTranslation(string language = null, TimeSpan? duration = null, string text = null, IEnumerable segments = null) + { + segments ??= new List(); + + return new AudioTranslation( + InternalCreateTranslationResponseVerboseJsonTask.Translate, + language, + duration, + text, + segments.ToList(), + serializedAdditionalRawData: null); + } + + /// Initializes a new instance of . + /// A new instance for mocking. + public static TranscribedSegment TranscribedSegment(int id = default, long seekOffset = default, TimeSpan start = default, TimeSpan end = default, string text = null, IEnumerable tokenIds = null, float temperature = default, double averageLogProbability = default, float compressionRatio = default, double noSpeechProbability = default) + { + tokenIds ??= new List(); + + return new TranscribedSegment( + id, + seekOffset, + start, + end, + text, + tokenIds.ToList(), + temperature, + averageLogProbability, + compressionRatio, + noSpeechProbability, + serializedAdditionalRawData: null); + } + + /// Initializes a new instance of . + /// A new instance for mocking. + public static TranscribedWord TranscribedWord(string word = null, TimeSpan start = default, TimeSpan end = default) + { + return new TranscribedWord( + word, + start, + end, + serializedAdditionalRawData: null); + } +} diff --git a/.dotnet/src/Custom/Audio/SpeechGenerationOptions.cs b/.dotnet/src/Custom/Audio/SpeechGenerationOptions.cs new file mode 100644 index 000000000..0edbdc02f --- /dev/null +++ b/.dotnet/src/Custom/Audio/SpeechGenerationOptions.cs @@ -0,0 +1,38 @@ +namespace OpenAI.Audio; + +/// +/// A representation of additional options available to control the behavior of a text-to-speech audio generation +/// operation. +/// +[CodeGenModel("CreateSpeechRequest")] +[CodeGenSuppress("SpeechGenerationOptions", typeof(InternalCreateSpeechRequestModel), typeof(string), typeof(GeneratedSpeechVoice))] +public partial class SpeechGenerationOptions +{ + // CUSTOM: + // - Made internal. The model is specified by the client. + // - Added setter. + /// One of the available [TTS models](/docs/models/tts): `tts-1` or `tts-1-hd`. + internal InternalCreateSpeechRequestModel Model { get; set; } + + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// The text to generate audio for. The maximum length is 4096 characters. + internal string Input { get; set; } + + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// + /// The voice to use when generating the audio. Supported voices are `alloy`, `echo`, `fable`, + /// `onyx`, `nova`, and `shimmer`. Previews of the voices are available in the + /// [Text to speech guide](/docs/guides/text-to-speech/voice-options). + /// + internal GeneratedSpeechVoice Voice { get; set; } + + // CUSTOM: Made public now that there are no required properties. + /// Initializes a new instance of . + public SpeechGenerationOptions() + { + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/TranscribedSegment.cs b/.dotnet/src/Custom/Audio/TranscribedSegment.cs new file mode 100644 index 000000000..20f23300a --- /dev/null +++ b/.dotnet/src/Custom/Audio/TranscribedSegment.cs @@ -0,0 +1,29 @@ +using System; +using System.Collections.Generic; +using System.Runtime.InteropServices; + +namespace OpenAI.Audio; + +[CodeGenModel("TranscriptionSegment")] +[StructLayout(LayoutKind.Auto)] +public readonly partial struct TranscribedSegment +{ + // CUSTOM: Remove setter. + internal IDictionary SerializedAdditionalRawData { get; } + + // CUSTOM: Rename. + [CodeGenMember("Seek")] + public long SeekOffset { get; } + + // CUSTOM: Rename. + [CodeGenMember("Tokens")] + public IReadOnlyList TokenIds { get; } + + // CUSTOM: Rename. + [CodeGenMember("AvgLogprob")] + public double AverageLogProbability { get; } + + // CUSTOM: Rename. + [CodeGenMember("NoSpeechProb")] + public double NoSpeechProbability { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Audio/TranscribedWord.cs b/.dotnet/src/Custom/Audio/TranscribedWord.cs new file mode 100644 index 000000000..f3a534b67 --- /dev/null +++ b/.dotnet/src/Custom/Audio/TranscribedWord.cs @@ -0,0 +1,13 @@ +using System; +using System.Collections.Generic; +using System.Runtime.InteropServices; + +namespace OpenAI.Audio; + +[CodeGenModel("TranscriptionWord")] +[StructLayout(LayoutKind.Auto)] +public readonly partial struct TranscribedWord +{ + // CUSTOM: Remove setter. + internal IDictionary SerializedAdditionalRawData { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Batch/BatchClient.Protocol.cs b/.dotnet/src/Custom/Batch/BatchClient.Protocol.cs new file mode 100644 index 000000000..1505718a8 --- /dev/null +++ b/.dotnet/src/Custom/Batch/BatchClient.Protocol.cs @@ -0,0 +1,140 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Threading.Tasks; + +namespace OpenAI.Batch; + +[CodeGenSuppress("RetrieveBatch", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("RetrieveBatchAsync", typeof(string), typeof(RequestOptions))] +public partial class BatchClient +{ + /// + /// [Protocol Method] Creates and executes a batch from an uploaded file of requests + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task CreateBatchAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateBatchRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Creates and executes a batch from an uploaded file of requests + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult CreateBatch(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateBatchRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] List your organization's batches. + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual IAsyncEnumerable GetBatchesAsync(string after, int? limit, RequestOptions options) + { + BatchesPageEnumerator enumerator = new BatchesPageEnumerator(_pipeline, _endpoint, after, limit, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + /// + /// [Protocol Method] List your organization's batches. + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual IEnumerable GetBatches(string after, int? limit, RequestOptions options) + { + BatchesPageEnumerator enumerator = new BatchesPageEnumerator(_pipeline, _endpoint, after, limit, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + /// [Protocol Method] Retrieves a batch. + /// + /// The ID of the batch to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task GetBatchAsync(string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateRetrieveBatchRequest(batchId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieves a batch. + /// + /// The ID of the batch to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult GetBatch(string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateRetrieveBatchRequest(batchId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Cancels an in-progress batch. + /// + /// The ID of the batch to cancel. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task CancelBatchAsync(string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateCancelBatchRequest(batchId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Cancels an in-progress batch. + /// + /// The ID of the batch to cancel. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult CancelBatch(string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateCancelBatchRequest(batchId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } +} diff --git a/.dotnet/src/Custom/Batch/BatchClient.cs b/.dotnet/src/Custom/Batch/BatchClient.cs new file mode 100644 index 000000000..f84fa5b74 --- /dev/null +++ b/.dotnet/src/Custom/Batch/BatchClient.cs @@ -0,0 +1,67 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; + +namespace OpenAI.Batch; + +// CUSTOM: +// - Renamed. +// - Suppressed constructor that takes endpoint parameter; endpoint is now a property in the options class. +// - Suppressed convenience methods for now. +/// The service client for OpenAI batch operations. +[CodeGenClient("Batches")] +[CodeGenSuppress("BatchClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateBatch", typeof(string), typeof(InternalCreateBatchRequestEndpoint), typeof(InternalBatchCompletionTimeframe), typeof(IDictionary))] +[CodeGenSuppress("CreateBatchAsync", typeof(string), typeof(InternalCreateBatchRequestEndpoint), typeof(InternalBatchCompletionTimeframe), typeof(IDictionary))] +[CodeGenSuppress("RetrieveBatch", typeof(string))] +[CodeGenSuppress("RetrieveBatchAsync", typeof(string))] +[CodeGenSuppress("CancelBatch", typeof(string))] +[CodeGenSuppress("CancelBatchAsync", typeof(string))] +[CodeGenSuppress("GetBatches", typeof(string), typeof(int?))] +[CodeGenSuppress("GetBatchesAsync", typeof(string), typeof(int?))] +public partial class BatchClient +{ + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// is null. + public BatchClient(ApiKeyCredential credential) : this(credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// is null. + public BatchClient(ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The options to configure the client. + /// is null. + protected internal BatchClient(ClientPipeline pipeline, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + options ??= new OpenAIClientOptions(); + + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } +} diff --git a/.dotnet/src/Custom/Batch/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/Batch/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..a3675b6f1 --- /dev/null +++ b/.dotnet/src/Custom/Batch/Internal/GeneratorStubs.cs @@ -0,0 +1,54 @@ +namespace OpenAI.Batch; + +// CUSTOM: Made internal. + +[CodeGenModel("CreateBatchRequestCompletionWindow")] +internal readonly partial struct InternalBatchCompletionTimeframe { } + +[CodeGenModel("BatchErrorsDatum")] +internal partial class InternalBatchError { } + +[CodeGenModel("BatchErrors")] +internal partial class InternalBatchErrors { } + +[CodeGenModel("BatchErrorsObject")] +internal readonly partial struct InternalBatchErrorsObject { } + +[CodeGenModel("Batch")] +internal partial class InternalBatchJob { } + +[CodeGenModel("BatchObject")] +internal readonly partial struct InternalBatchObject { } + +[CodeGenModel("BatchRequestCounts")] +internal partial class InternalBatchRequestCounts { } + +[CodeGenModel("BatchRequestInput")] +internal partial class InternalBatchRequestInput { } + +[CodeGenModel("BatchRequestInputMethod")] +internal readonly partial struct InternalBatchRequestInputMethod { } + +[CodeGenModel("BatchRequestOutput")] +internal partial class InternalBatchRequestOutput { } + +[CodeGenModel("BatchRequestOutputError")] +internal partial class InternalBatchRequestOutputError { } + +[CodeGenModel("BatchRequestOutputResponse")] +internal partial class InternalBatchRequestOutputResponse { } + +[CodeGenModel("BatchStatus")] +internal readonly partial struct InternalBatchStatus { } + +[CodeGenModel("CreateBatchRequest")] +internal partial class InternalCreateBatchRequest { } + +[CodeGenModel("CreateBatchRequestEndpoint")] +internal readonly partial struct InternalCreateBatchRequestEndpoint { } + +[CodeGenModel("ListBatchesResponse")] +internal partial class InternalListBatchesResponse { } + +[CodeGenModel("ListBatchesResponseObject")] +internal readonly partial struct InternalListBatchesResponseObject { } \ No newline at end of file diff --git a/.dotnet/src/Custom/Batch/Internal/Pagination/BatchesPageEnumerator.cs b/.dotnet/src/Custom/Batch/Internal/Pagination/BatchesPageEnumerator.cs new file mode 100644 index 000000000..f8055d3a4 --- /dev/null +++ b/.dotnet/src/Custom/Batch/Internal/Pagination/BatchesPageEnumerator.cs @@ -0,0 +1,108 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.Batch; + +internal partial class BatchesPageEnumerator : PageResultEnumerator +{ + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + private readonly int? _limit; + private readonly RequestOptions _options; + + private string _after; + + public BatchesPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string after, int? limit, + RequestOptions options) + { + _pipeline = pipeline; + _endpoint = endpoint; + + _after = after; + _limit = limit; + _options = options; + } + + public override async Task GetFirstAsync() + => await GetBatchesAsync(_after, _limit, _options).ConfigureAwait(false); + + public override ClientResult GetFirst() + => GetBatches(_after, _limit, _options); + + public override async Task GetNextAsync(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return await GetBatchesAsync(_after, _limit, _options).ConfigureAwait(false); + } + + public override ClientResult GetNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return GetBatches(_after, _limit, _options); + } + + public override bool HasNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + bool hasMore = doc.RootElement.GetProperty("has_more"u8).GetBoolean(); + + return hasMore; + } + + internal virtual async Task GetBatchesAsync(string after, int? limit, RequestOptions options) + { + using PipelineMessage message = CreateGetBatchesRequest(after, limit, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal virtual ClientResult GetBatches(string after, int? limit, RequestOptions options) + { + using PipelineMessage message = CreateGetBatchesRequest(after, limit, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + internal PipelineMessage CreateGetBatchesRequest(string after, int? limit, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/batches", false); + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier? _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); +} diff --git a/.dotnet/src/Custom/Chat/AssistantChatMessage.Serialization.cs b/.dotnet/src/Custom/Chat/AssistantChatMessage.Serialization.cs new file mode 100644 index 000000000..44749c211 --- /dev/null +++ b/.dotnet/src/Custom/Chat/AssistantChatMessage.Serialization.cs @@ -0,0 +1,30 @@ +using System.ClientModel.Primitives; +using System.Collections; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class AssistantChatMessage : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeAssistantChatMessage, writer, options); + + internal static void SerializeAssistantChatMessage(AssistantChatMessage instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("role"u8); + writer.WriteStringValue(Role.ToSerialString()); + ChatMessageContentPart.WriteCoreContentPartList(Content, writer, options); + writer.WriteOptionalProperty("refusal"u8, Refusal, options); + writer.WriteOptionalProperty("name"u8, ParticipantName, options); + writer.WriteOptionalCollection("tool_calls"u8, ToolCalls, options); + writer.WriteOptionalProperty("function_call"u8, FunctionCall, options); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} diff --git a/.dotnet/src/Custom/Chat/AssistantChatMessage.cs b/.dotnet/src/Custom/Chat/AssistantChatMessage.cs new file mode 100644 index 000000000..46928d9a4 --- /dev/null +++ b/.dotnet/src/Custom/Chat/AssistantChatMessage.cs @@ -0,0 +1,128 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat; + +/// +/// Represents a chat message of the assistant role as supplied to a chat completion request. As assistant +/// messages are originated by the model on responses, instances typically +/// represent chat history or example interactions to guide model behavior. +/// +[CodeGenModel("ChatCompletionRequestAssistantMessage")] +public partial class AssistantChatMessage : ChatMessage +{ + /// + /// Creates a new instance of using a collection of content items. + /// For assistant messages, this can be one or more of type text or exactly one of type refusal. + /// + /// + /// The collection of content items associated with the message. + /// + public AssistantChatMessage(IEnumerable contentParts) + : base(ChatMessageRole.Assistant, contentParts) + { + Argument.AssertNotNullOrEmpty(contentParts, nameof(contentParts)); + } + + /// + /// Creates a new instance of using a collection of content items. + /// For assistant messages, this can be one or more of type text or exactly one of type refusal. + /// + /// + /// The collection of text and image content items associated with the message. + /// + public AssistantChatMessage(params ChatMessageContentPart[] contentParts) + : base(ChatMessageRole.Assistant, contentParts) + { + Argument.AssertNotNullOrEmpty(contentParts, nameof(contentParts)); + } + + /// + /// Creates a new instance of that represents ordinary text content and + /// does not feature tool or function calls. + /// + /// The text content of the message. + public AssistantChatMessage(string content) + : base(ChatMessageRole.Assistant, content) + { + Argument.AssertNotNull(content, nameof(content)); + } + + /// + /// Creates a new instance of that represents tool_calls that + /// were provided by the model. + /// + /// The tool_calls made by the model. + /// Optional text content associated with the message. + public AssistantChatMessage(IEnumerable toolCalls, string content = null) + : base(ChatMessageRole.Assistant, content) + { + Argument.AssertNotNull(toolCalls, nameof(toolCalls)); + + foreach (ChatToolCall toolCall in toolCalls) + { + ToolCalls.Add(toolCall); + } + } + + /// + /// Creates a new instance of that represents a function_call + /// (deprecated in favor of tool_calls) that was made by the model. + /// + /// The function_call made by the model. + /// Optional text content associated with the message. + public AssistantChatMessage(ChatFunctionCall functionCall, string content = null) + : base(ChatMessageRole.Assistant, content) + { + Argument.AssertNotNull(functionCall, nameof(functionCall)); + + FunctionCall = functionCall; + } + + /// + /// Creates a new instance of from a with + /// an assistant role response. + /// + /// + /// This constructor will copy the content, tool_calls, and function_call from a chat + /// completion response into a new assistant role request message. + /// + /// + /// The from which the conversation history request message should be created. + /// + /// + /// The role of the provided chat completion response was not . + /// + public AssistantChatMessage(ChatCompletion chatCompletion) + : base(ChatMessageRole.Assistant, chatCompletion?.Content) + { + Argument.AssertNotNull(chatCompletion, nameof(chatCompletion)); + + if (chatCompletion.Role != ChatMessageRole.Assistant) + { + throw new NotSupportedException($"Cannot instantiate an {nameof(AssistantChatMessage)} from a {nameof(ChatCompletion)} with role: {chatCompletion.Role}."); + } + + Refusal = chatCompletion.Refusal; + FunctionCall = chatCompletion.FunctionCall; + foreach (ChatToolCall toolCall in chatCompletion.ToolCalls ?? []) + { + ToolCalls.Add(toolCall); + } + } + + // CUSTOM: Renamed. + /// + /// An optional name associated with the assistant message. This is typically defined with a system + /// message and is used to differentiate between multiple participants of the same role. + /// + [CodeGenMember("Name")] + public string ParticipantName { get; set; } + + // CUSTOM: Common initialization for input model collection property. + [CodeGenMember("ToolCalls")] + public IList ToolCalls { get; } = new ChangeTrackingList(); + + // CUSTOM: Made internal. + internal AssistantChatMessage() { } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatClient.Protocol.cs b/.dotnet/src/Custom/Chat/ChatClient.Protocol.cs new file mode 100644 index 000000000..21a03c811 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatClient.Protocol.cs @@ -0,0 +1,47 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; +using System.Threading.Tasks; + +namespace OpenAI.Chat; + +/// The service client for the OpenAI Chat Completions endpoint. +[CodeGenSuppress("CreateChatCompletionAsync", typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("CreateChatCompletion", typeof(BinaryContent), typeof(RequestOptions))] +public partial class ChatClient +{ + /// + /// [Protocol Method] Creates a model response for the given chat conversation. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task CompleteChatAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateChatCompletionRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Creates a model response for the given chat conversation. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CompleteChat(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateChatCompletionRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } +} diff --git a/.dotnet/src/Custom/Chat/ChatClient.cs b/.dotnet/src/Custom/Chat/ChatClient.cs new file mode 100644 index 000000000..1eb1c0e1a --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatClient.cs @@ -0,0 +1,249 @@ +using OpenAI.Telemetry; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Linq; +using System.Threading; +using System.Threading.Tasks; + +namespace OpenAI.Chat; + +// CUSTOM: +// - Renamed. +// - Suppressed constructor that takes endpoint parameter; endpoint is now a property in the options class. +// - Suppressed methods that only take the options parameter. +/// The service client for OpenAI chat operations. +[CodeGenClient("Chat")] +[CodeGenSuppress("ChatClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateChatCompletionAsync", typeof(ChatCompletionOptions))] +[CodeGenSuppress("CreateChatCompletion", typeof(ChatCompletionOptions))] +public partial class ChatClient +{ + private readonly string _model; + private readonly OpenTelemetrySource _telemetry; + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public ChatClient(string model, ApiKeyCredential credential) : this(model, credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Added telemetry support. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public ChatClient(string model, ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + _telemetry = new OpenTelemetrySource(model, _endpoint); + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Added telemetry support. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + protected internal ChatClient(ClientPipeline pipeline, string model, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNullOrEmpty(model, nameof(model)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + _telemetry = new OpenTelemetrySource(model, _endpoint); + } + + /// Generates a completion for the given chat. + /// The messages comprising the chat so far. + /// The options to configure the chat completion. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual async Task> CompleteChatAsync(IEnumerable messages, ChatCompletionOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(messages, nameof(messages)); + + options ??= new(); + CreateChatCompletionOptions(messages, ref options); + using OpenTelemetryScope scope = _telemetry.StartChatScope(options); + + try + { + using BinaryContent content = options.ToBinaryContent(); + + ClientResult result = await CompleteChatAsync(content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + ChatCompletion chatCompletion = ChatCompletion.FromResponse(result.GetRawResponse()); + scope?.RecordChatCompletion(chatCompletion); + return ClientResult.FromValue(chatCompletion, result.GetRawResponse()); + } + catch (Exception ex) + { + scope?.RecordException(ex); + throw; + } + } + + /// Generates a completion for the given chat. + /// The messages comprising the chat so far. + /// The options to configure the chat completion. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual ClientResult CompleteChat(IEnumerable messages, ChatCompletionOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(messages, nameof(messages)); + + options ??= new(); + CreateChatCompletionOptions(messages, ref options); + using OpenTelemetryScope scope = _telemetry.StartChatScope(options); + + try + { + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = CompleteChat(content, cancellationToken.ToRequestOptions()); + ChatCompletion chatCompletion = ChatCompletion.FromResponse(result.GetRawResponse()); + + scope?.RecordChatCompletion(chatCompletion); + return ClientResult.FromValue(chatCompletion, result.GetRawResponse()); + } + catch (Exception ex) + { + scope?.RecordException(ex); + throw; + } + } + + /// Generates a completion for the given chat. + /// The messages comprising the chat so far. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual async Task> CompleteChatAsync(params ChatMessage[] messages) + => await CompleteChatAsync(messages, default(ChatCompletionOptions)).ConfigureAwait(false); + + /// Generates a completion for the given chat. + /// The messages comprising the chat so far. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual ClientResult CompleteChat(params ChatMessage[] messages) + => CompleteChat(messages, default(ChatCompletionOptions)); + + /// + /// Generates a completion for the given chat. The completion is streamed back token by token as it is being + /// generated by the model instead of waiting for it to be finished first. + /// + /// + /// implements the interface and can be + /// enumerated over using the await foreach pattern. + /// + /// The messages comprising the chat so far. + /// The options to configure the chat completion. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual AsyncCollectionResult CompleteChatStreamingAsync(IEnumerable messages, ChatCompletionOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(messages, nameof(messages)); + + options ??= new(); + CreateChatCompletionOptions(messages, ref options, stream: true); + + using BinaryContent content = options.ToBinaryContent(); + + async Task getResultAsync() => + await CompleteChatAsync(content, cancellationToken.ToRequestOptions(streaming: true)).ConfigureAwait(false); + return new AsyncStreamingChatCompletionUpdateCollection(getResultAsync); + } + + /// + /// Generates a completion for the given chat. The completion is streamed back token by token as it is being + /// generated by the model instead of waiting for it to be finished first. + /// + /// + /// implements the interface and can be + /// enumerated over using the await foreach pattern. + /// + /// The messages comprising the chat so far. + /// The options to configure the chat completion. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual CollectionResult CompleteChatStreaming(IEnumerable messages, ChatCompletionOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(messages, nameof(messages)); + + options ??= new(); + CreateChatCompletionOptions(messages, ref options, stream: true); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult getResult() => CompleteChat(content, cancellationToken.ToRequestOptions(streaming: true)); + return new StreamingChatCompletionUpdateCollection(getResult); + } + + /// + /// Generates a completion for the given chat. The completion is streamed back token by token as it is being + /// generated by the model instead of waiting for it to be finished first. + /// + /// + /// implements the interface and can be + /// enumerated over using the await foreach pattern. + /// + /// The messages comprising the chat so far. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual AsyncCollectionResult CompleteChatStreamingAsync(params ChatMessage[] messages) + => CompleteChatStreamingAsync(messages, default(ChatCompletionOptions)); + + /// + /// Generates a completion for the given chat. The completion is streamed back token by token as it is being + /// generated by the model instead of waiting for it to be finished first. + /// + /// + /// implements the interface and can be + /// enumerated over using the await foreach pattern. + /// + /// The messages comprising the chat so far. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual CollectionResult CompleteChatStreaming(params ChatMessage[] messages) + => CompleteChatStreaming(messages, default(ChatCompletionOptions)); + + private void CreateChatCompletionOptions(IEnumerable messages, ref ChatCompletionOptions options, bool stream = false) + { + options.Messages = messages.ToList(); + options.Model = _model; + options.Stream = stream + ? true + : null; + options.StreamOptions = stream ? options.StreamOptions : null; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatCompletion.cs b/.dotnet/src/Custom/Chat/ChatCompletion.cs new file mode 100644 index 000000000..7f0681f3f --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatCompletion.cs @@ -0,0 +1,85 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; + +namespace OpenAI.Chat; + +[CodeGenModel("CreateChatCompletionResponse")] +public partial class ChatCompletion +{ + private IReadOnlyList _contentTokenLogProbabilities; + private IReadOnlyList _refusalTokenLogProbabilities; + + // CUSTOM: Made private. This property does not add value in the context of a strongly-typed class. + /// The object type, which is always `chat.completion`. + [CodeGenMember("Object")] + private InternalCreateChatCompletionResponseObject Object { get; } = InternalCreateChatCompletionResponseObject.ChatCompletion; + + // CUSTOM: Made internal. We only get back a single choice, and instead we flatten the structure for usability. + /// A list of chat completion choices. Can be more than one if `n` is greater than 1. + [CodeGenMember("Choices")] + internal IReadOnlyList Choices { get; } + + // CUSTOM: Renamed. + /// The Unix timestamp (in seconds) of when the chat completion was created. + [CodeGenMember("Created")] + public DateTimeOffset CreatedAt { get; } + + // CUSTOM: Flattened choice property. + /// + /// The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, + /// `length` if the maximum number of tokens specified in the request was reached, + /// `content_filter` if content was omitted due to a flag from our content filters, + /// `tool_calls` if the model called a tool, or `function_call` (deprecated) if the model called a function. + /// + public ChatFinishReason FinishReason => Choices[0].FinishReason; + + // CUSTOM: Flattened choice logprobs property. + /// + /// Log probability information. + /// + public IReadOnlyList ContentTokenLogProbabilities => (Choices[0].Logprobs != null) + ? Choices[0].Logprobs.Content + : _contentTokenLogProbabilities ??= new ChangeTrackingList(); + + // CUSTOM: Flattened refusal logprobs property. + public IReadOnlyList RefusalTokenLogProbabilities => (Choices[0]?.Logprobs != null) + ? Choices[0].Logprobs.Refusal + : _refusalTokenLogProbabilities ??= new ChangeTrackingList(); + + // CUSTOM: Flattened choice message property. + /// + /// The role of the author of this message. + /// + public ChatMessageRole Role => Choices[0].Message.Role; + + // CUSTOM: Flattened choice message property. + /// + /// The contents of the message. + /// + public IReadOnlyList Content => Choices[0].Message.Content; + + // CUSTOM: Flattened choice message property. + /// + /// The tool calls. + /// + public IReadOnlyList ToolCalls => Choices[0].Message.ToolCalls; + + // CUSTOM: Flattened choice message property. + public ChatFunctionCall FunctionCall => Choices[0].Message.FunctionCall; + + // CUSTOM: Flattened choice message property. + public string Refusal => Choices[0].Message.Refusal; + + /// + /// Returns text representation of the first part of the first choice. + /// + /// + public override string ToString() => Content.Count > 0 ? Content[0].Text + : ToolCalls.Count > 0 ? ModelReaderWriter.Write(ToolCalls[0]).ToString() + : null; + + // CUSTOM: Made internal. + [CodeGenMember("ServiceTier")] + internal InternalCreateChatCompletionResponseServiceTier? _serviceTier; +} diff --git a/.dotnet/src/Custom/Chat/ChatCompletionOptions.Serialization.cs b/.dotnet/src/Custom/Chat/ChatCompletionOptions.Serialization.cs new file mode 100644 index 000000000..30e23a50b --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatCompletionOptions.Serialization.cs @@ -0,0 +1,78 @@ +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Globalization; +using System.Runtime.CompilerServices; +using System.Text.Json; + +namespace OpenAI.Chat; + +public partial class ChatCompletionOptions +{ + // CUSTOM: Added custom serialization to treat a single string as a collection of strings with one item. + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private void SerializeStopSequencesValue(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartArray(); + foreach (var item in StopSequences) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + + // CUSTOM: Added custom serialization to treat a single string as a collection of strings with one item. + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private static void DeserializeStopSequencesValue(JsonProperty property, ref IList stop) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + stop = null; + } + else if (property.Value.ValueKind == JsonValueKind.String) + { + List array = [property.Value.GetString()]; + stop = array; + } + else + { + List array = []; + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + stop = array; + } + } + + // CUSTOM: Added custom serialization to represent tokens as integers instead of strings. + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private void SerializeLogitBiasesValue(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + foreach (var item in LogitBiases) + { + writer.WritePropertyName(item.Key.ToString(CultureInfo.InvariantCulture)); + writer.WriteNumberValue(item.Value); + } + writer.WriteEndObject(); + } + + // CUSTOM: Added custom serialization to represent tokens as integers instead of strings. + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private static void DeserializeLogitBiasesValue(JsonProperty property, ref IDictionary logitBias) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + logitBias = null; + } + else + { + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(int.Parse(property0.Name, CultureInfo.InvariantCulture), property0.Value.GetInt32()); + } + logitBias = dictionary; + } + } +} diff --git a/.dotnet/src/Custom/Chat/ChatCompletionOptions.cs b/.dotnet/src/Custom/Chat/ChatCompletionOptions.cs new file mode 100644 index 000000000..212195db1 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatCompletionOptions.cs @@ -0,0 +1,139 @@ +using System.Collections.Generic; + +namespace OpenAI.Chat; + +/// +/// Request-level options for chat completion. +/// +[CodeGenModel("CreateChatCompletionRequest")] +[CodeGenSuppress("ChatCompletionOptions", typeof(IEnumerable), typeof(InternalCreateChatCompletionRequestModel))] +[CodeGenSerialization(nameof(StopSequences), SerializationValueHook = nameof(SerializeStopSequencesValue), DeserializationValueHook = nameof(DeserializeStopSequencesValue))] +[CodeGenSerialization(nameof(LogitBiases), SerializationValueHook = nameof(SerializeLogitBiasesValue), DeserializationValueHook = nameof(DeserializeLogitBiasesValue))] +public partial class ChatCompletionOptions +{ + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// + /// A list of messages comprising the conversation so far. [Example Python code](https://cookbook.openai.com/examples/how_to_format_inputs_to_chatgpt_models). + /// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes. + /// The available derived classes include , , , and . + /// + [CodeGenMember("Messages")] + internal IList Messages { get; set; } + + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// + /// ID of the model to use. See the model endpoint compatibility table for details on which models work with the Chat API. + /// + [CodeGenMember("Model")] + internal InternalCreateChatCompletionRequestModel Model { get; set; } + + // CUSTOM: Made internal. We only ever request a single choice. + /// How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep `n` as `1` to minimize costs. + [CodeGenMember("N")] + internal int? N { get; set; } + + // CUSTOM: Made internal. We set this manually based on the client method that is called. + /// If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) as they become available, with the stream terminated by a `data: [DONE]` message. [Example Python code](https://cookbook.openai.com/examples/how_to_stream_completions). + [CodeGenMember("Stream")] + internal bool? Stream { get; set; } + + /// Gets or sets the stream options. + [CodeGenMember("StreamOptions")] + internal InternalChatCompletionStreamOptions StreamOptions { get; set; } + = new() { IncludeUsage = true }; + + // CUSTOM: Made public now that there are no required properties. + /// Initializes a new instance of for deserialization. + public ChatCompletionOptions() + { + LogitBiases = new ChangeTrackingDictionary(); + StopSequences = new ChangeTrackingList(); + Tools = new ChangeTrackingList(); + Functions = new ChangeTrackingList(); + } + + // CUSTOM: Renamed. + /// Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the message content. + [CodeGenMember("Logprobs")] + public bool? IncludeLogProbabilities { get; set; } + + // CUSTOM: Renamed. + /// An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. must be set to if this property is used. + [CodeGenMember("TopLogprobs")] + public int? TopLogProbabilityCount { get; set; } + + // CUSTOM: + // - Renamed. + // - Changed type to treat a single string as a collection of strings with one item. + /// Up to 4 sequences where the API will stop generating further tokens. + [CodeGenMember("Stop")] + public IList StopSequences { get; } + + // CUSTOM: + // - Renamed. + // - Changed type to treat tokens as integers instead of strings. + /// + /// Modifies the likelihood of specified tokens appearing in the completion. It maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token. + /// + [CodeGenMember("LogitBias")] + public IDictionary LogitBiases { get; } + + // CUSTOM: Changed type to avoid BinaryData. + /// + /// Specifies which tool is called by the model, if any. means the model will not call any tool and instead generates a message. means the model can pick between generating a message or calling one or more tools. + /// means the model must call one or more tools. The model can also be forced to call a specific tool by constructing a new instance of while passing the desired as a constructor parameter. + /// + /// is the default behavior when no tools are present, while is the default if tools are present. + /// + /// + [CodeGenMember("ToolChoice")] + public ChatToolChoice ToolChoice { get; set; } + + // CUSTOM: + // - Renamed. + // - Changed type to avoid BinaryData. + /// + /// Deprecated in favor of . + /// + [CodeGenMember("FunctionCall")] + public ChatFunctionChoice FunctionChoice { get; set; } + + // CUSTOM: Renamed. + /// + /// Whether to enable parallel function calling during tool use. + /// + /// + /// Assumed true if not otherwise specified. + /// + [CodeGenMember("ParallelToolCalls")] + public bool? ParallelToolCallsEnabled { get; set; } + + /// + /// An object specifying the format that the model must output. + /// + /// + ///

+ /// Compatible with GPT-4o, GPT-4o mini, GPT-4 Turbo and all GPT-3.5 Turbo models newer than gpt-3.5-turbo-1106. + ///

+ ///

+ /// Learn more in the Structured Outputs guide. + ///

+ ///
+ //[CodeGenMember("ResponseFormat")] + //public ChatResponseFormat ResponseFormat { get; set; } + + [CodeGenMember("ServiceTier")] + internal InternalCreateChatCompletionRequestServiceTier? _serviceTier; + + // CUSTOM: Renamed. + /// + /// A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. + /// Learn more. + /// + [CodeGenMember("User")] + public string EndUserId { get; set; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatFinishReason.cs b/.dotnet/src/Custom/Chat/ChatFinishReason.cs new file mode 100644 index 000000000..85690c961 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatFinishReason.cs @@ -0,0 +1,87 @@ +namespace OpenAI.Chat; + +/// +/// The reason the model stopped generating tokens. This will be: +/// +/// +/// Property +/// REST +/// Condition +/// +/// +/// +/// stop +/// The model encountered a natural stop point or provided stop sequence. +/// +/// +/// +/// length +/// The maximum number of tokens specified in the request was reached. +/// +/// +/// +/// content_filter +/// Content was omitted due to a triggered content filter rule. +/// +/// +/// +/// tool_calls +/// +/// With no explicit tool_choice, the model called one or more tools that were defined in the request. +/// +/// +/// +/// +/// function_call +/// (Deprecated) The model called a function that was defined in the request. +/// +/// +/// +[CodeGenModel("CreateChatCompletionResponseChoiceFinishReason1")] +public enum ChatFinishReason +{ + /// + /// Indicates that the model encountered a natural stop point or provided stop sequence. + /// + [CodeGenMember("Stop")] + Stop, + + /// + /// Indicates that the model reached the maximum number of tokens allowed for the request. + /// + [CodeGenMember("Length")] + Length, + + /// + /// Indicates that content was omitted due to a triggered content filter rule. + /// + [CodeGenMember("ContentFilter")] + ContentFilter, + + /// + /// Indicates that the model called a function that was defined in the request. + /// + /// + /// To resolve tool calls, append the message associated with the tool calls followed by matching instances of + /// for each tool call, then perform another chat completion with the combined + /// set of messages. + /// + /// Note: is not provided as the finish_reason if the model calls a + /// tool in response to an explicit tool_choice via . + /// In that case, calling the specified tool is assumed and the expected reason is . + /// + /// + [CodeGenMember("ToolCalls")] + ToolCalls, + + /// + /// Indicates that the model called a function that was defined in the request. + /// + /// + /// To resolve a function call, append the message associated with the function call followed by a + /// with the appropriate name and arguments, then perform another chat + /// completion with the combined set of messages. + /// + [CodeGenMember("FunctionCall")] + FunctionCall, +} diff --git a/.dotnet/src/Custom/Chat/ChatFunction.cs b/.dotnet/src/Custom/Chat/ChatFunction.cs new file mode 100644 index 000000000..d2b14ec49 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatFunction.cs @@ -0,0 +1,42 @@ +using System; + +namespace OpenAI.Chat; + +/// +/// Represents the definition of a function that the model may call, as supplied in a chat completion request. +/// +[CodeGenModel("ChatCompletionFunctions")] +[CodeGenSuppress("ChatFunction", typeof(string))] +public partial class ChatFunction +{ + // CUSTOM: Added custom constructor. + /// + /// Creates a new instance of . + /// + /// The name of the function. + /// The description of the function. + /// The parameters into the function, in JSON Schema format. + public ChatFunction(string functionName, string functionDescription = null, BinaryData functionParameters = null) + { + Argument.AssertNotNull(functionName, nameof(functionName)); + + FunctionName = functionName; + FunctionDescription = functionDescription; + FunctionParameters = functionParameters; + } + + // CUSTOM: Renamed. + /// The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. + [CodeGenMember("Name")] + public string FunctionName { get; } + + // CUSTOM: Renamed + /// A description of what the function does, used by the model to choose when and how to call the function. + [CodeGenMember("Description")] + public string FunctionDescription { get; set; } + + // CUSTOM: Changed type to BinarayData. + /// Gets or sets the parameters. + [CodeGenMember("Parameters")] + public BinaryData FunctionParameters { get; set; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatFunctionCall.cs b/.dotnet/src/Custom/Chat/ChatFunctionCall.cs new file mode 100644 index 000000000..bfc84e23c --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatFunctionCall.cs @@ -0,0 +1,58 @@ +using System; + +namespace OpenAI.Chat; + +/// +/// Represents an assistant call against a supplied that is needed by the +/// model to continue the logical conversation. +/// +/// +/// +/// Note that functions are deprecated in favor of tools and using +/// instances with will enable the use of tool_calls via +/// instead of this type. +/// +/// +/// The model makes a function_call in response to evaluation of supplied name> and +/// description information in functions and is resolved by providing a new +/// with matching functioning output on a subsequent chat completion +/// request. +/// +/// +[CodeGenModel("ChatCompletionRequestAssistantMessageFunctionCall")] +public partial class ChatFunctionCall +{ + // CUSTOM: Reordered parameters. + /// Initializes a new instance of . + /// The name of the function to call. + /// + /// The arguments to call the function with, as generated by the model in JSON format. Note + /// that the model does not always generate valid JSON, and may hallucinate parameters not + /// defined by your function schema. Validate the arguments in your code before calling your + /// function. + /// + /// or is null. + public ChatFunctionCall(string functionName, string functionArguments) + { + Argument.AssertNotNull(functionName, nameof(functionName)); + Argument.AssertNotNull(functionArguments, nameof(functionArguments)); + + FunctionName = functionName; + FunctionArguments = functionArguments; + } + + // CUSTOM: Renamed. + /// The name of the function to call. + [CodeGenMember("Name")] + public string FunctionName { get; } + + // CUSTOM: Renamed. + /// + /// The arguments to call the function with, as generated by the model in JSON format. Note + /// that the model does not always generate valid JSON, and may hallucinate parameters not + /// defined by your function schema. Validate the arguments in your code before calling your + /// function. + /// + [CodeGenMember("Arguments")] + public string FunctionArguments { get; } +} diff --git a/.dotnet/src/Custom/Chat/ChatFunctionChoice.Serialization.cs b/.dotnet/src/Custom/Chat/ChatFunctionChoice.Serialization.cs new file mode 100644 index 000000000..0ad21b27b --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatFunctionChoice.Serialization.cs @@ -0,0 +1,63 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class ChatFunctionChoice : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeChatFunctionChoice, writer, options); + + internal static void SerializeChatFunctionChoice(ChatFunctionChoice instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + if (instance._isPlainString) + { + writer.WriteStringValue(instance._string); + } + else + { + writer.WriteStartObject(); + writer.WritePropertyName("name"u8); + writer.WriteStringValue(instance._function.Name); + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + } + + internal static ChatFunctionChoice DeserializeChatFunctionChoice(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + else if (element.ValueKind == JsonValueKind.String) + { + return new ChatFunctionChoice(element.ToString()); + } + else + { + string name = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatFunctionChoice(name, serializedAdditionalRawData); + } + } +} diff --git a/.dotnet/src/Custom/Chat/ChatFunctionChoice.cs b/.dotnet/src/Custom/Chat/ChatFunctionChoice.cs new file mode 100644 index 000000000..5c9626e4c --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatFunctionChoice.cs @@ -0,0 +1,69 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat; + +/// +/// Represents a desired manner in which the model should use the functions defined in a chat completion request. +/// +[CodeGenModel("ChatCompletionFunctionChoice")] +[CodeGenSuppress("ChatFunctionChoice", typeof(IDictionary))] +public partial class ChatFunctionChoice +{ + private readonly bool _isPlainString; + private readonly string _string; + private readonly InternalChatCompletionFunctionCallOption _function; + + private const string AutoValue = "auto"; + private const string NoneValue = "none"; + + // CUSTOM: Made internal. + internal ChatFunctionChoice() + { + } + + // CUSTOM: Added custom internal constructor to handle the plain string representation (e.g. "auto", "none", etc.). + internal ChatFunctionChoice(string predefinedFunctionChoice) + { + Argument.AssertNotNull(predefinedFunctionChoice, nameof(predefinedFunctionChoice)); + + _string = predefinedFunctionChoice; + _isPlainString = true; + } + + // CUSTOM: Added the function name parameter to the constructor that takes additional data to handle the object representation. + /// Initializes a new instance of . + /// The function name. + /// Keeps track of any properties unknown to the library. + internal ChatFunctionChoice(string functionName, IDictionary serializedAdditionalRawData) + { + Argument.AssertNotNull(functionName, nameof(functionName)); + + _function = new(functionName); + _isPlainString = false; + + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + // CUSTOM: Added custom public constructor to handle the object representation. + /// + /// Creates a new instance of . + /// + public ChatFunctionChoice(ChatFunction chatFunction) + { + Argument.AssertNotNull(chatFunction, nameof(chatFunction)); + + _function = new(chatFunction.FunctionName); + _isPlainString = false; + } + + /// + /// Specifies that the model must freely pick between generating a message or calling one or more tools. + /// + public static ChatFunctionChoice Auto { get; } = new ChatFunctionChoice(AutoValue); + /// + /// Specifies that the model must not invoke any tools, and instead it must generate an ordinary message. Note + /// that the tools that were provided may still influence the model's behavior even if they are not called. + /// + public static ChatFunctionChoice None { get; } = new ChatFunctionChoice(NoneValue); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatMessage.Serialization.cs b/.dotnet/src/Custom/Chat/ChatMessage.Serialization.cs new file mode 100644 index 000000000..007212010 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatMessage.Serialization.cs @@ -0,0 +1,47 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Runtime.CompilerServices; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public abstract partial class ChatMessage : IJsonModel +{ + [MethodImpl(MethodImplOptions.AggressiveInlining)] + internal void SerializeContentValue(Utf8JsonWriter writer, ModelReaderWriterOptions options = null) + { + throw new NotImplementedException(); + } + + [MethodImpl(MethodImplOptions.AggressiveInlining)] + internal static void DeserializeContentValue(JsonProperty property, ref IList content, ModelReaderWriterOptions options = null) + { + content ??= new ChangeTrackingList(); + + if (property.Value.ValueKind == JsonValueKind.Null) + { + return; + } + else if (property.Value.ValueKind == JsonValueKind.String) + { + content.Add(ChatMessageContentPart.CreateTextMessageContentPart(property.Value.GetString())); + } + else if (property.Value.ValueKind == JsonValueKind.Array) + { + foreach (var item in property.Value.EnumerateArray()) + { + content.Add(ChatMessageContentPart.DeserializeChatMessageContentPart(item, options)); + } + } + } + + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, WriteCore, writer, options); + + internal static void WriteCore(ChatMessage instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected abstract void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options); +} diff --git a/.dotnet/src/Custom/Chat/ChatMessage.cs b/.dotnet/src/Custom/Chat/ChatMessage.cs new file mode 100644 index 000000000..7278d5172 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatMessage.cs @@ -0,0 +1,134 @@ +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Chat; + +/// +/// A common, base representation of a message provided as input into a chat completion request. +/// +/// +/// +/// +/// Type - +/// Role - +/// Description +/// +/// +/// - +/// system - +/// Instructions to the model that guide the behavior of future assistant messages. +/// +/// +/// - +/// user - +/// Input messages from the caller, typically paired with assistant messages in a conversation. +/// +/// +/// - +/// assistant - +/// +/// Output messages from the model with responses to the user or calls to tools or functions that are +/// needed to continue the logical conversation. +/// +/// +/// +/// - +/// tool - +/// +/// Resolution information for a in an earlier +/// that was made against a supplied +/// . +/// +/// +/// +/// - +/// function - +/// +/// Resolution information for a in an earlier +/// that was made against a supplied +/// . Note that functions are deprecated in favor of +/// tool_calls. +/// +/// +/// +/// +[CodeGenModel("ChatCompletionRequestMessage")] +[CodeGenSerialization(nameof(Content), SerializationValueHook = nameof(SerializeContentValue), DeserializationValueHook = nameof(DeserializeContentValue))] +public abstract partial class ChatMessage +{ + protected internal ChatMessage(ChatMessageRole role, IEnumerable contentParts) + { + Role = role; + foreach (ChatMessageContentPart contentPart in contentParts ?? []) + { + Content.Add(contentPart); + } + } + + protected internal ChatMessage(ChatMessageRole role, string content) + : this(role, content is null ? null : [ChatMessageContentPart.CreateTextMessageContentPart(content)]) + { } + + /// + /// The content associated with the message. The interpretation of this content will vary depending on the message type. + /// + public IList Content { get; } = new ChangeTrackingList(); + + // CUSTOM: use strongly-typed role. + [CodeGenMember("Role")] + internal ChatMessageRole Role { get; set; } + + /// + public static SystemChatMessage CreateSystemMessage(string content) => new(content); + + /// + public static SystemChatMessage CreateSystemMessage(IEnumerable contentParts) => new(contentParts); + + /// + public static SystemChatMessage CreateSystemMessage(params ChatMessageContentPart[] contentParts) => new(contentParts); + + /// + public static UserChatMessage CreateUserMessage(string content) => new(content); + + /// + public static UserChatMessage CreateUserMessage(IEnumerable contentParts) => new(contentParts); + + /// + public static UserChatMessage CreateUserMessage(params ChatMessageContentPart[] contentParts) => new(contentParts); + + /// + public static AssistantChatMessage CreateAssistantMessage(string content) => new(content); + + /// + public static AssistantChatMessage CreateAssistantMessage(IEnumerable contentParts) => new(contentParts); + + /// + public static AssistantChatMessage CreateAssistantMessage(params ChatMessageContentPart[] contentParts) => new(contentParts); + + /// + public static AssistantChatMessage CreateAssistantMessage(IEnumerable toolCalls, string content = null) => new(toolCalls, content); + + /// + public static AssistantChatMessage CreateAssistantMessage(ChatFunctionCall functionCall, string content = null) => new(functionCall, content); + + /// + public static AssistantChatMessage CreateAssistantMessage(ChatCompletion chatCompletion) => new(chatCompletion); + + /// + public static ToolChatMessage CreateToolChatMessage(string toolCallId, string content) => new(toolCallId, content); + + /// + public static ToolChatMessage CreateToolChatMessage(string toolCallId, IEnumerable contentParts) => new(toolCallId, contentParts); + + /// + public static ToolChatMessage CreateToolChatMessage(string toolCallId, params ChatMessageContentPart[] contentParts) => new(toolCallId, contentParts); + + /// + public static FunctionChatMessage CreateFunctionMessage(string functionName, string content) => new(functionName, content); + + /// + /// Creates UserChatMessage. + /// + /// + public static implicit operator ChatMessage(string userMessage) => new UserChatMessage(userMessage); +} diff --git a/.dotnet/src/Custom/Chat/ChatMessageContentPart.Serialization.cs b/.dotnet/src/Custom/Chat/ChatMessageContentPart.Serialization.cs new file mode 100644 index 000000000..42d75589c --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatMessageContentPart.Serialization.cs @@ -0,0 +1,107 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class ChatMessageContentPart : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, WriteCoreContentPart, writer, options); + + internal static void WriteCoreContentPart(ChatMessageContentPart instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(instance._kind.ToString()); + + if (instance._kind == ChatMessageContentPartKind.Text) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(instance._text); + } + else if (instance._kind == ChatMessageContentPartKind.Refusal) + { + writer.WritePropertyName("refusal"u8); + writer.WriteStringValue(instance._refusal); + } + else if (instance._kind == ChatMessageContentPartKind.Image) + { + writer.WritePropertyName("image_url"u8); + writer.WriteObjectValue(instance._imageUrl, options); + } + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + + internal static void WriteCoreContentPartList(IList instances, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + if (!Optional.IsCollectionDefined(instances)) + { + return; + } + + writer.WritePropertyName("content"u8); + if (instances.Count == 1 && !string.IsNullOrEmpty(instances[0].Text)) + { + writer.WriteStringValue(instances[0].Text); + } + else + { + writer.WriteStartArray(); + foreach (var item in instances) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + } + + internal static ChatMessageContentPart DeserializeChatMessageContentPart(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + + string kind = default; + string text = default; + string refusal = default; + InternalChatCompletionRequestMessageContentPartImageImageUrl imageUrl = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + kind = property.Value.GetString(); + continue; + } + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (property.NameEquals("image_url"u8)) + { + imageUrl = InternalChatCompletionRequestMessageContentPartImageImageUrl.DeserializeInternalChatCompletionRequestMessageContentPartImageImageUrl(property.Value, options); + continue; + } + if (property.NameEquals("refusal"u8)) + { + refusal = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatMessageContentPart(kind, text, refusal, imageUrl, serializedAdditionalRawData); + } +} diff --git a/.dotnet/src/Custom/Chat/ChatMessageContentPart.cs b/.dotnet/src/Custom/Chat/ChatMessageContentPart.cs new file mode 100644 index 000000000..2267b7877 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatMessageContentPart.cs @@ -0,0 +1,173 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat; + +/// +/// Represents the common base type for a piece of message content used for chat completions. +/// +[CodeGenModel("ChatMessageContentPart")] +[CodeGenSuppress("ChatMessageContentPart", typeof(IDictionary))] +public partial class ChatMessageContentPart +{ + private readonly ChatMessageContentPartKind _kind; + private readonly string _text; + private readonly string _refusal; + private readonly InternalChatCompletionRequestMessageContentPartImageImageUrl _imageUrl; + private readonly string _dataUri; + + internal ChatMessageContentPart(string text) + { + Argument.AssertNotNull(text, nameof(text)); + + _text = text; + _kind = ChatMessageContentPartKind.Text; + } + + // CUSTOM: Made internal. + internal ChatMessageContentPart() + { + } + + + internal ChatMessageContentPart(Uri imageUri, ImageChatMessageContentPartDetail? imageDetail = null) + { + Argument.AssertNotNull(imageUri, nameof(imageUri)); + + _imageUrl = new(imageUri) { Detail = imageDetail }; + _kind = ChatMessageContentPartKind.Image; + } + + internal ChatMessageContentPart(BinaryData imageBytes, string imageBytesMediaType, ImageChatMessageContentPartDetail? imageDetail = null) + { + Argument.AssertNotNull(imageBytes, nameof(imageBytes)); + Argument.AssertNotNull(imageBytesMediaType, nameof(imageBytesMediaType)); + + _imageUrl = new(imageBytes, imageBytesMediaType) { Detail = imageDetail }; + _kind = ChatMessageContentPartKind.Image; + } + + /// Initializes a new instance of . + /// The kind. + /// The text. + /// The image URI. + /// Keeps track of any properties unknown to the library. + internal ChatMessageContentPart(string kind, string text, string refusal, InternalChatCompletionRequestMessageContentPartImageImageUrl imageUrl, IDictionary serializedAdditionalRawData) + { + _kind = new ChatMessageContentPartKind(kind); + _text = text; + _refusal = refusal; + _imageUrl = imageUrl; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// + /// The content part kind. + /// + public ChatMessageContentPartKind Kind => _kind; + + /// + /// The text content. + /// + public string Text => _text; + + /// + /// The refusal message from the assistant. + /// + public string Refusal => _refusal; + + /// + /// The image URI content. + /// + public Uri ImageUri => _imageUrl?.ImageUri; + + /// + /// The image URI content. + /// + public BinaryData ImageBytes => _imageUrl?.ImageBytes; + + /// + /// The image URI content. + /// + public string ImageBytesMediaType => _imageUrl?.ImageBytesMediaType; + + /// + /// The image URI detail. + /// + public ImageChatMessageContentPartDetail? ImageDetail => _imageUrl?.Detail; + + /// + /// Creates a new instance of that encapsulates text content. + /// + /// The content for the new instance. + /// A new instance of . + public static ChatMessageContentPart CreateTextMessageContentPart(string text) + { + Argument.AssertNotNull(text, nameof(text)); + + return new(text); + } + + /// + /// Creates a new instance of that encapsulates an assistant refusal message. + /// + /// The refusal message from the assistant. + /// A new instance of . + public static ChatMessageContentPart CreateRefusalMessageContentPart(string refusal) + { + Argument.AssertNotNull(refusal, nameof(refusal)); + + return new ChatMessageContentPart( + ChatMessageContentPartKind.Refusal.ToString(), + text: null, + refusal: refusal, + imageUrl: null, + serializedAdditionalRawData: null); + } + + /// + /// Creates a new instance of that encapsulates image content obtained from + /// an internet location that will be accessible to the model when evaluating a message with this content. + /// + /// An internet location pointing to an image. This must be accessible to the model. + /// The detail level of the image. + /// A new instance of . + public static ChatMessageContentPart CreateImageMessageContentPart(Uri imageUri, ImageChatMessageContentPartDetail? imageDetail = null) + { + Argument.AssertNotNull(imageUri, nameof(imageUri)); + + return new(imageUri, imageDetail); + } + + /// + /// Creates a new instance of that encapsulates image content obtained from + /// an internet location that will be accessible to the model when evaluating a message with this content. + /// + /// The readable stream containing the image data to use as content. + /// The MIME descriptor, like image/png, corresponding to the image data format of the provided data. + /// The detail level of the image. + /// A new instance of . + public static ChatMessageContentPart CreateImageMessageContentPart(BinaryData imageBytes, string imageBytesMediaType, ImageChatMessageContentPartDetail? imageDetail = null) + { + Argument.AssertNotNull(imageBytes, nameof(imageBytes)); + Argument.AssertNotNull(imageBytesMediaType, nameof(imageBytesMediaType)); + + return new(imageBytes, imageBytesMediaType, imageDetail); + } + + /// + /// Returns text representation of this part. + /// + /// + public override string ToString() => Text; + + /// + /// Implicitly creates a new instance from an item of plain text. + /// + /// + /// Using a in the position of a is equivalent to + /// calling the method. + /// + /// The text content to use as this content part. + public static implicit operator ChatMessageContentPart(string content) => new(content); +} diff --git a/.dotnet/src/Custom/Chat/ChatMessageContentPartKind.cs b/.dotnet/src/Custom/Chat/ChatMessageContentPartKind.cs new file mode 100644 index 000000000..73ba771ad --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatMessageContentPartKind.cs @@ -0,0 +1,49 @@ +using System.ComponentModel; +using System; + +namespace OpenAI.Chat; + +/// +/// Represents the possibles of underlying data for a chat message's content property. +/// +public readonly partial struct ChatMessageContentPartKind : IEquatable +{ + private readonly string _value; + + /// Initializes a new instance of . + /// is null. + public ChatMessageContentPartKind(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string TextValue = "text"; + private const string RefusalValue = "refusal"; + private const string ImageValue = "image_url"; + + /// Text. + public static ChatMessageContentPartKind Text { get; } = new ChatMessageContentPartKind(TextValue); + /// Refusal. + public static ChatMessageContentPartKind Refusal { get; } = new(RefusalValue); + /// Image. + public static ChatMessageContentPartKind Image { get; } = new ChatMessageContentPartKind(ImageValue); + + /// Determines if two values are the same. + public static bool operator ==(ChatMessageContentPartKind left, ChatMessageContentPartKind right) => left.Equals(right); + /// Determines if two values are not the same. + public static bool operator !=(ChatMessageContentPartKind left, ChatMessageContentPartKind right) => !left.Equals(right); + /// Converts a string to a . + public static implicit operator ChatMessageContentPartKind(string value) => new ChatMessageContentPartKind(value); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is ChatMessageContentPartKind other && Equals(other); + /// + public bool Equals(ChatMessageContentPartKind other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + /// + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value?.GetHashCode() ?? 0; + /// + public override string ToString() => _value; +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatMessageRole.cs b/.dotnet/src/Custom/Chat/ChatMessageRole.cs new file mode 100644 index 000000000..6611ec5d9 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatMessageRole.cs @@ -0,0 +1,95 @@ +namespace OpenAI.Chat; + +/// +/// Represents the role of a chat completion message. +/// +/// +/// +/// +/// Type - +/// Role - +/// Description +/// +/// +/// - +/// system - +/// Instructions to the model that guide the behavior of future assistant messages. +/// +/// +/// - +/// user - +/// Input messages from the caller, typically paired with assistant messages in a conversation. +/// +/// +/// - +/// assistant - +/// +/// Output messages from the model with responses to the user or calls to tools or functions that are +/// needed to continue the logical conversation. +/// +/// +/// +/// - +/// tool - +/// +/// Resolution information for a in an earlier +/// that was made against a supplied +/// . +/// +/// +/// +/// - +/// function - +/// +/// Resolution information for a in an earlier +/// that was made against a supplied +/// . Note that functions are deprecated in favor of +/// tool_calls. +/// +/// +/// +/// +[CodeGenModel("ChatCompletionRole")] +public enum ChatMessageRole +{ + /// + /// The system role, which provides instructions to the model that guide the behavior of future + /// assistant messages + /// + [CodeGenMember("System")] + System, + + /// + /// The user role that provides input from the caller as a prompt for model responses. + /// + [CodeGenMember("User")] + User, + + /// + /// The assistant role that provides output from the model that either issues completions in response to + /// user messages or calls provided tools or functions. + /// + [CodeGenMember("Assistant")] + Assistant, + + /// + /// The tool role that provides resolving information to prior tool_calls made by the model against + /// supplied tools. + /// + [CodeGenMember("Tool")] + Tool, + + /// + /// + /// The function role that provides resolving information to a prior function_call made by the model + /// against a definition supplied in functions. + /// + /// + /// + /// functions are deprecated in favor of tools and supplying tools will result in + /// tool_calls that must be resolved via the tool role rather than a function_call resolved + /// by a function role message. + /// + [CodeGenMember("Function")] + Function, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatResponseFormat.cs b/.dotnet/src/Custom/Chat/ChatResponseFormat.cs new file mode 100644 index 000000000..2b7feb1b8 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatResponseFormat.cs @@ -0,0 +1,82 @@ +using OpenAI.Internal; +using System; +using System.ClientModel.Primitives; +using System.ComponentModel; + +namespace OpenAI.Chat; + +[CodeGenModel("ChatResponseFormat")] +public abstract partial class ChatResponseFormat : IEquatable +{ + public static ChatResponseFormat Text { get; } = new InternalChatResponseFormatText(); + public static ChatResponseFormat JsonObject { get; } = new InternalChatResponseFormatJsonObject(); + + public static ChatResponseFormat CreateTextFormat() => new InternalChatResponseFormatText(); + public static ChatResponseFormat CreateJsonObjectFormat() => new InternalChatResponseFormatJsonObject(); + public static ChatResponseFormat CreateJsonSchemaFormat( + string name, + BinaryData jsonSchema, + string description = null, + bool? strictSchemaEnabled = null) + { + Argument.AssertNotNullOrEmpty(name, nameof(name)); + Argument.AssertNotNull(jsonSchema, nameof(jsonSchema)); + + InternalResponseFormatJsonSchemaJsonSchema internalSchema = new( + description, + name, + jsonSchema, + strictSchemaEnabled, + null); + return new InternalChatResponseFormatJsonSchema(internalSchema); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public static bool operator ==(ChatResponseFormat first, ChatResponseFormat second) + { + if (first is null) + { + return second is null; + } + return first.Equals(second); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public static bool operator !=(ChatResponseFormat first, ChatResponseFormat second) + => !(first == second); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) + { + return (this as IEquatable).Equals(obj as ChatResponseFormat) + || ToString().Equals(obj?.ToString()); + } + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => ToString().GetHashCode(); + + [EditorBrowsable(EditorBrowsableState.Never)] + bool IEquatable.Equals(ChatResponseFormat other) + { + if (other is null) + { + return false; + } + + if (Object.ReferenceEquals(this, other)) + { + return true; + } + + return (this is InternalChatResponseFormatText && other is InternalChatResponseFormatText) + || (this is InternalChatResponseFormatJsonObject && other is InternalChatResponseFormatJsonObject) + || (this is InternalChatResponseFormatJsonSchema thisJsonSchema + && other is InternalChatResponseFormatJsonSchema otherJsonSchema + && thisJsonSchema.JsonSchema.Name.Equals(otherJsonSchema.JsonSchema.Name)); + } + + public override string ToString() + { + return ModelReaderWriter.Write(this).ToString(); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatTokenLogProbabilityInfo.cs b/.dotnet/src/Custom/Chat/ChatTokenLogProbabilityInfo.cs new file mode 100644 index 000000000..570400079 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatTokenLogProbabilityInfo.cs @@ -0,0 +1,34 @@ +namespace OpenAI.Chat; + +using System.Collections.Generic; + +/// +/// Represents a single token's log probability information, as requested via +/// . +/// +[CodeGenModel("ChatCompletionTokenLogprob")] +public partial class ChatTokenLogProbabilityInfo +{ + // CUSTOM: Renamed. + /// The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value `-9999.0` is used to signify that the token is very unlikely. + [CodeGenMember("Logprob")] + public float LogProbability { get; } + + // CUSTOM: Renamed. + /// + /// A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where + /// characters are represented by multiple tokens and their byte representations must be combined to generate + /// the correct text representation. Can be null if there is no bytes representation for the token. + /// + [CodeGenMember("Bytes")] + public IReadOnlyList Utf8ByteValues { get; } + + // CUSTOM: Renamed. + /// + /// List of the most likely tokens and their log probability at this token position. In rare cases, + /// there may be fewer than the number of requested top_logprobs returned, as supplied via + /// . + /// + [CodeGenMember("TopLogprobs")] + public IReadOnlyList TopLogProbabilities { get; } +} diff --git a/.dotnet/src/Custom/Chat/ChatTokenTopLogProbabilityInfo.cs b/.dotnet/src/Custom/Chat/ChatTokenTopLogProbabilityInfo.cs new file mode 100644 index 000000000..91d1fba8e --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatTokenTopLogProbabilityInfo.cs @@ -0,0 +1,26 @@ +using System.Collections.Generic; + +namespace OpenAI.Chat; + +/// +/// Represents a single item of log probability information as requested via +/// and +/// . +/// +[CodeGenModel("ChatCompletionTokenLogprobTopLogprob")] +public partial class ChatTokenTopLogProbabilityInfo +{ + // CUSTOM: Renamed. + /// The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value `-9999.0` is used to signify that the token is very unlikely. + [CodeGenMember("Logprob")] + public float LogProbability { get; } + + // CUSTOM: Renamed. + /// + /// A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where + /// characters are represented by multiple tokens and their byte representations must be combined to generate + /// the correct text representation. Can be null if there is no bytes representation for the token. + /// + [CodeGenMember("Bytes")] + public IReadOnlyList Utf8ByteValues { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatTokenUsage.cs b/.dotnet/src/Custom/Chat/ChatTokenUsage.cs new file mode 100644 index 000000000..c53222320 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatTokenUsage.cs @@ -0,0 +1,18 @@ +namespace OpenAI.Chat; + +/// +/// Represents computed token consumption statistics for a chat completion request. +/// +[CodeGenModel("CompletionUsage")] +public partial class ChatTokenUsage +{ + // CUSTOM: Renamed. + /// Number of tokens in the generated completion. + [CodeGenMember("CompletionTokens")] + public int OutputTokens { get; } + + // CUSOTM: Renamed. + /// Number of tokens in the prompt. + [CodeGenMember("PromptTokens")] + public int InputTokens { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatTool.cs b/.dotnet/src/Custom/Chat/ChatTool.cs new file mode 100644 index 000000000..f1fc403ad --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatTool.cs @@ -0,0 +1,93 @@ +using System; + +namespace OpenAI.Chat; + +/// +/// A base representation of a tool supplied to a chat completion request. Tools inform the model about additional, +/// caller-provided behaviors that can be invoked to provide prompt enrichment or custom actions. +/// +[CodeGenModel("ChatCompletionTool")] +public partial class ChatTool +{ + // CUSTOM: Made internal. + /// Gets the function. + [CodeGenMember("Function")] + internal InternalFunctionDefinition Function { get; } + + // CUSTOM: Made internal. + /// Initializes a new instance of . + /// + /// is null. + internal ChatTool(InternalFunctionDefinition function) + { + Kind = ChatToolKind.Function; + + Function = function; + } + + // CUSTOM: Renamed. + /// The type of the tool. Currently, only is supported. + [CodeGenMember("Type")] + public ChatToolKind Kind { get; } = ChatToolKind.Function; + + // CUSTOM: Flattened. + /// + /// The name of the function that the tool represents. + /// + public string FunctionName => Function?.Name; + + // CUSTOM: Flattened. + /// + /// A friendly description of the function. This supplements in informing the model about when + /// it should call the function. + /// + public string FunctionDescription => Function?.Description; + + // CUSTOM: Flattened. + /// + /// The parameter information for the function, provided in JSON Schema format. + /// + /// + /// The method provides + /// an easy definition interface using the dynamic type: + /// + /// Parameters = BinaryData.FromObjectAsJson(new + /// { + /// type = "object", + /// properties = new + /// { + /// your_function_argument = new + /// { + /// type = "string", + /// description = "the description of your function argument" + /// } + /// }, + /// required = new[] { "your_function_argument" } + /// }) + /// + /// + public BinaryData FunctionParameters => Function?.Parameters; + + public bool? StrictParameterSchemaEnabled => Function?.Strict; + + // CUSTOM: Added custom constructor. + /// + /// Creates a new instance of . + /// + /// The name of the function. + /// The description of the function. + /// The parameters into the function, in JSON Schema format. + public static ChatTool CreateFunctionTool(string functionName, string functionDescription = null, BinaryData functionParameters = null, bool? strictParameterSchemaEnabled = null) + { + Argument.AssertNotNull(functionName, nameof(functionName)); + + InternalFunctionDefinition function = new(functionName) + { + Description = functionDescription, + Parameters = functionParameters, + Strict = strictParameterSchemaEnabled, + }; + + return new(function); + } +} diff --git a/.dotnet/src/Custom/Chat/ChatToolCall.cs b/.dotnet/src/Custom/Chat/ChatToolCall.cs new file mode 100644 index 000000000..7a508f743 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatToolCall.cs @@ -0,0 +1,69 @@ +using System; + +namespace OpenAI.Chat; + +/// +/// A base representation of an item in an assistant role response's tool_calls that specifies +/// parameterized resolution against a previously defined tool that is needed for the model to continue the logical +/// conversation. +/// +[CodeGenModel("ChatCompletionMessageToolCall")] +public partial class ChatToolCall +{ + /// The function that the model called. + [CodeGenMember("Function")] + internal InternalChatCompletionMessageToolCallFunction Function { get; } + + // CUSTOM: Made internal. + /// Initializes a new instance of . + /// The ID of the tool call. + /// The function that the model called. + /// or is null. + internal ChatToolCall(string id, InternalChatCompletionMessageToolCallFunction function) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(function, nameof(function)); + + Kind = ChatToolCallKind.Function; + + Id = id; + Function = function; + } + + // CUSTOM: Renamed. + /// The kind of tool. Currently, only is supported. + [CodeGenMember("Type")] + public ChatToolCallKind Kind { get; } = ChatToolCallKind.Function; + + // CUSTOM: Flattened. + /// + /// Gets the name of the function. + /// + public string FunctionName => Function?.Name; + + // CUSTOM: Flattened. + /// + /// Gets the arguments to the function. + /// + public string FunctionArguments => Function?.Arguments; + + /// + /// Creates a new instance of . + /// + /// + /// The ID of the tool call, used when resolving the tool call with a future + /// . + /// + /// The name of the function. + /// The arguments to the function. + public static ChatToolCall CreateFunctionToolCall(string toolCallId, string functionName, string functionArguments) + { + Argument.AssertNotNull(toolCallId, nameof(toolCallId)); + Argument.AssertNotNull(functionName, nameof(functionName)); + Argument.AssertNotNull(functionArguments, nameof(functionArguments)); + + InternalChatCompletionMessageToolCallFunction function = new(functionName, functionArguments); + + return new(toolCallId, function); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatToolCallKind.cs b/.dotnet/src/Custom/Chat/ChatToolCallKind.cs new file mode 100644 index 000000000..1a0b1716a --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatToolCallKind.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Chat; + +[CodeGenModel("ChatCompletionMessageToolCallType")] +public readonly partial struct ChatToolCallKind +{ +} diff --git a/.dotnet/src/Custom/Chat/ChatToolChoice.Serialization.cs b/.dotnet/src/Custom/Chat/ChatToolChoice.Serialization.cs new file mode 100644 index 000000000..28ebbf323 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatToolChoice.Serialization.cs @@ -0,0 +1,71 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class ChatToolChoice : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeChatToolChoice, writer, options); + + internal static void SerializeChatToolChoice(ChatToolChoice instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + if (instance._isPlainString) + { + writer.WriteStringValue(instance._string); + } + else + { + writer.WriteStartObject(); + writer.WritePropertyName("type"u8); + writer.WriteStringValue(instance._type.ToString()); + writer.WritePropertyName("function"u8); + writer.WriteObjectValue(instance._function, options); + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + } + + internal static ChatToolChoice DeserializeChatToolChoice(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + else if (element.ValueKind == JsonValueKind.String) + { + return new ChatToolChoice(element.ToString()); + } + else + { + InternalChatCompletionNamedToolChoiceType type = default; + InternalChatCompletionNamedToolChoiceFunction function = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = new InternalChatCompletionNamedToolChoiceType(property.Value.GetString()); + continue; + } + if (property.NameEquals("function"u8)) + { + function = InternalChatCompletionNamedToolChoiceFunction.DeserializeInternalChatCompletionNamedToolChoiceFunction(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatToolChoice(function.Name, serializedAdditionalRawData); + } + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ChatToolChoice.cs b/.dotnet/src/Custom/Chat/ChatToolChoice.cs new file mode 100644 index 000000000..0fb3fa258 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatToolChoice.cs @@ -0,0 +1,80 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat; + +/// +/// Represents tool_choice, the desired manner in which the model should use the tools defined in a +/// chat completion request. +/// +[CodeGenModel("ChatCompletionToolChoice")] +[CodeGenSuppress("ChatToolChoice", typeof(IDictionary))] +public partial class ChatToolChoice +{ + private readonly bool _isPlainString; + private readonly string _string; + private readonly InternalChatCompletionNamedToolChoiceType _type; + private readonly InternalChatCompletionNamedToolChoiceFunction _function; + + private const string AutoValue = "auto"; + private const string NoneValue = "none"; + private const string RequiredValue = "required"; + + // CUSTOM: Made internal. + internal ChatToolChoice() + { + } + + // CUSTOM: Added custom internal constructor to handle the plain string representation (e.g. "auto", "none", etc.). + internal ChatToolChoice(string predefinedToolChoice) + { + Argument.AssertNotNull(predefinedToolChoice, nameof(predefinedToolChoice)); + + _string = predefinedToolChoice; + _isPlainString = true; + } + + // CUSTOM: Added custom public constructor to handle the object representation. + /// + /// Creates a new instance of which requests that the model restricts its behavior + /// to calling the specified tool. + /// + /// The definition of the tool that the model should call. + public ChatToolChoice(ChatTool tool) + { + Argument.AssertNotNull(tool, nameof(tool)); + + _function = new(tool.FunctionName); + _type = InternalChatCompletionNamedToolChoiceType.Function; + _isPlainString = false; + } + + // CUSTOM: Added the function name parameter to the constructor that takes additional data to handle the object representation. + /// Initializes a new instance of . + /// The function name. + /// Keeps track of any properties unknown to the library. + internal ChatToolChoice(string functionName, IDictionary serializedAdditionalRawData) + { + Argument.AssertNotNull(functionName, nameof(functionName)); + + _function = new(functionName); + _type = InternalChatCompletionNamedToolChoiceType.Function; + _isPlainString = false; + + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// + /// Specifies that the model must freely pick between generating a message or calling one or more tools. + /// + public static ChatToolChoice Auto { get; } = new ChatToolChoice(AutoValue); + /// + /// Specifies that the model must not invoke any tools, and instead it must generate an ordinary message. Note + /// that the tools that were provided may still influence the model's behavior even if they are not called. + /// + public static ChatToolChoice None { get; } = new ChatToolChoice(NoneValue); + /// + /// Specifies that the model must call one or more tools. + /// + public static ChatToolChoice Required { get; } = new ChatToolChoice(RequiredValue); +} diff --git a/.dotnet/src/Custom/Chat/ChatToolKind.cs b/.dotnet/src/Custom/Chat/ChatToolKind.cs new file mode 100644 index 000000000..3970f0f07 --- /dev/null +++ b/.dotnet/src/Custom/Chat/ChatToolKind.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Chat; + +[CodeGenModel("ChatCompletionToolType")] +public readonly partial struct ChatToolKind +{ +} diff --git a/.dotnet/src/Custom/Chat/FunctionChatMessage.Serialization.cs b/.dotnet/src/Custom/Chat/FunctionChatMessage.Serialization.cs new file mode 100644 index 000000000..0bf7c4b18 --- /dev/null +++ b/.dotnet/src/Custom/Chat/FunctionChatMessage.Serialization.cs @@ -0,0 +1,34 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class FunctionChatMessage : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeFunctionChatMessage, writer, options); + + internal static void SerializeFunctionChatMessage(FunctionChatMessage instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + instance.WriteCore(writer, options); + } + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("role"u8); + writer.WriteStringValue(Role.ToSerialString()); + writer.WritePropertyName("name"u8); + writer.WriteStringValue(FunctionName); + if (Optional.IsCollectionDefined(Content)) + { + writer.WritePropertyName("content"u8); + writer.WriteStringValue(Content?[0]?.Text); + } + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} diff --git a/.dotnet/src/Custom/Chat/FunctionChatMessage.cs b/.dotnet/src/Custom/Chat/FunctionChatMessage.cs new file mode 100644 index 000000000..10468f427 --- /dev/null +++ b/.dotnet/src/Custom/Chat/FunctionChatMessage.cs @@ -0,0 +1,39 @@ +using System.Collections.Generic; + +namespace OpenAI.Chat; + +/// +/// Represents a chat message of the function role as provided to a chat completion request. A function message +/// resolves a prior function_call received from the model and correlates to both a supplied +/// instance as well as a made by the model on an +/// assistant response message. +/// +[CodeGenModel("ChatCompletionRequestFunctionMessage")] +[CodeGenSuppress("FunctionChatMessage", typeof(IEnumerable), typeof(string))] +public partial class FunctionChatMessage : ChatMessage +{ + /// + /// Creates a new instance of . + /// + /// + /// The name of the called function that this message provides information from. + /// + /// + /// The textual content that represents the output or result from the called function. There is no format + /// restriction (e.g. JSON) imposed on this content. + /// + public FunctionChatMessage(string functionName, string content = null) + : base(ChatMessageRole.Function, content) + { + Argument.AssertNotNull(functionName, nameof(functionName)); + + FunctionName = functionName; + } + + // CUSTOM: Renamed. + /// + /// The name of the called function that this message provides information from. + /// + [CodeGenMember("Name")] + public string FunctionName { get; } +} diff --git a/.dotnet/src/Custom/Chat/ImageChatMessageContentPartDetail.cs b/.dotnet/src/Custom/Chat/ImageChatMessageContentPartDetail.cs new file mode 100644 index 000000000..943400a8e --- /dev/null +++ b/.dotnet/src/Custom/Chat/ImageChatMessageContentPartDetail.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Chat; + +[CodeGenModel("ChatCompletionRequestMessageContentPartImageImageUrlDetail")] +public readonly partial struct ImageChatMessageContentPartDetail +{ +} diff --git a/.dotnet/src/Custom/Chat/Internal/AsyncStreamingChatCompletionUpdateCollection.cs b/.dotnet/src/Custom/Chat/Internal/AsyncStreamingChatCompletionUpdateCollection.cs new file mode 100644 index 000000000..d3bc8ae90 --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/AsyncStreamingChatCompletionUpdateCollection.cs @@ -0,0 +1,146 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Diagnostics; +using System.Net.ServerSentEvents; +using System.Text.Json; +using System.Threading; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.Chat; + +/// +/// Implementation of collection abstraction over streaming chat updates. +/// +internal class AsyncStreamingChatCompletionUpdateCollection : AsyncCollectionResult +{ + private readonly Func> _getResultAsync; + + public AsyncStreamingChatCompletionUpdateCollection(Func> getResultAsync) : base() + { + Argument.AssertNotNull(getResultAsync, nameof(getResultAsync)); + + _getResultAsync = getResultAsync; + } + + public override IAsyncEnumerator GetAsyncEnumerator(CancellationToken cancellationToken = default) + { + return new AsyncStreamingChatUpdateEnumerator(_getResultAsync, this, cancellationToken); + } + + private sealed class AsyncStreamingChatUpdateEnumerator : IAsyncEnumerator + { + private static ReadOnlySpan TerminalData => "[DONE]"u8; + + private readonly Func> _getResultAsync; + private readonly AsyncStreamingChatCompletionUpdateCollection _enumerable; + private readonly CancellationToken _cancellationToken; + + // These enumerators represent what is effectively a doubly-nested + // loop over the outer event collection and the inner update collection, + // i.e.: + // foreach (var sse in _events) { + // // get _updates from sse event + // foreach (var update in _updates) { ... } + // } + private IAsyncEnumerator>? _events; + private IEnumerator? _updates; + + private StreamingChatCompletionUpdate? _current; + private bool _started; + + public AsyncStreamingChatUpdateEnumerator(Func> getResultAsync, + AsyncStreamingChatCompletionUpdateCollection enumerable, + CancellationToken cancellationToken) + { + Debug.Assert(getResultAsync is not null); + Debug.Assert(enumerable is not null); + + _getResultAsync = getResultAsync!; + _enumerable = enumerable!; + _cancellationToken = cancellationToken; + } + + StreamingChatCompletionUpdate IAsyncEnumerator.Current + => _current!; + + async ValueTask IAsyncEnumerator.MoveNextAsync() + { + if (_events is null && _started) + { + throw new ObjectDisposedException(nameof(AsyncStreamingChatUpdateEnumerator)); + } + + _cancellationToken.ThrowIfCancellationRequested(); + _events ??= await CreateEventEnumeratorAsync().ConfigureAwait(false); + _started = true; + + if (_updates is not null && _updates.MoveNext()) + { + _current = _updates.Current; + return true; + } + + if (await _events.MoveNextAsync().ConfigureAwait(false)) + { + if (_events.Current.Data.AsSpan().SequenceEqual(TerminalData)) + { + _current = default; + return false; + } + + using JsonDocument doc = JsonDocument.Parse(_events.Current.Data); + var updates = StreamingChatCompletionUpdate.DeserializeStreamingChatCompletionUpdates(doc.RootElement); + _updates = updates.GetEnumerator(); + + if (_updates.MoveNext()) + { + _current = _updates.Current; + return true; + } + } + + _current = default; + return false; + } + + private async Task>> CreateEventEnumeratorAsync() + { + ClientResult result = await _getResultAsync().ConfigureAwait(false); + PipelineResponse response = result.GetRawResponse(); + _enumerable.SetRawResponse(response); + + if (response.ContentStream is null) + { + throw new InvalidOperationException("Unable to create result from response with null ContentStream"); + } + + IAsyncEnumerable> enumerable = SseParser.Create(response.ContentStream, (_, bytes) => bytes.ToArray()).EnumerateAsync(); + return enumerable.GetAsyncEnumerator(_cancellationToken); + } + + public async ValueTask DisposeAsync() + { + await DisposeAsyncCore().ConfigureAwait(false); + + GC.SuppressFinalize(this); + } + + private async ValueTask DisposeAsyncCore() + { + if (_events is not null) + { + await _events.DisposeAsync().ConfigureAwait(false); + _events = null; + + // Dispose the response so we don't leave the unbuffered + // network stream open. + PipelineResponse response = _enumerable.GetRawResponse(); + response.Dispose(); + } + } + } +} diff --git a/.dotnet/src/Custom/Chat/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/Chat/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..f52ceaa79 --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/GeneratorStubs.cs @@ -0,0 +1,101 @@ +namespace OpenAI.Chat; + +[CodeGenModel("ChatCompletionFunctionCallOption")] +internal partial class InternalChatCompletionFunctionCallOption { } + +[CodeGenModel("ChatCompletionMessageToolCallFunction")] +internal partial class InternalChatCompletionMessageToolCallFunction { } + +[CodeGenModel("ChatCompletionMessageToolCallChunkFunction")] +internal partial class InternalChatCompletionMessageToolCallChunkFunction { } + +[CodeGenModel("ChatCompletionMessageToolCallChunkType")] +internal readonly partial struct InternalChatCompletionMessageToolCallChunkType { } + +[CodeGenModel("ChatCompletionNamedToolChoice")] +internal partial class InternalChatCompletionNamedToolChoice { } + +[CodeGenModel("ChatCompletionNamedToolChoiceFunction")] +internal partial class InternalChatCompletionNamedToolChoiceFunction { } + +[CodeGenModel("ChatCompletionNamedToolChoiceType")] +internal readonly partial struct InternalChatCompletionNamedToolChoiceType { } + +[CodeGenModel("ChatCompletionRequestMessageContentPartImage")] +internal partial class InternalChatCompletionRequestMessageContentPartImage { } + +[CodeGenModel("ChatCompletionRequestMessageContentPartImageType")] +internal readonly partial struct InternalChatCompletionRequestMessageContentPartImageType { } + +[CodeGenModel("ChatCompletionRequestMessageContentPartText")] +internal partial class InternalChatCompletionRequestMessageContentPartText { } + +[CodeGenModel("ChatCompletionRequestMessageContentPartTextType")] +internal readonly partial struct InternalChatCompletionRequestMessageContentPartTextType { } + +[CodeGenModel("ChatCompletionResponseMessageFunctionCall")] +internal partial class InternalChatCompletionResponseMessageFunctionCall { } + +[CodeGenModel("ChatCompletionResponseMessageRole")] +internal readonly partial struct InternalChatCompletionResponseMessageRole { } + +[CodeGenModel("ChatCompletionStreamOptions")] +internal partial class InternalChatCompletionStreamOptions { } + +[CodeGenModel("ChatCompletionStreamResponseDeltaRole")] +internal readonly partial struct InternalChatCompletionStreamResponseDeltaRole { } + +[CodeGenModel("CreateChatCompletionFunctionResponse")] +internal partial class InternalCreateChatCompletionFunctionResponse { } + +[CodeGenModel("CreateChatCompletionFunctionResponseChoice")] +internal partial class InternalCreateChatCompletionFunctionResponseChoice { } + +[CodeGenModel("CreateChatCompletionFunctionResponseChoiceFinishReason")] +internal readonly partial struct InternalCreateChatCompletionFunctionResponseChoiceFinishReason { } + +[CodeGenModel("CreateChatCompletionFunctionResponseObject")] +internal readonly partial struct InternalCreateChatCompletionFunctionResponseObject { } + +[CodeGenModel("CreateChatCompletionRequestModel")] +internal readonly partial struct InternalCreateChatCompletionRequestModel { } + +[CodeGenModel("CreateChatCompletionRequestToolChoice")] +internal readonly partial struct InternalCreateChatCompletionRequestToolChoice { } + +[CodeGenModel("CreateChatCompletionResponseChoice")] +internal partial class InternalCreateChatCompletionResponseChoice { } + +[CodeGenModel("CreateChatCompletionResponseChoiceLogprobs")] +internal partial class InternalCreateChatCompletionResponseChoiceLogprobs { } + +[CodeGenModel("CreateChatCompletionResponseObject")] +internal readonly partial struct InternalCreateChatCompletionResponseObject { } + +[CodeGenModel("CreateChatCompletionStreamResponseChoice")] +internal partial class InternalCreateChatCompletionStreamResponseChoice { } + +[CodeGenModel("CreateChatCompletionStreamResponseChoiceFinishReason")] +internal readonly partial struct InternalCreateChatCompletionStreamResponseChoiceFinishReason { } + +[CodeGenModel("CreateChatCompletionStreamResponseChoiceLogprobs")] +internal partial class InternalCreateChatCompletionStreamResponseChoiceLogprobs { } + +[CodeGenModel("CreateChatCompletionStreamResponseObject")] +internal readonly partial struct InternalCreateChatCompletionStreamResponseObject { } + +[CodeGenModel("CreateChatCompletionStreamResponseUsage")] +internal partial class InternalCreateChatCompletionStreamResponseUsage { } + +[CodeGenModel("FunctionParameters")] +internal partial class InternalFunctionParameters { } + +[CodeGenModel("ChatResponseFormatText")] internal partial class InternalChatResponseFormatText { } +[CodeGenModel("ChatResponseFormatJsonObject")] internal partial class InternalChatResponseFormatJsonObject { } +[CodeGenModel("ChatResponseFormatJsonSchema")] internal partial class InternalChatResponseFormatJsonSchema { } +[CodeGenModel("UnknownChatResponseFormat")] internal partial class InternalUnknownChatResponseFormat { } +[CodeGenModel("ChatCompletionRequestMessageContentPartRefusal")] internal partial class InternalChatCompletionRequestMessageContentPartRefusal { } +[CodeGenModel("ChatCompletionRequestMessageContentPartRefusalType")] internal readonly partial struct InternalChatCompletionRequestMessageContentPartRefusalType { } +[CodeGenModel("CreateChatCompletionRequestServiceTier")] internal readonly partial struct InternalCreateChatCompletionRequestServiceTier { } +[CodeGenModel("CreateChatCompletionResponseServiceTier")] internal readonly partial struct InternalCreateChatCompletionResponseServiceTier { } +[CodeGenModel("CreateChatCompletionStreamResponseServiceTier")] internal readonly partial struct InternalCreateChatCompletionStreamResponseServiceTier { } diff --git a/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionRequestMessageContentPartImageImageUrl.cs b/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionRequestMessageContentPartImageImageUrl.cs new file mode 100644 index 000000000..be29738a8 --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionRequestMessageContentPartImageImageUrl.cs @@ -0,0 +1,80 @@ +using System; +using System.Collections.Generic; +using System.Text.RegularExpressions; + +namespace OpenAI.Chat; + +[CodeGenModel("ChatCompletionRequestMessageContentPartImageImageUrl")] +[CodeGenSuppress("InternalChatCompletionRequestMessageContentPartImageImageUrl", typeof(string))] +internal partial class InternalChatCompletionRequestMessageContentPartImageImageUrl +{ +#if NET8_0_OR_GREATER + [GeneratedRegex(@"^data:(?.+?);base64,(?.+)$")] + private static partial Regex ParseDataUriRegex(); +#else + private static Regex ParseDataUriRegex() => s_parseDataUriRegex; + private static readonly Regex s_parseDataUriRegex = new(@"^data:(?.+?);base64,(?.+)$", RegexOptions.Compiled); +#endif + + private readonly Uri _imageUri; + private readonly BinaryData _imageBytes; + private readonly string _imageBytesMediaType; + + // CUSTOM: Changed type from Uri to string to be able to support data URIs properly. + /// Either a URL of the image or the base64 encoded image data. + [CodeGenMember("Url")] + internal string Url { get; } + + /// Initializes a new instance of . + /// Either a URL of the image or the base64 encoded image data. + /// is null. + public InternalChatCompletionRequestMessageContentPartImageImageUrl(Uri uri) + { + Argument.AssertNotNull(uri, nameof(uri)); + + _imageUri = uri; + + Url = uri.ToString(); + } + + public InternalChatCompletionRequestMessageContentPartImageImageUrl(BinaryData imageBytes, string imageBytesMediaType) + { + Argument.AssertNotNull(imageBytes, nameof(imageBytes)); + Argument.AssertNotNull(imageBytesMediaType, nameof(imageBytesMediaType)); + + _imageBytes = imageBytes; + _imageBytesMediaType = imageBytesMediaType; + + string base64EncodedData = Convert.ToBase64String(_imageBytes.ToArray()); + Url = $"data:{_imageBytesMediaType};base64,{base64EncodedData}"; + } + + /// Initializes a new instance of . + /// Either a URL of the image or the base64 encoded image data. + /// Specifies the detail level of the image. Learn more in the [Vision guide](/docs/guides/vision/low-or-high-fidelity-image-understanding). + /// Keeps track of any properties unknown to the library. + internal InternalChatCompletionRequestMessageContentPartImageImageUrl(string url, ImageChatMessageContentPartDetail? detail, IDictionary serializedAdditionalRawData) + { + Match parsedDataUri = ParseDataUriRegex().Match(url); + + if (parsedDataUri.Success) + { + _imageBytes = BinaryData.FromBytes(Convert.FromBase64String(parsedDataUri.Groups["data"].Value)); + _imageBytesMediaType = parsedDataUri.Groups["type"].Value; + } + else + { + _imageUri = new Uri(url); + } + + Url = url; + Detail = detail; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public Uri ImageUri => _imageUri; + + public BinaryData ImageBytes => _imageBytes; + + public string ImageBytesMediaType => _imageBytesMediaType; +} diff --git a/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionResponseMessage.Serialization.cs b/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionResponseMessage.Serialization.cs new file mode 100644 index 000000000..4ed0cadda --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionResponseMessage.Serialization.cs @@ -0,0 +1,136 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Runtime.CompilerServices; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +internal partial class InternalChatCompletionResponseMessage : IJsonModel +{ + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private void SerializeContentValue(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + throw new NotImplementedException(); + } + + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private static void DeserializeContentValue(JsonProperty property, ref IReadOnlyList content, ModelReaderWriterOptions options = null) + { + throw new NotImplementedException(); + } + + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeInternalChatCompletionResponseMessage, writer, options); + + internal static void SerializeInternalChatCompletionResponseMessage(InternalChatCompletionResponseMessage instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + if (Optional.IsCollectionDefined(instance.Content)) + { + if (instance.Content[0] != null) + { + writer.WritePropertyName("content"u8); + writer.WriteStringValue(instance.Content[0].Text); + } + else + { + writer.WriteNull("content"); + } + } + if (Optional.IsCollectionDefined(instance.ToolCalls)) + { + writer.WritePropertyName("tool_calls"u8); + writer.WriteStartArray(); + foreach (var item in instance.ToolCalls) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + writer.WritePropertyName("role"u8); + writer.WriteStringValue(instance.Role.ToSerialString()); + if (Optional.IsDefined(instance.FunctionCall)) + { + writer.WritePropertyName("function_call"u8); + writer.WriteObjectValue(instance.FunctionCall, options); + } + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + + internal static InternalChatCompletionResponseMessage DeserializeInternalChatCompletionResponseMessage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList content = default; + string refusal = default; + IReadOnlyList toolCalls = default; + ChatMessageRole role = default; + ChatFunctionCall functionCall = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("content"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + array.Add(ChatMessageContentPart.CreateTextMessageContentPart(property.Value.GetString())); + content = array; + continue; + } + if (property.NameEquals("refusal"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + refusal = property.Value.GetString(); + continue; + } + if (property.NameEquals("tool_calls"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatToolCall.DeserializeChatToolCall(item, options)); + } + toolCalls = array; + continue; + } + if (property.NameEquals("role"u8)) + { + role = property.Value.GetString().ToChatMessageRole(); + continue; + } + if (property.NameEquals("function_call"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + functionCall = ChatFunctionCall.DeserializeChatFunctionCall(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionResponseMessage(content ?? new ChangeTrackingList(), refusal, toolCalls ?? new ChangeTrackingList(), role, functionCall, serializedAdditionalRawData); + } +} diff --git a/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionResponseMessage.cs b/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionResponseMessage.cs new file mode 100644 index 000000000..92982f906 --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionResponseMessage.cs @@ -0,0 +1,25 @@ +using System.Collections.Generic; + +namespace OpenAI.Chat; + + +[CodeGenModel("ChatCompletionResponseMessage")] +[CodeGenSuppress("InternalChatCompletionResponseMessage", typeof(IEnumerable))] +[CodeGenSerialization(nameof(Content), SerializationValueHook = nameof(SerializeContentValue), DeserializationValueHook = nameof(DeserializeContentValue))] +internal partial class InternalChatCompletionResponseMessage +{ + // CUSTOM: Changed type from InternalChatCompletionResponseMessageRole. + /// The role of the author of this message. + [CodeGenMember("Role")] + public ChatMessageRole Role { get; } = ChatMessageRole.Assistant; + + // CUSTOM: Changed type from string. + /// The contents of the message. + [CodeGenMember("Content")] + public IReadOnlyList Content { get; } + + // CUSTOM: Changed type from InternalChatCompletionResponseMessageFunctionCall. + /// Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model. + [CodeGenMember("FunctionCall")] + public ChatFunctionCall FunctionCall { get; } +} diff --git a/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionStreamResponseDelta.Serialization.cs b/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionStreamResponseDelta.Serialization.cs new file mode 100644 index 000000000..0239d2571 --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionStreamResponseDelta.Serialization.cs @@ -0,0 +1,143 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Runtime.CompilerServices; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +internal partial class InternalChatCompletionStreamResponseDelta : IJsonModel +{ + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private void SerializeContentValue(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + throw new NotImplementedException(); + } + + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private static void DeserializeContentValue(JsonProperty property, ref IReadOnlyList content, ModelReaderWriterOptions options = null) + { + throw new NotImplementedException(); + } + + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeInternalChatCompletionStreamResponseDelta, writer, options); + + internal static void SerializeInternalChatCompletionStreamResponseDelta(InternalChatCompletionStreamResponseDelta instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + if (Optional.IsCollectionDefined(instance.Content)) + { + if (instance.Content[0] != null) + { + writer.WritePropertyName("content"u8); + writer.WriteStringValue(instance.Content[0].Text); + } + else + { + writer.WriteNull("content"); + } + } + if (Optional.IsDefined(instance.FunctionCall)) + { + writer.WritePropertyName("function_call"u8); + writer.WriteObjectValue(instance.FunctionCall, options); + } + if (Optional.IsCollectionDefined(instance.ToolCalls)) + { + writer.WritePropertyName("tool_calls"u8); + writer.WriteStartArray(); + foreach (var item in instance.ToolCalls) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (Optional.IsDefined(instance.Role)) + { + writer.WritePropertyName("role"u8); + writer.WriteStringValue(instance.Role.Value.ToSerialString()); + } + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + + internal static InternalChatCompletionStreamResponseDelta DeserializeInternalChatCompletionStreamResponseDelta(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList content = default; + StreamingChatFunctionCallUpdate functionCall = default; + IReadOnlyList toolCalls = default; + ChatMessageRole? role = default; + string refusal = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("content"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + array.Add(ChatMessageContentPart.CreateTextMessageContentPart(property.Value.GetString())); + content = array; + continue; + } + if (property.NameEquals("function_call"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + functionCall = StreamingChatFunctionCallUpdate.DeserializeStreamingChatFunctionCallUpdate(property.Value, options); + continue; + } + if (property.NameEquals("tool_calls"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(StreamingChatToolCallUpdate.DeserializeStreamingChatToolCallUpdate(item, options)); + } + toolCalls = array; + continue; + } + if (property.NameEquals("role"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + role = property.Value.GetString().ToChatMessageRole(); + continue; + } + if (property.NameEquals("refusal"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + refusal = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionStreamResponseDelta(content ?? new ChangeTrackingList(), functionCall, toolCalls ?? new ChangeTrackingList(), role, refusal, serializedAdditionalRawData); + } +} diff --git a/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionStreamResponseDelta.cs b/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionStreamResponseDelta.cs new file mode 100644 index 000000000..215dc7b00 --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/InternalChatCompletionStreamResponseDelta.cs @@ -0,0 +1,19 @@ +using System.Collections.Generic; + +namespace OpenAI.Chat; + +[CodeGenModel("ChatCompletionStreamResponseDelta")] +[CodeGenSuppress("InternalChatCompletionStreamResponseDelta")] +[CodeGenSerialization(nameof(Content), SerializationValueHook = nameof(SerializeContentValue), DeserializationValueHook = nameof(DeserializeContentValue))] +internal partial class InternalChatCompletionStreamResponseDelta +{ + // CUSTOM: Changed type from string. + /// The role of the author of this message. + [CodeGenMember("Role")] + public ChatMessageRole? Role { get; } + + // CUSTOM: Changed type from string. + /// The contents of the message. + [CodeGenMember("Content")] + public IReadOnlyList Content { get; } +} diff --git a/.dotnet/src/Custom/Chat/Internal/InternalCreateChatCompletionStreamResponseChoice.Serialization.cs b/.dotnet/src/Custom/Chat/Internal/InternalCreateChatCompletionStreamResponseChoice.Serialization.cs new file mode 100644 index 000000000..69b0c4c46 --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/InternalCreateChatCompletionStreamResponseChoice.Serialization.cs @@ -0,0 +1,105 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +internal partial class InternalCreateChatCompletionStreamResponseChoice : IJsonModel +{ + // CUSTOM: + // - Made FinishReason nullable. + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeInternalCreateChatCompletionStreamResponseChoice, writer, options); + + internal static void SerializeInternalCreateChatCompletionStreamResponseChoice(InternalCreateChatCompletionStreamResponseChoice instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("delta"u8); + writer.WriteObjectValue(instance.Delta, options); + if (Optional.IsDefined(instance.Logprobs)) + { + if (instance.Logprobs != null) + { + writer.WritePropertyName("logprobs"u8); + writer.WriteObjectValue(instance.Logprobs, options); + } + else + { + writer.WriteNull("logprobs"); + } + } + if (Optional.IsDefined(instance.FinishReason)) + { + if (instance.FinishReason != null) + { + writer.WritePropertyName("finish_reason"u8); + writer.WriteStringValue(instance.FinishReason.Value.ToSerialString()); + } + else + { + writer.WriteNull("finish_reason"); + } + } + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(instance.Index); + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + + internal static InternalCreateChatCompletionStreamResponseChoice DeserializeInternalCreateChatCompletionStreamResponseChoice(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalChatCompletionStreamResponseDelta delta = default; + InternalCreateChatCompletionStreamResponseChoiceLogprobs logprobs = default; + ChatFinishReason? finishReason = default; + int index = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("delta"u8)) + { + delta = InternalChatCompletionStreamResponseDelta.DeserializeInternalChatCompletionStreamResponseDelta(property.Value, options); + continue; + } + if (property.NameEquals("logprobs"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + logprobs = null; + continue; + } + logprobs = InternalCreateChatCompletionStreamResponseChoiceLogprobs.DeserializeInternalCreateChatCompletionStreamResponseChoiceLogprobs(property.Value, options); + continue; + } + if (property.NameEquals("finish_reason"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + finishReason = null; + continue; + } + finishReason = property.Value.GetString().ToChatFinishReason(); + continue; + } + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateChatCompletionStreamResponseChoice(delta, logprobs, finishReason, index, serializedAdditionalRawData); + } +} diff --git a/.dotnet/src/Custom/Chat/Internal/InternalCreateChatCompletionStreamResponseChoice.cs b/.dotnet/src/Custom/Chat/Internal/InternalCreateChatCompletionStreamResponseChoice.cs new file mode 100644 index 000000000..356812fbc --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/InternalCreateChatCompletionStreamResponseChoice.cs @@ -0,0 +1,14 @@ +namespace OpenAI.Chat; + +internal partial class InternalCreateChatCompletionStreamResponseChoice +{ + // CUSTOM: Changed type from string. + /// + /// The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, + /// `length` if the maximum number of tokens specified in the request was reached, + /// `content_filter` if content was omitted due to a flag from our content filters, + /// `tool_calls` if the model called a tool, or `function_call` (deprecated) if the model called a function. + /// + [CodeGenMember("FinishReason")] + public ChatFinishReason? FinishReason { get; } +} diff --git a/.dotnet/src/Custom/Chat/Internal/StreamingChatCompletionUpdateCollection.cs b/.dotnet/src/Custom/Chat/Internal/StreamingChatCompletionUpdateCollection.cs new file mode 100644 index 000000000..bc5d360d7 --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/StreamingChatCompletionUpdateCollection.cs @@ -0,0 +1,147 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections; +using System.Collections.Generic; +using System.Diagnostics; +using System.Net.ServerSentEvents; +using System.Text.Json; + +#nullable enable + +namespace OpenAI.Chat; + +/// +/// Implementation of collection abstraction over streaming chat updates. +/// +internal class StreamingChatCompletionUpdateCollection : CollectionResult +{ + private readonly Func _getResult; + + public StreamingChatCompletionUpdateCollection(Func getResult) : base() + { + Argument.AssertNotNull(getResult, nameof(getResult)); + + _getResult = getResult; + } + + public override IEnumerator GetEnumerator() + { + return new StreamingChatUpdateEnumerator(_getResult, this); + } + + private sealed class StreamingChatUpdateEnumerator : IEnumerator + { + private static ReadOnlySpan TerminalData => "[DONE]"u8; + + private readonly Func _getResult; + private readonly StreamingChatCompletionUpdateCollection _enumerable; + + // These enumerators represent what is effectively a doubly-nested + // loop over the outer event collection and the inner update collection, + // i.e.: + // foreach (var sse in _events) { + // // get _updates from sse event + // foreach (var update in _updates) { ... } + // } + private IEnumerator>? _events; + private IEnumerator? _updates; + + private StreamingChatCompletionUpdate? _current; + private bool _started; + + public StreamingChatUpdateEnumerator(Func getResult, + StreamingChatCompletionUpdateCollection enumerable) + { + Debug.Assert(getResult is not null); + Debug.Assert(enumerable is not null); + + _getResult = getResult!; + _enumerable = enumerable!; + } + + StreamingChatCompletionUpdate IEnumerator.Current + => _current!; + + object IEnumerator.Current => throw new NotImplementedException(); + + public bool MoveNext() + { + if (_events is null && _started) + { + throw new ObjectDisposedException(nameof(StreamingChatUpdateEnumerator)); + } + + _events ??= CreateEventEnumerator(); + _started = true; + + if (_updates is not null && _updates.MoveNext()) + { + _current = _updates.Current; + return true; + } + + if (_events.MoveNext()) + { + if (_events.Current.Data.AsSpan().SequenceEqual(TerminalData)) + { + _current = default; + return false; + } + + using JsonDocument doc = JsonDocument.Parse(_events.Current.Data); + var updates = StreamingChatCompletionUpdate.DeserializeStreamingChatCompletionUpdates(doc.RootElement); + _updates = updates.GetEnumerator(); + + if (_updates.MoveNext()) + { + _current = _updates.Current; + return true; + } + } + + _current = default; + return false; + } + + private IEnumerator> CreateEventEnumerator() + { + ClientResult result = _getResult(); + PipelineResponse response = result.GetRawResponse(); + _enumerable.SetRawResponse(response); + + if (response.ContentStream is null) + { + throw new InvalidOperationException("Unable to create result from response with null ContentStream"); + } + + IEnumerable> enumerable = SseParser.Create(response.ContentStream, (_, bytes) => bytes.ToArray()).Enumerate(); + return enumerable.GetEnumerator(); + } + + public void Reset() + { + throw new NotSupportedException("Cannot seek back in an SSE stream."); + } + + public void Dispose() + { + Dispose(true); + GC.SuppressFinalize(this); + } + + private void Dispose(bool disposing) + { + if (disposing && _events is not null) + { + _events.Dispose(); + _events = null; + + // Dispose the response so we don't leave the unbuffered + // network stream open. + PipelineResponse response = _enumerable.GetRawResponse(); + response.Dispose(); + } + } + } +} diff --git a/.dotnet/src/Custom/Chat/Internal/UnknownChatMessage.Serialization.cs b/.dotnet/src/Custom/Chat/Internal/UnknownChatMessage.Serialization.cs new file mode 100644 index 000000000..02b8d2096 --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/UnknownChatMessage.Serialization.cs @@ -0,0 +1,70 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +internal partial class UnknownChatMessage : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, WriteCore, writer, options); + + internal static void WriteCore(UnknownChatMessage instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + instance.WriteCore(writer, options); + } + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("role"u8); + writer.WriteStringValue(Role.ToSerialString()); + ChatMessageContentPart.WriteCoreContentPartList(Content, writer, options); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + + internal static UnknownChatMessage DeserializeUnknownChatMessage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + ChatMessageRole? role = null; + IList content = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("role"u8)) + { + role = property.Value.GetString().ToChatMessageRole(); + continue; + } + if (property.NameEquals("content"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatMessageContentPart.DeserializeChatMessageContentPart(item, options)); + } + content = array; + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UnknownChatMessage(role.Value, content ?? new ChangeTrackingList(), serializedAdditionalRawData); + } +} diff --git a/.dotnet/src/Custom/Chat/Internal/UnknownChatMessage.cs b/.dotnet/src/Custom/Chat/Internal/UnknownChatMessage.cs new file mode 100644 index 000000000..ac4faf00f --- /dev/null +++ b/.dotnet/src/Custom/Chat/Internal/UnknownChatMessage.cs @@ -0,0 +1,7 @@ +namespace OpenAI.Chat; + +[CodeGenModel("UnknownChatCompletionRequestMessage")] +internal partial class UnknownChatMessage : ChatMessage +{ + +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/StreamingChatCompletionUpdate.cs b/.dotnet/src/Custom/Chat/StreamingChatCompletionUpdate.cs new file mode 100644 index 000000000..2ce62bd84 --- /dev/null +++ b/.dotnet/src/Custom/Chat/StreamingChatCompletionUpdate.cs @@ -0,0 +1,122 @@ +namespace OpenAI.Chat; + +using System; +using System.Collections.Generic; +using System.Text.Json; + +/// +/// Represents an incremental item of new data in a streaming response to a chat completion request. +/// +[CodeGenModel("CreateChatCompletionStreamResponse")] +public partial class StreamingChatCompletionUpdate +{ + private IReadOnlyList _contentUpdate; + private IReadOnlyList _toolCallUpdates; + private IReadOnlyList _contentTokenLogProbabilities; + private IReadOnlyList _refusalTokenLogProbabilities; + + // CUSTOM: + // - Made private. This property does not add value in the context of a strongly-typed class. + // - Changed type from string. + /// The object type, which is always `chat.completion.chunk`. + [CodeGenMember("Object")] + internal InternalCreateChatCompletionStreamResponseObject Object { get; } = InternalCreateChatCompletionStreamResponseObject.ChatCompletionChunk; + + // CUSTOM: Made internal.We only get back a single choice, and instead we flatten the structure for usability. + /// + /// A list of chat completion choices. Can contain more than one elements if `n` is greater than 1. Can also be empty for the + /// last chunk if you set `stream_options: {"include_usage": true}`. + /// + [CodeGenMember("Choices")] + internal IReadOnlyList Choices { get; } + + // CUSTOM: Renamed. + /// The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp. + [CodeGenMember("Created")] + public DateTimeOffset CreatedAt { get; } + + // CUSTOM: Changed type from InternalCreateChatCompletionStreamResponseUsage. + /// + /// An optional field that will only be present when you set `stream_options: {"include_usage": true}` in your request. + /// When present, it contains a null value except for the last chunk which contains the token usage statistics for the entire request. + /// + [CodeGenMember("Usage")] + public ChatTokenUsage Usage { get; } + + // CUSTOM: Made internal. + [CodeGenMember("ServiceTier")] + internal InternalCreateChatCompletionStreamResponseServiceTier? ServiceTier { get; } + + // CUSTOM: Flattened choice property. + /// + /// Gets the associated with this update. + /// + public ChatFinishReason? FinishReason => (Choices.Count > 0) + ? Choices[0].FinishReason + : null; + + // CUSTOM: Flattened choice logprobs property. + /// + /// Log probability information. + /// + public IReadOnlyList ContentTokenLogProbabilities => (Choices.Count > 0 && Choices[0].Logprobs != null) + ? Choices[0].Logprobs.Content + : _contentTokenLogProbabilities ??= new ChangeTrackingList(); + + // CUSTOM: Flattened refusal logprobs property. + public IReadOnlyList RefusalTokenLogProbabilities => (Choices.Count > 0 && Choices[0].Logprobs != null) + ? Choices[0].Logprobs.Refusal + : _refusalTokenLogProbabilities ??= new ChangeTrackingList(); + + // CUSTOM: Flattened choice delta property. + /// + /// Gets the content fragment associated with this update. + /// + /// + /// + /// Corresponds to e.g. $.choices[0].delta.content in the underlying REST schema. + /// + /// Each update contains only a small number of tokens. When presenting or reconstituting a full, streamed + /// response, all values for the same chat completions should be combined. + /// + public IReadOnlyList ContentUpdate => (Choices.Count > 0) + ? Choices[0].Delta.Content + : _contentUpdate ??= new ChangeTrackingList(); + + // CUSTOM: Flattened choice delta property. + /// + /// Gets the associated with this update. + /// + /// + /// assignment typically occurs in a single update across a streamed Chat Completions + /// and the value should be considered to be persist for all subsequent updates. + /// + public ChatMessageRole? Role => (Choices.Count > 0) + ? Choices[0].Delta.Role + : null; + + // CUSTOM: Flattened choice delta property. + /// Gets the tool calls. + public IReadOnlyList ToolCallUpdates => (Choices.Count > 0) + ? Choices[0].Delta.ToolCalls + : _toolCallUpdates ??= new ChangeTrackingList(); + + // CUSTOM: Flattened choice delta property. + /// + /// Deprecated and replaced by . The name and arguments of a function that + /// should be called, as generated by the model. + /// + public StreamingChatFunctionCallUpdate FunctionCallUpdate => (Choices.Count > 0) + ? Choices[0].Delta.FunctionCall + : null; + + // CUSTOM: Flattened choice delta property. + public string RefusalUpdate => (Choices.Count > 0) + ? Choices[0].Delta?.Refusal + : null; + + internal static List DeserializeStreamingChatCompletionUpdates(JsonElement element) + { + return [StreamingChatCompletionUpdate.DeserializeStreamingChatCompletionUpdate(element)]; + } +} diff --git a/.dotnet/src/Custom/Chat/StreamingChatFunctionCallUpdate.cs b/.dotnet/src/Custom/Chat/StreamingChatFunctionCallUpdate.cs new file mode 100644 index 000000000..ad6d8c36f --- /dev/null +++ b/.dotnet/src/Custom/Chat/StreamingChatFunctionCallUpdate.cs @@ -0,0 +1,28 @@ +namespace OpenAI.Chat; + +[CodeGenModel("ChatCompletionStreamResponseDeltaFunctionCall")] +public partial class StreamingChatFunctionCallUpdate +{ + // CUSTOM: Renamed. + /// The name of the function to call. + [CodeGenMember("Name")] + public string FunctionName { get; } + + // CUSTOM: Renamed. + /// + /// Gets a function arguments fragment associated with this update. + /// + /// + /// + /// Each update contains only a small number of tokens. When presenting or reconstituting a full, streamed + /// arguments body, all values should be combined. + /// + /// + /// As is the case for non-streaming , the content provided + /// for function arguments is not guaranteed to be well-formed JSON or to contain expected data. Callers + /// should validate function arguments before using them. + /// + /// + [CodeGenMember("Arguments")] + public string FunctionArgumentsUpdate { get; } +} diff --git a/.dotnet/src/Custom/Chat/StreamingChatToolCallUpdate.cs b/.dotnet/src/Custom/Chat/StreamingChatToolCallUpdate.cs new file mode 100644 index 000000000..c53055a2f --- /dev/null +++ b/.dotnet/src/Custom/Chat/StreamingChatToolCallUpdate.cs @@ -0,0 +1,76 @@ +namespace OpenAI.Chat; + +using System; +using System.Text.Json; + +/// +/// A base representation of an incremental update to a streaming tool call that is part of a streaming chat completion +/// request. +/// +/// +/// +/// This type encapsulates the payload located in e.g. $.choices[0].delta.tool_calls[] in the REST API schema. +/// +/// +/// To differentiate between parallel streaming tool calls within a single streaming choice, use the value of the +/// property. +/// +/// +/// is the streaming, base class counterpart to . +/// +/// +[CodeGenModel("ChatCompletionMessageToolCallChunk")] +[CodeGenSuppress("StreamingChatToolCallUpdate", typeof(int))] +public partial class StreamingChatToolCallUpdate +{ + [CodeGenMember("Function")] + internal InternalChatCompletionMessageToolCallChunkFunction Function { get; } + + internal StreamingChatToolCallUpdate(int index, string id, InternalChatCompletionMessageToolCallChunkFunction function) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(function, nameof(function)); + + Kind = ChatToolCallKind.Function; + + Index = index; + Id = id; + Function = function; + } + + // CUSTOM: + // - Renamed. + // - Changed type from string. + /// The kind of tool.Currently, only is supported. + [CodeGenMember("Type")] + public ChatToolCallKind Kind { get; } = ChatToolCallKind.Function; + + /// + /// The name of the function requested by the tool call. + /// + /// + /// + /// Corresponds to e.g. $.choices[0].delta.tool_calls[0].function.name in the REST API schema. + /// + /// + /// For a streaming function tool call, this name will appear in a single streaming update payload, typically the + /// first. Use the property to differentiate between multiple, + /// parallel tool calls when streaming. + /// + /// + public string FunctionName => Function?.Name; + + /// + /// The next new segment of the function arguments for the function tool called by a streaming tool call. + /// These must be accumulated for the complete contents of the function arguments. + /// + /// + /// + /// Corresponds to e.g. $.choices[0].delta.tool_calls[0].function.arguments in the REST API schema. + /// + /// Note that the model does not always generate valid JSON and may hallucinate parameters + /// not defined by your function schema. Validate the arguments in your code before calling + /// your function. + /// + public string FunctionArgumentsUpdate => Function?.Arguments; +} diff --git a/.dotnet/src/Custom/Chat/SystemChatMessage.Serialization.cs b/.dotnet/src/Custom/Chat/SystemChatMessage.Serialization.cs new file mode 100644 index 000000000..b1063b03f --- /dev/null +++ b/.dotnet/src/Custom/Chat/SystemChatMessage.Serialization.cs @@ -0,0 +1,27 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class SystemChatMessage : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeSystemChatMessage, writer, options); + + internal static void SerializeSystemChatMessage(SystemChatMessage instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("role"u8); + writer.WriteStringValue(Role.ToSerialString()); + ChatMessageContentPart.WriteCoreContentPartList(Content, writer, options); + writer.WriteOptionalProperty("name"u8, ParticipantName, options); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/SystemChatMessage.cs b/.dotnet/src/Custom/Chat/SystemChatMessage.cs new file mode 100644 index 000000000..ed991f4d5 --- /dev/null +++ b/.dotnet/src/Custom/Chat/SystemChatMessage.cs @@ -0,0 +1,53 @@ +using System.Collections.Generic; + +namespace OpenAI.Chat; + +/// +/// Represents a chat message of the system role as supplied to a chat completion request. A system message is +/// generally supplied as the first message to a chat completion request and guides the model's behavior across future +/// assistant role response messages. These messages may help control behavior, style, tone, and +/// restrictions for a model-based assistant. +/// +[CodeGenModel("ChatCompletionRequestSystemMessage")] +public partial class SystemChatMessage : ChatMessage +{ + /// + /// Creates a new instance of using a collection of content items. + /// For system messages, these can only be of type text. + /// + /// + /// The collection of content items associated with the message. + /// + public SystemChatMessage(IEnumerable contentParts) + : base(ChatMessageRole.System, contentParts) + { } + + /// + /// Creates a new instance of using a collection of content items. + /// For system messages, these can only be of type text. + /// + /// + /// The collection of content items associated with the message. + /// + public SystemChatMessage(params ChatMessageContentPart[] contentParts) + : base(ChatMessageRole.System, contentParts) + { + Argument.AssertNotNullOrEmpty(contentParts, nameof(contentParts)); + } + + /// + /// Creates a new instance of with a single item of text content. + /// + /// The text content of the message. + public SystemChatMessage(string content) + : base(ChatMessageRole.System, content) + { + Argument.AssertNotNull(content, nameof(content)); + } + + /// + /// An optional name for the participant. + /// + [CodeGenMember("Name")] + public string ParticipantName { get; set; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/ToolChatMessage.Serialization.cs b/.dotnet/src/Custom/Chat/ToolChatMessage.Serialization.cs new file mode 100644 index 000000000..9eb64d17e --- /dev/null +++ b/.dotnet/src/Custom/Chat/ToolChatMessage.Serialization.cs @@ -0,0 +1,26 @@ +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class ToolChatMessage : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeToolChatMessage, writer, options); + + internal static void SerializeToolChatMessage(ToolChatMessage instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("role"u8); + writer.WriteStringValue(Role.ToSerialString()); + writer.WritePropertyName("tool_call_id"u8); + writer.WriteStringValue(ToolCallId); + ChatMessageContentPart.WriteCoreContentPartList(Content, writer, options); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} diff --git a/.dotnet/src/Custom/Chat/ToolChatMessage.cs b/.dotnet/src/Custom/Chat/ToolChatMessage.cs new file mode 100644 index 000000000..c85f8efed --- /dev/null +++ b/.dotnet/src/Custom/Chat/ToolChatMessage.cs @@ -0,0 +1,79 @@ +using System.Collections.Generic; + +namespace OpenAI.Chat; + +/// +/// Represents a chat message of the tool role as supplied to a chat completion request. A tool message +/// encapsulates a resolution of a made by the model. The typical interaction flow featuring +/// tool messages is: +/// +/// A provides a on a request; +/// +/// Based on the name and description information of provided tools, the model responds with one or +/// more instances that need to be resolved to continue the logical conversation; +/// +/// +/// For each , the matching tool is invoked and its output is supplied back to the model +/// via a to resolve the tool call and allow the logical conversation to +/// continue. +/// +/// +/// +[CodeGenModel("ChatCompletionRequestToolMessage")] +[CodeGenSuppress("ToolChatMessage", typeof(IEnumerable), typeof(string))] +public partial class ToolChatMessage : ChatMessage +{ + /// + /// Creates a new instance of using a collection of content items. + /// For tool messages, these can only be of type text. + /// + /// + /// The ID of the tool call that this message responds to. + /// + /// + /// The collection of content items associated with the message. + /// + public ToolChatMessage(string toolCallId, IEnumerable contentParts) + : base(ChatMessageRole.Tool, contentParts) + { + Argument.AssertNotNullOrEmpty(toolCallId, nameof(toolCallId)); + Argument.AssertNotNullOrEmpty(contentParts, nameof(contentParts)); + + ToolCallId = toolCallId; + } + + /// + /// Creates a new instance of using a collection of content items. + /// For tool messages, these can only be of type text. + /// + /// + /// The ID of the tool call that this message responds to. + /// + /// + /// The collection of content items associated with the message. + /// + public ToolChatMessage(string toolCallId, params ChatMessageContentPart[] contentParts) + : base(ChatMessageRole.Tool, contentParts) + { + Argument.AssertNotNullOrEmpty(toolCallId, nameof(toolCallId)); + Argument.AssertNotNullOrEmpty(contentParts, nameof(contentParts)); + + ToolCallId = toolCallId; + } + + /// + /// Creates a new instance of with a single item of text content. + /// + /// + /// The ID of the tool call that this message responds to. + /// + /// The text content of the message. + public ToolChatMessage(string toolCallId, string content) + : base(ChatMessageRole.Tool, content) + { + Argument.AssertNotNullOrEmpty(toolCallId, nameof(toolCallId)); + Argument.AssertNotNull(content, nameof(content)); + + ToolCallId = toolCallId; + } +} diff --git a/.dotnet/src/Custom/Chat/UserChatMessage.Serialization.cs b/.dotnet/src/Custom/Chat/UserChatMessage.Serialization.cs new file mode 100644 index 000000000..fcd98a834 --- /dev/null +++ b/.dotnet/src/Custom/Chat/UserChatMessage.Serialization.cs @@ -0,0 +1,27 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class UserChatMessage : IJsonModel +{ + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeUserChatMessage, writer, options); + + internal static void SerializeUserChatMessage(UserChatMessage instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + => instance.WriteCore(writer, options); + + protected override void WriteCore(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("role"u8); + writer.WriteStringValue(Role.ToSerialString()); + ChatMessageContentPart.WriteCoreContentPartList(Content, writer, options); + writer.WriteOptionalProperty("name"u8, ParticipantName, options); + writer.WriteSerializedAdditionalRawData(SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Chat/UserChatMessage.cs b/.dotnet/src/Custom/Chat/UserChatMessage.cs new file mode 100644 index 000000000..18a881117 --- /dev/null +++ b/.dotnet/src/Custom/Chat/UserChatMessage.cs @@ -0,0 +1,57 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat; + +/// +/// Represents a chat message of the user role as supplied to a chat completion request. A user message contains +/// information originating from the caller and serves as a prompt for the model to complete. User messages may result +/// in either direct assistant message responses or in calls to supplied tools or functions. +/// +[CodeGenModel("ChatCompletionRequestUserMessage")] +[CodeGenSuppress("UserChatMessage", typeof(ReadOnlyMemory))] +public partial class UserChatMessage : ChatMessage +{ + /// + /// Creates a new instance of using a collection of content items that can + /// include text and image information. This content format is currently only applicable to the + /// gpt-4o and later models and will not be accepted by older models. + /// + /// + /// The collection of text and image content items associated with the message. + /// + public UserChatMessage(IEnumerable content) + : base(ChatMessageRole.User, content) + { + Argument.AssertNotNullOrEmpty(content, nameof(content)); + } + + /// + /// Creates a new instance of using a collection of content items that can + /// include text and image information. This content format is currently only applicable to the + /// gpt-4o and later models and will not be accepted by older models. + /// + /// + /// The collection of text and image content items associated with the message. + /// + public UserChatMessage(params ChatMessageContentPart[] content) + : base(ChatMessageRole.User, content) + { } + + /// + /// Creates a new instance of with ordinary text content. + /// + /// The textual content associated with the message. + public UserChatMessage(string content) + : base(ChatMessageRole.User, content) + { + Argument.AssertNotNull(content, nameof(content)); + } + + // CUSTOM: Rename. + /// + /// An optional name for the participant. + /// + [CodeGenMember("Name")] + public string ParticipantName { get; set; } +} diff --git a/.dotnet/src/Custom/Common/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/Common/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..d0c7de7f2 --- /dev/null +++ b/.dotnet/src/Custom/Common/Internal/GeneratorStubs.cs @@ -0,0 +1,11 @@ +using System; + +namespace OpenAI.Internal; + +[CodeGenModel("OmniTypedResponseFormat")] internal partial class InternalOmniTypedResponseFormat { } +[CodeGenModel("ResponseFormatJsonObject")] internal partial class InternalResponseFormatJsonObject { } +[CodeGenModel("ResponseFormatJsonSchema")] internal partial class InternalResponseFormatJsonSchema { } +[CodeGenModel("ResponseFormatJsonSchemaJsonSchema")] internal partial class InternalResponseFormatJsonSchemaJsonSchema { [CodeGenMember("Schema")] public BinaryData Schema { get; set; } } +[CodeGenModel("ResponseFormatJsonSchemaSchema")] internal partial class InternalResponseFormatJsonSchemaSchema { } +[CodeGenModel("ResponseFormatText")] internal partial class InternalResponseFormatText { } +[CodeGenModel("UnknownOmniTypedResponseFormat")] internal partial class InternalUnknownOmniTypedResponseFormat { } diff --git a/.dotnet/src/Custom/Common/ListOrder.cs b/.dotnet/src/Custom/Common/ListOrder.cs new file mode 100644 index 000000000..41d3e065c --- /dev/null +++ b/.dotnet/src/Custom/Common/ListOrder.cs @@ -0,0 +1,12 @@ +namespace OpenAI; + +[CodeGenModel("ListOrder")] +public readonly partial struct ListOrder +{ + // CUSTOM: Rename members. + + [CodeGenMember("Asc")] + public static ListOrder OldestFirst { get; } = new ListOrder(OldestFirstValue); + [CodeGenMember("Desc")] + public static ListOrder NewestFirst { get; } = new ListOrder(NewestFirstValue); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Embeddings/Embedding.cs b/.dotnet/src/Custom/Embeddings/Embedding.cs new file mode 100644 index 000000000..bc21ad734 --- /dev/null +++ b/.dotnet/src/Custom/Embeddings/Embedding.cs @@ -0,0 +1,145 @@ +using System; +using System.Buffers; +using System.Buffers.Binary; +using System.Buffers.Text; +using System.Collections.Generic; +using System.Runtime.InteropServices; + +namespace OpenAI.Embeddings; + +/// +/// Represents an embedding vector returned by embedding endpoint. +/// +[CodeGenModel("Embedding")] +[CodeGenSuppress("Embedding", typeof(int), typeof(BinaryData))] +public partial class Embedding +{ + // CUSTOM: Made private. The value of the embedding is publicly exposed as ReadOnlyMemory instead of BinaryData. + /// + /// The embedding vector, which is a list of floats. The length of vector depends on the model as + /// listed in the [embedding guide](/docs/guides/embeddings). + /// + /// To assign an object to this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// + /// Supported types: + /// + /// + /// where T is of type + /// + /// + /// + /// + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + private BinaryData EmbeddingProperty { get; } + + // CUSTOM: Made private. This property does not add value in the context of a strongly-typed class. + /// The object type, which is always "embedding". + private InternalEmbeddingObject Object { get; } = InternalEmbeddingObject.Embedding; + + // CUSTOM: Added logic to handle additional custom properties. + /// Initializes a new instance of . + /// The index of the embedding in the list of embeddings. + /// + /// The embedding vector, which is a list of floats. The length of vector depends on the model as + /// listed in the [embedding guide](/docs/guides/embeddings). + /// + /// The object type, which is always "embedding". + /// Keeps track of any properties unknown to the library. + internal Embedding(int index, BinaryData embeddingProperty, InternalEmbeddingObject @object, IDictionary serializedAdditionalRawData) + { + Index = (int)index; + EmbeddingProperty = embeddingProperty; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + + // Handle additional custom properties. + Vector = ConvertToVectorOfFloats(embeddingProperty); + } + + // CUSTOM: Entirely custom constructor used by the Model Factory. + /// Initializes a new instance of . + /// The index of the embedding in the list of embeddings. + /// The embedding vector, which is a list of floats. + internal Embedding(int index, ReadOnlyMemory vector) + { + Index = index; + Vector = vector; + } + + // CUSTOM: Added as a public, custom property. For slightly better performance, the embedding is always requested as a base64-encoded + // string and then manually transformed into a more user-friendly ReadOnlyMemory. + /// + /// The embedding vector, which is a list of floats. + /// + public ReadOnlyMemory Vector { get; } + + // CUSTOM: Implemented custom logic to transform from BinaryData to ReadOnlyMemory. + private static ReadOnlyMemory ConvertToVectorOfFloats(BinaryData binaryData) + { + ReadOnlySpan base64 = binaryData.ToMemory().Span; + + // Remove quotes around base64 string. + if (base64.Length < 2 || base64[0] != (byte)'"' || base64[base64.Length - 1] != (byte)'"') + { + ThrowInvalidData(); + } + base64 = base64.Slice(1, base64.Length - 2); + + // Decode base64 string to bytes. + byte[] bytes = ArrayPool.Shared.Rent(Base64.GetMaxDecodedFromUtf8Length(base64.Length)); + OperationStatus status = Base64.DecodeFromUtf8(base64, bytes.AsSpan(), out int bytesConsumed, out int bytesWritten); + if (status != OperationStatus.Done || bytesWritten % sizeof(float) != 0) + { + ThrowInvalidData(); + } + + // Interpret bytes as floats + float[] vector = new float[bytesWritten / sizeof(float)]; + bytes.AsSpan(0, bytesWritten).CopyTo(MemoryMarshal.AsBytes(vector.AsSpan())); + if (!BitConverter.IsLittleEndian) + { + Span ints = MemoryMarshal.Cast(vector.AsSpan()); +#if NET8_0_OR_GREATER + BinaryPrimitives.ReverseEndianness(ints, ints); +#else + for (int i = 0; i < ints.Length; i++) + { + ints[i] = BinaryPrimitives.ReverseEndianness(ints[i]); + } +#endif + } + + ArrayPool.Shared.Return(bytes); + return new ReadOnlyMemory(vector); + + static void ThrowInvalidData() => + throw new FormatException("The input is not a valid Base64 string of encoded floats."); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Embeddings/EmbeddingClient.Protocol.cs b/.dotnet/src/Custom/Embeddings/EmbeddingClient.Protocol.cs new file mode 100644 index 000000000..94dfb9d6f --- /dev/null +++ b/.dotnet/src/Custom/Embeddings/EmbeddingClient.Protocol.cs @@ -0,0 +1,54 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; +using System.Threading.Tasks; + +namespace OpenAI.Embeddings; + +[CodeGenSuppress("CreateEmbeddingAsync", typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("CreateEmbedding", typeof(BinaryContent), typeof(RequestOptions))] +public partial class EmbeddingClient +{ + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + /// + /// [Protocol Method] Creates an embedding vector representing the input text. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GenerateEmbeddingsAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateEmbeddingRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + /// + /// [Protocol Method] Creates an embedding vector representing the input text. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GenerateEmbeddings(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateEmbeddingRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Embeddings/EmbeddingClient.cs b/.dotnet/src/Custom/Embeddings/EmbeddingClient.cs new file mode 100644 index 000000000..eae8dad2c --- /dev/null +++ b/.dotnet/src/Custom/Embeddings/EmbeddingClient.cs @@ -0,0 +1,201 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Linq; +using System.Threading; +using System.Threading.Tasks; + +namespace OpenAI.Embeddings; + +// CUSTOM: +// - Renamed. +// - Suppressed constructor that takes endpoint parameter; endpoint is now a property in the options class. +// - Suppressed methods that only take the options parameter. +/// The service client for OpenAI embedding operations. +[CodeGenClient("Embeddings")] +[CodeGenSuppress("EmbeddingClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateEmbeddingAsync", typeof(EmbeddingGenerationOptions))] +[CodeGenSuppress("CreateEmbedding", typeof(EmbeddingGenerationOptions))] +public partial class EmbeddingClient +{ + private readonly string _model; + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public EmbeddingClient(string model, ApiKeyCredential credential) : this(model, credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public EmbeddingClient(string model, ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + protected internal EmbeddingClient(ClientPipeline pipeline, string model, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNullOrEmpty(model, nameof(model)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: Added to simplify generating a single embedding from a string input. + /// Generates an embedding representing the text input. + /// The text input to generate an embedding for. + /// The options to configure the embedding generation. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateEmbeddingAsync(string input, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(input, nameof(input)); + + options ??= new(); + CreateEmbeddingGenerationOptions(BinaryData.FromObjectAsJson(input), ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = await GenerateEmbeddingsAsync(content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(EmbeddingCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + // CUSTOM: Added to simplify generating a single embedding from a string input. + /// Generates an embedding representing the text input. + /// The text input to generate an embedding for. + /// The options to configure the embedding generation. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateEmbedding(string input, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(input, nameof(input)); + + options ??= new(); + CreateEmbeddingGenerationOptions(BinaryData.FromObjectAsJson(input), ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = GenerateEmbeddings(content, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(EmbeddingCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + // CUSTOM: Added to simplify passing the input as a collection of strings instead of BinaryData. + /// Generates embeddings representing the text inputs. + /// The text inputs to generate embeddings for. + /// The options to configure the embedding generation. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual async Task> GenerateEmbeddingsAsync(IEnumerable inputs, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(inputs, nameof(inputs)); + + options ??= new(); + CreateEmbeddingGenerationOptions(BinaryData.FromObjectAsJson(inputs), ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = await GenerateEmbeddingsAsync(content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(EmbeddingCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + + } + + // CUSTOM: Added to simplify passing the input as a collection of strings instead of BinaryData. + /// Generates embeddings representing the text inputs. + /// The text inputs to generate embeddings for. + /// The options to configure the embedding generation. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual ClientResult GenerateEmbeddings(IEnumerable inputs, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(inputs, nameof(inputs)); + + options ??= new(); + CreateEmbeddingGenerationOptions(BinaryData.FromObjectAsJson(inputs), ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = GenerateEmbeddings(content, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(EmbeddingCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + // CUSTOM: Added to simplify passing the input as a collection of a collection of tokens instead of BinaryData. + /// Generates embeddings representing the text inputs. + /// The text inputs to generate embeddings for. + /// The options to configure the embedding generation. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual async Task> GenerateEmbeddingsAsync(IEnumerable> inputs, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(inputs, nameof(inputs)); + + options ??= new(); + CreateEmbeddingGenerationOptions(BinaryData.FromObjectAsJson(inputs), ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = await GenerateEmbeddingsAsync(content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(EmbeddingCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + // CUSTOM: Added to simplify passing the input as a collection of a collection of tokens instead of BinaryData. + /// Generates embeddings representing the text inputs. + /// The text inputs to generate embeddings for. + /// The options to configure the embedding generation. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual ClientResult GenerateEmbeddings(IEnumerable> inputs, EmbeddingGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(inputs, nameof(inputs)); + + options ??= new(); + CreateEmbeddingGenerationOptions(BinaryData.FromObjectAsJson(inputs), ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = GenerateEmbeddings(content, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(EmbeddingCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + private void CreateEmbeddingGenerationOptions(BinaryData input, ref EmbeddingGenerationOptions options) + { + options.Input = input; + options.Model = _model; + options.EncodingFormat = InternalCreateEmbeddingRequestEncodingFormat.Base64; + } +} diff --git a/.dotnet/src/Custom/Embeddings/EmbeddingCollection.Serialization.cs b/.dotnet/src/Custom/Embeddings/EmbeddingCollection.Serialization.cs new file mode 100644 index 000000000..7bafe4a7b --- /dev/null +++ b/.dotnet/src/Custom/Embeddings/EmbeddingCollection.Serialization.cs @@ -0,0 +1,87 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Embeddings; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class EmbeddingCollection : IJsonModel +{ + // CUSTOM: + // - Serialized the Items property. + // - Recovered the deserialization of SerializedAdditionalRawData. See https://github.com/Azure/autorest.csharp/issues/4636. + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeEmbeddingCollection, writer, options); + + internal static void SerializeEmbeddingCollection(EmbeddingCollection instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in instance.Items) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + writer.WritePropertyName("model"u8); + writer.WriteStringValue(instance.Model); + writer.WritePropertyName("object"u8); + writer.WriteStringValue(instance.Object.ToString()); + writer.WritePropertyName("usage"u8); + writer.WriteObjectValue(instance.Usage, options); + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + + // CUSTOM: Recovered the deserialization of SerializedAdditionalRawData. See https://github.com/Azure/autorest.csharp/issues/4636. + internal static EmbeddingCollection DeserializeEmbeddingCollection(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= new ModelReaderWriterOptions("W"); + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList data = default; + string model = default; + InternalCreateEmbeddingResponseObject @object = default; + EmbeddingTokenUsage usage = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(Embedding.DeserializeEmbedding(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("model"u8)) + { + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalCreateEmbeddingResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("usage"u8)) + { + usage = EmbeddingTokenUsage.DeserializeEmbeddingTokenUsage(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new EmbeddingCollection(data, model, @object, usage, serializedAdditionalRawData); + } +} diff --git a/.dotnet/src/Custom/Embeddings/EmbeddingCollection.cs b/.dotnet/src/Custom/Embeddings/EmbeddingCollection.cs new file mode 100644 index 000000000..7fe4ff179 --- /dev/null +++ b/.dotnet/src/Custom/Embeddings/EmbeddingCollection.cs @@ -0,0 +1,90 @@ +using System; +using System.Collections.Generic; +using System.Collections.ObjectModel; + +namespace OpenAI.Embeddings; + +[CodeGenModel("CreateEmbeddingResponse")] +[CodeGenSuppress("Data")] +[CodeGenSuppress(nameof(EmbeddingCollection))] +[CodeGenSuppress(nameof(EmbeddingCollection), typeof(IReadOnlyList), typeof(string), typeof(InternalCreateEmbeddingResponseObject), typeof(EmbeddingTokenUsage))] +public partial class EmbeddingCollection : ReadOnlyCollection +{ + // CUSTOM: Made private. This property does not add value in the context of a strongly-typed class. + /// The object type, which is always "list". + [CodeGenMember("Object")] + private InternalCreateEmbeddingResponseObject Object { get; } = InternalCreateEmbeddingResponseObject.List; + + // CUSTOM: Recovered this field. See https://github.com/Azure/autorest.csharp/issues/4636. + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + private IDictionary SerializedAdditionalRawData; + + // CUSTOM: Set the inherited Items property via the base constructor in favor of the suppressed Data property. + /// Initializes a new instance of . + /// The list of embeddings generated by the model. + /// The name of the model used to generate the embedding. + /// The usage information for the request. + /// , or is null. + internal EmbeddingCollection(IEnumerable data, string model, EmbeddingTokenUsage usage) + : base([.. data]) + { + Argument.AssertNotNull(data, nameof(data)); + Argument.AssertNotNull(model, nameof(model)); + Argument.AssertNotNull(usage, nameof(usage)); + + Model = model; + Usage = usage; + } + + // CUSTOM: Set the inherited Items property via the base constructor in favor of the suppressed Data property. + /// Initializes a new instance of . + /// The list of embeddings generated by the model. + /// The name of the model used to generate the embedding. + /// The object type, which is always "list". + /// The usage information for the request. + /// Keeps track of any properties unknown to the library. + internal EmbeddingCollection(IReadOnlyList data, string model, InternalCreateEmbeddingResponseObject @object, EmbeddingTokenUsage usage, IDictionary serializedAdditionalRawData) + : base([.. data]) + { + Model = model; + Object = @object; + Usage = usage; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + // CUSTOM: Set the inherited Items property via the base constructor in favor of the suppressed Data property. + /// Initializes a new instance of for deserialization. + internal EmbeddingCollection() + : base([]) + { + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Embeddings/EmbeddingGenerationOptions.cs b/.dotnet/src/Custom/Embeddings/EmbeddingGenerationOptions.cs new file mode 100644 index 000000000..0c3760797 --- /dev/null +++ b/.dotnet/src/Custom/Embeddings/EmbeddingGenerationOptions.cs @@ -0,0 +1,97 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Embeddings; + +[CodeGenModel("CreateEmbeddingRequest")] +[CodeGenSuppress("EmbeddingGenerationOptions", typeof(BinaryData), typeof(InternalCreateEmbeddingRequestModel))] +public partial class EmbeddingGenerationOptions +{ + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// + /// Input text to embed, encoded as a string or array of tokens. To embed multiple inputs in a + /// single request, pass an array of strings or array of token arrays. Each input must not exceed + /// the max input tokens for the model (8191 tokens for `text-embedding-ada-002`) and cannot be an + /// empty string. + /// [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb) + /// for counting tokens. + /// + /// To assign an object to this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// + /// Supported types: + /// + /// + /// + /// + /// + /// where T is of type + /// + /// + /// where T is of type + /// + /// + /// where T is of type IList{long} + /// + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal BinaryData Input { get; set; } + + // CUSTOM: + // - Made internal. The model is specified by the client. + // - Added setter. + /// + /// ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to + /// see all of your available models, or see our [Model overview](/docs/models/overview) for + /// descriptions of them. + /// + internal InternalCreateEmbeddingRequestModel Model { get; set; } + + // CUSTOM: Made internal. We always request the embedding as a base64-encoded string for better performance. + /// + /// The format to return the embeddings in. Can be either `float` or + /// [`base64`](https://pypi.org/project/pybase64/). + /// + internal InternalCreateEmbeddingRequestEncodingFormat? EncodingFormat { get; set; } + + // CUSTOM: Made public now that there are no required properties. + /// Initializes a new instance of . + public EmbeddingGenerationOptions() + { + } + + // CUSTOM: Renamed. + /// + /// A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. + /// Learn more. + /// + [CodeGenMember("User")] + public string EndUserId { get; set; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Embeddings/EmbeddingTokenUsage.cs b/.dotnet/src/Custom/Embeddings/EmbeddingTokenUsage.cs new file mode 100644 index 000000000..54927ecd3 --- /dev/null +++ b/.dotnet/src/Custom/Embeddings/EmbeddingTokenUsage.cs @@ -0,0 +1,10 @@ +namespace OpenAI.Embeddings; + +[CodeGenModel("CreateEmbeddingResponseUsage")] +public partial class EmbeddingTokenUsage +{ + // CUSTOM: Renamed. + /// The number of tokens used by the input prompts. + [CodeGenMember("PromptTokens")] + public int InputTokens { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Embeddings/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/Embeddings/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..0b0502e87 --- /dev/null +++ b/.dotnet/src/Custom/Embeddings/Internal/GeneratorStubs.cs @@ -0,0 +1,15 @@ +namespace OpenAI.Embeddings; + +// CUSTOM: Made internal. + +[CodeGenModel("CreateEmbeddingRequestEncodingFormat")] +internal readonly partial struct InternalCreateEmbeddingRequestEncodingFormat { } + +[CodeGenModel("CreateEmbeddingRequestModel")] +internal readonly partial struct InternalCreateEmbeddingRequestModel { } + +[CodeGenModel("CreateEmbeddingResponseObject")] +internal readonly partial struct InternalCreateEmbeddingResponseObject { } + +[CodeGenModel("EmbeddingObject")] +internal readonly partial struct InternalEmbeddingObject { } \ No newline at end of file diff --git a/.dotnet/src/Custom/Embeddings/OpenAIEmbeddingsModelFactory.cs b/.dotnet/src/Custom/Embeddings/OpenAIEmbeddingsModelFactory.cs new file mode 100644 index 000000000..c2b94109c --- /dev/null +++ b/.dotnet/src/Custom/Embeddings/OpenAIEmbeddingsModelFactory.cs @@ -0,0 +1,43 @@ +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Embeddings; + +/// Model factory for models. +public static partial class OpenAIEmbeddingsModelFactory +{ + /// Initializes a new instance of . + /// A new instance for mocking. + public static Embedding Embedding(int index = default, IEnumerable vector = null) + { + vector ??= new List(); + + return new Embedding( + index, + vector.ToArray()); + } + + /// Initializes a new instance of . + /// A new instance for mocking. + public static EmbeddingCollection EmbeddingCollection(IEnumerable items = null, string model = null, EmbeddingTokenUsage usage = null) + { + items ??= new List(); + + return new EmbeddingCollection( + items.ToList(), + model, + InternalCreateEmbeddingResponseObject.List, + usage, + serializedAdditionalRawData: null); + } + + /// Initializes a new instance of . + /// A new instance for mocking. + public static EmbeddingTokenUsage EmbeddingTokenUsage(int inputTokens = default, int totalTokens = default) + { + return new EmbeddingTokenUsage( + inputTokens, + totalTokens, + serializedAdditionalRawData: null); + } +} diff --git a/.dotnet/src/Custom/Files/FileClient.Protocol.cs b/.dotnet/src/Custom/Files/FileClient.Protocol.cs new file mode 100644 index 000000000..f9784bfec --- /dev/null +++ b/.dotnet/src/Custom/Files/FileClient.Protocol.cs @@ -0,0 +1,212 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; +using System.Threading.Tasks; + +namespace OpenAI.Files; + +[CodeGenSuppress("CreateFileAsync", typeof(BinaryContent), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CreateFile", typeof(BinaryContent), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("GetFilesAsync", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("GetFiles", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("RetrieveFileAsync", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("RetrieveFile", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("DeleteFileAsync", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("DeleteFile", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("DownloadFileAsync", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("DownloadFile", typeof(string), typeof(RequestOptions))] +public partial class FileClient +{ + /// + /// [Protocol Method] Upload a file that can be used across various endpoints. The size of all the files uploaded by + /// one organization can be up to 100 GB. + /// + /// The size of individual files can be a maximum of 512 MB or 2 million tokens for Assistants. See + /// the Assistants Tools guide to + /// learn more about the types of files supported. The Fine-tuning API only supports `.jsonl` files. + /// + /// Please contact us if you need to increase these + /// storage limits. + /// + /// The content to send as the body of the request. + /// The content type of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task UploadFileAsync(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateCreateFileRequest(content, contentType, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Upload a file that can be used across various endpoints. The size of all the files uploaded by + /// one organization can be up to 100 GB. + /// + /// The size of individual files can be a maximum of 512 MB or 2 million tokens for Assistants. See + /// the Assistants Tools guide to + /// learn more about the types of files supported. The Fine-tuning API only supports `.jsonl` files. + /// + /// Please contact us if you need to increase these + /// storage limits. + /// + /// The content to send as the body of the request. + /// The content type of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult UploadFile(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateCreateFileRequest(content, contentType, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Retrieves a list of files that belong to the user's organization. + /// + /// Only return files with the given purpose. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GetFilesAsync(string purpose, RequestOptions options) + { + using PipelineMessage message = CreateGetFilesRequest(purpose, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieves a list of files that belong to the user's organization. + /// + /// Only return files with the given purpose. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetFiles(string purpose, RequestOptions options) + { + using PipelineMessage message = CreateGetFilesRequest(purpose, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Retrieves information about a specified file. + /// + /// The ID of the file to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GetFileAsync(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateRetrieveFileRequest(fileId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieves information about a specified file. + /// + /// The ID of the file to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetFile(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateRetrieveFileRequest(fileId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Deletes a previously uploaded file. + /// + /// The ID of the file to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task DeleteFileAsync(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDeleteFileRequest(fileId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Deletes a previously uploaded file. + /// + /// The ID of the file to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteFile(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDeleteFileRequest(fileId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Downloads the binary content of the specified file. + /// + /// The ID of the file to download. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task DownloadFileAsync(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDownloadFileRequest(fileId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Downloads the binary content of the specified file. + /// + /// The ID of the file to download. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DownloadFile(string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDownloadFileRequest(fileId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } +} diff --git a/.dotnet/src/Custom/Files/FileClient.cs b/.dotnet/src/Custom/Files/FileClient.cs new file mode 100644 index 000000000..1905aa714 --- /dev/null +++ b/.dotnet/src/Custom/Files/FileClient.cs @@ -0,0 +1,299 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.IO; +using System.Threading; +using System.Threading.Tasks; + +namespace OpenAI.Files; + +// CUSTOM: +// - Renamed. +// - Suppressed constructor that takes endpoint parameter; endpoint is now a property in the options class. +/// The service client for OpenAI file operations. +[CodeGenClient("Files")] +[CodeGenSuppress("FileClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateFileAsync", typeof(InternalFileUploadOptions))] +[CodeGenSuppress("CreateFile", typeof(InternalFileUploadOptions))] +[CodeGenSuppress("GetFilesAsync", typeof(string))] +[CodeGenSuppress("GetFiles", typeof(string))] +[CodeGenSuppress("RetrieveFileAsync", typeof(string))] +[CodeGenSuppress("RetrieveFile", typeof(string))] +[CodeGenSuppress("DeleteFileAsync", typeof(string))] +[CodeGenSuppress("DeleteFile", typeof(string))] +[CodeGenSuppress("DownloadFileAsync", typeof(string))] +[CodeGenSuppress("DownloadFile", typeof(string))] +public partial class FileClient +{ + private InternalUploadsClient _internalUploadsClient; + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// is null. + public FileClient(ApiKeyCredential credential) : this(credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// is null. + public FileClient(ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The options to configure the client. + /// is null. + protected internal FileClient(ClientPipeline pipeline, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + options ??= new OpenAIClientOptions(); + + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + _internalUploadsClient = new(pipeline, options); + } + + /// Uploads a file that can be used across various operations. + /// Individual files can be up to 512 MB, and the size of all files uploaded by one organization can be up to 100 GB. + /// The file stream to upload. + /// + /// The filename associated with the file stream. The filename's extension (for example: .json) will be used to + /// validate the file format. The request may fail if the filename's extension and the actual file format do + /// not match. + /// + /// The intended purpose of the uploaded file. + /// A token that can be used to cancel this method call. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> UploadFileAsync(Stream file, string filename, FileUploadPurpose purpose, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(file, nameof(file)); + Argument.AssertNotNullOrEmpty(filename, nameof(filename)); + + InternalFileUploadOptions options = new() + { + Purpose = purpose + }; + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(file, filename); + ClientResult result = await UploadFileAsync(content, content.ContentType, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(OpenAIFileInfo.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Uploads a file that can be used across various operations. + /// Individual files can be up to 512 MB, and the size of all files uploaded by one organization can be up to 100 GB. + /// The file stream to upload. + /// + /// The filename associated with the file stream. The filename's extension (for example: .json) will be used to + /// validate the file format. The request may fail if the filename's extension and the actual file format do + /// not match. + /// + /// The intended purpose of the uploaded file. + /// A token that can be used to cancel this method call. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult UploadFile(Stream file, string filename, FileUploadPurpose purpose, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(file, nameof(file)); + Argument.AssertNotNullOrEmpty(filename, nameof(filename)); + + InternalFileUploadOptions options = new() + { + Purpose = purpose + }; + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(file, filename); + ClientResult result = UploadFile(content, content.ContentType, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(OpenAIFileInfo.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Uploads a file that can be used across various operations. + /// Individual files can be up to 512 MB, and the size of all files uploaded by one organization can be up to 100 GB. + /// The file bytes to upload. + /// + /// The filename associated with the file bytes. The filename's extension (for example: .json) will be used to + /// validate the file format. The request may fail if the filename's extension and the actual file format do + /// not match. + /// + /// The intended purpose of the uploaded file. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual Task> UploadFileAsync(BinaryData file, string filename, FileUploadPurpose purpose) + { + Argument.AssertNotNull(file, nameof(file)); + Argument.AssertNotNullOrEmpty(filename, nameof(filename)); + + return UploadFileAsync(file?.ToStream(), filename, purpose); + } + + /// Uploads a file that can be used across various operations. + /// Individual files can be up to 512 MB, and the size of all files uploaded by one organization can be up to 100 GB. + /// The file bytes to upload. + /// + /// The filename associated with the file bytes. The filename's extension (for example: .json) will be used to + /// validate the file format. The request may fail if the filename's extension and the actual file format do + /// not match. + /// + /// The intended purpose of the uploaded file. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult UploadFile(BinaryData file, string filename, FileUploadPurpose purpose) + { + Argument.AssertNotNull(file, nameof(file)); + Argument.AssertNotNullOrEmpty(filename, nameof(filename)); + + return UploadFile(file?.ToStream(), filename, purpose); + } + + /// Uploads a file that can be used across various operations. + /// Individual files can be up to 512 MB, and the size of all files uploaded by one organization can be up to 100 GB. + /// + /// The path of the file to upload. The provided file path's extension (for example: .json) will be used to + /// validate the file format. The request may fail if the file path's extension and the actual file format do + /// not match. + /// + /// The intended purpose of the uploaded file. + /// was null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> UploadFileAsync(string filePath, FileUploadPurpose purpose) + { + Argument.AssertNotNullOrEmpty(filePath, nameof(filePath)); + + using FileStream stream = File.OpenRead(filePath); + return await UploadFileAsync(stream, filePath, purpose).ConfigureAwait(false); + } + + /// Uploads a file that can be used across various operations. + /// Individual files can be up to 512 MB, and the size of all files uploaded by one organization can be up to 100 GB. + /// + /// The path of the file to upload. The provided file path's extension (for example: .json) will be used to + /// validate the file format. The request may fail if the file path's extension and the actual file format do + /// not match. + /// + /// The intended purpose of the uploaded file. + /// was null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult UploadFile(string filePath, FileUploadPurpose purpose) + { + Argument.AssertNotNullOrEmpty(filePath, nameof(filePath)); + + using FileStream stream = File.OpenRead(filePath); + return UploadFile(stream, filePath, purpose); + } + + /// Gets basic information about each of the files belonging to the user's organization. + /// Only return files with the given purpose. + /// A token that can be used to cancel this method call. + public virtual async Task> GetFilesAsync(OpenAIFilePurpose? purpose = null, CancellationToken cancellationToken = default) + { + ClientResult result = await GetFilesAsync(purpose?.ToString(), cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(OpenAIFileInfoCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Gets basic information about each of the files belonging to the user's organization. + /// Only return files with the given purpose. + /// A token that can be used to cancel this method call. + public virtual ClientResult GetFiles(OpenAIFilePurpose? purpose = null, CancellationToken cancellationToken = default) + { + ClientResult result = GetFiles(purpose?.ToString(), cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(OpenAIFileInfoCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Gets basic information about the specified file. + /// The ID of the desired file. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> GetFileAsync(string fileId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + ClientResult result = await GetFileAsync(fileId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(OpenAIFileInfo.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Gets basic information about the specified file. + /// The ID of the desired file. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult GetFile(string fileId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + ClientResult result = GetFile(fileId, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(OpenAIFileInfo.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Deletes the specified file. + /// The ID of the file to delete. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> DeleteFileAsync(string fileId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + ClientResult result = await DeleteFileAsync(fileId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + InternalDeleteFileResponse internalDeletion = InternalDeleteFileResponse.FromResponse(result.GetRawResponse()); + return ClientResult.FromValue(internalDeletion.Deleted, result.GetRawResponse()); + } + + /// Deletes the specified file. + /// The ID of the file to delete. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult DeleteFile(string fileId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + ClientResult result = DeleteFile(fileId, cancellationToken.ToRequestOptions()); + InternalDeleteFileResponse internalDeletion = InternalDeleteFileResponse.FromResponse(result.GetRawResponse()); + return ClientResult.FromValue(internalDeletion.Deleted, result.GetRawResponse()); + } + + /// Downloads the content of the specified file. + /// The ID of the file to download. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> DownloadFileAsync(string fileId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + ClientResult result = await DownloadFileAsync(fileId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(result.GetRawResponse().Content, result.GetRawResponse()); + } + + /// Downloads the content of the specified file. + /// The ID of the file to download. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult DownloadFile(string fileId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + ClientResult result = DownloadFile(fileId, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(result.GetRawResponse().Content, result.GetRawResponse()); + } +} diff --git a/.dotnet/src/Custom/Files/FileUploadPurpose.cs b/.dotnet/src/Custom/Files/FileUploadPurpose.cs new file mode 100644 index 000000000..2f3e9d8ca --- /dev/null +++ b/.dotnet/src/Custom/Files/FileUploadPurpose.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Files; + +[CodeGenModel("CreateFileRequestPurpose")] +public readonly partial struct FileUploadPurpose +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Files/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/Files/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..e21e8755d --- /dev/null +++ b/.dotnet/src/Custom/Files/Internal/GeneratorStubs.cs @@ -0,0 +1,23 @@ +namespace OpenAI.Files; + +[CodeGenModel("DeleteFileResponse")] +internal partial class InternalDeleteFileResponse { } + +[CodeGenModel("DeleteFileResponseObject")] +internal readonly partial struct InternalDeleteFileResponseObject { } + +[CodeGenModel("ListFilesResponseObject")] +internal readonly partial struct InternalListFilesResponseObject { } + +[CodeGenModel("OpenAIFileObject")] +internal readonly partial struct InternalOpenAIFileObject { } + +[CodeGenModel("AddUploadPartRequest")] internal partial class InternalAddUploadPartRequest { } +[CodeGenModel("CompleteUploadRequest")] internal partial class InternalCompleteUploadRequest { } +[CodeGenModel("CreateUploadRequest")] internal partial class InternalCreateUploadRequest { } +[CodeGenModel("CreateUploadRequestPurpose")] internal readonly partial struct InternalCreateUploadRequestPurpose { } +[CodeGenModel("Upload")] internal partial class InternalUpload { } +[CodeGenModel("UploadObject")] internal readonly partial struct InternalUploadObject { } +[CodeGenModel("UploadPart")] internal partial class InternalUploadPart { } +[CodeGenModel("UploadPartObject")] internal readonly partial struct InternalUploadPartObject { } +[CodeGenModel("UploadStatus")] internal readonly partial struct InternalUploadStatus { } diff --git a/.dotnet/src/Custom/Files/Internal/InternalFileUploadOptions.cs b/.dotnet/src/Custom/Files/Internal/InternalFileUploadOptions.cs new file mode 100644 index 000000000..feec447c3 --- /dev/null +++ b/.dotnet/src/Custom/Files/Internal/InternalFileUploadOptions.cs @@ -0,0 +1,42 @@ +using System.IO; + +namespace OpenAI.Files; + +[CodeGenModel("CreateFileRequest")] +[CodeGenSuppress("InternalFileUploadOptions", typeof(Stream), typeof(FileUploadPurpose))] +internal partial class InternalFileUploadOptions +{ + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// The file object (not file name) to be uploaded. + internal Stream File { get; set; } + + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// + /// The intended purpose of the uploaded file. Use "fine-tune" for + /// [Fine-tuning](/docs/api-reference/fine-tuning) and "assistants" for + /// [Assistants](/docs/api-reference/assistants) and [Messages](/docs/api-reference/messages). This + /// allows us to validate the format of the uploaded file is correct for fine-tuning. + /// + internal FileUploadPurpose Purpose { get; set; } + + // CUSTOM: Made public now that there are no required properties. + /// Initializes a new instance of for deserialization. + public InternalFileUploadOptions() + { + } + + internal MultipartFormDataBinaryContent ToMultipartContent(Stream file, string filename) + { + MultipartFormDataBinaryContent content = new(); + + content.Add(file, "file", filename); + + content.Add(Purpose.ToString(), "purpose"); + + return content; + } +} diff --git a/.dotnet/src/Custom/Files/Internal/InternalUploadsClient.cs b/.dotnet/src/Custom/Files/Internal/InternalUploadsClient.cs new file mode 100644 index 000000000..0e9b3915b --- /dev/null +++ b/.dotnet/src/Custom/Files/Internal/InternalUploadsClient.cs @@ -0,0 +1,53 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace OpenAI.Files; + +[CodeGenClient("Uploads")] +[CodeGenSuppress("InternalUploadsClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +internal partial class InternalUploadsClient +{ + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// is null. + internal InternalUploadsClient(ApiKeyCredential credential) : this(credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// is null. + internal InternalUploadsClient(ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The options to configure the client. + /// is null. + protected internal InternalUploadsClient(ClientPipeline pipeline, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + options ??= new OpenAIClientOptions(); + + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } +} diff --git a/.dotnet/src/Custom/Files/OpenAIFileInfo.cs b/.dotnet/src/Custom/Files/OpenAIFileInfo.cs new file mode 100644 index 000000000..17f0d4796 --- /dev/null +++ b/.dotnet/src/Custom/Files/OpenAIFileInfo.cs @@ -0,0 +1,14 @@ +namespace OpenAI.Files; + +[CodeGenModel("OpenAIFile")] +public partial class OpenAIFileInfo +{ + // CUSTOM: Made private. This property does not add value in the context of a strongly-typed class. + /// The object type, which is always "file". + private InternalOpenAIFileObject Object { get; } = InternalOpenAIFileObject.File; + + // CUSTOM: Renamed. + /// The size of the file, in bytes. + [CodeGenMember("Bytes")] + public int? SizeInBytes { get; } +} diff --git a/.dotnet/src/Custom/Files/OpenAIFileInfoCollection.Serialization.cs b/.dotnet/src/Custom/Files/OpenAIFileInfoCollection.Serialization.cs new file mode 100644 index 000000000..751d8be5a --- /dev/null +++ b/.dotnet/src/Custom/Files/OpenAIFileInfoCollection.Serialization.cs @@ -0,0 +1,72 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Files; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class OpenAIFileInfoCollection : IJsonModel +{ + // CUSTOM: + // - Serialized the Items property. + // - Recovered the serialization of SerializedAdditionalRawData. See https://github.com/Azure/autorest.csharp/issues/4636. + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeOpenAIFileInfoCollection, writer, options); + + internal static void SerializeOpenAIFileInfoCollection(OpenAIFileInfoCollection instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in instance.Items) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + writer.WritePropertyName("object"u8); + writer.WriteStringValue(instance.Object.ToString()); + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + + // CUSTOM: Recovered the deserialization of SerializedAdditionalRawData. See https://github.com/Azure/autorest.csharp/issues/4636. + internal static OpenAIFileInfoCollection DeserializeOpenAIFileInfoCollection(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList data = default; + InternalListFilesResponseObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(OpenAIFileInfo.DeserializeOpenAIFileInfo(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalListFilesResponseObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new OpenAIFileInfoCollection(data, @object, serializedAdditionalRawData); + } + +} diff --git a/.dotnet/src/Custom/Files/OpenAIFileInfoCollection.cs b/.dotnet/src/Custom/Files/OpenAIFileInfoCollection.cs new file mode 100644 index 000000000..9406205b1 --- /dev/null +++ b/.dotnet/src/Custom/Files/OpenAIFileInfoCollection.cs @@ -0,0 +1,75 @@ +using System; +using System.Collections.Generic; +using System.Collections.ObjectModel; + +namespace OpenAI.Files; + +[CodeGenModel("ListFilesResponse")] +[CodeGenSuppress("Data")] +[CodeGenSuppress(nameof(OpenAIFileInfoCollection))] +[CodeGenSuppress(nameof(OpenAIFileInfoCollection), typeof(IReadOnlyList), typeof(InternalListFilesResponseObject))] +public partial class OpenAIFileInfoCollection : ReadOnlyCollection +{ + // CUSTOM: Made private. This property does not add value in the context of a strongly-typed class. + /// Gets the object. + private InternalListFilesResponseObject Object { get; } = InternalListFilesResponseObject.List; + + // CUSTOM: Recovered this field. See https://github.com/Azure/autorest.csharp/issues/4636. + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + private IDictionary SerializedAdditionalRawData; + + /// Initializes a new instance of . + /// + /// is null. + internal OpenAIFileInfoCollection(IEnumerable data) + : base([.. data]) + { + Argument.AssertNotNull(data, nameof(data)); + } + + /// Initializes a new instance of . + /// + /// + /// Keeps track of any properties unknown to the library. + internal OpenAIFileInfoCollection(IReadOnlyList data, InternalListFilesResponseObject @object, IDictionary serializedAdditionalRawData) + : base([.. data]) + { + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal OpenAIFileInfoCollection() + : base([]) + { + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Files/OpenAIFilePurpose.cs b/.dotnet/src/Custom/Files/OpenAIFilePurpose.cs new file mode 100644 index 000000000..2d659b99d --- /dev/null +++ b/.dotnet/src/Custom/Files/OpenAIFilePurpose.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Files; + +[CodeGenModel("OpenAIFilePurpose")] +public readonly partial struct OpenAIFilePurpose +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Files/OpenAIFileStatus.cs b/.dotnet/src/Custom/Files/OpenAIFileStatus.cs new file mode 100644 index 000000000..b321e5e09 --- /dev/null +++ b/.dotnet/src/Custom/Files/OpenAIFileStatus.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Files; + +[CodeGenModel("OpenAIFileStatus")] +public readonly partial struct OpenAIFileStatus +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Files/OpenAIFilesModelFactory.cs b/.dotnet/src/Custom/Files/OpenAIFilesModelFactory.cs new file mode 100644 index 000000000..41c130905 --- /dev/null +++ b/.dotnet/src/Custom/Files/OpenAIFilesModelFactory.cs @@ -0,0 +1,37 @@ +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Files; + +/// Model factory for models. +public static partial class OpenAIFilesModelFactory +{ + /// Initializes a new instance of . + /// A new instance for mocking. + public static OpenAIFileInfo OpenAIFileInfo(string id = null, int? sizeInBytes = null, DateTimeOffset createdAt = default, string filename = null, OpenAIFilePurpose purpose = default, OpenAIFileStatus status = default, string statusDetails = null) + { + return new OpenAIFileInfo( + id, + sizeInBytes, + createdAt, + filename, + @object: InternalOpenAIFileObject.File, + purpose, + status, + statusDetails, + serializedAdditionalRawData: null); + } + + /// Initializes a new instance of . + /// A new instance for mocking. + public static OpenAIFileInfoCollection OpenAIFileInfoCollection(IEnumerable items = null) + { + items ??= new List(); + + return new OpenAIFileInfoCollection( + items.ToList(), + InternalListFilesResponseObject.List, + serializedAdditionalRawData: null); + } +} diff --git a/.dotnet/src/Custom/FineTuning/FineTuningClient.Protocol.cs b/.dotnet/src/Custom/FineTuning/FineTuningClient.Protocol.cs new file mode 100644 index 000000000..7a304e896 --- /dev/null +++ b/.dotnet/src/Custom/FineTuning/FineTuningClient.Protocol.cs @@ -0,0 +1,305 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Linq; +using System.Text.Json; +using System.Threading.Tasks; + +namespace OpenAI.FineTuning; + +[CodeGenSuppress("CreateFineTuningJobAsync", typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("CreateFineTuningJob", typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("GetPaginatedFineTuningJobsAsync", typeof(string), typeof(int?), typeof(RequestOptions))] +[CodeGenSuppress("GetPaginatedFineTuningJobs", typeof(string), typeof(int?), typeof(RequestOptions))] +[CodeGenSuppress("RetrieveFineTuningJobAsync", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("RetrieveFineTuningJob", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CancelFineTuningJobAsync", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CancelFineTuningJob", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("GetFineTuningEventsAsync", typeof(string), typeof(string), typeof(int?), typeof(RequestOptions))] +[CodeGenSuppress("GetFineTuningEvents", typeof(string), typeof(string), typeof(int?), typeof(RequestOptions))] +[CodeGenSuppress("GetFineTuningJobCheckpointsAsync", typeof(string), typeof(string), typeof(int?), typeof(RequestOptions))] +[CodeGenSuppress("GetFineTuningJobCheckpoints", typeof(string), typeof(string), typeof(int?), typeof(RequestOptions))] +public partial class FineTuningClient +{ + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// [Protocol Method] Creates a fine-tuning job which begins the process of creating a new model from a given dataset. + /// + /// Response includes details of the enqueued job including job status and the name of the fine-tuned models once complete. + /// + /// [Learn more about fine-tuning](/docs/guides/fine-tuning) + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task CreateJobAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateFineTuningJobRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// [Protocol Method] Creates a fine-tuning job which begins the process of creating a new model from a given dataset. + /// + /// Response includes details of the enqueued job including job status and the name of the fine-tuned models once complete. + /// + /// [Learn more about fine-tuning](/docs/guides/fine-tuning) + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult CreateJob(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateFineTuningJobRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// [Protocol Method] List your organization's fine-tuning jobs + /// + /// Identifier for the last job from the previous pagination request. + /// Number of fine-tuning jobs to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual IAsyncEnumerable GetJobsAsync(string after, int? limit, RequestOptions options) + { + FineTuningJobsPageEnumerator enumerator = new FineTuningJobsPageEnumerator(_pipeline, _endpoint, after, limit, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// [Protocol Method] List your organization's fine-tuning jobs + /// + /// Identifier for the last job from the previous pagination request. + /// Number of fine-tuning jobs to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual IEnumerable GetJobs(string after, int? limit, RequestOptions options) + { + FineTuningJobsPageEnumerator enumerator = new FineTuningJobsPageEnumerator(_pipeline, _endpoint, after, limit, options); + return PageCollectionHelpers.Create(enumerator); + } + + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// [Protocol Method] Get info about a fine-tuning job. + /// + /// [Learn more about fine-tuning](/docs/guides/fine-tuning) + /// + /// The ID of the fine-tuning job. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task GetJobAsync(string jobId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + using PipelineMessage message = CreateRetrieveFineTuningJobRequest(jobId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// [Protocol Method] Get info about a fine-tuning job. + /// + /// [Learn more about fine-tuning](/docs/guides/fine-tuning) + /// + /// The ID of the fine-tuning job. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult GetJob(string jobId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + using PipelineMessage message = CreateRetrieveFineTuningJobRequest(jobId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// [Protocol Method] Immediately cancel a fine-tune job. + /// + /// The ID of the fine-tuning job to cancel. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task CancelJobAsync(string jobId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + using PipelineMessage message = CreateCancelFineTuningJobRequest(jobId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// [Protocol Method] Immediately cancel a fine-tune job. + /// + /// The ID of the fine-tuning job to cancel. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual ClientResult CancelJob(string jobId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + using PipelineMessage message = CreateCancelFineTuningJobRequest(jobId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// [Protocol Method] Get status updates for a fine-tuning job. + /// + /// The ID of the fine-tuning job to get events for. + /// Identifier for the last event from the previous pagination request. + /// Number of events to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual IAsyncEnumerable GetJobEventsAsync(string jobId, string after, int? limit, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + FineTuningJobEventsPageEnumerator enumerator = new FineTuningJobEventsPageEnumerator(_pipeline, _endpoint, jobId, after, limit, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// [Protocol Method] Get status updates for a fine-tuning job. + /// + /// The ID of the fine-tuning job to get events for. + /// Identifier for the last event from the previous pagination request. + /// Number of events to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual IEnumerable GetJobEvents(string jobId, string after, int? limit, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + FineTuningJobEventsPageEnumerator enumerator = new FineTuningJobEventsPageEnumerator(_pipeline, _endpoint, jobId, after, limit, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + /// [Protocol Method] List the checkpoints for a fine-tuning job. + /// + /// The ID of the fine-tuning job to get checkpoints for. + /// Identifier for the last checkpoint ID from the previous pagination request. + /// Number of checkpoints to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual IAsyncEnumerable GetJobCheckpointsAsync(string jobId, string after, int? limit, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + FineTuningJobCheckpointsPageEnumerator enumerator = new FineTuningJobCheckpointsPageEnumerator(_pipeline, _endpoint, jobId, after, limit, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + /// + /// [Protocol Method] List the checkpoints for a fine-tuning job. + /// + /// The ID of the fine-tuning job to get checkpoints for. + /// Identifier for the last checkpoint ID from the previous pagination request. + /// Number of checkpoints to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual IEnumerable GetJobCheckpoints(string jobId, string after, int? limit, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + FineTuningJobCheckpointsPageEnumerator enumerator = new FineTuningJobCheckpointsPageEnumerator(_pipeline, _endpoint, jobId, after, limit, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + /// Helper method to determine the starting point of the next page based on the previous page. + /// + /// The previous page. + /// The starting point of the next page. + /// If the starting point could not be determined. + internal static string GetStartOfNextPage(ClientResult previousPage) + { + PipelineResponse response = previousPage.GetRawResponse(); + + string after = null; + using JsonDocument doc = JsonDocument.Parse(response.Content); + + // For consistency with the existing code, we'll try tp use the "last_id" field. However, the OpenAI specification + // doesn't actually mention this property exists at all, and I don't see it in the response so I would assume this + // would never work. + if (doc.RootElement.TryGetProperty("last_id"u8, out JsonElement lastId)) + { + after = lastId.GetString(); + } + + // This should work: We get the data property, check if it is an array, get the last element, and then read the + // id of that element + if (after == null + && doc.RootElement.TryGetProperty("data"u8, out JsonElement dataArray) + && dataArray.ValueKind == JsonValueKind.Array + && dataArray.EnumerateArray().LastOrDefault().TryGetProperty("id"u8, out lastId)) + { + after = lastId.GetString(); + } + + return after + ?? throw new InvalidOperationException("Don't know how to determine the starting point for the next page"); + } +} diff --git a/.dotnet/src/Custom/FineTuning/FineTuningClient.cs b/.dotnet/src/Custom/FineTuning/FineTuningClient.cs new file mode 100644 index 000000000..0d4ff991f --- /dev/null +++ b/.dotnet/src/Custom/FineTuning/FineTuningClient.cs @@ -0,0 +1,70 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace OpenAI.FineTuning; + +// CUSTOM: +// - Renamed. +// - Suppressed constructor that takes endpoint parameter; endpoint is now a property in the options class. +// - Suppressed convenience methods for now. +/// The service client for OpenAI fine-tuning operations. +[CodeGenClient("FineTuning")] +[CodeGenSuppress("FineTuningClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateFineTuningJobAsync", typeof(InternalCreateFineTuningJobRequest))] +[CodeGenSuppress("CreateFineTuningJob", typeof(InternalCreateFineTuningJobRequest))] +[CodeGenSuppress("GetPaginatedFineTuningJobsAsync", typeof(string), typeof(int?))] +[CodeGenSuppress("GetPaginatedFineTuningJobs", typeof(string), typeof(int?))] +[CodeGenSuppress("RetrieveFineTuningJobAsync", typeof(string))] +[CodeGenSuppress("RetrieveFineTuningJob", typeof(string))] +[CodeGenSuppress("CancelFineTuningJobAsync", typeof(string))] +[CodeGenSuppress("CancelFineTuningJob", typeof(string))] +[CodeGenSuppress("GetFineTuningEventsAsync", typeof(string), typeof(string), typeof(int?))] +[CodeGenSuppress("GetFineTuningEvents", typeof(string), typeof(string), typeof(int?))] +[CodeGenSuppress("GetFineTuningJobCheckpointsAsync", typeof(string), typeof(string), typeof(int?))] +[CodeGenSuppress("GetFineTuningJobCheckpoints", typeof(string), typeof(string), typeof(int?))] +public partial class FineTuningClient +{ + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// is null. + public FineTuningClient(ApiKeyCredential credential) : this(credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// is null. + public FineTuningClient(ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The options to configure the client. + /// is null. + protected internal FineTuningClient(ClientPipeline pipeline, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + options ??= new OpenAIClientOptions(); + + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } +} diff --git a/.dotnet/src/Custom/FineTuning/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/FineTuning/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..833c958ff --- /dev/null +++ b/.dotnet/src/Custom/FineTuning/Internal/GeneratorStubs.cs @@ -0,0 +1,97 @@ +namespace OpenAI.FineTuning; + +// CUSTOM: Made internal. + +[CodeGenModel("CreateFineTuningJobRequest")] +internal partial class InternalCreateFineTuningJobRequest { } + +[CodeGenModel("CreateFineTuningJobRequestHyperparameters")] +internal partial class InternalCreateFineTuningJobRequestHyperparameters { } + +[CodeGenModel("CreateFineTuningJobRequestIntegration")] +internal partial class InternalCreateFineTuningJobRequestIntegration { } + +[CodeGenModel("CreateFineTuningJobRequestIntegrationType")] +internal readonly partial struct InternalCreateFineTuningJobRequestIntegrationType { } + +[CodeGenModel("CreateFineTuningJobRequestIntegrationWandb")] +internal partial class InternalCreateFineTuningJobRequestIntegrationWandb { } + +[CodeGenModel("CreateFineTuningJobRequestModel")] +internal readonly partial struct InternalCreateFineTuningJobRequestModel { } + +[CodeGenModel("FineTuneChatCompletionRequestAssistantMessage")] +internal partial class InternalFineTuneChatCompletionRequestAssistantMessage { } + +[CodeGenModel("FinetuneChatRequestInput")] +internal partial class InternalFinetuneChatRequestInput { } + +[CodeGenModel("FinetuneCompletionRequestInput")] +internal partial class InternalFinetuneCompletionRequestInput { } + +[CodeGenModel("FineTuningIntegration")] +internal partial class InternalFineTuningIntegration { } + +[CodeGenModel("FineTuningIntegrationType")] +internal readonly partial struct InternalFineTuningIntegrationType { } + +[CodeGenModel("FineTuningIntegrationWandb")] +internal partial class InternalFineTuningIntegrationWandb { } + +[CodeGenModel("FineTuningJob")] +internal partial class InternalFineTuningJob { } + +[CodeGenModel("FineTuningJobCheckpoint")] +internal partial class InternalFineTuningJobCheckpoint { } + +[CodeGenModel("FineTuningJobCheckpointMetrics")] +internal partial class InternalFineTuningJobCheckpointMetrics { } + +[CodeGenModel("FineTuningJobCheckpointObject")] +internal readonly partial struct InternalFineTuningJobCheckpointObject { } + +[CodeGenModel("FineTuningJobError")] +internal partial class InternalFineTuningJobError { } + +[CodeGenModel("FineTuningJobEvent")] +internal partial class InternalFineTuningJobEvent { } + +[CodeGenModel("FineTuningJobEventLevel")] +internal readonly partial struct InternalFineTuningJobEventLevel { } + +[CodeGenModel("FineTuningJobEventObject")] +internal readonly partial struct InternalFineTuningJobEventObject { } + +[CodeGenModel("FineTuningJobHyperparameters")] +internal partial class InternalFineTuningJobHyperparameters { } + +[CodeGenModel("FineTuningJobObject")] +internal readonly partial struct InternalFineTuningJobObject { } + +[CodeGenModel("FineTuningJobStatus")] +internal readonly partial struct InternalFineTuningJobStatus { } + +[CodeGenModel("ListFineTuningJobCheckpointsResponse")] +internal partial class InternalListFineTuningJobCheckpointsResponse { } + +[CodeGenModel("ListFineTuningJobCheckpointsResponseObject")] +internal readonly partial struct InternalListFineTuningJobCheckpointsResponseObject { } + +[CodeGenModel("ListFineTuningJobEventsResponse")] +internal partial class InternalListFineTuningJobEventsResponse { } + +[CodeGenModel("ListFineTuningJobEventsResponseObject")] +internal readonly partial struct InternalListFineTuningJobEventsResponseObject { } + +[CodeGenModel("ListPaginatedFineTuningJobsResponse")] +internal partial class InternalListPaginatedFineTuningJobsResponse { } + +[CodeGenModel("ListPaginatedFineTuningJobsResponseObject")] +internal readonly partial struct InternalListPaginatedFineTuningJobsResponseObject { } + +[CodeGenModel("CreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum")] internal readonly partial struct InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum { } +[CodeGenModel("CreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum")] internal readonly partial struct InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum { } +[CodeGenModel("CreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum")] internal readonly partial struct InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum { } +[CodeGenModel("FineTuningJobHyperparametersBatchSizeChoiceEnum")] internal readonly partial struct InternalFineTuningJobHyperparametersBatchSizeChoiceEnum { } +[CodeGenModel("FineTuningJobHyperparametersLearningRateMultiplierChoiceEnum")] internal readonly partial struct InternalFineTuningJobHyperparametersLearningRateMultiplierChoiceEnum { } +[CodeGenModel("FineTuningJobHyperparametersNEpochsChoiceEnum")] internal readonly partial struct InternalFineTuningJobHyperparametersNEpochsChoiceEnum { } diff --git a/.dotnet/src/Custom/FineTuning/Internal/Pagination/FineTuningJobCheckpointsPageEnumerator.cs b/.dotnet/src/Custom/FineTuning/Internal/Pagination/FineTuningJobCheckpointsPageEnumerator.cs new file mode 100644 index 000000000..ccda60fdb --- /dev/null +++ b/.dotnet/src/Custom/FineTuning/Internal/Pagination/FineTuningJobCheckpointsPageEnumerator.cs @@ -0,0 +1,108 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.FineTuning; + +internal partial class FineTuningJobCheckpointsPageEnumerator : PageResultEnumerator +{ + protected readonly ClientPipeline _pipeline; + protected readonly Uri _endpoint; + + private readonly string _jobId; + private readonly int? _limit; + private readonly RequestOptions _options; + + private string _after; + + public FineTuningJobCheckpointsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string jobId, string after, int? limit, + RequestOptions options) + { + _pipeline = pipeline; + _endpoint = endpoint; + + _jobId = jobId; + _after = after; + _limit = limit; + _options = options; + } + + public override async Task GetFirstAsync() + => await GetJobCheckpointsAsync(_jobId, _after, _limit, _options).ConfigureAwait(false); + + public override ClientResult GetFirst() + => GetJobCheckpoints(_jobId, _after, _limit, _options); + + public override async Task GetNextAsync(ClientResult result) + { + _after = FineTuningClient.GetStartOfNextPage(result); + return await GetJobCheckpointsAsync(_jobId, _after, _limit, _options).ConfigureAwait(false); + } + + public override ClientResult GetNext(ClientResult result) + { + _after = FineTuningClient.GetStartOfNextPage(result); + return GetJobCheckpoints(_jobId, _after, _limit, _options); + } + + public override bool HasNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + bool hasMore = doc.RootElement.GetProperty("has_more"u8).GetBoolean(); + + return hasMore; + } + + internal virtual async Task GetJobCheckpointsAsync(string jobId, string after, int? limit, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + using PipelineMessage message = CreateGetFineTuningJobCheckpointsRequest(jobId, after, limit, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal virtual ClientResult GetJobCheckpoints(string jobId, string after, int? limit, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + using PipelineMessage message = CreateGetFineTuningJobCheckpointsRequest(jobId, after, limit, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + internal virtual PipelineMessage CreateGetFineTuningJobCheckpointsRequest(string fineTuningJobId, string after, int? limit, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/fine_tuning/jobs/", false); + uri.AppendPath(fineTuningJobId, true); + uri.AppendPath("/checkpoints", false); + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier? _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); +} diff --git a/.dotnet/src/Custom/FineTuning/Internal/Pagination/FineTuningJobEventsPageEnumerator.cs b/.dotnet/src/Custom/FineTuning/Internal/Pagination/FineTuningJobEventsPageEnumerator.cs new file mode 100644 index 000000000..a69bc6273 --- /dev/null +++ b/.dotnet/src/Custom/FineTuning/Internal/Pagination/FineTuningJobEventsPageEnumerator.cs @@ -0,0 +1,108 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.FineTuning; + +internal partial class FineTuningJobEventsPageEnumerator : PageResultEnumerator +{ + protected readonly ClientPipeline _pipeline; + protected readonly Uri _endpoint; + + private readonly string _jobId; + private readonly int? _limit; + private readonly RequestOptions _options; + + private string _after; + + public FineTuningJobEventsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string jobId, string after, int? limit, + RequestOptions options) + { + _pipeline = pipeline; + _endpoint = endpoint; + + _jobId = jobId; + _after = after; + _limit = limit; + _options = options; + } + + public override async Task GetFirstAsync() + => await GetJobEventsAsync(_jobId, _after, _limit, _options).ConfigureAwait(false); + + public override ClientResult GetFirst() + => GetJobEvents(_jobId, _after, _limit, _options); + + public override async Task GetNextAsync(ClientResult result) + { + _after = FineTuningClient.GetStartOfNextPage(result); + return await GetJobEventsAsync(_jobId, _after, _limit, _options).ConfigureAwait(false); + } + + public override ClientResult GetNext(ClientResult result) + { + _after = FineTuningClient.GetStartOfNextPage(result); + return GetJobEvents(_jobId, _after, _limit, _options); + } + + public override bool HasNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + bool hasMore = doc.RootElement.GetProperty("has_more"u8).GetBoolean(); + + return hasMore; + } + + internal virtual async Task GetJobEventsAsync(string jobId, string after, int? limit, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + using PipelineMessage message = CreateGetFineTuningEventsRequest(jobId, after, limit, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal virtual ClientResult GetJobEvents(string jobId, string after, int? limit, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(jobId, nameof(jobId)); + + using PipelineMessage message = CreateGetFineTuningEventsRequest(jobId, after, limit, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + internal virtual PipelineMessage CreateGetFineTuningEventsRequest(string fineTuningJobId, string after, int? limit, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/fine_tuning/jobs/", false); + uri.AppendPath(fineTuningJobId, true); + uri.AppendPath("/events", false); + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier? _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); +} diff --git a/.dotnet/src/Custom/FineTuning/Internal/Pagination/FineTuningJobsPageEnumerator.cs b/.dotnet/src/Custom/FineTuning/Internal/Pagination/FineTuningJobsPageEnumerator.cs new file mode 100644 index 000000000..8b81ff6dc --- /dev/null +++ b/.dotnet/src/Custom/FineTuning/Internal/Pagination/FineTuningJobsPageEnumerator.cs @@ -0,0 +1,100 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.FineTuning; + +internal partial class FineTuningJobsPageEnumerator : PageResultEnumerator +{ + protected readonly ClientPipeline _pipeline; + protected readonly Uri _endpoint; + + private readonly int? _limit; + private readonly RequestOptions _options; + + private string? _after; + + public FineTuningJobsPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string? after, int? limit, + RequestOptions options) + { + _pipeline = pipeline; + _endpoint = endpoint; + + _after = after; + _limit = limit; + _options = options; + } + + public override async Task GetFirstAsync() + => await GetJobsAsync(_after, _limit, _options).ConfigureAwait(false); + + public override ClientResult GetFirst() + => GetJobs(_after, _limit, _options); + + public override async Task GetNextAsync(ClientResult result) + { + _after = FineTuningClient.GetStartOfNextPage(result); + return await GetJobsAsync(_after, _limit, _options).ConfigureAwait(false); + } + + public override ClientResult GetNext(ClientResult result) + { + _after = FineTuningClient.GetStartOfNextPage(result); + return GetJobs(_after, _limit, _options); + } + + public override bool HasNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + bool hasMore = doc.RootElement.GetProperty("has_more"u8).GetBoolean(); + + return hasMore; + } + + internal virtual async Task GetJobsAsync(string? after, int? limit, RequestOptions options) + { + using PipelineMessage message = CreateGetFineTuningJobsRequest(after, limit, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal virtual ClientResult GetJobs(string? after, int? limit, RequestOptions options) + { + using PipelineMessage message = CreateGetFineTuningJobsRequest(after, limit, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + internal virtual PipelineMessage CreateGetFineTuningJobsRequest(string? after, int? limit, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/fine_tuning/jobs", false); + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier? _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); +} diff --git a/.dotnet/src/Custom/Images/GeneratedImage.cs b/.dotnet/src/Custom/Images/GeneratedImage.cs new file mode 100644 index 000000000..5c8158c76 --- /dev/null +++ b/.dotnet/src/Custom/Images/GeneratedImage.cs @@ -0,0 +1,38 @@ +using System; + +namespace OpenAI.Images; + +/// +/// Represents the result data for an image generation request. +/// +[CodeGenModel("Image")] +public partial class GeneratedImage +{ + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// The binary image data received from the response, provided when + /// is set to . + /// + /// + /// This property is mutually exclusive with and will be null when the other + /// is present. + /// + [CodeGenMember("B64Json")] + public BinaryData ImageBytes { get; } + + // CUSTOM: + // - Renamed. + // - Edited doc comment. + /// + /// A temporary internet location for an image, provided by default or when + /// is set to . + /// + /// + /// This property is mutually exclusive with and will be null when the other + /// is present. + /// + [CodeGenMember("Url")] + public Uri ImageUri { get; } +} diff --git a/.dotnet/src/Custom/Images/GeneratedImageCollection.Serialization.cs b/.dotnet/src/Custom/Images/GeneratedImageCollection.Serialization.cs new file mode 100644 index 000000000..8dbfc48e8 --- /dev/null +++ b/.dotnet/src/Custom/Images/GeneratedImageCollection.Serialization.cs @@ -0,0 +1,71 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Images; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class GeneratedImageCollection : IJsonModel +{ + // CUSTOM: + // - Serialized the Items property. + // - Recovered the serialization of SerializedAdditionalRawData. See https://github.com/Azure/autorest.csharp/issues/4636. + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeGeneratedImageCollection, writer, options); + + internal static void SerializeGeneratedImageCollection(GeneratedImageCollection instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("created"u8); + writer.WriteNumberValue(instance.CreatedAt, "U"); + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in instance.Items) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + + // CUSTOM: Recovered the deserialization of SerializedAdditionalRawData. See https://github.com/Azure/autorest.csharp/issues/4636. + internal static GeneratedImageCollection DeserializeGeneratedImageCollection(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + DateTimeOffset created = default; + IReadOnlyList data = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("created"u8)) + { + created = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(GeneratedImage.DeserializeGeneratedImage(item, options)); + } + data = array; + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new GeneratedImageCollection(created, data, serializedAdditionalRawData); + } +} diff --git a/.dotnet/src/Custom/Images/GeneratedImageCollection.cs b/.dotnet/src/Custom/Images/GeneratedImageCollection.cs new file mode 100644 index 000000000..1cd14a3ab --- /dev/null +++ b/.dotnet/src/Custom/Images/GeneratedImageCollection.cs @@ -0,0 +1,87 @@ +using System; +using System.Collections.Generic; +using System.Collections.ObjectModel; + +namespace OpenAI.Images; + +/// +/// Represents an image generation response payload that contains information for multiple generated images. +/// +[CodeGenModel("ImagesResponse")] +[CodeGenSuppress("Data")] +[CodeGenSuppress(nameof(GeneratedImageCollection))] +[CodeGenSuppress(nameof(GeneratedImageCollection), typeof(DateTimeOffset), typeof(IReadOnlyList))] +public partial class GeneratedImageCollection : ReadOnlyCollection +{ + // CUSTOM: Recovered this field. See https://github.com/Azure/autorest.csharp/issues/4636. + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + private IDictionary SerializedAdditionalRawData; + + // CUSTOM: Set the inherited Items property via the base constructor in favor of the suppressed Data property. + /// Initializes a new instance of . + /// + /// + /// is null. + internal GeneratedImageCollection(DateTimeOffset created, IEnumerable data) + : base([.. data]) + { + Argument.AssertNotNull(data, nameof(data)); + + CreatedAt = created; + } + + // CUSTOM: Set the inherited Items property via the base constructor in favor of the suppressed Data property. + /// Initializes a new instance of . + /// + /// + /// Keeps track of any properties unknown to the library. + internal GeneratedImageCollection(DateTimeOffset created, IReadOnlyList data, IDictionary serializedAdditionalRawData) + : base([.. data]) + { + CreatedAt = created; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + // CUSTOM: Set the inherited Items property via the base constructor in favor of the suppressed Data property. + /// Initializes a new instance of for deserialization. + internal GeneratedImageCollection() + : base([]) + { + } + + // CUSTOM: Renamed. + /// + /// The timestamp at which the result image was generated. + /// + [CodeGenMember("Created")] + public DateTimeOffset CreatedAt { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Images/GeneratedImageFormat.cs b/.dotnet/src/Custom/Images/GeneratedImageFormat.cs new file mode 100644 index 000000000..93bdca9ec --- /dev/null +++ b/.dotnet/src/Custom/Images/GeneratedImageFormat.cs @@ -0,0 +1,37 @@ +namespace OpenAI.Images; + +// CUSTOM: +// - Renamed enum and its members. +// - Converted extensible enum into an enum. +// - Edited doc comment. +/// +/// Represents the available output methods for generated images. +/// +/// +/// url - - Default, provides a temporary internet location that +/// the generated image can be retrieved from. +/// +/// +/// b64_json - - Provides the full image data on the response, +/// encoded in the result as a base64 string. This offers the fastest round trip time but can drastically +/// increase the size of response payloads. +/// +/// +/// +[CodeGenModel("CreateImageRequestResponseFormat")] +public enum GeneratedImageFormat +{ + /// + /// Instructs the request to return image data directly on the response, encoded as a base64 string in the response + /// JSON. This minimizes availability time but drastically increases the size of responses, required bandwidth, and + /// immediate memory needs. This is equivalent to b64_json in the REST API. + /// + [CodeGenMember("B64Json")] + Bytes, + /// + /// The default setting that instructs the request to return a temporary internet location from which the image can + /// be retrieved. + /// + [CodeGenMember("Url")] + Uri, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Images/GeneratedImageQuality.cs b/.dotnet/src/Custom/Images/GeneratedImageQuality.cs new file mode 100644 index 000000000..6f25f99e8 --- /dev/null +++ b/.dotnet/src/Custom/Images/GeneratedImageQuality.cs @@ -0,0 +1,31 @@ +namespace OpenAI.Images; + +// CUSTOM: +// - Renamed enum and its members. +// - Converted extensible enum into an enum. +// - Edited doc comment. +/// +/// A representation of the quality setting for image operations that controls the level of work that the model will +/// perform. +/// +/// +/// Available qualities consist of: +/// +/// - standard - The default setting that balances speed, detail, and consistecy. +/// - hd - Better consistency and finer details, but may be slower. +/// +/// +[CodeGenModel("CreateImageRequestQuality")] +public enum GeneratedImageQuality +{ + /// + /// The hd image quality that provides finer details and greater consistency but may be slower. + /// + [CodeGenMember("Hd")] + High, + /// + /// The standard image quality that provides a balanced mix of detailing, consistency, and speed. + /// + [CodeGenMember("Standard")] + Standard, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Images/GeneratedImageSize.cs b/.dotnet/src/Custom/Images/GeneratedImageSize.cs new file mode 100644 index 000000000..49acb361b --- /dev/null +++ b/.dotnet/src/Custom/Images/GeneratedImageSize.cs @@ -0,0 +1,81 @@ +using System; + +namespace OpenAI.Images; + +// CUSTOM: Added custom struct in favor of the generated extensible enum. +/// +/// Represents the available output dimensions for generated images. +/// +[CodeGenModel("CreateImageRequestSize")] +[CodeGenSuppress("GeneratedImageSize", typeof(string))] +[CodeGenSuppress("op_Implicit", typeof(string))] +public readonly partial struct GeneratedImageSize : IEquatable +{ + private readonly string _value; + + /// Initializes a new instance of . + /// is null. + internal GeneratedImageSize(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + /// + /// Creates a new instance of . + /// + /// + /// Note: arbitrary dimensions are not supported and a given model will only support a set of predefined + /// sizes. If supported dimensions are not known, try using one of the static properties like . + /// + /// The desired width, in pixels, for an image. + /// The desired height, in pixels, for an image. + public GeneratedImageSize(int width, int height) + { + _value = $"{width}x{height}"; + } + + /// + /// A small, square image with 256 pixels of both width and height. + /// + /// Supported only for the older dall-e-2 model. + /// + /// + [CodeGenMember("_256x256")] + public static readonly GeneratedImageSize W256xH256 = new(256, 256); + + /// + /// A medium-small, square image with 512 pixels of both width and height. + /// + /// Supported only for the older dall-e-2 model. + /// + /// + [CodeGenMember("_512x512")] + public static readonly GeneratedImageSize W512xH512 = new(512, 512); + + /// + /// A square image with 1024 pixels of both width and height. + /// + /// Supported and default for both dall-e-2 and dall-e-3 models. + /// + /// + [CodeGenMember("_1024x1024")] + public static readonly GeneratedImageSize W1024xH1024 = new(1024, 1024); + + /// + /// An extra tall image, 1024 pixels wide by 1792 pixels high. + /// + /// Supported only for the dall-e-3 model. + /// + /// + [CodeGenMember("_1792x1024")] + public static readonly GeneratedImageSize W1024xH1792 = new(1024, 1792); + + /// + /// An extra wide image, 1792 pixels wide by 1024 pixels high. + /// + /// Supported only for the dall-e-3 model. + /// + /// + [CodeGenMember("_1024x1792")] + public static readonly GeneratedImageSize W1792xH1024 = new(1792, 1024); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Images/GeneratedImageStyle.cs b/.dotnet/src/Custom/Images/GeneratedImageStyle.cs new file mode 100644 index 000000000..1ea6c49d5 --- /dev/null +++ b/.dotnet/src/Custom/Images/GeneratedImageStyle.cs @@ -0,0 +1,23 @@ +namespace OpenAI.Images; + +// CUSTOM: +// - Renamed. +// - Converted extensible enum into an enum. +// - Edited doc comment. +/// +/// The style of the generated images. Must be one of vivid or natural. Vivid causes the model to lean towards +/// generating hyper-real and dramatic images. Natural causes the model to produce more natural, less hyper-real +/// looking images. This param is only supported for dall-e-3. +/// +[CodeGenModel("CreateImageRequestStyle")] +public enum GeneratedImageStyle +{ + /// + /// The vivid style, with which the model will tend towards hyper-realistic, dramatic imagery. + /// + Vivid, + /// + /// The natural style, with which the model will not tend towards hyper-realistic, dramatic imagery. + /// + Natural, +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Images/ImageClient.Protocol.cs b/.dotnet/src/Custom/Images/ImageClient.Protocol.cs new file mode 100644 index 000000000..b0ffff651 --- /dev/null +++ b/.dotnet/src/Custom/Images/ImageClient.Protocol.cs @@ -0,0 +1,162 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; +using System.Threading.Tasks; + +namespace OpenAI.Images; + +[CodeGenSuppress("CreateImageAsync", typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("CreateImage", typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("CreateImageEditAsync", typeof(BinaryContent), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CreateImageEdit", typeof(BinaryContent), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CreateImageVariationAsync", typeof(BinaryContent), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CreateImageVariation", typeof(BinaryContent), typeof(string), typeof(RequestOptions))] +public partial class ImageClient +{ + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + /// + /// [Protocol Method] Generates images based on a given prompt. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GenerateImagesAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateImageRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + /// + /// [Protocol Method] Generates images based on a given prompt. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GenerateImages(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateImageRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + // - Parametrized the Content-Type header. + // - Added "contentType" parameter. + /// + /// [Protocol Method] Generates edited or extended images given an original image and a prompt. + /// + /// The content to send as the body of the request. + /// The content type of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GenerateImageEditsAsync(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateCreateImageEditRequest(content, contentType, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + // - Parametrized the Content-Type header. + // - Added "contentType" parameter. + /// + /// [Protocol Method] Generates edited or extended images given an original image and a prompt. + /// + /// The content to send as the body of the request. + /// The content type of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GenerateImageEdits(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateCreateImageEditRequest(content, contentType, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + // - Parametrized the Content-Type header. + // - Added "contentType" parameter. + /// + /// [Protocol Method] Generates variations of a given image. + /// + /// The content to send as the body of the request. + /// The content type of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GenerateImageVariationsAsync(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateCreateImageVariationRequest(content, contentType, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + // CUSTOM: + // - Renamed. + // - Edited the cref in the doc comment to point to the correct convenience overload after it was also renamed. + // - Added the EditorBrowsable attribute to hide protocol methods from IntelliSense when a convenience overload is available. + // - Parametrized the Content-Type header. + // - Added "contentType" parameter. + /// + /// [Protocol Method] Generates variations of a given image. + /// + /// The content to send as the body of the request. + /// The content type of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GenerateImageVariations(BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNullOrEmpty(contentType, nameof(contentType)); + + using PipelineMessage message = CreateCreateImageVariationRequest(content, contentType, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } +} diff --git a/.dotnet/src/Custom/Images/ImageClient.cs b/.dotnet/src/Custom/Images/ImageClient.cs new file mode 100644 index 000000000..b7efdd384 --- /dev/null +++ b/.dotnet/src/Custom/Images/ImageClient.cs @@ -0,0 +1,825 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.IO; +using System.Linq; +using System.Threading; +using System.Threading.Tasks; + +namespace OpenAI.Images; + +// CUSTOM: +// - Renamed. +// - Suppressed constructor that takes endpoint parameter; endpoint is now a property in the options class. +// - Suppressed methods that only take the options parameter. +/// The service client for OpenAI image operations. +[CodeGenClient("Images")] +[CodeGenSuppress("ImageClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateImageAsync", typeof(ImageGenerationOptions))] +[CodeGenSuppress("CreateImage", typeof(ImageGenerationOptions))] +[CodeGenSuppress("CreateImageEditAsync", typeof(ImageEditOptions))] +[CodeGenSuppress("CreateImageEdit", typeof(ImageEditOptions))] +[CodeGenSuppress("CreateImageVariationAsync", typeof(ImageVariationOptions))] +[CodeGenSuppress("CreateImageVariation", typeof(ImageVariationOptions))] +public partial class ImageClient +{ + private readonly string _model; + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public ImageClient(string model, ApiKeyCredential credential) : this(model, credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public ImageClient(string model, ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + protected internal ImageClient(ClientPipeline pipeline, string model, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNullOrEmpty(model, nameof(model)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } + + #region GenerateImages + + /// Generates an image based on a prompt. + /// A text description of the desired image. + /// The options to configure the image generation. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageAsync(string prompt, ImageGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + options ??= new(); + CreateImageGenerationOptions(prompt, null, ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = await GenerateImagesAsync(content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + /// Generates an image based on a prompt. + /// A text description of the desired image. + /// The options to configure the image generation. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImage(string prompt, ImageGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + options ??= new(); + CreateImageGenerationOptions(prompt, null, ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = GenerateImages(content, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + /// Generates images based on a prompt. + /// A text description of the desired images. + /// The number of images to generate. + /// The options to configure the image generation. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImagesAsync(string prompt, int imageCount, ImageGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + options ??= new(); + CreateImageGenerationOptions(prompt, imageCount, ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = await GenerateImagesAsync(content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Generates images based on a prompt. + /// A text description of the desired images. + /// The number of images to generate. + /// The options to configure the image generation. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImages(string prompt, int imageCount, ImageGenerationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + options ??= new(); + CreateImageGenerationOptions(prompt, imageCount, ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = GenerateImages(content, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + #endregion + + #region GenerateImageEdits + + /// Generates an edited or extended image based on an original image and a prompt. + /// + /// The image stream to edit. Must be a valid PNG file, less than 4MB, and square. The image must have transparency, which + /// will be used as the mask. + /// + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// A text description of the desired image. + /// The options to configure the image edit. + /// A token that can be used to cancel this method call. + /// , , or is null. + /// or is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageEditAsync(Stream image, string imageFilename, string prompt, ImageEditOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + options ??= new(); + CreateImageEditOptions(image, imageFilename, prompt, null, null, null, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename, null, null); + ClientResult result = await GenerateImageEditsAsync(content, content.ContentType, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + /// Generates an edited or extended image based on an original image and a prompt. + /// + /// The image stream to edit. Must be a valid PNG file, less than 4MB, and square. The image must have transparency, which + /// will be used as the mask. + /// + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// A text description of the desired image. + /// The options to configure the image edit. + /// A token that can be used to cancel this method call. + /// , , or is null. + /// or is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageEdit(Stream image, string imageFilename, string prompt, ImageEditOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + options ??= new(); + CreateImageEditOptions(image, imageFilename, prompt, null, null, null, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename, null, null); + ClientResult result = GenerateImageEdits(content, content.ContentType, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + /// Generates an edited or extended image based on an original image and a prompt. + /// + /// The path of the image file to edit. Must be a valid PNG file, less than 4MB, and square. The image must + /// have transparency, which will be used as the mask. The provided file path's extension (for example: .png) + /// will be used to validate the format of the input image. The request may fail if the file path's extension + /// and the actual format of the input image do not match. + /// + /// A text description of the desired image. + /// The options to configure the image edit. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageEditAsync(string imageFilePath, string prompt, ImageEditOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + return await GenerateImageEditAsync(imageStream, imageFilePath, prompt, options).ConfigureAwait(false); + } + + /// Generates an edited or extended image based on an original image and a prompt. + /// + /// The path of the image file to edit. Must be a valid PNG file, less than 4MB, and square. The image must + /// have transparency, which will be used as the mask. The provided file path's extension (for example: .png) + /// will be used to validate the format of the input image. The request may fail if the file path's extension + /// and the actual format of the input image do not match. + /// + /// A text description of the desired image. + /// The options to configure the image edit. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageEdit(string imageFilePath, string prompt, ImageEditOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + return GenerateImageEdit(imageStream, imageFilePath, prompt,options); + } + + /// Generates an edited or extended image based on an original image, a prompt, and a mask. + /// The image stream to edit. Must be a valid PNG file, less than 4MB, and square. + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// A text description of the desired image. + /// + /// An additional image whose fully transparent areas (i.e., where alpha is zero) indicate where the original image + /// should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions as image. + /// + /// + /// The filename associated with the mask image stream. The filename's extension (for example: .png) will be + /// used to validate the format of the mask image. The request may fail if the filename's extension and the + /// actual format of the mask image do not match. + /// + /// The options to configure the image edit. + /// A token that can be used to cancel this method call. + /// , , , , or is null. + /// , , or is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageEditAsync(Stream image, string imageFilename, string prompt, Stream mask, string maskFilename, ImageEditOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + Argument.AssertNotNull(mask, nameof(mask)); + Argument.AssertNotNullOrEmpty(maskFilename, nameof(maskFilename)); + + options ??= new(); + CreateImageEditOptions(image, imageFilename, prompt, mask, maskFilename, null, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename, mask, maskFilename); + ClientResult result = await GenerateImageEditsAsync(content, content.ContentType, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + /// Generates an edited or extended image based on an original image, a prompt, and a mask. + /// The image stream to edit. Must be a valid PNG file, less than 4MB, and square. + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// A text description of the desired image. + /// + /// An additional image whose fully transparent areas (i.e., where alpha is zero) indicate where the original image + /// should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions as image. + /// + /// + /// The filename associated with the mask image stream. The filename's extension (for example: .png) will be + /// used to validate the format of the mask image. The request may fail if the filename's extension and the + /// actual format of the mask image do not match. + /// + /// The options to configure the image edit. + /// A token that can be used to cancel this method call. + /// , , , , or is null. + /// , , or is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageEdit(Stream image, string imageFilename, string prompt, Stream mask, string maskFilename, ImageEditOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + Argument.AssertNotNull(mask, nameof(mask)); + Argument.AssertNotNullOrEmpty(maskFilename, nameof(maskFilename)); + + options ??= new(); + CreateImageEditOptions(image, imageFilename, prompt, mask, maskFilename, null, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename, mask, maskFilename); + ClientResult result = GenerateImageEdits(content, content.ContentType, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + /// Generates an edited or extended image based on an original image, a prompt, and a mask. + /// + /// The path of the image file to edit. Must be a valid PNG file, less than 4MB, and square. The provided file + /// path's extension (for example: .png) will be used to validate the format of the input image. The request + /// may fail if the file path's extension and the actual format of the input image do not match. + /// + /// A text description of the desired image. + /// + /// The path of the mask image file whose fully transparent areas (i.e., where alpha is zero) indicate where + /// the original image should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions + /// as the original image. The provided file path's extension (for example: .png) will be used to validate the + /// format of the mask image. The request may fail if the file path's extension and the actual format of the + /// mask image do not match. + /// + /// The options to configure the image edit. + /// , or is null. + /// , , or is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageEditAsync(string imageFilePath, string prompt, string maskFilePath, ImageEditOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + Argument.AssertNotNullOrEmpty(maskFilePath, nameof(maskFilePath)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + using FileStream maskStream = File.OpenRead(maskFilePath); + return await GenerateImageEditAsync(imageStream, imageFilePath, prompt, maskStream, maskFilePath, options).ConfigureAwait(false); + } + + /// Generates an edited or extended image based on an original image, a prompt, and a mask. + /// + /// The path of the image file to edit. Must be a valid PNG file, less than 4MB, and square. The provided file + /// path's extension (for example: .png) will be used to validate the format of the input image. The request + /// may fail if the file path's extension and the actual format of the input image do not match. + /// + /// A text description of the desired image. + /// + /// The path of the mask image file whose fully transparent areas (i.e., where alpha is zero) indicate where + /// the original image should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions + /// as the original image. The provided file path's extension (for example: .png) will be used to validate the + /// format of the mask image. The request may fail if the file path's extension and the actual format of the + /// mask image do not match. + /// + /// The options to configure the image edit. + /// , or is null. + /// , , or is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageEdit(string imageFilePath, string prompt, string maskFilePath, ImageEditOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + Argument.AssertNotNullOrEmpty(maskFilePath, nameof(maskFilePath)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + using FileStream maskStream = File.OpenRead(maskFilePath); + return GenerateImageEdit(imageStream, imageFilePath, prompt, maskStream, maskFilePath, options); + } + + /// Generates edited or extended images based on an original image and a prompt. + /// + /// The image stream to edit. Must be a valid PNG file, less than 4MB, and square. The image must have transparency, which + /// will be used as the mask. + /// + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// A text description of the desired image. + /// The number of edited or extended images to generate. + /// The options to configure the image edit. + /// A token that can be used to cancel this method call. + /// , , or is null. + /// or is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageEditsAsync(Stream image, string imageFilename, string prompt, int imageCount, ImageEditOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + options ??= new(); + CreateImageEditOptions(image, imageFilename, prompt, null, null, imageCount, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename, null, null); + ClientResult result = await GenerateImageEditsAsync(content, content.ContentType, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Generates edited or extended images based on an original image and a prompt. + /// + /// The image stream to edit. Must be a valid PNG file, less than 4MB, and square. The image must have transparency, which + /// will be used as the mask. + /// + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// A text description of the desired image. + /// The number of edited or extended images to generate. + /// The options to configure the image edit. + /// A token that can be used to cancel this method call. + /// , , or is null. + /// or is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageEdits(Stream image, string imageFilename, string prompt, int imageCount, ImageEditOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + options ??= new(); + CreateImageEditOptions(image, imageFilename, prompt, null, null, imageCount, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename, null, null); + ClientResult result = GenerateImageEdits(content, content.ContentType, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Generates edited or extended images based on an original image and a prompt. + /// + /// The path of the image file to edit. Must be a valid PNG file, less than 4MB, and square. The image must + /// have transparency, which will be used as the mask. The provided file path's extension (for example: .png) + /// will be used to validate the format of the input image. The request may fail if the file path's extension + /// and the actual format of the input image do not match. + /// + /// A text description of the desired image. + /// The number of edited or extended images to generate. + /// The options to configure the image edit. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageEditsAsync(string imageFilePath, string prompt, int imageCount, ImageEditOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + return await GenerateImageEditsAsync(imageStream, imageFilePath, prompt, imageCount, options).ConfigureAwait(false); + } + + /// Generates edited or extended images based on an original image and a prompt. + /// + /// The path of the image file to edit. Must be a valid PNG file, less than 4MB, and square. The image must + /// have transparency, which will be used as the mask. The provided file path's extension (for example: .png) + /// will be used to validate the format of the input image. The request may fail if the file path's extension + /// and the actual format of the input image do not match. + /// + /// A text description of the desired image. + /// The number of edited or extended images to generate. + /// The options to configure the image edit. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageEdits(string imageFilePath, string prompt, int imageCount, ImageEditOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + return GenerateImageEdits(imageStream, imageFilePath, prompt, imageCount, options); + } + + /// Generates edited or extended images based on an original image, a prompt, and a mask. + /// The image stream to edit. Must be a valid PNG file, less than 4MB, and square. + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// A text description of the desired image. + /// + /// An additional image whose fully transparent areas (i.e., where alpha is zero) indicate where the original image + /// should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions as image. + /// + /// + /// The filename associated with the mask image stream. The filename's extension (for example: .png) will be + /// used to validate the format of the mask image. The request may fail if the filename's extension and the + /// actual format of the mask image do not match. + /// + /// The number of edited or extended images to generate. + /// The options to configure the image edit. + /// A token that can be used to cancel this method call. + /// , , , , or is null. + /// , , or is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageEditsAsync(Stream image, string imageFilename, string prompt, Stream mask, string maskFilename, int imageCount, ImageEditOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + Argument.AssertNotNull(mask, nameof(mask)); + Argument.AssertNotNullOrEmpty(maskFilename, nameof(maskFilename)); + + options ??= new(); + CreateImageEditOptions(image, imageFilename, prompt, mask, maskFilename, imageCount, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename, mask, maskFilename); + ClientResult result = await GenerateImageEditsAsync(content, content.ContentType, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Generates edited or extended images based on an original image, a prompt, and a mask. + /// The image stream to edit. Must be a valid PNG file, less than 4MB, and square. + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// A text description of the desired image. + /// + /// An additional image whose fully transparent areas (i.e., where alpha is zero) indicate where the original image + /// should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions as image. + /// + /// + /// The filename associated with the mask image stream. The filename's extension (for example: .png) will be + /// used to validate the format of the mask image. The request may fail if the filename's extension and the + /// actual format of the mask image do not match. + /// + /// The number of edited or extended images to generate. + /// The options to configure the image edit. + /// A token that can be used to cancel this method call. + /// , , , , or is null. + /// , , or is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageEdits(Stream image, string imageFilename, string prompt, Stream mask, string maskFilename, int imageCount, ImageEditOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + Argument.AssertNotNull(mask, nameof(mask)); + Argument.AssertNotNullOrEmpty(maskFilename, nameof(maskFilename)); + + options ??= new(); + CreateImageEditOptions(image, imageFilename, prompt, mask, maskFilename, imageCount, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename, mask, maskFilename); + ClientResult result = GenerateImageEdits(content, content.ContentType, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Generates edited or extended images based on an original image, a prompt, and a mask. + /// + /// The path of the image file to edit. Must be a valid PNG file, less than 4MB, and square. The provided file + /// path's extension (for example: .png) will be used to validate the format of the input image. The request + /// may fail if the file path's extension and the actual format of the input image do not match. + /// + /// A text description of the desired image. + /// + /// The path of the mask image file whose fully transparent areas (i.e., where alpha is zero) indicate where + /// the original image should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions + /// as the original image. The provided file path's extension (for example: .png) will be used to validate the + /// format of the mask image. The request may fail if the file path's extension and the actual format of the + /// mask image do not match. + /// + /// The number of edited or extended images to generate. + /// The options to configure the image edit. + /// , or is null. + /// , , or is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageEditsAsync(string imageFilePath, string prompt, string maskFilePath, int imageCount, ImageEditOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + Argument.AssertNotNullOrEmpty(maskFilePath, nameof(maskFilePath)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + using FileStream maskStream = File.OpenRead(maskFilePath); + return await GenerateImageEditsAsync(imageStream, imageFilePath, prompt, maskStream, maskFilePath, imageCount, options).ConfigureAwait(false); + } + + /// Generates edited or extended images based on an original image, a prompt, and a mask. + /// + /// The path of the image file to edit. Must be a valid PNG file, less than 4MB, and square. The provided file + /// path's extension (for example: .png) will be used to validate the format of the input image. The request + /// may fail if the file path's extension and the actual format of the input image do not match. + /// + /// A text description of the desired image. + /// + /// The path of the mask image file whose fully transparent areas (i.e., where alpha is zero) indicate where + /// the original image should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions + /// as the original image. The provided file path's extension (for example: .png) will be used to validate the + /// format of the mask image. The request may fail if the file path's extension and the actual format of the + /// mask image do not match. + /// + /// The number of edited or extended images to generate. + /// The options to configure the image edit. + /// , or is null. + /// , , or is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageEdits(string imageFilePath, string prompt, string maskFilePath, int imageCount, ImageEditOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + Argument.AssertNotNullOrEmpty(prompt, nameof(prompt)); + Argument.AssertNotNullOrEmpty(maskFilePath, nameof(maskFilePath)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + using FileStream maskStream = File.OpenRead(maskFilePath); + return GenerateImageEdits(imageStream, imageFilePath, prompt, maskStream, maskFilePath, imageCount, options); + } + + #endregion + + #region GenerateImageVariations + + /// Generates a variation of a given image. + /// The image stream to use as the basis for the variation. Must be a valid PNG file, less than 4MB, and square. + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// The options to configure the image variation. + /// A token that can be used to cancel this method call. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageVariationAsync(Stream image, string imageFilename, ImageVariationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + + options ??= new(); + CreateImageVariationOptions(image, imageFilename, null, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename); + ClientResult result = await GenerateImageVariationsAsync(content, content.ContentType, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + /// Generates a variation of a given image. + /// The image stream to use as the basis for the variation. Must be a valid PNG file, less than 4MB, and square. + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// The options to configure the image variation. + /// A token that can be used to cancel this method call. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageVariation(Stream image, string imageFilename, ImageVariationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + + options ??= new(); + CreateImageVariationOptions(image, imageFilename, null, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename); + ClientResult result = GenerateImageVariations(content, content.ContentType, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + /// Generates a variation of a given image. + /// + /// The path of the image file to use as the basis for the variation. Must be a valid PNG file, less than 4MB, + /// and square. The provided file path's extension (for example: .png) will be used to validate the format of + /// the input image. The request may fail if the file path's extension and the actual format of the input image + /// do not match. + /// + /// The options to configure the image variation. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageVariationAsync(string imageFilePath, ImageVariationOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + return await GenerateImageVariationAsync(imageStream, imageFilePath, options).ConfigureAwait(false); + } + + /// Generates a variation of a given image. + /// + /// The path of the image file to use as the basis for the variation. Must be a valid PNG file, less than 4MB, + /// and square. The provided file path's extension (for example: .png) will be used to validate the format of + /// the input image. The request may fail if the file path's extension and the actual format of the input image + /// do not match. + /// + /// The options to configure the image variation. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageVariation(string imageFilePath, ImageVariationOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + return GenerateImageVariation(imageStream, imageFilePath, options); + } + + /// Generates variations of a given image. + /// The image stream to use as the basis for the variation. Must be a valid PNG file, less than 4MB, and square. + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// The number of image variations to generate. + /// The options to configure the image variation. + /// A token that can be used to cancel this method call. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageVariationsAsync(Stream image, string imageFilename, int imageCount, ImageVariationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + + options ??= new(); + CreateImageVariationOptions(image, imageFilename, imageCount, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename); + ClientResult result = await GenerateImageVariationsAsync(content, content.ContentType, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Generates variations of a given image. + /// The image stream to use as the basis for the variation. Must be a valid PNG file, less than 4MB, and square. + /// + /// The filename associated with the image stream. The filename's extension (for example: .png) will be used to + /// validate the format of the input image. The request may fail if the filename's extension and the actual + /// format of the input image do not match. + /// + /// The number of image variations to generate. + /// The options to configure the image variation. + /// A token that can be used to cancel this method call. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageVariations(Stream image, string imageFilename, int imageCount, ImageVariationOptions options = null, CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(image, nameof(image)); + Argument.AssertNotNullOrEmpty(imageFilename, nameof(imageFilename)); + + options ??= new(); + CreateImageVariationOptions(image, imageFilename, imageCount, ref options); + + using MultipartFormDataBinaryContent content = options.ToMultipartContent(image, imageFilename); + ClientResult result = GenerateImageVariations(content, content.ContentType, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(GeneratedImageCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Generates variations of a given image. + /// + /// The path of the image file to use as the basis for the variation. Must be a valid PNG file, less than 4MB, + /// and square. The provided file path's extension (for example: .png) will be used to validate the format of + /// the input image. The request may fail if the file path's extension and the actual format of the input image + /// do not match. + /// + /// The number of image variations to generate. + /// The options to configure the image variation. + /// was null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> GenerateImageVariationsAsync(string imageFilePath, int imageCount, ImageVariationOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + return await GenerateImageVariationsAsync(imageStream, imageFilePath, imageCount, options).ConfigureAwait(false); + } + + /// Generates variations of a given image. + /// + /// The path of the image file to use as the basis for the variation. Must be a valid PNG file, less than 4MB, + /// and square. The provided file path's extension (for example: .png) will be used to validate the format of + /// the input image. The request may fail if the file path's extension and the actual format of the input image + /// do not match. + /// + /// The number of image variations to generate. + /// The options to configure the image variation. + /// was null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult GenerateImageVariations(string imageFilePath, int imageCount, ImageVariationOptions options = null) + { + Argument.AssertNotNullOrEmpty(imageFilePath, nameof(imageFilePath)); + + using FileStream imageStream = File.OpenRead(imageFilePath); + return GenerateImageVariations(imageStream, imageFilePath, imageCount, options); + } + + #endregion + + private void CreateImageGenerationOptions(string prompt, int? imageCount, ref ImageGenerationOptions options) + { + options.Prompt = prompt; + options.N = imageCount; + options.Model = _model; + } + + private void CreateImageEditOptions(Stream image, string imageFilename, string prompt, Stream mask, string maskFilename, int? imageCount, ref ImageEditOptions options) + { + options.Prompt = prompt; + options.N = imageCount; + options.Model = _model; + } + + private void CreateImageVariationOptions(Stream image, string imageFilename, int? imageCount, ref ImageVariationOptions options) + { + options.N = imageCount; + options.Model = _model; + } +} diff --git a/.dotnet/src/Custom/Images/ImageEditOptions.cs b/.dotnet/src/Custom/Images/ImageEditOptions.cs new file mode 100644 index 000000000..b8681a3de --- /dev/null +++ b/.dotnet/src/Custom/Images/ImageEditOptions.cs @@ -0,0 +1,136 @@ +using System; +using System.IO; + +namespace OpenAI.Images; + +/// +/// Represents additional options available to control the behavior of an image generation operation. +/// +[CodeGenModel("CreateImageEditRequest")] +[CodeGenSuppress("ImageEditOptions", typeof(BinaryData), typeof(string))] +public partial class ImageEditOptions +{ + // CUSTOM: Made internal. The model is specified by the client. + /// The model to use for image generation. Only `dall-e-2` is supported at this time. + internal InternalCreateImageEditRequestModel? Model { get; set; } + + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// + /// The image to edit. Must be a valid PNG file, less than 4MB, and square. If mask is not + /// provided, image must have transparency, which will be used as the mask. + /// + /// To assign a byte[] to this property use . + /// The byte[] will be serialized to a Base64 encoded string. + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromBytes(new byte[] { 1, 2, 3 }) + /// Creates a payload of "AQID". + /// + /// + /// + /// + internal BinaryData Image { get; set; } + + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// A text description of the desired image(s). The maximum length is 1000 characters. + internal string Prompt { get; set; } + + // CUSTOM: Made internal. This value comes from a parameter on the client method. + /// + /// An additional image whose fully transparent areas (e.g. where alpha is zero) indicate where + /// `image` should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions + /// as `image`. + /// + /// To assign a byte[] to this property use . + /// The byte[] will be serialized to a Base64 encoded string. + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromBytes(new byte[] { 1, 2, 3 }) + /// Creates a payload of "AQID". + /// + /// + /// + /// + internal BinaryData Mask { get; set; } + + // CUSTOM: Made internal. This value comes from a parameter on the client method. + /// The number of images to generate. Must be between 1 and 10. + internal long? N { get; set; } + + // CUSTOM: Made public now that there are no required properties. + /// Initializes a new instance of . + public ImageEditOptions() + { + } + + // CUSTOM: Changed property type. + /// The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. + [CodeGenMember("Size")] + public GeneratedImageSize? Size { get; set; } + + // CUSTOM: Changed property type. + /// The format in which the generated images are returned. Must be one of `url` or `b64_json`. + [CodeGenMember("ResponseFormat")] + public GeneratedImageFormat? ResponseFormat { get; set; } + + // CUSTOM: Renamed. + /// + /// A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. + /// Learn more. + /// + [CodeGenMember("User")] + public string EndUserId { get; set; } + + internal MultipartFormDataBinaryContent ToMultipartContent(Stream image, string imageFilename, Stream mask, string maskFilename) + { + MultipartFormDataBinaryContent content = new(); + + content.Add(image, "image", imageFilename); + content.Add(Prompt, "prompt"); + content.Add(Model.Value.ToString(), "model"); + + if (mask is not null) + { + content.Add(mask, "mask", maskFilename); + } + + if (N is not null) + { + content.Add(N.Value, "n"); + } + + if (ResponseFormat is not null) + { + string format = ResponseFormat switch + { + GeneratedImageFormat.Uri => "url", + GeneratedImageFormat.Bytes => "b64_json", + _ => throw new ArgumentException(nameof(ResponseFormat)), + }; + + content.Add(format, "response_format"); + } + + if (Size is not null) + { + content.Add(Size.ToString(), "size"); + } + + if (EndUserId is not null) + { + content.Add(EndUserId, "user"); + } + + return content; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Images/ImageGenerationOptions.cs b/.dotnet/src/Custom/Images/ImageGenerationOptions.cs new file mode 100644 index 000000000..688e7e701 --- /dev/null +++ b/.dotnet/src/Custom/Images/ImageGenerationOptions.cs @@ -0,0 +1,43 @@ +namespace OpenAI.Images; + +/// +/// Represents additional options available to control the behavior of an image generation operation. +/// +[CodeGenModel("CreateImageRequest")] +[CodeGenSuppress("ImageGenerationOptions", typeof(string))] +public partial class ImageGenerationOptions +{ + // CUSTOM: Made internal. The model is specified by the client. + /// The model to use for image generation. + internal InternalCreateImageRequestModel? Model { get; set; } + + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// + /// A text description of the desired image(s). The maximum length is 1000 characters for + /// `dall-e-2` and 4000 characters for `dall-e-3`. + /// + internal string Prompt { get; set; } + + // CUSTOM: Made internal. This value comes from a parameter on the client method. + /// + /// The number of images to generate. Must be between 1 and 10. For `dall-e-3`, only `n=1` is + /// supported. + /// + internal long? N { get; set; } + + // CUSTOM: Made public now that there are no required properties. + /// Initializes a new instance of . + public ImageGenerationOptions() + { + } + + // CUSTOM: Renamed. + /// + /// A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. + /// Learn more. + /// + [CodeGenMember("User")] + public string EndUserId { get; set; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Images/ImageVariationOptions.cs b/.dotnet/src/Custom/Images/ImageVariationOptions.cs new file mode 100644 index 000000000..0f71b265b --- /dev/null +++ b/.dotnet/src/Custom/Images/ImageVariationOptions.cs @@ -0,0 +1,103 @@ +using System; +using System.IO; + +namespace OpenAI.Images; + +/// +/// Represents additional options available to control the behavior of an image generation operation. +/// +[CodeGenModel("CreateImageVariationRequest")] +[CodeGenSuppress("ImageVariationOptions", typeof(BinaryData))] +public partial class ImageVariationOptions +{ + // CUSTOM: Made internal. The model is specified by the client. + /// The model to use for image generation. Only `dall-e-2` is supported at this time. + internal InternalCreateImageVariationRequestModel? Model { get; set; } + + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// + /// The image to use as the basis for the variation(s). Must be a valid PNG file, less than 4MB, + /// and square. + /// + /// To assign a byte[] to this property use . + /// The byte[] will be serialized to a Base64 encoded string. + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromBytes(new byte[] { 1, 2, 3 }) + /// Creates a payload of "AQID". + /// + /// + /// + /// + internal BinaryData Image { get; set; } + + // CUSTOM: Made internal. This value comes from a parameter on the client method. + /// The number of images to generate. Must be between 1 and 10. + internal long? N { get; set; } + + // CUSTOM: Made public now that there are no required properties. + /// Initializes a new instance of . + public ImageVariationOptions() + { + } + + // CUSTOM: Changed property type. + /// The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. + [CodeGenMember("Size")] + public GeneratedImageSize? Size { get; set; } + + // CUSTOM: Changed property type. + /// The format in which the generated images are returned. Must be one of `url` or `b64_json`. + [CodeGenMember("ResponseFormat")] + public GeneratedImageFormat? ResponseFormat { get; set; } + + // CUSTOM: Renamed. + /// + /// A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. + /// Learn more. + /// + [CodeGenMember("User")] + public string EndUserId { get; set; } + + internal MultipartFormDataBinaryContent ToMultipartContent(Stream image, string imageFilename) + { + MultipartFormDataBinaryContent content = new(); + + content.Add(image, "image", imageFilename); + content.Add(Model.Value.ToString(), "model"); + + if (N is not null) + { + content.Add(N.Value, "n"); + } + + if (ResponseFormat is not null) + { + string format = ResponseFormat switch + { + GeneratedImageFormat.Uri => "url", + GeneratedImageFormat.Bytes => "b64_json", + _ => throw new ArgumentException(nameof(ResponseFormat)), + }; + + content.Add(format, "response_format"); + } + + if (Size is not null) + { + content.Add(Size.ToString(), "size"); + } + + if (EndUserId is not null) + { + content.Add(EndUserId, "user"); + } + + return content; + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Images/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/Images/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..3e85bb899 --- /dev/null +++ b/.dotnet/src/Custom/Images/Internal/GeneratorStubs.cs @@ -0,0 +1,24 @@ +namespace OpenAI.Images; + +// CUSTOM: Made internal. + +[CodeGenModel("CreateImageEditRequestModel")] +internal readonly partial struct InternalCreateImageEditRequestModel { } + +[CodeGenModel("CreateImageEditRequestResponseFormat")] +internal readonly partial struct InternalCreateImageEditRequestResponseFormat { } + +[CodeGenModel("CreateImageEditRequestSize")] +internal readonly partial struct InternalCreateImageEditRequestSize { } + +[CodeGenModel("CreateImageRequestModel")] +internal readonly partial struct InternalCreateImageRequestModel { } + +[CodeGenModel("CreateImageVariationRequestModel")] +internal readonly partial struct InternalCreateImageVariationRequestModel { } + +[CodeGenModel("CreateImageVariationRequestResponseFormat")] +internal readonly partial struct InternalCreateImageVariationRequestResponseFormat { } + +[CodeGenModel("CreateImageVariationRequestSize")] +internal readonly partial struct InternalCreateImageVariationRequestSize { } diff --git a/.dotnet/src/Custom/Images/OpenAIImagesModelFactory.cs b/.dotnet/src/Custom/Images/OpenAIImagesModelFactory.cs new file mode 100644 index 000000000..c318ff416 --- /dev/null +++ b/.dotnet/src/Custom/Images/OpenAIImagesModelFactory.cs @@ -0,0 +1,32 @@ +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Images; + +/// Model factory for models. +public static partial class OpenAIImagesModelFactory +{ + /// Initializes a new instance of . + /// A new instance for mocking. + public static GeneratedImage GeneratedImage(BinaryData imageBytes = null, Uri imageUri = null, string revisedPrompt = null) + { + return new GeneratedImage( + imageBytes, + imageUri, + revisedPrompt, + serializedAdditionalRawData: null); + } + + /// Initializes a new instance of . + /// A new instance for mocking. + public static GeneratedImageCollection GeneratedImageCollection(DateTimeOffset createdAt = default, IEnumerable items = null) + { + items ??= new List(); + + return new GeneratedImageCollection( + createdAt, + items.ToList(), + serializedAdditionalRawData: null); + } +} diff --git a/.dotnet/src/Custom/Internal/CancellationTokenExtensions.cs b/.dotnet/src/Custom/Internal/CancellationTokenExtensions.cs new file mode 100644 index 000000000..d0a5bc81f --- /dev/null +++ b/.dotnet/src/Custom/Internal/CancellationTokenExtensions.cs @@ -0,0 +1,24 @@ +using System.ClientModel.Primitives; +using System.Threading; + +namespace OpenAI; + +internal static class CancellationTokenExtensions +{ + public static RequestOptions ToRequestOptions(this CancellationToken cancellationToken, bool streaming = false) + { + if (cancellationToken == default) + { + if (!streaming) return null; + return StreamRequestOptions; + } + + return new RequestOptions() { + CancellationToken = cancellationToken, + BufferResponse = !streaming, + }; + } + + private static RequestOptions StreamRequestOptions => _streamRequestOptions ??= new() { BufferResponse = false }; + private static RequestOptions _streamRequestOptions; +} diff --git a/.dotnet/src/Custom/Internal/ClientPipelineExtensions.cs b/.dotnet/src/Custom/Internal/ClientPipelineExtensions.cs new file mode 100644 index 000000000..0b3cca3f4 --- /dev/null +++ b/.dotnet/src/Custom/Internal/ClientPipelineExtensions.cs @@ -0,0 +1,70 @@ +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI; + +internal static partial class ClientPipelineExtensions +{ + // CUSTOM: + // - Supplemented exception body with deserialized OpenAI error details + + public static async ValueTask ProcessMessageAsync( + this ClientPipeline pipeline, + PipelineMessage message, + RequestOptions options) + { + await pipeline.SendAsync(message).ConfigureAwait(false); + + if (message.Response.IsError && (options?.ErrorOptions & ClientErrorBehaviors.NoThrow) != ClientErrorBehaviors.NoThrow) + { + throw await TryBufferResponseAndCreateErrorAsync(message).ConfigureAwait(false) switch + { + string errorMessage when !string.IsNullOrEmpty(errorMessage) + => new ClientResultException(errorMessage, message.Response), + _ => new ClientResultException(message.Response), + }; + } + + return message.BufferResponse ? + message.Response : + message.ExtractResponse(); + } + + public static PipelineResponse ProcessMessage( + this ClientPipeline pipeline, + PipelineMessage message, + RequestOptions options) + { + pipeline.Send(message); + + if (message.Response.IsError && (options?.ErrorOptions & ClientErrorBehaviors.NoThrow) != ClientErrorBehaviors.NoThrow) + { + throw TryBufferResponseAndCreateError(message) switch + { + string errorMessage when !string.IsNullOrEmpty(errorMessage) + => new ClientResultException(errorMessage, message.Response), + _ => new ClientResultException(message.Response), + }; + } + + return message.BufferResponse ? + message.Response : + message.ExtractResponse(); + } + + private static string TryBufferResponseAndCreateError(PipelineMessage message) + { + message.Response.BufferContent(); + return TryCreateErrorMessageFromResponse(message.Response); + } + + private static async Task TryBufferResponseAndCreateErrorAsync(PipelineMessage message) + { + await message.Response.BufferContentAsync().ConfigureAwait(false); + return TryCreateErrorMessageFromResponse(message.Response); + } + + private static string TryCreateErrorMessageFromResponse(PipelineResponse response) + => Internal.OpenAIError.TryCreateFromResponse(response)?.ToExceptionMessage(response.Status); +} diff --git a/.dotnet/src/Custom/Internal/IInternalListResponseOfT.cs b/.dotnet/src/Custom/Internal/IInternalListResponseOfT.cs new file mode 100644 index 000000000..bc02a852a --- /dev/null +++ b/.dotnet/src/Custom/Internal/IInternalListResponseOfT.cs @@ -0,0 +1,11 @@ +using System.Collections.Generic; + +namespace OpenAI; + +internal interface IInternalListResponse +{ + IReadOnlyList Data { get; } + string FirstId { get; } + string LastId { get; } + bool HasMore { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Internal/InternalFunctionDefinition.cs b/.dotnet/src/Custom/Internal/InternalFunctionDefinition.cs new file mode 100644 index 000000000..0909a1c32 --- /dev/null +++ b/.dotnet/src/Custom/Internal/InternalFunctionDefinition.cs @@ -0,0 +1,13 @@ +using System; + +namespace OpenAI; + +[CodeGenModel("FunctionObject")] +internal partial class InternalFunctionDefinition +{ + /// + /// The parameters to the function, formatting as a JSON Schema object. + /// + [CodeGenMember("Parameters")] + internal BinaryData Parameters; +} diff --git a/.dotnet/src/Custom/Internal/TelemetryDetails.cs b/.dotnet/src/Custom/Internal/TelemetryDetails.cs new file mode 100644 index 000000000..45f33ff4d --- /dev/null +++ b/.dotnet/src/Custom/Internal/TelemetryDetails.cs @@ -0,0 +1,150 @@ +using System; +using System.Net.Http.Headers; +using System.Reflection; +using System.Runtime.InteropServices; +using System.Text; + +#nullable enable + +namespace OpenAI; + +/// +/// Details about the package to be included in UserAgent telemetry +/// +internal class TelemetryDetails +{ + private const int MaxApplicationIdLength = 24; + private readonly string _userAgent; + + /// + /// The package type represented by this instance. + /// + public Assembly Assembly { get; } + + /// + /// The value of the applicationId used to initialize this instance. + /// + public string? ApplicationId { get; } + + /// + /// Initialize an instance of by extracting the name and version information from the associated with the . + /// + /// The used to generate the package name and version information for the value. + /// An optional value to be prepended to the . + internal TelemetryDetails(Assembly assembly, string? applicationId = null) + : this(assembly, applicationId, new RuntimeInformationWrapper()) + { } + + internal TelemetryDetails(Assembly assembly, string? applicationId = null, RuntimeInformationWrapper? runtimeInformation = default) + { + Argument.AssertNotNull(assembly, nameof(assembly)); + if (applicationId?.Length > MaxApplicationIdLength) + { + throw new ArgumentOutOfRangeException(nameof(applicationId), $"{nameof(applicationId)} must be shorter than {MaxApplicationIdLength + 1} characters"); + } + + Assembly = assembly; + ApplicationId = applicationId; + _userAgent = GenerateUserAgentString(assembly, applicationId, runtimeInformation); + } + + internal static string GenerateUserAgentString(Assembly clientAssembly, string? applicationId = null, RuntimeInformationWrapper? runtimeInformation = default) + { + AssemblyInformationalVersionAttribute? versionAttribute + = clientAssembly.GetCustomAttribute() + ?? throw new InvalidOperationException( + $"{nameof(AssemblyInformationalVersionAttribute)} is required on client SDK assembly '{clientAssembly.FullName}'."); + + string version = versionAttribute.InformationalVersion; + + string assemblyName = clientAssembly.GetName().Name!; + + int hashSeparator = version.LastIndexOf('+'); + if (hashSeparator != -1) + { + version = version.Substring(0, hashSeparator); + } + runtimeInformation ??= new RuntimeInformationWrapper(); + var platformInformation = EscapeProductInformation($"({runtimeInformation.FrameworkDescription}; {runtimeInformation.OSDescription})"); + + return applicationId != null + ? $"{applicationId} {assemblyName}/{version} {platformInformation}" + : $"{assemblyName}/{version} {platformInformation}"; + } + + /// + /// The properly formatted UserAgent string based on this instance. + /// + public override string ToString() => _userAgent; + + /// + /// If the ProductInformation is not in the proper format, this escapes any ')' , '(' or '\' characters per https://www.rfc-editor.org/rfc/rfc7230#section-3.2.6 + /// + /// The ProductInfo portion of the UserAgent + /// + private static string EscapeProductInformation(string productInfo) + { + // If the string is already valid, we don't need to escape anything + bool success = false; + try + { + success = ProductInfoHeaderValue.TryParse(productInfo, out var _); + } + catch (Exception) + { + // Invalid values can throw in Framework due to https://github.com/dotnet/runtime/issues/28558 + // Treat this as a failure to parse. + } + if (success) + { + return productInfo; + } + + var sb = new StringBuilder(productInfo.Length + 2); + sb.Append('('); + // exclude the first and last characters, which are the enclosing parentheses + for (int i = 1; i < productInfo.Length - 1; i++) + { + char c = productInfo[i]; + if (c == ')' || c == '(') + { + sb.Append('\\'); + } + // If we see a \, we don't need to escape it if it's followed by a '\', '(', or ')', because it is already escaped. + else if (c == '\\') + { + if (i + 1 < (productInfo.Length - 1)) + { + char next = productInfo[i + 1]; + if (next == '\\' || next == '(' || next == ')') + { + sb.Append(c); + sb.Append(next); + i++; + continue; + } + else + { + sb.Append('\\'); + } + } + else + { + sb.Append('\\'); + } + } + sb.Append(c); + } + sb.Append(')'); + return sb.ToString(); + } + + internal class RuntimeInformationWrapper + { + public virtual string FrameworkDescription => RuntimeInformation.FrameworkDescription; + public virtual string OSDescription => RuntimeInformation.OSDescription; + public virtual Architecture OSArchitecture => RuntimeInformation.OSArchitecture; + public virtual Architecture ProcessArchitecture => RuntimeInformation.ProcessArchitecture; + public virtual bool IsOSPlatform(OSPlatform osPlatform) => RuntimeInformation.IsOSPlatform(osPlatform); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/LegacyCompletions/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/LegacyCompletions/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..b184f9da0 --- /dev/null +++ b/.dotnet/src/Custom/LegacyCompletions/Internal/GeneratorStubs.cs @@ -0,0 +1,24 @@ +namespace OpenAI.LegacyCompletions; + +// CUSTOM: Made internal. + +[CodeGenModel("CreateCompletionRequest")] +internal partial class InternalCreateCompletionRequest { } + +[CodeGenModel("CreateCompletionRequestModel")] +internal readonly partial struct InternalCreateCompletionRequestModel { } + +[CodeGenModel("CreateCompletionResponse")] +internal partial class InternalCreateCompletionResponse { } + +[CodeGenModel("CreateCompletionResponseChoice")] +internal partial class InternalCreateCompletionResponseChoice { } + +[CodeGenModel("CreateCompletionResponseChoiceFinishReason")] +internal readonly partial struct InternalCreateCompletionResponseChoiceFinishReason { } + +[CodeGenModel("CreateCompletionResponseChoiceLogprobs")] +internal partial class InternalCreateCompletionResponseChoiceLogprobs { } + +[CodeGenModel("CreateCompletionResponseObject")] +internal readonly partial struct InternalCreateCompletionResponseObject { } \ No newline at end of file diff --git a/.dotnet/src/Custom/LegacyCompletions/Internal/LegacyCompletionClient.cs b/.dotnet/src/Custom/LegacyCompletions/Internal/LegacyCompletionClient.cs new file mode 100644 index 000000000..7d50115ce --- /dev/null +++ b/.dotnet/src/Custom/LegacyCompletions/Internal/LegacyCompletionClient.cs @@ -0,0 +1,73 @@ +using System; +using System.ClientModel.Primitives; +using System.ClientModel; + +namespace OpenAI.LegacyCompletions; + +// CUSTOM: +// - Renamed. +// - Suppressed constructor that takes endpoint parameter; endpoint is now a property in the options class. +// - Suppressed methods that only take the options parameter. +/// The service client for OpenAI legacy completion operations. +[CodeGenClient("Completions")] +[CodeGenSuppress("LegacyCompletionClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +internal partial class LegacyCompletionClient +{ + private readonly string _model; + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public LegacyCompletionClient(string model, ApiKeyCredential credential) : this(model, credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public LegacyCompletionClient(string model, ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + protected internal LegacyCompletionClient(ClientPipeline pipeline, string model, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNullOrEmpty(model, nameof(model)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } +} diff --git a/.dotnet/src/Custom/Models/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/Models/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..2142aa5c1 --- /dev/null +++ b/.dotnet/src/Custom/Models/Internal/GeneratorStubs.cs @@ -0,0 +1,14 @@ + +namespace OpenAI.Models; + +[CodeGenModel("DeleteModelResponse")] +internal partial class InternalDeleteModelResponse { } + +[CodeGenModel("DeleteModelResponseObject")] +internal readonly partial struct InternalDeleteModelResponseObject { } + +[CodeGenModel("ListModelsResponseObject")] +internal readonly partial struct InternalListModelsResponseObject { } + +[CodeGenModel("ModelObject")] +internal readonly partial struct InternalModelObject { } diff --git a/.dotnet/src/Custom/Models/ModelClient.Protocol.cs b/.dotnet/src/Custom/Models/ModelClient.Protocol.cs new file mode 100644 index 000000000..7be879115 --- /dev/null +++ b/.dotnet/src/Custom/Models/ModelClient.Protocol.cs @@ -0,0 +1,118 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; +using System.Threading.Tasks; + +namespace OpenAI.Models; + +[CodeGenSuppress("GetModelsAsync", typeof(RequestOptions))] +[CodeGenSuppress("GetModels", typeof(RequestOptions))] +[CodeGenSuppress("RetrieveAsync", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("Retrieve", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("DeleteAsync", typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("Delete", typeof(string), typeof(RequestOptions))] +public partial class ModelClient +{ + /// + /// [Protocol Method] Lists the currently available models, and provides basic information about each one such as the + /// owner and availability. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GetModelsAsync(RequestOptions options) + { + using PipelineMessage message = CreateGetModelsRequest(options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Lists the currently available models, and provides basic information about each one such as the + /// owner and availability. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetModels(RequestOptions options) + { + using PipelineMessage message = CreateGetModelsRequest(options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Retrieves a model instance, providing basic information about the model such as the owner and + /// permissioning. + /// + /// The ID of the model to use for this request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GetModelAsync(string model, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + + using PipelineMessage message = CreateRetrieveRequest(model, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieves a model instance, providing basic information about the model such as the owner and + /// permissioning. + /// + /// The ID of the model to use for this request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetModel(string model, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + + using PipelineMessage message = CreateRetrieveRequest(model, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Delete a fine-tuned model. You must have the Owner role in your organization to delete a model. + /// + /// The model to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task DeleteModelAsync(string model, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + + using PipelineMessage message = CreateDeleteRequest(model, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Delete a fine-tuned model. You must have the Owner role in your organization to delete a model. + /// + /// The model to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteModel(string model, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + + using PipelineMessage message = CreateDeleteRequest(model, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } +} diff --git a/.dotnet/src/Custom/Models/ModelClient.cs b/.dotnet/src/Custom/Models/ModelClient.cs new file mode 100644 index 000000000..efbadd6d7 --- /dev/null +++ b/.dotnet/src/Custom/Models/ModelClient.cs @@ -0,0 +1,131 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Models; + +// CUSTOM: +// - Renamed. +// - Suppressed constructor that takes endpoint parameter; endpoint is now a property in the options class. +// - Renamed convenience methods. +/// The service client for OpenAI model operations. +[CodeGenClient("ModelsOps")] +[CodeGenSuppress("ModelClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("RetrieveAsync", typeof(string))] +[CodeGenSuppress("Retrieve", typeof(string))] +[CodeGenSuppress("DeleteAsync", typeof(string))] +[CodeGenSuppress("Delete", typeof(string))] +public partial class ModelClient +{ + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// is null. + public ModelClient(ApiKeyCredential credential) : this(credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// is null. + public ModelClient(ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The options to configure the client. + /// is null. + protected internal ModelClient(ClientPipeline pipeline, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + options ??= new OpenAIClientOptions(); + + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } + + /// Gets basic information about each of the models that are currently available, such as their corresponding owner and availability. + public virtual async Task> GetModelsAsync() + { + ClientResult result = await GetModelsAsync(null).ConfigureAwait(false); + return ClientResult.FromValue(OpenAIModelInfoCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Gets basic information about each of the models that are currently available, such as their corresponding owner and availability. + public virtual ClientResult GetModels() + { + ClientResult result = GetModels(null); + return ClientResult.FromValue(OpenAIModelInfoCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Gets basic information about the specified model, such as its owner and availability. + /// The name of the desired model. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> GetModelAsync(string model) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + + ClientResult result = await GetModelAsync(model, (RequestOptions)null).ConfigureAwait(false); + return ClientResult.FromValue(OpenAIModelInfo.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Gets basic information about the specified model, such as its owner and availability. + /// The name of the desired model. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult GetModel(string model) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + + ClientResult result = GetModel(model, (RequestOptions)null); + return ClientResult.FromValue(OpenAIModelInfo.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Deletes the specified fine-tuned model. + /// You must have the role of "owner" within your organization in order to be able to delete a model. + /// The name of the model to delete. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> DeleteModelAsync(string model) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + + ClientResult result = await DeleteModelAsync(model, null).ConfigureAwait(false); + PipelineResponse response = result?.GetRawResponse(); + InternalDeleteModelResponse value = InternalDeleteModelResponse.FromResponse(response); + return ClientResult.FromValue(value.Deleted, response); + } + + /// Deletes the specified fine-tuned model. + /// You must have the role of "owner" within your organization in order to be able to delete a model. + /// The name of the model to delete. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult DeleteModel(string model) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + + ClientResult result = DeleteModel(model, null); + PipelineResponse response = result?.GetRawResponse(); + InternalDeleteModelResponse value = InternalDeleteModelResponse.FromResponse(response); + return ClientResult.FromValue(value.Deleted, response); + } +} diff --git a/.dotnet/src/Custom/Models/OpenAIModelInfo.cs b/.dotnet/src/Custom/Models/OpenAIModelInfo.cs new file mode 100644 index 000000000..1fbb8fddc --- /dev/null +++ b/.dotnet/src/Custom/Models/OpenAIModelInfo.cs @@ -0,0 +1,19 @@ +using System; + +namespace OpenAI.Models; + +/// +/// Represents information about a single available model entry. +/// +[CodeGenModel("Model")] +public partial class OpenAIModelInfo +{ + // CUSTOM: Made private. This property does not add value in the context of a strongly-typed class. + /// The object type, which is always "model". + private InternalModelObject Object { get; } = InternalModelObject.Model; + + // CUSTOM: Renamed. + /// The Unix timestamp (in seconds) when the model was created. + [CodeGenMember("Created")] + public DateTimeOffset CreatedAt { get; } +} diff --git a/.dotnet/src/Custom/Models/OpenAIModelInfoCollection.Serialization.cs b/.dotnet/src/Custom/Models/OpenAIModelInfoCollection.Serialization.cs new file mode 100644 index 000000000..6cfc9383d --- /dev/null +++ b/.dotnet/src/Custom/Models/OpenAIModelInfoCollection.Serialization.cs @@ -0,0 +1,74 @@ +using OpenAI.Models; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Reflection; +using System.Text.Json; + +namespace OpenAI.Models; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class OpenAIModelInfoCollection : IJsonModel +{ + // CUSTOM: + // - Serialized the Items property. + // - Recovered the serialization of SerializedAdditionalRawData. See https://github.com/Azure/autorest.csharp/issues/4636. + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeOpenAIModelInfoCollection, writer, options); + + internal static void SerializeOpenAIModelInfoCollection(OpenAIModelInfoCollection instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("object"u8); + writer.WriteStringValue(instance.Object.ToString()); + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in instance.Items) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + + // CUSTOM: Recovered the deserialization of SerializedAdditionalRawData. See https://github.com/Azure/autorest.csharp/issues/4636. + internal static OpenAIModelInfoCollection DeserializeOpenAIModelInfoCollection(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalListModelsResponseObject @object = default; + IReadOnlyList data = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("object"u8)) + { + @object = new InternalListModelsResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(OpenAIModelInfo.DeserializeOpenAIModelInfo(item, options)); + } + data = array; + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new OpenAIModelInfoCollection(@object, data, serializedAdditionalRawData); + } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/Models/OpenAIModelInfoCollection.cs b/.dotnet/src/Custom/Models/OpenAIModelInfoCollection.cs new file mode 100644 index 000000000..773bab8f9 --- /dev/null +++ b/.dotnet/src/Custom/Models/OpenAIModelInfoCollection.cs @@ -0,0 +1,79 @@ +using System; +using System.Collections.Generic; +using System.Collections.ObjectModel; +using System.Reflection; + +namespace OpenAI.Models; + +/// +/// Represents a collection of entries for available models. +/// +[CodeGenModel("ListModelsResponse")] +[CodeGenSuppress("Data")] +[CodeGenSuppress(nameof(OpenAIModelInfoCollection))] +[CodeGenSuppress(nameof(OpenAIModelInfoCollection), typeof(InternalListModelsResponseObject), typeof(IReadOnlyList))] +public partial class OpenAIModelInfoCollection : ReadOnlyCollection +{ + // CUSTOM: Made private. This property does not add value in the context of a strongly-typed class. + /// Gets the object. + private InternalListModelsResponseObject Object { get; } = InternalListModelsResponseObject.List; + + // CUSTOM: Recovered this field. See https://github.com/Azure/autorest.csharp/issues/4636. + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + private IDictionary SerializedAdditionalRawData; + + /// Initializes a new instance of . + /// + /// is null. + internal OpenAIModelInfoCollection(IEnumerable data) + : base([.. data]) + { + Argument.AssertNotNull(data, nameof(data)); + } + + /// Initializes a new instance of . + /// + /// + /// Keeps track of any properties unknown to the library. + internal OpenAIModelInfoCollection(InternalListModelsResponseObject @object, IReadOnlyList data, IDictionary serializedAdditionalRawData) + : base([.. data]) + { + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal OpenAIModelInfoCollection() + : base([]) + { + } +} diff --git a/.dotnet/src/Custom/Models/OpenAIModelsModelFactory.cs b/.dotnet/src/Custom/Models/OpenAIModelsModelFactory.cs new file mode 100644 index 000000000..960f6b5e2 --- /dev/null +++ b/.dotnet/src/Custom/Models/OpenAIModelsModelFactory.cs @@ -0,0 +1,33 @@ +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Models; + +/// Model factory for models. +public static partial class OpenAIModelsModelFactory +{ + /// Initializes a new instance of . + /// A new instance for mocking. + public static OpenAIModelInfo OpenAIModelInfo(string id = null, DateTimeOffset createdAt = default, string ownedBy = null) + { + return new OpenAIModelInfo( + id, + createdAt, + InternalModelObject.Model, + ownedBy, + serializedAdditionalRawData: null); + } + + /// Initializes a new instance of . + /// A new instance for mocking. + public static OpenAIModelInfoCollection OpenAIModelInfoCollection(IEnumerable items = null) + { + items ??= new List(); + + return new OpenAIModelInfoCollection( + InternalListModelsResponseObject.List, + items.ToList(), + serializedAdditionalRawData: null); + } +} diff --git a/.dotnet/src/Custom/Moderations/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/Moderations/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..4e11648b7 --- /dev/null +++ b/.dotnet/src/Custom/Moderations/Internal/GeneratorStubs.cs @@ -0,0 +1,4 @@ +namespace OpenAI.Moderations; + +[CodeGenModel("CreateModerationRequestModel")] +internal readonly partial struct InternalCreateModerationRequestModel { } \ No newline at end of file diff --git a/.dotnet/src/Custom/Moderations/ModerationCategories.cs b/.dotnet/src/Custom/Moderations/ModerationCategories.cs new file mode 100644 index 000000000..a5cbbe8bf --- /dev/null +++ b/.dotnet/src/Custom/Moderations/ModerationCategories.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Moderations; + +[CodeGenModel("CreateModerationResponseResultCategories")] +public partial class ModerationCategories +{ +} diff --git a/.dotnet/src/Custom/Moderations/ModerationCategoryScores.cs b/.dotnet/src/Custom/Moderations/ModerationCategoryScores.cs new file mode 100644 index 000000000..34b093f18 --- /dev/null +++ b/.dotnet/src/Custom/Moderations/ModerationCategoryScores.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Moderations; + +[CodeGenModel("CreateModerationResponseResultCategoryScores")] +public partial class ModerationCategoryScores +{ +} diff --git a/.dotnet/src/Custom/Moderations/ModerationClient.Protocol.cs b/.dotnet/src/Custom/Moderations/ModerationClient.Protocol.cs new file mode 100644 index 000000000..676ddc229 --- /dev/null +++ b/.dotnet/src/Custom/Moderations/ModerationClient.Protocol.cs @@ -0,0 +1,46 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.ComponentModel; +using System.Threading.Tasks; + +namespace OpenAI.Moderations; + +[CodeGenSuppress("CreateModerationAsync", typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("CreateModeration", typeof(BinaryContent), typeof(RequestOptions))] +public partial class ModerationClient +{ + /// + /// [Protocol Method] Classifies if text is potentially harmful. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task ClassifyTextInputsAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateModerationRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Classifies if text is potentially harmful. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ClassifyTextInputs(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateModerationRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } +} diff --git a/.dotnet/src/Custom/Moderations/ModerationClient.cs b/.dotnet/src/Custom/Moderations/ModerationClient.cs new file mode 100644 index 000000000..d425f1ad9 --- /dev/null +++ b/.dotnet/src/Custom/Moderations/ModerationClient.cs @@ -0,0 +1,153 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Linq; +using System.Threading; +using System.Threading.Tasks; + +namespace OpenAI.Moderations; + +// CUSTOM: +// - Renamed. +// - Suppressed constructor that takes endpoint parameter; endpoint is now a property in the options class. +// - Suppressed methods that only take the options parameter. +/// The service client for OpenAI moderation operations. +[CodeGenClient("Moderations")] +[CodeGenSuppress("ModerationClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateModerationAsync", typeof(ModerationOptions))] +[CodeGenSuppress("CreateModeration", typeof(ModerationOptions))] +public partial class ModerationClient +{ + private readonly string _model; + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public ModerationClient(string model, ApiKeyCredential credential) : this(model, credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + public ModerationClient(string model, ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNullOrEmpty(model, nameof(model)); + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Added `model` parameter. + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The name of the model to use in requests sent to the service. To learn more about the available models, see . + /// The options to configure the client. + /// or is null. + /// is an empty string, and was expected to be non-empty. + protected internal ModerationClient(ClientPipeline pipeline, string model, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + Argument.AssertNotNullOrEmpty(model, nameof(model)); + options ??= new OpenAIClientOptions(); + + _model = model; + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } + + /// Classifies if the text input is potentially harmful across several categories. + /// The text input to classify. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual async Task> ClassifyTextInputAsync(string input, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(input, nameof(input)); + + ModerationOptions options = new(); + CreateModerationOptions(BinaryData.FromObjectAsJson(input), ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = await ClassifyTextInputsAsync(content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(ModerationCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + /// Classifies if the text input is potentially harmful across several categories. + /// The text input to classify. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty string, and was expected to be non-empty. + public virtual ClientResult ClassifyTextInput(string input, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(input, nameof(input)); + + ModerationOptions options = new(); + CreateModerationOptions(BinaryData.FromObjectAsJson(input), ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = ClassifyTextInputs(content, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(ModerationCollection.FromResponse(result.GetRawResponse()).FirstOrDefault(), result.GetRawResponse()); + } + + /// Classifies if the text inputs are potentially harmful across several categories. + /// The text inputs to classify. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual async Task> ClassifyTextInputsAsync(IEnumerable inputs, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(inputs, nameof(inputs)); + + ModerationOptions options = new(); + CreateModerationOptions(BinaryData.FromObjectAsJson(inputs), ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = await ClassifyTextInputsAsync(content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(ModerationCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Classifies if the text inputs are potentially harmful across several categories. + /// The text inputs to classify. + /// A token that can be used to cancel this method call. + /// is null. + /// is an empty collection, and was expected to be non-empty. + public virtual ClientResult ClassifyTextInputs(IEnumerable inputs, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(inputs, nameof(inputs)); + + ModerationOptions options = new(); + CreateModerationOptions(BinaryData.FromObjectAsJson(inputs), ref options); + + using BinaryContent content = options.ToBinaryContent(); + ClientResult result = ClassifyTextInputs(content, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(ModerationCollection.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + private void CreateModerationOptions(BinaryData input, ref ModerationOptions options) + { + options.Input = input; + options.Model = _model; + } +} diff --git a/.dotnet/src/Custom/Moderations/ModerationCollection.Serialization..cs b/.dotnet/src/Custom/Moderations/ModerationCollection.Serialization..cs new file mode 100644 index 000000000..1bf783f19 --- /dev/null +++ b/.dotnet/src/Custom/Moderations/ModerationCollection.Serialization..cs @@ -0,0 +1,83 @@ +// + +#nullable disable + +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Moderations; + +[CodeGenSuppress("global::System.ClientModel.Primitives.IJsonModel.Write", typeof(Utf8JsonWriter), typeof(ModelReaderWriterOptions))] +public partial class ModerationCollection : IJsonModel +{ + // CUSTOM: + // - Serialized the Items property. + // - Recovered the deserialization of SerializedAdditionalRawData. See https://github.com/Azure/autorest.csharp/issues/4636. + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + => CustomSerializationHelpers.SerializeInstance(this, SerializeModerationCollection, writer, options); + + internal static void SerializeModerationCollection(ModerationCollection instance, Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + writer.WriteStartObject(); + writer.WritePropertyName("id"u8); + writer.WriteStringValue(instance.Id); + writer.WritePropertyName("model"u8); + writer.WriteStringValue(instance.Model); + writer.WritePropertyName("results"u8); + writer.WriteStartArray(); + foreach (var item in instance.Items) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + writer.WriteSerializedAdditionalRawData(instance.SerializedAdditionalRawData, options); + writer.WriteEndObject(); + } + + // CUSTOM: Recovered the deserialization of SerializedAdditionalRawData. See https://github.com/Azure/autorest.csharp/issues/4636. + internal static ModerationCollection DeserializeModerationCollection(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + string model = default; + IReadOnlyList results = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("model"u8)) + { + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("results"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ModerationResult.DeserializeModerationResult(item, options)); + } + results = array; + continue; + } + if (true) + { + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ModerationCollection(id, model, results, serializedAdditionalRawData); + } +} diff --git a/.dotnet/src/Custom/Moderations/ModerationCollection.cs b/.dotnet/src/Custom/Moderations/ModerationCollection.cs new file mode 100644 index 000000000..ed2a4c6d1 --- /dev/null +++ b/.dotnet/src/Custom/Moderations/ModerationCollection.cs @@ -0,0 +1,80 @@ +using System; +using System.Collections.Generic; +using System.Collections.ObjectModel; + +namespace OpenAI.Moderations; + +[CodeGenModel("CreateModerationResponse")] +[CodeGenSuppress("Results")] +[CodeGenSuppress(nameof(ModerationCollection))] +[CodeGenSuppress(nameof(ModerationCollection), typeof(string), typeof(string), typeof(IReadOnlyList))] +public partial class ModerationCollection : ReadOnlyCollection +{ + // CUSTOM: Recovered this field. See https://github.com/Azure/autorest.csharp/issues/4636. + /// + /// Keeps track of any properties unknown to the library. + /// + /// To assign an object to the value of this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + private IDictionary SerializedAdditionalRawData; + + /// Initializes a new instance of . + /// The unique identifier for the moderation request. + /// The model used to generate the moderation results. + /// A list of moderation objects. + /// , or is null. + internal ModerationCollection(string id, string model, IEnumerable results) + : base([.. results]) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(model, nameof(model)); + Argument.AssertNotNull(results, nameof(results)); + + Id = id; + Model = model; + } + + /// Initializes a new instance of . + /// The unique identifier for the moderation request. + /// The model used to generate the moderation results. + /// A list of moderation objects. + /// Keeps track of any properties unknown to the library. + internal ModerationCollection(string id, string model, IReadOnlyList results, IDictionary serializedAdditionalRawData) + : base([.. results]) + { + Id = id; + Model = model; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + /// Initializes a new instance of for deserialization. + internal ModerationCollection() + : base([]) + { + } +} diff --git a/.dotnet/src/Custom/Moderations/ModerationOptions.cs b/.dotnet/src/Custom/Moderations/ModerationOptions.cs new file mode 100644 index 000000000..a2561defe --- /dev/null +++ b/.dotnet/src/Custom/Moderations/ModerationOptions.cs @@ -0,0 +1,71 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.Moderations; + +[CodeGenModel("CreateModerationRequest")] +[CodeGenSuppress("ModerationOptions", typeof(BinaryData))] +internal partial class ModerationOptions +{ + // CUSTOM: + // - Made internal. This value comes from a parameter on the client method. + // - Added setter. + /// + /// The input text to classify + /// + /// To assign an object to this property use . + /// + /// + /// To assign an already formatted json string to this property use . + /// + /// + /// + /// Supported types: + /// + /// + /// + /// + /// + /// where T is of type + /// + /// + /// + /// Examples: + /// + /// + /// BinaryData.FromObjectAsJson("foo") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromString("\"foo\"") + /// Creates a payload of "foo". + /// + /// + /// BinaryData.FromObjectAsJson(new { key = "value" }) + /// Creates a payload of { "key": "value" }. + /// + /// + /// BinaryData.FromString("{\"key\": \"value\"}") + /// Creates a payload of { "key": "value" }. + /// + /// + /// + /// + internal BinaryData Input { get; set; } + + // CUSTOM: Made internal. The model is specified by the client. + /// + /// Two content moderations models are available: `text-moderation-stable` and + /// `text-moderation-latest`. The default is `text-moderation-latest` which will be automatically + /// upgraded over time. This ensures you are always using our most accurate model. If you use + /// `text-moderation-stable`, we will provide advanced notice before updating the model. Accuracy + /// of `text-moderation-stable` may be slightly lower than for `text-moderation-latest`. + /// + internal InternalCreateModerationRequestModel? Model { get; set; } + + // CUSTOM: Made public now that there are no required properties. + /// Initializes a new instance of for deserialization. + public ModerationOptions() + { + } +} diff --git a/.dotnet/src/Custom/Moderations/ModerationResult.cs b/.dotnet/src/Custom/Moderations/ModerationResult.cs new file mode 100644 index 000000000..78e95f255 --- /dev/null +++ b/.dotnet/src/Custom/Moderations/ModerationResult.cs @@ -0,0 +1,6 @@ +namespace OpenAI.Moderations; + +[CodeGenModel("CreateModerationResponseResult")] +public partial class ModerationResult +{ +} diff --git a/.dotnet/src/Custom/Moderations/OpenAIModerationsModelFactory.cs b/.dotnet/src/Custom/Moderations/OpenAIModerationsModelFactory.cs new file mode 100644 index 000000000..f81a2a59d --- /dev/null +++ b/.dotnet/src/Custom/Moderations/OpenAIModerationsModelFactory.cs @@ -0,0 +1,70 @@ +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Moderations; + +/// Model factory for models. +public static partial class OpenAIModerationsModelFactory +{ + /// Initializes a new instance of . + /// A new instance for mocking. + public static ModerationCategories ModerationCategories(bool hate = default, bool hateThreatening = default, bool harassment = default, bool harassmentThreatening = default, bool selfHarm = default, bool selfHarmIntent = default, bool selfHarmInstructions = default, bool sexual = default, bool sexualMinors = default, bool violence = default, bool violenceGraphic = default) + { + return new ModerationCategories( + hate, + hateThreatening, + harassment, + harassmentThreatening, + selfHarm, + selfHarmIntent, + selfHarmInstructions, + sexual, + sexualMinors, + violence, + violenceGraphic, + serializedAdditionalRawData: null); + } + + /// Initializes a new instance of . + /// A new instance for mocking. + public static ModerationCategoryScores ModerationCategoryScores(float hate = default, float hateThreatening = default, float harassment = default, float harassmentThreatening = default, float selfHarm = default, float selfHarmIntent = default, float selfHarmInstructions = default, float sexual = default, float sexualMinors = default, float violence = default, float violenceGraphic = default) + { + return new ModerationCategoryScores( + hate, + hateThreatening, + harassment, + harassmentThreatening, + selfHarm, + selfHarmIntent, + selfHarmInstructions, + sexual, + sexualMinors, + violence, + violenceGraphic, + serializedAdditionalRawData: null); + } + + /// Initializes a new instance of . + /// A new instance for mocking. + public static ModerationCollection ModerationCollection(string id = null, string model = null, IEnumerable items = null) + { + items ??= new List(); + + return new ModerationCollection( + id, + model, + items.ToList(), + serializedAdditionalRawData: null); + } + + /// Initializes a new instance of . + /// A new instance for mocking. + public static ModerationResult ModerationResult(bool flagged = default, ModerationCategories categories = null, ModerationCategoryScores categoryScores = null) + { + return new ModerationResult( + flagged, + categories, + categoryScores, + serializedAdditionalRawData: null); + } +} diff --git a/.dotnet/src/Custom/OpenAIClient.cs b/.dotnet/src/Custom/OpenAIClient.cs new file mode 100644 index 000000000..ebc442386 --- /dev/null +++ b/.dotnet/src/Custom/OpenAIClient.cs @@ -0,0 +1,294 @@ +using OpenAI.Assistants; +using OpenAI.Audio; +using OpenAI.Batch; +using OpenAI.Chat; +using OpenAI.Embeddings; +using OpenAI.Files; +using OpenAI.FineTuning; +using OpenAI.Images; +using OpenAI.Models; +using OpenAI.Moderations; +using OpenAI.VectorStores; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Diagnostics.CodeAnalysis; + +namespace OpenAI; + +// CUSTOM: +// - Suppressed constructor that takes endpoint parameter; endpoint is now a property in the options class. +// - Suppressed cached clients. Clients are not singletons, and users can create multiple clients of the same type +// if needed (e.g., to target different OpenAI models). The Get*Client methods return new client instances. +/// +/// A top-level client factory that enables convenient creation of scenario-specific sub-clients while reusing shared +/// configuration details like endpoint, authentication, and pipeline customization. +/// +[CodeGenModel("OpenAIClient")] +[CodeGenSuppress("OpenAIClient", typeof(ApiKeyCredential))] +[CodeGenSuppress("OpenAIClient", typeof(Uri), typeof(ApiKeyCredential), typeof(OpenAIClientOptions))] +[CodeGenSuppress("_cachedAssistantClient")] +[CodeGenSuppress("_cachedAudioClient")] +[CodeGenSuppress("_cachedBatchClient")] +[CodeGenSuppress("_cachedChatClient")] +[CodeGenSuppress("_cachedEmbeddingClient")] +[CodeGenSuppress("_cachedFileClient")] +[CodeGenSuppress("_cachedFineTuningClient")] +[CodeGenSuppress("_cachedImageClient")] +[CodeGenSuppress("_cachedInternalAssistantMessageClient")] +[CodeGenSuppress("_cachedInternalAssistantRunClient")] +[CodeGenSuppress("_cachedInternalAssistantThreadClient")] +[CodeGenSuppress("_cachedInternalUploadsClient")] +[CodeGenSuppress("_cachedLegacyCompletionClient")] +[CodeGenSuppress("_cachedModelClient")] +[CodeGenSuppress("_cachedModerationClient")] +[CodeGenSuppress("_cachedVectorStoreClient")] +[CodeGenSuppress("GetAssistantClientClient")] +[CodeGenSuppress("GetAudioClientClient")] +[CodeGenSuppress("GetBatchClientClient")] +[CodeGenSuppress("GetChatClientClient")] +[CodeGenSuppress("GetEmbeddingClientClient")] +[CodeGenSuppress("GetFileClientClient")] +[CodeGenSuppress("GetFineTuningClientClient")] +[CodeGenSuppress("GetImageClientClient")] +[CodeGenSuppress("GetInternalAssistantMessageClientClient")] +[CodeGenSuppress("GetInternalAssistantRunClientClient")] +[CodeGenSuppress("GetInternalAssistantThreadClientClient")] +[CodeGenSuppress("GetInternalUploadsClientClient")] +[CodeGenSuppress("GetLegacyCompletionClientClient")] +[CodeGenSuppress("GetModelClientClient")] +[CodeGenSuppress("GetModerationClientClient")] +[CodeGenSuppress("GetVectorStoreClientClient")] +public partial class OpenAIClient +{ + private const string OpenAIV1Endpoint = "https://api.openai.com"; + private const string OpenAIBetaHeaderValue = "assistants=v2"; + + private static class KnownHeaderNames + { + public const string OpenAIBeta = "OpenAI-Beta"; + public const string OpenAIOrganization = "OpenAI-Organization"; + public const string OpenAIProject = "OpenAI-Project"; + public const string UserAgent = "User-Agent"; + } + + private readonly OpenAIClientOptions _options; + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// is null. + public OpenAIClient(ApiKeyCredential credential) : this(credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// is null. + public OpenAIClient(ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + _options = options; + } + + // CUSTOM: Added protected internal constructor that takes a ClientPipeline. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The options to configure the client. + /// is null. + protected internal OpenAIClient(ClientPipeline pipeline, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + options ??= new OpenAIClientOptions(); + + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + _options = options; + } + + /// + /// Gets a new instance of that reuses the client configuration details provided to + /// the instance. + /// + /// + /// This method is functionally equivalent to using the constructor directly with + /// the same configuration details. + /// + /// A new . + [Experimental("OPENAI001")] + public virtual AssistantClient GetAssistantClient() => new(_pipeline, _options); + + /// + /// Gets a new instance of that reuses the client configuration details provided to + /// the instance. + /// + /// + /// This method is functionally equivalent to using the constructor directly with + /// the same configuration details. + /// + /// A new . + public virtual AudioClient GetAudioClient(string model) => new(_pipeline, model, _options); + + /// + /// Gets a new instance of that reuses the client configuration details provided to + /// the instance. + /// + /// + /// This method is functionally equivalent to using the constructor directly with + /// the same configuration details. + /// + /// A new . + public virtual BatchClient GetBatchClient() => new(_pipeline, _options); + + /// + /// Gets a new instance of that reuses the client configuration details provided to + /// the instance. + /// + /// + /// This method is functionally equivalent to using the constructor directly with + /// the same configuration details. + /// + /// A new . + public virtual ChatClient GetChatClient(string model) => new(_pipeline, model, _options); + + /// + /// Gets a new instance of that reuses the client configuration details provided to + /// the instance. + /// + /// + /// This method is functionally equivalent to using the constructor directly with + /// the same configuration details. + /// + /// A new . + public virtual EmbeddingClient GetEmbeddingClient(string model) => new(_pipeline, model, _options); + + /// + /// Gets a new instance of that reuses the client configuration details provided to + /// the instance. + /// + /// + /// This method is functionally equivalent to using the constructor directly with + /// the same configuration details. + /// + /// A new . + public virtual FileClient GetFileClient() => new(_pipeline, _options); + + /// + /// Gets a new instance of that reuses the client configuration details provided to + /// the instance. + /// + /// + /// This method is functionally equivalent to using the constructor directly with + /// the same configuration details. + /// + /// A new . + public virtual FineTuningClient GetFineTuningClient() => new(_pipeline, _options); + + /// + /// Gets a new instance of that reuses the client configuration details provided to + /// the instance. + /// + /// + /// This method is functionally equivalent to using the constructor directly with + /// the same configuration details. + /// + /// A new . + public virtual ImageClient GetImageClient(string model) => new(_pipeline, model, _options); + + /// + /// Gets a new instance of that reuses the client configuration details provided to + /// the instance. + /// + /// + /// This method is functionally equivalent to using the constructor directly with + /// the same configuration details. + /// + /// A new . + public virtual ModelClient GetModelClient() => new(_pipeline, _options); + + /// + /// Gets a new instance of that reuses the client configuration details provided to + /// the instance. + /// + /// + /// This method is functionally equivalent to using the constructor directly with + /// the same configuration details. + /// + /// A new . + public virtual ModerationClient GetModerationClient(string model) => new(_pipeline, model, _options); + + /// + /// Gets a new instance of that reuses the client configuration details provided to + /// the instance. + /// + /// + /// This method is functionally equivalent to using the constructor directly with + /// the same configuration details. + /// + /// A new . + [Experimental("OPENAI001")] + public virtual VectorStoreClient GetVectorStoreClient() => new(_pipeline, _options); + + internal static ClientPipeline CreatePipeline(ApiKeyCredential credential, OpenAIClientOptions options) + { + return ClientPipeline.Create( + options, + perCallPolicies: [ + CreateAddBetaFeatureHeaderPolicy(), + CreateAddCustomHeadersPolicy(options), + ], + perTryPolicies: [ + ApiKeyAuthenticationPolicy.CreateHeaderApiKeyPolicy(credential, AuthorizationHeader, AuthorizationApiKeyPrefix) + ], + beforeTransportPolicies: [ + ]); + } + + internal static Uri GetEndpoint(OpenAIClientOptions options = null) + { + return options?.Endpoint ?? new(OpenAIV1Endpoint); + } + + private static PipelinePolicy CreateAddBetaFeatureHeaderPolicy() + { + return new GenericActionPipelinePolicy((message) => + { + if (message?.Request?.Headers?.TryGetValue(KnownHeaderNames.OpenAIBeta, out string _) == false) + { + message.Request.Headers.Set(KnownHeaderNames.OpenAIBeta, OpenAIBetaHeaderValue); + } + }); + } + + private static PipelinePolicy CreateAddCustomHeadersPolicy(OpenAIClientOptions options = null) + { + TelemetryDetails telemetryDetails = new(typeof(OpenAIClientOptions).Assembly, options?.ApplicationId); + return new GenericActionPipelinePolicy((message) => + { + if (message?.Request?.Headers?.TryGetValue(KnownHeaderNames.UserAgent, out string _) == false) + { + message.Request.Headers.Set(KnownHeaderNames.UserAgent, telemetryDetails.ToString()); + } + + if (!string.IsNullOrEmpty(options?.OrganizationId)) + { + message.Request.Headers.Set(KnownHeaderNames.OpenAIOrganization, options.OrganizationId); + } + + if (!string.IsNullOrEmpty(options?.ProjectId)) + { + message.Request.Headers.Set(KnownHeaderNames.OpenAIProject, options.ProjectId); + } + }); + } +} diff --git a/.dotnet/src/Custom/OpenAIClientOptions.cs b/.dotnet/src/Custom/OpenAIClientOptions.cs new file mode 100644 index 000000000..ca7937575 --- /dev/null +++ b/.dotnet/src/Custom/OpenAIClientOptions.cs @@ -0,0 +1,73 @@ +using System; +using System.ClientModel.Primitives; + +namespace OpenAI; + +/// The options to configure the client. +[CodeGenModel("OpenAIClientOptions")] +public partial class OpenAIClientOptions : ClientPipelineOptions +{ + private Uri _endpoint; + private string _organizationId; + private string _projectId; + private string _applicationId; + + /// + /// The service endpoint that the client will send requests to. If not set, the default endpoint will be used. + /// + public Uri Endpoint + { + get => _endpoint; + set + { + AssertNotFrozen(); + _endpoint = value; + } + } + + /// + /// The value to use for the OpenAI-Organization request header. Users who belong to multiple organizations + /// can set this value to specify which organization is used for an API request. Usage from these API requests will + /// count against the specified organization's quota. If not set, the header will be omitted, and the default + /// organization will be billed. You can change your default organization in your user settings. + /// Learn more. + /// + public string OrganizationId + { + get => _organizationId; + set + { + AssertNotFrozen(); + _organizationId = value; + } + } + + /// + /// The value to use for the OpenAI-Project request header. Users who are accessing their projects through + /// their legacy user API key can set this value to specify which project is used for an API request. Usage from + /// these API requests will count as usage for the specified project. If not set, the header will be omitted, and + /// the default project will be accessed. + /// + public string ProjectId + { + get => _projectId; + set + { + AssertNotFrozen(); + _projectId = value; + } + } + + /// + /// An optional application ID to use as part of the request User-Agent header. + /// + public string ApplicationId + { + get => _applicationId; + set + { + AssertNotFrozen(); + _applicationId = value; + } + } +} diff --git a/.dotnet/src/Custom/OpenAIError.cs b/.dotnet/src/Custom/OpenAIError.cs new file mode 100644 index 000000000..ce82f6d7c --- /dev/null +++ b/.dotnet/src/Custom/OpenAIError.cs @@ -0,0 +1,53 @@ +using System; +using System.ClientModel.Primitives; +using System.Text; +using System.Text.Json; + +namespace OpenAI.Internal; + +// Custom: +// - Renamed +// - 'FromResponse' added for convenience with parent type +// - 'ToExceptionMessage' added for encapsulated message formatting + +[CodeGenModel("Error")] +internal partial class OpenAIError +{ + internal static OpenAIError TryCreateFromResponse(PipelineResponse response) + { + try + { + using JsonDocument errorDocument = JsonDocument.Parse(response.Content); + OpenAIErrorResponse errorResponse + = OpenAIErrorResponse.DeserializeOpenAIErrorResponse(errorDocument.RootElement); + return errorResponse.Error; + } + catch (InvalidOperationException) + { + return null; + } + catch (JsonException) + { + return null; + } + } + + public string ToExceptionMessage(int httpStatus) + { + StringBuilder messageBuilder = new(); + messageBuilder.Append("HTTP ").Append(httpStatus).Append(" (").Append(Type).Append(": ").Append(Code).AppendLine(")"); + if (!string.IsNullOrEmpty(Param)) + { + messageBuilder.Append("Parameter: ").AppendLine(Param); + } + messageBuilder.AppendLine(); + messageBuilder.Append(Message); + return messageBuilder.ToString(); + } +} + +// Custom: +// - Renamed + +[CodeGenModel("ErrorResponse")] +internal partial class OpenAIErrorResponse { } diff --git a/.dotnet/src/Custom/OpenAIModelFactory.cs b/.dotnet/src/Custom/OpenAIModelFactory.cs new file mode 100644 index 000000000..bd049d8c4 --- /dev/null +++ b/.dotnet/src/Custom/OpenAIModelFactory.cs @@ -0,0 +1,36 @@ +using OpenAI.Embeddings; +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI; + +[CodeGenModel("OpenAIModelFactory")] +internal static partial class OpenAIModelFactory +{ + /// Initializes a new instance of . + /// The list of embeddings generated by the model. + /// The name of the model used to generate the embedding. + /// The usage information for the request. + /// A new instance for mocking. + public static EmbeddingCollection EmbeddingCollection(IEnumerable data = null, EmbeddingTokenUsage usage = null, string model = null) + { + data ??= new List(); + + return new EmbeddingCollection(data?.ToList(), model, InternalCreateEmbeddingResponseObject.List, usage, serializedAdditionalRawData: null); + } + + /// Initializes a new instance of . + /// The index of the embedding in the list of embeddings. + /// + /// The embedding vector, which is a list of floats. The length of vector depends on the model as + /// listed in the [embedding guide](/docs/guides/embeddings). + /// + /// A new instance for mocking. + public static Embedding Embedding(ReadOnlyMemory vector = default, int index = default) + { + // TODO: Vector must be converted to base64-encoded string. + return new Embedding(index, BinaryData.FromObjectAsJson(vector), InternalEmbeddingObject.Embedding, serializedAdditionalRawData: null); + } + +} diff --git a/.dotnet/src/Custom/VectorStores/FileChunkingStrategy.cs b/.dotnet/src/Custom/VectorStores/FileChunkingStrategy.cs new file mode 100644 index 000000000..0f642d66e --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/FileChunkingStrategy.cs @@ -0,0 +1,36 @@ +namespace OpenAI.VectorStores; + +[CodeGenModel("FileChunkingStrategyResponseParam")] +public abstract partial class FileChunkingStrategy +{ + /// + /// Gets a value representing the default, automatic selection for a file chunking strategy. + /// + /// + /// This value is only valid on vector store requests. response instances + /// will report the concrete chunking strategy applied after automatic selection. + /// + public static FileChunkingStrategy Auto => _autoValue ??= new(); + + /// + /// Gets a value representing the other, unknown strategy type. + /// + /// + /// This value is present on responses when no chunking strategy could be found. This is typically only true for + /// vector stores created earlier than file chunking strategy availability. + /// + public static FileChunkingStrategy Unknown => _unknownValue ??= new(); + + /// + public static FileChunkingStrategy CreateStaticStrategy( + int maxTokensPerChunk, + int overlappingTokenCount) + { + return new StaticFileChunkingStrategy( + maxTokensPerChunk, + overlappingTokenCount); + } + + private static InternalAutoChunkingStrategy _autoValue; + private static InternalUnknownChunkingStrategy _unknownValue; +} diff --git a/.dotnet/src/Custom/VectorStores/Internal/GeneratorStubs.cs b/.dotnet/src/Custom/VectorStores/Internal/GeneratorStubs.cs new file mode 100644 index 000000000..3fa00a75b --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/Internal/GeneratorStubs.cs @@ -0,0 +1,69 @@ +namespace OpenAI.VectorStores; + +// CUSTOM: Made internal. + +[CodeGenModel("CreateVectorStoreFileBatchRequest")] +internal partial class InternalCreateVectorStoreFileBatchRequest { } + +[CodeGenModel("CreateVectorStoreFileRequest")] +internal partial class InternalCreateVectorStoreFileRequest {} + +[CodeGenModel("DeleteVectorStoreFileResponse")] +internal partial class InternalDeleteVectorStoreFileResponse { } + +[CodeGenModel("DeleteVectorStoreFileResponseObject")] +internal readonly partial struct InternalDeleteVectorStoreFileResponseObject { } + +[CodeGenModel("DeleteVectorStoreResponse")] +internal partial class InternalDeleteVectorStoreResponse { } + +[CodeGenModel("DeleteVectorStoreResponseObject")] +internal readonly partial struct InternalDeleteVectorStoreResponseObject { } + +[CodeGenModel("ListVectorStoreFilesResponse")] +internal partial class InternalListVectorStoreFilesResponse : IInternalListResponse { } + +[CodeGenModel("ListVectorStoreFilesResponseObject")] +internal readonly partial struct InternalListVectorStoreFilesResponseObject { } + +[CodeGenModel("ListVectorStoresResponse")] +internal partial class InternalListVectorStoresResponse : IInternalListResponse { } + +[CodeGenModel("ListVectorStoresResponseObject")] +internal readonly partial struct InternalListVectorStoresResponseObject { } + +[CodeGenModel("VectorStoreFileBatchObjectFileCounts")] +internal partial class InternalVectorStoreFileBatchObjectFileCounts { } + +[CodeGenModel("VectorStoreFileBatchObjectObject")] +internal readonly partial struct InternalVectorStoreFileBatchObjectObject { } + +[CodeGenModel("VectorStoreFileObjectObject")] +internal readonly partial struct InternalVectorStoreFileObjectObject { } + +[CodeGenModel("VectorStoreObjectObject")] +internal readonly partial struct InternalVectorStoreObjectObject { } + +[CodeGenModel("StaticChunkingStrategy")] +internal partial class InternalStaticChunkingStrategyDetails { } + +[CodeGenModel("FileChunkingStrategyRequestParam")] +internal partial class InternalFileChunkingStrategyRequestParam { } + +[CodeGenModel("AutoChunkingStrategyRequestParam")] +internal partial class InternalAutoChunkingStrategyRequestParam { } + +[CodeGenModel("StaticChunkingStrategyRequestParam")] +internal partial class InternalStaticChunkingStrategyRequestParam { } + +[CodeGenModel("UnknownFileChunkingStrategyRequestParam")] +internal partial class InternalUnknownFileChunkingStrategyRequestParamProxy { } + +[CodeGenModel("AutoChunkingStrategyResponseParam")] +internal partial class InternalAutoChunkingStrategy { } + +[CodeGenModel("OtherChunkingStrategyResponseParam")] +internal partial class InternalUnknownChunkingStrategy { } + +[CodeGenModel("UnknownFileChunkingStrategyResponseParam")] +internal partial class InternalUnknownFileChunkingStrategyResponseParamProxy { } diff --git a/.dotnet/src/Custom/VectorStores/Internal/InternalCreateVectorStoreFileRequest.cs b/.dotnet/src/Custom/VectorStores/Internal/InternalCreateVectorStoreFileRequest.cs new file mode 100644 index 000000000..338a848e8 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/Internal/InternalCreateVectorStoreFileRequest.cs @@ -0,0 +1,10 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores; + +internal partial class InternalCreateVectorStoreFileRequest +{ + [CodeGenMember("ChunkingStrategy")] + public FileChunkingStrategy ChunkingStrategy { get; set; } +} diff --git a/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFileBatchesPageEnumerator.cs b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFileBatchesPageEnumerator.cs new file mode 100644 index 000000000..3212db4d7 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFileBatchesPageEnumerator.cs @@ -0,0 +1,157 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.VectorStores; + +internal partial class VectorStoreFileBatchesPageEnumerator : PageEnumerator +{ + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + private readonly string _vectorStoreId; + private readonly string _batchId; + private readonly int? _limit; + private readonly string? _order; + + private readonly string? _before; + private readonly string? _filter; + private readonly RequestOptions _options; + + private string? _after; + + public virtual ClientPipeline Pipeline => _pipeline; + + public VectorStoreFileBatchesPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string vectorStoreId, string batchId, int? limit, string? order, string? after, string? before, string? filter, + RequestOptions options) + { + _pipeline = pipeline; + _endpoint = endpoint; + + _vectorStoreId = vectorStoreId; + _batchId = batchId; + + _limit = limit; + _order = order; + _after = after; + _before = before; + _filter = filter; + + _options = options; + } + + public override async Task GetFirstAsync() + => await GetFileAssociationsAsync(_vectorStoreId, _batchId, _limit, _order, _after, _before, _filter, _options).ConfigureAwait(false); + + public override ClientResult GetFirst() + => GetFileAssociations(_vectorStoreId, _batchId, _limit, _order, _after, _before, _filter, _options); + + public override async Task GetNextAsync(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return await GetFileAssociationsAsync(_vectorStoreId, _batchId, _limit, _order, _after, _before, _filter, _options).ConfigureAwait(false); + } + + public override ClientResult GetNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return GetFileAssociations(_vectorStoreId, _batchId, _limit, _order, _after, _before, _filter, _options); + } + + public override bool HasNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + bool hasMore = doc.RootElement.GetProperty("has_more"u8).GetBoolean(); + + return hasMore; + } + + public override PageResult GetPageFromResult(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + InternalListVectorStoreFilesResponse list = ModelReaderWriter.Read(response.Content)!; + + VectorStoreFilesPageToken pageToken = VectorStoreFilesPageToken.FromOptions(_vectorStoreId, _limit, _order, _after, _before, _filter); + VectorStoreFilesPageToken? nextPageToken = pageToken.GetNextPageToken(list.HasMore, list.LastId); + + return PageResult.Create(list.Data, pageToken, nextPageToken, response); + } + + internal virtual async Task GetFileAssociationsAsync(string vectorStoreId, string batchId, int? limit, string? order, string? after, string? before, string? filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateGetFilesInVectorStoreBatchesRequest(vectorStoreId, batchId, limit, order, after, before, filter, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal virtual ClientResult GetFileAssociations(string vectorStoreId, string batchId, int? limit, string? order, string? after, string? before, string? filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateGetFilesInVectorStoreBatchesRequest(vectorStoreId, batchId, limit, order, after, before, filter, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + internal PipelineMessage CreateGetFilesInVectorStoreBatchesRequest(string vectorStoreId, string batchId, int? limit, string? order, string? after, string? before, string? filter, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + uri.AppendPath("/file_batches/", false); + uri.AppendPath(batchId, true); + uri.AppendPath("/files", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + if (filter != null) + { + uri.AppendQuery("filter", filter, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier? _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); +} diff --git a/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFileBatchesPageToken.cs b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFileBatchesPageToken.cs new file mode 100644 index 000000000..50f807901 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFileBatchesPageToken.cs @@ -0,0 +1,183 @@ +using System; +using System.ClientModel; +using System.Diagnostics; +using System.IO; +using System.Text.Json; + +#nullable enable + +namespace OpenAI.VectorStores; + +internal class VectorStoreFileBatchesPageToken : ContinuationToken +{ + protected VectorStoreFileBatchesPageToken(string vectorStoreId,string batchId, int? limit, string? order, string? after, string? before, string? filter) + { + VectorStoreId = vectorStoreId; + BatchId = batchId; + + Limit = limit; + Order = order; + After = after; + Before = before; + Filter = filter; + } + + public string VectorStoreId { get; } + + public string BatchId { get; } + + public int? Limit { get; } + + public string? Order { get; } + + public string? After { get; } + + public string? Before { get; } + + public string? Filter { get; } + + public override BinaryData ToBytes() + { + using MemoryStream stream = new(); + using Utf8JsonWriter writer = new(stream); + + writer.WriteStartObject(); + writer.WriteString("vectorStoreId", VectorStoreId); + writer.WriteString("batchId", BatchId); + + if (Limit.HasValue) + { + writer.WriteNumber("limit", Limit.Value); + } + + if (Order is not null) + { + writer.WriteString("order", Order); + } + + if (After is not null) + { + writer.WriteString("after", After); + } + + if (Before is not null) + { + writer.WriteString("before", Before); + } + + if (Filter is not null) + { + writer.WriteString("filter", Filter); + } + + writer.WriteEndObject(); + + writer.Flush(); + stream.Position = 0; + + return BinaryData.FromStream(stream); + } + + public VectorStoreFileBatchesPageToken? GetNextPageToken(bool hasMore, string? lastId) + { + if (!hasMore || lastId is null) + { + return null; + } + + return new(VectorStoreId, BatchId, Limit, Order, lastId, Before, Filter); + } + + public static VectorStoreFileBatchesPageToken FromToken(ContinuationToken pageToken) + { + if (pageToken is VectorStoreFileBatchesPageToken token) + { + return token; + } + + BinaryData data = pageToken.ToBytes(); + + if (data.ToMemory().Length == 0) + { + throw new ArgumentException("Failed to create VectorStoreFileBatchesPageToken from provided pageToken.", nameof(pageToken)); + } + + Utf8JsonReader reader = new(data); + + string vectorStoreId = null!; + string batchId = null!; + int? limit = null; + string? order = null; + string? after = null; + string? before = null; + string? filter = null; + + reader.Read(); + + Debug.Assert(reader.TokenType == JsonTokenType.StartObject); + + while (reader.Read()) + { + if (reader.TokenType == JsonTokenType.EndObject) + { + break; + } + + Debug.Assert(reader.TokenType == JsonTokenType.PropertyName); + + string propertyName = reader.GetString()!; + + switch (propertyName) + { + case "vectorStoreId": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + vectorStoreId = reader.GetString()!; + break; + case "batchId": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + batchId = reader.GetString()!; + break; + case "limit": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.Number); + limit = reader.GetInt32(); + break; + case "order": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + order = reader.GetString(); + break; + case "after": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + after = reader.GetString(); + break; + case "before": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + before = reader.GetString(); + break; + case "filter": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + filter = reader.GetString(); + break; + default: + throw new JsonException($"Unrecognized property '{propertyName}'."); + } + } + + if (vectorStoreId is null || + batchId is null) + { + throw new ArgumentException("Failed to create VectorStoreFileBatchesPageToken from provided pageToken.", nameof(pageToken)); + } + + return new(vectorStoreId, batchId, limit, order, after, before, filter); + } + + public static VectorStoreFileBatchesPageToken FromOptions(string vectorStoreId, string batchId, int? limit, string? order, string? after, string? before, string? filter) + => new(vectorStoreId, batchId, limit, order, after, before, filter); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFilesPageEnumerator.cs b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFilesPageEnumerator.cs new file mode 100644 index 000000000..81506b05d --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFilesPageEnumerator.cs @@ -0,0 +1,150 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.VectorStores; + +internal partial class VectorStoreFilesPageEnumerator : PageEnumerator +{ + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + private readonly string _vectorStoreId; + private readonly int? _limit; + private readonly string? _order; + + private readonly string? _before; + private readonly string? _filter; + private readonly RequestOptions _options; + + private string? _after; + + public virtual ClientPipeline Pipeline => _pipeline; + + public VectorStoreFilesPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + string vectorStoreId, + int? limit, string? order, string? after, string? before, string? filter, + RequestOptions options) + { + _pipeline = pipeline; + _endpoint = endpoint; + + _vectorStoreId = vectorStoreId; + _limit = limit; + _order = order; + _after = after; + _before = before; + _filter = filter; + _options = options; + } + + public override async Task GetFirstAsync() + => await GetFileAssociationsAsync(_vectorStoreId, _limit, _order, _after, _before, _filter, _options).ConfigureAwait(false); + + public override ClientResult GetFirst() + => GetFileAssociations(_vectorStoreId, _limit, _order, _after, _before, _filter, _options); + + public override async Task GetNextAsync(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return await GetFileAssociationsAsync(_vectorStoreId, _limit, _order, _after, _before, _filter, _options).ConfigureAwait(false); + } + + public override ClientResult GetNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return GetFileAssociations(_vectorStoreId, _limit, _order, _after, _before, _filter, _options); + } + + public override bool HasNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + bool hasMore = doc.RootElement.GetProperty("has_more"u8).GetBoolean(); + + return hasMore; + } + + public override PageResult GetPageFromResult(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + InternalListVectorStoreFilesResponse list = ModelReaderWriter.Read(response.Content)!; + + VectorStoreFilesPageToken pageToken = VectorStoreFilesPageToken.FromOptions(_vectorStoreId, _limit, _order, _after, _before, _filter); + VectorStoreFilesPageToken? nextPageToken = pageToken.GetNextPageToken(list.HasMore, list.LastId); + + return PageResult.Create(list.Data, pageToken, nextPageToken, response); + } + + internal virtual async Task GetFileAssociationsAsync(string vectorStoreId, int? limit, string? order, string? after, string? before, string? filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateGetVectorStoreFilesRequest(vectorStoreId, limit, order, after, before, filter, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal virtual ClientResult GetFileAssociations(string vectorStoreId, int? limit, string? order, string? after, string? before, string? filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateGetVectorStoreFilesRequest(vectorStoreId, limit, order, after, before, filter, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + internal PipelineMessage CreateGetVectorStoreFilesRequest(string vectorStoreId, int? limit, string? order, string? after, string? before, string? filter, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + uri.AppendPath("/files", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + if (filter != null) + { + uri.AppendQuery("filter", filter, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier? _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); +} diff --git a/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFilesPageToken.cs b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFilesPageToken.cs new file mode 100644 index 000000000..c5112807f --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoreFilesPageToken.cs @@ -0,0 +1,171 @@ +using System; +using System.ClientModel; +using System.Diagnostics; +using System.IO; +using System.Text.Json; + +#nullable enable + +namespace OpenAI.VectorStores; + +internal class VectorStoreFilesPageToken : ContinuationToken +{ + protected VectorStoreFilesPageToken(string vectorStoreId, int? limit, string? order, string? after, string? before, string? filter) + { + VectorStoreId = vectorStoreId; + + Limit = limit; + Order = order; + After = after; + Before = before; + Filter = filter; + } + public string VectorStoreId { get; } + + public int? Limit { get; } + + public string? Order { get; } + + public string? After { get; } + + public string? Before { get; } + + public string? Filter { get; } + + public override BinaryData ToBytes() + { + using MemoryStream stream = new(); + using Utf8JsonWriter writer = new(stream); + + writer.WriteStartObject(); + writer.WriteString("vectorStoreId", VectorStoreId); + + if (Limit.HasValue) + { + writer.WriteNumber("limit", Limit.Value); + } + + if (Order is not null) + { + writer.WriteString("order", Order); + } + + if (After is not null) + { + writer.WriteString("after", After); + } + + if (Before is not null) + { + writer.WriteString("before", Before); + } + + if (Filter is not null) + { + writer.WriteString("filter", Filter); + } + + writer.WriteEndObject(); + + writer.Flush(); + stream.Position = 0; + + return BinaryData.FromStream(stream); + } + + public VectorStoreFilesPageToken? GetNextPageToken(bool hasMore, string? lastId) + { + if (!hasMore || lastId is null) + { + return null; + } + + return new(VectorStoreId, Limit, Order, lastId, Before, Filter); + } + + public static VectorStoreFilesPageToken FromToken(ContinuationToken pageToken) + { + if (pageToken is VectorStoreFilesPageToken token) + { + return token; + } + + BinaryData data = pageToken.ToBytes(); + + if (data.ToMemory().Length == 0) + { + throw new ArgumentException("Failed to create VectorStoreFilesPageToken from provided pageToken.", nameof(pageToken)); + } + + Utf8JsonReader reader = new(data); + + string vectorStoreId = null!; + int? limit = null; + string? order = null; + string? after = null; + string? before = null; + string? filter = null; + + reader.Read(); + + Debug.Assert(reader.TokenType == JsonTokenType.StartObject); + + while (reader.Read()) + { + if (reader.TokenType == JsonTokenType.EndObject) + { + break; + } + + Debug.Assert(reader.TokenType == JsonTokenType.PropertyName); + + string propertyName = reader.GetString()!; + + switch (propertyName) + { + case "vectorStoreId": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + vectorStoreId = reader.GetString()!; + break; + case "limit": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.Number); + limit = reader.GetInt32(); + break; + case "order": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + order = reader.GetString(); + break; + case "after": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + after = reader.GetString(); + break; + case "before": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + before = reader.GetString(); + break; + case "filter": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + filter = reader.GetString(); + break; + default: + throw new JsonException($"Unrecognized property '{propertyName}'."); + } + } + + if (vectorStoreId is null) + { + throw new ArgumentException("Failed to create VectorStoreFilesPageToken from provided pageToken.", nameof(pageToken)); + } + + return new(vectorStoreId, limit, order, after, before, filter); + } + + public static VectorStoreFilesPageToken FromOptions(string vectorStoreId, int? limit, string? order, string? after, string? before, string? filter) + => new(vectorStoreId, limit, order, after, before, filter); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoresPageEnumerator.cs b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoresPageEnumerator.cs new file mode 100644 index 000000000..d3416a3b5 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoresPageEnumerator.cs @@ -0,0 +1,134 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.VectorStores; + +internal partial class VectorStoresPageEnumerator : PageEnumerator +{ + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + private readonly int? _limit; + private readonly string _order; + private readonly string _before; + private readonly RequestOptions _options; + + private string _after; + + public virtual ClientPipeline Pipeline => _pipeline; + + public VectorStoresPageEnumerator( + ClientPipeline pipeline, + Uri endpoint, + int? limit, string order, string after, string before, + RequestOptions options) + { + _pipeline = pipeline; + _endpoint = endpoint; + + _limit = limit; + _order = order; + _after = after; + _before = before; + _options = options; + } + + public override async Task GetFirstAsync() + => await GetVectorStoresAsync(_limit, _order, _after, _before, _options).ConfigureAwait(false); + + public override ClientResult GetFirst() + => GetVectorStores(_limit, _order, _after, _before, _options); + + public override async Task GetNextAsync(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return await GetVectorStoresAsync(_limit, _order, _after, _before, _options).ConfigureAwait(false); + } + + public override ClientResult GetNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + _after = doc.RootElement.GetProperty("last_id"u8).GetString()!; + + return GetVectorStores(_limit, _order, _after, _before, _options); + } + + public override bool HasNext(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + using JsonDocument doc = JsonDocument.Parse(response.Content); + bool hasMore = doc.RootElement.GetProperty("has_more"u8).GetBoolean(); + + return hasMore; + } + + public override PageResult GetPageFromResult(ClientResult result) + { + PipelineResponse response = result.GetRawResponse(); + + InternalListVectorStoresResponse list = ModelReaderWriter.Read(response.Content)!; + + VectorStoresPageToken pageToken = VectorStoresPageToken.FromOptions(_limit, _order, _after, _before); + VectorStoresPageToken? nextPageToken = pageToken.GetNextPageToken(list.HasMore, list.LastId); + + return PageResult.Create(list.Data, pageToken, nextPageToken, response); + } + + internal virtual async Task GetVectorStoresAsync(int? limit, string order, string after, string before, RequestOptions options) + { + using PipelineMessage message = CreateGetVectorStoresRequest(limit, order, after, before, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + internal virtual ClientResult GetVectorStores(int? limit, string order, string after, string before, RequestOptions options) + { + using PipelineMessage message = CreateGetVectorStoresRequest(limit, order, after, before, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + internal PipelineMessage CreateGetVectorStoresRequest(int? limit, string order, string after, string before, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier? _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); +} diff --git a/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoresPageToken.cs b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoresPageToken.cs new file mode 100644 index 000000000..88134bfcd --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/Internal/Pagination/VectorStoresPageToken.cs @@ -0,0 +1,142 @@ +using System; +using System.ClientModel; +using System.Diagnostics; +using System.IO; +using System.Text.Json; + +#nullable enable + +namespace OpenAI.VectorStores; + +internal class VectorStoresPageToken : ContinuationToken +{ + protected VectorStoresPageToken(int? limit, string? order, string? after, string? before) + { + Limit = limit; + Order = order; + After = after; + Before = before; + } + + public int? Limit { get; } + + public string? Order { get; } + + public string? After { get; } + + public string? Before { get; } + + public override BinaryData ToBytes() + { + using MemoryStream stream = new(); + using Utf8JsonWriter writer = new(stream); + + writer.WriteStartObject(); + + if (Limit.HasValue) + { + writer.WriteNumber("limit", Limit.Value); + } + + if (Order is not null) + { + writer.WriteString("order", Order); + } + + if (After is not null) + { + writer.WriteString("after", After); + } + + if (Before is not null) + { + writer.WriteString("before", Before); + } + + writer.WriteEndObject(); + + writer.Flush(); + stream.Position = 0; + + return BinaryData.FromStream(stream); + } + + public VectorStoresPageToken? GetNextPageToken(bool hasMore, string? lastId) + { + if (!hasMore || lastId is null) + { + return null; + } + + return new(Limit, Order, lastId, Before); + } + + public static VectorStoresPageToken FromToken(ContinuationToken pageToken) + { + if (pageToken is VectorStoresPageToken token) + { + return token; + } + + BinaryData data = pageToken.ToBytes(); + + if (data.ToMemory().Length == 0) + { + throw new ArgumentException("Failed to create VectorStoresPageToken from provided pageToken.", nameof(pageToken)); + } + + Utf8JsonReader reader = new(data); + + int? limit = null; + string? order = null; + string? after = null; + string? before = null; + + reader.Read(); + + Debug.Assert(reader.TokenType == JsonTokenType.StartObject); + + while (reader.Read()) + { + if (reader.TokenType == JsonTokenType.EndObject) + { + break; + } + + Debug.Assert(reader.TokenType == JsonTokenType.PropertyName); + + string propertyName = reader.GetString()!; + + switch (propertyName) + { + case "limit": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.Number); + limit = reader.GetInt32(); + break; + case "order": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + order = reader.GetString(); + break; + case "after": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + after = reader.GetString(); + break; + case "before": + reader.Read(); + Debug.Assert(reader.TokenType == JsonTokenType.String); + before = reader.GetString(); + break; + default: + throw new JsonException($"Unrecognized property '{propertyName}'."); + } + } + + return new(limit, order, after, before); + } + + public static VectorStoresPageToken FromOptions(int? limit, string? order, string? after, string? before) + => new(limit, order, after, before); +} \ No newline at end of file diff --git a/.dotnet/src/Custom/VectorStores/StaticFileChunkingStrategy.cs b/.dotnet/src/Custom/VectorStores/StaticFileChunkingStrategy.cs new file mode 100644 index 000000000..035f5cc6b --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/StaticFileChunkingStrategy.cs @@ -0,0 +1,39 @@ +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores; + +[CodeGenModel("StaticChunkingStrategyResponseParam")] +public partial class StaticFileChunkingStrategy : FileChunkingStrategy +{ + [CodeGenMember("Static")] + private InternalStaticChunkingStrategyDetails _internalDetails; + + /// + /// The maximum size of a file chunk, in tokens. + /// + /// + /// If not otherwise specified, a default of 800 will be used. + /// + public int MaxTokensPerChunk => _internalDetails.MaxChunkSizeTokens; + /// + /// The number of shared, overlapping tokens allowed between chunks. + /// + /// + /// + /// This value may not exceed half of . + /// + /// If not otherwise specified, a default of 400 will be used. + /// + public int OverlappingTokenCount => _internalDetails.ChunkOverlapTokens; + + /// + /// Creates a new instance of , which allows for direct specification of + /// file chunk size and chunk overlap windows. + /// + /// + /// + public StaticFileChunkingStrategy(int maxTokensPerChunk, int overlappingTokenCount) + : this(new InternalStaticChunkingStrategyDetails(maxTokensPerChunk, overlappingTokenCount)) + {} +} diff --git a/.dotnet/src/Custom/VectorStores/VectorStore.cs b/.dotnet/src/Custom/VectorStores/VectorStore.cs new file mode 100644 index 000000000..04b8c79b7 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStore.cs @@ -0,0 +1,19 @@ +namespace OpenAI.VectorStores; + +/// +/// A representation of a file storage and indexing container used by the file_search tool for assistants. +/// +[CodeGenModel("VectorStoreObject")] +public partial class VectorStore +{ + // CUSTOM: Made internal. + /// The object type, which is always `vector_store`. + [CodeGenMember("Object")] + internal InternalVectorStoreObjectObject Object { get; } = InternalVectorStoreObjectObject.VectorStore; + + /// + /// Gets the policy that controls when this vector store will be automatically deleted. + /// + [CodeGenMember("ExpiresAfter")] + public VectorStoreExpirationPolicy ExpirationPolicy { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreBatchFileJob.cs b/.dotnet/src/Custom/VectorStores/VectorStoreBatchFileJob.cs new file mode 100644 index 000000000..507745bb3 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreBatchFileJob.cs @@ -0,0 +1,20 @@ +namespace OpenAI.VectorStores; + +/// +/// Represents information about a bulk ingestion job of files into a vector store. +/// +[CodeGenModel("VectorStoreFileBatchObject")] +public partial class VectorStoreBatchFileJob +{ + private readonly object Object; + + /// + /// The ID of the batch file ingestion job into the vector store corresponding to . + /// + [CodeGenMember("Id")] + public string BatchId { get; } + + /// Gets the file counts. + [CodeGenMember("Counts")] + public VectorStoreFileCounts FileCounts { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreBatchFileJobStatus.cs b/.dotnet/src/Custom/VectorStores/VectorStoreBatchFileJobStatus.cs new file mode 100644 index 000000000..84eb5e3f6 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreBatchFileJobStatus.cs @@ -0,0 +1,6 @@ +namespace OpenAI.VectorStores; + +[CodeGenModel("VectorStoreFileBatchObjectStatus")] +public readonly partial struct VectorStoreBatchFileJobStatus +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreClient.Convenience.cs b/.dotnet/src/Custom/VectorStores/VectorStoreClient.Convenience.cs new file mode 100644 index 000000000..ee48aaca4 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreClient.Convenience.cs @@ -0,0 +1,247 @@ +using OpenAI.Files; +using System.ClientModel; +using System.Collections.Generic; +using System.Linq; +using System.Threading.Tasks; + +namespace OpenAI.VectorStores; + +public partial class VectorStoreClient +{ + /// + /// Modifies an existing vector store. + /// + /// The vector store to modify. + /// The new options to apply to the vector store. + /// The modified vector store instance. + public virtual Task> ModifyVectorStoreAsync(VectorStore vectorStore, VectorStoreModificationOptions options) + => ModifyVectorStoreAsync(vectorStore?.Id, options); + + /// + /// Modifies an existing vector store. + /// + /// The vector store to modify. + /// The new options to apply to the vector store. + /// The modified vector store instance. + public virtual ClientResult ModifyVectorStore(VectorStore vectorStore, VectorStoreModificationOptions options) + => ModifyVectorStore(vectorStore?.Id, options); + + /// + /// Gets an up-to-date instance of an existing vector store. + /// + /// The existing vector store instance to get an updated instance of. + /// The refreshed vector store instance. + public virtual Task> GetVectorStoreAsync(VectorStore vectorStore) + => GetVectorStoreAsync(vectorStore?.Id); + + /// + /// Gets an up-to-date instance of an existing vector store. + /// + /// The existing vector store instance to get an updated instance of. + /// The refreshed vector store instance. + public virtual ClientResult GetVectorStore(VectorStore vectorStore) + => GetVectorStore(vectorStore?.Id); + + /// + /// Deletes a vector store. + /// + /// The vector store to delete. + /// A value indicating whether the deletion operation was successful. + public virtual Task> DeleteVectorStoreAsync(VectorStore vectorStore) + => DeleteVectorStoreAsync(vectorStore?.Id); + + /// + /// Deletes a vector store. + /// + /// The vector store to delete. + /// A value indicating whether the deletion operation was successful. + public virtual ClientResult DeleteVectorStore(VectorStore vectorStore) + => DeleteVectorStore(vectorStore?.Id); + + /// + /// Associates an uploaded file with a vector store, beginning ingestion of the file into the vector store. + /// + /// The vector store to associate the file with. + /// The file to associate with the vector store. + /// + /// A instance that represents the new association. + /// + public virtual Task> AddFileToVectorStoreAsync(VectorStore vectorStore, OpenAIFileInfo file) + => AddFileToVectorStoreAsync(vectorStore?.Id, file?.Id); + + /// + /// Associates an uploaded file with a vector store, beginning ingestion of the file into the vector store. + /// + /// The vector store to associate the file with. + /// The file to associate with the vector store. + /// + /// A instance that represents the new association. + /// + public virtual ClientResult AddFileToVectorStore(VectorStore vectorStore, OpenAIFileInfo file) + => AddFileToVectorStore(vectorStore?.Id, file?.Id); + + /// + /// Gets a page collection holding instances that represent file inclusions in the + /// specified vector store. + /// + /// + /// The vector store to enumerate the file associations of. + /// + /// Options describing the collection to return. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetFileAssociationsAsync( + VectorStore vectorStore, + VectorStoreFileAssociationCollectionOptions options = default) + => GetFileAssociationsAsync(vectorStore?.Id, options); + + /// + /// Gets a page collection holding instances that represent file inclusions in the + /// specified vector store. + /// + /// + /// The ID vector store to enumerate the file associations of. + /// + /// Options describing the collection to return. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetFileAssociations( + VectorStore vectorStore, + VectorStoreFileAssociationCollectionOptions options = default) + => GetFileAssociations(vectorStore?.Id, options); + + /// + /// Gets a instance representing an existing association between a known + /// vector store and file. + /// + /// The vector store associated with the file. + /// The file associated with the vector store. + /// A instance. + public virtual Task> GetFileAssociationAsync( + VectorStore vectorStore, + OpenAIFileInfo file) + => GetFileAssociationAsync(vectorStore?.Id, file?.Id); + + /// + /// Gets a instance representing an existing association between a known + /// vector store and file. + /// + /// The vector store associated with the file. + /// The file associated with the vector store. + /// A instance. + public virtual ClientResult GetFileAssociation( + VectorStore vectorStore, + OpenAIFileInfo file) + => GetFileAssociation(vectorStore?.Id, file?.Id); + + /// + /// Removes the association between a file and vector store, which makes the file no longer available to the vector + /// store. + /// + /// + /// This does not delete the file. To delete the file, use . + /// + /// The vector store that the file should be removed from. + /// The file to remove from the vector store. + /// A value indicating whether the removal operation was successful. + public virtual Task> RemoveFileFromStoreAsync(VectorStore vectorStore, OpenAIFileInfo file) + => RemoveFileFromStoreAsync(vectorStore?.Id, file?.Id); + + /// + /// Removes the association between a file and vector store, which makes the file no longer available to the vector + /// store. + /// + /// + /// This does not delete the file. To delete the file, use . + /// + /// The vector store that the file should be removed from. + /// The file to remove from the vector store. + /// A value indicating whether the removal operation was successful. + public virtual ClientResult RemoveFileFromStore(VectorStore vectorStore, OpenAIFileInfo file) + => RemoveFileFromStore(vectorStore?.Id, file?.Id); + + /// + /// Begins a batch job to associate multiple jobs with a vector store, beginning the ingestion process. + /// + /// The vector store to associate files with. + /// The files to associate with the vector store. + /// A instance representing the batch operation. + public virtual Task> CreateBatchFileJobAsync(VectorStore vectorStore, IEnumerable files) + => CreateBatchFileJobAsync(vectorStore?.Id, files?.Select(file => file.Id)); + + /// + /// Begins a batch job to associate multiple jobs with a vector store, beginning the ingestion process. + /// + /// The vector store to associate files with. + /// The files to associate with the vector store. + /// A instance representing the batch operation. + public virtual ClientResult CreateBatchFileJob(VectorStore vectorStore, IEnumerable files) + => CreateBatchFileJob(vectorStore?.Id, files?.Select(file => file.Id)); + + /// + /// Gets an updated instance of an existing , refreshing its status. + /// + /// The job to refresh. + /// The refreshed instance of . + public virtual Task> GetBatchFileJobAsync(VectorStoreBatchFileJob batchJob) + => GetBatchFileJobAsync(batchJob?.VectorStoreId, batchJob?.BatchId); + + /// + /// Gets an updated instance of an existing , refreshing its status. + /// + /// The job to refresh. + /// The refreshed instance of . + public virtual ClientResult GetBatchFileJob(VectorStoreBatchFileJob batchJob) + => GetBatchFileJob(batchJob?.VectorStoreId, batchJob?.BatchId); + + /// + /// Cancels an in-progress . + /// + /// The that should be canceled. + /// An updated instance. + public virtual Task> CancelBatchFileJobAsync(VectorStoreBatchFileJob batchJob) + => CancelBatchFileJobAsync(batchJob?.VectorStoreId, batchJob?.BatchId); + + /// + /// Cancels an in-progress . + /// + /// The that should be canceled. + /// An updated instance. + public virtual ClientResult CancelBatchFileJob(VectorStoreBatchFileJob batchJob) + => CancelBatchFileJob(batchJob?.VectorStoreId, batchJob?.BatchId); + + /// + /// Gets a page collection holding file associations associated with a vector store batch file job, representing the files + /// that were scheduled for ingestion into the vector store. + /// + /// The vector store batch file job to retrieve file associations from. + /// Options describing the collection to return. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetFileAssociationsAsync( + VectorStoreBatchFileJob batchJob, + VectorStoreFileAssociationCollectionOptions options = default) + => GetFileAssociationsAsync(batchJob?.VectorStoreId, batchJob?.BatchId, options); + + /// + /// Gets a page collection holding file associations associated with a vector store batch file job, representing the files + /// that were scheduled for ingestion into the vector store. + /// + /// The vector store batch file job to retrieve file associations from. + /// Options describing the collection to return. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetFileAssociations( + VectorStoreBatchFileJob batchJob, + VectorStoreFileAssociationCollectionOptions options = default) + => GetFileAssociations(batchJob?.VectorStoreId, batchJob?.BatchId, options); + +} diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreClient.Protocol.cs b/.dotnet/src/Custom/VectorStores/VectorStoreClient.Protocol.cs new file mode 100644 index 000000000..ca3fa76bd --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreClient.Protocol.cs @@ -0,0 +1,622 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.ComponentModel; +using System.Threading.Tasks; + +namespace OpenAI.VectorStores; + +[CodeGenSuppress("GetVectorStoreFilesAsync", typeof(string), typeof(int?), typeof(string), typeof(string), typeof(string), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("GetVectorStoreFiles", typeof(string), typeof(int?), typeof(string), typeof(string), typeof(string), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CreateVectorStoreFileAsync", typeof(string), typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("CreateVectorStoreFile", typeof(string), typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("GetVectorStoreFileAsync", typeof(string), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("GetVectorStoreFile", typeof(string), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("DeleteVectorStoreFileAsync", typeof(string), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("DeleteVectorStoreFile", typeof(string), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CreateVectorStoreFileBatchAsync", typeof(string), typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("CreateVectorStoreFileBatch", typeof(string), typeof(BinaryContent), typeof(RequestOptions))] +[CodeGenSuppress("GetVectorStoreFileBatchAsync", typeof(string), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("GetVectorStoreFileBatch", typeof(string), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CancelVectorStoreFileBatchAsync", typeof(string), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("CancelVectorStoreFileBatch", typeof(string), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("GetFilesInVectorStoreBatchesAsync", typeof(string), typeof(string), typeof(int?), typeof(string), typeof(string), typeof(string), typeof(string), typeof(RequestOptions))] +[CodeGenSuppress("GetFilesInVectorStoreBatches", typeof(string), typeof(string), typeof(int?), typeof(string), typeof(string), typeof(string), typeof(string), typeof(RequestOptions))] +public partial class VectorStoreClient +{ + /// + /// [Protocol Method] Returns a paginated collection of vector-stores. + /// + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetVectorStoresAsync(int? limit, string order, string after, string before, RequestOptions options) + { + VectorStoresPageEnumerator enumerator = new VectorStoresPageEnumerator(_pipeline, _endpoint, limit, order, after, before, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + /// + /// [Protocol Method] Returns a paginated collection of vector-stores. + /// + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetVectorStores(int? limit, string order, string after, string before, RequestOptions options) + { + VectorStoresPageEnumerator enumerator = new VectorStoresPageEnumerator(_pipeline, _endpoint, limit, order, after, before, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + /// [Protocol Method] Creates a vector store. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + public virtual async Task CreateVectorStoreAsync(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCreateVectorStoreRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Creates a vector store. + /// + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateVectorStore(BinaryContent content, RequestOptions options = null) + { + using PipelineMessage message = CreateCreateVectorStoreRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Retrieves a vector store. + /// + /// The ID of the vector store to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GetVectorStoreAsync(string vectorStoreId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateGetVectorStoreRequest(vectorStoreId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieves a vector store. + /// + /// The ID of the vector store to retrieve. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetVectorStore(string vectorStoreId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateGetVectorStoreRequest(vectorStoreId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Modifies a vector store. + /// + /// The ID of the vector store to modify. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task ModifyVectorStoreAsync(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyVectorStoreRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Modifies a vector store. + /// + /// The ID of the vector store to modify. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult ModifyVectorStore(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateModifyVectorStoreRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Delete a vector store. + /// + /// The ID of the vector store to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task DeleteVectorStoreAsync(string vectorStoreId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateDeleteVectorStoreRequest(vectorStoreId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Delete a vector store. + /// + /// The ID of the vector store to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult DeleteVectorStore(string vectorStoreId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + using PipelineMessage message = CreateDeleteVectorStoreRequest(vectorStoreId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Returns a paginated collection of vector store files. + /// + /// The ID of the vector store that the files belong to. + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// Filter by file status. One of `in_progress`, `completed`, `failed`, `cancelled`. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetFileAssociationsAsync(string vectorStoreId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + VectorStoreFilesPageEnumerator enumerator = new VectorStoreFilesPageEnumerator(_pipeline, _endpoint, vectorStoreId, limit, order, after, before, filter, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + /// + /// [Protocol Method] Returns a paginated collection of vector store files. + /// + /// The ID of the vector store that the files belong to. + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// Filter by file status. One of `in_progress`, `completed`, `failed`, `cancelled`. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetFileAssociations(string vectorStoreId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + VectorStoreFilesPageEnumerator enumerator = new VectorStoreFilesPageEnumerator(_pipeline, _endpoint, vectorStoreId, limit, order, after, before, filter, options); + return PageCollectionHelpers.Create(enumerator); + } + + /// + /// [Protocol Method] Create a vector store file by attaching a [File](/docs/api-reference/files) to a [vector store](/docs/api-reference/vector-stores/object). + /// + /// The ID of the vector store for which to create a File. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task AddFileToVectorStoreAsync(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateVectorStoreFileRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Create a vector store file by attaching a [File](/docs/api-reference/files) to a [vector store](/docs/api-reference/vector-stores/object). + /// + /// The ID of the vector store for which to create a File. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult AddFileToVectorStore(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateVectorStoreFileRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Retrieves a vector store file. + /// + /// The ID of the vector store that the file belongs to. + /// The ID of the file being retrieved. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GetFileAssociationAsync(string vectorStoreId, string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateGetVectorStoreFileRequest(vectorStoreId, fileId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieves a vector store file. + /// + /// The ID of the vector store that the file belongs to. + /// The ID of the file being retrieved. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetFileAssociation(string vectorStoreId, string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateGetVectorStoreFileRequest(vectorStoreId, fileId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Delete a vector store file. This will remove the file from the vector store but the file itself will not be deleted. To delete the file, use the [delete file](/docs/api-reference/files/delete) endpoint. + /// + /// The ID of the vector store that the file belongs to. + /// The ID of the file to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task RemoveFileFromStoreAsync(string vectorStoreId, string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDeleteVectorStoreFileRequest(vectorStoreId, fileId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Delete a vector store file. This will remove the file from the vector store but the file itself will not be deleted. To delete the file, use the [delete file](/docs/api-reference/files/delete) endpoint. + /// + /// The ID of the vector store that the file belongs to. + /// The ID of the file to delete. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult RemoveFileFromStore(string vectorStoreId, string fileId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + using PipelineMessage message = CreateDeleteVectorStoreFileRequest(vectorStoreId, fileId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Create a vector store file batch. + /// + /// The ID of the vector store for which to create a file batch. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task CreateBatchFileJobAsync(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateVectorStoreFileBatchRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Create a vector store file batch. + /// + /// The ID of the vector store for which to create a file batch. + /// The content to send as the body of the request. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CreateBatchFileJob(string vectorStoreId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateVectorStoreFileBatchRequest(vectorStoreId, content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Retrieves a vector store file batch. + /// + /// The ID of the vector store that the file batch belongs to. + /// The ID of the file batch being retrieved. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task GetBatchFileJobAsync(string vectorStoreId, string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateGetVectorStoreFileBatchRequest(vectorStoreId, batchId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Retrieves a vector store file batch. + /// + /// The ID of the vector store that the file batch belongs to. + /// The ID of the file batch being retrieved. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult GetBatchFileJob(string vectorStoreId, string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateGetVectorStoreFileBatchRequest(vectorStoreId, batchId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Cancel a vector store file batch. This attempts to cancel the processing of files in this batch as soon as possible. + /// + /// The ID of the vector store that the file batch belongs to. + /// The ID of the file batch to cancel. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual async Task CancelBatchFileJobAsync(string vectorStoreId, string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateCancelVectorStoreFileBatchRequest(vectorStoreId, batchId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + /// + /// [Protocol Method] Cancel a vector store file batch. This attempts to cancel the processing of files in this batch as soon as possible. + /// + /// The ID of the vector store that the file batch belongs to. + /// The ID of the file batch to cancel. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// The response returned from the service. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual ClientResult CancelBatchFileJob(string vectorStoreId, string batchId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + using PipelineMessage message = CreateCancelVectorStoreFileBatchRequest(vectorStoreId, batchId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + /// + /// [Protocol Method] Returns a paginated collection of vector store files in a batch. + /// + /// The ID of the vector store that the file batch belongs to. + /// The ID of the file batch that the files belong to. + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// Filter by file status. One of `in_progress`, `completed`, `failed`, `cancelled`. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IAsyncEnumerable GetFileAssociationsAsync(string vectorStoreId, string batchId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + VectorStoreFileBatchesPageEnumerator enumerator = new VectorStoreFileBatchesPageEnumerator(_pipeline, _endpoint, vectorStoreId, batchId, limit, order, after, before, filter, options); + return PageCollectionHelpers.CreateAsync(enumerator); + } + + /// + /// [Protocol Method] Returns a paginated collection of vector store files in a batch. + /// + /// The ID of the vector store that the file batch belongs to. + /// The ID of the file batch that the files belong to. + /// + /// A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + /// default is 20. + /// + /// + /// Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + /// for descending order. Allowed values: "asc" | "desc" + /// + /// + /// A cursor for use in pagination. `after` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include after=obj_foo in order to fetch the next page of the list. + /// + /// + /// A cursor for use in pagination. `before` is an object ID that defines your place in the list. + /// For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + /// subsequent call can include before=obj_foo in order to fetch the previous page of the list. + /// + /// Filter by file status. One of `in_progress`, `completed`, `failed`, `cancelled`. + /// The request options, which can override default behaviors of the client pipeline on a per-call basis. + /// or is null. + /// or is an empty string, and was expected to be non-empty. + /// Service returned a non-success status code. + /// A collection of service responses, each holding a page of values. + [EditorBrowsable(EditorBrowsableState.Never)] + public virtual IEnumerable GetFileAssociations(string vectorStoreId, string batchId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchId, nameof(batchId)); + + VectorStoreFileBatchesPageEnumerator enumerator = new VectorStoreFileBatchesPageEnumerator(_pipeline, _endpoint, vectorStoreId, batchId, limit, order, after, before, filter, options); + return PageCollectionHelpers.Create(enumerator); + } +} diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreClient.cs b/.dotnet/src/Custom/VectorStores/VectorStoreClient.cs new file mode 100644 index 000000000..5edeb0dfb --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreClient.cs @@ -0,0 +1,751 @@ +using OpenAI.Files; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Diagnostics.CodeAnalysis; +using System.Threading; +using System.Threading.Tasks; + +namespace OpenAI.VectorStores; + +/// +/// The service client for OpenAI vector store operations. +/// +[CodeGenClient("VectorStores")] +[CodeGenSuppress("VectorStoreClient", typeof(ClientPipeline), typeof(ApiKeyCredential), typeof(Uri))] +[CodeGenSuppress("CreateVectorStoreAsync", typeof(VectorStoreCreationOptions))] +[CodeGenSuppress("CreateVectorStore", typeof(VectorStoreCreationOptions))] +[CodeGenSuppress("GetVectorStoreAsync", typeof(string))] +[CodeGenSuppress("GetVectorStore", typeof(string))] +[CodeGenSuppress("ModifyVectorStoreAsync", typeof(string), typeof(VectorStoreModificationOptions))] +[CodeGenSuppress("ModifyVectorStore", typeof(string), typeof(VectorStoreModificationOptions))] +[CodeGenSuppress("DeleteVectorStoreAsync", typeof(string))] +[CodeGenSuppress("DeleteVectorStore", typeof(string))] +[CodeGenSuppress("GetVectorStoresAsync", typeof(int?), typeof(ListOrder?), typeof(string), typeof(string))] +[CodeGenSuppress("GetVectorStores", typeof(int?), typeof(ListOrder?), typeof(string), typeof(string))] +[CodeGenSuppress("GetVectorStoreFilesAsync", typeof(string), typeof(int?), typeof(ListOrder?), typeof(string), typeof(string), typeof(VectorStoreFileStatusFilter?))] +[CodeGenSuppress("GetVectorStoreFiles", typeof(string), typeof(int?), typeof(ListOrder?), typeof(string), typeof(string), typeof(VectorStoreFileStatusFilter?))] +[CodeGenSuppress("CreateVectorStoreFileAsync", typeof(string), typeof(InternalCreateVectorStoreFileRequest))] +[CodeGenSuppress("CreateVectorStoreFile", typeof(string), typeof(InternalCreateVectorStoreFileRequest))] +[CodeGenSuppress("GetVectorStoreFileAsync", typeof(string), typeof(string))] +[CodeGenSuppress("GetVectorStoreFile", typeof(string), typeof(string))] +[CodeGenSuppress("DeleteVectorStoreFileAsync", typeof(string), typeof(string))] +[CodeGenSuppress("DeleteVectorStoreFile", typeof(string), typeof(string))] +[CodeGenSuppress("CreateVectorStoreFileBatchAsync", typeof(string), typeof(InternalCreateVectorStoreFileBatchRequest))] +[CodeGenSuppress("CreateVectorStoreFileBatch", typeof(string), typeof(InternalCreateVectorStoreFileBatchRequest))] +[CodeGenSuppress("GetVectorStoreFileBatchAsync", typeof(string), typeof(string))] +[CodeGenSuppress("GetVectorStoreFileBatch", typeof(string), typeof(string))] +[CodeGenSuppress("CancelVectorStoreFileBatchAsync", typeof(string), typeof(string))] +[CodeGenSuppress("CancelVectorStoreFileBatch", typeof(string), typeof(string))] +[CodeGenSuppress("GetFilesInVectorStoreBatchesAsync", typeof(string), typeof(string), typeof(int?), typeof(ListOrder?), typeof(string), typeof(string), typeof(VectorStoreFileStatusFilter?))] +[CodeGenSuppress("GetFilesInVectorStoreBatches", typeof(string), typeof(string), typeof(int?), typeof(ListOrder?), typeof(string), typeof(string), typeof(VectorStoreFileStatusFilter?))] +[Experimental("OPENAI001")] +public partial class VectorStoreClient +{ + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// is null. + public VectorStoreClient(ApiKeyCredential credential) : this(credential, new OpenAIClientOptions()) + { + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + /// Initializes a new instance of . + /// The API key to authenticate with the service. + /// The options to configure the client. + /// is null. + public VectorStoreClient(ApiKeyCredential credential, OpenAIClientOptions options) + { + Argument.AssertNotNull(credential, nameof(credential)); + options ??= new OpenAIClientOptions(); + + _pipeline = OpenAIClient.CreatePipeline(credential, options); + _endpoint = OpenAIClient.GetEndpoint(options); + } + + // CUSTOM: + // - Used a custom pipeline. + // - Demoted the endpoint parameter to be a property in the options class. + // - Made protected. + /// Initializes a new instance of . + /// The HTTP pipeline to send and receive REST requests and responses. + /// The options to configure the client. + /// is null. + protected internal VectorStoreClient(ClientPipeline pipeline, OpenAIClientOptions options) + { + Argument.AssertNotNull(pipeline, nameof(pipeline)); + options ??= new OpenAIClientOptions(); + + _pipeline = pipeline; + _endpoint = OpenAIClient.GetEndpoint(options); + } + + /// Creates a vector store. + /// The to use. + /// A token that can be used to cancel this method call. + /// is null. + /// Create vector store. + public virtual async Task> CreateVectorStoreAsync(VectorStoreCreationOptions vectorStore = null, CancellationToken cancellationToken = default) + { + using BinaryContent content = vectorStore?.ToBinaryContent(); + ClientResult result = await CreateVectorStoreAsync(content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(VectorStore.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// Creates a vector store. + /// The to use. + /// A token that can be used to cancel this method call. + /// is null. + /// Create vector store. + public virtual ClientResult CreateVectorStore(VectorStoreCreationOptions vectorStore = null, CancellationToken cancellationToken = default) + { + using BinaryContent content = vectorStore?.ToBinaryContent(); + ClientResult result = CreateVectorStore(content, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(VectorStore.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// + /// Gets an instance representing an existing based on its ID. + /// + /// The ID of the vector store to retrieve. + /// A token that can be used to cancel this method call. + /// A representation of an existing . + public virtual async Task> GetVectorStoreAsync(string vectorStoreId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + ClientResult result + = await GetVectorStoreAsync(vectorStoreId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue( + VectorStore.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// + /// Gets an instance representing an existing based on its ID. + /// + /// The ID of the vector store to retrieve. + /// A token that can be used to cancel this method call. + /// A representation of an existing . + public virtual ClientResult GetVectorStore(string vectorStoreId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + ClientResult result = GetVectorStore(vectorStoreId, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(VectorStore.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// + /// Modifies an existing . + /// + /// The ID of the to modify. + /// The new options to apply to the . + /// A token that can be used to cancel this method call. + /// An updated representation of the modified . + public virtual async Task> ModifyVectorStoreAsync(string vectorStoreId, VectorStoreModificationOptions vectorStore, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(vectorStore, nameof(vectorStore)); + + using BinaryContent content = vectorStore.ToBinaryContent(); + ClientResult result = await ModifyVectorStoreAsync(vectorStoreId, content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + return ClientResult.FromValue(VectorStore.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// + /// Modifies an existing . + /// + /// The ID of the to modify. + /// The new options to apply to the . + /// A token that can be used to cancel this method call. + /// An updated representation of the modified . + public virtual ClientResult ModifyVectorStore(string vectorStoreId, VectorStoreModificationOptions vectorStore, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(vectorStore, nameof(vectorStore)); + + using BinaryContent content = vectorStore.ToBinaryContent(); + ClientResult result = ModifyVectorStore(vectorStoreId, content, cancellationToken.ToRequestOptions()); + return ClientResult.FromValue(VectorStore.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + /// + /// Deletes a vector store. + /// + /// The ID of the vector store to delete. + /// A token that can be used to cancel this method call. + /// A value indicating whether the deletion operation was successful. + public virtual async Task> DeleteVectorStoreAsync(string vectorStoreId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + ClientResult protocolResult = await DeleteVectorStoreAsync(vectorStoreId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + PipelineResponse rawProtocolResponse = protocolResult?.GetRawResponse(); + InternalDeleteVectorStoreResponse internalResponse = InternalDeleteVectorStoreResponse.FromResponse(rawProtocolResponse); + return ClientResult.FromValue(internalResponse.Deleted, rawProtocolResponse); + } + + /// + /// Deletes a vector store. + /// + /// The ID of the vector store to delete. + /// A token that can be used to cancel this method call. + /// A value indicating whether the deletion operation was successful. + public virtual ClientResult DeleteVectorStore(string vectorStoreId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + ClientResult protocolResult = DeleteVectorStore(vectorStoreId, cancellationToken.ToRequestOptions()); + PipelineResponse rawProtocolResponse = protocolResult?.GetRawResponse(); + InternalDeleteVectorStoreResponse internalResponse = InternalDeleteVectorStoreResponse.FromResponse(rawProtocolResponse); + return ClientResult.FromValue(internalResponse.Deleted, rawProtocolResponse); + } + + /// + /// Gets a page collection holding instances for the configured organization. + /// + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetVectorStoresAsync( + VectorStoreCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + return GetVectorStoresAsync(options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Rehydrates a page collection holding instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetVectorStoresAsync( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + VectorStoresPageToken pageToken = VectorStoresPageToken.FromToken(firstPageToken); + return GetVectorStoresAsync(pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Gets a page collection holding instances for the configured organization. + /// + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetVectorStores( + VectorStoreCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + return GetVectorStores(options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Rehydrates a page collection holding instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetVectorStores( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + VectorStoresPageToken pageToken = VectorStoresPageToken.FromToken(firstPageToken); + return GetVectorStores(pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Associates a single, uploaded file with a vector store, beginning ingestion of the file into the vector store. + /// + /// The ID of the vector store to associate the file with. + /// The ID of the file to associate with the vector store. + /// A token that can be used to cancel this method call. + /// + /// A instance that represents the new association. + /// + public virtual async Task> AddFileToVectorStoreAsync(string vectorStoreId, string fileId, CancellationToken cancellationToken = default) + { + InternalCreateVectorStoreFileRequest internalRequest = new(fileId); + ClientResult protocolResult = await AddFileToVectorStoreAsync(vectorStoreId, internalRequest.ToBinaryContent(), cancellationToken.ToRequestOptions()).ConfigureAwait(false); + PipelineResponse protocolResponse = protocolResult?.GetRawResponse(); + VectorStoreFileAssociation fileAssociation = VectorStoreFileAssociation.FromResponse(protocolResponse); + return ClientResult.FromValue(fileAssociation, protocolResponse); + } + + /// + /// Associates a single, uploaded file with a vector store, beginning ingestion of the file into the vector store. + /// + /// The ID of the vector store to associate the file with. + /// The ID of the file to associate with the vector store. + /// A token that can be used to cancel this method call. + /// + /// A instance that represents the new association. + /// + public virtual ClientResult AddFileToVectorStore(string vectorStoreId, string fileId, CancellationToken cancellationToken = default) + { + InternalCreateVectorStoreFileRequest internalRequest = new(fileId); + ClientResult protocolResult = AddFileToVectorStore(vectorStoreId, internalRequest.ToBinaryContent(), cancellationToken.ToRequestOptions()); + PipelineResponse protocolResponse = protocolResult?.GetRawResponse(); + VectorStoreFileAssociation fileAssociation = VectorStoreFileAssociation.FromResponse(protocolResponse); + return ClientResult.FromValue(fileAssociation, protocolResponse); + } + + /// + /// Gets a page collection holding instances that represent file inclusions in the + /// specified vector store. + /// + /// + /// The ID of the vector store to enumerate the file associations of. + /// + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetFileAssociationsAsync( + string vectorStoreId, + VectorStoreFileAssociationCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + return GetFileAssociationsAsync(vectorStoreId, options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, options?.Filter?.ToString(), cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Rehydrates a page collection holding instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetFileAssociationsAsync( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + VectorStoreFilesPageToken pageToken = VectorStoreFilesPageToken.FromToken(firstPageToken); + return GetFileAssociationsAsync(pageToken?.VectorStoreId, pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, pageToken?.Filter, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Gets a page collection holding instances that represent file inclusions in the + /// specified vector store. + /// + /// + /// The ID of the vector store to enumerate the file associations of. + /// + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetFileAssociations( + string vectorStoreId, + VectorStoreFileAssociationCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + + return GetFileAssociations(vectorStoreId, options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, options?.Filter?.ToString(), cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Rehydrates a page collection holding instances from a page token. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetFileAssociations( + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + VectorStoreFilesPageToken pageToken = VectorStoreFilesPageToken.FromToken(firstPageToken); + return GetFileAssociations(pageToken?.VectorStoreId, pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, pageToken?.Filter, cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Gets a instance representing an existing association between a known + /// vector store ID and file ID. + /// + /// The ID of the vector store associated with the file. + /// The ID of the file associated with the vector store. + /// A token that can be used to cancel this method call. + /// A instance. + public virtual async Task> GetFileAssociationAsync( + string vectorStoreId, + string fileId, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + ClientResult result = await GetFileAssociationAsync(vectorStoreId, fileId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + PipelineResponse response = result?.GetRawResponse(); + VectorStoreFileAssociation value = VectorStoreFileAssociation.FromResponse(response); + return ClientResult.FromValue(value, response); + } + + /// + /// Gets a instance representing an existing association between a known + /// vector store ID and file ID. + /// + /// The ID of the vector store associated with the file. + /// The ID of the file associated with the vector store. + /// A token that can be used to cancel this method call. + /// A instance. + public virtual ClientResult GetFileAssociation( + string vectorStoreId, + string fileId, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileId, nameof(fileId)); + + ClientResult result = GetFileAssociation(vectorStoreId, fileId, cancellationToken.ToRequestOptions()); + PipelineResponse response = result?.GetRawResponse(); + VectorStoreFileAssociation value = VectorStoreFileAssociation.FromResponse(response); + return ClientResult.FromValue(value, response); + } + + /// + /// Removes the association between a file and vector store, which makes the file no longer available to the vector + /// store. + /// + /// + /// This does not delete the file. To delete the file, use . + /// + /// The ID of the vector store that the file should be removed from. + /// The ID of the file to remove from the vector store. + /// A token that can be used to cancel this method call. + /// A value indicating whether the removal operation was successful. + public virtual async Task> RemoveFileFromStoreAsync(string vectorStoreId, string fileId, CancellationToken cancellationToken = default) + { + ClientResult protocolResult = await RemoveFileFromStoreAsync(vectorStoreId, fileId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + PipelineResponse protocolResponse = protocolResult?.GetRawResponse(); + InternalDeleteVectorStoreFileResponse internalDeletion = InternalDeleteVectorStoreFileResponse.FromResponse(protocolResponse); + return ClientResult.FromValue(internalDeletion.Deleted, protocolResponse); + } + + /// + /// Removes the association between a file and vector store, which makes the file no longer available to the vector + /// store. + /// + /// + /// This does not delete the file. To delete the file, use . + /// + /// The ID of the vector store that the file should be removed from. + /// The ID of the file to remove from the vector store. + /// A token that can be used to cancel this method call. + /// A value indicating whether the removal operation was successful. + public virtual ClientResult RemoveFileFromStore(string vectorStoreId, string fileId, CancellationToken cancellationToken = default) + { + ClientResult protocolResult = RemoveFileFromStore(vectorStoreId, fileId, cancellationToken.ToRequestOptions()); + PipelineResponse protocolResponse = protocolResult?.GetRawResponse(); + InternalDeleteVectorStoreFileResponse internalDeletion = InternalDeleteVectorStoreFileResponse.FromResponse(protocolResponse); + return ClientResult.FromValue(internalDeletion.Deleted, protocolResponse); + } + + /// + /// Begins a batch job to associate multiple jobs with a vector store, beginning the ingestion process. + /// + /// The ID of the vector store to associate files with. + /// The IDs of the files to associate with the vector store. + /// A token that can be used to cancel this method call. + /// A instance representing the batch operation. + public virtual async Task> CreateBatchFileJobAsync(string vectorStoreId, IEnumerable fileIds, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileIds, nameof(fileIds)); + + BinaryContent content = new InternalCreateVectorStoreFileBatchRequest(fileIds).ToBinaryContent(); + ClientResult result = await CreateBatchFileJobAsync(vectorStoreId, content, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + PipelineResponse response = result?.GetRawResponse(); + VectorStoreBatchFileJob value = VectorStoreBatchFileJob.FromResponse(response); + return ClientResult.FromValue(value, response); + } + + /// + /// Begins a batch job to associate multiple jobs with a vector store, beginning the ingestion process. + /// + /// The ID of the vector store to associate files with. + /// The IDs of the files to associate with the vector store. + /// A token that can be used to cancel this method call. + /// A instance representing the batch operation. + public virtual ClientResult CreateBatchFileJob(string vectorStoreId, IEnumerable fileIds, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(fileIds, nameof(fileIds)); + + BinaryContent content = new InternalCreateVectorStoreFileBatchRequest(fileIds).ToBinaryContent(); + ClientResult result = CreateBatchFileJob(vectorStoreId, content, cancellationToken.ToRequestOptions()); + PipelineResponse response = result?.GetRawResponse(); + VectorStoreBatchFileJob value = VectorStoreBatchFileJob.FromResponse(response); + return ClientResult.FromValue(value, response); + } + + /// + /// Gets an existing vector store batch file ingestion job from a known vector store ID and job ID. + /// + /// The ID of the vector store into which the batch of files was started. + /// The ID of the batch operation adding files to the vector store. + /// A token that can be used to cancel this method call. + /// A instance representing the ingestion operation. + public virtual async Task> GetBatchFileJobAsync(string vectorStoreId, string batchJobId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchJobId, nameof(batchJobId)); + + ClientResult result = await GetBatchFileJobAsync(vectorStoreId, batchJobId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + PipelineResponse response = result?.GetRawResponse(); + VectorStoreBatchFileJob value = VectorStoreBatchFileJob.FromResponse(response); + return ClientResult.FromValue(value, response); + } + + /// + /// Gets an existing vector store batch file ingestion job from a known vector store ID and job ID. + /// + /// The ID of the vector store into which the batch of files was started. + /// The ID of the batch operation adding files to the vector store. + /// A token that can be used to cancel this method call. + /// A instance representing the ingestion operation. + public virtual ClientResult GetBatchFileJob(string vectorStoreId, string batchJobId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchJobId, nameof(batchJobId)); + + ClientResult result = GetBatchFileJob(vectorStoreId, batchJobId, cancellationToken.ToRequestOptions()); + PipelineResponse response = result?.GetRawResponse(); + VectorStoreBatchFileJob value = VectorStoreBatchFileJob.FromResponse(response); + return ClientResult.FromValue(value, response); + } + + /// + /// Cancels an in-progress . + /// + /// + /// The ID of the that is the ingestion target of the batch job being cancelled. + /// + /// + /// The ID of the that should be canceled. + /// + /// A token that can be used to cancel this method call. + /// An updated instance. + public virtual async Task> CancelBatchFileJobAsync(string vectorStoreId, string batchJobId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchJobId, nameof(batchJobId)); + + ClientResult result = await CancelBatchFileJobAsync(vectorStoreId, batchJobId, cancellationToken.ToRequestOptions()).ConfigureAwait(false); + PipelineResponse response = result?.GetRawResponse(); + VectorStoreBatchFileJob value = VectorStoreBatchFileJob.FromResponse(response); + return ClientResult.FromValue(value, response); + } + + /// + /// Cancels an in-progress . + /// + /// + /// The ID of the that is the ingestion target of the batch job being cancelled. + /// + /// + /// The ID of the that should be canceled. + /// + /// A token that can be used to cancel this method call. + /// An updated instance. + public virtual ClientResult CancelBatchFileJob(string vectorStoreId, string batchJobId, CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchJobId, nameof(batchJobId)); + + ClientResult result = CancelBatchFileJob(vectorStoreId, batchJobId, cancellationToken.ToRequestOptions()); + PipelineResponse response = result?.GetRawResponse(); + VectorStoreBatchFileJob value = VectorStoreBatchFileJob.FromResponse(response); + return ClientResult.FromValue(value, response); + } + + /// + /// Gets a page collection of file associations associated with a vector store batch file job, representing the files + /// that were scheduled for ingestion into the vector store. + /// + /// + /// The ID of the vector store into which the file batch was scheduled for ingestion. + /// + /// + /// The ID of the batch file job that was previously scheduled. + /// + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetFileAssociationsAsync( + string vectorStoreId, + string batchJobId, + VectorStoreFileAssociationCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchJobId, nameof(batchJobId)); + + return GetFileAssociationsAsync(vectorStoreId, batchJobId, options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, options?.Filter?.ToString(), cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Rehydrates a page collection of file associations from a page token. + /// + /// + /// The ID of the vector store into which the file batch was scheduled for ingestion. + /// + /// + /// The ID of the batch file job that was previously scheduled. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual AsyncPageCollection GetFileAssociationsAsync( + string vectorStoreId, + string batchJobId, + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + VectorStoreFileBatchesPageToken pageToken = VectorStoreFileBatchesPageToken.FromToken(firstPageToken); + + if (vectorStoreId != pageToken.VectorStoreId) + { + throw new ArgumentException( + "Invalid page token. 'vectorStoreId' value does not match page token value.", + nameof(vectorStoreId)); + } + + if (batchJobId != pageToken.BatchId) + { + throw new ArgumentException( + "Invalid page token. 'batchJobId' value does not match page token value.", + nameof(vectorStoreId)); + } + + return GetFileAssociationsAsync(vectorStoreId, batchJobId, pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, pageToken?.Filter, cancellationToken.ToRequestOptions()) + as AsyncPageCollection; + } + + /// + /// Gets a page collection of file associations associated with a vector store batch file job, representing the files + /// that were scheduled for ingestion into the vector store. + /// + /// + /// The ID of the vector store into which the file batch was scheduled for ingestion. + /// + /// + /// The ID of the batch file job that was previously scheduled. + /// + /// Options describing the collection to return. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetFileAssociations( + string vectorStoreId, + string batchJobId, + VectorStoreFileAssociationCollectionOptions options = default, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNullOrEmpty(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNullOrEmpty(batchJobId, nameof(batchJobId)); + + return GetFileAssociations(vectorStoreId, batchJobId, options?.PageSize, options?.Order?.ToString(), options?.AfterId, options?.BeforeId, options?.Filter?.ToString(), cancellationToken.ToRequestOptions()) + as PageCollection; + } + + /// + /// Rehydrates a page collection of file associations from a page token. + /// that were scheduled for ingestion into the vector store. + /// + /// + /// The ID of the vector store into which the file batch was scheduled for ingestion. + /// + /// + /// The ID of the batch file job that was previously scheduled. + /// + /// Page token corresponding to the first page of the collection to rehydrate. + /// A token that can be used to cancel this method call. + /// holds pages of values. To obtain a collection of values, call + /// . To obtain the current + /// page of values, call . + /// A collection of pages of . + public virtual PageCollection GetFileAssociations( + string vectorStoreId, + string batchJobId, + ContinuationToken firstPageToken, + CancellationToken cancellationToken = default) + { + Argument.AssertNotNull(firstPageToken, nameof(firstPageToken)); + + VectorStoreFileBatchesPageToken pageToken = VectorStoreFileBatchesPageToken.FromToken(firstPageToken); + + if (vectorStoreId != pageToken.VectorStoreId) + { + throw new ArgumentException( + "Invalid page token. 'vectorStoreId' value does not match page token value.", + nameof(vectorStoreId)); + } + + if (batchJobId != pageToken.BatchId) + { + throw new ArgumentException( + "Invalid page token. 'batchJobId' value does not match page token value.", + nameof(vectorStoreId)); + } + + return GetFileAssociations(vectorStoreId, batchJobId, pageToken?.Limit, pageToken?.Order, pageToken?.After, pageToken?.Before, pageToken?.Filter, cancellationToken.ToRequestOptions()) + as PageCollection; + } +} diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreCollectionOptions.cs b/.dotnet/src/Custom/VectorStores/VectorStoreCollectionOptions.cs new file mode 100644 index 000000000..82dfac7dc --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreCollectionOptions.cs @@ -0,0 +1,33 @@ +namespace OpenAI.VectorStores; + +/// +/// Represents addition options available when requesting a collection of instances. +/// +public class VectorStoreCollectionOptions +{ + /// + /// Creates a new instance of . + /// + public VectorStoreCollectionOptions() { } + + /// + /// The order that results should appear in the list according to + /// their created_at timestamp. + /// + public ListOrder? Order { get; set; } + + /// + /// The number of values to return in a page result. + /// + public int? PageSize { get; set; } + + /// + /// The id of the item preceeding the first item in the collection. + /// + public string AfterId { get; set; } + + /// + /// The id of the item following the last item in the collection. + /// + public string BeforeId { get; set; } +} diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreCreationOptions.cs b/.dotnet/src/Custom/VectorStores/VectorStoreCreationOptions.cs new file mode 100644 index 000000000..97514c9f4 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreCreationOptions.cs @@ -0,0 +1,17 @@ +using System.Collections.Generic; + +namespace OpenAI.VectorStores; + +[CodeGenModel("CreateVectorStoreRequest")] +public partial class VectorStoreCreationOptions +{ + /// A list of [File](/docs/api-reference/files) IDs that the vector store should use. Useful for tools like `file_search` that can access files. + public IList FileIds { get; set; } + + /// Gets or sets the policy that controls when the new vector store will be automatically deleted. + [CodeGenMember("ExpiresAfter")] + public VectorStoreExpirationPolicy ExpirationPolicy { get; set; } + + [CodeGenMember("ChunkingStrategy")] + public FileChunkingStrategy ChunkingStrategy { get; set; } +} diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreExpirationAnchor.cs b/.dotnet/src/Custom/VectorStores/VectorStoreExpirationAnchor.cs new file mode 100644 index 000000000..482276c40 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreExpirationAnchor.cs @@ -0,0 +1,23 @@ +using System.ComponentModel; + +namespace OpenAI.VectorStores; + +/// +/// Represents the available timestamps to which the duration in a will apply. +/// +[CodeGenModel("VectorStoreExpirationAfterAnchor")] +public enum VectorStoreExpirationAnchor +{ + /// + /// An unknown anchor. + /// + [EditorBrowsable(EditorBrowsableState.Never)] + Unknown, + + /// + /// Specifies that the expiration policy should apply relative to the last timestamp at which the vector store was + /// used. + /// + [CodeGenMember("LastActiveAt")] + LastActiveAt +} diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreExpirationPolicy.cs b/.dotnet/src/Custom/VectorStores/VectorStoreExpirationPolicy.cs new file mode 100644 index 000000000..4357959e8 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreExpirationPolicy.cs @@ -0,0 +1,63 @@ +using System; +using System.Collections.Generic; +using System.Diagnostics.CodeAnalysis; + +namespace OpenAI.VectorStores; + +/// +/// Represents the the configuration that controls when a vector store will be automatically deleted. +/// +[CodeGenModel("VectorStoreExpirationAfter")] +[CodeGenSuppress(nameof(VectorStoreExpirationPolicy))] +[CodeGenSuppress(nameof(VectorStoreExpirationPolicy), typeof(int))] +[CodeGenSuppress(nameof(VectorStoreExpirationPolicy), typeof(VectorStoreExpirationAnchor), typeof(int), typeof(IDictionary))] +public partial class VectorStoreExpirationPolicy +{ + private IDictionary SerializedAdditionalRawData; + + [CodeGenMember("Anchor")] + private VectorStoreExpirationAnchor _anchor; + [CodeGenMember("Days")] + private int _days; + + /// Anchor timestamp after which the expiration policy applies. Supported anchors: `last_active_at`. + public required VectorStoreExpirationAnchor Anchor + { + get => _anchor; + set => _anchor = value; + } + + /// The number of days after the anchor time that the vector store will expire. + public required int Days + { + get => _days; + set => _days = value; + } + + /// Initializes a new instance of . + [SetsRequiredMembers] + public VectorStoreExpirationPolicy(VectorStoreExpirationAnchor anchor, int days) + : this(anchor, days, null) + { + Days = days; + Anchor = anchor; + } + + /// Initializes a new instance of . + public VectorStoreExpirationPolicy() + { + SerializedAdditionalRawData = new ChangeTrackingDictionary(); + } + + /// Initializes a new instance of . + /// Anchor timestamp after which the expiration policy applies. Supported anchors: `last_active_at`. + /// The number of days after the anchor time that the vector store will expire. + /// Keeps track of any properties unknown to the library. + [SetsRequiredMembers] + internal VectorStoreExpirationPolicy(VectorStoreExpirationAnchor anchor, int days, IDictionary serializedAdditionalRawData) + { + Anchor = anchor; + Days = days; + SerializedAdditionalRawData = serializedAdditionalRawData ?? new ChangeTrackingDictionary(); + } +} diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociation.cs b/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociation.cs new file mode 100644 index 000000000..9c680ab46 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociation.cs @@ -0,0 +1,29 @@ +namespace OpenAI.VectorStores; + +/// +/// A representation of a file association between an uploaded file and a vector store. +/// +[CodeGenModel("VectorStoreFileObject")] +public partial class VectorStoreFileAssociation +{ + // CUSTOM: Made internal. + /// The object type, which is always `vector_store.file`. + [CodeGenMember("Object")] + internal InternalVectorStoreFileObjectObject Object { get; } = InternalVectorStoreFileObjectObject.VectorStoreFile; + + /// + /// The ID of the file that is associated with the vector store. + /// + [CodeGenMember("Id")] + public string FileId { get; } + + /// + /// The total count of bytes used for vector storage of the file. Note that this may differ from the size of the + /// file. + /// + [CodeGenMember("UsageBytes")] + public int Size { get; } + + [CodeGenMember("ChunkingStrategy")] + public FileChunkingStrategy ChunkingStrategy { get; } +} \ No newline at end of file diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationCollectionOptions.cs b/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationCollectionOptions.cs new file mode 100644 index 000000000..9880b0afd --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationCollectionOptions.cs @@ -0,0 +1,38 @@ +namespace OpenAI.VectorStores; + +/// +/// Represents addition options available when requesting a collection of instances. +/// +public class VectorStoreFileAssociationCollectionOptions +{ + /// + /// Creates a new instance of . + /// + public VectorStoreFileAssociationCollectionOptions() { } + + /// + /// The order that results should appear in the list according to + /// their created_at timestamp. + /// + public ListOrder? Order { get; set; } + + /// + /// The number of values to return in a page result. + /// + public int? PageSize { get; set; } + + /// + /// The id of the item preceeding the first item in the collection. + /// + public string AfterId { get; set; } + + /// + /// The id of the item following the last item in the collection. + /// + public string BeforeId { get; set; } + + /// + /// A status filter that file associations must match to be included in the collection. + /// + public VectorStoreFileStatusFilter? Filter { get; set; } +} diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationError.cs b/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationError.cs new file mode 100644 index 000000000..a682fe289 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationError.cs @@ -0,0 +1,6 @@ +namespace OpenAI.VectorStores; + +[CodeGenModel("VectorStoreFileObjectLastError")] +public partial class VectorStoreFileAssociationError +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationErrorCode.cs b/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationErrorCode.cs new file mode 100644 index 000000000..8246d94f9 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationErrorCode.cs @@ -0,0 +1,6 @@ +namespace OpenAI.VectorStores; + +[CodeGenModel("VectorStoreFileObjectLastErrorCode")] +public readonly partial struct VectorStoreFileAssociationErrorCode +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationStatus.cs b/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationStatus.cs new file mode 100644 index 000000000..a1bd0c55d --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreFileAssociationStatus.cs @@ -0,0 +1,28 @@ +using System.ComponentModel; + +namespace OpenAI.VectorStores; + +/// +/// Represents the possible states for a vector store file association. +/// +[CodeGenModel("VectorStoreFileObjectStatus")] +public enum VectorStoreFileAssociationStatus +{ + /// + /// An unknown vector store file association status. + /// + [EditorBrowsable(EditorBrowsableState.Never)] + Unknown, + + [CodeGenMember("InProgress")] + InProgress, + + [CodeGenMember("Completed")] + Completed, + + [CodeGenMember("Cancelled")] + Cancelled, + + [CodeGenMember("Failed")] + Failed, +} diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreFileCounts.cs b/.dotnet/src/Custom/VectorStores/VectorStoreFileCounts.cs new file mode 100644 index 000000000..049de2351 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreFileCounts.cs @@ -0,0 +1,6 @@ +namespace OpenAI.VectorStores; + +[CodeGenModel("VectorStoreObjectFileCounts")] +public partial class VectorStoreFileCounts +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreFileStatusFilter.cs b/.dotnet/src/Custom/VectorStores/VectorStoreFileStatusFilter.cs new file mode 100644 index 000000000..24f9fb964 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreFileStatusFilter.cs @@ -0,0 +1,6 @@ +namespace OpenAI.VectorStores; + +[CodeGenModel("ListVectorStoreFilesFilter")] +public readonly partial struct VectorStoreFileStatusFilter +{ +} \ No newline at end of file diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreModificationOptions.cs b/.dotnet/src/Custom/VectorStores/VectorStoreModificationOptions.cs new file mode 100644 index 000000000..964a1a865 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreModificationOptions.cs @@ -0,0 +1,11 @@ +using System.Collections.Generic; + +namespace OpenAI.VectorStores; + +[CodeGenModel("UpdateVectorStoreRequest")] +public partial class VectorStoreModificationOptions +{ + /// Gets or sets the policy that controls when the new vector store will be automatically deleted. + [CodeGenMember("ExpiresAfter")] + public VectorStoreExpirationPolicy ExpirationPolicy { get; set; } +} diff --git a/.dotnet/src/Custom/VectorStores/VectorStoreStatus.cs b/.dotnet/src/Custom/VectorStores/VectorStoreStatus.cs new file mode 100644 index 000000000..0f482a942 --- /dev/null +++ b/.dotnet/src/Custom/VectorStores/VectorStoreStatus.cs @@ -0,0 +1,25 @@ +using System.ComponentModel; + +namespace OpenAI.VectorStores; + +/// +/// Represents the possible states for a vector store. +/// +[CodeGenModel("VectorStoreObjectStatus")] +public enum VectorStoreStatus +{ + /// + /// An unknown vector store status. + /// + [EditorBrowsable(EditorBrowsableState.Never)] + Unknown, + + [CodeGenMember("InProgress")] + InProgress, + + [CodeGenMember("Completed")] + Completed, + + [CodeGenMember("Expired")] + Expired, +} diff --git a/.dotnet/src/Generated/AssistantClient.cs b/.dotnet/src/Generated/AssistantClient.cs new file mode 100644 index 000000000..3e5a40f5b --- /dev/null +++ b/.dotnet/src/Generated/AssistantClient.cs @@ -0,0 +1,128 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Assistants +{ + // Data plane generated sub-client. + public partial class AssistantClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected AssistantClient() + { + } + + internal PipelineMessage CreateCreateAssistantRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/assistants", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetAssistantsRequest(int? limit, string order, string after, string before, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/assistants", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetAssistantRequest(string assistantId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/assistants/", false); + uri.AppendPath(assistantId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateModifyAssistantRequest(string assistantId, BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/assistants/", false); + uri.AppendPath(assistantId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateDeleteAssistantRequest(string assistantId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "DELETE"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/assistants/", false); + uri.AppendPath(assistantId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/AudioClient.cs b/.dotnet/src/Generated/AudioClient.cs new file mode 100644 index 000000000..6022edc5e --- /dev/null +++ b/.dotnet/src/Generated/AudioClient.cs @@ -0,0 +1,81 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Audio +{ + // Data plane generated sub-client. + public partial class AudioClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected AudioClient() + { + } + + internal PipelineMessage CreateCreateSpeechRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/audio/speech", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/octet-stream"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCreateTranscriptionRequest(BinaryContent content, string contentType, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/audio/transcriptions", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", contentType); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCreateTranslationRequest(BinaryContent content, string contentType, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/audio/translations", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", contentType); + request.Content = content; + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/BatchClient.cs b/.dotnet/src/Generated/BatchClient.cs new file mode 100644 index 000000000..2b43e1b0f --- /dev/null +++ b/.dotnet/src/Generated/BatchClient.cs @@ -0,0 +1,104 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Threading.Tasks; + +namespace OpenAI.Batch +{ + // Data plane generated sub-client. + public partial class BatchClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected BatchClient() + { + } + + internal PipelineMessage CreateCreateBatchRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/batches", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetBatchesRequest(string after, int? limit, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/batches", false); + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateRetrieveBatchRequest(string batchId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/batches/", false); + uri.AppendPath(batchId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCancelBatchRequest(string batchId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/batches/", false); + uri.AppendPath(batchId, true); + uri.AppendPath("/cancel", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/ChatClient.cs b/.dotnet/src/Generated/ChatClient.cs new file mode 100644 index 000000000..b27f0ede5 --- /dev/null +++ b/.dotnet/src/Generated/ChatClient.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Chat +{ + // Data plane generated sub-client. + public partial class ChatClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected ChatClient() + { + } + + internal PipelineMessage CreateCreateChatCompletionRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/chat/completions", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/EmbeddingClient.cs b/.dotnet/src/Generated/EmbeddingClient.cs new file mode 100644 index 000000000..00352b853 --- /dev/null +++ b/.dotnet/src/Generated/EmbeddingClient.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Embeddings +{ + // Data plane generated sub-client. + public partial class EmbeddingClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected EmbeddingClient() + { + } + + internal PipelineMessage CreateCreateEmbeddingRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/embeddings", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/FileClient.cs b/.dotnet/src/Generated/FileClient.cs new file mode 100644 index 000000000..bfb518a32 --- /dev/null +++ b/.dotnet/src/Generated/FileClient.cs @@ -0,0 +1,115 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Files +{ + // Data plane generated sub-client. + public partial class FileClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected FileClient() + { + } + + internal PipelineMessage CreateCreateFileRequest(BinaryContent content, string contentType, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/files", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", contentType); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetFilesRequest(string purpose, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/files", false); + if (purpose != null) + { + uri.AppendQuery("purpose", purpose, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateRetrieveFileRequest(string fileId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/files/", false); + uri.AppendPath(fileId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateDeleteFileRequest(string fileId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "DELETE"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/files/", false); + uri.AppendPath(fileId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateDownloadFileRequest(string fileId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/files/", false); + uri.AppendPath(fileId, true); + uri.AppendPath("/content", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/FineTuningClient.cs b/.dotnet/src/Generated/FineTuningClient.cs new file mode 100644 index 000000000..fe4998f43 --- /dev/null +++ b/.dotnet/src/Generated/FineTuningClient.cs @@ -0,0 +1,153 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.FineTuning +{ + // Data plane generated sub-client. + public partial class FineTuningClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected FineTuningClient() + { + } + + internal PipelineMessage CreateCreateFineTuningJobRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/fine_tuning/jobs", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetPaginatedFineTuningJobsRequest(string after, int? limit, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/fine_tuning/jobs", false); + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateRetrieveFineTuningJobRequest(string fineTuningJobId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/fine_tuning/jobs/", false); + uri.AppendPath(fineTuningJobId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCancelFineTuningJobRequest(string fineTuningJobId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/fine_tuning/jobs/", false); + uri.AppendPath(fineTuningJobId, true); + uri.AppendPath("/cancel", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetFineTuningJobCheckpointsRequest(string fineTuningJobId, string after, int? limit, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/fine_tuning/jobs/", false); + uri.AppendPath(fineTuningJobId, true); + uri.AppendPath("/checkpoints", false); + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetFineTuningEventsRequest(string fineTuningJobId, string after, int? limit, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/fine_tuning/jobs/", false); + uri.AppendPath(fineTuningJobId, true); + uri.AppendPath("/events", false); + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/ImageClient.cs b/.dotnet/src/Generated/ImageClient.cs new file mode 100644 index 000000000..cab97e7f5 --- /dev/null +++ b/.dotnet/src/Generated/ImageClient.cs @@ -0,0 +1,81 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Images +{ + // Data plane generated sub-client. + public partial class ImageClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected ImageClient() + { + } + + internal PipelineMessage CreateCreateImageRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/images/generations", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCreateImageEditRequest(BinaryContent content, string contentType, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/images/edits", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", contentType); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCreateImageVariationRequest(BinaryContent content, string contentType, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/images/variations", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", contentType); + request.Content = content; + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/Internal/Argument.cs b/.dotnet/src/Generated/Internal/Argument.cs new file mode 100644 index 000000000..29cb5ac18 --- /dev/null +++ b/.dotnet/src/Generated/Internal/Argument.cs @@ -0,0 +1,126 @@ +// + +#nullable disable + +using System; +using System.Collections; +using System.Collections.Generic; + +namespace OpenAI +{ + internal static class Argument + { + public static void AssertNotNull(T value, string name) + { + if (value is null) + { + throw new ArgumentNullException(name); + } + } + + public static void AssertNotNull(T? value, string name) + where T : struct + { + if (!value.HasValue) + { + throw new ArgumentNullException(name); + } + } + + public static void AssertNotNullOrEmpty(IEnumerable value, string name) + { + if (value is null) + { + throw new ArgumentNullException(name); + } + if (value is ICollection collectionOfT && collectionOfT.Count == 0) + { + throw new ArgumentException("Value cannot be an empty collection.", name); + } + if (value is ICollection collection && collection.Count == 0) + { + throw new ArgumentException("Value cannot be an empty collection.", name); + } + using IEnumerator e = value.GetEnumerator(); + if (!e.MoveNext()) + { + throw new ArgumentException("Value cannot be an empty collection.", name); + } + } + + public static void AssertNotNullOrEmpty(string value, string name) + { + if (value is null) + { + throw new ArgumentNullException(name); + } + if (value.Length == 0) + { + throw new ArgumentException("Value cannot be an empty string.", name); + } + } + + public static void AssertNotNullOrWhiteSpace(string value, string name) + { + if (value is null) + { + throw new ArgumentNullException(name); + } + if (string.IsNullOrWhiteSpace(value)) + { + throw new ArgumentException("Value cannot be empty or contain only white-space characters.", name); + } + } + + public static void AssertNotDefault(ref T value, string name) + where T : struct, IEquatable + { + if (value.Equals(default)) + { + throw new ArgumentException("Value cannot be empty.", name); + } + } + + public static void AssertInRange(T value, T minimum, T maximum, string name) + where T : notnull, IComparable + { + if (minimum.CompareTo(value) > 0) + { + throw new ArgumentOutOfRangeException(name, "Value is less than the minimum allowed."); + } + if (maximum.CompareTo(value) < 0) + { + throw new ArgumentOutOfRangeException(name, "Value is greater than the maximum allowed."); + } + } + + public static void AssertEnumDefined(Type enumType, object value, string name) + { + if (!Enum.IsDefined(enumType, value)) + { + throw new ArgumentException($"Value not defined for {enumType.FullName}.", name); + } + } + + public static T CheckNotNull(T value, string name) + where T : class + { + AssertNotNull(value, name); + return value; + } + + public static string CheckNotNullOrEmpty(string value, string name) + { + AssertNotNullOrEmpty(value, name); + return value; + } + + public static void AssertNull(T value, string name, string message = null) + { + if (value != null) + { + throw new ArgumentException(message ?? "Value must be null.", name); + } + } + } +} diff --git a/.dotnet/src/Generated/Internal/BinaryContentHelper.cs b/.dotnet/src/Generated/Internal/BinaryContentHelper.cs new file mode 100644 index 000000000..52bcdbdf3 --- /dev/null +++ b/.dotnet/src/Generated/Internal/BinaryContentHelper.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI +{ + internal static class BinaryContentHelper + { + public static BinaryContent FromEnumerable(IEnumerable enumerable) + where T : notnull + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteStartArray(); + foreach (var item in enumerable) + { + content.JsonWriter.WriteObjectValue(item, ModelSerializationExtensions.WireOptions); + } + content.JsonWriter.WriteEndArray(); + + return content; + } + + public static BinaryContent FromEnumerable(IEnumerable enumerable) + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteStartArray(); + foreach (var item in enumerable) + { + if (item == null) + { + content.JsonWriter.WriteNullValue(); + } + else + { +#if NET6_0_OR_GREATER + content.JsonWriter.WriteRawValue(item); +#else + using (JsonDocument document = JsonDocument.Parse(item)) + { + JsonSerializer.Serialize(content.JsonWriter, document.RootElement); + } +#endif + } + } + content.JsonWriter.WriteEndArray(); + + return content; + } + + public static BinaryContent FromEnumerable(ReadOnlySpan span) + where T : notnull + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteStartArray(); + for (int i = 0; i < span.Length; i++) + { + content.JsonWriter.WriteObjectValue(span[i], ModelSerializationExtensions.WireOptions); + } + content.JsonWriter.WriteEndArray(); + + return content; + } + + public static BinaryContent FromDictionary(IDictionary dictionary) + where TValue : notnull + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteStartObject(); + foreach (var item in dictionary) + { + content.JsonWriter.WritePropertyName(item.Key); + content.JsonWriter.WriteObjectValue(item.Value, ModelSerializationExtensions.WireOptions); + } + content.JsonWriter.WriteEndObject(); + + return content; + } + + public static BinaryContent FromDictionary(IDictionary dictionary) + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteStartObject(); + foreach (var item in dictionary) + { + content.JsonWriter.WritePropertyName(item.Key); + if (item.Value == null) + { + content.JsonWriter.WriteNullValue(); + } + else + { +#if NET6_0_OR_GREATER + content.JsonWriter.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(content.JsonWriter, document.RootElement); + } +#endif + } + } + content.JsonWriter.WriteEndObject(); + + return content; + } + + public static BinaryContent FromObject(object value) + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); + content.JsonWriter.WriteObjectValue(value, ModelSerializationExtensions.WireOptions); + return content; + } + + public static BinaryContent FromObject(BinaryData value) + { + Utf8JsonBinaryContent content = new Utf8JsonBinaryContent(); +#if NET6_0_OR_GREATER + content.JsonWriter.WriteRawValue(value); +#else + using (JsonDocument document = JsonDocument.Parse(value)) + { + JsonSerializer.Serialize(content.JsonWriter, document.RootElement); + } +#endif + return content; + } + } +} diff --git a/.dotnet/src/Generated/Internal/ChangeTrackingDictionary.cs b/.dotnet/src/Generated/Internal/ChangeTrackingDictionary.cs new file mode 100644 index 000000000..2eb05d04a --- /dev/null +++ b/.dotnet/src/Generated/Internal/ChangeTrackingDictionary.cs @@ -0,0 +1,164 @@ +// + +#nullable disable + +using System; +using System.Collections; +using System.Collections.Generic; + +namespace OpenAI +{ + internal class ChangeTrackingDictionary : IDictionary, IReadOnlyDictionary where TKey : notnull + { + private IDictionary _innerDictionary; + + public ChangeTrackingDictionary() + { + } + + public ChangeTrackingDictionary(IDictionary dictionary) + { + if (dictionary == null) + { + return; + } + _innerDictionary = new Dictionary(dictionary); + } + + public ChangeTrackingDictionary(IReadOnlyDictionary dictionary) + { + if (dictionary == null) + { + return; + } + _innerDictionary = new Dictionary(); + foreach (var pair in dictionary) + { + _innerDictionary.Add(pair); + } + } + + public bool IsUndefined => _innerDictionary == null; + + public int Count => IsUndefined ? 0 : EnsureDictionary().Count; + + public bool IsReadOnly => IsUndefined ? false : EnsureDictionary().IsReadOnly; + + public ICollection Keys => IsUndefined ? Array.Empty() : EnsureDictionary().Keys; + + public ICollection Values => IsUndefined ? Array.Empty() : EnsureDictionary().Values; + + public TValue this[TKey key] + { + get + { + if (IsUndefined) + { + throw new KeyNotFoundException(nameof(key)); + } + return EnsureDictionary()[key]; + } + set + { + EnsureDictionary()[key] = value; + } + } + + IEnumerable IReadOnlyDictionary.Keys => Keys; + + IEnumerable IReadOnlyDictionary.Values => Values; + + public IEnumerator> GetEnumerator() + { + if (IsUndefined) + { + IEnumerator> enumerateEmpty() + { + yield break; + } + return enumerateEmpty(); + } + return EnsureDictionary().GetEnumerator(); + } + + IEnumerator IEnumerable.GetEnumerator() + { + return GetEnumerator(); + } + + public void Add(KeyValuePair item) + { + EnsureDictionary().Add(item); + } + + public void Clear() + { + EnsureDictionary().Clear(); + } + + public bool Contains(KeyValuePair item) + { + if (IsUndefined) + { + return false; + } + return EnsureDictionary().Contains(item); + } + + public void CopyTo(KeyValuePair[] array, int index) + { + if (IsUndefined) + { + return; + } + EnsureDictionary().CopyTo(array, index); + } + + public bool Remove(KeyValuePair item) + { + if (IsUndefined) + { + return false; + } + return EnsureDictionary().Remove(item); + } + + public void Add(TKey key, TValue value) + { + EnsureDictionary().Add(key, value); + } + + public bool ContainsKey(TKey key) + { + if (IsUndefined) + { + return false; + } + return EnsureDictionary().ContainsKey(key); + } + + public bool Remove(TKey key) + { + if (IsUndefined) + { + return false; + } + return EnsureDictionary().Remove(key); + } + + public bool TryGetValue(TKey key, out TValue value) + { + if (IsUndefined) + { + value = default; + return false; + } + return EnsureDictionary().TryGetValue(key, out value); + } + + public IDictionary EnsureDictionary() + { + return _innerDictionary ??= new Dictionary(); + } + } +} diff --git a/.dotnet/src/Generated/Internal/ChangeTrackingList.cs b/.dotnet/src/Generated/Internal/ChangeTrackingList.cs new file mode 100644 index 000000000..c64afc504 --- /dev/null +++ b/.dotnet/src/Generated/Internal/ChangeTrackingList.cs @@ -0,0 +1,150 @@ +// + +#nullable disable + +using System; +using System.Collections; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI +{ + internal class ChangeTrackingList : IList, IReadOnlyList + { + private IList _innerList; + + public ChangeTrackingList() + { + } + + public ChangeTrackingList(IList innerList) + { + if (innerList != null) + { + _innerList = innerList; + } + } + + public ChangeTrackingList(IReadOnlyList innerList) + { + if (innerList != null) + { + _innerList = innerList.ToList(); + } + } + + public bool IsUndefined => _innerList == null; + + public int Count => IsUndefined ? 0 : EnsureList().Count; + + public bool IsReadOnly => IsUndefined ? false : EnsureList().IsReadOnly; + + public T this[int index] + { + get + { + if (IsUndefined) + { + throw new ArgumentOutOfRangeException(nameof(index)); + } + return EnsureList()[index]; + } + set + { + if (IsUndefined) + { + throw new ArgumentOutOfRangeException(nameof(index)); + } + EnsureList()[index] = value; + } + } + + public void Reset() + { + _innerList = null; + } + + public IEnumerator GetEnumerator() + { + if (IsUndefined) + { + IEnumerator enumerateEmpty() + { + yield break; + } + return enumerateEmpty(); + } + return EnsureList().GetEnumerator(); + } + + IEnumerator IEnumerable.GetEnumerator() + { + return GetEnumerator(); + } + + public void Add(T item) + { + EnsureList().Add(item); + } + + public void Clear() + { + EnsureList().Clear(); + } + + public bool Contains(T item) + { + if (IsUndefined) + { + return false; + } + return EnsureList().Contains(item); + } + + public void CopyTo(T[] array, int arrayIndex) + { + if (IsUndefined) + { + return; + } + EnsureList().CopyTo(array, arrayIndex); + } + + public bool Remove(T item) + { + if (IsUndefined) + { + return false; + } + return EnsureList().Remove(item); + } + + public int IndexOf(T item) + { + if (IsUndefined) + { + return -1; + } + return EnsureList().IndexOf(item); + } + + public void Insert(int index, T item) + { + EnsureList().Insert(index, item); + } + + public void RemoveAt(int index) + { + if (IsUndefined) + { + throw new ArgumentOutOfRangeException(nameof(index)); + } + EnsureList().RemoveAt(index); + } + + public IList EnsureList() + { + return _innerList ??= new List(); + } + } +} diff --git a/.dotnet/src/Generated/Internal/ClientPipelineExtensions.cs b/.dotnet/src/Generated/Internal/ClientPipelineExtensions.cs new file mode 100644 index 000000000..c51dd675b --- /dev/null +++ b/.dotnet/src/Generated/Internal/ClientPipelineExtensions.cs @@ -0,0 +1,41 @@ +// + +#nullable disable + +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI +{ + internal static partial class ClientPipelineExtensions + { + public static async ValueTask> ProcessHeadAsBoolMessageAsync(this ClientPipeline pipeline, PipelineMessage message, RequestOptions options) + { + PipelineResponse response = await pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false); + switch (response.Status) + { + case >= 200 and < 300: + return ClientResult.FromValue(true, response); + case >= 400 and < 500: + return ClientResult.FromValue(false, response); + default: + return new ErrorResult(response, new ClientResultException(response)); + } + } + + public static ClientResult ProcessHeadAsBoolMessage(this ClientPipeline pipeline, PipelineMessage message, RequestOptions options) + { + PipelineResponse response = pipeline.ProcessMessage(message, options); + switch (response.Status) + { + case >= 200 and < 300: + return ClientResult.FromValue(true, response); + case >= 400 and < 500: + return ClientResult.FromValue(false, response); + default: + return new ErrorResult(response, new ClientResultException(response)); + } + } + } +} diff --git a/.dotnet/src/Generated/Internal/ClientUriBuilder.cs b/.dotnet/src/Generated/Internal/ClientUriBuilder.cs new file mode 100644 index 000000000..9406854da --- /dev/null +++ b/.dotnet/src/Generated/Internal/ClientUriBuilder.cs @@ -0,0 +1,205 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; +using System.Text; + +namespace OpenAI +{ + internal class ClientUriBuilder + { + private UriBuilder _uriBuilder; + private StringBuilder _pathBuilder; + private StringBuilder _queryBuilder; + + public ClientUriBuilder() + { + } + + private UriBuilder UriBuilder => _uriBuilder ??= new UriBuilder(); + + private StringBuilder PathBuilder => _pathBuilder ??= new StringBuilder(UriBuilder.Path); + + private StringBuilder QueryBuilder => _queryBuilder ??= new StringBuilder(UriBuilder.Query); + + public void Reset(Uri uri) + { + _uriBuilder = new UriBuilder(uri); + _pathBuilder = new StringBuilder(UriBuilder.Path); + _queryBuilder = new StringBuilder(UriBuilder.Query); + } + + public void AppendPath(string value, bool escape) + { + Argument.AssertNotNullOrWhiteSpace(value, nameof(value)); + + if (escape) + { + value = Uri.EscapeDataString(value); + } + + if (PathBuilder.Length > 0 && PathBuilder[PathBuilder.Length - 1] == '/' && value[0] == '/') + { + PathBuilder.Remove(PathBuilder.Length - 1, 1); + } + + PathBuilder.Append(value); + UriBuilder.Path = PathBuilder.ToString(); + } + + public void AppendPath(bool value, bool escape = false) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(float value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(double value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(int value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(byte[] value, string format, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendPath(IEnumerable value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(DateTimeOffset value, string format, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendPath(TimeSpan value, string format, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendPath(Guid value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendPath(long value, bool escape = true) + { + AppendPath(ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, string value, bool escape) + { + Argument.AssertNotNullOrWhiteSpace(name, nameof(name)); + Argument.AssertNotNullOrWhiteSpace(value, nameof(value)); + + if (QueryBuilder.Length > 0) + { + QueryBuilder.Append('&'); + } + + if (escape) + { + value = Uri.EscapeDataString(value); + } + + QueryBuilder.Append(name); + QueryBuilder.Append('='); + QueryBuilder.Append(value); + } + + public void AppendQuery(string name, bool value, bool escape = false) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, float value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, DateTimeOffset value, string format, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendQuery(string name, TimeSpan value, string format, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendQuery(string name, double value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, decimal value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, int value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, long value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, TimeSpan value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQuery(string name, byte[] value, string format, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value, format), escape); + } + + public void AppendQuery(string name, Guid value, bool escape = true) + { + AppendQuery(name, ModelSerializationExtensions.TypeFormatters.ConvertToString(value), escape); + } + + public void AppendQueryDelimited(string name, IEnumerable value, string delimiter, bool escape = true) + { + var stringValues = value.Select(v => ModelSerializationExtensions.TypeFormatters.ConvertToString(v)); + AppendQuery(name, string.Join(delimiter, stringValues), escape); + } + + public void AppendQueryDelimited(string name, IEnumerable value, string delimiter, string format, bool escape = true) + { + var stringValues = value.Select(v => ModelSerializationExtensions.TypeFormatters.ConvertToString(v, format)); + AppendQuery(name, string.Join(delimiter, stringValues), escape); + } + + public Uri ToUri() + { + if (_pathBuilder != null) + { + UriBuilder.Path = _pathBuilder.ToString(); + } + + if (_queryBuilder != null) + { + UriBuilder.Query = _queryBuilder.ToString(); + } + + return UriBuilder.Uri; + } + } +} diff --git a/.dotnet/src/Generated/Internal/ErrorResult.cs b/.dotnet/src/Generated/Internal/ErrorResult.cs new file mode 100644 index 000000000..42938626f --- /dev/null +++ b/.dotnet/src/Generated/Internal/ErrorResult.cs @@ -0,0 +1,23 @@ +// + +#nullable disable + +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace OpenAI +{ + internal class ErrorResult : ClientResult + { + private readonly PipelineResponse _response; + private readonly ClientResultException _exception; + + public ErrorResult(PipelineResponse response, ClientResultException exception) : base(default, response) + { + _response = response; + _exception = exception; + } + + public override T Value => throw _exception; + } +} diff --git a/.dotnet/src/Generated/Internal/ModelSerializationExtensions.cs b/.dotnet/src/Generated/Internal/ModelSerializationExtensions.cs new file mode 100644 index 000000000..d1334d37b --- /dev/null +++ b/.dotnet/src/Generated/Internal/ModelSerializationExtensions.cs @@ -0,0 +1,399 @@ +// + +#nullable disable + +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Diagnostics; +using System.Globalization; +using System.Text.Json; +using System.Xml; + +namespace OpenAI +{ + internal static class ModelSerializationExtensions + { + internal static readonly ModelReaderWriterOptions WireOptions = new ModelReaderWriterOptions("W"); + internal static readonly BinaryData SentinelValue = BinaryData.FromObjectAsJson("__EMPTY__"); + + public static object GetObject(this JsonElement element) + { + switch (element.ValueKind) + { + case JsonValueKind.String: + return element.GetString(); + case JsonValueKind.Number: + if (element.TryGetInt32(out int intValue)) + { + return intValue; + } + if (element.TryGetInt64(out long longValue)) + { + return longValue; + } + return element.GetDouble(); + case JsonValueKind.True: + return true; + case JsonValueKind.False: + return false; + case JsonValueKind.Undefined: + case JsonValueKind.Null: + return null; + case JsonValueKind.Object: + var dictionary = new Dictionary(); + foreach (var jsonProperty in element.EnumerateObject()) + { + dictionary.Add(jsonProperty.Name, jsonProperty.Value.GetObject()); + } + return dictionary; + case JsonValueKind.Array: + var list = new List(); + foreach (var item in element.EnumerateArray()) + { + list.Add(item.GetObject()); + } + return list.ToArray(); + default: + throw new NotSupportedException($"Not supported value kind {element.ValueKind}"); + } + } + + public static byte[] GetBytesFromBase64(this JsonElement element, string format) + { + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + + return format switch + { + "U" => TypeFormatters.FromBase64UrlString(element.GetRequiredString()), + "D" => element.GetBytesFromBase64(), + _ => throw new ArgumentException($"Format is not supported: '{format}'", nameof(format)) + }; + } + + public static DateTimeOffset GetDateTimeOffset(this JsonElement element, string format) => format switch + { + "U" when element.ValueKind == JsonValueKind.Number => DateTimeOffset.FromUnixTimeSeconds(element.GetInt64()), + _ => TypeFormatters.ParseDateTimeOffset(element.GetString(), format) + }; + + public static TimeSpan GetTimeSpan(this JsonElement element, string format) => TypeFormatters.ParseTimeSpan(element.GetString(), format); + + public static char GetChar(this JsonElement element) + { + if (element.ValueKind == JsonValueKind.String) + { + var text = element.GetString(); + if (text == null || text.Length != 1) + { + throw new NotSupportedException($"Cannot convert \"{text}\" to a char"); + } + return text[0]; + } + else + { + throw new NotSupportedException($"Cannot convert {element.ValueKind} to a char"); + } + } + + [Conditional("DEBUG")] + public static void ThrowNonNullablePropertyIsNull(this JsonProperty property) + { + throw new JsonException($"A property '{property.Name}' defined as non-nullable but received as null from the service. This exception only happens in DEBUG builds of the library and would be ignored in the release build"); + } + + public static string GetRequiredString(this JsonElement element) + { + var value = element.GetString(); + if (value == null) + { + throw new InvalidOperationException($"The requested operation requires an element of type 'String', but the target element has type '{element.ValueKind}'."); + } + return value; + } + + public static void WriteStringValue(this Utf8JsonWriter writer, DateTimeOffset value, string format) + { + writer.WriteStringValue(TypeFormatters.ToString(value, format)); + } + + public static void WriteStringValue(this Utf8JsonWriter writer, DateTime value, string format) + { + writer.WriteStringValue(TypeFormatters.ToString(value, format)); + } + + public static void WriteStringValue(this Utf8JsonWriter writer, TimeSpan value, string format) + { + writer.WriteStringValue(TypeFormatters.ToString(value, format)); + } + + public static void WriteStringValue(this Utf8JsonWriter writer, char value) + { + writer.WriteStringValue(value.ToString(CultureInfo.InvariantCulture)); + } + + public static void WriteBase64StringValue(this Utf8JsonWriter writer, byte[] value, string format) + { + if (value == null) + { + writer.WriteNullValue(); + return; + } + switch (format) + { + case "U": + writer.WriteStringValue(TypeFormatters.ToBase64UrlString(value)); + break; + case "D": + writer.WriteBase64StringValue(value); + break; + default: + throw new ArgumentException($"Format is not supported: '{format}'", nameof(format)); + } + } + + public static void WriteNumberValue(this Utf8JsonWriter writer, DateTimeOffset value, string format) + { + if (format != "U") + { + throw new ArgumentOutOfRangeException(nameof(format), "Only 'U' format is supported when writing a DateTimeOffset as a Number."); + } + writer.WriteNumberValue(value.ToUnixTimeSeconds()); + } + + public static void WriteObjectValue(this Utf8JsonWriter writer, T value, ModelReaderWriterOptions options = null) + { + switch (value) + { + case null: + writer.WriteNullValue(); + break; + case IJsonModel jsonModel: + jsonModel.Write(writer, options ?? WireOptions); + break; + case byte[] bytes: + writer.WriteBase64StringValue(bytes); + break; + case BinaryData bytes0: + writer.WriteBase64StringValue(bytes0); + break; + case JsonElement json: + json.WriteTo(writer); + break; + case int i: + writer.WriteNumberValue(i); + break; + case decimal d: + writer.WriteNumberValue(d); + break; + case double d0: + if (double.IsNaN(d0)) + { + writer.WriteStringValue("NaN"); + } + else + { + writer.WriteNumberValue(d0); + } + break; + case float f: + writer.WriteNumberValue(f); + break; + case long l: + writer.WriteNumberValue(l); + break; + case string s: + writer.WriteStringValue(s); + break; + case bool b: + writer.WriteBooleanValue(b); + break; + case Guid g: + writer.WriteStringValue(g); + break; + case DateTimeOffset dateTimeOffset: + writer.WriteStringValue(dateTimeOffset, "O"); + break; + case DateTime dateTime: + writer.WriteStringValue(dateTime, "O"); + break; + case IEnumerable> enumerable: + writer.WriteStartObject(); + foreach (var pair in enumerable) + { + writer.WritePropertyName(pair.Key); + writer.WriteObjectValue(pair.Value, options); + } + writer.WriteEndObject(); + break; + case IEnumerable objectEnumerable: + writer.WriteStartArray(); + foreach (var item in objectEnumerable) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + break; + case TimeSpan timeSpan: + writer.WriteStringValue(timeSpan, "P"); + break; + default: + throw new NotSupportedException($"Not supported type {value.GetType()}"); + } + } + + public static void WriteObjectValue(this Utf8JsonWriter writer, object value, ModelReaderWriterOptions options = null) + { + writer.WriteObjectValue(value, options); + } + + internal static bool IsSentinelValue(BinaryData value) + { + ReadOnlySpan sentinelSpan = SentinelValue.ToMemory().Span; + ReadOnlySpan valueSpan = value.ToMemory().Span; + return sentinelSpan.SequenceEqual(valueSpan); + } + + internal static class TypeFormatters + { + private const string RoundtripZFormat = "yyyy-MM-ddTHH:mm:ss.fffffffZ"; + public const string DefaultNumberFormat = "G"; + + public static string ToString(bool value) => value ? "true" : "false"; + + public static string ToString(DateTime value, string format) => value.Kind switch + { + DateTimeKind.Utc => ToString((DateTimeOffset)value, format), + _ => throw new NotSupportedException($"DateTime {value} has a Kind of {value.Kind}. Generated clients require it to be UTC. You can call DateTime.SpecifyKind to change Kind property value to DateTimeKind.Utc.") + }; + + public static string ToString(DateTimeOffset value, string format) => format switch + { + "D" => value.ToString("yyyy-MM-dd", CultureInfo.InvariantCulture), + "U" => value.ToUnixTimeSeconds().ToString(CultureInfo.InvariantCulture), + "O" => value.ToUniversalTime().ToString(RoundtripZFormat, CultureInfo.InvariantCulture), + "o" => value.ToUniversalTime().ToString(RoundtripZFormat, CultureInfo.InvariantCulture), + "R" => value.ToString("r", CultureInfo.InvariantCulture), + _ => value.ToString(format, CultureInfo.InvariantCulture) + }; + + public static string ToString(TimeSpan value, string format) => format switch + { + "P" => XmlConvert.ToString(value), + _ => value.ToString(format, CultureInfo.InvariantCulture) + }; + + public static string ToString(byte[] value, string format) => format switch + { + "U" => ToBase64UrlString(value), + "D" => Convert.ToBase64String(value), + _ => throw new ArgumentException($"Format is not supported: '{format}'", nameof(format)) + }; + + public static string ToBase64UrlString(byte[] value) + { + int numWholeOrPartialInputBlocks = checked(value.Length + 2) / 3; + int size = checked(numWholeOrPartialInputBlocks * 4); + char[] output = new char[size]; + + int numBase64Chars = Convert.ToBase64CharArray(value, 0, value.Length, output, 0); + + int i = 0; + for (; i < numBase64Chars; i++) + { + char ch = output[i]; + if (ch == '+') + { + output[i] = '-'; + } + else + { + if (ch == '/') + { + output[i] = '_'; + } + else + { + if (ch == '=') + { + break; + } + } + } + } + + return new string(output, 0, i); + } + + public static byte[] FromBase64UrlString(string value) + { + int paddingCharsToAdd = (value.Length % 4) switch + { + 0 => 0, + 2 => 2, + 3 => 1, + _ => throw new InvalidOperationException("Malformed input") + }; + char[] output = new char[(value.Length + paddingCharsToAdd)]; + int i = 0; + for (; i < value.Length; i++) + { + char ch = value[i]; + if (ch == '-') + { + output[i] = '+'; + } + else + { + if (ch == '_') + { + output[i] = '/'; + } + else + { + output[i] = ch; + } + } + } + + for (; i < output.Length; i++) + { + output[i] = '='; + } + + return Convert.FromBase64CharArray(output, 0, output.Length); + } + + public static DateTimeOffset ParseDateTimeOffset(string value, string format) => format switch + { + "U" => DateTimeOffset.FromUnixTimeSeconds(long.Parse(value, CultureInfo.InvariantCulture)), + _ => DateTimeOffset.Parse(value, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal) + }; + + public static TimeSpan ParseTimeSpan(string value, string format) => format switch + { + "P" => XmlConvert.ToTimeSpan(value), + _ => TimeSpan.ParseExact(value, format, CultureInfo.InvariantCulture) + }; + + public static string ConvertToString(object value, string format = null) => value switch + { + null => "null", + string s => s, + bool b => ToString(b), + int or float or double or long or decimal => ((IFormattable)value).ToString(DefaultNumberFormat, CultureInfo.InvariantCulture), + byte[] b0 when format != null => ToString(b0, format), + IEnumerable s0 => string.Join(",", s0), + DateTimeOffset dateTime when format != null => ToString(dateTime, format), + TimeSpan timeSpan when format != null => ToString(timeSpan, format), + TimeSpan timeSpan0 => XmlConvert.ToString(timeSpan0), + Guid guid => guid.ToString(), + BinaryData binaryData => ConvertToString(binaryData.ToArray(), format), + _ => value.ToString() + }; + } + } +} diff --git a/.dotnet/src/Generated/Internal/Optional.cs b/.dotnet/src/Generated/Internal/Optional.cs new file mode 100644 index 000000000..7b3fe4806 --- /dev/null +++ b/.dotnet/src/Generated/Internal/Optional.cs @@ -0,0 +1,48 @@ +// + +#nullable disable + +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI +{ + internal static class Optional + { + public static bool IsCollectionDefined(IEnumerable collection) + { + return !(collection is ChangeTrackingList changeTrackingList && changeTrackingList.IsUndefined); + } + + public static bool IsCollectionDefined(IDictionary collection) + { + return !(collection is ChangeTrackingDictionary changeTrackingDictionary && changeTrackingDictionary.IsUndefined); + } + + public static bool IsCollectionDefined(IReadOnlyDictionary collection) + { + return !(collection is ChangeTrackingDictionary changeTrackingDictionary && changeTrackingDictionary.IsUndefined); + } + + public static bool IsDefined(T? value) + where T : struct + { + return value.HasValue; + } + + public static bool IsDefined(object value) + { + return value != null; + } + + public static bool IsDefined(JsonElement value) + { + return value.ValueKind != JsonValueKind.Undefined; + } + + public static bool IsDefined(string value) + { + return value != null; + } + } +} diff --git a/.dotnet/src/Generated/Internal/Utf8JsonBinaryContent.cs b/.dotnet/src/Generated/Internal/Utf8JsonBinaryContent.cs new file mode 100644 index 000000000..1f7d30685 --- /dev/null +++ b/.dotnet/src/Generated/Internal/Utf8JsonBinaryContent.cs @@ -0,0 +1,52 @@ +// + +#nullable disable + +using System.ClientModel; +using System.IO; +using System.Text.Json; +using System.Threading; +using System.Threading.Tasks; + +namespace OpenAI +{ + internal class Utf8JsonBinaryContent : BinaryContent + { + private readonly MemoryStream _stream; + private readonly BinaryContent _content; + + public Utf8JsonBinaryContent() + { + _stream = new MemoryStream(); + _content = Create(_stream); + JsonWriter = new Utf8JsonWriter(_stream); + } + + public Utf8JsonWriter JsonWriter { get; } + + public override async Task WriteToAsync(Stream stream, CancellationToken cancellationToken = default) + { + await JsonWriter.FlushAsync().ConfigureAwait(false); + await _content.WriteToAsync(stream, cancellationToken).ConfigureAwait(false); + } + + public override void WriteTo(Stream stream, CancellationToken cancellationToken = default) + { + JsonWriter.Flush(); + _content.WriteTo(stream, cancellationToken); + } + + public override bool TryComputeLength(out long length) + { + length = JsonWriter.BytesCommitted + JsonWriter.BytesPending; + return true; + } + + public override void Dispose() + { + JsonWriter.Dispose(); + _content.Dispose(); + _stream.Dispose(); + } + } +} diff --git a/.dotnet/src/Generated/InternalAssistantMessageClient.cs b/.dotnet/src/Generated/InternalAssistantMessageClient.cs new file mode 100644 index 000000000..5b211af7f --- /dev/null +++ b/.dotnet/src/Generated/InternalAssistantMessageClient.cs @@ -0,0 +1,154 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Assistants +{ + // Data plane generated sub-client. + internal partial class InternalAssistantMessageClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected InternalAssistantMessageClient() + { + } + + public virtual async Task GetMessagesAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetMessagesRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public virtual ClientResult GetMessages(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetMessagesRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + internal PipelineMessage CreateCreateMessageRequest(string threadId, BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/messages", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetMessagesRequest(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/messages", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetMessageRequest(string threadId, string messageId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/messages/", false); + uri.AppendPath(messageId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateModifyMessageRequest(string threadId, string messageId, BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/messages/", false); + uri.AppendPath(messageId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateDeleteMessageRequest(string threadId, string messageId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "DELETE"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/messages/", false); + uri.AppendPath(messageId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/InternalAssistantRunClient.cs b/.dotnet/src/Generated/InternalAssistantRunClient.cs new file mode 100644 index 000000000..2a6b2de95 --- /dev/null +++ b/.dotnet/src/Generated/InternalAssistantRunClient.cs @@ -0,0 +1,266 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Assistants +{ + // Data plane generated sub-client. + internal partial class InternalAssistantRunClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected InternalAssistantRunClient() + { + } + + public virtual async Task GetRunsAsync(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetRunsRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public virtual ClientResult GetRuns(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + + using PipelineMessage message = CreateGetRunsRequest(threadId, limit, order, after, before, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + public virtual async Task GetRunStepsAsync(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateGetRunStepsRequest(threadId, runId, limit, order, after, before, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public virtual ClientResult GetRunSteps(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(threadId, nameof(threadId)); + Argument.AssertNotNullOrEmpty(runId, nameof(runId)); + + using PipelineMessage message = CreateGetRunStepsRequest(threadId, runId, limit, order, after, before, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + internal PipelineMessage CreateCreateThreadAndRunRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/runs", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCreateRunRequest(string threadId, BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/runs", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetRunsRequest(string threadId, int? limit, string order, string after, string before, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/runs", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetRunRequest(string threadId, string runId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/runs/", false); + uri.AppendPath(runId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateModifyRunRequest(string threadId, string runId, BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/runs/", false); + uri.AppendPath(runId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCancelRunRequest(string threadId, string runId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/runs/", false); + uri.AppendPath(runId, true); + uri.AppendPath("/cancel", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateSubmitToolOutputsToRunRequest(string threadId, string runId, BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/runs/", false); + uri.AppendPath(runId, true); + uri.AppendPath("/submit_tool_outputs", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetRunStepsRequest(string threadId, string runId, int? limit, string order, string after, string before, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/runs/", false); + uri.AppendPath(runId, true); + uri.AppendPath("/steps", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetRunStepRequest(string threadId, string runId, string stepId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + uri.AppendPath("/runs/", false); + uri.AppendPath(runId, true); + uri.AppendPath("/steps/", false); + uri.AppendPath(stepId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/InternalAssistantThreadClient.cs b/.dotnet/src/Generated/InternalAssistantThreadClient.cs new file mode 100644 index 000000000..9bba60ad5 --- /dev/null +++ b/.dotnet/src/Generated/InternalAssistantThreadClient.cs @@ -0,0 +1,97 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Assistants +{ + // Data plane generated sub-client. + internal partial class InternalAssistantThreadClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected InternalAssistantThreadClient() + { + } + + internal PipelineMessage CreateCreateThreadRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetThreadRequest(string threadId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateModifyThreadRequest(string threadId, BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateDeleteThreadRequest(string threadId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "DELETE"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/threads/", false); + uri.AppendPath(threadId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/InternalUploadsClient.cs b/.dotnet/src/Generated/InternalUploadsClient.cs new file mode 100644 index 000000000..0e30e2f4d --- /dev/null +++ b/.dotnet/src/Generated/InternalUploadsClient.cs @@ -0,0 +1,244 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Files +{ + // Data plane generated sub-client. + internal partial class InternalUploadsClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected InternalUploadsClient() + { + } + + public virtual async Task> CreateUploadAsync(InternalCreateUploadRequest requestBody) + { + Argument.AssertNotNull(requestBody, nameof(requestBody)); + + using BinaryContent content = requestBody.ToBinaryContent(); + ClientResult result = await CreateUploadAsync(content, null).ConfigureAwait(false); + return ClientResult.FromValue(InternalUpload.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + public virtual ClientResult CreateUpload(InternalCreateUploadRequest requestBody) + { + Argument.AssertNotNull(requestBody, nameof(requestBody)); + + using BinaryContent content = requestBody.ToBinaryContent(); + ClientResult result = CreateUpload(content, null); + return ClientResult.FromValue(InternalUpload.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + public virtual async Task CreateUploadAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateUploadRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public virtual ClientResult CreateUpload(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateUploadRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + public virtual async Task> AddUploadPartAsync(string uploadId, InternalAddUploadPartRequest requestBody) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + Argument.AssertNotNull(requestBody, nameof(requestBody)); + + using MultipartFormDataBinaryContent content = requestBody.ToMultipartBinaryBody(); + ClientResult result = await AddUploadPartAsync(uploadId, content, content.ContentType, (RequestOptions)null).ConfigureAwait(false); + return ClientResult.FromValue(InternalUploadPart.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + public virtual ClientResult AddUploadPart(string uploadId, InternalAddUploadPartRequest requestBody) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + Argument.AssertNotNull(requestBody, nameof(requestBody)); + + using MultipartFormDataBinaryContent content = requestBody.ToMultipartBinaryBody(); + ClientResult result = AddUploadPart(uploadId, content, content.ContentType, (RequestOptions)null); + return ClientResult.FromValue(InternalUploadPart.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + public virtual async Task AddUploadPartAsync(string uploadId, BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateAddUploadPartRequest(uploadId, content, contentType, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public virtual ClientResult AddUploadPart(string uploadId, BinaryContent content, string contentType, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateAddUploadPartRequest(uploadId, content, contentType, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + public virtual async Task> CompleteUploadAsync(string uploadId, InternalCompleteUploadRequest requestBody) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + Argument.AssertNotNull(requestBody, nameof(requestBody)); + + using BinaryContent content = requestBody.ToBinaryContent(); + ClientResult result = await CompleteUploadAsync(uploadId, content, null).ConfigureAwait(false); + return ClientResult.FromValue(InternalUpload.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + public virtual ClientResult CompleteUpload(string uploadId, InternalCompleteUploadRequest requestBody) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + Argument.AssertNotNull(requestBody, nameof(requestBody)); + + using BinaryContent content = requestBody.ToBinaryContent(); + ClientResult result = CompleteUpload(uploadId, content, null); + return ClientResult.FromValue(InternalUpload.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + public virtual async Task CompleteUploadAsync(string uploadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCompleteUploadRequest(uploadId, content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public virtual ClientResult CompleteUpload(string uploadId, BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCompleteUploadRequest(uploadId, content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + public virtual async Task> CancelUploadAsync(string uploadId) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + + ClientResult result = await CancelUploadAsync(uploadId, null).ConfigureAwait(false); + return ClientResult.FromValue(InternalUpload.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + public virtual ClientResult CancelUpload(string uploadId) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + + ClientResult result = CancelUpload(uploadId, null); + return ClientResult.FromValue(InternalUpload.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + public virtual async Task CancelUploadAsync(string uploadId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + + using PipelineMessage message = CreateCancelUploadRequest(uploadId, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public virtual ClientResult CancelUpload(string uploadId, RequestOptions options) + { + Argument.AssertNotNullOrEmpty(uploadId, nameof(uploadId)); + + using PipelineMessage message = CreateCancelUploadRequest(uploadId, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + internal PipelineMessage CreateCreateUploadRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/uploads", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateAddUploadPartRequest(string uploadId, BinaryContent content, string contentType, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/uploads/", false); + uri.AppendPath(uploadId, true); + uri.AppendPath("/parts", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", contentType); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCompleteUploadRequest(string uploadId, BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/uploads/", false); + uri.AppendPath(uploadId, true); + uri.AppendPath("/complete", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCancelUploadRequest(string uploadId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/uploads/", false); + uri.AppendPath(uploadId, true); + uri.AppendPath("/cancel", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/LegacyCompletionClient.cs b/.dotnet/src/Generated/LegacyCompletionClient.cs new file mode 100644 index 000000000..4a0ad7e28 --- /dev/null +++ b/.dotnet/src/Generated/LegacyCompletionClient.cs @@ -0,0 +1,81 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.LegacyCompletions +{ + // Data plane generated sub-client. + internal partial class LegacyCompletionClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected LegacyCompletionClient() + { + } + + public virtual async Task> CreateCompletionAsync(InternalCreateCompletionRequest requestBody) + { + Argument.AssertNotNull(requestBody, nameof(requestBody)); + + using BinaryContent content = requestBody.ToBinaryContent(); + ClientResult result = await CreateCompletionAsync(content, null).ConfigureAwait(false); + return ClientResult.FromValue(InternalCreateCompletionResponse.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + public virtual ClientResult CreateCompletion(InternalCreateCompletionRequest requestBody) + { + Argument.AssertNotNull(requestBody, nameof(requestBody)); + + using BinaryContent content = requestBody.ToBinaryContent(); + ClientResult result = CreateCompletion(content, null); + return ClientResult.FromValue(InternalCreateCompletionResponse.FromResponse(result.GetRawResponse()), result.GetRawResponse()); + } + + public virtual async Task CreateCompletionAsync(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateCompletionRequest(content, options); + return ClientResult.FromResponse(await _pipeline.ProcessMessageAsync(message, options).ConfigureAwait(false)); + } + + public virtual ClientResult CreateCompletion(BinaryContent content, RequestOptions options = null) + { + Argument.AssertNotNull(content, nameof(content)); + + using PipelineMessage message = CreateCreateCompletionRequest(content, options); + return ClientResult.FromResponse(_pipeline.ProcessMessage(message, options)); + } + + internal PipelineMessage CreateCreateCompletionRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/completions", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/ModelClient.cs b/.dotnet/src/Generated/ModelClient.cs new file mode 100644 index 000000000..62db6d531 --- /dev/null +++ b/.dotnet/src/Generated/ModelClient.cs @@ -0,0 +1,77 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Models +{ + // Data plane generated sub-client. + public partial class ModelClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected ModelClient() + { + } + + internal PipelineMessage CreateGetModelsRequest(RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/models", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateRetrieveRequest(string model, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/models/", false); + uri.AppendPath(model, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateDeleteRequest(string model, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "DELETE"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/models/", false); + uri.AppendPath(model, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/Models/Assistant.Serialization.cs b/.dotnet/src/Generated/Models/Assistant.Serialization.cs new file mode 100644 index 000000000..22f1b7a73 --- /dev/null +++ b/.dotnet/src/Generated/Models/Assistant.Serialization.cs @@ -0,0 +1,396 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class Assistant : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(Assistant)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + if (Name != null) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + else + { + writer.WriteNull("name"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("description") != true) + { + if (Description != null) + { + writer.WritePropertyName("description"u8); + writer.WriteStringValue(Description); + } + else + { + writer.WriteNull("description"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model); + } + if (SerializedAdditionalRawData?.ContainsKey("instructions") != true) + { + if (Instructions != null) + { + writer.WritePropertyName("instructions"u8); + writer.WriteStringValue(Instructions); + } + else + { + writer.WriteNull("instructions"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tools") != true) + { + writer.WritePropertyName("tools"u8); + writer.WriteStartArray(); + foreach (var item in Tools) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("tool_resources") != true && Optional.IsDefined(ToolResources)) + { + if (ToolResources != null) + { + writer.WritePropertyName("tool_resources"u8); + writer.WriteObjectValue(ToolResources, options); + } + else + { + writer.WriteNull("tool_resources"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true) + { + if (Metadata != null && Optional.IsCollectionDefined(Metadata)) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("temperature") != true && Optional.IsDefined(Temperature)) + { + if (Temperature != null) + { + writer.WritePropertyName("temperature"u8); + writer.WriteNumberValue(Temperature.Value); + } + else + { + writer.WriteNull("temperature"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("top_p") != true && Optional.IsDefined(NucleusSamplingFactor)) + { + if (NucleusSamplingFactor != null) + { + writer.WritePropertyName("top_p"u8); + writer.WriteNumberValue(NucleusSamplingFactor.Value); + } + else + { + writer.WriteNull("top_p"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + if (ResponseFormat != null) + { + writer.WritePropertyName("response_format"u8); + writer.WriteObjectValue(ResponseFormat, options); + } + else + { + writer.WriteNull("response_format"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + Assistant IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(Assistant)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAssistant(document.RootElement, options); + } + + internal static Assistant DeserializeAssistant(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalAssistantObjectObject @object = default; + DateTimeOffset createdAt = default; + string name = default; + string description = default; + string model = default; + string instructions = default; + IReadOnlyList tools = default; + ToolResources toolResources = default; + IReadOnlyDictionary metadata = default; + float? temperature = default; + float? topP = default; + AssistantResponseFormat responseFormat = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalAssistantObjectObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("name"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + name = null; + continue; + } + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("description"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + description = null; + continue; + } + description = property.Value.GetString(); + continue; + } + if (property.NameEquals("model"u8)) + { + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("instructions"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + instructions = null; + continue; + } + instructions = property.Value.GetString(); + continue; + } + if (property.NameEquals("tools"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ToolDefinition.DeserializeToolDefinition(item, options)); + } + tools = array; + continue; + } + if (property.NameEquals("tool_resources"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + toolResources = null; + continue; + } + toolResources = ToolResources.DeserializeToolResources(property.Value, options); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + metadata = new ChangeTrackingDictionary(); + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (property.NameEquals("temperature"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + temperature = null; + continue; + } + temperature = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("top_p"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + topP = null; + continue; + } + topP = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + responseFormat = null; + continue; + } + responseFormat = AssistantResponseFormat.DeserializeAssistantResponseFormat(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new Assistant( + id, + @object, + createdAt, + name, + description, + model, + instructions, + tools, + toolResources, + metadata, + temperature, + topP, + responseFormat, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(Assistant)} does not support writing '{options.Format}' format."); + } + } + + Assistant IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAssistant(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(Assistant)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static Assistant FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAssistant(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/Assistant.cs b/.dotnet/src/Generated/Models/Assistant.cs new file mode 100644 index 000000000..db507e169 --- /dev/null +++ b/.dotnet/src/Generated/Models/Assistant.cs @@ -0,0 +1,64 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + public partial class Assistant + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal Assistant(string id, DateTimeOffset createdAt, string name, string description, string model, string instructions, IEnumerable tools, IReadOnlyDictionary metadata) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(model, nameof(model)); + Argument.AssertNotNull(tools, nameof(tools)); + + Id = id; + CreatedAt = createdAt; + Name = name; + Description = description; + Model = model; + Instructions = instructions; + Tools = tools.ToList(); + Metadata = metadata; + } + + internal Assistant(string id, InternalAssistantObjectObject @object, DateTimeOffset createdAt, string name, string description, string model, string instructions, IReadOnlyList tools, ToolResources toolResources, IReadOnlyDictionary metadata, float? temperature, float? nucleusSamplingFactor, AssistantResponseFormat responseFormat, IDictionary serializedAdditionalRawData) + { + Id = id; + Object = @object; + CreatedAt = createdAt; + Name = name; + Description = description; + Model = model; + Instructions = instructions; + Tools = tools; + ToolResources = toolResources; + Metadata = metadata; + Temperature = temperature; + NucleusSamplingFactor = nucleusSamplingFactor; + ResponseFormat = responseFormat; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal Assistant() + { + } + + public string Id { get; } + + public DateTimeOffset CreatedAt { get; } + public string Name { get; } + public string Description { get; } + public string Model { get; } + public string Instructions { get; } + public IReadOnlyList Tools { get; } + public ToolResources ToolResources { get; } + public IReadOnlyDictionary Metadata { get; } + public float? Temperature { get; } + } +} diff --git a/.dotnet/src/Generated/Models/AssistantChatMessage.Serialization.cs b/.dotnet/src/Generated/Models/AssistantChatMessage.Serialization.cs new file mode 100644 index 000000000..89a7d76f5 --- /dev/null +++ b/.dotnet/src/Generated/Models/AssistantChatMessage.Serialization.cs @@ -0,0 +1,153 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class AssistantChatMessage : IJsonModel + { + AssistantChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AssistantChatMessage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAssistantChatMessage(document.RootElement, options); + } + + internal static AssistantChatMessage DeserializeAssistantChatMessage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string refusal = default; + string name = default; + IList toolCalls = default; + ChatFunctionCall functionCall = default; + ChatMessageRole role = default; + IList content = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("refusal"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + refusal = null; + continue; + } + refusal = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("tool_calls"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatToolCall.DeserializeChatToolCall(item, options)); + } + toolCalls = array; + continue; + } + if (property.NameEquals("function_call"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + functionCall = null; + continue; + } + functionCall = ChatFunctionCall.DeserializeChatFunctionCall(property.Value, options); + continue; + } + if (property.NameEquals("role"u8)) + { + role = property.Value.GetString().ToChatMessageRole(); + continue; + } + if (property.NameEquals("content"u8)) + { + DeserializeContentValue(property, ref content); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AssistantChatMessage( + role, + content ?? new ChangeTrackingList(), + serializedAdditionalRawData, + refusal, + name, + toolCalls ?? new ChangeTrackingList(), + functionCall); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AssistantChatMessage)} does not support writing '{options.Format}' format."); + } + } + + AssistantChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAssistantChatMessage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AssistantChatMessage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new AssistantChatMessage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAssistantChatMessage(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/AssistantChatMessage.cs b/.dotnet/src/Generated/Models/AssistantChatMessage.cs new file mode 100644 index 000000000..1d3cc6723 --- /dev/null +++ b/.dotnet/src/Generated/Models/AssistantChatMessage.cs @@ -0,0 +1,23 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class AssistantChatMessage : ChatMessage + { + internal AssistantChatMessage(ChatMessageRole role, IList content, IDictionary serializedAdditionalRawData, string refusal, string participantName, IList toolCalls, ChatFunctionCall functionCall) : base(role, content, serializedAdditionalRawData) + { + Refusal = refusal; + ParticipantName = participantName; + ToolCalls = toolCalls; + FunctionCall = functionCall; + } + + public string Refusal { get; set; } + public ChatFunctionCall FunctionCall { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/AssistantCreationOptions.Serialization.cs b/.dotnet/src/Generated/Models/AssistantCreationOptions.Serialization.cs new file mode 100644 index 000000000..31be626be --- /dev/null +++ b/.dotnet/src/Generated/Models/AssistantCreationOptions.Serialization.cs @@ -0,0 +1,363 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class AssistantCreationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AssistantCreationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true && Optional.IsDefined(Name)) + { + if (Name != null) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + else + { + writer.WriteNull("name"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("description") != true && Optional.IsDefined(Description)) + { + if (Description != null) + { + writer.WritePropertyName("description"u8); + writer.WriteStringValue(Description); + } + else + { + writer.WriteNull("description"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("instructions") != true && Optional.IsDefined(Instructions)) + { + if (Instructions != null) + { + writer.WritePropertyName("instructions"u8); + writer.WriteStringValue(Instructions); + } + else + { + writer.WriteNull("instructions"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tools") != true && Optional.IsCollectionDefined(Tools)) + { + writer.WritePropertyName("tools"u8); + writer.WriteStartArray(); + foreach (var item in Tools) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("tool_resources") != true && Optional.IsDefined(ToolResources)) + { + if (ToolResources != null) + { + writer.WritePropertyName("tool_resources"u8); + writer.WriteObjectValue(ToolResources, options); + } + else + { + writer.WriteNull("tool_resources"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("temperature") != true && Optional.IsDefined(Temperature)) + { + if (Temperature != null) + { + writer.WritePropertyName("temperature"u8); + writer.WriteNumberValue(Temperature.Value); + } + else + { + writer.WriteNull("temperature"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("top_p") != true && Optional.IsDefined(NucleusSamplingFactor)) + { + if (NucleusSamplingFactor != null) + { + writer.WritePropertyName("top_p"u8); + writer.WriteNumberValue(NucleusSamplingFactor.Value); + } + else + { + writer.WriteNull("top_p"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + if (ResponseFormat != null) + { + writer.WritePropertyName("response_format"u8); + writer.WriteObjectValue(ResponseFormat, options); + } + else + { + writer.WriteNull("response_format"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AssistantCreationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AssistantCreationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAssistantCreationOptions(document.RootElement, options); + } + + internal static AssistantCreationOptions DeserializeAssistantCreationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string model = default; + string name = default; + string description = default; + string instructions = default; + IList tools = default; + ToolResources toolResources = default; + IDictionary metadata = default; + float? temperature = default; + float? topP = default; + AssistantResponseFormat responseFormat = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("model"u8)) + { + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + name = null; + continue; + } + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("description"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + description = null; + continue; + } + description = property.Value.GetString(); + continue; + } + if (property.NameEquals("instructions"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + instructions = null; + continue; + } + instructions = property.Value.GetString(); + continue; + } + if (property.NameEquals("tools"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ToolDefinition.DeserializeToolDefinition(item, options)); + } + tools = array; + continue; + } + if (property.NameEquals("tool_resources"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + toolResources = null; + continue; + } + toolResources = Assistants.ToolResources.DeserializeToolResources(property.Value, options); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (property.NameEquals("temperature"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + temperature = null; + continue; + } + temperature = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("top_p"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + topP = null; + continue; + } + topP = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + responseFormat = null; + continue; + } + responseFormat = AssistantResponseFormat.DeserializeAssistantResponseFormat(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AssistantCreationOptions( + model, + name, + description, + instructions, + tools ?? new ChangeTrackingList(), + toolResources, + metadata ?? new ChangeTrackingDictionary(), + temperature, + topP, + responseFormat, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AssistantCreationOptions)} does not support writing '{options.Format}' format."); + } + } + + AssistantCreationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAssistantCreationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AssistantCreationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static AssistantCreationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAssistantCreationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/AssistantCreationOptions.cs b/.dotnet/src/Generated/Models/AssistantCreationOptions.cs new file mode 100644 index 000000000..9e0ed5afc --- /dev/null +++ b/.dotnet/src/Generated/Models/AssistantCreationOptions.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class AssistantCreationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal AssistantCreationOptions(string model, string name, string description, string instructions, IList tools, ToolResources toolResources, IDictionary metadata, float? temperature, float? nucleusSamplingFactor, AssistantResponseFormat responseFormat, IDictionary serializedAdditionalRawData) + { + Model = model; + Name = name; + Description = description; + Instructions = instructions; + Tools = tools; + ToolResources = toolResources; + Metadata = metadata; + Temperature = temperature; + NucleusSamplingFactor = nucleusSamplingFactor; + ResponseFormat = responseFormat; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public string Name { get; set; } + public string Description { get; set; } + public string Instructions { get; set; } + public IDictionary Metadata { get; set; } + public float? Temperature { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/AssistantModificationOptions.Serialization.cs b/.dotnet/src/Generated/Models/AssistantModificationOptions.Serialization.cs new file mode 100644 index 000000000..185f0d9bc --- /dev/null +++ b/.dotnet/src/Generated/Models/AssistantModificationOptions.Serialization.cs @@ -0,0 +1,363 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class AssistantModificationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AssistantModificationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("model") != true && Optional.IsDefined(Model)) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true && Optional.IsDefined(Name)) + { + if (Name != null) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + else + { + writer.WriteNull("name"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("description") != true && Optional.IsDefined(Description)) + { + if (Description != null) + { + writer.WritePropertyName("description"u8); + writer.WriteStringValue(Description); + } + else + { + writer.WriteNull("description"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("instructions") != true && Optional.IsDefined(Instructions)) + { + if (Instructions != null) + { + writer.WritePropertyName("instructions"u8); + writer.WriteStringValue(Instructions); + } + else + { + writer.WriteNull("instructions"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tools") != true && Optional.IsCollectionDefined(DefaultTools)) + { + writer.WritePropertyName("tools"u8); + writer.WriteStartArray(); + foreach (var item in DefaultTools) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("tool_resources") != true && Optional.IsDefined(ToolResources)) + { + if (ToolResources != null) + { + writer.WritePropertyName("tool_resources"u8); + writer.WriteObjectValue(ToolResources, options); + } + else + { + writer.WriteNull("tool_resources"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("temperature") != true && Optional.IsDefined(Temperature)) + { + if (Temperature != null) + { + writer.WritePropertyName("temperature"u8); + writer.WriteNumberValue(Temperature.Value); + } + else + { + writer.WriteNull("temperature"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("top_p") != true && Optional.IsDefined(NucleusSamplingFactor)) + { + if (NucleusSamplingFactor != null) + { + writer.WritePropertyName("top_p"u8); + writer.WriteNumberValue(NucleusSamplingFactor.Value); + } + else + { + writer.WriteNull("top_p"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + if (ResponseFormat != null) + { + writer.WritePropertyName("response_format"u8); + writer.WriteObjectValue(ResponseFormat, options); + } + else + { + writer.WriteNull("response_format"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AssistantModificationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AssistantModificationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAssistantModificationOptions(document.RootElement, options); + } + + internal static AssistantModificationOptions DeserializeAssistantModificationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string model = default; + string name = default; + string description = default; + string instructions = default; + IList tools = default; + ToolResources toolResources = default; + IDictionary metadata = default; + float? temperature = default; + float? topP = default; + AssistantResponseFormat responseFormat = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("model"u8)) + { + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + name = null; + continue; + } + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("description"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + description = null; + continue; + } + description = property.Value.GetString(); + continue; + } + if (property.NameEquals("instructions"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + instructions = null; + continue; + } + instructions = property.Value.GetString(); + continue; + } + if (property.NameEquals("tools"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ToolDefinition.DeserializeToolDefinition(item, options)); + } + tools = array; + continue; + } + if (property.NameEquals("tool_resources"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + toolResources = null; + continue; + } + toolResources = Assistants.ToolResources.DeserializeToolResources(property.Value, options); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (property.NameEquals("temperature"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + temperature = null; + continue; + } + temperature = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("top_p"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + topP = null; + continue; + } + topP = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + responseFormat = null; + continue; + } + responseFormat = AssistantResponseFormat.DeserializeAssistantResponseFormat(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AssistantModificationOptions( + model, + name, + description, + instructions, + tools ?? new ChangeTrackingList(), + toolResources, + metadata ?? new ChangeTrackingDictionary(), + temperature, + topP, + responseFormat, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AssistantModificationOptions)} does not support writing '{options.Format}' format."); + } + } + + AssistantModificationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAssistantModificationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AssistantModificationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static AssistantModificationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAssistantModificationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/AssistantModificationOptions.cs b/.dotnet/src/Generated/Models/AssistantModificationOptions.cs new file mode 100644 index 000000000..baa7fed1a --- /dev/null +++ b/.dotnet/src/Generated/Models/AssistantModificationOptions.cs @@ -0,0 +1,39 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class AssistantModificationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public AssistantModificationOptions() + { + DefaultTools = new ChangeTrackingList(); + Metadata = new ChangeTrackingDictionary(); + } + + internal AssistantModificationOptions(string model, string name, string description, string instructions, IList defaultTools, ToolResources toolResources, IDictionary metadata, float? temperature, float? nucleusSamplingFactor, AssistantResponseFormat responseFormat, IDictionary serializedAdditionalRawData) + { + Model = model; + Name = name; + Description = description; + Instructions = instructions; + DefaultTools = defaultTools; + ToolResources = toolResources; + Metadata = metadata; + Temperature = temperature; + NucleusSamplingFactor = nucleusSamplingFactor; + ResponseFormat = responseFormat; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public string Name { get; set; } + public string Description { get; set; } + public string Instructions { get; set; } + public IDictionary Metadata { get; set; } + public float? Temperature { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/AssistantResponseFormat.Serialization.cs b/.dotnet/src/Generated/Models/AssistantResponseFormat.Serialization.cs new file mode 100644 index 000000000..f2fa4d9f9 --- /dev/null +++ b/.dotnet/src/Generated/Models/AssistantResponseFormat.Serialization.cs @@ -0,0 +1,16 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + [PersistableModelProxy(typeof(InternalUnknownAssistantResponseFormat))] + public partial class AssistantResponseFormat : IJsonModel + { + } +} diff --git a/.dotnet/src/Generated/Models/AssistantResponseFormat.cs b/.dotnet/src/Generated/Models/AssistantResponseFormat.cs new file mode 100644 index 000000000..3490f8a5a --- /dev/null +++ b/.dotnet/src/Generated/Models/AssistantResponseFormat.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public abstract partial class AssistantResponseFormat + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected AssistantResponseFormat() + { + } + + internal AssistantResponseFormat(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/AssistantThread.Serialization.cs b/.dotnet/src/Generated/Models/AssistantThread.Serialization.cs new file mode 100644 index 000000000..45e72b9bf --- /dev/null +++ b/.dotnet/src/Generated/Models/AssistantThread.Serialization.cs @@ -0,0 +1,218 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class AssistantThread : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AssistantThread)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("tool_resources") != true) + { + if (ToolResources != null) + { + writer.WritePropertyName("tool_resources"u8); + writer.WriteObjectValue(ToolResources, options); + } + else + { + writer.WriteNull("tool_resources"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true) + { + if (Metadata != null && Optional.IsCollectionDefined(Metadata)) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AssistantThread IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AssistantThread)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAssistantThread(document.RootElement, options); + } + + internal static AssistantThread DeserializeAssistantThread(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalThreadObjectObject @object = default; + DateTimeOffset createdAt = default; + ToolResources toolResources = default; + IReadOnlyDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalThreadObjectObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("tool_resources"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + toolResources = null; + continue; + } + toolResources = Assistants.ToolResources.DeserializeToolResources(property.Value, options); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + metadata = new ChangeTrackingDictionary(); + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AssistantThread( + id, + @object, + createdAt, + toolResources, + metadata, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AssistantThread)} does not support writing '{options.Format}' format."); + } + } + + AssistantThread IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAssistantThread(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AssistantThread)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static AssistantThread FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAssistantThread(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/AssistantThread.cs b/.dotnet/src/Generated/Models/AssistantThread.cs new file mode 100644 index 000000000..6cec49e4b --- /dev/null +++ b/.dotnet/src/Generated/Models/AssistantThread.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class AssistantThread + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal AssistantThread(string id, DateTimeOffset createdAt, ToolResources toolResources, IReadOnlyDictionary metadata) + { + Argument.AssertNotNull(id, nameof(id)); + + Id = id; + CreatedAt = createdAt; + ToolResources = toolResources; + Metadata = metadata; + } + + internal AssistantThread(string id, InternalThreadObjectObject @object, DateTimeOffset createdAt, ToolResources toolResources, IReadOnlyDictionary metadata, IDictionary serializedAdditionalRawData) + { + Id = id; + Object = @object; + CreatedAt = createdAt; + ToolResources = toolResources; + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal AssistantThread() + { + } + + public string Id { get; } + + public DateTimeOffset CreatedAt { get; } + public IReadOnlyDictionary Metadata { get; } + } +} diff --git a/.dotnet/src/Generated/Models/AudioTranscription.Serialization.cs b/.dotnet/src/Generated/Models/AudioTranscription.Serialization.cs new file mode 100644 index 000000000..d08be5d13 --- /dev/null +++ b/.dotnet/src/Generated/Models/AudioTranscription.Serialization.cs @@ -0,0 +1,217 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Audio +{ + public partial class AudioTranscription : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AudioTranscription)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("task") != true) + { + writer.WritePropertyName("task"u8); + writer.WriteStringValue(Task.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("language") != true) + { + writer.WritePropertyName("language"u8); + writer.WriteStringValue(Language); + } + if (SerializedAdditionalRawData?.ContainsKey("duration") != true) + { + writer.WritePropertyName("duration"u8); + writer.WriteNumberValue(Convert.ToDouble(Duration.Value.ToString("s\\.FFF"))); + } + if (SerializedAdditionalRawData?.ContainsKey("text") != true) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(Text); + } + if (SerializedAdditionalRawData?.ContainsKey("words") != true && Optional.IsCollectionDefined(Words)) + { + writer.WritePropertyName("words"u8); + writer.WriteStartArray(); + foreach (var item in Words) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("segments") != true && Optional.IsCollectionDefined(Segments)) + { + writer.WritePropertyName("segments"u8); + writer.WriteStartArray(); + foreach (var item in Segments) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AudioTranscription IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AudioTranscription)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAudioTranscription(document.RootElement, options); + } + + internal static AudioTranscription DeserializeAudioTranscription(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalCreateTranscriptionResponseVerboseJsonTask task = default; + string language = default; + TimeSpan? duration = default; + string text = default; + IReadOnlyList words = default; + IReadOnlyList segments = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("task"u8)) + { + task = new InternalCreateTranscriptionResponseVerboseJsonTask(property.Value.GetString()); + continue; + } + if (property.NameEquals("language"u8)) + { + language = property.Value.GetString(); + continue; + } + if (property.NameEquals("duration"u8)) + { + duration = TimeSpan.FromSeconds(property.Value.GetDouble()); + continue; + } + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (property.NameEquals("words"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(TranscribedWord.DeserializeTranscribedWord(item, options)); + } + words = array; + continue; + } + if (property.NameEquals("segments"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(TranscribedSegment.DeserializeTranscribedSegment(item, options)); + } + segments = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AudioTranscription( + task, + language, + duration, + text, + words ?? new ChangeTrackingList(), + segments ?? new ChangeTrackingList(), + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AudioTranscription)} does not support writing '{options.Format}' format."); + } + } + + AudioTranscription IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAudioTranscription(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AudioTranscription)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/AudioTranscription.cs b/.dotnet/src/Generated/Models/AudioTranscription.cs new file mode 100644 index 000000000..ce9824521 --- /dev/null +++ b/.dotnet/src/Generated/Models/AudioTranscription.cs @@ -0,0 +1,45 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Audio +{ + public partial class AudioTranscription + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal AudioTranscription(string language, TimeSpan? duration, string text) + { + Argument.AssertNotNull(language, nameof(language)); + Argument.AssertNotNull(text, nameof(text)); + + Language = language; + Duration = duration; + Text = text; + Words = new ChangeTrackingList(); + Segments = new ChangeTrackingList(); + } + + internal AudioTranscription(InternalCreateTranscriptionResponseVerboseJsonTask task, string language, TimeSpan? duration, string text, IReadOnlyList words, IReadOnlyList segments, IDictionary serializedAdditionalRawData) + { + Task = task; + Language = language; + Duration = duration; + Text = text; + Words = words; + Segments = segments; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal AudioTranscription() + { + } + + public string Language { get; } + public string Text { get; } + public IReadOnlyList Words { get; } + public IReadOnlyList Segments { get; } + } +} diff --git a/.dotnet/src/Generated/Models/AudioTranscriptionFormat.Serialization.cs b/.dotnet/src/Generated/Models/AudioTranscriptionFormat.Serialization.cs new file mode 100644 index 000000000..699760d2a --- /dev/null +++ b/.dotnet/src/Generated/Models/AudioTranscriptionFormat.Serialization.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.Audio +{ + internal static partial class AudioTranscriptionFormatExtensions + { + public static string ToSerialString(this AudioTranscriptionFormat value) => value switch + { + AudioTranscriptionFormat.Simple => "json", + AudioTranscriptionFormat.Text => "text", + AudioTranscriptionFormat.Srt => "srt", + AudioTranscriptionFormat.Verbose => "verbose_json", + AudioTranscriptionFormat.Vtt => "vtt", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown AudioTranscriptionFormat value.") + }; + + public static AudioTranscriptionFormat ToAudioTranscriptionFormat(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "json")) return AudioTranscriptionFormat.Simple; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "text")) return AudioTranscriptionFormat.Text; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "srt")) return AudioTranscriptionFormat.Srt; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "verbose_json")) return AudioTranscriptionFormat.Verbose; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "vtt")) return AudioTranscriptionFormat.Vtt; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown AudioTranscriptionFormat value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/AudioTranscriptionOptions.Serialization.cs b/.dotnet/src/Generated/Models/AudioTranscriptionOptions.Serialization.cs new file mode 100644 index 000000000..cebd9176f --- /dev/null +++ b/.dotnet/src/Generated/Models/AudioTranscriptionOptions.Serialization.cs @@ -0,0 +1,304 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.IO; +using System.Text.Json; + +namespace OpenAI.Audio +{ + public partial class AudioTranscriptionOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AudioTranscriptionOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file") != true) + { + writer.WritePropertyName("file"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(File); +#else + using (JsonDocument document = JsonDocument.Parse(File)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("language") != true && Optional.IsDefined(Language)) + { + writer.WritePropertyName("language"u8); + writer.WriteStringValue(Language); + } + if (SerializedAdditionalRawData?.ContainsKey("prompt") != true && Optional.IsDefined(Prompt)) + { + writer.WritePropertyName("prompt"u8); + writer.WriteStringValue(Prompt); + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + writer.WritePropertyName("response_format"u8); + writer.WriteStringValue(ResponseFormat.Value.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("temperature") != true && Optional.IsDefined(Temperature)) + { + writer.WritePropertyName("temperature"u8); + writer.WriteNumberValue(Temperature.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("timestamp_granularities") != true && Optional.IsCollectionDefined(TimestampGranularities)) + { + writer.WritePropertyName("timestamp_granularities"u8); + writer.WriteStartArray(); + foreach (var item in TimestampGranularities) + { + if (item == null) + { + writer.WriteNullValue(); + continue; + } +#if NET6_0_OR_GREATER + writer.WriteRawValue(item); +#else + using (JsonDocument document = JsonDocument.Parse(item)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AudioTranscriptionOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AudioTranscriptionOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAudioTranscriptionOptions(document.RootElement, options); + } + + internal static AudioTranscriptionOptions DeserializeAudioTranscriptionOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + BinaryData file = default; + InternalCreateTranscriptionRequestModel model = default; + string language = default; + string prompt = default; + AudioTranscriptionFormat? responseFormat = default; + float? temperature = default; + IList timestampGranularities = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file"u8)) + { + file = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("model"u8)) + { + model = new InternalCreateTranscriptionRequestModel(property.Value.GetString()); + continue; + } + if (property.NameEquals("language"u8)) + { + language = property.Value.GetString(); + continue; + } + if (property.NameEquals("prompt"u8)) + { + prompt = property.Value.GetString(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + responseFormat = property.Value.GetString().ToAudioTranscriptionFormat(); + continue; + } + if (property.NameEquals("temperature"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + temperature = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("timestamp_granularities"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + if (item.ValueKind == JsonValueKind.Null) + { + array.Add(null); + } + else + { + array.Add(BinaryData.FromString(item.GetRawText())); + } + } + timestampGranularities = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AudioTranscriptionOptions( + file, + model, + language, + prompt, + responseFormat, + temperature, + timestampGranularities ?? new ChangeTrackingList(), + serializedAdditionalRawData); + } + + private BinaryData SerializeMultipart(ModelReaderWriterOptions options) + { + using MultipartFormDataBinaryContent content = ToMultipartBinaryBody(); + using MemoryStream stream = new MemoryStream(); + content.WriteTo(stream); + if (stream.Position > int.MaxValue) + { + return BinaryData.FromStream(stream); + } + else + { + return new BinaryData(stream.GetBuffer().AsMemory(0, (int)stream.Position)); + } + } + + internal virtual MultipartFormDataBinaryContent ToMultipartBinaryBody() + { + MultipartFormDataBinaryContent content = new MultipartFormDataBinaryContent(); + content.Add(File, "file", "file"); + content.Add(Model.ToString(), "model"); + if (Optional.IsDefined(Language)) + { + content.Add(Language, "language"); + } + if (Optional.IsDefined(Prompt)) + { + content.Add(Prompt, "prompt"); + } + if (Optional.IsDefined(ResponseFormat)) + { + content.Add(ResponseFormat.Value.ToSerialString(), "response_format"); + } + if (Optional.IsDefined(Temperature)) + { + content.Add(Temperature.Value, "temperature"); + } + if (Optional.IsCollectionDefined(TimestampGranularities)) + { + foreach (BinaryData item in TimestampGranularities) + { + content.Add(item, "timestamp_granularities", "timestamp_granularities"); + } + } + return content; + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + case "MFD": + return SerializeMultipart(options); + default: + throw new FormatException($"The model {nameof(AudioTranscriptionOptions)} does not support writing '{options.Format}' format."); + } + } + + AudioTranscriptionOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAudioTranscriptionOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AudioTranscriptionOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "MFD"; + + internal static AudioTranscriptionOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAudioTranscriptionOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/AudioTranscriptionOptions.cs b/.dotnet/src/Generated/Models/AudioTranscriptionOptions.cs new file mode 100644 index 000000000..3640e680a --- /dev/null +++ b/.dotnet/src/Generated/Models/AudioTranscriptionOptions.cs @@ -0,0 +1,30 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Audio +{ + public partial class AudioTranscriptionOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal AudioTranscriptionOptions(BinaryData file, InternalCreateTranscriptionRequestModel model, string language, string prompt, AudioTranscriptionFormat? responseFormat, float? temperature, IList timestampGranularities, IDictionary serializedAdditionalRawData) + { + File = file; + Model = model; + Language = language; + Prompt = prompt; + ResponseFormat = responseFormat; + Temperature = temperature; + TimestampGranularities = timestampGranularities; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public string Language { get; set; } + public string Prompt { get; set; } + public AudioTranscriptionFormat? ResponseFormat { get; set; } + public float? Temperature { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/AudioTranslation.Serialization.cs b/.dotnet/src/Generated/Models/AudioTranslation.Serialization.cs new file mode 100644 index 000000000..ff7da0c0b --- /dev/null +++ b/.dotnet/src/Generated/Models/AudioTranslation.Serialization.cs @@ -0,0 +1,191 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Audio +{ + public partial class AudioTranslation : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AudioTranslation)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("task") != true) + { + writer.WritePropertyName("task"u8); + writer.WriteStringValue(Task.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("language") != true) + { + writer.WritePropertyName("language"u8); + writer.WriteStringValue(Language); + } + if (SerializedAdditionalRawData?.ContainsKey("duration") != true) + { + writer.WritePropertyName("duration"u8); + writer.WriteNumberValue(Convert.ToDouble(Duration.Value.ToString("s\\.FFF"))); + } + if (SerializedAdditionalRawData?.ContainsKey("text") != true) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(Text); + } + if (SerializedAdditionalRawData?.ContainsKey("segments") != true && Optional.IsCollectionDefined(Segments)) + { + writer.WritePropertyName("segments"u8); + writer.WriteStartArray(); + foreach (var item in Segments) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AudioTranslation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AudioTranslation)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAudioTranslation(document.RootElement, options); + } + + internal static AudioTranslation DeserializeAudioTranslation(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalCreateTranslationResponseVerboseJsonTask task = default; + string language = default; + TimeSpan? duration = default; + string text = default; + IReadOnlyList segments = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("task"u8)) + { + task = new InternalCreateTranslationResponseVerboseJsonTask(property.Value.GetString()); + continue; + } + if (property.NameEquals("language"u8)) + { + language = property.Value.GetString(); + continue; + } + if (property.NameEquals("duration"u8)) + { + duration = TimeSpan.FromSeconds(property.Value.GetDouble()); + continue; + } + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (property.NameEquals("segments"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(TranscribedSegment.DeserializeTranscribedSegment(item, options)); + } + segments = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AudioTranslation( + task, + language, + duration, + text, + segments ?? new ChangeTrackingList(), + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AudioTranslation)} does not support writing '{options.Format}' format."); + } + } + + AudioTranslation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAudioTranslation(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AudioTranslation)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/AudioTranslation.cs b/.dotnet/src/Generated/Models/AudioTranslation.cs new file mode 100644 index 000000000..6dfb907ca --- /dev/null +++ b/.dotnet/src/Generated/Models/AudioTranslation.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Audio +{ + public partial class AudioTranslation + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal AudioTranslation(string language, TimeSpan? duration, string text) + { + Argument.AssertNotNull(language, nameof(language)); + Argument.AssertNotNull(text, nameof(text)); + + Language = language; + Duration = duration; + Text = text; + Segments = new ChangeTrackingList(); + } + + internal AudioTranslation(InternalCreateTranslationResponseVerboseJsonTask task, string language, TimeSpan? duration, string text, IReadOnlyList segments, IDictionary serializedAdditionalRawData) + { + Task = task; + Language = language; + Duration = duration; + Text = text; + Segments = segments; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal AudioTranslation() + { + } + + public string Language { get; } + public string Text { get; } + public IReadOnlyList Segments { get; } + } +} diff --git a/.dotnet/src/Generated/Models/AudioTranslationFormat.Serialization.cs b/.dotnet/src/Generated/Models/AudioTranslationFormat.Serialization.cs new file mode 100644 index 000000000..6c55e965d --- /dev/null +++ b/.dotnet/src/Generated/Models/AudioTranslationFormat.Serialization.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.Audio +{ + internal static partial class AudioTranslationFormatExtensions + { + public static string ToSerialString(this AudioTranslationFormat value) => value switch + { + AudioTranslationFormat.Simple => "json", + AudioTranslationFormat.Text => "text", + AudioTranslationFormat.Srt => "srt", + AudioTranslationFormat.Verbose => "verbose_json", + AudioTranslationFormat.Vtt => "vtt", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown AudioTranslationFormat value.") + }; + + public static AudioTranslationFormat ToAudioTranslationFormat(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "json")) return AudioTranslationFormat.Simple; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "text")) return AudioTranslationFormat.Text; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "srt")) return AudioTranslationFormat.Srt; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "verbose_json")) return AudioTranslationFormat.Verbose; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "vtt")) return AudioTranslationFormat.Vtt; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown AudioTranslationFormat value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/AudioTranslationOptions.Serialization.cs b/.dotnet/src/Generated/Models/AudioTranslationOptions.Serialization.cs new file mode 100644 index 000000000..5881d9a41 --- /dev/null +++ b/.dotnet/src/Generated/Models/AudioTranslationOptions.Serialization.cs @@ -0,0 +1,236 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.IO; +using System.Text.Json; + +namespace OpenAI.Audio +{ + public partial class AudioTranslationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AudioTranslationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file") != true) + { + writer.WritePropertyName("file"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(File); +#else + using (JsonDocument document = JsonDocument.Parse(File)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("prompt") != true && Optional.IsDefined(Prompt)) + { + writer.WritePropertyName("prompt"u8); + writer.WriteStringValue(Prompt); + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + writer.WritePropertyName("response_format"u8); + writer.WriteStringValue(ResponseFormat.Value.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("temperature") != true && Optional.IsDefined(Temperature)) + { + writer.WritePropertyName("temperature"u8); + writer.WriteNumberValue(Temperature.Value); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AudioTranslationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AudioTranslationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAudioTranslationOptions(document.RootElement, options); + } + + internal static AudioTranslationOptions DeserializeAudioTranslationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + BinaryData file = default; + InternalCreateTranslationRequestModel model = default; + string prompt = default; + AudioTranslationFormat? responseFormat = default; + float? temperature = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file"u8)) + { + file = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("model"u8)) + { + model = new InternalCreateTranslationRequestModel(property.Value.GetString()); + continue; + } + if (property.NameEquals("prompt"u8)) + { + prompt = property.Value.GetString(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + responseFormat = property.Value.GetString().ToAudioTranslationFormat(); + continue; + } + if (property.NameEquals("temperature"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + temperature = property.Value.GetSingle(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new AudioTranslationOptions( + file, + model, + prompt, + responseFormat, + temperature, + serializedAdditionalRawData); + } + + private BinaryData SerializeMultipart(ModelReaderWriterOptions options) + { + using MultipartFormDataBinaryContent content = ToMultipartBinaryBody(); + using MemoryStream stream = new MemoryStream(); + content.WriteTo(stream); + if (stream.Position > int.MaxValue) + { + return BinaryData.FromStream(stream); + } + else + { + return new BinaryData(stream.GetBuffer().AsMemory(0, (int)stream.Position)); + } + } + + internal virtual MultipartFormDataBinaryContent ToMultipartBinaryBody() + { + MultipartFormDataBinaryContent content = new MultipartFormDataBinaryContent(); + content.Add(File, "file", "file"); + content.Add(Model.ToString(), "model"); + if (Optional.IsDefined(Prompt)) + { + content.Add(Prompt, "prompt"); + } + if (Optional.IsDefined(ResponseFormat)) + { + content.Add(ResponseFormat.Value.ToSerialString(), "response_format"); + } + if (Optional.IsDefined(Temperature)) + { + content.Add(Temperature.Value, "temperature"); + } + return content; + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + case "MFD": + return SerializeMultipart(options); + default: + throw new FormatException($"The model {nameof(AudioTranslationOptions)} does not support writing '{options.Format}' format."); + } + } + + AudioTranslationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAudioTranslationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AudioTranslationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "MFD"; + + internal static AudioTranslationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeAudioTranslationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/AudioTranslationOptions.cs b/.dotnet/src/Generated/Models/AudioTranslationOptions.cs new file mode 100644 index 000000000..a19d2f1fa --- /dev/null +++ b/.dotnet/src/Generated/Models/AudioTranslationOptions.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Audio +{ + public partial class AudioTranslationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal AudioTranslationOptions(BinaryData file, InternalCreateTranslationRequestModel model, string prompt, AudioTranslationFormat? responseFormat, float? temperature, IDictionary serializedAdditionalRawData) + { + File = file; + Model = model; + Prompt = prompt; + ResponseFormat = responseFormat; + Temperature = temperature; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public string Prompt { get; set; } + public AudioTranslationFormat? ResponseFormat { get; set; } + public float? Temperature { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/ChatCompletion.Serialization.cs b/.dotnet/src/Generated/Models/ChatCompletion.Serialization.cs new file mode 100644 index 000000000..87d34ee83 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatCompletion.Serialization.cs @@ -0,0 +1,245 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatCompletion : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatCompletion)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("choices") != true) + { + writer.WritePropertyName("choices"u8); + writer.WriteStartArray(); + foreach (var item in Choices) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("created") != true) + { + writer.WritePropertyName("created"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model); + } + if (SerializedAdditionalRawData?.ContainsKey("service_tier") != true && Optional.IsDefined(_serviceTier)) + { + if (_serviceTier != null) + { + writer.WritePropertyName("service_tier"u8); + writer.WriteStringValue(_serviceTier.Value.ToString()); + } + else + { + writer.WriteNull("service_tier"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("system_fingerprint") != true && Optional.IsDefined(SystemFingerprint)) + { + writer.WritePropertyName("system_fingerprint"u8); + writer.WriteStringValue(SystemFingerprint); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("usage") != true && Optional.IsDefined(Usage)) + { + writer.WritePropertyName("usage"u8); + writer.WriteObjectValue(Usage, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ChatCompletion IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatCompletion)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatCompletion(document.RootElement, options); + } + + internal static ChatCompletion DeserializeChatCompletion(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + IReadOnlyList choices = default; + DateTimeOffset created = default; + string model = default; + InternalCreateChatCompletionResponseServiceTier? serviceTier = default; + string systemFingerprint = default; + InternalCreateChatCompletionResponseObject @object = default; + ChatTokenUsage usage = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("choices"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalCreateChatCompletionResponseChoice.DeserializeInternalCreateChatCompletionResponseChoice(item, options)); + } + choices = array; + continue; + } + if (property.NameEquals("created"u8)) + { + created = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("model"u8)) + { + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("service_tier"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + serviceTier = null; + continue; + } + serviceTier = new InternalCreateChatCompletionResponseServiceTier(property.Value.GetString()); + continue; + } + if (property.NameEquals("system_fingerprint"u8)) + { + systemFingerprint = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalCreateChatCompletionResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("usage"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + usage = ChatTokenUsage.DeserializeChatTokenUsage(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatCompletion( + id, + choices, + created, + model, + serviceTier, + systemFingerprint, + @object, + usage, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatCompletion)} does not support writing '{options.Format}' format."); + } + } + + ChatCompletion IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatCompletion(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatCompletion)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatCompletion FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatCompletion(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatCompletion.cs b/.dotnet/src/Generated/Models/ChatCompletion.cs new file mode 100644 index 000000000..74549a43e --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatCompletion.cs @@ -0,0 +1,49 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Chat +{ + public partial class ChatCompletion + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal ChatCompletion(string id, IEnumerable choices, DateTimeOffset createdAt, string model) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(choices, nameof(choices)); + Argument.AssertNotNull(model, nameof(model)); + + Id = id; + Choices = choices.ToList(); + CreatedAt = createdAt; + Model = model; + } + + internal ChatCompletion(string id, IReadOnlyList choices, DateTimeOffset createdAt, string model, InternalCreateChatCompletionResponseServiceTier? serviceTier, string systemFingerprint, InternalCreateChatCompletionResponseObject @object, ChatTokenUsage usage, IDictionary serializedAdditionalRawData) + { + Id = id; + Choices = choices; + CreatedAt = createdAt; + Model = model; + _serviceTier = serviceTier; + SystemFingerprint = systemFingerprint; + Object = @object; + Usage = usage; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ChatCompletion() + { + } + + public string Id { get; } + public string Model { get; } + public string SystemFingerprint { get; } + + public ChatTokenUsage Usage { get; } + } +} diff --git a/.dotnet/src/Generated/Models/ChatCompletionOptions.Serialization.cs b/.dotnet/src/Generated/Models/ChatCompletionOptions.Serialization.cs new file mode 100644 index 000000000..cae873c4a --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatCompletionOptions.Serialization.cs @@ -0,0 +1,611 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatCompletionOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatCompletionOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("messages") != true) + { + writer.WritePropertyName("messages"u8); + writer.WriteStartArray(); + foreach (var item in Messages) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("frequency_penalty") != true && Optional.IsDefined(FrequencyPenalty)) + { + if (FrequencyPenalty != null) + { + writer.WritePropertyName("frequency_penalty"u8); + writer.WriteNumberValue(FrequencyPenalty.Value); + } + else + { + writer.WriteNull("frequency_penalty"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("logit_bias") != true && Optional.IsCollectionDefined(LogitBiases)) + { + if (LogitBiases != null) + { + writer.WritePropertyName("logit_bias"u8); + SerializeLogitBiasesValue(writer, options); + } + else + { + writer.WriteNull("logit_bias"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("logprobs") != true && Optional.IsDefined(IncludeLogProbabilities)) + { + if (IncludeLogProbabilities != null) + { + writer.WritePropertyName("logprobs"u8); + writer.WriteBooleanValue(IncludeLogProbabilities.Value); + } + else + { + writer.WriteNull("logprobs"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("top_logprobs") != true && Optional.IsDefined(TopLogProbabilityCount)) + { + if (TopLogProbabilityCount != null) + { + writer.WritePropertyName("top_logprobs"u8); + writer.WriteNumberValue(TopLogProbabilityCount.Value); + } + else + { + writer.WriteNull("top_logprobs"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("max_tokens") != true && Optional.IsDefined(MaxTokens)) + { + if (MaxTokens != null) + { + writer.WritePropertyName("max_tokens"u8); + writer.WriteNumberValue(MaxTokens.Value); + } + else + { + writer.WriteNull("max_tokens"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("n") != true && Optional.IsDefined(N)) + { + if (N != null) + { + writer.WritePropertyName("n"u8); + writer.WriteNumberValue(N.Value); + } + else + { + writer.WriteNull("n"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("presence_penalty") != true && Optional.IsDefined(PresencePenalty)) + { + if (PresencePenalty != null) + { + writer.WritePropertyName("presence_penalty"u8); + writer.WriteNumberValue(PresencePenalty.Value); + } + else + { + writer.WriteNull("presence_penalty"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + writer.WritePropertyName("response_format"u8); + writer.WriteObjectValue(ResponseFormat, options); + } + if (SerializedAdditionalRawData?.ContainsKey("seed") != true && Optional.IsDefined(Seed)) + { + if (Seed != null) + { + writer.WritePropertyName("seed"u8); + writer.WriteNumberValue(Seed.Value); + } + else + { + writer.WriteNull("seed"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("service_tier") != true && Optional.IsDefined(_serviceTier)) + { + if (_serviceTier != null) + { + writer.WritePropertyName("service_tier"u8); + writer.WriteStringValue(_serviceTier.Value.ToString()); + } + else + { + writer.WriteNull("service_tier"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("stop") != true && Optional.IsCollectionDefined(StopSequences)) + { + if (StopSequences != null) + { + writer.WritePropertyName("stop"u8); + SerializeStopSequencesValue(writer, options); + } + else + { + writer.WriteNull("stop"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("stream") != true && Optional.IsDefined(Stream)) + { + if (Stream != null) + { + writer.WritePropertyName("stream"u8); + writer.WriteBooleanValue(Stream.Value); + } + else + { + writer.WriteNull("stream"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("stream_options") != true && Optional.IsDefined(StreamOptions)) + { + if (StreamOptions != null) + { + writer.WritePropertyName("stream_options"u8); + writer.WriteObjectValue(StreamOptions, options); + } + else + { + writer.WriteNull("stream_options"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("temperature") != true && Optional.IsDefined(Temperature)) + { + if (Temperature != null) + { + writer.WritePropertyName("temperature"u8); + writer.WriteNumberValue(Temperature.Value); + } + else + { + writer.WriteNull("temperature"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("top_p") != true && Optional.IsDefined(TopP)) + { + if (TopP != null) + { + writer.WritePropertyName("top_p"u8); + writer.WriteNumberValue(TopP.Value); + } + else + { + writer.WriteNull("top_p"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tools") != true && Optional.IsCollectionDefined(Tools)) + { + writer.WritePropertyName("tools"u8); + writer.WriteStartArray(); + foreach (var item in Tools) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("tool_choice") != true && Optional.IsDefined(ToolChoice)) + { + writer.WritePropertyName("tool_choice"u8); + writer.WriteObjectValue(ToolChoice, options); + } + if (SerializedAdditionalRawData?.ContainsKey("parallel_tool_calls") != true && Optional.IsDefined(ParallelToolCallsEnabled)) + { + writer.WritePropertyName("parallel_tool_calls"u8); + writer.WriteBooleanValue(ParallelToolCallsEnabled.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("user") != true && Optional.IsDefined(EndUserId)) + { + writer.WritePropertyName("user"u8); + writer.WriteStringValue(EndUserId); + } + if (SerializedAdditionalRawData?.ContainsKey("function_call") != true && Optional.IsDefined(FunctionChoice)) + { + writer.WritePropertyName("function_call"u8); + writer.WriteObjectValue(FunctionChoice, options); + } + if (SerializedAdditionalRawData?.ContainsKey("functions") != true && Optional.IsCollectionDefined(Functions)) + { + writer.WritePropertyName("functions"u8); + writer.WriteStartArray(); + foreach (var item in Functions) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ChatCompletionOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatCompletionOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatCompletionOptions(document.RootElement, options); + } + + internal static ChatCompletionOptions DeserializeChatCompletionOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList messages = default; + InternalCreateChatCompletionRequestModel model = default; + float? frequencyPenalty = default; + IDictionary logitBias = default; + bool? logprobs = default; + int? topLogprobs = default; + int? maxTokens = default; + int? n = default; + float? presencePenalty = default; + ChatResponseFormat responseFormat = default; + long? seed = default; + InternalCreateChatCompletionRequestServiceTier? serviceTier = default; + IList stop = default; + bool? stream = default; + InternalChatCompletionStreamOptions streamOptions = default; + float? temperature = default; + float? topP = default; + IList tools = default; + ChatToolChoice toolChoice = default; + bool? parallelToolCalls = default; + string user = default; + ChatFunctionChoice functionCall = default; + IList functions = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("messages"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatMessage.DeserializeChatMessage(item, options)); + } + messages = array; + continue; + } + if (property.NameEquals("model"u8)) + { + model = new InternalCreateChatCompletionRequestModel(property.Value.GetString()); + continue; + } + if (property.NameEquals("frequency_penalty"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + frequencyPenalty = null; + continue; + } + frequencyPenalty = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("logit_bias"u8)) + { + DeserializeLogitBiasesValue(property, ref logitBias); + continue; + } + if (property.NameEquals("logprobs"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + logprobs = null; + continue; + } + logprobs = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("top_logprobs"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + topLogprobs = null; + continue; + } + topLogprobs = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("max_tokens"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + maxTokens = null; + continue; + } + maxTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("n"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + n = null; + continue; + } + n = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("presence_penalty"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + presencePenalty = null; + continue; + } + presencePenalty = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + responseFormat = ChatResponseFormat.DeserializeChatResponseFormat(property.Value, options); + continue; + } + if (property.NameEquals("seed"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + seed = null; + continue; + } + seed = property.Value.GetInt64(); + continue; + } + if (property.NameEquals("service_tier"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + serviceTier = null; + continue; + } + serviceTier = new InternalCreateChatCompletionRequestServiceTier(property.Value.GetString()); + continue; + } + if (property.NameEquals("stop"u8)) + { + DeserializeStopSequencesValue(property, ref stop); + continue; + } + if (property.NameEquals("stream"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + stream = null; + continue; + } + stream = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("stream_options"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + streamOptions = null; + continue; + } + streamOptions = InternalChatCompletionStreamOptions.DeserializeInternalChatCompletionStreamOptions(property.Value, options); + continue; + } + if (property.NameEquals("temperature"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + temperature = null; + continue; + } + temperature = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("top_p"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + topP = null; + continue; + } + topP = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("tools"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatTool.DeserializeChatTool(item, options)); + } + tools = array; + continue; + } + if (property.NameEquals("tool_choice"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + toolChoice = ChatToolChoice.DeserializeChatToolChoice(property.Value, options); + continue; + } + if (property.NameEquals("parallel_tool_calls"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + parallelToolCalls = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("user"u8)) + { + user = property.Value.GetString(); + continue; + } + if (property.NameEquals("function_call"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + functionCall = ChatFunctionChoice.DeserializeChatFunctionChoice(property.Value, options); + continue; + } + if (property.NameEquals("functions"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatFunction.DeserializeChatFunction(item, options)); + } + functions = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatCompletionOptions( + messages, + model, + frequencyPenalty, + logitBias ?? new ChangeTrackingDictionary(), + logprobs, + topLogprobs, + maxTokens, + n, + presencePenalty, + responseFormat, + seed, + serviceTier, + stop ?? new ChangeTrackingList(), + stream, + streamOptions, + temperature, + topP, + tools ?? new ChangeTrackingList(), + toolChoice, + parallelToolCalls, + user, + functionCall, + functions ?? new ChangeTrackingList(), + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatCompletionOptions)} does not support writing '{options.Format}' format."); + } + } + + ChatCompletionOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatCompletionOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatCompletionOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatCompletionOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatCompletionOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatCompletionOptions.cs b/.dotnet/src/Generated/Models/ChatCompletionOptions.cs new file mode 100644 index 000000000..040f1424c --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatCompletionOptions.cs @@ -0,0 +1,52 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Chat +{ + public partial class ChatCompletionOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal ChatCompletionOptions(IList messages, InternalCreateChatCompletionRequestModel model, float? frequencyPenalty, IDictionary logitBiases, bool? includeLogProbabilities, int? topLogProbabilityCount, int? maxTokens, int? n, float? presencePenalty, ChatResponseFormat responseFormat, long? seed, InternalCreateChatCompletionRequestServiceTier? serviceTier, IList stopSequences, bool? stream, InternalChatCompletionStreamOptions streamOptions, float? temperature, float? topP, IList tools, ChatToolChoice toolChoice, bool? parallelToolCallsEnabled, string endUserId, ChatFunctionChoice functionChoice, IList functions, IDictionary serializedAdditionalRawData) + { + Messages = messages; + Model = model; + FrequencyPenalty = frequencyPenalty; + LogitBiases = logitBiases; + IncludeLogProbabilities = includeLogProbabilities; + TopLogProbabilityCount = topLogProbabilityCount; + MaxTokens = maxTokens; + N = n; + PresencePenalty = presencePenalty; + ResponseFormat = responseFormat; + Seed = seed; + _serviceTier = serviceTier; + StopSequences = stopSequences; + Stream = stream; + StreamOptions = streamOptions; + Temperature = temperature; + TopP = topP; + Tools = tools; + ToolChoice = toolChoice; + ParallelToolCallsEnabled = parallelToolCallsEnabled; + EndUserId = endUserId; + FunctionChoice = functionChoice; + Functions = functions; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public float? FrequencyPenalty { get; set; } + public int? MaxTokens { get; set; } + public float? PresencePenalty { get; set; } + public ChatResponseFormat ResponseFormat { get; set; } + public long? Seed { get; set; } + public float? Temperature { get; set; } + public float? TopP { get; set; } + public IList Tools { get; } + public IList Functions { get; } + } +} diff --git a/.dotnet/src/Generated/Models/ChatFinishReason.Serialization.cs b/.dotnet/src/Generated/Models/ChatFinishReason.Serialization.cs new file mode 100644 index 000000000..3bfd77465 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatFinishReason.Serialization.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.Chat +{ + internal static partial class ChatFinishReasonExtensions + { + public static string ToSerialString(this ChatFinishReason value) => value switch + { + ChatFinishReason.Stop => "stop", + ChatFinishReason.Length => "length", + ChatFinishReason.ToolCalls => "tool_calls", + ChatFinishReason.ContentFilter => "content_filter", + ChatFinishReason.FunctionCall => "function_call", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown ChatFinishReason value.") + }; + + public static ChatFinishReason ToChatFinishReason(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "stop")) return ChatFinishReason.Stop; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "length")) return ChatFinishReason.Length; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "tool_calls")) return ChatFinishReason.ToolCalls; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "content_filter")) return ChatFinishReason.ContentFilter; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "function_call")) return ChatFinishReason.FunctionCall; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown ChatFinishReason value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatFunction.Serialization.cs b/.dotnet/src/Generated/Models/ChatFunction.Serialization.cs new file mode 100644 index 000000000..8974ea692 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatFunction.Serialization.cs @@ -0,0 +1,166 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatFunction : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatFunction)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("description") != true && Optional.IsDefined(FunctionDescription)) + { + writer.WritePropertyName("description"u8); + writer.WriteStringValue(FunctionDescription); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(FunctionName); + } + if (SerializedAdditionalRawData?.ContainsKey("parameters") != true && Optional.IsDefined(FunctionParameters)) + { + writer.WritePropertyName("parameters"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(FunctionParameters); +#else + using (JsonDocument document = JsonDocument.Parse(FunctionParameters)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ChatFunction IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatFunction)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatFunction(document.RootElement, options); + } + + internal static ChatFunction DeserializeChatFunction(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string description = default; + string name = default; + BinaryData parameters = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("description"u8)) + { + description = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("parameters"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + parameters = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatFunction(description, name, parameters, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatFunction)} does not support writing '{options.Format}' format."); + } + } + + ChatFunction IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatFunction(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatFunction)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatFunction FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatFunction(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatFunction.cs b/.dotnet/src/Generated/Models/ChatFunction.cs new file mode 100644 index 000000000..378eaae81 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatFunction.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + [Obsolete("This field is marked as deprecated.")] + public partial class ChatFunction + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal ChatFunction(string functionDescription, string functionName, BinaryData functionParameters, IDictionary serializedAdditionalRawData) + { + FunctionDescription = functionDescription; + FunctionName = functionName; + FunctionParameters = functionParameters; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ChatFunction() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatFunctionCall.Serialization.cs b/.dotnet/src/Generated/Models/ChatFunctionCall.Serialization.cs new file mode 100644 index 000000000..f5f498a8f --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatFunctionCall.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatFunctionCall : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatFunctionCall)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("arguments") != true) + { + writer.WritePropertyName("arguments"u8); + writer.WriteStringValue(FunctionArguments); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(FunctionName); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ChatFunctionCall IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatFunctionCall)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatFunctionCall(document.RootElement, options); + } + + internal static ChatFunctionCall DeserializeChatFunctionCall(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string arguments = default; + string name = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("arguments"u8)) + { + arguments = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatFunctionCall(arguments, name, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatFunctionCall)} does not support writing '{options.Format}' format."); + } + } + + ChatFunctionCall IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatFunctionCall(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatFunctionCall)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatFunctionCall FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatFunctionCall(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatFunctionCall.cs b/.dotnet/src/Generated/Models/ChatFunctionCall.cs new file mode 100644 index 000000000..d10f638aa --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatFunctionCall.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class ChatFunctionCall + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal ChatFunctionCall(string functionArguments, string functionName, IDictionary serializedAdditionalRawData) + { + FunctionArguments = functionArguments; + FunctionName = functionName; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ChatFunctionCall() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatFunctionChoice.Serialization.cs b/.dotnet/src/Generated/Models/ChatFunctionChoice.Serialization.cs new file mode 100644 index 000000000..eb482e954 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatFunctionChoice.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatFunctionChoice : IJsonModel + { + ChatFunctionChoice IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatFunctionChoice)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatFunctionChoice(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatFunctionChoice)} does not support writing '{options.Format}' format."); + } + } + + ChatFunctionChoice IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatFunctionChoice(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatFunctionChoice)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatFunctionChoice FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatFunctionChoice(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatFunctionChoice.cs b/.dotnet/src/Generated/Models/ChatFunctionChoice.cs new file mode 100644 index 000000000..a103d4f41 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatFunctionChoice.cs @@ -0,0 +1,14 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class ChatFunctionChoice + { + internal IDictionary SerializedAdditionalRawData { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/ChatMessage.Serialization.cs b/.dotnet/src/Generated/Models/ChatMessage.Serialization.cs new file mode 100644 index 000000000..3d13375d1 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatMessage.Serialization.cs @@ -0,0 +1,93 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; +using OpenAI.FineTuning; + +namespace OpenAI.Chat +{ + [PersistableModelProxy(typeof(UnknownChatMessage))] + public partial class ChatMessage : IJsonModel + { + ChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatMessage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatMessage(document.RootElement, options); + } + + internal static ChatMessage DeserializeChatMessage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("role", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case null: return InternalFineTuneChatCompletionRequestAssistantMessage.DeserializeInternalFineTuneChatCompletionRequestAssistantMessage(element, options); + case "assistant": return AssistantChatMessage.DeserializeAssistantChatMessage(element, options); + case "function": return FunctionChatMessage.DeserializeFunctionChatMessage(element, options); + case "system": return SystemChatMessage.DeserializeSystemChatMessage(element, options); + case "tool": return ToolChatMessage.DeserializeToolChatMessage(element, options); + case "user": return UserChatMessage.DeserializeUserChatMessage(element, options); + } + } + return UnknownChatMessage.DeserializeUnknownChatMessage(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatMessage)} does not support writing '{options.Format}' format."); + } + } + + ChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatMessage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatMessage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatMessage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatMessage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatMessage.cs b/.dotnet/src/Generated/Models/ChatMessage.cs new file mode 100644 index 000000000..08314591b --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatMessage.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public abstract partial class ChatMessage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected ChatMessage() + { + Content = new ChangeTrackingList(); + } + + internal ChatMessage(ChatMessageRole role, IList content, IDictionary serializedAdditionalRawData) + { + Role = role; + Content = content; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatMessageContentPart.Serialization.cs b/.dotnet/src/Generated/Models/ChatMessageContentPart.Serialization.cs new file mode 100644 index 000000000..aa068ae4d --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatMessageContentPart.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatMessageContentPart : IJsonModel + { + ChatMessageContentPart IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatMessageContentPart)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatMessageContentPart(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatMessageContentPart)} does not support writing '{options.Format}' format."); + } + } + + ChatMessageContentPart IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatMessageContentPart(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatMessageContentPart)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatMessageContentPart FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatMessageContentPart(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatMessageContentPart.cs b/.dotnet/src/Generated/Models/ChatMessageContentPart.cs new file mode 100644 index 000000000..ac4ef86e6 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatMessageContentPart.cs @@ -0,0 +1,14 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class ChatMessageContentPart + { + internal IDictionary SerializedAdditionalRawData { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/ChatMessageRole.Serialization.cs b/.dotnet/src/Generated/Models/ChatMessageRole.Serialization.cs new file mode 100644 index 000000000..4380b11e4 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatMessageRole.Serialization.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.Chat +{ + internal static partial class ChatMessageRoleExtensions + { + public static string ToSerialString(this ChatMessageRole value) => value switch + { + ChatMessageRole.System => "system", + ChatMessageRole.User => "user", + ChatMessageRole.Assistant => "assistant", + ChatMessageRole.Tool => "tool", + ChatMessageRole.Function => "function", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown ChatMessageRole value.") + }; + + public static ChatMessageRole ToChatMessageRole(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "system")) return ChatMessageRole.System; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "user")) return ChatMessageRole.User; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "assistant")) return ChatMessageRole.Assistant; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "tool")) return ChatMessageRole.Tool; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "function")) return ChatMessageRole.Function; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown ChatMessageRole value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatResponseFormat.Serialization.cs b/.dotnet/src/Generated/Models/ChatResponseFormat.Serialization.cs new file mode 100644 index 000000000..2c055865b --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatResponseFormat.Serialization.cs @@ -0,0 +1,125 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Chat +{ + [PersistableModelProxy(typeof(InternalUnknownChatResponseFormat))] + public partial class ChatResponseFormat : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatResponseFormat)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ChatResponseFormat IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatResponseFormat)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatResponseFormat(document.RootElement, options); + } + + internal static ChatResponseFormat DeserializeChatResponseFormat(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "json_object": return InternalChatResponseFormatJsonObject.DeserializeInternalChatResponseFormatJsonObject(element, options); + case "json_schema": return InternalChatResponseFormatJsonSchema.DeserializeInternalChatResponseFormatJsonSchema(element, options); + case "text": return InternalChatResponseFormatText.DeserializeInternalChatResponseFormatText(element, options); + } + } + return InternalUnknownChatResponseFormat.DeserializeInternalUnknownChatResponseFormat(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatResponseFormat)} does not support writing '{options.Format}' format."); + } + } + + ChatResponseFormat IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatResponseFormat(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatResponseFormat)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatResponseFormat FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatResponseFormat(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatResponseFormat.cs b/.dotnet/src/Generated/Models/ChatResponseFormat.cs new file mode 100644 index 000000000..54eb0a8bd --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatResponseFormat.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public abstract partial class ChatResponseFormat + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected ChatResponseFormat() + { + } + + internal ChatResponseFormat(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/ChatTokenLogProbabilityInfo.Serialization.cs b/.dotnet/src/Generated/Models/ChatTokenLogProbabilityInfo.Serialization.cs new file mode 100644 index 000000000..b85a2daac --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatTokenLogProbabilityInfo.Serialization.cs @@ -0,0 +1,198 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatTokenLogProbabilityInfo : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatTokenLogProbabilityInfo)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("token") != true) + { + writer.WritePropertyName("token"u8); + writer.WriteStringValue(Token); + } + if (SerializedAdditionalRawData?.ContainsKey("logprob") != true) + { + writer.WritePropertyName("logprob"u8); + writer.WriteNumberValue(LogProbability); + } + if (SerializedAdditionalRawData?.ContainsKey("bytes") != true) + { + if (Utf8ByteValues != null && Optional.IsCollectionDefined(Utf8ByteValues)) + { + writer.WritePropertyName("bytes"u8); + writer.WriteStartArray(); + foreach (var item in Utf8ByteValues) + { + writer.WriteNumberValue(item); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("bytes"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("top_logprobs") != true) + { + writer.WritePropertyName("top_logprobs"u8); + writer.WriteStartArray(); + foreach (var item in TopLogProbabilities) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ChatTokenLogProbabilityInfo IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatTokenLogProbabilityInfo)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatTokenLogProbabilityInfo(document.RootElement, options); + } + + internal static ChatTokenLogProbabilityInfo DeserializeChatTokenLogProbabilityInfo(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string token = default; + float logprob = default; + IReadOnlyList bytes = default; + IReadOnlyList topLogprobs = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("token"u8)) + { + token = property.Value.GetString(); + continue; + } + if (property.NameEquals("logprob"u8)) + { + logprob = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("bytes"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + bytes = new ChangeTrackingList(); + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetInt32()); + } + bytes = array; + continue; + } + if (property.NameEquals("top_logprobs"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatTokenTopLogProbabilityInfo.DeserializeChatTokenTopLogProbabilityInfo(item, options)); + } + topLogprobs = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatTokenLogProbabilityInfo(token, logprob, bytes, topLogprobs, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatTokenLogProbabilityInfo)} does not support writing '{options.Format}' format."); + } + } + + ChatTokenLogProbabilityInfo IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatTokenLogProbabilityInfo(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatTokenLogProbabilityInfo)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatTokenLogProbabilityInfo FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatTokenLogProbabilityInfo(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatTokenLogProbabilityInfo.cs b/.dotnet/src/Generated/Models/ChatTokenLogProbabilityInfo.cs new file mode 100644 index 000000000..857fe0a88 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatTokenLogProbabilityInfo.cs @@ -0,0 +1,40 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Chat +{ + public partial class ChatTokenLogProbabilityInfo + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal ChatTokenLogProbabilityInfo(string token, float logProbability, IEnumerable utf8ByteValues, IEnumerable topLogProbabilities) + { + Argument.AssertNotNull(token, nameof(token)); + Argument.AssertNotNull(topLogProbabilities, nameof(topLogProbabilities)); + + Token = token; + LogProbability = logProbability; + Utf8ByteValues = utf8ByteValues?.ToList(); + TopLogProbabilities = topLogProbabilities.ToList(); + } + + internal ChatTokenLogProbabilityInfo(string token, float logProbability, IReadOnlyList utf8ByteValues, IReadOnlyList topLogProbabilities, IDictionary serializedAdditionalRawData) + { + Token = token; + LogProbability = logProbability; + Utf8ByteValues = utf8ByteValues; + TopLogProbabilities = topLogProbabilities; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ChatTokenLogProbabilityInfo() + { + } + + public string Token { get; } + } +} diff --git a/.dotnet/src/Generated/Models/ChatTokenTopLogProbabilityInfo.Serialization.cs b/.dotnet/src/Generated/Models/ChatTokenTopLogProbabilityInfo.Serialization.cs new file mode 100644 index 000000000..14d36e1a3 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatTokenTopLogProbabilityInfo.Serialization.cs @@ -0,0 +1,177 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatTokenTopLogProbabilityInfo : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatTokenTopLogProbabilityInfo)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("token") != true) + { + writer.WritePropertyName("token"u8); + writer.WriteStringValue(Token); + } + if (SerializedAdditionalRawData?.ContainsKey("logprob") != true) + { + writer.WritePropertyName("logprob"u8); + writer.WriteNumberValue(LogProbability); + } + if (SerializedAdditionalRawData?.ContainsKey("bytes") != true) + { + if (Utf8ByteValues != null && Optional.IsCollectionDefined(Utf8ByteValues)) + { + writer.WritePropertyName("bytes"u8); + writer.WriteStartArray(); + foreach (var item in Utf8ByteValues) + { + writer.WriteNumberValue(item); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("bytes"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ChatTokenTopLogProbabilityInfo IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatTokenTopLogProbabilityInfo)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatTokenTopLogProbabilityInfo(document.RootElement, options); + } + + internal static ChatTokenTopLogProbabilityInfo DeserializeChatTokenTopLogProbabilityInfo(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string token = default; + float logprob = default; + IReadOnlyList bytes = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("token"u8)) + { + token = property.Value.GetString(); + continue; + } + if (property.NameEquals("logprob"u8)) + { + logprob = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("bytes"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + bytes = new ChangeTrackingList(); + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetInt32()); + } + bytes = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatTokenTopLogProbabilityInfo(token, logprob, bytes, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatTokenTopLogProbabilityInfo)} does not support writing '{options.Format}' format."); + } + } + + ChatTokenTopLogProbabilityInfo IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatTokenTopLogProbabilityInfo(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatTokenTopLogProbabilityInfo)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatTokenTopLogProbabilityInfo FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatTokenTopLogProbabilityInfo(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatTokenTopLogProbabilityInfo.cs b/.dotnet/src/Generated/Models/ChatTokenTopLogProbabilityInfo.cs new file mode 100644 index 000000000..f0ee70ff5 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatTokenTopLogProbabilityInfo.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Chat +{ + public partial class ChatTokenTopLogProbabilityInfo + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal ChatTokenTopLogProbabilityInfo(string token, float logProbability, IEnumerable utf8ByteValues) + { + Argument.AssertNotNull(token, nameof(token)); + + Token = token; + LogProbability = logProbability; + Utf8ByteValues = utf8ByteValues?.ToList(); + } + + internal ChatTokenTopLogProbabilityInfo(string token, float logProbability, IReadOnlyList utf8ByteValues, IDictionary serializedAdditionalRawData) + { + Token = token; + LogProbability = logProbability; + Utf8ByteValues = utf8ByteValues; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ChatTokenTopLogProbabilityInfo() + { + } + + public string Token { get; } + } +} diff --git a/.dotnet/src/Generated/Models/ChatTokenUsage.Serialization.cs b/.dotnet/src/Generated/Models/ChatTokenUsage.Serialization.cs new file mode 100644 index 000000000..2440ceca2 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatTokenUsage.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatTokenUsage : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatTokenUsage)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("completion_tokens") != true) + { + writer.WritePropertyName("completion_tokens"u8); + writer.WriteNumberValue(OutputTokens); + } + if (SerializedAdditionalRawData?.ContainsKey("prompt_tokens") != true) + { + writer.WritePropertyName("prompt_tokens"u8); + writer.WriteNumberValue(InputTokens); + } + if (SerializedAdditionalRawData?.ContainsKey("total_tokens") != true) + { + writer.WritePropertyName("total_tokens"u8); + writer.WriteNumberValue(TotalTokens); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ChatTokenUsage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatTokenUsage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatTokenUsage(document.RootElement, options); + } + + internal static ChatTokenUsage DeserializeChatTokenUsage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int completionTokens = default; + int promptTokens = default; + int totalTokens = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("completion_tokens"u8)) + { + completionTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("prompt_tokens"u8)) + { + promptTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("total_tokens"u8)) + { + totalTokens = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatTokenUsage(completionTokens, promptTokens, totalTokens, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatTokenUsage)} does not support writing '{options.Format}' format."); + } + } + + ChatTokenUsage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatTokenUsage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatTokenUsage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatTokenUsage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatTokenUsage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatTokenUsage.cs b/.dotnet/src/Generated/Models/ChatTokenUsage.cs new file mode 100644 index 000000000..3c2507780 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatTokenUsage.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class ChatTokenUsage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal ChatTokenUsage(int outputTokens, int inputTokens, int totalTokens) + { + OutputTokens = outputTokens; + InputTokens = inputTokens; + TotalTokens = totalTokens; + } + + internal ChatTokenUsage(int outputTokens, int inputTokens, int totalTokens, IDictionary serializedAdditionalRawData) + { + OutputTokens = outputTokens; + InputTokens = inputTokens; + TotalTokens = totalTokens; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ChatTokenUsage() + { + } + public int TotalTokens { get; } + } +} diff --git a/.dotnet/src/Generated/Models/ChatTool.Serialization.cs b/.dotnet/src/Generated/Models/ChatTool.Serialization.cs new file mode 100644 index 000000000..78fbf981a --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatTool.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatTool : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatTool)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Kind.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("function") != true) + { + writer.WritePropertyName("function"u8); + writer.WriteObjectValue(Function, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ChatTool IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatTool)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatTool(document.RootElement, options); + } + + internal static ChatTool DeserializeChatTool(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + ChatToolKind type = default; + InternalFunctionDefinition function = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = new ChatToolKind(property.Value.GetString()); + continue; + } + if (property.NameEquals("function"u8)) + { + function = InternalFunctionDefinition.DeserializeInternalFunctionDefinition(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatTool(type, function, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatTool)} does not support writing '{options.Format}' format."); + } + } + + ChatTool IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatTool(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatTool)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatTool FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatTool(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatTool.cs b/.dotnet/src/Generated/Models/ChatTool.cs new file mode 100644 index 000000000..d10b8bc5f --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatTool.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class ChatTool + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal ChatTool(ChatToolKind kind, InternalFunctionDefinition function, IDictionary serializedAdditionalRawData) + { + Kind = kind; + Function = function; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ChatTool() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatToolCall.Serialization.cs b/.dotnet/src/Generated/Models/ChatToolCall.Serialization.cs new file mode 100644 index 000000000..0b17701a4 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatToolCall.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatToolCall : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatToolCall)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Kind.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("function") != true) + { + writer.WritePropertyName("function"u8); + writer.WriteObjectValue(Function, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ChatToolCall IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatToolCall)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatToolCall(document.RootElement, options); + } + + internal static ChatToolCall DeserializeChatToolCall(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + ChatToolCallKind type = default; + InternalChatCompletionMessageToolCallFunction function = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = new ChatToolCallKind(property.Value.GetString()); + continue; + } + if (property.NameEquals("function"u8)) + { + function = InternalChatCompletionMessageToolCallFunction.DeserializeInternalChatCompletionMessageToolCallFunction(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ChatToolCall(id, type, function, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatToolCall)} does not support writing '{options.Format}' format."); + } + } + + ChatToolCall IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatToolCall(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatToolCall)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatToolCall FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatToolCall(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatToolCall.cs b/.dotnet/src/Generated/Models/ChatToolCall.cs new file mode 100644 index 000000000..0d909e905 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatToolCall.cs @@ -0,0 +1,28 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class ChatToolCall + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal ChatToolCall(string id, ChatToolCallKind kind, InternalChatCompletionMessageToolCallFunction function, IDictionary serializedAdditionalRawData) + { + Id = id; + Kind = kind; + Function = function; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ChatToolCall() + { + } + + public string Id { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/ChatToolCallKind.cs b/.dotnet/src/Generated/Models/ChatToolCallKind.cs new file mode 100644 index 000000000..c75e67ca1 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatToolCallKind.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + public readonly partial struct ChatToolCallKind : IEquatable + { + private readonly string _value; + + public ChatToolCallKind(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FunctionValue = "function"; + + public static ChatToolCallKind Function { get; } = new ChatToolCallKind(FunctionValue); + public static bool operator ==(ChatToolCallKind left, ChatToolCallKind right) => left.Equals(right); + public static bool operator !=(ChatToolCallKind left, ChatToolCallKind right) => !left.Equals(right); + public static implicit operator ChatToolCallKind(string value) => new ChatToolCallKind(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is ChatToolCallKind other && Equals(other); + public bool Equals(ChatToolCallKind other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/ChatToolChoice.Serialization.cs b/.dotnet/src/Generated/Models/ChatToolChoice.Serialization.cs new file mode 100644 index 000000000..9fd540965 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatToolChoice.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ChatToolChoice : IJsonModel + { + ChatToolChoice IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatToolChoice)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatToolChoice(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatToolChoice)} does not support writing '{options.Format}' format."); + } + } + + ChatToolChoice IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatToolChoice(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatToolChoice)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ChatToolChoice FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeChatToolChoice(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ChatToolChoice.cs b/.dotnet/src/Generated/Models/ChatToolChoice.cs new file mode 100644 index 000000000..5650611aa --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatToolChoice.cs @@ -0,0 +1,14 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class ChatToolChoice + { + internal IDictionary SerializedAdditionalRawData { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/ChatToolKind.cs b/.dotnet/src/Generated/Models/ChatToolKind.cs new file mode 100644 index 000000000..793f2f7a0 --- /dev/null +++ b/.dotnet/src/Generated/Models/ChatToolKind.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + public readonly partial struct ChatToolKind : IEquatable + { + private readonly string _value; + + public ChatToolKind(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FunctionValue = "function"; + + public static ChatToolKind Function { get; } = new ChatToolKind(FunctionValue); + public static bool operator ==(ChatToolKind left, ChatToolKind right) => left.Equals(right); + public static bool operator !=(ChatToolKind left, ChatToolKind right) => !left.Equals(right); + public static implicit operator ChatToolKind(string value) => new ChatToolKind(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is ChatToolKind other && Equals(other); + public bool Equals(ChatToolKind other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/CodeInterpreterToolDefinition.Serialization.cs b/.dotnet/src/Generated/Models/CodeInterpreterToolDefinition.Serialization.cs new file mode 100644 index 000000000..c1f2296db --- /dev/null +++ b/.dotnet/src/Generated/Models/CodeInterpreterToolDefinition.Serialization.cs @@ -0,0 +1,97 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class CodeInterpreterToolDefinition : IJsonModel + { + CodeInterpreterToolDefinition IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(CodeInterpreterToolDefinition)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeCodeInterpreterToolDefinition(document.RootElement, options); + } + + internal static CodeInterpreterToolDefinition DeserializeCodeInterpreterToolDefinition(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new CodeInterpreterToolDefinition(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(CodeInterpreterToolDefinition)} does not support writing '{options.Format}' format."); + } + } + + CodeInterpreterToolDefinition IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeCodeInterpreterToolDefinition(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(CodeInterpreterToolDefinition)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new CodeInterpreterToolDefinition FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeCodeInterpreterToolDefinition(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/CodeInterpreterToolDefinition.cs b/.dotnet/src/Generated/Models/CodeInterpreterToolDefinition.cs new file mode 100644 index 000000000..1b330ab0d --- /dev/null +++ b/.dotnet/src/Generated/Models/CodeInterpreterToolDefinition.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class CodeInterpreterToolDefinition : ToolDefinition + { + public CodeInterpreterToolDefinition() + { + Type = "code_interpreter"; + } + + internal CodeInterpreterToolDefinition(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + } +} diff --git a/.dotnet/src/Generated/Models/CodeInterpreterToolResources.Serialization.cs b/.dotnet/src/Generated/Models/CodeInterpreterToolResources.Serialization.cs new file mode 100644 index 000000000..467dad787 --- /dev/null +++ b/.dotnet/src/Generated/Models/CodeInterpreterToolResources.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class CodeInterpreterToolResources : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(CodeInterpreterToolResources)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_ids") != true && Optional.IsCollectionDefined(FileIds)) + { + writer.WritePropertyName("file_ids"u8); + writer.WriteStartArray(); + foreach (var item in FileIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + CodeInterpreterToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(CodeInterpreterToolResources)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeCodeInterpreterToolResources(document.RootElement, options); + } + + internal static CodeInterpreterToolResources DeserializeCodeInterpreterToolResources(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList fileIds = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + fileIds = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new CodeInterpreterToolResources(fileIds ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(CodeInterpreterToolResources)} does not support writing '{options.Format}' format."); + } + } + + CodeInterpreterToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeCodeInterpreterToolResources(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(CodeInterpreterToolResources)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static CodeInterpreterToolResources FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeCodeInterpreterToolResources(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/CodeInterpreterToolResources.cs b/.dotnet/src/Generated/Models/CodeInterpreterToolResources.cs new file mode 100644 index 000000000..1735ff0f9 --- /dev/null +++ b/.dotnet/src/Generated/Models/CodeInterpreterToolResources.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class CodeInterpreterToolResources + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal CodeInterpreterToolResources(IList fileIds, IDictionary serializedAdditionalRawData) + { + FileIds = fileIds; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/Embedding.Serialization.cs b/.dotnet/src/Generated/Models/Embedding.Serialization.cs new file mode 100644 index 000000000..93a723fe5 --- /dev/null +++ b/.dotnet/src/Generated/Models/Embedding.Serialization.cs @@ -0,0 +1,162 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Embeddings +{ + public partial class Embedding : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(Embedding)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("embedding") != true) + { + writer.WritePropertyName("embedding"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(EmbeddingProperty); +#else + using (JsonDocument document = JsonDocument.Parse(EmbeddingProperty)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + Embedding IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(Embedding)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeEmbedding(document.RootElement, options); + } + + internal static Embedding DeserializeEmbedding(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + BinaryData embedding = default; + InternalEmbeddingObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("embedding"u8)) + { + embedding = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalEmbeddingObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new Embedding(index, embedding, @object, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(Embedding)} does not support writing '{options.Format}' format."); + } + } + + Embedding IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeEmbedding(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(Embedding)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static Embedding FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeEmbedding(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/Embedding.cs b/.dotnet/src/Generated/Models/Embedding.cs new file mode 100644 index 000000000..b8eb10978 --- /dev/null +++ b/.dotnet/src/Generated/Models/Embedding.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Embeddings +{ + public partial class Embedding + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal Embedding() + { + } + + public int Index { get; } + } +} diff --git a/.dotnet/src/Generated/Models/EmbeddingCollection.Serialization.cs b/.dotnet/src/Generated/Models/EmbeddingCollection.Serialization.cs new file mode 100644 index 000000000..d569d5592 --- /dev/null +++ b/.dotnet/src/Generated/Models/EmbeddingCollection.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Embeddings +{ + public partial class EmbeddingCollection : IJsonModel + { + EmbeddingCollection IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(EmbeddingCollection)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeEmbeddingCollection(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(EmbeddingCollection)} does not support writing '{options.Format}' format."); + } + } + + EmbeddingCollection IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeEmbeddingCollection(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(EmbeddingCollection)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static EmbeddingCollection FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeEmbeddingCollection(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/EmbeddingCollection.cs b/.dotnet/src/Generated/Models/EmbeddingCollection.cs new file mode 100644 index 000000000..6afd1cf94 --- /dev/null +++ b/.dotnet/src/Generated/Models/EmbeddingCollection.cs @@ -0,0 +1,17 @@ +// + +#nullable disable + +using System.Collections.Generic; +using System.Collections.ObjectModel; +using System.Linq; + +namespace OpenAI.Embeddings +{ + public partial class EmbeddingCollection : ReadOnlyCollection + { + public string Model { get; } + + public EmbeddingTokenUsage Usage { get; } + } +} diff --git a/.dotnet/src/Generated/Models/EmbeddingGenerationOptions.Serialization.cs b/.dotnet/src/Generated/Models/EmbeddingGenerationOptions.Serialization.cs new file mode 100644 index 000000000..5a2ee5c11 --- /dev/null +++ b/.dotnet/src/Generated/Models/EmbeddingGenerationOptions.Serialization.cs @@ -0,0 +1,198 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Embeddings +{ + public partial class EmbeddingGenerationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(EmbeddingGenerationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("input") != true) + { + writer.WritePropertyName("input"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(Input); +#else + using (JsonDocument document = JsonDocument.Parse(Input)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("encoding_format") != true && Optional.IsDefined(EncodingFormat)) + { + writer.WritePropertyName("encoding_format"u8); + writer.WriteStringValue(EncodingFormat.Value.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("dimensions") != true && Optional.IsDefined(Dimensions)) + { + writer.WritePropertyName("dimensions"u8); + writer.WriteNumberValue(Dimensions.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("user") != true && Optional.IsDefined(EndUserId)) + { + writer.WritePropertyName("user"u8); + writer.WriteStringValue(EndUserId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + EmbeddingGenerationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(EmbeddingGenerationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeEmbeddingGenerationOptions(document.RootElement, options); + } + + internal static EmbeddingGenerationOptions DeserializeEmbeddingGenerationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + BinaryData input = default; + InternalCreateEmbeddingRequestModel model = default; + InternalCreateEmbeddingRequestEncodingFormat? encodingFormat = default; + int? dimensions = default; + string user = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("input"u8)) + { + input = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("model"u8)) + { + model = new InternalCreateEmbeddingRequestModel(property.Value.GetString()); + continue; + } + if (property.NameEquals("encoding_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + encodingFormat = new InternalCreateEmbeddingRequestEncodingFormat(property.Value.GetString()); + continue; + } + if (property.NameEquals("dimensions"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + dimensions = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("user"u8)) + { + user = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new EmbeddingGenerationOptions( + input, + model, + encodingFormat, + dimensions, + user, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(EmbeddingGenerationOptions)} does not support writing '{options.Format}' format."); + } + } + + EmbeddingGenerationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeEmbeddingGenerationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(EmbeddingGenerationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static EmbeddingGenerationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeEmbeddingGenerationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/EmbeddingGenerationOptions.cs b/.dotnet/src/Generated/Models/EmbeddingGenerationOptions.cs new file mode 100644 index 000000000..92ffbca28 --- /dev/null +++ b/.dotnet/src/Generated/Models/EmbeddingGenerationOptions.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Embeddings +{ + public partial class EmbeddingGenerationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal EmbeddingGenerationOptions(BinaryData input, InternalCreateEmbeddingRequestModel model, InternalCreateEmbeddingRequestEncodingFormat? encodingFormat, int? dimensions, string endUserId, IDictionary serializedAdditionalRawData) + { + Input = input; + Model = model; + EncodingFormat = encodingFormat; + Dimensions = dimensions; + EndUserId = endUserId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public int? Dimensions { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/EmbeddingTokenUsage.Serialization.cs b/.dotnet/src/Generated/Models/EmbeddingTokenUsage.Serialization.cs new file mode 100644 index 000000000..520379899 --- /dev/null +++ b/.dotnet/src/Generated/Models/EmbeddingTokenUsage.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Embeddings +{ + public partial class EmbeddingTokenUsage : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(EmbeddingTokenUsage)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("prompt_tokens") != true) + { + writer.WritePropertyName("prompt_tokens"u8); + writer.WriteNumberValue(InputTokens); + } + if (SerializedAdditionalRawData?.ContainsKey("total_tokens") != true) + { + writer.WritePropertyName("total_tokens"u8); + writer.WriteNumberValue(TotalTokens); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + EmbeddingTokenUsage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(EmbeddingTokenUsage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeEmbeddingTokenUsage(document.RootElement, options); + } + + internal static EmbeddingTokenUsage DeserializeEmbeddingTokenUsage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int promptTokens = default; + int totalTokens = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("prompt_tokens"u8)) + { + promptTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("total_tokens"u8)) + { + totalTokens = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new EmbeddingTokenUsage(promptTokens, totalTokens, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(EmbeddingTokenUsage)} does not support writing '{options.Format}' format."); + } + } + + EmbeddingTokenUsage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeEmbeddingTokenUsage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(EmbeddingTokenUsage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static EmbeddingTokenUsage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeEmbeddingTokenUsage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/EmbeddingTokenUsage.cs b/.dotnet/src/Generated/Models/EmbeddingTokenUsage.cs new file mode 100644 index 000000000..5234ccde5 --- /dev/null +++ b/.dotnet/src/Generated/Models/EmbeddingTokenUsage.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Embeddings +{ + public partial class EmbeddingTokenUsage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal EmbeddingTokenUsage(int inputTokens, int totalTokens) + { + InputTokens = inputTokens; + TotalTokens = totalTokens; + } + + internal EmbeddingTokenUsage(int inputTokens, int totalTokens, IDictionary serializedAdditionalRawData) + { + InputTokens = inputTokens; + TotalTokens = totalTokens; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal EmbeddingTokenUsage() + { + } + public int TotalTokens { get; } + } +} diff --git a/.dotnet/src/Generated/Models/FileChunkingStrategy.Serialization.cs b/.dotnet/src/Generated/Models/FileChunkingStrategy.Serialization.cs new file mode 100644 index 000000000..c56c5329a --- /dev/null +++ b/.dotnet/src/Generated/Models/FileChunkingStrategy.Serialization.cs @@ -0,0 +1,125 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + [PersistableModelProxy(typeof(InternalUnknownFileChunkingStrategyResponseParamProxy))] + public partial class FileChunkingStrategy : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(FileChunkingStrategy)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + FileChunkingStrategy IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(FileChunkingStrategy)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeFileChunkingStrategy(document.RootElement, options); + } + + internal static FileChunkingStrategy DeserializeFileChunkingStrategy(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "auto": return InternalAutoChunkingStrategy.DeserializeInternalAutoChunkingStrategy(element, options); + case "other": return InternalUnknownChunkingStrategy.DeserializeInternalUnknownChunkingStrategy(element, options); + case "static": return StaticFileChunkingStrategy.DeserializeStaticFileChunkingStrategy(element, options); + } + } + return InternalUnknownFileChunkingStrategyResponseParamProxy.DeserializeInternalUnknownFileChunkingStrategyResponseParamProxy(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(FileChunkingStrategy)} does not support writing '{options.Format}' format."); + } + } + + FileChunkingStrategy IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeFileChunkingStrategy(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(FileChunkingStrategy)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static FileChunkingStrategy FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeFileChunkingStrategy(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/FileChunkingStrategy.cs b/.dotnet/src/Generated/Models/FileChunkingStrategy.cs new file mode 100644 index 000000000..b93a33e99 --- /dev/null +++ b/.dotnet/src/Generated/Models/FileChunkingStrategy.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + public abstract partial class FileChunkingStrategy + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected FileChunkingStrategy() + { + } + + internal FileChunkingStrategy(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/FileSearchToolDefinition.Serialization.cs b/.dotnet/src/Generated/Models/FileSearchToolDefinition.Serialization.cs new file mode 100644 index 000000000..23868744b --- /dev/null +++ b/.dotnet/src/Generated/Models/FileSearchToolDefinition.Serialization.cs @@ -0,0 +1,107 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class FileSearchToolDefinition : IJsonModel + { + FileSearchToolDefinition IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(FileSearchToolDefinition)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeFileSearchToolDefinition(document.RootElement, options); + } + + internal static FileSearchToolDefinition DeserializeFileSearchToolDefinition(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalAssistantToolsFileSearchFileSearch fileSearch = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_search"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fileSearch = InternalAssistantToolsFileSearchFileSearch.DeserializeInternalAssistantToolsFileSearchFileSearch(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new FileSearchToolDefinition(type, serializedAdditionalRawData, fileSearch); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(FileSearchToolDefinition)} does not support writing '{options.Format}' format."); + } + } + + FileSearchToolDefinition IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeFileSearchToolDefinition(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(FileSearchToolDefinition)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new FileSearchToolDefinition FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeFileSearchToolDefinition(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/FileSearchToolDefinition.cs b/.dotnet/src/Generated/Models/FileSearchToolDefinition.cs new file mode 100644 index 000000000..a91be72a4 --- /dev/null +++ b/.dotnet/src/Generated/Models/FileSearchToolDefinition.cs @@ -0,0 +1,17 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class FileSearchToolDefinition : ToolDefinition + { + internal FileSearchToolDefinition(string type, IDictionary serializedAdditionalRawData, InternalAssistantToolsFileSearchFileSearch fileSearch) : base(type, serializedAdditionalRawData) + { + _fileSearch = fileSearch; + } + } +} diff --git a/.dotnet/src/Generated/Models/FileSearchToolResources.Serialization.cs b/.dotnet/src/Generated/Models/FileSearchToolResources.Serialization.cs new file mode 100644 index 000000000..15df3db90 --- /dev/null +++ b/.dotnet/src/Generated/Models/FileSearchToolResources.Serialization.cs @@ -0,0 +1,167 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class FileSearchToolResources : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(FileSearchToolResources)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("vector_store_ids") != true && Optional.IsCollectionDefined(VectorStoreIds)) + { + writer.WritePropertyName("vector_store_ids"u8); + writer.WriteStartArray(); + foreach (var item in VectorStoreIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("vector_stores") != true && Optional.IsCollectionDefined(NewVectorStores)) + { + writer.WritePropertyName("vector_stores"u8); + SerializeNewVectorStores(writer, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + FileSearchToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(FileSearchToolResources)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeFileSearchToolResources(document.RootElement, options); + } + + internal static FileSearchToolResources DeserializeFileSearchToolResources(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList vectorStoreIds = default; + IList vectorStores = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("vector_store_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + vectorStoreIds = array; + continue; + } + if (property.NameEquals("vector_stores"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(VectorStoreCreationHelper.DeserializeVectorStoreCreationHelper(item, options)); + } + vectorStores = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new FileSearchToolResources(vectorStoreIds ?? new ChangeTrackingList(), vectorStores ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(FileSearchToolResources)} does not support writing '{options.Format}' format."); + } + } + + FileSearchToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeFileSearchToolResources(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(FileSearchToolResources)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static FileSearchToolResources FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeFileSearchToolResources(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/FileSearchToolResources.cs b/.dotnet/src/Generated/Models/FileSearchToolResources.cs new file mode 100644 index 000000000..21ddbd788 --- /dev/null +++ b/.dotnet/src/Generated/Models/FileSearchToolResources.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class FileSearchToolResources + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal FileSearchToolResources(IList vectorStoreIds, IList newVectorStores, IDictionary serializedAdditionalRawData) + { + VectorStoreIds = vectorStoreIds; + NewVectorStores = newVectorStores; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/FileUploadPurpose.cs b/.dotnet/src/Generated/Models/FileUploadPurpose.cs new file mode 100644 index 000000000..9c08633af --- /dev/null +++ b/.dotnet/src/Generated/Models/FileUploadPurpose.cs @@ -0,0 +1,40 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Files +{ + public readonly partial struct FileUploadPurpose : IEquatable + { + private readonly string _value; + + public FileUploadPurpose(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AssistantsValue = "assistants"; + private const string BatchValue = "batch"; + private const string FineTuneValue = "fine-tune"; + private const string VisionValue = "vision"; + + public static FileUploadPurpose Assistants { get; } = new FileUploadPurpose(AssistantsValue); + public static FileUploadPurpose Batch { get; } = new FileUploadPurpose(BatchValue); + public static FileUploadPurpose FineTune { get; } = new FileUploadPurpose(FineTuneValue); + public static FileUploadPurpose Vision { get; } = new FileUploadPurpose(VisionValue); + public static bool operator ==(FileUploadPurpose left, FileUploadPurpose right) => left.Equals(right); + public static bool operator !=(FileUploadPurpose left, FileUploadPurpose right) => !left.Equals(right); + public static implicit operator FileUploadPurpose(string value) => new FileUploadPurpose(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is FileUploadPurpose other && Equals(other); + public bool Equals(FileUploadPurpose other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/FunctionChatMessage.Serialization.cs b/.dotnet/src/Generated/Models/FunctionChatMessage.Serialization.cs new file mode 100644 index 000000000..398f7d5cc --- /dev/null +++ b/.dotnet/src/Generated/Models/FunctionChatMessage.Serialization.cs @@ -0,0 +1,109 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class FunctionChatMessage : IJsonModel + { + FunctionChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(FunctionChatMessage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeFunctionChatMessage(document.RootElement, options); + } + + internal static FunctionChatMessage DeserializeFunctionChatMessage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + ChatMessageRole role = default; + IList content = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("role"u8)) + { + role = property.Value.GetString().ToChatMessageRole(); + continue; + } + if (property.NameEquals("content"u8)) + { + DeserializeContentValue(property, ref content); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new FunctionChatMessage(role, content ?? new ChangeTrackingList(), serializedAdditionalRawData, name); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(FunctionChatMessage)} does not support writing '{options.Format}' format."); + } + } + + FunctionChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeFunctionChatMessage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(FunctionChatMessage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new FunctionChatMessage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeFunctionChatMessage(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/FunctionChatMessage.cs b/.dotnet/src/Generated/Models/FunctionChatMessage.cs new file mode 100644 index 000000000..abcbb2170 --- /dev/null +++ b/.dotnet/src/Generated/Models/FunctionChatMessage.cs @@ -0,0 +1,22 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + [Obsolete("This field is marked as deprecated.")] + public partial class FunctionChatMessage : ChatMessage + { + internal FunctionChatMessage(ChatMessageRole role, IList content, IDictionary serializedAdditionalRawData, string functionName) : base(role, content, serializedAdditionalRawData) + { + FunctionName = functionName; + } + + internal FunctionChatMessage() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/FunctionToolDefinition.Serialization.cs b/.dotnet/src/Generated/Models/FunctionToolDefinition.Serialization.cs new file mode 100644 index 000000000..e17ea82e9 --- /dev/null +++ b/.dotnet/src/Generated/Models/FunctionToolDefinition.Serialization.cs @@ -0,0 +1,103 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class FunctionToolDefinition : IJsonModel + { + FunctionToolDefinition IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(FunctionToolDefinition)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeFunctionToolDefinition(document.RootElement, options); + } + + internal static FunctionToolDefinition DeserializeFunctionToolDefinition(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalFunctionDefinition function = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("function"u8)) + { + function = InternalFunctionDefinition.DeserializeInternalFunctionDefinition(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new FunctionToolDefinition(type, serializedAdditionalRawData, function); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(FunctionToolDefinition)} does not support writing '{options.Format}' format."); + } + } + + FunctionToolDefinition IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeFunctionToolDefinition(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(FunctionToolDefinition)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new FunctionToolDefinition FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeFunctionToolDefinition(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/FunctionToolDefinition.cs b/.dotnet/src/Generated/Models/FunctionToolDefinition.cs new file mode 100644 index 000000000..a1ed5ea67 --- /dev/null +++ b/.dotnet/src/Generated/Models/FunctionToolDefinition.cs @@ -0,0 +1,13 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class FunctionToolDefinition : ToolDefinition + { + } +} diff --git a/.dotnet/src/Generated/Models/GeneratedImage.Serialization.cs b/.dotnet/src/Generated/Models/GeneratedImage.Serialization.cs new file mode 100644 index 000000000..cdcba781c --- /dev/null +++ b/.dotnet/src/Generated/Models/GeneratedImage.Serialization.cs @@ -0,0 +1,163 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Images +{ + public partial class GeneratedImage : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(GeneratedImage)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("b64_json") != true && Optional.IsDefined(ImageBytes)) + { + writer.WritePropertyName("b64_json"u8); + writer.WriteBase64StringValue(ImageBytes.ToArray(), "D"); + } + if (SerializedAdditionalRawData?.ContainsKey("url") != true && Optional.IsDefined(ImageUri)) + { + writer.WritePropertyName("url"u8); + writer.WriteStringValue(ImageUri.AbsoluteUri); + } + if (SerializedAdditionalRawData?.ContainsKey("revised_prompt") != true && Optional.IsDefined(RevisedPrompt)) + { + writer.WritePropertyName("revised_prompt"u8); + writer.WriteStringValue(RevisedPrompt); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + GeneratedImage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(GeneratedImage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeGeneratedImage(document.RootElement, options); + } + + internal static GeneratedImage DeserializeGeneratedImage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + BinaryData b64Json = default; + Uri url = default; + string revisedPrompt = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("b64_json"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + b64Json = BinaryData.FromBytes(property.Value.GetBytesFromBase64("D")); + continue; + } + if (property.NameEquals("url"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + url = new Uri(property.Value.GetString()); + continue; + } + if (property.NameEquals("revised_prompt"u8)) + { + revisedPrompt = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new GeneratedImage(b64Json, url, revisedPrompt, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(GeneratedImage)} does not support writing '{options.Format}' format."); + } + } + + GeneratedImage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeGeneratedImage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(GeneratedImage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static GeneratedImage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeGeneratedImage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/GeneratedImage.cs b/.dotnet/src/Generated/Models/GeneratedImage.cs new file mode 100644 index 000000000..9177bfc96 --- /dev/null +++ b/.dotnet/src/Generated/Models/GeneratedImage.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Images +{ + public partial class GeneratedImage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal GeneratedImage() + { + } + + internal GeneratedImage(BinaryData imageBytes, Uri imageUri, string revisedPrompt, IDictionary serializedAdditionalRawData) + { + ImageBytes = imageBytes; + ImageUri = imageUri; + RevisedPrompt = revisedPrompt; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public string RevisedPrompt { get; } + } +} diff --git a/.dotnet/src/Generated/Models/GeneratedImageCollection.Serialization.cs b/.dotnet/src/Generated/Models/GeneratedImageCollection.Serialization.cs new file mode 100644 index 000000000..46bcc4d4e --- /dev/null +++ b/.dotnet/src/Generated/Models/GeneratedImageCollection.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Images +{ + public partial class GeneratedImageCollection : IJsonModel + { + GeneratedImageCollection IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(GeneratedImageCollection)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeGeneratedImageCollection(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(GeneratedImageCollection)} does not support writing '{options.Format}' format."); + } + } + + GeneratedImageCollection IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeGeneratedImageCollection(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(GeneratedImageCollection)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static GeneratedImageCollection FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeGeneratedImageCollection(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/GeneratedImageCollection.cs b/.dotnet/src/Generated/Models/GeneratedImageCollection.cs new file mode 100644 index 000000000..c61b6e824 --- /dev/null +++ b/.dotnet/src/Generated/Models/GeneratedImageCollection.cs @@ -0,0 +1,16 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Collections.ObjectModel; +using System.Linq; + +namespace OpenAI.Images +{ + public partial class GeneratedImageCollection : ReadOnlyCollection + { + public DateTimeOffset Created { get; } + } +} diff --git a/.dotnet/src/Generated/Models/GeneratedImageFormat.Serialization.cs b/.dotnet/src/Generated/Models/GeneratedImageFormat.Serialization.cs new file mode 100644 index 000000000..6cbd234d9 --- /dev/null +++ b/.dotnet/src/Generated/Models/GeneratedImageFormat.Serialization.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.Images +{ + internal static partial class GeneratedImageFormatExtensions + { + public static string ToSerialString(this GeneratedImageFormat value) => value switch + { + GeneratedImageFormat.Uri => "url", + GeneratedImageFormat.Bytes => "b64_json", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown GeneratedImageFormat value.") + }; + + public static GeneratedImageFormat ToGeneratedImageFormat(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "url")) return GeneratedImageFormat.Uri; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "b64_json")) return GeneratedImageFormat.Bytes; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown GeneratedImageFormat value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/GeneratedImageQuality.Serialization.cs b/.dotnet/src/Generated/Models/GeneratedImageQuality.Serialization.cs new file mode 100644 index 000000000..96d00a50c --- /dev/null +++ b/.dotnet/src/Generated/Models/GeneratedImageQuality.Serialization.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.Images +{ + internal static partial class GeneratedImageQualityExtensions + { + public static string ToSerialString(this GeneratedImageQuality value) => value switch + { + GeneratedImageQuality.Standard => "standard", + GeneratedImageQuality.High => "hd", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown GeneratedImageQuality value.") + }; + + public static GeneratedImageQuality ToGeneratedImageQuality(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "standard")) return GeneratedImageQuality.Standard; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "hd")) return GeneratedImageQuality.High; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown GeneratedImageQuality value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/GeneratedImageSize.cs b/.dotnet/src/Generated/Models/GeneratedImageSize.cs new file mode 100644 index 000000000..8a16802ab --- /dev/null +++ b/.dotnet/src/Generated/Models/GeneratedImageSize.cs @@ -0,0 +1,28 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Images +{ + public readonly partial struct GeneratedImageSize : IEquatable + { + private const string W256xH256Value = "256x256"; + private const string W512xH512Value = "512x512"; + private const string W1024xH1024Value = "1024x1024"; + private const string W1024xH1792Value = "1792x1024"; + private const string W1792xH1024Value = "1024x1792"; + public static bool operator ==(GeneratedImageSize left, GeneratedImageSize right) => left.Equals(right); + public static bool operator !=(GeneratedImageSize left, GeneratedImageSize right) => !left.Equals(right); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is GeneratedImageSize other && Equals(other); + public bool Equals(GeneratedImageSize other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/GeneratedImageStyle.Serialization.cs b/.dotnet/src/Generated/Models/GeneratedImageStyle.Serialization.cs new file mode 100644 index 000000000..ebb7cbd7f --- /dev/null +++ b/.dotnet/src/Generated/Models/GeneratedImageStyle.Serialization.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.Images +{ + internal static partial class GeneratedImageStyleExtensions + { + public static string ToSerialString(this GeneratedImageStyle value) => value switch + { + GeneratedImageStyle.Vivid => "vivid", + GeneratedImageStyle.Natural => "natural", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown GeneratedImageStyle value.") + }; + + public static GeneratedImageStyle ToGeneratedImageStyle(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "vivid")) return GeneratedImageStyle.Vivid; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "natural")) return GeneratedImageStyle.Natural; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown GeneratedImageStyle value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/GeneratedSpeechFormat.Serialization.cs b/.dotnet/src/Generated/Models/GeneratedSpeechFormat.Serialization.cs new file mode 100644 index 000000000..ff1d45f85 --- /dev/null +++ b/.dotnet/src/Generated/Models/GeneratedSpeechFormat.Serialization.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.Audio +{ + internal static partial class GeneratedSpeechFormatExtensions + { + public static string ToSerialString(this GeneratedSpeechFormat value) => value switch + { + GeneratedSpeechFormat.Mp3 => "mp3", + GeneratedSpeechFormat.Opus => "opus", + GeneratedSpeechFormat.Aac => "aac", + GeneratedSpeechFormat.Flac => "flac", + GeneratedSpeechFormat.Wav => "wav", + GeneratedSpeechFormat.Pcm => "pcm", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown GeneratedSpeechFormat value.") + }; + + public static GeneratedSpeechFormat ToGeneratedSpeechFormat(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "mp3")) return GeneratedSpeechFormat.Mp3; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "opus")) return GeneratedSpeechFormat.Opus; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "aac")) return GeneratedSpeechFormat.Aac; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "flac")) return GeneratedSpeechFormat.Flac; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "wav")) return GeneratedSpeechFormat.Wav; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "pcm")) return GeneratedSpeechFormat.Pcm; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown GeneratedSpeechFormat value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/GeneratedSpeechVoice.Serialization.cs b/.dotnet/src/Generated/Models/GeneratedSpeechVoice.Serialization.cs new file mode 100644 index 000000000..b28f7c0d4 --- /dev/null +++ b/.dotnet/src/Generated/Models/GeneratedSpeechVoice.Serialization.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.Audio +{ + internal static partial class GeneratedSpeechVoiceExtensions + { + public static string ToSerialString(this GeneratedSpeechVoice value) => value switch + { + GeneratedSpeechVoice.Alloy => "alloy", + GeneratedSpeechVoice.Echo => "echo", + GeneratedSpeechVoice.Fable => "fable", + GeneratedSpeechVoice.Onyx => "onyx", + GeneratedSpeechVoice.Nova => "nova", + GeneratedSpeechVoice.Shimmer => "shimmer", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown GeneratedSpeechVoice value.") + }; + + public static GeneratedSpeechVoice ToGeneratedSpeechVoice(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "alloy")) return GeneratedSpeechVoice.Alloy; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "echo")) return GeneratedSpeechVoice.Echo; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "fable")) return GeneratedSpeechVoice.Fable; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "onyx")) return GeneratedSpeechVoice.Onyx; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "nova")) return GeneratedSpeechVoice.Nova; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "shimmer")) return GeneratedSpeechVoice.Shimmer; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown GeneratedSpeechVoice value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/ImageChatMessageContentPartDetail.cs b/.dotnet/src/Generated/Models/ImageChatMessageContentPartDetail.cs new file mode 100644 index 000000000..137c9a819 --- /dev/null +++ b/.dotnet/src/Generated/Models/ImageChatMessageContentPartDetail.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + public readonly partial struct ImageChatMessageContentPartDetail : IEquatable + { + private readonly string _value; + + public ImageChatMessageContentPartDetail(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AutoValue = "auto"; + private const string LowValue = "low"; + private const string HighValue = "high"; + + public static ImageChatMessageContentPartDetail Auto { get; } = new ImageChatMessageContentPartDetail(AutoValue); + public static ImageChatMessageContentPartDetail Low { get; } = new ImageChatMessageContentPartDetail(LowValue); + public static ImageChatMessageContentPartDetail High { get; } = new ImageChatMessageContentPartDetail(HighValue); + public static bool operator ==(ImageChatMessageContentPartDetail left, ImageChatMessageContentPartDetail right) => left.Equals(right); + public static bool operator !=(ImageChatMessageContentPartDetail left, ImageChatMessageContentPartDetail right) => !left.Equals(right); + public static implicit operator ImageChatMessageContentPartDetail(string value) => new ImageChatMessageContentPartDetail(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is ImageChatMessageContentPartDetail other && Equals(other); + public bool Equals(ImageChatMessageContentPartDetail other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/ImageEditOptions.Serialization.cs b/.dotnet/src/Generated/Models/ImageEditOptions.Serialization.cs new file mode 100644 index 000000000..45e614018 --- /dev/null +++ b/.dotnet/src/Generated/Models/ImageEditOptions.Serialization.cs @@ -0,0 +1,347 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.IO; +using System.Text.Json; + +namespace OpenAI.Images +{ + public partial class ImageEditOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ImageEditOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("image") != true) + { + writer.WritePropertyName("image"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(Image); +#else + using (JsonDocument document = JsonDocument.Parse(Image)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("prompt") != true) + { + writer.WritePropertyName("prompt"u8); + writer.WriteStringValue(Prompt); + } + if (SerializedAdditionalRawData?.ContainsKey("mask") != true && Optional.IsDefined(Mask)) + { + writer.WritePropertyName("mask"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(Mask); +#else + using (JsonDocument document = JsonDocument.Parse(Mask)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true && Optional.IsDefined(Model)) + { + if (Model != null) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model.Value.ToString()); + } + else + { + writer.WriteNull("model"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("n") != true && Optional.IsDefined(N)) + { + if (N != null) + { + writer.WritePropertyName("n"u8); + writer.WriteNumberValue(N.Value); + } + else + { + writer.WriteNull("n"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("size") != true && Optional.IsDefined(Size)) + { + if (Size != null) + { + writer.WritePropertyName("size"u8); + writer.WriteStringValue(Size.Value.ToString()); + } + else + { + writer.WriteNull("size"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + if (ResponseFormat != null) + { + writer.WritePropertyName("response_format"u8); + writer.WriteStringValue(ResponseFormat.Value.ToSerialString()); + } + else + { + writer.WriteNull("response_format"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("user") != true && Optional.IsDefined(EndUserId)) + { + writer.WritePropertyName("user"u8); + writer.WriteStringValue(EndUserId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ImageEditOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ImageEditOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeImageEditOptions(document.RootElement, options); + } + + internal static ImageEditOptions DeserializeImageEditOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + BinaryData image = default; + string prompt = default; + BinaryData mask = default; + InternalCreateImageEditRequestModel? model = default; + long? n = default; + GeneratedImageSize? size = default; + GeneratedImageFormat? responseFormat = default; + string user = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("image"u8)) + { + image = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("prompt"u8)) + { + prompt = property.Value.GetString(); + continue; + } + if (property.NameEquals("mask"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + mask = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("model"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + model = null; + continue; + } + model = new InternalCreateImageEditRequestModel(property.Value.GetString()); + continue; + } + if (property.NameEquals("n"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + n = null; + continue; + } + n = property.Value.GetInt64(); + continue; + } + if (property.NameEquals("size"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + size = null; + continue; + } + size = new GeneratedImageSize(property.Value.GetString()); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + responseFormat = null; + continue; + } + responseFormat = property.Value.GetString().ToGeneratedImageFormat(); + continue; + } + if (property.NameEquals("user"u8)) + { + user = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ImageEditOptions( + image, + prompt, + mask, + model, + n, + size, + responseFormat, + user, + serializedAdditionalRawData); + } + + private BinaryData SerializeMultipart(ModelReaderWriterOptions options) + { + using MultipartFormDataBinaryContent content = ToMultipartBinaryBody(); + using MemoryStream stream = new MemoryStream(); + content.WriteTo(stream); + if (stream.Position > int.MaxValue) + { + return BinaryData.FromStream(stream); + } + else + { + return new BinaryData(stream.GetBuffer().AsMemory(0, (int)stream.Position)); + } + } + + internal virtual MultipartFormDataBinaryContent ToMultipartBinaryBody() + { + MultipartFormDataBinaryContent content = new MultipartFormDataBinaryContent(); + content.Add(Image, "image", "image"); + content.Add(Prompt, "prompt"); + if (Optional.IsDefined(Mask)) + { + content.Add(Mask, "mask", "mask"); + } + if (Optional.IsDefined(Model)) + { + if (Model != null) + { + content.Add(Model.Value.ToString(), "model"); + } + } + if (Optional.IsDefined(N)) + { + if (N != null) + { + content.Add(N.Value, "n"); + } + } + if (Optional.IsDefined(Size)) + { + if (Size != null) + { + content.Add(Size.Value.ToString(), "size"); + } + } + if (Optional.IsDefined(ResponseFormat)) + { + if (ResponseFormat != null) + { + content.Add(ResponseFormat.Value.ToSerialString(), "response_format"); + } + } + if (Optional.IsDefined(EndUserId)) + { + content.Add(EndUserId, "user"); + } + return content; + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + case "MFD": + return SerializeMultipart(options); + default: + throw new FormatException($"The model {nameof(ImageEditOptions)} does not support writing '{options.Format}' format."); + } + } + + ImageEditOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeImageEditOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ImageEditOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "MFD"; + + internal static ImageEditOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeImageEditOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ImageEditOptions.cs b/.dotnet/src/Generated/Models/ImageEditOptions.cs new file mode 100644 index 000000000..c46d7879c --- /dev/null +++ b/.dotnet/src/Generated/Models/ImageEditOptions.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Images +{ + public partial class ImageEditOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal ImageEditOptions(BinaryData image, string prompt, BinaryData mask, InternalCreateImageEditRequestModel? model, long? n, GeneratedImageSize? size, GeneratedImageFormat? responseFormat, string endUserId, IDictionary serializedAdditionalRawData) + { + Image = image; + Prompt = prompt; + Mask = mask; + Model = model; + N = n; + Size = size; + ResponseFormat = responseFormat; + EndUserId = endUserId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/ImageGenerationOptions.Serialization.cs b/.dotnet/src/Generated/Models/ImageGenerationOptions.Serialization.cs new file mode 100644 index 000000000..a007fd04a --- /dev/null +++ b/.dotnet/src/Generated/Models/ImageGenerationOptions.Serialization.cs @@ -0,0 +1,283 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Images +{ + public partial class ImageGenerationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ImageGenerationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("prompt") != true) + { + writer.WritePropertyName("prompt"u8); + writer.WriteStringValue(Prompt); + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true && Optional.IsDefined(Model)) + { + if (Model != null) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model.Value.ToString()); + } + else + { + writer.WriteNull("model"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("n") != true && Optional.IsDefined(N)) + { + if (N != null) + { + writer.WritePropertyName("n"u8); + writer.WriteNumberValue(N.Value); + } + else + { + writer.WriteNull("n"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("quality") != true && Optional.IsDefined(Quality)) + { + writer.WritePropertyName("quality"u8); + writer.WriteStringValue(Quality.Value.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + if (ResponseFormat != null) + { + writer.WritePropertyName("response_format"u8); + writer.WriteStringValue(ResponseFormat.Value.ToSerialString()); + } + else + { + writer.WriteNull("response_format"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("size") != true && Optional.IsDefined(Size)) + { + if (Size != null) + { + writer.WritePropertyName("size"u8); + writer.WriteStringValue(Size.Value.ToString()); + } + else + { + writer.WriteNull("size"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("style") != true && Optional.IsDefined(Style)) + { + if (Style != null) + { + writer.WritePropertyName("style"u8); + writer.WriteStringValue(Style.Value.ToSerialString()); + } + else + { + writer.WriteNull("style"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("user") != true && Optional.IsDefined(EndUserId)) + { + writer.WritePropertyName("user"u8); + writer.WriteStringValue(EndUserId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ImageGenerationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ImageGenerationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeImageGenerationOptions(document.RootElement, options); + } + + internal static ImageGenerationOptions DeserializeImageGenerationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string prompt = default; + InternalCreateImageRequestModel? model = default; + long? n = default; + GeneratedImageQuality? quality = default; + GeneratedImageFormat? responseFormat = default; + GeneratedImageSize? size = default; + GeneratedImageStyle? style = default; + string user = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("prompt"u8)) + { + prompt = property.Value.GetString(); + continue; + } + if (property.NameEquals("model"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + model = null; + continue; + } + model = new InternalCreateImageRequestModel(property.Value.GetString()); + continue; + } + if (property.NameEquals("n"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + n = null; + continue; + } + n = property.Value.GetInt64(); + continue; + } + if (property.NameEquals("quality"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + quality = property.Value.GetString().ToGeneratedImageQuality(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + responseFormat = null; + continue; + } + responseFormat = property.Value.GetString().ToGeneratedImageFormat(); + continue; + } + if (property.NameEquals("size"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + size = null; + continue; + } + size = new GeneratedImageSize(property.Value.GetString()); + continue; + } + if (property.NameEquals("style"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + style = null; + continue; + } + style = property.Value.GetString().ToGeneratedImageStyle(); + continue; + } + if (property.NameEquals("user"u8)) + { + user = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ImageGenerationOptions( + prompt, + model, + n, + quality, + responseFormat, + size, + style, + user, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ImageGenerationOptions)} does not support writing '{options.Format}' format."); + } + } + + ImageGenerationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeImageGenerationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ImageGenerationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ImageGenerationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeImageGenerationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ImageGenerationOptions.cs b/.dotnet/src/Generated/Models/ImageGenerationOptions.cs new file mode 100644 index 000000000..e7d9311dc --- /dev/null +++ b/.dotnet/src/Generated/Models/ImageGenerationOptions.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Images +{ + public partial class ImageGenerationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal ImageGenerationOptions(string prompt, InternalCreateImageRequestModel? model, long? n, GeneratedImageQuality? quality, GeneratedImageFormat? responseFormat, GeneratedImageSize? size, GeneratedImageStyle? style, string endUserId, IDictionary serializedAdditionalRawData) + { + Prompt = prompt; + Model = model; + N = n; + Quality = quality; + ResponseFormat = responseFormat; + Size = size; + Style = style; + EndUserId = endUserId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public GeneratedImageQuality? Quality { get; set; } + public GeneratedImageFormat? ResponseFormat { get; set; } + public GeneratedImageSize? Size { get; set; } + public GeneratedImageStyle? Style { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/ImageVariationOptions.Serialization.cs b/.dotnet/src/Generated/Models/ImageVariationOptions.Serialization.cs new file mode 100644 index 000000000..aea32c3d6 --- /dev/null +++ b/.dotnet/src/Generated/Models/ImageVariationOptions.Serialization.cs @@ -0,0 +1,307 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.IO; +using System.Text.Json; + +namespace OpenAI.Images +{ + public partial class ImageVariationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ImageVariationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("image") != true) + { + writer.WritePropertyName("image"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(Image); +#else + using (JsonDocument document = JsonDocument.Parse(Image)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true && Optional.IsDefined(Model)) + { + if (Model != null) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model.Value.ToString()); + } + else + { + writer.WriteNull("model"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("n") != true && Optional.IsDefined(N)) + { + if (N != null) + { + writer.WritePropertyName("n"u8); + writer.WriteNumberValue(N.Value); + } + else + { + writer.WriteNull("n"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + if (ResponseFormat != null) + { + writer.WritePropertyName("response_format"u8); + writer.WriteStringValue(ResponseFormat.Value.ToSerialString()); + } + else + { + writer.WriteNull("response_format"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("size") != true && Optional.IsDefined(Size)) + { + if (Size != null) + { + writer.WritePropertyName("size"u8); + writer.WriteStringValue(Size.Value.ToString()); + } + else + { + writer.WriteNull("size"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("user") != true && Optional.IsDefined(EndUserId)) + { + writer.WritePropertyName("user"u8); + writer.WriteStringValue(EndUserId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ImageVariationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ImageVariationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeImageVariationOptions(document.RootElement, options); + } + + internal static ImageVariationOptions DeserializeImageVariationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + BinaryData image = default; + InternalCreateImageVariationRequestModel? model = default; + long? n = default; + GeneratedImageFormat? responseFormat = default; + GeneratedImageSize? size = default; + string user = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("image"u8)) + { + image = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("model"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + model = null; + continue; + } + model = new InternalCreateImageVariationRequestModel(property.Value.GetString()); + continue; + } + if (property.NameEquals("n"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + n = null; + continue; + } + n = property.Value.GetInt64(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + responseFormat = null; + continue; + } + responseFormat = property.Value.GetString().ToGeneratedImageFormat(); + continue; + } + if (property.NameEquals("size"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + size = null; + continue; + } + size = new GeneratedImageSize(property.Value.GetString()); + continue; + } + if (property.NameEquals("user"u8)) + { + user = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ImageVariationOptions( + image, + model, + n, + responseFormat, + size, + user, + serializedAdditionalRawData); + } + + private BinaryData SerializeMultipart(ModelReaderWriterOptions options) + { + using MultipartFormDataBinaryContent content = ToMultipartBinaryBody(); + using MemoryStream stream = new MemoryStream(); + content.WriteTo(stream); + if (stream.Position > int.MaxValue) + { + return BinaryData.FromStream(stream); + } + else + { + return new BinaryData(stream.GetBuffer().AsMemory(0, (int)stream.Position)); + } + } + + internal virtual MultipartFormDataBinaryContent ToMultipartBinaryBody() + { + MultipartFormDataBinaryContent content = new MultipartFormDataBinaryContent(); + content.Add(Image, "image", "image"); + if (Optional.IsDefined(Model)) + { + if (Model != null) + { + content.Add(Model.Value.ToString(), "model"); + } + } + if (Optional.IsDefined(N)) + { + if (N != null) + { + content.Add(N.Value, "n"); + } + } + if (Optional.IsDefined(ResponseFormat)) + { + if (ResponseFormat != null) + { + content.Add(ResponseFormat.Value.ToSerialString(), "response_format"); + } + } + if (Optional.IsDefined(Size)) + { + if (Size != null) + { + content.Add(Size.Value.ToString(), "size"); + } + } + if (Optional.IsDefined(EndUserId)) + { + content.Add(EndUserId, "user"); + } + return content; + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + case "MFD": + return SerializeMultipart(options); + default: + throw new FormatException($"The model {nameof(ImageVariationOptions)} does not support writing '{options.Format}' format."); + } + } + + ImageVariationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeImageVariationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ImageVariationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "MFD"; + + internal static ImageVariationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeImageVariationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ImageVariationOptions.cs b/.dotnet/src/Generated/Models/ImageVariationOptions.cs new file mode 100644 index 000000000..6fac2f976 --- /dev/null +++ b/.dotnet/src/Generated/Models/ImageVariationOptions.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Images +{ + public partial class ImageVariationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal ImageVariationOptions(BinaryData image, InternalCreateImageVariationRequestModel? model, long? n, GeneratedImageFormat? responseFormat, GeneratedImageSize? size, string endUserId, IDictionary serializedAdditionalRawData) + { + Image = image; + Model = model; + N = n; + ResponseFormat = responseFormat; + Size = size; + EndUserId = endUserId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAddUploadPartRequest.Serialization.cs b/.dotnet/src/Generated/Models/InternalAddUploadPartRequest.Serialization.cs new file mode 100644 index 000000000..f76b44afb --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAddUploadPartRequest.Serialization.cs @@ -0,0 +1,165 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.IO; +using System.Text.Json; + +namespace OpenAI.Files +{ + internal partial class InternalAddUploadPartRequest : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAddUploadPartRequest)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(global::System.BinaryData.FromStream(Data)); +#else + using (JsonDocument document = JsonDocument.Parse(BinaryData.FromStream(Data))) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAddUploadPartRequest IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAddUploadPartRequest)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAddUploadPartRequest(document.RootElement, options); + } + + internal static InternalAddUploadPartRequest DeserializeInternalAddUploadPartRequest(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + Stream data = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("data"u8)) + { + data = BinaryData.FromString(property.Value.GetRawText()).ToStream(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAddUploadPartRequest(data, serializedAdditionalRawData); + } + + private BinaryData SerializeMultipart(ModelReaderWriterOptions options) + { + using MultipartFormDataBinaryContent content = ToMultipartBinaryBody(); + using MemoryStream stream = new MemoryStream(); + content.WriteTo(stream); + if (stream.Position > int.MaxValue) + { + return BinaryData.FromStream(stream); + } + else + { + return new BinaryData(stream.GetBuffer().AsMemory(0, (int)stream.Position)); + } + } + + internal virtual MultipartFormDataBinaryContent ToMultipartBinaryBody() + { + MultipartFormDataBinaryContent content = new MultipartFormDataBinaryContent(); + content.Add(Data, "data", "data", "application/octet-stream"); + return content; + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + case "MFD": + return SerializeMultipart(options); + default: + throw new FormatException($"The model {nameof(InternalAddUploadPartRequest)} does not support writing '{options.Format}' format."); + } + } + + InternalAddUploadPartRequest IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAddUploadPartRequest(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAddUploadPartRequest)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "MFD"; + + internal static InternalAddUploadPartRequest FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAddUploadPartRequest(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAddUploadPartRequest.cs b/.dotnet/src/Generated/Models/InternalAddUploadPartRequest.cs new file mode 100644 index 000000000..f6035f34b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAddUploadPartRequest.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.IO; + +namespace OpenAI.Files +{ + internal partial class InternalAddUploadPartRequest + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalAddUploadPartRequest(Stream data) + { + Argument.AssertNotNull(data, nameof(data)); + + Data = data; + } + + internal InternalAddUploadPartRequest(Stream data, IDictionary serializedAdditionalRawData) + { + Data = data; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalAddUploadPartRequest() + { + } + + public Stream Data { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantObjectObject.cs b/.dotnet/src/Generated/Models/InternalAssistantObjectObject.cs new file mode 100644 index 000000000..bb627c2bc --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantObjectObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalAssistantObjectObject : IEquatable + { + private readonly string _value; + + public InternalAssistantObjectObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AssistantValue = "assistant"; + + public static InternalAssistantObjectObject Assistant { get; } = new InternalAssistantObjectObject(AssistantValue); + public static bool operator ==(InternalAssistantObjectObject left, InternalAssistantObjectObject right) => left.Equals(right); + public static bool operator !=(InternalAssistantObjectObject left, InternalAssistantObjectObject right) => !left.Equals(right); + public static implicit operator InternalAssistantObjectObject(string value) => new InternalAssistantObjectObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalAssistantObjectObject other && Equals(other); + public bool Equals(InternalAssistantObjectObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonObject.Serialization.cs new file mode 100644 index 000000000..80afbf5f6 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonObject.Serialization.cs @@ -0,0 +1,122 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantResponseFormatJsonObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatJsonObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAssistantResponseFormatJsonObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatJsonObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAssistantResponseFormatJsonObject(document.RootElement, options); + } + + internal static InternalAssistantResponseFormatJsonObject DeserializeInternalAssistantResponseFormatJsonObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAssistantResponseFormatJsonObject(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatJsonObject)} does not support writing '{options.Format}' format."); + } + } + + InternalAssistantResponseFormatJsonObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAssistantResponseFormatJsonObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatJsonObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonObject.cs b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonObject.cs new file mode 100644 index 000000000..b6caf3d33 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonObject.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantResponseFormatJsonObject : AssistantResponseFormat + { + internal InternalAssistantResponseFormatJsonObject() + { + Type = "json_object"; + } + + internal InternalAssistantResponseFormatJsonObject(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonSchema.Serialization.cs b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonSchema.Serialization.cs new file mode 100644 index 000000000..24865dbf9 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonSchema.Serialization.cs @@ -0,0 +1,134 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; +using OpenAI.Internal; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantResponseFormatJsonSchema : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatJsonSchema)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("json_schema") != true) + { + writer.WritePropertyName("json_schema"u8); + writer.WriteObjectValue(JsonSchema, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAssistantResponseFormatJsonSchema IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatJsonSchema)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAssistantResponseFormatJsonSchema(document.RootElement, options); + } + + internal static InternalAssistantResponseFormatJsonSchema DeserializeInternalAssistantResponseFormatJsonSchema(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalResponseFormatJsonSchemaJsonSchema jsonSchema = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("json_schema"u8)) + { + jsonSchema = InternalResponseFormatJsonSchemaJsonSchema.DeserializeInternalResponseFormatJsonSchemaJsonSchema(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAssistantResponseFormatJsonSchema(type, serializedAdditionalRawData, jsonSchema); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatJsonSchema)} does not support writing '{options.Format}' format."); + } + } + + InternalAssistantResponseFormatJsonSchema IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAssistantResponseFormatJsonSchema(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatJsonSchema)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonSchema.cs b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonSchema.cs new file mode 100644 index 000000000..d2b71a861 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatJsonSchema.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using OpenAI.Internal; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantResponseFormatJsonSchema : AssistantResponseFormat + { + internal InternalAssistantResponseFormatJsonSchema(InternalResponseFormatJsonSchemaJsonSchema jsonSchema) + { + Argument.AssertNotNull(jsonSchema, nameof(jsonSchema)); + + Type = "json_schema"; + JsonSchema = jsonSchema; + } + + internal InternalAssistantResponseFormatJsonSchema(string type, IDictionary serializedAdditionalRawData, InternalResponseFormatJsonSchemaJsonSchema jsonSchema) : base(type, serializedAdditionalRawData) + { + JsonSchema = jsonSchema; + } + + internal InternalAssistantResponseFormatJsonSchema() + { + } + + public InternalResponseFormatJsonSchemaJsonSchema JsonSchema { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantResponseFormatText.Serialization.cs b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatText.Serialization.cs new file mode 100644 index 000000000..a2ae06eea --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatText.Serialization.cs @@ -0,0 +1,122 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantResponseFormatText : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatText)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAssistantResponseFormatText IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatText)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAssistantResponseFormatText(document.RootElement, options); + } + + internal static InternalAssistantResponseFormatText DeserializeInternalAssistantResponseFormatText(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAssistantResponseFormatText(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatText)} does not support writing '{options.Format}' format."); + } + } + + InternalAssistantResponseFormatText IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAssistantResponseFormatText(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAssistantResponseFormatText)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantResponseFormatText.cs b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatText.cs new file mode 100644 index 000000000..899b65998 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantResponseFormatText.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantResponseFormatText : AssistantResponseFormat + { + internal InternalAssistantResponseFormatText() + { + Type = "text"; + } + + internal InternalAssistantResponseFormatText(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchFileSearch.Serialization.cs b/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchFileSearch.Serialization.cs new file mode 100644 index 000000000..321a00758 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchFileSearch.Serialization.cs @@ -0,0 +1,137 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantToolsFileSearchFileSearch : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantToolsFileSearchFileSearch)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("max_num_results") != true && Optional.IsDefined(InternalMaxNumResults)) + { + writer.WritePropertyName("max_num_results"u8); + writer.WriteNumberValue(InternalMaxNumResults.Value); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAssistantToolsFileSearchFileSearch IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantToolsFileSearchFileSearch)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAssistantToolsFileSearchFileSearch(document.RootElement, options); + } + + internal static InternalAssistantToolsFileSearchFileSearch DeserializeInternalAssistantToolsFileSearchFileSearch(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int? maxNumResults = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("max_num_results"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + maxNumResults = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAssistantToolsFileSearchFileSearch(maxNumResults, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAssistantToolsFileSearchFileSearch)} does not support writing '{options.Format}' format."); + } + } + + InternalAssistantToolsFileSearchFileSearch IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAssistantToolsFileSearchFileSearch(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAssistantToolsFileSearchFileSearch)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalAssistantToolsFileSearchFileSearch FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAssistantToolsFileSearchFileSearch(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchFileSearch.cs b/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchFileSearch.cs new file mode 100644 index 000000000..b65ac169e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchFileSearch.cs @@ -0,0 +1,23 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantToolsFileSearchFileSearch + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalAssistantToolsFileSearchFileSearch() + { + } + + internal InternalAssistantToolsFileSearchFileSearch(int? internalMaxNumResults, IDictionary serializedAdditionalRawData) + { + InternalMaxNumResults = internalMaxNumResults; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchTypeOnly.Serialization.cs b/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchTypeOnly.Serialization.cs new file mode 100644 index 000000000..ef306bf2a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchTypeOnly.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantToolsFileSearchTypeOnly : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantToolsFileSearchTypeOnly)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAssistantToolsFileSearchTypeOnly IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantToolsFileSearchTypeOnly)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAssistantToolsFileSearchTypeOnly(document.RootElement, options); + } + + internal static InternalAssistantToolsFileSearchTypeOnly DeserializeInternalAssistantToolsFileSearchTypeOnly(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalAssistantToolsFileSearchTypeOnlyType type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = new InternalAssistantToolsFileSearchTypeOnlyType(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAssistantToolsFileSearchTypeOnly(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAssistantToolsFileSearchTypeOnly)} does not support writing '{options.Format}' format."); + } + } + + InternalAssistantToolsFileSearchTypeOnly IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAssistantToolsFileSearchTypeOnly(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAssistantToolsFileSearchTypeOnly)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalAssistantToolsFileSearchTypeOnly FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAssistantToolsFileSearchTypeOnly(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchTypeOnly.cs b/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchTypeOnly.cs new file mode 100644 index 000000000..607f7fb44 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchTypeOnly.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantToolsFileSearchTypeOnly + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalAssistantToolsFileSearchTypeOnly() + { + } + + internal InternalAssistantToolsFileSearchTypeOnly(InternalAssistantToolsFileSearchTypeOnlyType type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public InternalAssistantToolsFileSearchTypeOnlyType Type { get; } = InternalAssistantToolsFileSearchTypeOnlyType.FileSearch; + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchTypeOnlyType.cs b/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchTypeOnlyType.cs new file mode 100644 index 000000000..c1378a9ae --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantToolsFileSearchTypeOnlyType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalAssistantToolsFileSearchTypeOnlyType : IEquatable + { + private readonly string _value; + + public InternalAssistantToolsFileSearchTypeOnlyType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FileSearchValue = "file_search"; + + public static InternalAssistantToolsFileSearchTypeOnlyType FileSearch { get; } = new InternalAssistantToolsFileSearchTypeOnlyType(FileSearchValue); + public static bool operator ==(InternalAssistantToolsFileSearchTypeOnlyType left, InternalAssistantToolsFileSearchTypeOnlyType right) => left.Equals(right); + public static bool operator !=(InternalAssistantToolsFileSearchTypeOnlyType left, InternalAssistantToolsFileSearchTypeOnlyType right) => !left.Equals(right); + public static implicit operator InternalAssistantToolsFileSearchTypeOnlyType(string value) => new InternalAssistantToolsFileSearchTypeOnlyType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalAssistantToolsFileSearchTypeOnlyType other && Equals(other); + public bool Equals(InternalAssistantToolsFileSearchTypeOnlyType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantsNamedToolChoiceFunction.Serialization.cs b/.dotnet/src/Generated/Models/InternalAssistantsNamedToolChoiceFunction.Serialization.cs new file mode 100644 index 000000000..0ef83db21 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantsNamedToolChoiceFunction.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantsNamedToolChoiceFunction : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantsNamedToolChoiceFunction)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAssistantsNamedToolChoiceFunction IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAssistantsNamedToolChoiceFunction)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAssistantsNamedToolChoiceFunction(document.RootElement, options); + } + + internal static InternalAssistantsNamedToolChoiceFunction DeserializeInternalAssistantsNamedToolChoiceFunction(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAssistantsNamedToolChoiceFunction(name, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAssistantsNamedToolChoiceFunction)} does not support writing '{options.Format}' format."); + } + } + + InternalAssistantsNamedToolChoiceFunction IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAssistantsNamedToolChoiceFunction(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAssistantsNamedToolChoiceFunction)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalAssistantsNamedToolChoiceFunction FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAssistantsNamedToolChoiceFunction(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantsNamedToolChoiceFunction.cs b/.dotnet/src/Generated/Models/InternalAssistantsNamedToolChoiceFunction.cs new file mode 100644 index 000000000..88e7c6d17 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantsNamedToolChoiceFunction.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalAssistantsNamedToolChoiceFunction + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalAssistantsNamedToolChoiceFunction(string name) + { + Argument.AssertNotNull(name, nameof(name)); + + Name = name; + } + + internal InternalAssistantsNamedToolChoiceFunction(string name, IDictionary serializedAdditionalRawData) + { + Name = name; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalAssistantsNamedToolChoiceFunction() + { + } + + public string Name { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAssistantsNamedToolChoiceType.cs b/.dotnet/src/Generated/Models/InternalAssistantsNamedToolChoiceType.cs new file mode 100644 index 000000000..a71bdc9a0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAssistantsNamedToolChoiceType.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalAssistantsNamedToolChoiceType : IEquatable + { + private readonly string _value; + + public InternalAssistantsNamedToolChoiceType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FunctionValue = "function"; + private const string CodeInterpreterValue = "code_interpreter"; + private const string FileSearchValue = "file_search"; + + public static InternalAssistantsNamedToolChoiceType Function { get; } = new InternalAssistantsNamedToolChoiceType(FunctionValue); + public static InternalAssistantsNamedToolChoiceType CodeInterpreter { get; } = new InternalAssistantsNamedToolChoiceType(CodeInterpreterValue); + public static InternalAssistantsNamedToolChoiceType FileSearch { get; } = new InternalAssistantsNamedToolChoiceType(FileSearchValue); + public static bool operator ==(InternalAssistantsNamedToolChoiceType left, InternalAssistantsNamedToolChoiceType right) => left.Equals(right); + public static bool operator !=(InternalAssistantsNamedToolChoiceType left, InternalAssistantsNamedToolChoiceType right) => !left.Equals(right); + public static implicit operator InternalAssistantsNamedToolChoiceType(string value) => new InternalAssistantsNamedToolChoiceType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalAssistantsNamedToolChoiceType other && Equals(other); + public bool Equals(InternalAssistantsNamedToolChoiceType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalAutoChunkingStrategy.Serialization.cs b/.dotnet/src/Generated/Models/InternalAutoChunkingStrategy.Serialization.cs new file mode 100644 index 000000000..3f1028c9f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAutoChunkingStrategy.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalAutoChunkingStrategy : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAutoChunkingStrategy)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAutoChunkingStrategy IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAutoChunkingStrategy)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAutoChunkingStrategy(document.RootElement, options); + } + + internal static InternalAutoChunkingStrategy DeserializeInternalAutoChunkingStrategy(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAutoChunkingStrategy(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAutoChunkingStrategy)} does not support writing '{options.Format}' format."); + } + } + + InternalAutoChunkingStrategy IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAutoChunkingStrategy(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAutoChunkingStrategy)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalAutoChunkingStrategy FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAutoChunkingStrategy(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAutoChunkingStrategy.cs b/.dotnet/src/Generated/Models/InternalAutoChunkingStrategy.cs new file mode 100644 index 000000000..09a132829 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAutoChunkingStrategy.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal partial class InternalAutoChunkingStrategy : FileChunkingStrategy + { + public InternalAutoChunkingStrategy() + { + Type = "auto"; + } + + internal InternalAutoChunkingStrategy(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAutoChunkingStrategyRequestParam.Serialization.cs b/.dotnet/src/Generated/Models/InternalAutoChunkingStrategyRequestParam.Serialization.cs new file mode 100644 index 000000000..ba2aafdaa --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAutoChunkingStrategyRequestParam.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalAutoChunkingStrategyRequestParam : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAutoChunkingStrategyRequestParam)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalAutoChunkingStrategyRequestParam IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalAutoChunkingStrategyRequestParam)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalAutoChunkingStrategyRequestParam(document.RootElement, options); + } + + internal static InternalAutoChunkingStrategyRequestParam DeserializeInternalAutoChunkingStrategyRequestParam(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalAutoChunkingStrategyRequestParam(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalAutoChunkingStrategyRequestParam)} does not support writing '{options.Format}' format."); + } + } + + InternalAutoChunkingStrategyRequestParam IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalAutoChunkingStrategyRequestParam(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalAutoChunkingStrategyRequestParam)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalAutoChunkingStrategyRequestParam FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalAutoChunkingStrategyRequestParam(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalAutoChunkingStrategyRequestParam.cs b/.dotnet/src/Generated/Models/InternalAutoChunkingStrategyRequestParam.cs new file mode 100644 index 000000000..3b429d3e7 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalAutoChunkingStrategyRequestParam.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal partial class InternalAutoChunkingStrategyRequestParam : InternalFileChunkingStrategyRequestParam + { + public InternalAutoChunkingStrategyRequestParam() + { + Type = "auto"; + } + + internal InternalAutoChunkingStrategyRequestParam(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchCompletionTimeframe.cs b/.dotnet/src/Generated/Models/InternalBatchCompletionTimeframe.cs new file mode 100644 index 000000000..836a9c1e7 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchCompletionTimeframe.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Batch +{ + internal readonly partial struct InternalBatchCompletionTimeframe : IEquatable + { + private readonly string _value; + + public InternalBatchCompletionTimeframe(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string _24hValue = "24h"; + + public static InternalBatchCompletionTimeframe _24h { get; } = new InternalBatchCompletionTimeframe(_24hValue); + public static bool operator ==(InternalBatchCompletionTimeframe left, InternalBatchCompletionTimeframe right) => left.Equals(right); + public static bool operator !=(InternalBatchCompletionTimeframe left, InternalBatchCompletionTimeframe right) => !left.Equals(right); + public static implicit operator InternalBatchCompletionTimeframe(string value) => new InternalBatchCompletionTimeframe(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalBatchCompletionTimeframe other && Equals(other); + public bool Equals(InternalBatchCompletionTimeframe other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchError.Serialization.cs b/.dotnet/src/Generated/Models/InternalBatchError.Serialization.cs new file mode 100644 index 000000000..0e768f6aa --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchError.Serialization.cs @@ -0,0 +1,190 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true && Optional.IsDefined(Code)) + { + writer.WritePropertyName("code"u8); + writer.WriteStringValue(Code); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true && Optional.IsDefined(Message)) + { + writer.WritePropertyName("message"u8); + writer.WriteStringValue(Message); + } + if (SerializedAdditionalRawData?.ContainsKey("param") != true && Optional.IsDefined(Param)) + { + if (Param != null) + { + writer.WritePropertyName("param"u8); + writer.WriteStringValue(Param); + } + else + { + writer.WriteNull("param"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("line") != true && Optional.IsDefined(Line)) + { + if (Line != null) + { + writer.WritePropertyName("line"u8); + writer.WriteNumberValue(Line.Value); + } + else + { + writer.WriteNull("line"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalBatchError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalBatchError(document.RootElement, options); + } + + internal static InternalBatchError DeserializeInternalBatchError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string code = default; + string message = default; + string param = default; + int? line = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + code = property.Value.GetString(); + continue; + } + if (property.NameEquals("message"u8)) + { + message = property.Value.GetString(); + continue; + } + if (property.NameEquals("param"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + param = null; + continue; + } + param = property.Value.GetString(); + continue; + } + if (property.NameEquals("line"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + line = null; + continue; + } + line = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalBatchError(code, message, param, line, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalBatchError)} does not support writing '{options.Format}' format."); + } + } + + InternalBatchError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalBatchError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalBatchError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalBatchError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalBatchError(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchError.cs b/.dotnet/src/Generated/Models/InternalBatchError.cs new file mode 100644 index 000000000..411b48c42 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchError.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchError + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalBatchError() + { + } + + internal InternalBatchError(string code, string message, string param, int? line, IDictionary serializedAdditionalRawData) + { + Code = code; + Message = message; + Param = param; + Line = line; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string Code { get; } + public string Message { get; } + public string Param { get; } + public int? Line { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchErrors.Serialization.cs b/.dotnet/src/Generated/Models/InternalBatchErrors.Serialization.cs new file mode 100644 index 000000000..d75c78a08 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchErrors.Serialization.cs @@ -0,0 +1,162 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchErrors : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchErrors)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("object") != true && Optional.IsDefined(Object)) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.Value.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("data") != true && Optional.IsCollectionDefined(Data)) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalBatchErrors IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchErrors)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalBatchErrors(document.RootElement, options); + } + + internal static InternalBatchErrors DeserializeInternalBatchErrors(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalBatchErrorsObject? @object = default; + IReadOnlyList data = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("object"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + @object = new InternalBatchErrorsObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("data"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalBatchError.DeserializeInternalBatchError(item, options)); + } + data = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalBatchErrors(@object, data ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalBatchErrors)} does not support writing '{options.Format}' format."); + } + } + + InternalBatchErrors IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalBatchErrors(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalBatchErrors)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalBatchErrors FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalBatchErrors(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchErrors.cs b/.dotnet/src/Generated/Models/InternalBatchErrors.cs new file mode 100644 index 000000000..ef20e9e07 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchErrors.cs @@ -0,0 +1,28 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchErrors + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalBatchErrors() + { + Data = new ChangeTrackingList(); + } + + internal InternalBatchErrors(InternalBatchErrorsObject? @object, IReadOnlyList data, IDictionary serializedAdditionalRawData) + { + Object = @object; + Data = data; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public InternalBatchErrorsObject? Object { get; } + public IReadOnlyList Data { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchErrorsObject.cs b/.dotnet/src/Generated/Models/InternalBatchErrorsObject.cs new file mode 100644 index 000000000..0f98bd40d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchErrorsObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Batch +{ + internal readonly partial struct InternalBatchErrorsObject : IEquatable + { + private readonly string _value; + + public InternalBatchErrorsObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalBatchErrorsObject List { get; } = new InternalBatchErrorsObject(ListValue); + public static bool operator ==(InternalBatchErrorsObject left, InternalBatchErrorsObject right) => left.Equals(right); + public static bool operator !=(InternalBatchErrorsObject left, InternalBatchErrorsObject right) => !left.Equals(right); + public static implicit operator InternalBatchErrorsObject(string value) => new InternalBatchErrorsObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalBatchErrorsObject other && Equals(other); + public bool Equals(InternalBatchErrorsObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchJob.Serialization.cs b/.dotnet/src/Generated/Models/InternalBatchJob.Serialization.cs new file mode 100644 index 000000000..f0aa4f8f6 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchJob.Serialization.cs @@ -0,0 +1,425 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchJob : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchJob)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("endpoint") != true) + { + writer.WritePropertyName("endpoint"u8); + writer.WriteStringValue(Endpoint); + } + if (SerializedAdditionalRawData?.ContainsKey("errors") != true && Optional.IsDefined(Errors)) + { + writer.WritePropertyName("errors"u8); + writer.WriteObjectValue(Errors, options); + } + if (SerializedAdditionalRawData?.ContainsKey("input_file_id") != true) + { + writer.WritePropertyName("input_file_id"u8); + writer.WriteStringValue(InputFileId); + } + if (SerializedAdditionalRawData?.ContainsKey("completion_window") != true) + { + writer.WritePropertyName("completion_window"u8); + writer.WriteStringValue(CompletionWindow); + } + if (SerializedAdditionalRawData?.ContainsKey("status") != true) + { + writer.WritePropertyName("status"u8); + writer.WriteStringValue(Status.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("output_file_id") != true && Optional.IsDefined(OutputFileId)) + { + writer.WritePropertyName("output_file_id"u8); + writer.WriteStringValue(OutputFileId); + } + if (SerializedAdditionalRawData?.ContainsKey("error_file_id") != true && Optional.IsDefined(ErrorFileId)) + { + writer.WritePropertyName("error_file_id"u8); + writer.WriteStringValue(ErrorFileId); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("in_progress_at") != true && Optional.IsDefined(InProgressAt)) + { + writer.WritePropertyName("in_progress_at"u8); + writer.WriteNumberValue(InProgressAt.Value, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("expires_at") != true && Optional.IsDefined(ExpiresAt)) + { + writer.WritePropertyName("expires_at"u8); + writer.WriteNumberValue(ExpiresAt.Value, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("finalizing_at") != true && Optional.IsDefined(FinalizingAt)) + { + writer.WritePropertyName("finalizing_at"u8); + writer.WriteNumberValue(FinalizingAt.Value, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("completed_at") != true && Optional.IsDefined(CompletedAt)) + { + writer.WritePropertyName("completed_at"u8); + writer.WriteNumberValue(CompletedAt.Value, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("failed_at") != true && Optional.IsDefined(FailedAt)) + { + writer.WritePropertyName("failed_at"u8); + writer.WriteNumberValue(FailedAt.Value, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("expired_at") != true && Optional.IsDefined(ExpiredAt)) + { + writer.WritePropertyName("expired_at"u8); + writer.WriteNumberValue(ExpiredAt.Value, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("cancelling_at") != true && Optional.IsDefined(CancellingAt)) + { + writer.WritePropertyName("cancelling_at"u8); + writer.WriteNumberValue(CancellingAt.Value, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("cancelled_at") != true && Optional.IsDefined(CancelledAt)) + { + writer.WritePropertyName("cancelled_at"u8); + writer.WriteNumberValue(CancelledAt.Value, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("request_counts") != true && Optional.IsDefined(RequestCounts)) + { + writer.WritePropertyName("request_counts"u8); + writer.WriteObjectValue(RequestCounts, options); + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalBatchJob IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchJob)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalBatchJob(document.RootElement, options); + } + + internal static InternalBatchJob DeserializeInternalBatchJob(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalBatchObject @object = default; + string endpoint = default; + InternalBatchErrors errors = default; + string inputFileId = default; + string completionWindow = default; + InternalBatchStatus status = default; + string outputFileId = default; + string errorFileId = default; + DateTimeOffset createdAt = default; + DateTimeOffset? inProgressAt = default; + DateTimeOffset? expiresAt = default; + DateTimeOffset? finalizingAt = default; + DateTimeOffset? completedAt = default; + DateTimeOffset? failedAt = default; + DateTimeOffset? expiredAt = default; + DateTimeOffset? cancellingAt = default; + DateTimeOffset? cancelledAt = default; + InternalBatchRequestCounts requestCounts = default; + IReadOnlyDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalBatchObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("endpoint"u8)) + { + endpoint = property.Value.GetString(); + continue; + } + if (property.NameEquals("errors"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + errors = InternalBatchErrors.DeserializeInternalBatchErrors(property.Value, options); + continue; + } + if (property.NameEquals("input_file_id"u8)) + { + inputFileId = property.Value.GetString(); + continue; + } + if (property.NameEquals("completion_window"u8)) + { + completionWindow = property.Value.GetString(); + continue; + } + if (property.NameEquals("status"u8)) + { + status = new InternalBatchStatus(property.Value.GetString()); + continue; + } + if (property.NameEquals("output_file_id"u8)) + { + outputFileId = property.Value.GetString(); + continue; + } + if (property.NameEquals("error_file_id"u8)) + { + errorFileId = property.Value.GetString(); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("in_progress_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + inProgressAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("expires_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + expiresAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("finalizing_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + finalizingAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("completed_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + completedAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("failed_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + failedAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("expired_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + expiredAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("cancelling_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + cancellingAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("cancelled_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + cancelledAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("request_counts"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + requestCounts = InternalBatchRequestCounts.DeserializeInternalBatchRequestCounts(property.Value, options); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalBatchJob( + id, + @object, + endpoint, + errors, + inputFileId, + completionWindow, + status, + outputFileId, + errorFileId, + createdAt, + inProgressAt, + expiresAt, + finalizingAt, + completedAt, + failedAt, + expiredAt, + cancellingAt, + cancelledAt, + requestCounts, + metadata ?? new ChangeTrackingDictionary(), + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalBatchJob)} does not support writing '{options.Format}' format."); + } + } + + InternalBatchJob IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalBatchJob(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalBatchJob)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalBatchJob FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalBatchJob(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchJob.cs b/.dotnet/src/Generated/Models/InternalBatchJob.cs new file mode 100644 index 000000000..bbd35aeac --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchJob.cs @@ -0,0 +1,80 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchJob + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalBatchJob(string id, string endpoint, string inputFileId, string completionWindow, InternalBatchStatus status, DateTimeOffset createdAt) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(endpoint, nameof(endpoint)); + Argument.AssertNotNull(inputFileId, nameof(inputFileId)); + Argument.AssertNotNull(completionWindow, nameof(completionWindow)); + + Id = id; + Endpoint = endpoint; + InputFileId = inputFileId; + CompletionWindow = completionWindow; + Status = status; + CreatedAt = createdAt; + Metadata = new ChangeTrackingDictionary(); + } + + internal InternalBatchJob(string id, InternalBatchObject @object, string endpoint, InternalBatchErrors errors, string inputFileId, string completionWindow, InternalBatchStatus status, string outputFileId, string errorFileId, DateTimeOffset createdAt, DateTimeOffset? inProgressAt, DateTimeOffset? expiresAt, DateTimeOffset? finalizingAt, DateTimeOffset? completedAt, DateTimeOffset? failedAt, DateTimeOffset? expiredAt, DateTimeOffset? cancellingAt, DateTimeOffset? cancelledAt, InternalBatchRequestCounts requestCounts, IReadOnlyDictionary metadata, IDictionary serializedAdditionalRawData) + { + Id = id; + Object = @object; + Endpoint = endpoint; + Errors = errors; + InputFileId = inputFileId; + CompletionWindow = completionWindow; + Status = status; + OutputFileId = outputFileId; + ErrorFileId = errorFileId; + CreatedAt = createdAt; + InProgressAt = inProgressAt; + ExpiresAt = expiresAt; + FinalizingAt = finalizingAt; + CompletedAt = completedAt; + FailedAt = failedAt; + ExpiredAt = expiredAt; + CancellingAt = cancellingAt; + CancelledAt = cancelledAt; + RequestCounts = requestCounts; + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalBatchJob() + { + } + + public string Id { get; } + public InternalBatchObject Object { get; } = InternalBatchObject.Batch; + + public string Endpoint { get; } + public InternalBatchErrors Errors { get; } + public string InputFileId { get; } + public string CompletionWindow { get; } + public InternalBatchStatus Status { get; } + public string OutputFileId { get; } + public string ErrorFileId { get; } + public DateTimeOffset CreatedAt { get; } + public DateTimeOffset? InProgressAt { get; } + public DateTimeOffset? ExpiresAt { get; } + public DateTimeOffset? FinalizingAt { get; } + public DateTimeOffset? CompletedAt { get; } + public DateTimeOffset? FailedAt { get; } + public DateTimeOffset? ExpiredAt { get; } + public DateTimeOffset? CancellingAt { get; } + public DateTimeOffset? CancelledAt { get; } + public InternalBatchRequestCounts RequestCounts { get; } + public IReadOnlyDictionary Metadata { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchObject.cs b/.dotnet/src/Generated/Models/InternalBatchObject.cs new file mode 100644 index 000000000..96ff270ec --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Batch +{ + internal readonly partial struct InternalBatchObject : IEquatable + { + private readonly string _value; + + public InternalBatchObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string BatchValue = "batch"; + + public static InternalBatchObject Batch { get; } = new InternalBatchObject(BatchValue); + public static bool operator ==(InternalBatchObject left, InternalBatchObject right) => left.Equals(right); + public static bool operator !=(InternalBatchObject left, InternalBatchObject right) => !left.Equals(right); + public static implicit operator InternalBatchObject(string value) => new InternalBatchObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalBatchObject other && Equals(other); + public bool Equals(InternalBatchObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchRequestCounts.Serialization.cs b/.dotnet/src/Generated/Models/InternalBatchRequestCounts.Serialization.cs new file mode 100644 index 000000000..ad7ad277b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchRequestCounts.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchRequestCounts : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchRequestCounts)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("total") != true) + { + writer.WritePropertyName("total"u8); + writer.WriteNumberValue(Total); + } + if (SerializedAdditionalRawData?.ContainsKey("completed") != true) + { + writer.WritePropertyName("completed"u8); + writer.WriteNumberValue(Completed); + } + if (SerializedAdditionalRawData?.ContainsKey("failed") != true) + { + writer.WritePropertyName("failed"u8); + writer.WriteNumberValue(Failed); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalBatchRequestCounts IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchRequestCounts)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalBatchRequestCounts(document.RootElement, options); + } + + internal static InternalBatchRequestCounts DeserializeInternalBatchRequestCounts(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int total = default; + int completed = default; + int failed = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("total"u8)) + { + total = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("completed"u8)) + { + completed = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("failed"u8)) + { + failed = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalBatchRequestCounts(total, completed, failed, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalBatchRequestCounts)} does not support writing '{options.Format}' format."); + } + } + + InternalBatchRequestCounts IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalBatchRequestCounts(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalBatchRequestCounts)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalBatchRequestCounts FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalBatchRequestCounts(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchRequestCounts.cs b/.dotnet/src/Generated/Models/InternalBatchRequestCounts.cs new file mode 100644 index 000000000..1e9731e07 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchRequestCounts.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchRequestCounts + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalBatchRequestCounts(int total, int completed, int failed) + { + Total = total; + Completed = completed; + Failed = failed; + } + + internal InternalBatchRequestCounts(int total, int completed, int failed, IDictionary serializedAdditionalRawData) + { + Total = total; + Completed = completed; + Failed = failed; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalBatchRequestCounts() + { + } + + public int Total { get; } + public int Completed { get; } + public int Failed { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchRequestInput.Serialization.cs b/.dotnet/src/Generated/Models/InternalBatchRequestInput.Serialization.cs new file mode 100644 index 000000000..fe753aed3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchRequestInput.Serialization.cs @@ -0,0 +1,163 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchRequestInput : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchRequestInput)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("custom_id") != true && Optional.IsDefined(CustomId)) + { + writer.WritePropertyName("custom_id"u8); + writer.WriteStringValue(CustomId); + } + if (SerializedAdditionalRawData?.ContainsKey("method") != true && Optional.IsDefined(Method)) + { + writer.WritePropertyName("method"u8); + writer.WriteStringValue(Method.Value.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("url") != true && Optional.IsDefined(Url)) + { + writer.WritePropertyName("url"u8); + writer.WriteStringValue(Url.AbsoluteUri); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalBatchRequestInput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchRequestInput)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalBatchRequestInput(document.RootElement, options); + } + + internal static InternalBatchRequestInput DeserializeInternalBatchRequestInput(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string customId = default; + InternalBatchRequestInputMethod? method = default; + Uri url = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("custom_id"u8)) + { + customId = property.Value.GetString(); + continue; + } + if (property.NameEquals("method"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + method = new InternalBatchRequestInputMethod(property.Value.GetString()); + continue; + } + if (property.NameEquals("url"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + url = new Uri(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalBatchRequestInput(customId, method, url, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalBatchRequestInput)} does not support writing '{options.Format}' format."); + } + } + + InternalBatchRequestInput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalBatchRequestInput(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalBatchRequestInput)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalBatchRequestInput FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalBatchRequestInput(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchRequestInput.cs b/.dotnet/src/Generated/Models/InternalBatchRequestInput.cs new file mode 100644 index 000000000..7e4b58c93 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchRequestInput.cs @@ -0,0 +1,29 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchRequestInput + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalBatchRequestInput() + { + } + + internal InternalBatchRequestInput(string customId, InternalBatchRequestInputMethod? method, Uri url, IDictionary serializedAdditionalRawData) + { + CustomId = customId; + Method = method; + Url = url; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string CustomId { get; set; } + public InternalBatchRequestInputMethod? Method { get; set; } + public Uri Url { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchRequestInputMethod.cs b/.dotnet/src/Generated/Models/InternalBatchRequestInputMethod.cs new file mode 100644 index 000000000..daff5451d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchRequestInputMethod.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Batch +{ + internal readonly partial struct InternalBatchRequestInputMethod : IEquatable + { + private readonly string _value; + + public InternalBatchRequestInputMethod(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string POSTValue = "POST"; + + public static InternalBatchRequestInputMethod POST { get; } = new InternalBatchRequestInputMethod(POSTValue); + public static bool operator ==(InternalBatchRequestInputMethod left, InternalBatchRequestInputMethod right) => left.Equals(right); + public static bool operator !=(InternalBatchRequestInputMethod left, InternalBatchRequestInputMethod right) => !left.Equals(right); + public static implicit operator InternalBatchRequestInputMethod(string value) => new InternalBatchRequestInputMethod(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalBatchRequestInputMethod other && Equals(other); + public bool Equals(InternalBatchRequestInputMethod other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchRequestOutput.Serialization.cs b/.dotnet/src/Generated/Models/InternalBatchRequestOutput.Serialization.cs new file mode 100644 index 000000000..cd87dfd3b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchRequestOutput.Serialization.cs @@ -0,0 +1,190 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchRequestOutput : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchRequestOutput)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true && Optional.IsDefined(Id)) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("custom_id") != true && Optional.IsDefined(CustomId)) + { + writer.WritePropertyName("custom_id"u8); + writer.WriteStringValue(CustomId); + } + if (SerializedAdditionalRawData?.ContainsKey("response") != true && Optional.IsDefined(Response)) + { + if (Response != null) + { + writer.WritePropertyName("response"u8); + writer.WriteObjectValue(Response, options); + } + else + { + writer.WriteNull("response"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("error") != true && Optional.IsDefined(Error)) + { + if (Error != null) + { + writer.WritePropertyName("error"u8); + writer.WriteObjectValue(Error, options); + } + else + { + writer.WriteNull("error"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalBatchRequestOutput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchRequestOutput)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalBatchRequestOutput(document.RootElement, options); + } + + internal static InternalBatchRequestOutput DeserializeInternalBatchRequestOutput(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + string customId = default; + InternalBatchRequestOutputResponse response = default; + InternalBatchRequestOutputError error = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("custom_id"u8)) + { + customId = property.Value.GetString(); + continue; + } + if (property.NameEquals("response"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + response = null; + continue; + } + response = InternalBatchRequestOutputResponse.DeserializeInternalBatchRequestOutputResponse(property.Value, options); + continue; + } + if (property.NameEquals("error"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + error = null; + continue; + } + error = InternalBatchRequestOutputError.DeserializeInternalBatchRequestOutputError(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalBatchRequestOutput(id, customId, response, error, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalBatchRequestOutput)} does not support writing '{options.Format}' format."); + } + } + + InternalBatchRequestOutput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalBatchRequestOutput(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalBatchRequestOutput)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalBatchRequestOutput FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalBatchRequestOutput(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchRequestOutput.cs b/.dotnet/src/Generated/Models/InternalBatchRequestOutput.cs new file mode 100644 index 000000000..ae119ecd7 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchRequestOutput.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchRequestOutput + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalBatchRequestOutput() + { + } + + internal InternalBatchRequestOutput(string id, string customId, InternalBatchRequestOutputResponse response, InternalBatchRequestOutputError error, IDictionary serializedAdditionalRawData) + { + Id = id; + CustomId = customId; + Response = response; + Error = error; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string Id { get; } + public string CustomId { get; } + public InternalBatchRequestOutputResponse Response { get; } + public InternalBatchRequestOutputError Error { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchRequestOutputError.Serialization.cs b/.dotnet/src/Generated/Models/InternalBatchRequestOutputError.Serialization.cs new file mode 100644 index 000000000..00f747f15 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchRequestOutputError.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchRequestOutputError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchRequestOutputError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true && Optional.IsDefined(Code)) + { + writer.WritePropertyName("code"u8); + writer.WriteStringValue(Code); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true && Optional.IsDefined(Message)) + { + writer.WritePropertyName("message"u8); + writer.WriteStringValue(Message); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalBatchRequestOutputError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchRequestOutputError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalBatchRequestOutputError(document.RootElement, options); + } + + internal static InternalBatchRequestOutputError DeserializeInternalBatchRequestOutputError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string code = default; + string message = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + code = property.Value.GetString(); + continue; + } + if (property.NameEquals("message"u8)) + { + message = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalBatchRequestOutputError(code, message, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalBatchRequestOutputError)} does not support writing '{options.Format}' format."); + } + } + + InternalBatchRequestOutputError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalBatchRequestOutputError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalBatchRequestOutputError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalBatchRequestOutputError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalBatchRequestOutputError(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchRequestOutputError.cs b/.dotnet/src/Generated/Models/InternalBatchRequestOutputError.cs new file mode 100644 index 000000000..b2ddd5bea --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchRequestOutputError.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchRequestOutputError + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalBatchRequestOutputError() + { + } + + internal InternalBatchRequestOutputError(string code, string message, IDictionary serializedAdditionalRawData) + { + Code = code; + Message = message; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string Code { get; } + public string Message { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchRequestOutputResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalBatchRequestOutputResponse.Serialization.cs new file mode 100644 index 000000000..54280475f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchRequestOutputResponse.Serialization.cs @@ -0,0 +1,174 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchRequestOutputResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchRequestOutputResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("status_code") != true && Optional.IsDefined(StatusCode)) + { + writer.WritePropertyName("status_code"u8); + writer.WriteNumberValue(StatusCode.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("request_id") != true && Optional.IsDefined(RequestId)) + { + writer.WritePropertyName("request_id"u8); + writer.WriteStringValue(RequestId); + } + if (SerializedAdditionalRawData?.ContainsKey("body") != true && Optional.IsCollectionDefined(Body)) + { + writer.WritePropertyName("body"u8); + writer.WriteStartObject(); + foreach (var item in Body) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalBatchRequestOutputResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalBatchRequestOutputResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalBatchRequestOutputResponse(document.RootElement, options); + } + + internal static InternalBatchRequestOutputResponse DeserializeInternalBatchRequestOutputResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int? statusCode = default; + string requestId = default; + IReadOnlyDictionary body = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("status_code"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + statusCode = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("request_id"u8)) + { + requestId = property.Value.GetString(); + continue; + } + if (property.NameEquals("body"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + body = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalBatchRequestOutputResponse(statusCode, requestId, body ?? new ChangeTrackingDictionary(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalBatchRequestOutputResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalBatchRequestOutputResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalBatchRequestOutputResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalBatchRequestOutputResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalBatchRequestOutputResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalBatchRequestOutputResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchRequestOutputResponse.cs b/.dotnet/src/Generated/Models/InternalBatchRequestOutputResponse.cs new file mode 100644 index 000000000..2919fec21 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchRequestOutputResponse.cs @@ -0,0 +1,30 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Batch +{ + internal partial class InternalBatchRequestOutputResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalBatchRequestOutputResponse() + { + Body = new ChangeTrackingDictionary(); + } + + internal InternalBatchRequestOutputResponse(int? statusCode, string requestId, IReadOnlyDictionary body, IDictionary serializedAdditionalRawData) + { + StatusCode = statusCode; + RequestId = requestId; + Body = body; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public int? StatusCode { get; } + public string RequestId { get; } + public IReadOnlyDictionary Body { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalBatchStatus.cs b/.dotnet/src/Generated/Models/InternalBatchStatus.cs new file mode 100644 index 000000000..b34fe0bf8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalBatchStatus.cs @@ -0,0 +1,48 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Batch +{ + internal readonly partial struct InternalBatchStatus : IEquatable + { + private readonly string _value; + + public InternalBatchStatus(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ValidatingValue = "validating"; + private const string FailedValue = "failed"; + private const string InProgressValue = "in_progress"; + private const string FinalizingValue = "finalizing"; + private const string CompletedValue = "completed"; + private const string ExpiredValue = "expired"; + private const string CancellingValue = "cancelling"; + private const string CancelledValue = "cancelled"; + + public static InternalBatchStatus Validating { get; } = new InternalBatchStatus(ValidatingValue); + public static InternalBatchStatus Failed { get; } = new InternalBatchStatus(FailedValue); + public static InternalBatchStatus InProgress { get; } = new InternalBatchStatus(InProgressValue); + public static InternalBatchStatus Finalizing { get; } = new InternalBatchStatus(FinalizingValue); + public static InternalBatchStatus Completed { get; } = new InternalBatchStatus(CompletedValue); + public static InternalBatchStatus Expired { get; } = new InternalBatchStatus(ExpiredValue); + public static InternalBatchStatus Cancelling { get; } = new InternalBatchStatus(CancellingValue); + public static InternalBatchStatus Cancelled { get; } = new InternalBatchStatus(CancelledValue); + public static bool operator ==(InternalBatchStatus left, InternalBatchStatus right) => left.Equals(right); + public static bool operator !=(InternalBatchStatus left, InternalBatchStatus right) => !left.Equals(right); + public static implicit operator InternalBatchStatus(string value) => new InternalBatchStatus(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalBatchStatus other && Equals(other); + public bool Equals(InternalBatchStatus other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionFunctionCallOption.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionFunctionCallOption.Serialization.cs new file mode 100644 index 000000000..b8f419482 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionFunctionCallOption.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionFunctionCallOption : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionFunctionCallOption)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatCompletionFunctionCallOption IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionFunctionCallOption)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionFunctionCallOption(document.RootElement, options); + } + + internal static InternalChatCompletionFunctionCallOption DeserializeInternalChatCompletionFunctionCallOption(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionFunctionCallOption(name, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionFunctionCallOption)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionFunctionCallOption IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionFunctionCallOption(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionFunctionCallOption)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionFunctionCallOption FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionFunctionCallOption(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionFunctionCallOption.cs b/.dotnet/src/Generated/Models/InternalChatCompletionFunctionCallOption.cs new file mode 100644 index 000000000..2cb95446c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionFunctionCallOption.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionFunctionCallOption + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalChatCompletionFunctionCallOption(string name) + { + Argument.AssertNotNull(name, nameof(name)); + + Name = name; + } + + internal InternalChatCompletionFunctionCallOption(string name, IDictionary serializedAdditionalRawData) + { + Name = name; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalChatCompletionFunctionCallOption() + { + } + + public string Name { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallChunkFunction.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallChunkFunction.Serialization.cs new file mode 100644 index 000000000..40c34b172 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallChunkFunction.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionMessageToolCallChunkFunction : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionMessageToolCallChunkFunction)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("name") != true && Optional.IsDefined(Name)) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData?.ContainsKey("arguments") != true && Optional.IsDefined(Arguments)) + { + writer.WritePropertyName("arguments"u8); + writer.WriteStringValue(Arguments); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatCompletionMessageToolCallChunkFunction IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionMessageToolCallChunkFunction)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionMessageToolCallChunkFunction(document.RootElement, options); + } + + internal static InternalChatCompletionMessageToolCallChunkFunction DeserializeInternalChatCompletionMessageToolCallChunkFunction(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + string arguments = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("arguments"u8)) + { + arguments = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionMessageToolCallChunkFunction(name, arguments, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionMessageToolCallChunkFunction)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionMessageToolCallChunkFunction IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionMessageToolCallChunkFunction(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionMessageToolCallChunkFunction)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionMessageToolCallChunkFunction FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionMessageToolCallChunkFunction(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallChunkFunction.cs b/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallChunkFunction.cs new file mode 100644 index 000000000..3119ca2e9 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallChunkFunction.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionMessageToolCallChunkFunction + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalChatCompletionMessageToolCallChunkFunction() + { + } + + internal InternalChatCompletionMessageToolCallChunkFunction(string name, string arguments, IDictionary serializedAdditionalRawData) + { + Name = name; + Arguments = arguments; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string Name { get; } + public string Arguments { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallChunkType.cs b/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallChunkType.cs new file mode 100644 index 000000000..ea62bf0ab --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallChunkType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalChatCompletionMessageToolCallChunkType : IEquatable + { + private readonly string _value; + + public InternalChatCompletionMessageToolCallChunkType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FunctionValue = "function"; + + public static InternalChatCompletionMessageToolCallChunkType Function { get; } = new InternalChatCompletionMessageToolCallChunkType(FunctionValue); + public static bool operator ==(InternalChatCompletionMessageToolCallChunkType left, InternalChatCompletionMessageToolCallChunkType right) => left.Equals(right); + public static bool operator !=(InternalChatCompletionMessageToolCallChunkType left, InternalChatCompletionMessageToolCallChunkType right) => !left.Equals(right); + public static implicit operator InternalChatCompletionMessageToolCallChunkType(string value) => new InternalChatCompletionMessageToolCallChunkType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalChatCompletionMessageToolCallChunkType other && Equals(other); + public bool Equals(InternalChatCompletionMessageToolCallChunkType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallFunction.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallFunction.Serialization.cs new file mode 100644 index 000000000..bb71f2b44 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallFunction.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionMessageToolCallFunction : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionMessageToolCallFunction)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData?.ContainsKey("arguments") != true) + { + writer.WritePropertyName("arguments"u8); + writer.WriteStringValue(Arguments); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatCompletionMessageToolCallFunction IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionMessageToolCallFunction)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionMessageToolCallFunction(document.RootElement, options); + } + + internal static InternalChatCompletionMessageToolCallFunction DeserializeInternalChatCompletionMessageToolCallFunction(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + string arguments = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("arguments"u8)) + { + arguments = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionMessageToolCallFunction(name, arguments, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionMessageToolCallFunction)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionMessageToolCallFunction IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionMessageToolCallFunction(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionMessageToolCallFunction)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionMessageToolCallFunction FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionMessageToolCallFunction(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallFunction.cs b/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallFunction.cs new file mode 100644 index 000000000..c725868a0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionMessageToolCallFunction.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionMessageToolCallFunction + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalChatCompletionMessageToolCallFunction(string name, string arguments) + { + Argument.AssertNotNull(name, nameof(name)); + Argument.AssertNotNull(arguments, nameof(arguments)); + + Name = name; + Arguments = arguments; + } + + internal InternalChatCompletionMessageToolCallFunction(string name, string arguments, IDictionary serializedAdditionalRawData) + { + Name = name; + Arguments = arguments; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalChatCompletionMessageToolCallFunction() + { + } + + public string Name { get; set; } + public string Arguments { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoice.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoice.Serialization.cs new file mode 100644 index 000000000..447e3509c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoice.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionNamedToolChoice : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionNamedToolChoice)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("function") != true) + { + writer.WritePropertyName("function"u8); + writer.WriteObjectValue(Function, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatCompletionNamedToolChoice IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionNamedToolChoice)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionNamedToolChoice(document.RootElement, options); + } + + internal static InternalChatCompletionNamedToolChoice DeserializeInternalChatCompletionNamedToolChoice(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalChatCompletionNamedToolChoiceType type = default; + InternalChatCompletionNamedToolChoiceFunction function = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = new InternalChatCompletionNamedToolChoiceType(property.Value.GetString()); + continue; + } + if (property.NameEquals("function"u8)) + { + function = InternalChatCompletionNamedToolChoiceFunction.DeserializeInternalChatCompletionNamedToolChoiceFunction(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionNamedToolChoice(type, function, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionNamedToolChoice)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionNamedToolChoice IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionNamedToolChoice(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionNamedToolChoice)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionNamedToolChoice FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionNamedToolChoice(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoice.cs b/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoice.cs new file mode 100644 index 000000000..4538d8136 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoice.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionNamedToolChoice + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalChatCompletionNamedToolChoice(InternalChatCompletionNamedToolChoiceFunction function) + { + Argument.AssertNotNull(function, nameof(function)); + + Function = function; + } + + internal InternalChatCompletionNamedToolChoice(InternalChatCompletionNamedToolChoiceType type, InternalChatCompletionNamedToolChoiceFunction function, IDictionary serializedAdditionalRawData) + { + Type = type; + Function = function; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalChatCompletionNamedToolChoice() + { + } + + public InternalChatCompletionNamedToolChoiceType Type { get; } = InternalChatCompletionNamedToolChoiceType.Function; + + public InternalChatCompletionNamedToolChoiceFunction Function { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoiceFunction.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoiceFunction.Serialization.cs new file mode 100644 index 000000000..8e0e339e3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoiceFunction.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionNamedToolChoiceFunction : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionNamedToolChoiceFunction)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatCompletionNamedToolChoiceFunction IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionNamedToolChoiceFunction)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionNamedToolChoiceFunction(document.RootElement, options); + } + + internal static InternalChatCompletionNamedToolChoiceFunction DeserializeInternalChatCompletionNamedToolChoiceFunction(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionNamedToolChoiceFunction(name, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionNamedToolChoiceFunction)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionNamedToolChoiceFunction IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionNamedToolChoiceFunction(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionNamedToolChoiceFunction)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionNamedToolChoiceFunction FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionNamedToolChoiceFunction(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoiceFunction.cs b/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoiceFunction.cs new file mode 100644 index 000000000..5097bcb79 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoiceFunction.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionNamedToolChoiceFunction + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalChatCompletionNamedToolChoiceFunction(string name) + { + Argument.AssertNotNull(name, nameof(name)); + + Name = name; + } + + internal InternalChatCompletionNamedToolChoiceFunction(string name, IDictionary serializedAdditionalRawData) + { + Name = name; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalChatCompletionNamedToolChoiceFunction() + { + } + + public string Name { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoiceType.cs b/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoiceType.cs new file mode 100644 index 000000000..b5eda303b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionNamedToolChoiceType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalChatCompletionNamedToolChoiceType : IEquatable + { + private readonly string _value; + + public InternalChatCompletionNamedToolChoiceType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FunctionValue = "function"; + + public static InternalChatCompletionNamedToolChoiceType Function { get; } = new InternalChatCompletionNamedToolChoiceType(FunctionValue); + public static bool operator ==(InternalChatCompletionNamedToolChoiceType left, InternalChatCompletionNamedToolChoiceType right) => left.Equals(right); + public static bool operator !=(InternalChatCompletionNamedToolChoiceType left, InternalChatCompletionNamedToolChoiceType right) => !left.Equals(right); + public static implicit operator InternalChatCompletionNamedToolChoiceType(string value) => new InternalChatCompletionNamedToolChoiceType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalChatCompletionNamedToolChoiceType other && Equals(other); + public bool Equals(InternalChatCompletionNamedToolChoiceType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImage.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImage.Serialization.cs new file mode 100644 index 000000000..f348a091d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImage.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionRequestMessageContentPartImage : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartImage)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("image_url") != true) + { + writer.WritePropertyName("image_url"u8); + writer.WriteObjectValue(ImageUrl, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatCompletionRequestMessageContentPartImage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartImage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionRequestMessageContentPartImage(document.RootElement, options); + } + + internal static InternalChatCompletionRequestMessageContentPartImage DeserializeInternalChatCompletionRequestMessageContentPartImage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalChatCompletionRequestMessageContentPartImageType type = default; + InternalChatCompletionRequestMessageContentPartImageImageUrl imageUrl = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = new InternalChatCompletionRequestMessageContentPartImageType(property.Value.GetString()); + continue; + } + if (property.NameEquals("image_url"u8)) + { + imageUrl = InternalChatCompletionRequestMessageContentPartImageImageUrl.DeserializeInternalChatCompletionRequestMessageContentPartImageImageUrl(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionRequestMessageContentPartImage(type, imageUrl, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartImage)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionRequestMessageContentPartImage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionRequestMessageContentPartImage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartImage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionRequestMessageContentPartImage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionRequestMessageContentPartImage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImage.cs b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImage.cs new file mode 100644 index 000000000..97fb55045 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImage.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionRequestMessageContentPartImage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalChatCompletionRequestMessageContentPartImage(InternalChatCompletionRequestMessageContentPartImageImageUrl imageUrl) + { + Argument.AssertNotNull(imageUrl, nameof(imageUrl)); + + ImageUrl = imageUrl; + } + + internal InternalChatCompletionRequestMessageContentPartImage(InternalChatCompletionRequestMessageContentPartImageType type, InternalChatCompletionRequestMessageContentPartImageImageUrl imageUrl, IDictionary serializedAdditionalRawData) + { + Type = type; + ImageUrl = imageUrl; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalChatCompletionRequestMessageContentPartImage() + { + } + + public InternalChatCompletionRequestMessageContentPartImageType Type { get; } = InternalChatCompletionRequestMessageContentPartImageType.ImageUrl; + + public InternalChatCompletionRequestMessageContentPartImageImageUrl ImageUrl { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImageImageUrl.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImageImageUrl.Serialization.cs new file mode 100644 index 000000000..f094173d0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImageImageUrl.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionRequestMessageContentPartImageImageUrl : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartImageImageUrl)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("url") != true) + { + writer.WritePropertyName("url"u8); + writer.WriteStringValue(Url); + } + if (SerializedAdditionalRawData?.ContainsKey("detail") != true && Optional.IsDefined(Detail)) + { + writer.WritePropertyName("detail"u8); + writer.WriteStringValue(Detail.Value.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatCompletionRequestMessageContentPartImageImageUrl IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartImageImageUrl)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionRequestMessageContentPartImageImageUrl(document.RootElement, options); + } + + internal static InternalChatCompletionRequestMessageContentPartImageImageUrl DeserializeInternalChatCompletionRequestMessageContentPartImageImageUrl(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string url = default; + ImageChatMessageContentPartDetail? detail = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("url"u8)) + { + url = property.Value.GetString(); + continue; + } + if (property.NameEquals("detail"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + detail = new ImageChatMessageContentPartDetail(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionRequestMessageContentPartImageImageUrl(url, detail, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartImageImageUrl)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionRequestMessageContentPartImageImageUrl IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionRequestMessageContentPartImageImageUrl(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartImageImageUrl)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionRequestMessageContentPartImageImageUrl FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionRequestMessageContentPartImageImageUrl(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImageImageUrl.cs b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImageImageUrl.cs new file mode 100644 index 000000000..1ff5dc332 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImageImageUrl.cs @@ -0,0 +1,19 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionRequestMessageContentPartImageImageUrl + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal InternalChatCompletionRequestMessageContentPartImageImageUrl() + { + } + public ImageChatMessageContentPartDetail? Detail { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImageType.cs b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImageType.cs new file mode 100644 index 000000000..2a1100017 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartImageType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalChatCompletionRequestMessageContentPartImageType : IEquatable + { + private readonly string _value; + + public InternalChatCompletionRequestMessageContentPartImageType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ImageUrlValue = "image_url"; + + public static InternalChatCompletionRequestMessageContentPartImageType ImageUrl { get; } = new InternalChatCompletionRequestMessageContentPartImageType(ImageUrlValue); + public static bool operator ==(InternalChatCompletionRequestMessageContentPartImageType left, InternalChatCompletionRequestMessageContentPartImageType right) => left.Equals(right); + public static bool operator !=(InternalChatCompletionRequestMessageContentPartImageType left, InternalChatCompletionRequestMessageContentPartImageType right) => !left.Equals(right); + public static implicit operator InternalChatCompletionRequestMessageContentPartImageType(string value) => new InternalChatCompletionRequestMessageContentPartImageType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalChatCompletionRequestMessageContentPartImageType other && Equals(other); + public bool Equals(InternalChatCompletionRequestMessageContentPartImageType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartRefusal.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartRefusal.Serialization.cs new file mode 100644 index 000000000..093b83c78 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartRefusal.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionRequestMessageContentPartRefusal : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartRefusal)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("refusal") != true) + { + writer.WritePropertyName("refusal"u8); + writer.WriteStringValue(Refusal); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatCompletionRequestMessageContentPartRefusal IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartRefusal)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionRequestMessageContentPartRefusal(document.RootElement, options); + } + + internal static InternalChatCompletionRequestMessageContentPartRefusal DeserializeInternalChatCompletionRequestMessageContentPartRefusal(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalChatCompletionRequestMessageContentPartRefusalType type = default; + string refusal = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = new InternalChatCompletionRequestMessageContentPartRefusalType(property.Value.GetString()); + continue; + } + if (property.NameEquals("refusal"u8)) + { + refusal = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionRequestMessageContentPartRefusal(type, refusal, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartRefusal)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionRequestMessageContentPartRefusal IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionRequestMessageContentPartRefusal(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartRefusal)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionRequestMessageContentPartRefusal FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionRequestMessageContentPartRefusal(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartRefusal.cs b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartRefusal.cs new file mode 100644 index 000000000..ea0eaa097 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartRefusal.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionRequestMessageContentPartRefusal + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalChatCompletionRequestMessageContentPartRefusal(string refusal) + { + Argument.AssertNotNull(refusal, nameof(refusal)); + + Refusal = refusal; + } + + internal InternalChatCompletionRequestMessageContentPartRefusal(InternalChatCompletionRequestMessageContentPartRefusalType type, string refusal, IDictionary serializedAdditionalRawData) + { + Type = type; + Refusal = refusal; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalChatCompletionRequestMessageContentPartRefusal() + { + } + + public InternalChatCompletionRequestMessageContentPartRefusalType Type { get; } = InternalChatCompletionRequestMessageContentPartRefusalType.Refusal; + + public string Refusal { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartRefusalType.cs b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartRefusalType.cs new file mode 100644 index 000000000..d1c3e8498 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartRefusalType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalChatCompletionRequestMessageContentPartRefusalType : IEquatable + { + private readonly string _value; + + public InternalChatCompletionRequestMessageContentPartRefusalType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string RefusalValue = "refusal"; + + public static InternalChatCompletionRequestMessageContentPartRefusalType Refusal { get; } = new InternalChatCompletionRequestMessageContentPartRefusalType(RefusalValue); + public static bool operator ==(InternalChatCompletionRequestMessageContentPartRefusalType left, InternalChatCompletionRequestMessageContentPartRefusalType right) => left.Equals(right); + public static bool operator !=(InternalChatCompletionRequestMessageContentPartRefusalType left, InternalChatCompletionRequestMessageContentPartRefusalType right) => !left.Equals(right); + public static implicit operator InternalChatCompletionRequestMessageContentPartRefusalType(string value) => new InternalChatCompletionRequestMessageContentPartRefusalType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalChatCompletionRequestMessageContentPartRefusalType other && Equals(other); + public bool Equals(InternalChatCompletionRequestMessageContentPartRefusalType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartText.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartText.Serialization.cs new file mode 100644 index 000000000..bec3c465f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartText.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionRequestMessageContentPartText : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartText)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("text") != true) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(Text); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatCompletionRequestMessageContentPartText IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartText)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionRequestMessageContentPartText(document.RootElement, options); + } + + internal static InternalChatCompletionRequestMessageContentPartText DeserializeInternalChatCompletionRequestMessageContentPartText(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalChatCompletionRequestMessageContentPartTextType type = default; + string text = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = new InternalChatCompletionRequestMessageContentPartTextType(property.Value.GetString()); + continue; + } + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionRequestMessageContentPartText(type, text, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartText)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionRequestMessageContentPartText IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionRequestMessageContentPartText(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionRequestMessageContentPartText)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionRequestMessageContentPartText FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionRequestMessageContentPartText(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartText.cs b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartText.cs new file mode 100644 index 000000000..09645f628 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartText.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionRequestMessageContentPartText + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalChatCompletionRequestMessageContentPartText(string text) + { + Argument.AssertNotNull(text, nameof(text)); + + Text = text; + } + + internal InternalChatCompletionRequestMessageContentPartText(InternalChatCompletionRequestMessageContentPartTextType type, string text, IDictionary serializedAdditionalRawData) + { + Type = type; + Text = text; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalChatCompletionRequestMessageContentPartText() + { + } + + public InternalChatCompletionRequestMessageContentPartTextType Type { get; } = InternalChatCompletionRequestMessageContentPartTextType.Text; + + public string Text { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartTextType.cs b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartTextType.cs new file mode 100644 index 000000000..8c58795bb --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionRequestMessageContentPartTextType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalChatCompletionRequestMessageContentPartTextType : IEquatable + { + private readonly string _value; + + public InternalChatCompletionRequestMessageContentPartTextType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string TextValue = "text"; + + public static InternalChatCompletionRequestMessageContentPartTextType Text { get; } = new InternalChatCompletionRequestMessageContentPartTextType(TextValue); + public static bool operator ==(InternalChatCompletionRequestMessageContentPartTextType left, InternalChatCompletionRequestMessageContentPartTextType right) => left.Equals(right); + public static bool operator !=(InternalChatCompletionRequestMessageContentPartTextType left, InternalChatCompletionRequestMessageContentPartTextType right) => !left.Equals(right); + public static implicit operator InternalChatCompletionRequestMessageContentPartTextType(string value) => new InternalChatCompletionRequestMessageContentPartTextType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalChatCompletionRequestMessageContentPartTextType other && Equals(other); + public bool Equals(InternalChatCompletionRequestMessageContentPartTextType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessage.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessage.Serialization.cs new file mode 100644 index 000000000..637df497d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessage.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionResponseMessage : IJsonModel + { + InternalChatCompletionResponseMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionResponseMessage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionResponseMessage(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionResponseMessage)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionResponseMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionResponseMessage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionResponseMessage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionResponseMessage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionResponseMessage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessage.cs b/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessage.cs new file mode 100644 index 000000000..f29262217 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessage.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionResponseMessage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalChatCompletionResponseMessage(IEnumerable content, string refusal) + { + Content = content?.ToList(); + Refusal = refusal; + ToolCalls = new ChangeTrackingList(); + } + + internal InternalChatCompletionResponseMessage(IReadOnlyList content, string refusal, IReadOnlyList toolCalls, ChatMessageRole role, ChatFunctionCall functionCall, IDictionary serializedAdditionalRawData) + { + Content = content; + Refusal = refusal; + ToolCalls = toolCalls; + Role = role; + FunctionCall = functionCall; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalChatCompletionResponseMessage() + { + } + public string Refusal { get; } + public IReadOnlyList ToolCalls { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessageFunctionCall.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessageFunctionCall.Serialization.cs new file mode 100644 index 000000000..6cb2512cd --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessageFunctionCall.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionResponseMessageFunctionCall : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionResponseMessageFunctionCall)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("arguments") != true) + { + writer.WritePropertyName("arguments"u8); + writer.WriteStringValue(Arguments); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatCompletionResponseMessageFunctionCall IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionResponseMessageFunctionCall)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionResponseMessageFunctionCall(document.RootElement, options); + } + + internal static InternalChatCompletionResponseMessageFunctionCall DeserializeInternalChatCompletionResponseMessageFunctionCall(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string arguments = default; + string name = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("arguments"u8)) + { + arguments = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionResponseMessageFunctionCall(arguments, name, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionResponseMessageFunctionCall)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionResponseMessageFunctionCall IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionResponseMessageFunctionCall(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionResponseMessageFunctionCall)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionResponseMessageFunctionCall FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionResponseMessageFunctionCall(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessageFunctionCall.cs b/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessageFunctionCall.cs new file mode 100644 index 000000000..6a070987f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessageFunctionCall.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionResponseMessageFunctionCall + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalChatCompletionResponseMessageFunctionCall(string arguments, string name) + { + Argument.AssertNotNull(arguments, nameof(arguments)); + Argument.AssertNotNull(name, nameof(name)); + + Arguments = arguments; + Name = name; + } + + internal InternalChatCompletionResponseMessageFunctionCall(string arguments, string name, IDictionary serializedAdditionalRawData) + { + Arguments = arguments; + Name = name; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalChatCompletionResponseMessageFunctionCall() + { + } + + public string Arguments { get; } + public string Name { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessageRole.cs b/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessageRole.cs new file mode 100644 index 000000000..0794f1019 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionResponseMessageRole.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalChatCompletionResponseMessageRole : IEquatable + { + private readonly string _value; + + public InternalChatCompletionResponseMessageRole(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AssistantValue = "assistant"; + + public static InternalChatCompletionResponseMessageRole Assistant { get; } = new InternalChatCompletionResponseMessageRole(AssistantValue); + public static bool operator ==(InternalChatCompletionResponseMessageRole left, InternalChatCompletionResponseMessageRole right) => left.Equals(right); + public static bool operator !=(InternalChatCompletionResponseMessageRole left, InternalChatCompletionResponseMessageRole right) => !left.Equals(right); + public static implicit operator InternalChatCompletionResponseMessageRole(string value) => new InternalChatCompletionResponseMessageRole(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalChatCompletionResponseMessageRole other && Equals(other); + public bool Equals(InternalChatCompletionResponseMessageRole other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionStreamOptions.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionStreamOptions.Serialization.cs new file mode 100644 index 000000000..bd3f2eb23 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionStreamOptions.Serialization.cs @@ -0,0 +1,137 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionStreamOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionStreamOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("include_usage") != true && Optional.IsDefined(IncludeUsage)) + { + writer.WritePropertyName("include_usage"u8); + writer.WriteBooleanValue(IncludeUsage.Value); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatCompletionStreamOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionStreamOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionStreamOptions(document.RootElement, options); + } + + internal static InternalChatCompletionStreamOptions DeserializeInternalChatCompletionStreamOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + bool? includeUsage = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("include_usage"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + includeUsage = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatCompletionStreamOptions(includeUsage, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionStreamOptions)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionStreamOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionStreamOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionStreamOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionStreamOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionStreamOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionStreamOptions.cs b/.dotnet/src/Generated/Models/InternalChatCompletionStreamOptions.cs new file mode 100644 index 000000000..65d1cc3a1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionStreamOptions.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionStreamOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalChatCompletionStreamOptions() + { + } + + internal InternalChatCompletionStreamOptions(bool? includeUsage, IDictionary serializedAdditionalRawData) + { + IncludeUsage = includeUsage; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public bool? IncludeUsage { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionStreamResponseDelta.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatCompletionStreamResponseDelta.Serialization.cs new file mode 100644 index 000000000..2a58b56b2 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionStreamResponseDelta.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionStreamResponseDelta : IJsonModel + { + InternalChatCompletionStreamResponseDelta IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatCompletionStreamResponseDelta)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatCompletionStreamResponseDelta(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatCompletionStreamResponseDelta)} does not support writing '{options.Format}' format."); + } + } + + InternalChatCompletionStreamResponseDelta IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatCompletionStreamResponseDelta(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatCompletionStreamResponseDelta)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalChatCompletionStreamResponseDelta FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatCompletionStreamResponseDelta(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionStreamResponseDelta.cs b/.dotnet/src/Generated/Models/InternalChatCompletionStreamResponseDelta.cs new file mode 100644 index 000000000..9bdd14f0a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionStreamResponseDelta.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatCompletionStreamResponseDelta + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal InternalChatCompletionStreamResponseDelta(IReadOnlyList content, StreamingChatFunctionCallUpdate functionCall, IReadOnlyList toolCalls, ChatMessageRole? role, string refusal, IDictionary serializedAdditionalRawData) + { + Content = content; + FunctionCall = functionCall; + ToolCalls = toolCalls; + Role = role; + Refusal = refusal; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public StreamingChatFunctionCallUpdate FunctionCall { get; } + public IReadOnlyList ToolCalls { get; } + public string Refusal { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatCompletionStreamResponseDeltaRole.cs b/.dotnet/src/Generated/Models/InternalChatCompletionStreamResponseDeltaRole.cs new file mode 100644 index 000000000..07df77e42 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatCompletionStreamResponseDeltaRole.cs @@ -0,0 +1,40 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalChatCompletionStreamResponseDeltaRole : IEquatable + { + private readonly string _value; + + public InternalChatCompletionStreamResponseDeltaRole(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string SystemValue = "system"; + private const string UserValue = "user"; + private const string AssistantValue = "assistant"; + private const string ToolValue = "tool"; + + public static InternalChatCompletionStreamResponseDeltaRole System { get; } = new InternalChatCompletionStreamResponseDeltaRole(SystemValue); + public static InternalChatCompletionStreamResponseDeltaRole User { get; } = new InternalChatCompletionStreamResponseDeltaRole(UserValue); + public static InternalChatCompletionStreamResponseDeltaRole Assistant { get; } = new InternalChatCompletionStreamResponseDeltaRole(AssistantValue); + public static InternalChatCompletionStreamResponseDeltaRole Tool { get; } = new InternalChatCompletionStreamResponseDeltaRole(ToolValue); + public static bool operator ==(InternalChatCompletionStreamResponseDeltaRole left, InternalChatCompletionStreamResponseDeltaRole right) => left.Equals(right); + public static bool operator !=(InternalChatCompletionStreamResponseDeltaRole left, InternalChatCompletionStreamResponseDeltaRole right) => !left.Equals(right); + public static implicit operator InternalChatCompletionStreamResponseDeltaRole(string value) => new InternalChatCompletionStreamResponseDeltaRole(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalChatCompletionStreamResponseDeltaRole other && Equals(other); + public bool Equals(InternalChatCompletionStreamResponseDeltaRole other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonObject.Serialization.cs new file mode 100644 index 000000000..9c63f887b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonObject.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatResponseFormatJsonObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatResponseFormatJsonObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatResponseFormatJsonObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatResponseFormatJsonObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatResponseFormatJsonObject(document.RootElement, options); + } + + internal static InternalChatResponseFormatJsonObject DeserializeInternalChatResponseFormatJsonObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatResponseFormatJsonObject(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatResponseFormatJsonObject)} does not support writing '{options.Format}' format."); + } + } + + InternalChatResponseFormatJsonObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatResponseFormatJsonObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatResponseFormatJsonObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalChatResponseFormatJsonObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatResponseFormatJsonObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonObject.cs b/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonObject.cs new file mode 100644 index 000000000..a3d489404 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonObject.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatResponseFormatJsonObject : ChatResponseFormat + { + public InternalChatResponseFormatJsonObject() + { + Type = "json_object"; + } + + internal InternalChatResponseFormatJsonObject(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonSchema.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonSchema.Serialization.cs new file mode 100644 index 000000000..c5c0cca64 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonSchema.Serialization.cs @@ -0,0 +1,145 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; +using OpenAI.Internal; + +namespace OpenAI.Chat +{ + internal partial class InternalChatResponseFormatJsonSchema : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatResponseFormatJsonSchema)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("json_schema") != true) + { + writer.WritePropertyName("json_schema"u8); + writer.WriteObjectValue(JsonSchema, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatResponseFormatJsonSchema IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatResponseFormatJsonSchema)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatResponseFormatJsonSchema(document.RootElement, options); + } + + internal static InternalChatResponseFormatJsonSchema DeserializeInternalChatResponseFormatJsonSchema(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalResponseFormatJsonSchemaJsonSchema jsonSchema = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("json_schema"u8)) + { + jsonSchema = InternalResponseFormatJsonSchemaJsonSchema.DeserializeInternalResponseFormatJsonSchemaJsonSchema(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatResponseFormatJsonSchema(type, serializedAdditionalRawData, jsonSchema); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatResponseFormatJsonSchema)} does not support writing '{options.Format}' format."); + } + } + + InternalChatResponseFormatJsonSchema IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatResponseFormatJsonSchema(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatResponseFormatJsonSchema)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalChatResponseFormatJsonSchema FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatResponseFormatJsonSchema(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonSchema.cs b/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonSchema.cs new file mode 100644 index 000000000..d519acb14 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatResponseFormatJsonSchema.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using OpenAI.Internal; + +namespace OpenAI.Chat +{ + internal partial class InternalChatResponseFormatJsonSchema : ChatResponseFormat + { + public InternalChatResponseFormatJsonSchema(InternalResponseFormatJsonSchemaJsonSchema jsonSchema) + { + Argument.AssertNotNull(jsonSchema, nameof(jsonSchema)); + + Type = "json_schema"; + JsonSchema = jsonSchema; + } + + internal InternalChatResponseFormatJsonSchema(string type, IDictionary serializedAdditionalRawData, InternalResponseFormatJsonSchemaJsonSchema jsonSchema) : base(type, serializedAdditionalRawData) + { + JsonSchema = jsonSchema; + } + + internal InternalChatResponseFormatJsonSchema() + { + } + + public InternalResponseFormatJsonSchemaJsonSchema JsonSchema { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatResponseFormatText.Serialization.cs b/.dotnet/src/Generated/Models/InternalChatResponseFormatText.Serialization.cs new file mode 100644 index 000000000..c0f373c9d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatResponseFormatText.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalChatResponseFormatText : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatResponseFormatText)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalChatResponseFormatText IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalChatResponseFormatText)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalChatResponseFormatText(document.RootElement, options); + } + + internal static InternalChatResponseFormatText DeserializeInternalChatResponseFormatText(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalChatResponseFormatText(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalChatResponseFormatText)} does not support writing '{options.Format}' format."); + } + } + + InternalChatResponseFormatText IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalChatResponseFormatText(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalChatResponseFormatText)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalChatResponseFormatText FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalChatResponseFormatText(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalChatResponseFormatText.cs b/.dotnet/src/Generated/Models/InternalChatResponseFormatText.cs new file mode 100644 index 000000000..49e7771f0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalChatResponseFormatText.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalChatResponseFormatText : ChatResponseFormat + { + public InternalChatResponseFormatText() + { + Type = "text"; + } + + internal InternalChatResponseFormatText(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCompleteUploadRequest.Serialization.cs b/.dotnet/src/Generated/Models/InternalCompleteUploadRequest.Serialization.cs new file mode 100644 index 000000000..30ee85a04 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCompleteUploadRequest.Serialization.cs @@ -0,0 +1,154 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Files +{ + internal partial class InternalCompleteUploadRequest : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCompleteUploadRequest)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("part_ids") != true) + { + writer.WritePropertyName("part_ids"u8); + writer.WriteStartArray(); + foreach (var item in PartIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("md5") != true && Optional.IsDefined(Md5)) + { + writer.WritePropertyName("md5"u8); + writer.WriteStringValue(Md5); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCompleteUploadRequest IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCompleteUploadRequest)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCompleteUploadRequest(document.RootElement, options); + } + + internal static InternalCompleteUploadRequest DeserializeInternalCompleteUploadRequest(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList partIds = default; + string md5 = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("part_ids"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + partIds = array; + continue; + } + if (property.NameEquals("md5"u8)) + { + md5 = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCompleteUploadRequest(partIds, md5, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCompleteUploadRequest)} does not support writing '{options.Format}' format."); + } + } + + InternalCompleteUploadRequest IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCompleteUploadRequest(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCompleteUploadRequest)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCompleteUploadRequest FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCompleteUploadRequest(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCompleteUploadRequest.cs b/.dotnet/src/Generated/Models/InternalCompleteUploadRequest.cs new file mode 100644 index 000000000..74d4d5346 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCompleteUploadRequest.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Files +{ + internal partial class InternalCompleteUploadRequest + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCompleteUploadRequest(IEnumerable partIds) + { + Argument.AssertNotNull(partIds, nameof(partIds)); + + PartIds = partIds.ToList(); + } + + internal InternalCompleteUploadRequest(IList partIds, string md5, IDictionary serializedAdditionalRawData) + { + PartIds = partIds; + Md5 = md5; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCompleteUploadRequest() + { + } + + public IList PartIds { get; } + public string Md5 { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateAssistantRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateAssistantRequestModel.cs new file mode 100644 index 000000000..dff777d4f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateAssistantRequestModel.cs @@ -0,0 +1,78 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalCreateAssistantRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateAssistantRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string Gpt4oValue = "gpt-4o"; + private const string Gpt4o20240806Value = "gpt-4o-2024-08-06"; + private const string Gpt4o20240513Value = "gpt-4o-2024-05-13"; + private const string Gpt4oMiniValue = "gpt-4o-mini"; + private const string Gpt4oMini20240718Value = "gpt-4o-mini-2024-07-18"; + private const string Gpt4TurboValue = "gpt-4-turbo"; + private const string Gpt4Turbo20240409Value = "gpt-4-turbo-2024-04-09"; + private const string Gpt40125PreviewValue = "gpt-4-0125-preview"; + private const string Gpt4TurboPreviewValue = "gpt-4-turbo-preview"; + private const string Gpt41106PreviewValue = "gpt-4-1106-preview"; + private const string Gpt4VisionPreviewValue = "gpt-4-vision-preview"; + private const string Gpt4Value = "gpt-4"; + private const string Gpt40314Value = "gpt-4-0314"; + private const string Gpt40613Value = "gpt-4-0613"; + private const string Gpt432kValue = "gpt-4-32k"; + private const string Gpt432k0314Value = "gpt-4-32k-0314"; + private const string Gpt432k0613Value = "gpt-4-32k-0613"; + private const string Gpt35TurboValue = "gpt-3.5-turbo"; + private const string Gpt35Turbo16kValue = "gpt-3.5-turbo-16k"; + private const string Gpt35Turbo0613Value = "gpt-3.5-turbo-0613"; + private const string Gpt35Turbo1106Value = "gpt-3.5-turbo-1106"; + private const string Gpt35Turbo0125Value = "gpt-3.5-turbo-0125"; + private const string Gpt35Turbo16k0613Value = "gpt-3.5-turbo-16k-0613"; + + public static InternalCreateAssistantRequestModel Gpt4o { get; } = new InternalCreateAssistantRequestModel(Gpt4oValue); + public static InternalCreateAssistantRequestModel Gpt4o20240806 { get; } = new InternalCreateAssistantRequestModel(Gpt4o20240806Value); + public static InternalCreateAssistantRequestModel Gpt4o20240513 { get; } = new InternalCreateAssistantRequestModel(Gpt4o20240513Value); + public static InternalCreateAssistantRequestModel Gpt4oMini { get; } = new InternalCreateAssistantRequestModel(Gpt4oMiniValue); + public static InternalCreateAssistantRequestModel Gpt4oMini20240718 { get; } = new InternalCreateAssistantRequestModel(Gpt4oMini20240718Value); + public static InternalCreateAssistantRequestModel Gpt4Turbo { get; } = new InternalCreateAssistantRequestModel(Gpt4TurboValue); + public static InternalCreateAssistantRequestModel Gpt4Turbo20240409 { get; } = new InternalCreateAssistantRequestModel(Gpt4Turbo20240409Value); + public static InternalCreateAssistantRequestModel Gpt40125Preview { get; } = new InternalCreateAssistantRequestModel(Gpt40125PreviewValue); + public static InternalCreateAssistantRequestModel Gpt4TurboPreview { get; } = new InternalCreateAssistantRequestModel(Gpt4TurboPreviewValue); + public static InternalCreateAssistantRequestModel Gpt41106Preview { get; } = new InternalCreateAssistantRequestModel(Gpt41106PreviewValue); + public static InternalCreateAssistantRequestModel Gpt4VisionPreview { get; } = new InternalCreateAssistantRequestModel(Gpt4VisionPreviewValue); + public static InternalCreateAssistantRequestModel Gpt4 { get; } = new InternalCreateAssistantRequestModel(Gpt4Value); + public static InternalCreateAssistantRequestModel Gpt40314 { get; } = new InternalCreateAssistantRequestModel(Gpt40314Value); + public static InternalCreateAssistantRequestModel Gpt40613 { get; } = new InternalCreateAssistantRequestModel(Gpt40613Value); + public static InternalCreateAssistantRequestModel Gpt432k { get; } = new InternalCreateAssistantRequestModel(Gpt432kValue); + public static InternalCreateAssistantRequestModel Gpt432k0314 { get; } = new InternalCreateAssistantRequestModel(Gpt432k0314Value); + public static InternalCreateAssistantRequestModel Gpt432k0613 { get; } = new InternalCreateAssistantRequestModel(Gpt432k0613Value); + public static InternalCreateAssistantRequestModel Gpt35Turbo { get; } = new InternalCreateAssistantRequestModel(Gpt35TurboValue); + public static InternalCreateAssistantRequestModel Gpt35Turbo16k { get; } = new InternalCreateAssistantRequestModel(Gpt35Turbo16kValue); + public static InternalCreateAssistantRequestModel Gpt35Turbo0613 { get; } = new InternalCreateAssistantRequestModel(Gpt35Turbo0613Value); + public static InternalCreateAssistantRequestModel Gpt35Turbo1106 { get; } = new InternalCreateAssistantRequestModel(Gpt35Turbo1106Value); + public static InternalCreateAssistantRequestModel Gpt35Turbo0125 { get; } = new InternalCreateAssistantRequestModel(Gpt35Turbo0125Value); + public static InternalCreateAssistantRequestModel Gpt35Turbo16k0613 { get; } = new InternalCreateAssistantRequestModel(Gpt35Turbo16k0613Value); + public static bool operator ==(InternalCreateAssistantRequestModel left, InternalCreateAssistantRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateAssistantRequestModel left, InternalCreateAssistantRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateAssistantRequestModel(string value) => new InternalCreateAssistantRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateAssistantRequestModel other && Equals(other); + public bool Equals(InternalCreateAssistantRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResources.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResources.Serialization.cs new file mode 100644 index 000000000..d21e16129 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResources.Serialization.cs @@ -0,0 +1,152 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateAssistantRequestToolResources : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateAssistantRequestToolResources)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code_interpreter") != true && Optional.IsDefined(CodeInterpreter)) + { + writer.WritePropertyName("code_interpreter"u8); + writer.WriteObjectValue(CodeInterpreter, options); + } + if (SerializedAdditionalRawData?.ContainsKey("file_search") != true && Optional.IsDefined(FileSearch)) + { + writer.WritePropertyName("file_search"u8); + writer.WriteObjectValue(FileSearch, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateAssistantRequestToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateAssistantRequestToolResources)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateAssistantRequestToolResources(document.RootElement, options); + } + + internal static InternalCreateAssistantRequestToolResources DeserializeInternalCreateAssistantRequestToolResources(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalCreateAssistantRequestToolResourcesCodeInterpreter codeInterpreter = default; + FileSearchToolResources fileSearch = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code_interpreter"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + codeInterpreter = InternalCreateAssistantRequestToolResourcesCodeInterpreter.DeserializeInternalCreateAssistantRequestToolResourcesCodeInterpreter(property.Value, options); + continue; + } + if (property.NameEquals("file_search"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fileSearch = FileSearchToolResources.DeserializeFileSearchToolResources(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateAssistantRequestToolResources(codeInterpreter, fileSearch, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateAssistantRequestToolResources)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateAssistantRequestToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateAssistantRequestToolResources(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateAssistantRequestToolResources)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateAssistantRequestToolResources FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateAssistantRequestToolResources(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResources.cs b/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResources.cs new file mode 100644 index 000000000..3ac5aa737 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResources.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateAssistantRequestToolResources + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateAssistantRequestToolResources() + { + } + + internal InternalCreateAssistantRequestToolResources(InternalCreateAssistantRequestToolResourcesCodeInterpreter codeInterpreter, FileSearchToolResources fileSearch, IDictionary serializedAdditionalRawData) + { + CodeInterpreter = codeInterpreter; + FileSearch = fileSearch; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public InternalCreateAssistantRequestToolResourcesCodeInterpreter CodeInterpreter { get; set; } + public FileSearchToolResources FileSearch { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResourcesCodeInterpreter.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResourcesCodeInterpreter.Serialization.cs new file mode 100644 index 000000000..ea159b987 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResourcesCodeInterpreter.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateAssistantRequestToolResourcesCodeInterpreter : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateAssistantRequestToolResourcesCodeInterpreter)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_ids") != true && Optional.IsCollectionDefined(FileIds)) + { + writer.WritePropertyName("file_ids"u8); + writer.WriteStartArray(); + foreach (var item in FileIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateAssistantRequestToolResourcesCodeInterpreter IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateAssistantRequestToolResourcesCodeInterpreter)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateAssistantRequestToolResourcesCodeInterpreter(document.RootElement, options); + } + + internal static InternalCreateAssistantRequestToolResourcesCodeInterpreter DeserializeInternalCreateAssistantRequestToolResourcesCodeInterpreter(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList fileIds = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + fileIds = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateAssistantRequestToolResourcesCodeInterpreter(fileIds ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateAssistantRequestToolResourcesCodeInterpreter)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateAssistantRequestToolResourcesCodeInterpreter IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateAssistantRequestToolResourcesCodeInterpreter(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateAssistantRequestToolResourcesCodeInterpreter)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateAssistantRequestToolResourcesCodeInterpreter FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateAssistantRequestToolResourcesCodeInterpreter(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResourcesCodeInterpreter.cs b/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResourcesCodeInterpreter.cs new file mode 100644 index 000000000..ee042f14f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateAssistantRequestToolResourcesCodeInterpreter.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateAssistantRequestToolResourcesCodeInterpreter + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateAssistantRequestToolResourcesCodeInterpreter() + { + FileIds = new ChangeTrackingList(); + } + + internal InternalCreateAssistantRequestToolResourcesCodeInterpreter(IList fileIds, IDictionary serializedAdditionalRawData) + { + FileIds = fileIds; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IList FileIds { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateBatchRequest.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateBatchRequest.Serialization.cs new file mode 100644 index 000000000..d8386b86c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateBatchRequest.Serialization.cs @@ -0,0 +1,188 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Batch +{ + internal partial class InternalCreateBatchRequest : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateBatchRequest)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("input_file_id") != true) + { + writer.WritePropertyName("input_file_id"u8); + writer.WriteStringValue(InputFileId); + } + if (SerializedAdditionalRawData?.ContainsKey("endpoint") != true) + { + writer.WritePropertyName("endpoint"u8); + writer.WriteStringValue(Endpoint.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("completion_window") != true) + { + writer.WritePropertyName("completion_window"u8); + writer.WriteStringValue(CompletionWindow.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateBatchRequest IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateBatchRequest)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateBatchRequest(document.RootElement, options); + } + + internal static InternalCreateBatchRequest DeserializeInternalCreateBatchRequest(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string inputFileId = default; + InternalCreateBatchRequestEndpoint endpoint = default; + InternalBatchCompletionTimeframe completionWindow = default; + IDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("input_file_id"u8)) + { + inputFileId = property.Value.GetString(); + continue; + } + if (property.NameEquals("endpoint"u8)) + { + endpoint = new InternalCreateBatchRequestEndpoint(property.Value.GetString()); + continue; + } + if (property.NameEquals("completion_window"u8)) + { + completionWindow = new InternalBatchCompletionTimeframe(property.Value.GetString()); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateBatchRequest(inputFileId, endpoint, completionWindow, metadata ?? new ChangeTrackingDictionary(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateBatchRequest)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateBatchRequest IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateBatchRequest(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateBatchRequest)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateBatchRequest FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateBatchRequest(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateBatchRequest.cs b/.dotnet/src/Generated/Models/InternalCreateBatchRequest.cs new file mode 100644 index 000000000..a50129fc8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateBatchRequest.cs @@ -0,0 +1,41 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Batch +{ + internal partial class InternalCreateBatchRequest + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateBatchRequest(string inputFileId, InternalCreateBatchRequestEndpoint endpoint) + { + Argument.AssertNotNull(inputFileId, nameof(inputFileId)); + + InputFileId = inputFileId; + Endpoint = endpoint; + Metadata = new ChangeTrackingDictionary(); + } + + internal InternalCreateBatchRequest(string inputFileId, InternalCreateBatchRequestEndpoint endpoint, InternalBatchCompletionTimeframe completionWindow, IDictionary metadata, IDictionary serializedAdditionalRawData) + { + InputFileId = inputFileId; + Endpoint = endpoint; + CompletionWindow = completionWindow; + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateBatchRequest() + { + } + + public string InputFileId { get; } + public InternalCreateBatchRequestEndpoint Endpoint { get; } + public InternalBatchCompletionTimeframe CompletionWindow { get; } = InternalBatchCompletionTimeframe._24h; + + public IDictionary Metadata { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateBatchRequestEndpoint.cs b/.dotnet/src/Generated/Models/InternalCreateBatchRequestEndpoint.cs new file mode 100644 index 000000000..9163b9de6 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateBatchRequestEndpoint.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Batch +{ + internal readonly partial struct InternalCreateBatchRequestEndpoint : IEquatable + { + private readonly string _value; + + public InternalCreateBatchRequestEndpoint(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string V1ChatCompletionsValue = "/v1/chat/completions"; + private const string V1EmbeddingsValue = "/v1/embeddings"; + + public static InternalCreateBatchRequestEndpoint V1ChatCompletions { get; } = new InternalCreateBatchRequestEndpoint(V1ChatCompletionsValue); + public static InternalCreateBatchRequestEndpoint V1Embeddings { get; } = new InternalCreateBatchRequestEndpoint(V1EmbeddingsValue); + public static bool operator ==(InternalCreateBatchRequestEndpoint left, InternalCreateBatchRequestEndpoint right) => left.Equals(right); + public static bool operator !=(InternalCreateBatchRequestEndpoint left, InternalCreateBatchRequestEndpoint right) => !left.Equals(right); + public static implicit operator InternalCreateBatchRequestEndpoint(string value) => new InternalCreateBatchRequestEndpoint(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateBatchRequestEndpoint other && Equals(other); + public bool Equals(InternalCreateBatchRequestEndpoint other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponse.Serialization.cs new file mode 100644 index 000000000..a5999bf8d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponse.Serialization.cs @@ -0,0 +1,221 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionFunctionResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionFunctionResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("choices") != true) + { + writer.WritePropertyName("choices"u8); + writer.WriteStartArray(); + foreach (var item in Choices) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("created") != true) + { + writer.WritePropertyName("created"u8); + writer.WriteNumberValue(Created, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model); + } + if (SerializedAdditionalRawData?.ContainsKey("system_fingerprint") != true && Optional.IsDefined(SystemFingerprint)) + { + writer.WritePropertyName("system_fingerprint"u8); + writer.WriteStringValue(SystemFingerprint); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("usage") != true && Optional.IsDefined(Usage)) + { + writer.WritePropertyName("usage"u8); + writer.WriteObjectValue(Usage, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateChatCompletionFunctionResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionFunctionResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateChatCompletionFunctionResponse(document.RootElement, options); + } + + internal static InternalCreateChatCompletionFunctionResponse DeserializeInternalCreateChatCompletionFunctionResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + IReadOnlyList choices = default; + DateTimeOffset created = default; + string model = default; + string systemFingerprint = default; + InternalCreateChatCompletionFunctionResponseObject @object = default; + ChatTokenUsage usage = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("choices"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalCreateChatCompletionFunctionResponseChoice.DeserializeInternalCreateChatCompletionFunctionResponseChoice(item, options)); + } + choices = array; + continue; + } + if (property.NameEquals("created"u8)) + { + created = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("model"u8)) + { + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("system_fingerprint"u8)) + { + systemFingerprint = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalCreateChatCompletionFunctionResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("usage"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + usage = ChatTokenUsage.DeserializeChatTokenUsage(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateChatCompletionFunctionResponse( + id, + choices, + created, + model, + systemFingerprint, + @object, + usage, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionFunctionResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateChatCompletionFunctionResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateChatCompletionFunctionResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionFunctionResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateChatCompletionFunctionResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateChatCompletionFunctionResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponse.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponse.cs new file mode 100644 index 000000000..19c621203 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponse.cs @@ -0,0 +1,51 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionFunctionResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateChatCompletionFunctionResponse(string id, IEnumerable choices, DateTimeOffset created, string model) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(choices, nameof(choices)); + Argument.AssertNotNull(model, nameof(model)); + + Id = id; + Choices = choices.ToList(); + Created = created; + Model = model; + } + + internal InternalCreateChatCompletionFunctionResponse(string id, IReadOnlyList choices, DateTimeOffset created, string model, string systemFingerprint, InternalCreateChatCompletionFunctionResponseObject @object, ChatTokenUsage usage, IDictionary serializedAdditionalRawData) + { + Id = id; + Choices = choices; + Created = created; + Model = model; + SystemFingerprint = systemFingerprint; + Object = @object; + Usage = usage; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateChatCompletionFunctionResponse() + { + } + + public string Id { get; } + public IReadOnlyList Choices { get; } + public DateTimeOffset Created { get; } + public string Model { get; } + public string SystemFingerprint { get; } + public InternalCreateChatCompletionFunctionResponseObject Object { get; } = InternalCreateChatCompletionFunctionResponseObject.ChatCompletion; + + public ChatTokenUsage Usage { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseChoice.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseChoice.Serialization.cs new file mode 100644 index 000000000..55422fd8c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseChoice.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionFunctionResponseChoice : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionFunctionResponseChoice)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("finish_reason") != true) + { + writer.WritePropertyName("finish_reason"u8); + writer.WriteStringValue(FinishReason.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true) + { + writer.WritePropertyName("message"u8); + writer.WriteObjectValue(Message, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateChatCompletionFunctionResponseChoice IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionFunctionResponseChoice)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateChatCompletionFunctionResponseChoice(document.RootElement, options); + } + + internal static InternalCreateChatCompletionFunctionResponseChoice DeserializeInternalCreateChatCompletionFunctionResponseChoice(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalCreateChatCompletionFunctionResponseChoiceFinishReason finishReason = default; + int index = default; + InternalChatCompletionResponseMessage message = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("finish_reason"u8)) + { + finishReason = new InternalCreateChatCompletionFunctionResponseChoiceFinishReason(property.Value.GetString()); + continue; + } + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("message"u8)) + { + message = InternalChatCompletionResponseMessage.DeserializeInternalChatCompletionResponseMessage(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateChatCompletionFunctionResponseChoice(finishReason, index, message, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionFunctionResponseChoice)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateChatCompletionFunctionResponseChoice IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateChatCompletionFunctionResponseChoice(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionFunctionResponseChoice)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateChatCompletionFunctionResponseChoice FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateChatCompletionFunctionResponseChoice(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseChoice.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseChoice.cs new file mode 100644 index 000000000..6e0687ded --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseChoice.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionFunctionResponseChoice + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateChatCompletionFunctionResponseChoice(InternalCreateChatCompletionFunctionResponseChoiceFinishReason finishReason, int index, InternalChatCompletionResponseMessage message) + { + Argument.AssertNotNull(message, nameof(message)); + + FinishReason = finishReason; + Index = index; + Message = message; + } + + internal InternalCreateChatCompletionFunctionResponseChoice(InternalCreateChatCompletionFunctionResponseChoiceFinishReason finishReason, int index, InternalChatCompletionResponseMessage message, IDictionary serializedAdditionalRawData) + { + FinishReason = finishReason; + Index = index; + Message = message; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateChatCompletionFunctionResponseChoice() + { + } + + public InternalCreateChatCompletionFunctionResponseChoiceFinishReason FinishReason { get; } + public int Index { get; } + public InternalChatCompletionResponseMessage Message { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseChoiceFinishReason.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseChoiceFinishReason.cs new file mode 100644 index 000000000..679edf97e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseChoiceFinishReason.cs @@ -0,0 +1,40 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalCreateChatCompletionFunctionResponseChoiceFinishReason : IEquatable + { + private readonly string _value; + + public InternalCreateChatCompletionFunctionResponseChoiceFinishReason(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string StopValue = "stop"; + private const string LengthValue = "length"; + private const string FunctionCallValue = "function_call"; + private const string ContentFilterValue = "content_filter"; + + public static InternalCreateChatCompletionFunctionResponseChoiceFinishReason Stop { get; } = new InternalCreateChatCompletionFunctionResponseChoiceFinishReason(StopValue); + public static InternalCreateChatCompletionFunctionResponseChoiceFinishReason Length { get; } = new InternalCreateChatCompletionFunctionResponseChoiceFinishReason(LengthValue); + public static InternalCreateChatCompletionFunctionResponseChoiceFinishReason FunctionCall { get; } = new InternalCreateChatCompletionFunctionResponseChoiceFinishReason(FunctionCallValue); + public static InternalCreateChatCompletionFunctionResponseChoiceFinishReason ContentFilter { get; } = new InternalCreateChatCompletionFunctionResponseChoiceFinishReason(ContentFilterValue); + public static bool operator ==(InternalCreateChatCompletionFunctionResponseChoiceFinishReason left, InternalCreateChatCompletionFunctionResponseChoiceFinishReason right) => left.Equals(right); + public static bool operator !=(InternalCreateChatCompletionFunctionResponseChoiceFinishReason left, InternalCreateChatCompletionFunctionResponseChoiceFinishReason right) => !left.Equals(right); + public static implicit operator InternalCreateChatCompletionFunctionResponseChoiceFinishReason(string value) => new InternalCreateChatCompletionFunctionResponseChoiceFinishReason(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateChatCompletionFunctionResponseChoiceFinishReason other && Equals(other); + public bool Equals(InternalCreateChatCompletionFunctionResponseChoiceFinishReason other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseObject.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseObject.cs new file mode 100644 index 000000000..43c04df1d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionFunctionResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalCreateChatCompletionFunctionResponseObject : IEquatable + { + private readonly string _value; + + public InternalCreateChatCompletionFunctionResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ChatCompletionValue = "chat.completion"; + + public static InternalCreateChatCompletionFunctionResponseObject ChatCompletion { get; } = new InternalCreateChatCompletionFunctionResponseObject(ChatCompletionValue); + public static bool operator ==(InternalCreateChatCompletionFunctionResponseObject left, InternalCreateChatCompletionFunctionResponseObject right) => left.Equals(right); + public static bool operator !=(InternalCreateChatCompletionFunctionResponseObject left, InternalCreateChatCompletionFunctionResponseObject right) => !left.Equals(right); + public static implicit operator InternalCreateChatCompletionFunctionResponseObject(string value) => new InternalCreateChatCompletionFunctionResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateChatCompletionFunctionResponseObject other && Equals(other); + public bool Equals(InternalCreateChatCompletionFunctionResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionRequestModel.cs new file mode 100644 index 000000000..4b14c7809 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionRequestModel.cs @@ -0,0 +1,82 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalCreateChatCompletionRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateChatCompletionRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string Gpt4oValue = "gpt-4o"; + private const string Gpt4o20240513Value = "gpt-4o-2024-05-13"; + private const string Gpt4o20240806Value = "gpt-4o-2024-08-06"; + private const string Chatgpt4oLatestValue = "chatgpt-4o-latest"; + private const string Gpt4oMiniValue = "gpt-4o-mini"; + private const string Gpt4oMini20240718Value = "gpt-4o-mini-2024-07-18"; + private const string Gpt4TurboValue = "gpt-4-turbo"; + private const string Gpt4Turbo20240409Value = "gpt-4-turbo-2024-04-09"; + private const string Gpt40125PreviewValue = "gpt-4-0125-preview"; + private const string Gpt4TurboPreviewValue = "gpt-4-turbo-preview"; + private const string Gpt41106PreviewValue = "gpt-4-1106-preview"; + private const string Gpt4VisionPreviewValue = "gpt-4-vision-preview"; + private const string Gpt4Value = "gpt-4"; + private const string Gpt40314Value = "gpt-4-0314"; + private const string Gpt40613Value = "gpt-4-0613"; + private const string Gpt432kValue = "gpt-4-32k"; + private const string Gpt432k0314Value = "gpt-4-32k-0314"; + private const string Gpt432k0613Value = "gpt-4-32k-0613"; + private const string Gpt35TurboValue = "gpt-3.5-turbo"; + private const string Gpt35Turbo16kValue = "gpt-3.5-turbo-16k"; + private const string Gpt35Turbo0301Value = "gpt-3.5-turbo-0301"; + private const string Gpt35Turbo0613Value = "gpt-3.5-turbo-0613"; + private const string Gpt35Turbo1106Value = "gpt-3.5-turbo-1106"; + private const string Gpt35Turbo0125Value = "gpt-3.5-turbo-0125"; + private const string Gpt35Turbo16k0613Value = "gpt-3.5-turbo-16k-0613"; + + public static InternalCreateChatCompletionRequestModel Gpt4o { get; } = new InternalCreateChatCompletionRequestModel(Gpt4oValue); + public static InternalCreateChatCompletionRequestModel Gpt4o20240513 { get; } = new InternalCreateChatCompletionRequestModel(Gpt4o20240513Value); + public static InternalCreateChatCompletionRequestModel Gpt4o20240806 { get; } = new InternalCreateChatCompletionRequestModel(Gpt4o20240806Value); + public static InternalCreateChatCompletionRequestModel Chatgpt4oLatest { get; } = new InternalCreateChatCompletionRequestModel(Chatgpt4oLatestValue); + public static InternalCreateChatCompletionRequestModel Gpt4oMini { get; } = new InternalCreateChatCompletionRequestModel(Gpt4oMiniValue); + public static InternalCreateChatCompletionRequestModel Gpt4oMini20240718 { get; } = new InternalCreateChatCompletionRequestModel(Gpt4oMini20240718Value); + public static InternalCreateChatCompletionRequestModel Gpt4Turbo { get; } = new InternalCreateChatCompletionRequestModel(Gpt4TurboValue); + public static InternalCreateChatCompletionRequestModel Gpt4Turbo20240409 { get; } = new InternalCreateChatCompletionRequestModel(Gpt4Turbo20240409Value); + public static InternalCreateChatCompletionRequestModel Gpt40125Preview { get; } = new InternalCreateChatCompletionRequestModel(Gpt40125PreviewValue); + public static InternalCreateChatCompletionRequestModel Gpt4TurboPreview { get; } = new InternalCreateChatCompletionRequestModel(Gpt4TurboPreviewValue); + public static InternalCreateChatCompletionRequestModel Gpt41106Preview { get; } = new InternalCreateChatCompletionRequestModel(Gpt41106PreviewValue); + public static InternalCreateChatCompletionRequestModel Gpt4VisionPreview { get; } = new InternalCreateChatCompletionRequestModel(Gpt4VisionPreviewValue); + public static InternalCreateChatCompletionRequestModel Gpt4 { get; } = new InternalCreateChatCompletionRequestModel(Gpt4Value); + public static InternalCreateChatCompletionRequestModel Gpt40314 { get; } = new InternalCreateChatCompletionRequestModel(Gpt40314Value); + public static InternalCreateChatCompletionRequestModel Gpt40613 { get; } = new InternalCreateChatCompletionRequestModel(Gpt40613Value); + public static InternalCreateChatCompletionRequestModel Gpt432k { get; } = new InternalCreateChatCompletionRequestModel(Gpt432kValue); + public static InternalCreateChatCompletionRequestModel Gpt432k0314 { get; } = new InternalCreateChatCompletionRequestModel(Gpt432k0314Value); + public static InternalCreateChatCompletionRequestModel Gpt432k0613 { get; } = new InternalCreateChatCompletionRequestModel(Gpt432k0613Value); + public static InternalCreateChatCompletionRequestModel Gpt35Turbo { get; } = new InternalCreateChatCompletionRequestModel(Gpt35TurboValue); + public static InternalCreateChatCompletionRequestModel Gpt35Turbo16k { get; } = new InternalCreateChatCompletionRequestModel(Gpt35Turbo16kValue); + public static InternalCreateChatCompletionRequestModel Gpt35Turbo0301 { get; } = new InternalCreateChatCompletionRequestModel(Gpt35Turbo0301Value); + public static InternalCreateChatCompletionRequestModel Gpt35Turbo0613 { get; } = new InternalCreateChatCompletionRequestModel(Gpt35Turbo0613Value); + public static InternalCreateChatCompletionRequestModel Gpt35Turbo1106 { get; } = new InternalCreateChatCompletionRequestModel(Gpt35Turbo1106Value); + public static InternalCreateChatCompletionRequestModel Gpt35Turbo0125 { get; } = new InternalCreateChatCompletionRequestModel(Gpt35Turbo0125Value); + public static InternalCreateChatCompletionRequestModel Gpt35Turbo16k0613 { get; } = new InternalCreateChatCompletionRequestModel(Gpt35Turbo16k0613Value); + public static bool operator ==(InternalCreateChatCompletionRequestModel left, InternalCreateChatCompletionRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateChatCompletionRequestModel left, InternalCreateChatCompletionRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateChatCompletionRequestModel(string value) => new InternalCreateChatCompletionRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateChatCompletionRequestModel other && Equals(other); + public bool Equals(InternalCreateChatCompletionRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionRequestServiceTier.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionRequestServiceTier.cs new file mode 100644 index 000000000..7d6f7f657 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionRequestServiceTier.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalCreateChatCompletionRequestServiceTier : IEquatable + { + private readonly string _value; + + public InternalCreateChatCompletionRequestServiceTier(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AutoValue = "auto"; + private const string DefaultValue = "default"; + + public static InternalCreateChatCompletionRequestServiceTier Auto { get; } = new InternalCreateChatCompletionRequestServiceTier(AutoValue); + public static InternalCreateChatCompletionRequestServiceTier Default { get; } = new InternalCreateChatCompletionRequestServiceTier(DefaultValue); + public static bool operator ==(InternalCreateChatCompletionRequestServiceTier left, InternalCreateChatCompletionRequestServiceTier right) => left.Equals(right); + public static bool operator !=(InternalCreateChatCompletionRequestServiceTier left, InternalCreateChatCompletionRequestServiceTier right) => !left.Equals(right); + public static implicit operator InternalCreateChatCompletionRequestServiceTier(string value) => new InternalCreateChatCompletionRequestServiceTier(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateChatCompletionRequestServiceTier other && Equals(other); + public bool Equals(InternalCreateChatCompletionRequestServiceTier other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionRequestToolChoice.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionRequestToolChoice.cs new file mode 100644 index 000000000..04328dced --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionRequestToolChoice.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalCreateChatCompletionRequestToolChoice : IEquatable + { + private readonly string _value; + + public InternalCreateChatCompletionRequestToolChoice(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string NoneValue = "none"; + private const string AutoValue = "auto"; + private const string RequiredValue = "required"; + + public static InternalCreateChatCompletionRequestToolChoice None { get; } = new InternalCreateChatCompletionRequestToolChoice(NoneValue); + public static InternalCreateChatCompletionRequestToolChoice Auto { get; } = new InternalCreateChatCompletionRequestToolChoice(AutoValue); + public static InternalCreateChatCompletionRequestToolChoice Required { get; } = new InternalCreateChatCompletionRequestToolChoice(RequiredValue); + public static bool operator ==(InternalCreateChatCompletionRequestToolChoice left, InternalCreateChatCompletionRequestToolChoice right) => left.Equals(right); + public static bool operator !=(InternalCreateChatCompletionRequestToolChoice left, InternalCreateChatCompletionRequestToolChoice right) => !left.Equals(right); + public static implicit operator InternalCreateChatCompletionRequestToolChoice(string value) => new InternalCreateChatCompletionRequestToolChoice(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateChatCompletionRequestToolChoice other && Equals(other); + public bool Equals(InternalCreateChatCompletionRequestToolChoice other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoice.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoice.Serialization.cs new file mode 100644 index 000000000..dc72c0a83 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoice.Serialization.cs @@ -0,0 +1,178 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionResponseChoice : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionResponseChoice)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("finish_reason") != true) + { + writer.WritePropertyName("finish_reason"u8); + writer.WriteStringValue(FinishReason.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true) + { + writer.WritePropertyName("message"u8); + writer.WriteObjectValue(Message, options); + } + if (SerializedAdditionalRawData?.ContainsKey("logprobs") != true) + { + if (Logprobs != null) + { + writer.WritePropertyName("logprobs"u8); + writer.WriteObjectValue(Logprobs, options); + } + else + { + writer.WriteNull("logprobs"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateChatCompletionResponseChoice IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionResponseChoice)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateChatCompletionResponseChoice(document.RootElement, options); + } + + internal static InternalCreateChatCompletionResponseChoice DeserializeInternalCreateChatCompletionResponseChoice(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + ChatFinishReason finishReason = default; + int index = default; + InternalChatCompletionResponseMessage message = default; + InternalCreateChatCompletionResponseChoiceLogprobs logprobs = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("finish_reason"u8)) + { + finishReason = property.Value.GetString().ToChatFinishReason(); + continue; + } + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("message"u8)) + { + message = InternalChatCompletionResponseMessage.DeserializeInternalChatCompletionResponseMessage(property.Value, options); + continue; + } + if (property.NameEquals("logprobs"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + logprobs = null; + continue; + } + logprobs = InternalCreateChatCompletionResponseChoiceLogprobs.DeserializeInternalCreateChatCompletionResponseChoiceLogprobs(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateChatCompletionResponseChoice(finishReason, index, message, logprobs, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionResponseChoice)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateChatCompletionResponseChoice IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateChatCompletionResponseChoice(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionResponseChoice)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateChatCompletionResponseChoice FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateChatCompletionResponseChoice(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoice.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoice.cs new file mode 100644 index 000000000..88a8051cf --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoice.cs @@ -0,0 +1,41 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionResponseChoice + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateChatCompletionResponseChoice(ChatFinishReason finishReason, int index, InternalChatCompletionResponseMessage message, InternalCreateChatCompletionResponseChoiceLogprobs logprobs) + { + Argument.AssertNotNull(message, nameof(message)); + + FinishReason = finishReason; + Index = index; + Message = message; + Logprobs = logprobs; + } + + internal InternalCreateChatCompletionResponseChoice(ChatFinishReason finishReason, int index, InternalChatCompletionResponseMessage message, InternalCreateChatCompletionResponseChoiceLogprobs logprobs, IDictionary serializedAdditionalRawData) + { + FinishReason = finishReason; + Index = index; + Message = message; + Logprobs = logprobs; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateChatCompletionResponseChoice() + { + } + + public ChatFinishReason FinishReason { get; } + public int Index { get; } + public InternalChatCompletionResponseMessage Message { get; } + public InternalCreateChatCompletionResponseChoiceLogprobs Logprobs { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoiceLogprobs.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoiceLogprobs.Serialization.cs new file mode 100644 index 000000000..1fee709bf --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoiceLogprobs.Serialization.cs @@ -0,0 +1,188 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionResponseChoiceLogprobs : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionResponseChoiceLogprobs)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("content") != true) + { + if (Content != null && Optional.IsCollectionDefined(Content)) + { + writer.WritePropertyName("content"u8); + writer.WriteStartArray(); + foreach (var item in Content) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("content"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("refusal") != true) + { + if (Refusal != null && Optional.IsCollectionDefined(Refusal)) + { + writer.WritePropertyName("refusal"u8); + writer.WriteStartArray(); + foreach (var item in Refusal) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("refusal"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateChatCompletionResponseChoiceLogprobs IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionResponseChoiceLogprobs)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateChatCompletionResponseChoiceLogprobs(document.RootElement, options); + } + + internal static InternalCreateChatCompletionResponseChoiceLogprobs DeserializeInternalCreateChatCompletionResponseChoiceLogprobs(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList content = default; + IReadOnlyList refusal = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("content"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + content = new ChangeTrackingList(); + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatTokenLogProbabilityInfo.DeserializeChatTokenLogProbabilityInfo(item, options)); + } + content = array; + continue; + } + if (property.NameEquals("refusal"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + refusal = new ChangeTrackingList(); + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatTokenLogProbabilityInfo.DeserializeChatTokenLogProbabilityInfo(item, options)); + } + refusal = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateChatCompletionResponseChoiceLogprobs(content, refusal, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionResponseChoiceLogprobs)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateChatCompletionResponseChoiceLogprobs IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateChatCompletionResponseChoiceLogprobs(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionResponseChoiceLogprobs)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateChatCompletionResponseChoiceLogprobs FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateChatCompletionResponseChoiceLogprobs(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoiceLogprobs.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoiceLogprobs.cs new file mode 100644 index 000000000..a03030ec2 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseChoiceLogprobs.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionResponseChoiceLogprobs + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateChatCompletionResponseChoiceLogprobs(IEnumerable content, IEnumerable refusal) + { + Content = content?.ToList(); + Refusal = refusal?.ToList(); + } + + internal InternalCreateChatCompletionResponseChoiceLogprobs(IReadOnlyList content, IReadOnlyList refusal, IDictionary serializedAdditionalRawData) + { + Content = content; + Refusal = refusal; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateChatCompletionResponseChoiceLogprobs() + { + } + + public IReadOnlyList Content { get; } + public IReadOnlyList Refusal { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseObject.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseObject.cs new file mode 100644 index 000000000..9c1c590ed --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalCreateChatCompletionResponseObject : IEquatable + { + private readonly string _value; + + public InternalCreateChatCompletionResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ChatCompletionValue = "chat.completion"; + + public static InternalCreateChatCompletionResponseObject ChatCompletion { get; } = new InternalCreateChatCompletionResponseObject(ChatCompletionValue); + public static bool operator ==(InternalCreateChatCompletionResponseObject left, InternalCreateChatCompletionResponseObject right) => left.Equals(right); + public static bool operator !=(InternalCreateChatCompletionResponseObject left, InternalCreateChatCompletionResponseObject right) => !left.Equals(right); + public static implicit operator InternalCreateChatCompletionResponseObject(string value) => new InternalCreateChatCompletionResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateChatCompletionResponseObject other && Equals(other); + public bool Equals(InternalCreateChatCompletionResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseServiceTier.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseServiceTier.cs new file mode 100644 index 000000000..599297be0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionResponseServiceTier.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalCreateChatCompletionResponseServiceTier : IEquatable + { + private readonly string _value; + + public InternalCreateChatCompletionResponseServiceTier(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ScaleValue = "scale"; + private const string DefaultValue = "default"; + + public static InternalCreateChatCompletionResponseServiceTier Scale { get; } = new InternalCreateChatCompletionResponseServiceTier(ScaleValue); + public static InternalCreateChatCompletionResponseServiceTier Default { get; } = new InternalCreateChatCompletionResponseServiceTier(DefaultValue); + public static bool operator ==(InternalCreateChatCompletionResponseServiceTier left, InternalCreateChatCompletionResponseServiceTier right) => left.Equals(right); + public static bool operator !=(InternalCreateChatCompletionResponseServiceTier left, InternalCreateChatCompletionResponseServiceTier right) => !left.Equals(right); + public static implicit operator InternalCreateChatCompletionResponseServiceTier(string value) => new InternalCreateChatCompletionResponseServiceTier(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateChatCompletionResponseServiceTier other && Equals(other); + public bool Equals(InternalCreateChatCompletionResponseServiceTier other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoice.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoice.Serialization.cs new file mode 100644 index 000000000..fd1dd8a10 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoice.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionStreamResponseChoice : IJsonModel + { + InternalCreateChatCompletionStreamResponseChoice IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionStreamResponseChoice)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateChatCompletionStreamResponseChoice(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionStreamResponseChoice)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateChatCompletionStreamResponseChoice IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateChatCompletionStreamResponseChoice(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionStreamResponseChoice)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateChatCompletionStreamResponseChoice FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateChatCompletionStreamResponseChoice(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoice.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoice.cs new file mode 100644 index 000000000..6bb376d9f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoice.cs @@ -0,0 +1,39 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionStreamResponseChoice + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateChatCompletionStreamResponseChoice(InternalChatCompletionStreamResponseDelta delta, ChatFinishReason? finishReason, int index) + { + Argument.AssertNotNull(delta, nameof(delta)); + + Delta = delta; + FinishReason = finishReason; + Index = index; + } + + internal InternalCreateChatCompletionStreamResponseChoice(InternalChatCompletionStreamResponseDelta delta, InternalCreateChatCompletionStreamResponseChoiceLogprobs logprobs, ChatFinishReason? finishReason, int index, IDictionary serializedAdditionalRawData) + { + Delta = delta; + Logprobs = logprobs; + FinishReason = finishReason; + Index = index; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateChatCompletionStreamResponseChoice() + { + } + + public InternalChatCompletionStreamResponseDelta Delta { get; } + public InternalCreateChatCompletionStreamResponseChoiceLogprobs Logprobs { get; } + public int Index { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoiceFinishReason.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoiceFinishReason.cs new file mode 100644 index 000000000..76fdf5a27 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoiceFinishReason.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalCreateChatCompletionStreamResponseChoiceFinishReason : IEquatable + { + private readonly string _value; + + public InternalCreateChatCompletionStreamResponseChoiceFinishReason(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string StopValue = "stop"; + private const string LengthValue = "length"; + private const string ToolCallsValue = "tool_calls"; + private const string ContentFilterValue = "content_filter"; + private const string FunctionCallValue = "function_call"; + + public static InternalCreateChatCompletionStreamResponseChoiceFinishReason Stop { get; } = new InternalCreateChatCompletionStreamResponseChoiceFinishReason(StopValue); + public static InternalCreateChatCompletionStreamResponseChoiceFinishReason Length { get; } = new InternalCreateChatCompletionStreamResponseChoiceFinishReason(LengthValue); + public static InternalCreateChatCompletionStreamResponseChoiceFinishReason ToolCalls { get; } = new InternalCreateChatCompletionStreamResponseChoiceFinishReason(ToolCallsValue); + public static InternalCreateChatCompletionStreamResponseChoiceFinishReason ContentFilter { get; } = new InternalCreateChatCompletionStreamResponseChoiceFinishReason(ContentFilterValue); + public static InternalCreateChatCompletionStreamResponseChoiceFinishReason FunctionCall { get; } = new InternalCreateChatCompletionStreamResponseChoiceFinishReason(FunctionCallValue); + public static bool operator ==(InternalCreateChatCompletionStreamResponseChoiceFinishReason left, InternalCreateChatCompletionStreamResponseChoiceFinishReason right) => left.Equals(right); + public static bool operator !=(InternalCreateChatCompletionStreamResponseChoiceFinishReason left, InternalCreateChatCompletionStreamResponseChoiceFinishReason right) => !left.Equals(right); + public static implicit operator InternalCreateChatCompletionStreamResponseChoiceFinishReason(string value) => new InternalCreateChatCompletionStreamResponseChoiceFinishReason(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateChatCompletionStreamResponseChoiceFinishReason other && Equals(other); + public bool Equals(InternalCreateChatCompletionStreamResponseChoiceFinishReason other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoiceLogprobs.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoiceLogprobs.Serialization.cs new file mode 100644 index 000000000..43ebedfc9 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoiceLogprobs.Serialization.cs @@ -0,0 +1,188 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionStreamResponseChoiceLogprobs : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionStreamResponseChoiceLogprobs)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("content") != true) + { + if (Content != null && Optional.IsCollectionDefined(Content)) + { + writer.WritePropertyName("content"u8); + writer.WriteStartArray(); + foreach (var item in Content) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("content"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("refusal") != true) + { + if (Refusal != null && Optional.IsCollectionDefined(Refusal)) + { + writer.WritePropertyName("refusal"u8); + writer.WriteStartArray(); + foreach (var item in Refusal) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("refusal"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateChatCompletionStreamResponseChoiceLogprobs IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionStreamResponseChoiceLogprobs)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateChatCompletionStreamResponseChoiceLogprobs(document.RootElement, options); + } + + internal static InternalCreateChatCompletionStreamResponseChoiceLogprobs DeserializeInternalCreateChatCompletionStreamResponseChoiceLogprobs(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList content = default; + IReadOnlyList refusal = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("content"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + content = new ChangeTrackingList(); + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatTokenLogProbabilityInfo.DeserializeChatTokenLogProbabilityInfo(item, options)); + } + content = array; + continue; + } + if (property.NameEquals("refusal"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + refusal = new ChangeTrackingList(); + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatTokenLogProbabilityInfo.DeserializeChatTokenLogProbabilityInfo(item, options)); + } + refusal = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateChatCompletionStreamResponseChoiceLogprobs(content, refusal, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionStreamResponseChoiceLogprobs)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateChatCompletionStreamResponseChoiceLogprobs IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateChatCompletionStreamResponseChoiceLogprobs(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionStreamResponseChoiceLogprobs)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateChatCompletionStreamResponseChoiceLogprobs FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateChatCompletionStreamResponseChoiceLogprobs(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoiceLogprobs.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoiceLogprobs.cs new file mode 100644 index 000000000..03a852399 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseChoiceLogprobs.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionStreamResponseChoiceLogprobs + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateChatCompletionStreamResponseChoiceLogprobs(IEnumerable content, IEnumerable refusal) + { + Content = content?.ToList(); + Refusal = refusal?.ToList(); + } + + internal InternalCreateChatCompletionStreamResponseChoiceLogprobs(IReadOnlyList content, IReadOnlyList refusal, IDictionary serializedAdditionalRawData) + { + Content = content; + Refusal = refusal; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateChatCompletionStreamResponseChoiceLogprobs() + { + } + + public IReadOnlyList Content { get; } + public IReadOnlyList Refusal { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseObject.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseObject.cs new file mode 100644 index 000000000..65aca720e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalCreateChatCompletionStreamResponseObject : IEquatable + { + private readonly string _value; + + public InternalCreateChatCompletionStreamResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ChatCompletionChunkValue = "chat.completion.chunk"; + + public static InternalCreateChatCompletionStreamResponseObject ChatCompletionChunk { get; } = new InternalCreateChatCompletionStreamResponseObject(ChatCompletionChunkValue); + public static bool operator ==(InternalCreateChatCompletionStreamResponseObject left, InternalCreateChatCompletionStreamResponseObject right) => left.Equals(right); + public static bool operator !=(InternalCreateChatCompletionStreamResponseObject left, InternalCreateChatCompletionStreamResponseObject right) => !left.Equals(right); + public static implicit operator InternalCreateChatCompletionStreamResponseObject(string value) => new InternalCreateChatCompletionStreamResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateChatCompletionStreamResponseObject other && Equals(other); + public bool Equals(InternalCreateChatCompletionStreamResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseServiceTier.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseServiceTier.cs new file mode 100644 index 000000000..0f2eb1326 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseServiceTier.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Chat +{ + internal readonly partial struct InternalCreateChatCompletionStreamResponseServiceTier : IEquatable + { + private readonly string _value; + + public InternalCreateChatCompletionStreamResponseServiceTier(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ScaleValue = "scale"; + private const string DefaultValue = "default"; + + public static InternalCreateChatCompletionStreamResponseServiceTier Scale { get; } = new InternalCreateChatCompletionStreamResponseServiceTier(ScaleValue); + public static InternalCreateChatCompletionStreamResponseServiceTier Default { get; } = new InternalCreateChatCompletionStreamResponseServiceTier(DefaultValue); + public static bool operator ==(InternalCreateChatCompletionStreamResponseServiceTier left, InternalCreateChatCompletionStreamResponseServiceTier right) => left.Equals(right); + public static bool operator !=(InternalCreateChatCompletionStreamResponseServiceTier left, InternalCreateChatCompletionStreamResponseServiceTier right) => !left.Equals(right); + public static implicit operator InternalCreateChatCompletionStreamResponseServiceTier(string value) => new InternalCreateChatCompletionStreamResponseServiceTier(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateChatCompletionStreamResponseServiceTier other && Equals(other); + public bool Equals(InternalCreateChatCompletionStreamResponseServiceTier other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseUsage.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseUsage.Serialization.cs new file mode 100644 index 000000000..300a01f0c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseUsage.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionStreamResponseUsage : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionStreamResponseUsage)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("completion_tokens") != true) + { + writer.WritePropertyName("completion_tokens"u8); + writer.WriteNumberValue(CompletionTokens); + } + if (SerializedAdditionalRawData?.ContainsKey("prompt_tokens") != true) + { + writer.WritePropertyName("prompt_tokens"u8); + writer.WriteNumberValue(PromptTokens); + } + if (SerializedAdditionalRawData?.ContainsKey("total_tokens") != true) + { + writer.WritePropertyName("total_tokens"u8); + writer.WriteNumberValue(TotalTokens); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateChatCompletionStreamResponseUsage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateChatCompletionStreamResponseUsage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateChatCompletionStreamResponseUsage(document.RootElement, options); + } + + internal static InternalCreateChatCompletionStreamResponseUsage DeserializeInternalCreateChatCompletionStreamResponseUsage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int completionTokens = default; + int promptTokens = default; + int totalTokens = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("completion_tokens"u8)) + { + completionTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("prompt_tokens"u8)) + { + promptTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("total_tokens"u8)) + { + totalTokens = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateChatCompletionStreamResponseUsage(completionTokens, promptTokens, totalTokens, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionStreamResponseUsage)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateChatCompletionStreamResponseUsage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateChatCompletionStreamResponseUsage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateChatCompletionStreamResponseUsage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateChatCompletionStreamResponseUsage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateChatCompletionStreamResponseUsage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseUsage.cs b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseUsage.cs new file mode 100644 index 000000000..922ce4cae --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateChatCompletionStreamResponseUsage.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalCreateChatCompletionStreamResponseUsage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateChatCompletionStreamResponseUsage(int completionTokens, int promptTokens, int totalTokens) + { + CompletionTokens = completionTokens; + PromptTokens = promptTokens; + TotalTokens = totalTokens; + } + + internal InternalCreateChatCompletionStreamResponseUsage(int completionTokens, int promptTokens, int totalTokens, IDictionary serializedAdditionalRawData) + { + CompletionTokens = completionTokens; + PromptTokens = promptTokens; + TotalTokens = totalTokens; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateChatCompletionStreamResponseUsage() + { + } + + public int CompletionTokens { get; } + public int PromptTokens { get; } + public int TotalTokens { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateCompletionRequest.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateCompletionRequest.Serialization.cs new file mode 100644 index 000000000..6658d5c48 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateCompletionRequest.Serialization.cs @@ -0,0 +1,556 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; +using OpenAI.Chat; + +namespace OpenAI.LegacyCompletions +{ + internal partial class InternalCreateCompletionRequest : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateCompletionRequest)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("prompt") != true) + { + if (Prompt != null) + { + writer.WritePropertyName("prompt"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(Prompt); +#else + using (JsonDocument document = JsonDocument.Parse(Prompt)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + else + { + writer.WriteNull("prompt"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("best_of") != true && Optional.IsDefined(BestOf)) + { + if (BestOf != null) + { + writer.WritePropertyName("best_of"u8); + writer.WriteNumberValue(BestOf.Value); + } + else + { + writer.WriteNull("best_of"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("echo") != true && Optional.IsDefined(Echo)) + { + if (Echo != null) + { + writer.WritePropertyName("echo"u8); + writer.WriteBooleanValue(Echo.Value); + } + else + { + writer.WriteNull("echo"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("frequency_penalty") != true && Optional.IsDefined(FrequencyPenalty)) + { + if (FrequencyPenalty != null) + { + writer.WritePropertyName("frequency_penalty"u8); + writer.WriteNumberValue(FrequencyPenalty.Value); + } + else + { + writer.WriteNull("frequency_penalty"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("logit_bias") != true && Optional.IsCollectionDefined(LogitBias)) + { + if (LogitBias != null) + { + writer.WritePropertyName("logit_bias"u8); + writer.WriteStartObject(); + foreach (var item in LogitBias) + { + writer.WritePropertyName(item.Key); + writer.WriteNumberValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("logit_bias"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("logprobs") != true && Optional.IsDefined(Logprobs)) + { + if (Logprobs != null) + { + writer.WritePropertyName("logprobs"u8); + writer.WriteNumberValue(Logprobs.Value); + } + else + { + writer.WriteNull("logprobs"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("max_tokens") != true && Optional.IsDefined(MaxTokens)) + { + if (MaxTokens != null) + { + writer.WritePropertyName("max_tokens"u8); + writer.WriteNumberValue(MaxTokens.Value); + } + else + { + writer.WriteNull("max_tokens"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("n") != true && Optional.IsDefined(N)) + { + if (N != null) + { + writer.WritePropertyName("n"u8); + writer.WriteNumberValue(N.Value); + } + else + { + writer.WriteNull("n"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("presence_penalty") != true && Optional.IsDefined(PresencePenalty)) + { + if (PresencePenalty != null) + { + writer.WritePropertyName("presence_penalty"u8); + writer.WriteNumberValue(PresencePenalty.Value); + } + else + { + writer.WriteNull("presence_penalty"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("seed") != true && Optional.IsDefined(Seed)) + { + if (Seed != null) + { + writer.WritePropertyName("seed"u8); + writer.WriteNumberValue(Seed.Value); + } + else + { + writer.WriteNull("seed"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("stop") != true && Optional.IsDefined(Stop)) + { + if (Stop != null) + { + writer.WritePropertyName("stop"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(Stop); +#else + using (JsonDocument document = JsonDocument.Parse(Stop)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + else + { + writer.WriteNull("stop"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("stream") != true && Optional.IsDefined(Stream)) + { + if (Stream != null) + { + writer.WritePropertyName("stream"u8); + writer.WriteBooleanValue(Stream.Value); + } + else + { + writer.WriteNull("stream"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("stream_options") != true && Optional.IsDefined(StreamOptions)) + { + if (StreamOptions != null) + { + writer.WritePropertyName("stream_options"u8); + writer.WriteObjectValue(StreamOptions, options); + } + else + { + writer.WriteNull("stream_options"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("suffix") != true && Optional.IsDefined(Suffix)) + { + if (Suffix != null) + { + writer.WritePropertyName("suffix"u8); + writer.WriteStringValue(Suffix); + } + else + { + writer.WriteNull("suffix"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("temperature") != true && Optional.IsDefined(Temperature)) + { + if (Temperature != null) + { + writer.WritePropertyName("temperature"u8); + writer.WriteNumberValue(Temperature.Value); + } + else + { + writer.WriteNull("temperature"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("top_p") != true && Optional.IsDefined(TopP)) + { + if (TopP != null) + { + writer.WritePropertyName("top_p"u8); + writer.WriteNumberValue(TopP.Value); + } + else + { + writer.WriteNull("top_p"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("user") != true && Optional.IsDefined(User)) + { + writer.WritePropertyName("user"u8); + writer.WriteStringValue(User); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateCompletionRequest IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateCompletionRequest)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateCompletionRequest(document.RootElement, options); + } + + internal static InternalCreateCompletionRequest DeserializeInternalCreateCompletionRequest(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalCreateCompletionRequestModel model = default; + BinaryData prompt = default; + int? bestOf = default; + bool? echo = default; + float? frequencyPenalty = default; + IDictionary logitBias = default; + int? logprobs = default; + int? maxTokens = default; + int? n = default; + float? presencePenalty = default; + long? seed = default; + BinaryData stop = default; + bool? stream = default; + InternalChatCompletionStreamOptions streamOptions = default; + string suffix = default; + float? temperature = default; + float? topP = default; + string user = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("model"u8)) + { + model = new InternalCreateCompletionRequestModel(property.Value.GetString()); + continue; + } + if (property.NameEquals("prompt"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + prompt = null; + continue; + } + prompt = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("best_of"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + bestOf = null; + continue; + } + bestOf = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("echo"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + echo = null; + continue; + } + echo = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("frequency_penalty"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + frequencyPenalty = null; + continue; + } + frequencyPenalty = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("logit_bias"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetInt32()); + } + logitBias = dictionary; + continue; + } + if (property.NameEquals("logprobs"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + logprobs = null; + continue; + } + logprobs = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("max_tokens"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + maxTokens = null; + continue; + } + maxTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("n"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + n = null; + continue; + } + n = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("presence_penalty"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + presencePenalty = null; + continue; + } + presencePenalty = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("seed"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + seed = null; + continue; + } + seed = property.Value.GetInt64(); + continue; + } + if (property.NameEquals("stop"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + stop = null; + continue; + } + stop = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("stream"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + stream = null; + continue; + } + stream = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("stream_options"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + streamOptions = null; + continue; + } + streamOptions = InternalChatCompletionStreamOptions.DeserializeInternalChatCompletionStreamOptions(property.Value, options); + continue; + } + if (property.NameEquals("suffix"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + suffix = null; + continue; + } + suffix = property.Value.GetString(); + continue; + } + if (property.NameEquals("temperature"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + temperature = null; + continue; + } + temperature = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("top_p"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + topP = null; + continue; + } + topP = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("user"u8)) + { + user = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateCompletionRequest( + model, + prompt, + bestOf, + echo, + frequencyPenalty, + logitBias ?? new ChangeTrackingDictionary(), + logprobs, + maxTokens, + n, + presencePenalty, + seed, + stop, + stream, + streamOptions, + suffix, + temperature, + topP, + user, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateCompletionRequest)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateCompletionRequest IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateCompletionRequest(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateCompletionRequest)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateCompletionRequest FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateCompletionRequest(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateCompletionRequest.cs b/.dotnet/src/Generated/Models/InternalCreateCompletionRequest.cs new file mode 100644 index 000000000..25e584d3b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateCompletionRequest.cs @@ -0,0 +1,67 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using OpenAI.Chat; + +namespace OpenAI.LegacyCompletions +{ + internal partial class InternalCreateCompletionRequest + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateCompletionRequest(InternalCreateCompletionRequestModel model, BinaryData prompt) + { + Model = model; + Prompt = prompt; + LogitBias = new ChangeTrackingDictionary(); + } + + internal InternalCreateCompletionRequest(InternalCreateCompletionRequestModel model, BinaryData prompt, int? bestOf, bool? echo, float? frequencyPenalty, IDictionary logitBias, int? logprobs, int? maxTokens, int? n, float? presencePenalty, long? seed, BinaryData stop, bool? stream, InternalChatCompletionStreamOptions streamOptions, string suffix, float? temperature, float? topP, string user, IDictionary serializedAdditionalRawData) + { + Model = model; + Prompt = prompt; + BestOf = bestOf; + Echo = echo; + FrequencyPenalty = frequencyPenalty; + LogitBias = logitBias; + Logprobs = logprobs; + MaxTokens = maxTokens; + N = n; + PresencePenalty = presencePenalty; + Seed = seed; + Stop = stop; + Stream = stream; + StreamOptions = streamOptions; + Suffix = suffix; + Temperature = temperature; + TopP = topP; + User = user; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateCompletionRequest() + { + } + + public InternalCreateCompletionRequestModel Model { get; } + public BinaryData Prompt { get; } + public int? BestOf { get; set; } + public bool? Echo { get; set; } + public float? FrequencyPenalty { get; set; } + public IDictionary LogitBias { get; set; } + public int? Logprobs { get; set; } + public int? MaxTokens { get; set; } + public int? N { get; set; } + public float? PresencePenalty { get; set; } + public long? Seed { get; set; } + public BinaryData Stop { get; set; } + public bool? Stream { get; set; } + public InternalChatCompletionStreamOptions StreamOptions { get; set; } + public string Suffix { get; set; } + public float? Temperature { get; set; } + public float? TopP { get; set; } + public string User { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateCompletionRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateCompletionRequestModel.cs new file mode 100644 index 000000000..46b5e9d0f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateCompletionRequestModel.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.LegacyCompletions +{ + internal readonly partial struct InternalCreateCompletionRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateCompletionRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string Gpt35TurboInstructValue = "gpt-3.5-turbo-instruct"; + private const string Davinci002Value = "davinci-002"; + private const string Babbage002Value = "babbage-002"; + + public static InternalCreateCompletionRequestModel Gpt35TurboInstruct { get; } = new InternalCreateCompletionRequestModel(Gpt35TurboInstructValue); + public static InternalCreateCompletionRequestModel Davinci002 { get; } = new InternalCreateCompletionRequestModel(Davinci002Value); + public static InternalCreateCompletionRequestModel Babbage002 { get; } = new InternalCreateCompletionRequestModel(Babbage002Value); + public static bool operator ==(InternalCreateCompletionRequestModel left, InternalCreateCompletionRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateCompletionRequestModel left, InternalCreateCompletionRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateCompletionRequestModel(string value) => new InternalCreateCompletionRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateCompletionRequestModel other && Equals(other); + public bool Equals(InternalCreateCompletionRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateCompletionResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateCompletionResponse.Serialization.cs new file mode 100644 index 000000000..7f641dc9b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateCompletionResponse.Serialization.cs @@ -0,0 +1,222 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; +using OpenAI.Chat; + +namespace OpenAI.LegacyCompletions +{ + internal partial class InternalCreateCompletionResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateCompletionResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("choices") != true) + { + writer.WritePropertyName("choices"u8); + writer.WriteStartArray(); + foreach (var item in Choices) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("created") != true) + { + writer.WritePropertyName("created"u8); + writer.WriteNumberValue(Created, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model); + } + if (SerializedAdditionalRawData?.ContainsKey("system_fingerprint") != true && Optional.IsDefined(SystemFingerprint)) + { + writer.WritePropertyName("system_fingerprint"u8); + writer.WriteStringValue(SystemFingerprint); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("usage") != true && Optional.IsDefined(Usage)) + { + writer.WritePropertyName("usage"u8); + writer.WriteObjectValue(Usage, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateCompletionResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateCompletionResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateCompletionResponse(document.RootElement, options); + } + + internal static InternalCreateCompletionResponse DeserializeInternalCreateCompletionResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + IReadOnlyList choices = default; + DateTimeOffset created = default; + string model = default; + string systemFingerprint = default; + InternalCreateCompletionResponseObject @object = default; + ChatTokenUsage usage = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("choices"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalCreateCompletionResponseChoice.DeserializeInternalCreateCompletionResponseChoice(item, options)); + } + choices = array; + continue; + } + if (property.NameEquals("created"u8)) + { + created = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("model"u8)) + { + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("system_fingerprint"u8)) + { + systemFingerprint = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalCreateCompletionResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("usage"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + usage = ChatTokenUsage.DeserializeChatTokenUsage(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateCompletionResponse( + id, + choices, + created, + model, + systemFingerprint, + @object, + usage, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateCompletionResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateCompletionResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateCompletionResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateCompletionResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateCompletionResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateCompletionResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateCompletionResponse.cs b/.dotnet/src/Generated/Models/InternalCreateCompletionResponse.cs new file mode 100644 index 000000000..b7dd5c2a4 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateCompletionResponse.cs @@ -0,0 +1,52 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; +using OpenAI.Chat; + +namespace OpenAI.LegacyCompletions +{ + internal partial class InternalCreateCompletionResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateCompletionResponse(string id, IEnumerable choices, DateTimeOffset created, string model) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(choices, nameof(choices)); + Argument.AssertNotNull(model, nameof(model)); + + Id = id; + Choices = choices.ToList(); + Created = created; + Model = model; + } + + internal InternalCreateCompletionResponse(string id, IReadOnlyList choices, DateTimeOffset created, string model, string systemFingerprint, InternalCreateCompletionResponseObject @object, ChatTokenUsage usage, IDictionary serializedAdditionalRawData) + { + Id = id; + Choices = choices; + Created = created; + Model = model; + SystemFingerprint = systemFingerprint; + Object = @object; + Usage = usage; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateCompletionResponse() + { + } + + public string Id { get; } + public IReadOnlyList Choices { get; } + public DateTimeOffset Created { get; } + public string Model { get; } + public string SystemFingerprint { get; } + public InternalCreateCompletionResponseObject Object { get; } = InternalCreateCompletionResponseObject.TextCompletion; + + public ChatTokenUsage Usage { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoice.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoice.Serialization.cs new file mode 100644 index 000000000..b49151000 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoice.Serialization.cs @@ -0,0 +1,178 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.LegacyCompletions +{ + internal partial class InternalCreateCompletionResponseChoice : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateCompletionResponseChoice)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("finish_reason") != true) + { + writer.WritePropertyName("finish_reason"u8); + writer.WriteStringValue(FinishReason.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("logprobs") != true) + { + if (Logprobs != null) + { + writer.WritePropertyName("logprobs"u8); + writer.WriteObjectValue(Logprobs, options); + } + else + { + writer.WriteNull("logprobs"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("text") != true) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(Text); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateCompletionResponseChoice IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateCompletionResponseChoice)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateCompletionResponseChoice(document.RootElement, options); + } + + internal static InternalCreateCompletionResponseChoice DeserializeInternalCreateCompletionResponseChoice(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalCreateCompletionResponseChoiceFinishReason finishReason = default; + int index = default; + InternalCreateCompletionResponseChoiceLogprobs logprobs = default; + string text = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("finish_reason"u8)) + { + finishReason = new InternalCreateCompletionResponseChoiceFinishReason(property.Value.GetString()); + continue; + } + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("logprobs"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + logprobs = null; + continue; + } + logprobs = InternalCreateCompletionResponseChoiceLogprobs.DeserializeInternalCreateCompletionResponseChoiceLogprobs(property.Value, options); + continue; + } + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateCompletionResponseChoice(finishReason, index, logprobs, text, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateCompletionResponseChoice)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateCompletionResponseChoice IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateCompletionResponseChoice(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateCompletionResponseChoice)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateCompletionResponseChoice FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateCompletionResponseChoice(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoice.cs b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoice.cs new file mode 100644 index 000000000..5aa7f0cca --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoice.cs @@ -0,0 +1,41 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.LegacyCompletions +{ + internal partial class InternalCreateCompletionResponseChoice + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateCompletionResponseChoice(InternalCreateCompletionResponseChoiceFinishReason finishReason, int index, InternalCreateCompletionResponseChoiceLogprobs logprobs, string text) + { + Argument.AssertNotNull(text, nameof(text)); + + FinishReason = finishReason; + Index = index; + Logprobs = logprobs; + Text = text; + } + + internal InternalCreateCompletionResponseChoice(InternalCreateCompletionResponseChoiceFinishReason finishReason, int index, InternalCreateCompletionResponseChoiceLogprobs logprobs, string text, IDictionary serializedAdditionalRawData) + { + FinishReason = finishReason; + Index = index; + Logprobs = logprobs; + Text = text; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateCompletionResponseChoice() + { + } + + public InternalCreateCompletionResponseChoiceFinishReason FinishReason { get; } + public int Index { get; } + public InternalCreateCompletionResponseChoiceLogprobs Logprobs { get; } + public string Text { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoiceFinishReason.cs b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoiceFinishReason.cs new file mode 100644 index 000000000..e10b1ef37 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoiceFinishReason.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.LegacyCompletions +{ + internal readonly partial struct InternalCreateCompletionResponseChoiceFinishReason : IEquatable + { + private readonly string _value; + + public InternalCreateCompletionResponseChoiceFinishReason(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string StopValue = "stop"; + private const string LengthValue = "length"; + private const string ContentFilterValue = "content_filter"; + + public static InternalCreateCompletionResponseChoiceFinishReason Stop { get; } = new InternalCreateCompletionResponseChoiceFinishReason(StopValue); + public static InternalCreateCompletionResponseChoiceFinishReason Length { get; } = new InternalCreateCompletionResponseChoiceFinishReason(LengthValue); + public static InternalCreateCompletionResponseChoiceFinishReason ContentFilter { get; } = new InternalCreateCompletionResponseChoiceFinishReason(ContentFilterValue); + public static bool operator ==(InternalCreateCompletionResponseChoiceFinishReason left, InternalCreateCompletionResponseChoiceFinishReason right) => left.Equals(right); + public static bool operator !=(InternalCreateCompletionResponseChoiceFinishReason left, InternalCreateCompletionResponseChoiceFinishReason right) => !left.Equals(right); + public static implicit operator InternalCreateCompletionResponseChoiceFinishReason(string value) => new InternalCreateCompletionResponseChoiceFinishReason(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateCompletionResponseChoiceFinishReason other && Equals(other); + public bool Equals(InternalCreateCompletionResponseChoiceFinishReason other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoiceLogprobs.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoiceLogprobs.Serialization.cs new file mode 100644 index 000000000..9117d8366 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoiceLogprobs.Serialization.cs @@ -0,0 +1,245 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.LegacyCompletions +{ + internal partial class InternalCreateCompletionResponseChoiceLogprobs : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateCompletionResponseChoiceLogprobs)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("text_offset") != true && Optional.IsCollectionDefined(TextOffset)) + { + writer.WritePropertyName("text_offset"u8); + writer.WriteStartArray(); + foreach (var item in TextOffset) + { + writer.WriteNumberValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("token_logprobs") != true && Optional.IsCollectionDefined(TokenLogprobs)) + { + writer.WritePropertyName("token_logprobs"u8); + writer.WriteStartArray(); + foreach (var item in TokenLogprobs) + { + writer.WriteNumberValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("tokens") != true && Optional.IsCollectionDefined(Tokens)) + { + writer.WritePropertyName("tokens"u8); + writer.WriteStartArray(); + foreach (var item in Tokens) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("top_logprobs") != true && Optional.IsCollectionDefined(TopLogprobs)) + { + writer.WritePropertyName("top_logprobs"u8); + writer.WriteStartArray(); + foreach (var item in TopLogprobs) + { + if (item == null) + { + writer.WriteNullValue(); + continue; + } + writer.WriteStartObject(); + foreach (var item0 in item) + { + writer.WritePropertyName(item0.Key); + writer.WriteNumberValue(item0.Value); + } + writer.WriteEndObject(); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateCompletionResponseChoiceLogprobs IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateCompletionResponseChoiceLogprobs)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateCompletionResponseChoiceLogprobs(document.RootElement, options); + } + + internal static InternalCreateCompletionResponseChoiceLogprobs DeserializeInternalCreateCompletionResponseChoiceLogprobs(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList textOffset = default; + IReadOnlyList tokenLogprobs = default; + IReadOnlyList tokens = default; + IReadOnlyList> topLogprobs = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("text_offset"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetInt32()); + } + textOffset = array; + continue; + } + if (property.NameEquals("token_logprobs"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetSingle()); + } + tokenLogprobs = array; + continue; + } + if (property.NameEquals("tokens"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + tokens = array; + continue; + } + if (property.NameEquals("top_logprobs"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List> array = new List>(); + foreach (var item in property.Value.EnumerateArray()) + { + if (item.ValueKind == JsonValueKind.Null) + { + array.Add(null); + } + else + { + Dictionary dictionary = new Dictionary(); + foreach (var property0 in item.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetSingle()); + } + array.Add(dictionary); + } + } + topLogprobs = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateCompletionResponseChoiceLogprobs(textOffset ?? new ChangeTrackingList(), tokenLogprobs ?? new ChangeTrackingList(), tokens ?? new ChangeTrackingList(), topLogprobs ?? new ChangeTrackingList>(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateCompletionResponseChoiceLogprobs)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateCompletionResponseChoiceLogprobs IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateCompletionResponseChoiceLogprobs(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateCompletionResponseChoiceLogprobs)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateCompletionResponseChoiceLogprobs FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateCompletionResponseChoiceLogprobs(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoiceLogprobs.cs b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoiceLogprobs.cs new file mode 100644 index 000000000..6f5de82a3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseChoiceLogprobs.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.LegacyCompletions +{ + internal partial class InternalCreateCompletionResponseChoiceLogprobs + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateCompletionResponseChoiceLogprobs() + { + TextOffset = new ChangeTrackingList(); + TokenLogprobs = new ChangeTrackingList(); + Tokens = new ChangeTrackingList(); + TopLogprobs = new ChangeTrackingList>(); + } + + internal InternalCreateCompletionResponseChoiceLogprobs(IReadOnlyList textOffset, IReadOnlyList tokenLogprobs, IReadOnlyList tokens, IReadOnlyList> topLogprobs, IDictionary serializedAdditionalRawData) + { + TextOffset = textOffset; + TokenLogprobs = tokenLogprobs; + Tokens = tokens; + TopLogprobs = topLogprobs; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IReadOnlyList TextOffset { get; } + public IReadOnlyList TokenLogprobs { get; } + public IReadOnlyList Tokens { get; } + public IReadOnlyList> TopLogprobs { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateCompletionResponseObject.cs b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseObject.cs new file mode 100644 index 000000000..8dfbd3475 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateCompletionResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.LegacyCompletions +{ + internal readonly partial struct InternalCreateCompletionResponseObject : IEquatable + { + private readonly string _value; + + public InternalCreateCompletionResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string TextCompletionValue = "text_completion"; + + public static InternalCreateCompletionResponseObject TextCompletion { get; } = new InternalCreateCompletionResponseObject(TextCompletionValue); + public static bool operator ==(InternalCreateCompletionResponseObject left, InternalCreateCompletionResponseObject right) => left.Equals(right); + public static bool operator !=(InternalCreateCompletionResponseObject left, InternalCreateCompletionResponseObject right) => !left.Equals(right); + public static implicit operator InternalCreateCompletionResponseObject(string value) => new InternalCreateCompletionResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateCompletionResponseObject other && Equals(other); + public bool Equals(InternalCreateCompletionResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateEmbeddingRequestEncodingFormat.cs b/.dotnet/src/Generated/Models/InternalCreateEmbeddingRequestEncodingFormat.cs new file mode 100644 index 000000000..6f59eb8e5 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateEmbeddingRequestEncodingFormat.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Embeddings +{ + internal readonly partial struct InternalCreateEmbeddingRequestEncodingFormat : IEquatable + { + private readonly string _value; + + public InternalCreateEmbeddingRequestEncodingFormat(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FloatValue = "float"; + private const string Base64Value = "base64"; + + public static InternalCreateEmbeddingRequestEncodingFormat Float { get; } = new InternalCreateEmbeddingRequestEncodingFormat(FloatValue); + public static InternalCreateEmbeddingRequestEncodingFormat Base64 { get; } = new InternalCreateEmbeddingRequestEncodingFormat(Base64Value); + public static bool operator ==(InternalCreateEmbeddingRequestEncodingFormat left, InternalCreateEmbeddingRequestEncodingFormat right) => left.Equals(right); + public static bool operator !=(InternalCreateEmbeddingRequestEncodingFormat left, InternalCreateEmbeddingRequestEncodingFormat right) => !left.Equals(right); + public static implicit operator InternalCreateEmbeddingRequestEncodingFormat(string value) => new InternalCreateEmbeddingRequestEncodingFormat(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateEmbeddingRequestEncodingFormat other && Equals(other); + public bool Equals(InternalCreateEmbeddingRequestEncodingFormat other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateEmbeddingRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateEmbeddingRequestModel.cs new file mode 100644 index 000000000..98fa114a0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateEmbeddingRequestModel.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Embeddings +{ + internal readonly partial struct InternalCreateEmbeddingRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateEmbeddingRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string TextEmbeddingAda002Value = "text-embedding-ada-002"; + private const string TextEmbedding3SmallValue = "text-embedding-3-small"; + private const string TextEmbedding3LargeValue = "text-embedding-3-large"; + + public static InternalCreateEmbeddingRequestModel TextEmbeddingAda002 { get; } = new InternalCreateEmbeddingRequestModel(TextEmbeddingAda002Value); + public static InternalCreateEmbeddingRequestModel TextEmbedding3Small { get; } = new InternalCreateEmbeddingRequestModel(TextEmbedding3SmallValue); + public static InternalCreateEmbeddingRequestModel TextEmbedding3Large { get; } = new InternalCreateEmbeddingRequestModel(TextEmbedding3LargeValue); + public static bool operator ==(InternalCreateEmbeddingRequestModel left, InternalCreateEmbeddingRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateEmbeddingRequestModel left, InternalCreateEmbeddingRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateEmbeddingRequestModel(string value) => new InternalCreateEmbeddingRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateEmbeddingRequestModel other && Equals(other); + public bool Equals(InternalCreateEmbeddingRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateEmbeddingResponseObject.cs b/.dotnet/src/Generated/Models/InternalCreateEmbeddingResponseObject.cs new file mode 100644 index 000000000..dc8b38632 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateEmbeddingResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Embeddings +{ + internal readonly partial struct InternalCreateEmbeddingResponseObject : IEquatable + { + private readonly string _value; + + public InternalCreateEmbeddingResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalCreateEmbeddingResponseObject List { get; } = new InternalCreateEmbeddingResponseObject(ListValue); + public static bool operator ==(InternalCreateEmbeddingResponseObject left, InternalCreateEmbeddingResponseObject right) => left.Equals(right); + public static bool operator !=(InternalCreateEmbeddingResponseObject left, InternalCreateEmbeddingResponseObject right) => !left.Equals(right); + public static implicit operator InternalCreateEmbeddingResponseObject(string value) => new InternalCreateEmbeddingResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateEmbeddingResponseObject other && Equals(other); + public bool Equals(InternalCreateEmbeddingResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequest.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequest.Serialization.cs new file mode 100644 index 000000000..8277ed4ca --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequest.Serialization.cs @@ -0,0 +1,268 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalCreateFineTuningJobRequest : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequest)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("training_file") != true) + { + writer.WritePropertyName("training_file"u8); + writer.WriteStringValue(TrainingFile); + } + if (SerializedAdditionalRawData?.ContainsKey("hyperparameters") != true && Optional.IsDefined(Hyperparameters)) + { + writer.WritePropertyName("hyperparameters"u8); + writer.WriteObjectValue(Hyperparameters, options); + } + if (SerializedAdditionalRawData?.ContainsKey("suffix") != true && Optional.IsDefined(Suffix)) + { + if (Suffix != null) + { + writer.WritePropertyName("suffix"u8); + writer.WriteStringValue(Suffix); + } + else + { + writer.WriteNull("suffix"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("validation_file") != true && Optional.IsDefined(ValidationFile)) + { + if (ValidationFile != null) + { + writer.WritePropertyName("validation_file"u8); + writer.WriteStringValue(ValidationFile); + } + else + { + writer.WriteNull("validation_file"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("integrations") != true && Optional.IsCollectionDefined(Integrations)) + { + if (Integrations != null) + { + writer.WritePropertyName("integrations"u8); + writer.WriteStartArray(); + foreach (var item in Integrations) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("integrations"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("seed") != true && Optional.IsDefined(Seed)) + { + if (Seed != null) + { + writer.WritePropertyName("seed"u8); + writer.WriteNumberValue(Seed.Value); + } + else + { + writer.WriteNull("seed"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateFineTuningJobRequest IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequest)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateFineTuningJobRequest(document.RootElement, options); + } + + internal static InternalCreateFineTuningJobRequest DeserializeInternalCreateFineTuningJobRequest(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalCreateFineTuningJobRequestModel model = default; + string trainingFile = default; + InternalCreateFineTuningJobRequestHyperparameters hyperparameters = default; + string suffix = default; + string validationFile = default; + IList integrations = default; + int? seed = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("model"u8)) + { + model = new InternalCreateFineTuningJobRequestModel(property.Value.GetString()); + continue; + } + if (property.NameEquals("training_file"u8)) + { + trainingFile = property.Value.GetString(); + continue; + } + if (property.NameEquals("hyperparameters"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + hyperparameters = InternalCreateFineTuningJobRequestHyperparameters.DeserializeInternalCreateFineTuningJobRequestHyperparameters(property.Value, options); + continue; + } + if (property.NameEquals("suffix"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + suffix = null; + continue; + } + suffix = property.Value.GetString(); + continue; + } + if (property.NameEquals("validation_file"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + validationFile = null; + continue; + } + validationFile = property.Value.GetString(); + continue; + } + if (property.NameEquals("integrations"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalCreateFineTuningJobRequestIntegration.DeserializeInternalCreateFineTuningJobRequestIntegration(item, options)); + } + integrations = array; + continue; + } + if (property.NameEquals("seed"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + seed = null; + continue; + } + seed = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateFineTuningJobRequest( + model, + trainingFile, + hyperparameters, + suffix, + validationFile, + integrations ?? new ChangeTrackingList(), + seed, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequest)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateFineTuningJobRequest IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateFineTuningJobRequest(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequest)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateFineTuningJobRequest FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateFineTuningJobRequest(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequest.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequest.cs new file mode 100644 index 000000000..6c183c752 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequest.cs @@ -0,0 +1,46 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalCreateFineTuningJobRequest + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateFineTuningJobRequest(InternalCreateFineTuningJobRequestModel model, string trainingFile) + { + Argument.AssertNotNull(trainingFile, nameof(trainingFile)); + + Model = model; + TrainingFile = trainingFile; + Integrations = new ChangeTrackingList(); + } + + internal InternalCreateFineTuningJobRequest(InternalCreateFineTuningJobRequestModel model, string trainingFile, InternalCreateFineTuningJobRequestHyperparameters hyperparameters, string suffix, string validationFile, IList integrations, int? seed, IDictionary serializedAdditionalRawData) + { + Model = model; + TrainingFile = trainingFile; + Hyperparameters = hyperparameters; + Suffix = suffix; + ValidationFile = validationFile; + Integrations = integrations; + Seed = seed; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateFineTuningJobRequest() + { + } + + public InternalCreateFineTuningJobRequestModel Model { get; } + public string TrainingFile { get; } + public InternalCreateFineTuningJobRequestHyperparameters Hyperparameters { get; set; } + public string Suffix { get; set; } + public string ValidationFile { get; set; } + public IList Integrations { get; set; } + public int? Seed { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparameters.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparameters.Serialization.cs new file mode 100644 index 000000000..a86de3cef --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparameters.Serialization.cs @@ -0,0 +1,188 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalCreateFineTuningJobRequestHyperparameters : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestHyperparameters)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("batch_size") != true && Optional.IsDefined(BatchSize)) + { + writer.WritePropertyName("batch_size"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(BatchSize); +#else + using (JsonDocument document = JsonDocument.Parse(BatchSize)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("learning_rate_multiplier") != true && Optional.IsDefined(LearningRateMultiplier)) + { + writer.WritePropertyName("learning_rate_multiplier"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(LearningRateMultiplier); +#else + using (JsonDocument document = JsonDocument.Parse(LearningRateMultiplier)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("n_epochs") != true && Optional.IsDefined(NEpochs)) + { + writer.WritePropertyName("n_epochs"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(NEpochs); +#else + using (JsonDocument document = JsonDocument.Parse(NEpochs)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateFineTuningJobRequestHyperparameters IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestHyperparameters)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateFineTuningJobRequestHyperparameters(document.RootElement, options); + } + + internal static InternalCreateFineTuningJobRequestHyperparameters DeserializeInternalCreateFineTuningJobRequestHyperparameters(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + BinaryData batchSize = default; + BinaryData learningRateMultiplier = default; + BinaryData nEpochs = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("batch_size"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + batchSize = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("learning_rate_multiplier"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + learningRateMultiplier = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("n_epochs"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + nEpochs = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateFineTuningJobRequestHyperparameters(batchSize, learningRateMultiplier, nEpochs, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestHyperparameters)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateFineTuningJobRequestHyperparameters IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateFineTuningJobRequestHyperparameters(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestHyperparameters)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateFineTuningJobRequestHyperparameters FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateFineTuningJobRequestHyperparameters(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparameters.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparameters.cs new file mode 100644 index 000000000..83694fba6 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparameters.cs @@ -0,0 +1,29 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalCreateFineTuningJobRequestHyperparameters + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateFineTuningJobRequestHyperparameters() + { + } + + internal InternalCreateFineTuningJobRequestHyperparameters(BinaryData batchSize, BinaryData learningRateMultiplier, BinaryData nEpochs, IDictionary serializedAdditionalRawData) + { + BatchSize = batchSize; + LearningRateMultiplier = learningRateMultiplier; + NEpochs = nEpochs; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public BinaryData BatchSize { get; set; } + public BinaryData LearningRateMultiplier { get; set; } + public BinaryData NEpochs { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum.cs new file mode 100644 index 000000000..f104417d9 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum : IEquatable + { + private readonly string _value; + + public InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AutoValue = "auto"; + + public static InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum Auto { get; } = new InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum(AutoValue); + public static bool operator ==(InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum left, InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum right) => left.Equals(right); + public static bool operator !=(InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum left, InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum right) => !left.Equals(right); + public static implicit operator InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum(string value) => new InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum other && Equals(other); + public bool Equals(InternalCreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum.cs new file mode 100644 index 000000000..9ff39cc7d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum : IEquatable + { + private readonly string _value; + + public InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AutoValue = "auto"; + + public static InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum Auto { get; } = new InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum(AutoValue); + public static bool operator ==(InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum left, InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum right) => left.Equals(right); + public static bool operator !=(InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum left, InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum right) => !left.Equals(right); + public static implicit operator InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum(string value) => new InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum other && Equals(other); + public bool Equals(InternalCreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum.cs new file mode 100644 index 000000000..8fad11a4b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum : IEquatable + { + private readonly string _value; + + public InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AutoValue = "auto"; + + public static InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum Auto { get; } = new InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum(AutoValue); + public static bool operator ==(InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum left, InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum right) => left.Equals(right); + public static bool operator !=(InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum left, InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum right) => !left.Equals(right); + public static implicit operator InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum(string value) => new InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum other && Equals(other); + public bool Equals(InternalCreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegration.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegration.Serialization.cs new file mode 100644 index 000000000..d5485ec70 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegration.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalCreateFineTuningJobRequestIntegration : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestIntegration)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("wandb") != true) + { + writer.WritePropertyName("wandb"u8); + writer.WriteObjectValue(Wandb, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateFineTuningJobRequestIntegration IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestIntegration)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateFineTuningJobRequestIntegration(document.RootElement, options); + } + + internal static InternalCreateFineTuningJobRequestIntegration DeserializeInternalCreateFineTuningJobRequestIntegration(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalCreateFineTuningJobRequestIntegrationType type = default; + InternalCreateFineTuningJobRequestIntegrationWandb wandb = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = new InternalCreateFineTuningJobRequestIntegrationType(property.Value.GetString()); + continue; + } + if (property.NameEquals("wandb"u8)) + { + wandb = InternalCreateFineTuningJobRequestIntegrationWandb.DeserializeInternalCreateFineTuningJobRequestIntegrationWandb(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateFineTuningJobRequestIntegration(type, wandb, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestIntegration)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateFineTuningJobRequestIntegration IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateFineTuningJobRequestIntegration(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestIntegration)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateFineTuningJobRequestIntegration FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateFineTuningJobRequestIntegration(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegration.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegration.cs new file mode 100644 index 000000000..1e614b3d4 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegration.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalCreateFineTuningJobRequestIntegration + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateFineTuningJobRequestIntegration(InternalCreateFineTuningJobRequestIntegrationWandb wandb) + { + Argument.AssertNotNull(wandb, nameof(wandb)); + + Wandb = wandb; + } + + internal InternalCreateFineTuningJobRequestIntegration(InternalCreateFineTuningJobRequestIntegrationType type, InternalCreateFineTuningJobRequestIntegrationWandb wandb, IDictionary serializedAdditionalRawData) + { + Type = type; + Wandb = wandb; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateFineTuningJobRequestIntegration() + { + } + + public InternalCreateFineTuningJobRequestIntegrationType Type { get; } = InternalCreateFineTuningJobRequestIntegrationType.Wandb; + + public InternalCreateFineTuningJobRequestIntegrationWandb Wandb { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegrationType.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegrationType.cs new file mode 100644 index 000000000..1d2308974 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegrationType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalCreateFineTuningJobRequestIntegrationType : IEquatable + { + private readonly string _value; + + public InternalCreateFineTuningJobRequestIntegrationType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string WandbValue = "wandb"; + + public static InternalCreateFineTuningJobRequestIntegrationType Wandb { get; } = new InternalCreateFineTuningJobRequestIntegrationType(WandbValue); + public static bool operator ==(InternalCreateFineTuningJobRequestIntegrationType left, InternalCreateFineTuningJobRequestIntegrationType right) => left.Equals(right); + public static bool operator !=(InternalCreateFineTuningJobRequestIntegrationType left, InternalCreateFineTuningJobRequestIntegrationType right) => !left.Equals(right); + public static implicit operator InternalCreateFineTuningJobRequestIntegrationType(string value) => new InternalCreateFineTuningJobRequestIntegrationType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateFineTuningJobRequestIntegrationType other && Equals(other); + public bool Equals(InternalCreateFineTuningJobRequestIntegrationType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegrationWandb.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegrationWandb.Serialization.cs new file mode 100644 index 000000000..603442e6d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegrationWandb.Serialization.cs @@ -0,0 +1,204 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalCreateFineTuningJobRequestIntegrationWandb : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestIntegrationWandb)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("project") != true) + { + writer.WritePropertyName("project"u8); + writer.WriteStringValue(Project); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true && Optional.IsDefined(Name)) + { + if (Name != null) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + else + { + writer.WriteNull("name"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("entity") != true && Optional.IsDefined(Entity)) + { + if (Entity != null) + { + writer.WritePropertyName("entity"u8); + writer.WriteStringValue(Entity); + } + else + { + writer.WriteNull("entity"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tags") != true && Optional.IsCollectionDefined(Tags)) + { + writer.WritePropertyName("tags"u8); + writer.WriteStartArray(); + foreach (var item in Tags) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateFineTuningJobRequestIntegrationWandb IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestIntegrationWandb)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateFineTuningJobRequestIntegrationWandb(document.RootElement, options); + } + + internal static InternalCreateFineTuningJobRequestIntegrationWandb DeserializeInternalCreateFineTuningJobRequestIntegrationWandb(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string project = default; + string name = default; + string entity = default; + IList tags = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("project"u8)) + { + project = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + name = null; + continue; + } + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("entity"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + entity = null; + continue; + } + entity = property.Value.GetString(); + continue; + } + if (property.NameEquals("tags"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + tags = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateFineTuningJobRequestIntegrationWandb(project, name, entity, tags ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestIntegrationWandb)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateFineTuningJobRequestIntegrationWandb IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateFineTuningJobRequestIntegrationWandb(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateFineTuningJobRequestIntegrationWandb)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateFineTuningJobRequestIntegrationWandb FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateFineTuningJobRequestIntegrationWandb(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegrationWandb.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegrationWandb.cs new file mode 100644 index 000000000..859e7295c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestIntegrationWandb.cs @@ -0,0 +1,39 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalCreateFineTuningJobRequestIntegrationWandb + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateFineTuningJobRequestIntegrationWandb(string project) + { + Argument.AssertNotNull(project, nameof(project)); + + Project = project; + Tags = new ChangeTrackingList(); + } + + internal InternalCreateFineTuningJobRequestIntegrationWandb(string project, string name, string entity, IList tags, IDictionary serializedAdditionalRawData) + { + Project = project; + Name = name; + Entity = entity; + Tags = tags; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateFineTuningJobRequestIntegrationWandb() + { + } + + public string Project { get; } + public string Name { get; set; } + public string Entity { get; set; } + public IList Tags { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestModel.cs new file mode 100644 index 000000000..e85d5c334 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateFineTuningJobRequestModel.cs @@ -0,0 +1,40 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalCreateFineTuningJobRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateFineTuningJobRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string Babbage002Value = "babbage-002"; + private const string Davinci002Value = "davinci-002"; + private const string Gpt35TurboValue = "gpt-3.5-turbo"; + private const string Gpt4oMiniValue = "gpt-4o-mini"; + + public static InternalCreateFineTuningJobRequestModel Babbage002 { get; } = new InternalCreateFineTuningJobRequestModel(Babbage002Value); + public static InternalCreateFineTuningJobRequestModel Davinci002 { get; } = new InternalCreateFineTuningJobRequestModel(Davinci002Value); + public static InternalCreateFineTuningJobRequestModel Gpt35Turbo { get; } = new InternalCreateFineTuningJobRequestModel(Gpt35TurboValue); + public static InternalCreateFineTuningJobRequestModel Gpt4oMini { get; } = new InternalCreateFineTuningJobRequestModel(Gpt4oMiniValue); + public static bool operator ==(InternalCreateFineTuningJobRequestModel left, InternalCreateFineTuningJobRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateFineTuningJobRequestModel left, InternalCreateFineTuningJobRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateFineTuningJobRequestModel(string value) => new InternalCreateFineTuningJobRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateFineTuningJobRequestModel other && Equals(other); + public bool Equals(InternalCreateFineTuningJobRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateImageEditRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateImageEditRequestModel.cs new file mode 100644 index 000000000..c1ae8ba57 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateImageEditRequestModel.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Images +{ + internal readonly partial struct InternalCreateImageEditRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateImageEditRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string DallE2Value = "dall-e-2"; + + public static InternalCreateImageEditRequestModel DallE2 { get; } = new InternalCreateImageEditRequestModel(DallE2Value); + public static bool operator ==(InternalCreateImageEditRequestModel left, InternalCreateImageEditRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateImageEditRequestModel left, InternalCreateImageEditRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateImageEditRequestModel(string value) => new InternalCreateImageEditRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateImageEditRequestModel other && Equals(other); + public bool Equals(InternalCreateImageEditRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateImageEditRequestResponseFormat.cs b/.dotnet/src/Generated/Models/InternalCreateImageEditRequestResponseFormat.cs new file mode 100644 index 000000000..7da85fee8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateImageEditRequestResponseFormat.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Images +{ + internal readonly partial struct InternalCreateImageEditRequestResponseFormat : IEquatable + { + private readonly string _value; + + public InternalCreateImageEditRequestResponseFormat(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string UrlValue = "url"; + private const string B64JsonValue = "b64_json"; + + public static InternalCreateImageEditRequestResponseFormat Url { get; } = new InternalCreateImageEditRequestResponseFormat(UrlValue); + public static InternalCreateImageEditRequestResponseFormat B64Json { get; } = new InternalCreateImageEditRequestResponseFormat(B64JsonValue); + public static bool operator ==(InternalCreateImageEditRequestResponseFormat left, InternalCreateImageEditRequestResponseFormat right) => left.Equals(right); + public static bool operator !=(InternalCreateImageEditRequestResponseFormat left, InternalCreateImageEditRequestResponseFormat right) => !left.Equals(right); + public static implicit operator InternalCreateImageEditRequestResponseFormat(string value) => new InternalCreateImageEditRequestResponseFormat(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateImageEditRequestResponseFormat other && Equals(other); + public bool Equals(InternalCreateImageEditRequestResponseFormat other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateImageEditRequestSize.cs b/.dotnet/src/Generated/Models/InternalCreateImageEditRequestSize.cs new file mode 100644 index 000000000..122d3fc3b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateImageEditRequestSize.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Images +{ + internal readonly partial struct InternalCreateImageEditRequestSize : IEquatable + { + private readonly string _value; + + public InternalCreateImageEditRequestSize(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string _256x256Value = "256x256"; + private const string _512x512Value = "512x512"; + private const string _1024x1024Value = "1024x1024"; + + public static InternalCreateImageEditRequestSize _256x256 { get; } = new InternalCreateImageEditRequestSize(_256x256Value); + public static InternalCreateImageEditRequestSize _512x512 { get; } = new InternalCreateImageEditRequestSize(_512x512Value); + public static InternalCreateImageEditRequestSize _1024x1024 { get; } = new InternalCreateImageEditRequestSize(_1024x1024Value); + public static bool operator ==(InternalCreateImageEditRequestSize left, InternalCreateImageEditRequestSize right) => left.Equals(right); + public static bool operator !=(InternalCreateImageEditRequestSize left, InternalCreateImageEditRequestSize right) => !left.Equals(right); + public static implicit operator InternalCreateImageEditRequestSize(string value) => new InternalCreateImageEditRequestSize(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateImageEditRequestSize other && Equals(other); + public bool Equals(InternalCreateImageEditRequestSize other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateImageRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateImageRequestModel.cs new file mode 100644 index 000000000..a43ec9fc4 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateImageRequestModel.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Images +{ + internal readonly partial struct InternalCreateImageRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateImageRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string DallE2Value = "dall-e-2"; + private const string DallE3Value = "dall-e-3"; + + public static InternalCreateImageRequestModel DallE2 { get; } = new InternalCreateImageRequestModel(DallE2Value); + public static InternalCreateImageRequestModel DallE3 { get; } = new InternalCreateImageRequestModel(DallE3Value); + public static bool operator ==(InternalCreateImageRequestModel left, InternalCreateImageRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateImageRequestModel left, InternalCreateImageRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateImageRequestModel(string value) => new InternalCreateImageRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateImageRequestModel other && Equals(other); + public bool Equals(InternalCreateImageRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateImageVariationRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateImageVariationRequestModel.cs new file mode 100644 index 000000000..2f57438b8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateImageVariationRequestModel.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Images +{ + internal readonly partial struct InternalCreateImageVariationRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateImageVariationRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string DallE2Value = "dall-e-2"; + + public static InternalCreateImageVariationRequestModel DallE2 { get; } = new InternalCreateImageVariationRequestModel(DallE2Value); + public static bool operator ==(InternalCreateImageVariationRequestModel left, InternalCreateImageVariationRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateImageVariationRequestModel left, InternalCreateImageVariationRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateImageVariationRequestModel(string value) => new InternalCreateImageVariationRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateImageVariationRequestModel other && Equals(other); + public bool Equals(InternalCreateImageVariationRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateImageVariationRequestResponseFormat.cs b/.dotnet/src/Generated/Models/InternalCreateImageVariationRequestResponseFormat.cs new file mode 100644 index 000000000..bb1ed33e1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateImageVariationRequestResponseFormat.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Images +{ + internal readonly partial struct InternalCreateImageVariationRequestResponseFormat : IEquatable + { + private readonly string _value; + + public InternalCreateImageVariationRequestResponseFormat(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string UrlValue = "url"; + private const string B64JsonValue = "b64_json"; + + public static InternalCreateImageVariationRequestResponseFormat Url { get; } = new InternalCreateImageVariationRequestResponseFormat(UrlValue); + public static InternalCreateImageVariationRequestResponseFormat B64Json { get; } = new InternalCreateImageVariationRequestResponseFormat(B64JsonValue); + public static bool operator ==(InternalCreateImageVariationRequestResponseFormat left, InternalCreateImageVariationRequestResponseFormat right) => left.Equals(right); + public static bool operator !=(InternalCreateImageVariationRequestResponseFormat left, InternalCreateImageVariationRequestResponseFormat right) => !left.Equals(right); + public static implicit operator InternalCreateImageVariationRequestResponseFormat(string value) => new InternalCreateImageVariationRequestResponseFormat(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateImageVariationRequestResponseFormat other && Equals(other); + public bool Equals(InternalCreateImageVariationRequestResponseFormat other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateImageVariationRequestSize.cs b/.dotnet/src/Generated/Models/InternalCreateImageVariationRequestSize.cs new file mode 100644 index 000000000..8a60593b0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateImageVariationRequestSize.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Images +{ + internal readonly partial struct InternalCreateImageVariationRequestSize : IEquatable + { + private readonly string _value; + + public InternalCreateImageVariationRequestSize(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string _256x256Value = "256x256"; + private const string _512x512Value = "512x512"; + private const string _1024x1024Value = "1024x1024"; + + public static InternalCreateImageVariationRequestSize _256x256 { get; } = new InternalCreateImageVariationRequestSize(_256x256Value); + public static InternalCreateImageVariationRequestSize _512x512 { get; } = new InternalCreateImageVariationRequestSize(_512x512Value); + public static InternalCreateImageVariationRequestSize _1024x1024 { get; } = new InternalCreateImageVariationRequestSize(_1024x1024Value); + public static bool operator ==(InternalCreateImageVariationRequestSize left, InternalCreateImageVariationRequestSize right) => left.Equals(right); + public static bool operator !=(InternalCreateImageVariationRequestSize left, InternalCreateImageVariationRequestSize right) => !left.Equals(right); + public static implicit operator InternalCreateImageVariationRequestSize(string value) => new InternalCreateImageVariationRequestSize(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateImageVariationRequestSize other && Equals(other); + public bool Equals(InternalCreateImageVariationRequestSize other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateModerationRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateModerationRequestModel.cs new file mode 100644 index 000000000..f58696815 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateModerationRequestModel.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Moderations +{ + internal readonly partial struct InternalCreateModerationRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateModerationRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string TextModerationLatestValue = "text-moderation-latest"; + private const string TextModerationStableValue = "text-moderation-stable"; + + public static InternalCreateModerationRequestModel TextModerationLatest { get; } = new InternalCreateModerationRequestModel(TextModerationLatestValue); + public static InternalCreateModerationRequestModel TextModerationStable { get; } = new InternalCreateModerationRequestModel(TextModerationStableValue); + public static bool operator ==(InternalCreateModerationRequestModel left, InternalCreateModerationRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateModerationRequestModel left, InternalCreateModerationRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateModerationRequestModel(string value) => new InternalCreateModerationRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateModerationRequestModel other && Equals(other); + public bool Equals(InternalCreateModerationRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateRunRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateRunRequestModel.cs new file mode 100644 index 000000000..27b73a31b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateRunRequestModel.cs @@ -0,0 +1,78 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalCreateRunRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateRunRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string Gpt4oValue = "gpt-4o"; + private const string Gpt4o20240806Value = "gpt-4o-2024-08-06"; + private const string Gpt4o20240513Value = "gpt-4o-2024-05-13"; + private const string Gpt4oMiniValue = "gpt-4o-mini"; + private const string Gpt4oMini20240718Value = "gpt-4o-mini-2024-07-18"; + private const string Gpt4TurboValue = "gpt-4-turbo"; + private const string Gpt4Turbo20240409Value = "gpt-4-turbo-2024-04-09"; + private const string Gpt40125PreviewValue = "gpt-4-0125-preview"; + private const string Gpt4TurboPreviewValue = "gpt-4-turbo-preview"; + private const string Gpt41106PreviewValue = "gpt-4-1106-preview"; + private const string Gpt4VisionPreviewValue = "gpt-4-vision-preview"; + private const string Gpt4Value = "gpt-4"; + private const string Gpt40314Value = "gpt-4-0314"; + private const string Gpt40613Value = "gpt-4-0613"; + private const string Gpt432kValue = "gpt-4-32k"; + private const string Gpt432k0314Value = "gpt-4-32k-0314"; + private const string Gpt432k0613Value = "gpt-4-32k-0613"; + private const string Gpt35TurboValue = "gpt-3.5-turbo"; + private const string Gpt35Turbo16kValue = "gpt-3.5-turbo-16k"; + private const string Gpt35Turbo0613Value = "gpt-3.5-turbo-0613"; + private const string Gpt35Turbo1106Value = "gpt-3.5-turbo-1106"; + private const string Gpt35Turbo0125Value = "gpt-3.5-turbo-0125"; + private const string Gpt35Turbo16k0613Value = "gpt-3.5-turbo-16k-0613"; + + public static InternalCreateRunRequestModel Gpt4o { get; } = new InternalCreateRunRequestModel(Gpt4oValue); + public static InternalCreateRunRequestModel Gpt4o20240806 { get; } = new InternalCreateRunRequestModel(Gpt4o20240806Value); + public static InternalCreateRunRequestModel Gpt4o20240513 { get; } = new InternalCreateRunRequestModel(Gpt4o20240513Value); + public static InternalCreateRunRequestModel Gpt4oMini { get; } = new InternalCreateRunRequestModel(Gpt4oMiniValue); + public static InternalCreateRunRequestModel Gpt4oMini20240718 { get; } = new InternalCreateRunRequestModel(Gpt4oMini20240718Value); + public static InternalCreateRunRequestModel Gpt4Turbo { get; } = new InternalCreateRunRequestModel(Gpt4TurboValue); + public static InternalCreateRunRequestModel Gpt4Turbo20240409 { get; } = new InternalCreateRunRequestModel(Gpt4Turbo20240409Value); + public static InternalCreateRunRequestModel Gpt40125Preview { get; } = new InternalCreateRunRequestModel(Gpt40125PreviewValue); + public static InternalCreateRunRequestModel Gpt4TurboPreview { get; } = new InternalCreateRunRequestModel(Gpt4TurboPreviewValue); + public static InternalCreateRunRequestModel Gpt41106Preview { get; } = new InternalCreateRunRequestModel(Gpt41106PreviewValue); + public static InternalCreateRunRequestModel Gpt4VisionPreview { get; } = new InternalCreateRunRequestModel(Gpt4VisionPreviewValue); + public static InternalCreateRunRequestModel Gpt4 { get; } = new InternalCreateRunRequestModel(Gpt4Value); + public static InternalCreateRunRequestModel Gpt40314 { get; } = new InternalCreateRunRequestModel(Gpt40314Value); + public static InternalCreateRunRequestModel Gpt40613 { get; } = new InternalCreateRunRequestModel(Gpt40613Value); + public static InternalCreateRunRequestModel Gpt432k { get; } = new InternalCreateRunRequestModel(Gpt432kValue); + public static InternalCreateRunRequestModel Gpt432k0314 { get; } = new InternalCreateRunRequestModel(Gpt432k0314Value); + public static InternalCreateRunRequestModel Gpt432k0613 { get; } = new InternalCreateRunRequestModel(Gpt432k0613Value); + public static InternalCreateRunRequestModel Gpt35Turbo { get; } = new InternalCreateRunRequestModel(Gpt35TurboValue); + public static InternalCreateRunRequestModel Gpt35Turbo16k { get; } = new InternalCreateRunRequestModel(Gpt35Turbo16kValue); + public static InternalCreateRunRequestModel Gpt35Turbo0613 { get; } = new InternalCreateRunRequestModel(Gpt35Turbo0613Value); + public static InternalCreateRunRequestModel Gpt35Turbo1106 { get; } = new InternalCreateRunRequestModel(Gpt35Turbo1106Value); + public static InternalCreateRunRequestModel Gpt35Turbo0125 { get; } = new InternalCreateRunRequestModel(Gpt35Turbo0125Value); + public static InternalCreateRunRequestModel Gpt35Turbo16k0613 { get; } = new InternalCreateRunRequestModel(Gpt35Turbo16k0613Value); + public static bool operator ==(InternalCreateRunRequestModel left, InternalCreateRunRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateRunRequestModel left, InternalCreateRunRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateRunRequestModel(string value) => new InternalCreateRunRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateRunRequestModel other && Equals(other); + public bool Equals(InternalCreateRunRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateSpeechRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateSpeechRequestModel.cs new file mode 100644 index 000000000..795344fd1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateSpeechRequestModel.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Audio +{ + internal readonly partial struct InternalCreateSpeechRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateSpeechRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string Tts1Value = "tts-1"; + private const string Tts1HdValue = "tts-1-hd"; + + public static InternalCreateSpeechRequestModel Tts1 { get; } = new InternalCreateSpeechRequestModel(Tts1Value); + public static InternalCreateSpeechRequestModel Tts1Hd { get; } = new InternalCreateSpeechRequestModel(Tts1HdValue); + public static bool operator ==(InternalCreateSpeechRequestModel left, InternalCreateSpeechRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateSpeechRequestModel left, InternalCreateSpeechRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateSpeechRequestModel(string value) => new InternalCreateSpeechRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateSpeechRequestModel other && Equals(other); + public bool Equals(InternalCreateSpeechRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequest.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequest.Serialization.cs new file mode 100644 index 000000000..20185da5d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequest.Serialization.cs @@ -0,0 +1,498 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadAndRunRequest : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequest)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("assistant_id") != true) + { + writer.WritePropertyName("assistant_id"u8); + writer.WriteStringValue(AssistantId); + } + if (SerializedAdditionalRawData?.ContainsKey("thread") != true && Optional.IsDefined(Thread)) + { + writer.WritePropertyName("thread"u8); + writer.WriteObjectValue(Thread, options); + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true && Optional.IsDefined(Model)) + { + if (Model != null) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model); + } + else + { + writer.WriteNull("model"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("instructions") != true && Optional.IsDefined(Instructions)) + { + if (Instructions != null) + { + writer.WritePropertyName("instructions"u8); + writer.WriteStringValue(Instructions); + } + else + { + writer.WriteNull("instructions"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tools") != true && Optional.IsCollectionDefined(Tools)) + { + if (Tools != null) + { + writer.WritePropertyName("tools"u8); + writer.WriteStartArray(); + foreach (var item in Tools) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("tools"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tool_resources") != true && Optional.IsDefined(ToolResources)) + { + if (ToolResources != null) + { + writer.WritePropertyName("tool_resources"u8); + writer.WriteObjectValue(ToolResources, options); + } + else + { + writer.WriteNull("tool_resources"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("temperature") != true && Optional.IsDefined(Temperature)) + { + if (Temperature != null) + { + writer.WritePropertyName("temperature"u8); + writer.WriteNumberValue(Temperature.Value); + } + else + { + writer.WriteNull("temperature"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("top_p") != true && Optional.IsDefined(TopP)) + { + if (TopP != null) + { + writer.WritePropertyName("top_p"u8); + writer.WriteNumberValue(TopP.Value); + } + else + { + writer.WriteNull("top_p"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("stream") != true && Optional.IsDefined(Stream)) + { + if (Stream != null) + { + writer.WritePropertyName("stream"u8); + writer.WriteBooleanValue(Stream.Value); + } + else + { + writer.WriteNull("stream"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("max_prompt_tokens") != true && Optional.IsDefined(MaxPromptTokens)) + { + if (MaxPromptTokens != null) + { + writer.WritePropertyName("max_prompt_tokens"u8); + writer.WriteNumberValue(MaxPromptTokens.Value); + } + else + { + writer.WriteNull("max_prompt_tokens"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("max_completion_tokens") != true && Optional.IsDefined(MaxCompletionTokens)) + { + if (MaxCompletionTokens != null) + { + writer.WritePropertyName("max_completion_tokens"u8); + writer.WriteNumberValue(MaxCompletionTokens.Value); + } + else + { + writer.WriteNull("max_completion_tokens"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("truncation_strategy") != true && Optional.IsDefined(TruncationStrategy)) + { + if (TruncationStrategy != null) + { + writer.WritePropertyName("truncation_strategy"u8); + writer.WriteObjectValue(TruncationStrategy, options); + } + else + { + writer.WriteNull("truncation_strategy"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tool_choice") != true && Optional.IsDefined(ToolChoice)) + { + if (ToolChoice != null) + { + writer.WritePropertyName("tool_choice"u8); + writer.WriteObjectValue(ToolChoice, options); + } + else + { + writer.WriteNull("tool_choice"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("parallel_tool_calls") != true && Optional.IsDefined(ParallelToolCalls)) + { + writer.WritePropertyName("parallel_tool_calls"u8); + writer.WriteBooleanValue(ParallelToolCalls.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + if (ResponseFormat != null) + { + writer.WritePropertyName("response_format"u8); + writer.WriteObjectValue(ResponseFormat, options); + } + else + { + writer.WriteNull("response_format"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateThreadAndRunRequest IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequest)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateThreadAndRunRequest(document.RootElement, options); + } + + internal static InternalCreateThreadAndRunRequest DeserializeInternalCreateThreadAndRunRequest(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string assistantId = default; + ThreadCreationOptions thread = default; + string model = default; + string instructions = default; + IList tools = default; + ToolResources toolResources = default; + IDictionary metadata = default; + float? temperature = default; + float? topP = default; + bool? stream = default; + int? maxPromptTokens = default; + int? maxCompletionTokens = default; + RunTruncationStrategy truncationStrategy = default; + ToolConstraint toolChoice = default; + bool? parallelToolCalls = default; + AssistantResponseFormat responseFormat = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("assistant_id"u8)) + { + assistantId = property.Value.GetString(); + continue; + } + if (property.NameEquals("thread"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + thread = ThreadCreationOptions.DeserializeThreadCreationOptions(property.Value, options); + continue; + } + if (property.NameEquals("model"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + model = null; + continue; + } + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("instructions"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + instructions = null; + continue; + } + instructions = property.Value.GetString(); + continue; + } + if (property.NameEquals("tools"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ToolDefinition.DeserializeToolDefinition(item, options)); + } + tools = array; + continue; + } + if (property.NameEquals("tool_resources"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + toolResources = null; + continue; + } + toolResources = Assistants.ToolResources.DeserializeToolResources(property.Value, options); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (property.NameEquals("temperature"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + temperature = null; + continue; + } + temperature = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("top_p"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + topP = null; + continue; + } + topP = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("stream"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + stream = null; + continue; + } + stream = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("max_prompt_tokens"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + maxPromptTokens = null; + continue; + } + maxPromptTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("max_completion_tokens"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + maxCompletionTokens = null; + continue; + } + maxCompletionTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("truncation_strategy"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + truncationStrategy = null; + continue; + } + truncationStrategy = RunTruncationStrategy.DeserializeRunTruncationStrategy(property.Value, options); + continue; + } + if (property.NameEquals("tool_choice"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + toolChoice = null; + continue; + } + toolChoice = ToolConstraint.DeserializeToolConstraint(property.Value, options); + continue; + } + if (property.NameEquals("parallel_tool_calls"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + parallelToolCalls = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + responseFormat = null; + continue; + } + responseFormat = AssistantResponseFormat.DeserializeAssistantResponseFormat(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateThreadAndRunRequest( + assistantId, + thread, + model, + instructions, + tools ?? new ChangeTrackingList(), + toolResources, + metadata ?? new ChangeTrackingDictionary(), + temperature, + topP, + stream, + maxPromptTokens, + maxCompletionTokens, + truncationStrategy, + toolChoice, + parallelToolCalls, + responseFormat, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequest)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateThreadAndRunRequest IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateThreadAndRunRequest(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequest)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateThreadAndRunRequest FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateThreadAndRunRequest(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequest.cs b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequest.cs new file mode 100644 index 000000000..5e50cf643 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequest.cs @@ -0,0 +1,60 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadAndRunRequest + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateThreadAndRunRequest(string assistantId) + { + Argument.AssertNotNull(assistantId, nameof(assistantId)); + + AssistantId = assistantId; + Tools = new ChangeTrackingList(); + Metadata = new ChangeTrackingDictionary(); + } + + internal InternalCreateThreadAndRunRequest(string assistantId, ThreadCreationOptions thread, string model, string instructions, IList tools, ToolResources toolResources, IDictionary metadata, float? temperature, float? topP, bool? stream, int? maxPromptTokens, int? maxCompletionTokens, RunTruncationStrategy truncationStrategy, ToolConstraint toolChoice, bool? parallelToolCalls, AssistantResponseFormat responseFormat, IDictionary serializedAdditionalRawData) + { + AssistantId = assistantId; + Thread = thread; + Model = model; + Instructions = instructions; + Tools = tools; + ToolResources = toolResources; + Metadata = metadata; + Temperature = temperature; + TopP = topP; + Stream = stream; + MaxPromptTokens = maxPromptTokens; + MaxCompletionTokens = maxCompletionTokens; + TruncationStrategy = truncationStrategy; + ToolChoice = toolChoice; + ParallelToolCalls = parallelToolCalls; + ResponseFormat = responseFormat; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateThreadAndRunRequest() + { + } + + public string AssistantId { get; } + public ThreadCreationOptions Thread { get; set; } + public string Instructions { get; set; } + public IList Tools { get; set; } + public IDictionary Metadata { get; set; } + public float? Temperature { get; set; } + public float? TopP { get; set; } + public bool? Stream { get; set; } + public int? MaxPromptTokens { get; set; } + public int? MaxCompletionTokens { get; set; } + public RunTruncationStrategy TruncationStrategy { get; set; } + public bool? ParallelToolCalls { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestModel.cs new file mode 100644 index 000000000..04d84a623 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestModel.cs @@ -0,0 +1,78 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalCreateThreadAndRunRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateThreadAndRunRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string Gpt4oValue = "gpt-4o"; + private const string Gpt4o20240806Value = "gpt-4o-2024-08-06"; + private const string Gpt4o20240513Value = "gpt-4o-2024-05-13"; + private const string Gpt4oMiniValue = "gpt-4o-mini"; + private const string Gpt4oMini20240718Value = "gpt-4o-mini-2024-07-18"; + private const string Gpt4TurboValue = "gpt-4-turbo"; + private const string Gpt4Turbo20240409Value = "gpt-4-turbo-2024-04-09"; + private const string Gpt40125PreviewValue = "gpt-4-0125-preview"; + private const string Gpt4TurboPreviewValue = "gpt-4-turbo-preview"; + private const string Gpt41106PreviewValue = "gpt-4-1106-preview"; + private const string Gpt4VisionPreviewValue = "gpt-4-vision-preview"; + private const string Gpt4Value = "gpt-4"; + private const string Gpt40314Value = "gpt-4-0314"; + private const string Gpt40613Value = "gpt-4-0613"; + private const string Gpt432kValue = "gpt-4-32k"; + private const string Gpt432k0314Value = "gpt-4-32k-0314"; + private const string Gpt432k0613Value = "gpt-4-32k-0613"; + private const string Gpt35TurboValue = "gpt-3.5-turbo"; + private const string Gpt35Turbo16kValue = "gpt-3.5-turbo-16k"; + private const string Gpt35Turbo0613Value = "gpt-3.5-turbo-0613"; + private const string Gpt35Turbo1106Value = "gpt-3.5-turbo-1106"; + private const string Gpt35Turbo0125Value = "gpt-3.5-turbo-0125"; + private const string Gpt35Turbo16k0613Value = "gpt-3.5-turbo-16k-0613"; + + public static InternalCreateThreadAndRunRequestModel Gpt4o { get; } = new InternalCreateThreadAndRunRequestModel(Gpt4oValue); + public static InternalCreateThreadAndRunRequestModel Gpt4o20240806 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt4o20240806Value); + public static InternalCreateThreadAndRunRequestModel Gpt4o20240513 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt4o20240513Value); + public static InternalCreateThreadAndRunRequestModel Gpt4oMini { get; } = new InternalCreateThreadAndRunRequestModel(Gpt4oMiniValue); + public static InternalCreateThreadAndRunRequestModel Gpt4oMini20240718 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt4oMini20240718Value); + public static InternalCreateThreadAndRunRequestModel Gpt4Turbo { get; } = new InternalCreateThreadAndRunRequestModel(Gpt4TurboValue); + public static InternalCreateThreadAndRunRequestModel Gpt4Turbo20240409 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt4Turbo20240409Value); + public static InternalCreateThreadAndRunRequestModel Gpt40125Preview { get; } = new InternalCreateThreadAndRunRequestModel(Gpt40125PreviewValue); + public static InternalCreateThreadAndRunRequestModel Gpt4TurboPreview { get; } = new InternalCreateThreadAndRunRequestModel(Gpt4TurboPreviewValue); + public static InternalCreateThreadAndRunRequestModel Gpt41106Preview { get; } = new InternalCreateThreadAndRunRequestModel(Gpt41106PreviewValue); + public static InternalCreateThreadAndRunRequestModel Gpt4VisionPreview { get; } = new InternalCreateThreadAndRunRequestModel(Gpt4VisionPreviewValue); + public static InternalCreateThreadAndRunRequestModel Gpt4 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt4Value); + public static InternalCreateThreadAndRunRequestModel Gpt40314 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt40314Value); + public static InternalCreateThreadAndRunRequestModel Gpt40613 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt40613Value); + public static InternalCreateThreadAndRunRequestModel Gpt432k { get; } = new InternalCreateThreadAndRunRequestModel(Gpt432kValue); + public static InternalCreateThreadAndRunRequestModel Gpt432k0314 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt432k0314Value); + public static InternalCreateThreadAndRunRequestModel Gpt432k0613 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt432k0613Value); + public static InternalCreateThreadAndRunRequestModel Gpt35Turbo { get; } = new InternalCreateThreadAndRunRequestModel(Gpt35TurboValue); + public static InternalCreateThreadAndRunRequestModel Gpt35Turbo16k { get; } = new InternalCreateThreadAndRunRequestModel(Gpt35Turbo16kValue); + public static InternalCreateThreadAndRunRequestModel Gpt35Turbo0613 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt35Turbo0613Value); + public static InternalCreateThreadAndRunRequestModel Gpt35Turbo1106 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt35Turbo1106Value); + public static InternalCreateThreadAndRunRequestModel Gpt35Turbo0125 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt35Turbo0125Value); + public static InternalCreateThreadAndRunRequestModel Gpt35Turbo16k0613 { get; } = new InternalCreateThreadAndRunRequestModel(Gpt35Turbo16k0613Value); + public static bool operator ==(InternalCreateThreadAndRunRequestModel left, InternalCreateThreadAndRunRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateThreadAndRunRequestModel left, InternalCreateThreadAndRunRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateThreadAndRunRequestModel(string value) => new InternalCreateThreadAndRunRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateThreadAndRunRequestModel other && Equals(other); + public bool Equals(InternalCreateThreadAndRunRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolChoice.cs b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolChoice.cs new file mode 100644 index 000000000..45a467b3c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolChoice.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalCreateThreadAndRunRequestToolChoice : IEquatable + { + private readonly string _value; + + public InternalCreateThreadAndRunRequestToolChoice(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string NoneValue = "none"; + private const string AutoValue = "auto"; + private const string RequiredValue = "required"; + + public static InternalCreateThreadAndRunRequestToolChoice None { get; } = new InternalCreateThreadAndRunRequestToolChoice(NoneValue); + public static InternalCreateThreadAndRunRequestToolChoice Auto { get; } = new InternalCreateThreadAndRunRequestToolChoice(AutoValue); + public static InternalCreateThreadAndRunRequestToolChoice Required { get; } = new InternalCreateThreadAndRunRequestToolChoice(RequiredValue); + public static bool operator ==(InternalCreateThreadAndRunRequestToolChoice left, InternalCreateThreadAndRunRequestToolChoice right) => left.Equals(right); + public static bool operator !=(InternalCreateThreadAndRunRequestToolChoice left, InternalCreateThreadAndRunRequestToolChoice right) => !left.Equals(right); + public static implicit operator InternalCreateThreadAndRunRequestToolChoice(string value) => new InternalCreateThreadAndRunRequestToolChoice(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateThreadAndRunRequestToolChoice other && Equals(other); + public bool Equals(InternalCreateThreadAndRunRequestToolChoice other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResources.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResources.Serialization.cs new file mode 100644 index 000000000..aff6e9e50 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResources.Serialization.cs @@ -0,0 +1,152 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadAndRunRequestToolResources : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequestToolResources)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code_interpreter") != true && Optional.IsDefined(CodeInterpreter)) + { + writer.WritePropertyName("code_interpreter"u8); + writer.WriteObjectValue(CodeInterpreter, options); + } + if (SerializedAdditionalRawData?.ContainsKey("file_search") != true && Optional.IsDefined(FileSearch)) + { + writer.WritePropertyName("file_search"u8); + writer.WriteObjectValue(FileSearch, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateThreadAndRunRequestToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequestToolResources)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateThreadAndRunRequestToolResources(document.RootElement, options); + } + + internal static InternalCreateThreadAndRunRequestToolResources DeserializeInternalCreateThreadAndRunRequestToolResources(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter codeInterpreter = default; + InternalToolResourcesFileSearchIdsOnly fileSearch = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code_interpreter"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + codeInterpreter = InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter.DeserializeInternalCreateThreadAndRunRequestToolResourcesCodeInterpreter(property.Value, options); + continue; + } + if (property.NameEquals("file_search"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fileSearch = InternalToolResourcesFileSearchIdsOnly.DeserializeInternalToolResourcesFileSearchIdsOnly(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateThreadAndRunRequestToolResources(codeInterpreter, fileSearch, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequestToolResources)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateThreadAndRunRequestToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateThreadAndRunRequestToolResources(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequestToolResources)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateThreadAndRunRequestToolResources FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateThreadAndRunRequestToolResources(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResources.cs b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResources.cs new file mode 100644 index 000000000..a07a484a3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResources.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadAndRunRequestToolResources + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateThreadAndRunRequestToolResources() + { + } + + internal InternalCreateThreadAndRunRequestToolResources(InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter codeInterpreter, InternalToolResourcesFileSearchIdsOnly fileSearch, IDictionary serializedAdditionalRawData) + { + CodeInterpreter = codeInterpreter; + FileSearch = fileSearch; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter CodeInterpreter { get; set; } + public InternalToolResourcesFileSearchIdsOnly FileSearch { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter.Serialization.cs new file mode 100644 index 000000000..73c8d2c30 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_ids") != true && Optional.IsCollectionDefined(FileIds)) + { + writer.WritePropertyName("file_ids"u8); + writer.WriteStartArray(); + foreach (var item in FileIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateThreadAndRunRequestToolResourcesCodeInterpreter(document.RootElement, options); + } + + internal static InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter DeserializeInternalCreateThreadAndRunRequestToolResourcesCodeInterpreter(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList fileIds = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + fileIds = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter(fileIds ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateThreadAndRunRequestToolResourcesCodeInterpreter(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateThreadAndRunRequestToolResourcesCodeInterpreter(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter.cs b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter.cs new file mode 100644 index 000000000..ed362ab6b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter() + { + FileIds = new ChangeTrackingList(); + } + + internal InternalCreateThreadAndRunRequestToolResourcesCodeInterpreter(IList fileIds, IDictionary serializedAdditionalRawData) + { + FileIds = fileIds; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IList FileIds { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResources.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResources.Serialization.cs new file mode 100644 index 000000000..39b6689b2 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResources.Serialization.cs @@ -0,0 +1,152 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadRequestToolResources : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResources)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code_interpreter") != true && Optional.IsDefined(CodeInterpreter)) + { + writer.WritePropertyName("code_interpreter"u8); + writer.WriteObjectValue(CodeInterpreter, options); + } + if (SerializedAdditionalRawData?.ContainsKey("file_search") != true && Optional.IsDefined(FileSearch)) + { + writer.WritePropertyName("file_search"u8); + writer.WriteObjectValue(FileSearch, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateThreadRequestToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResources)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateThreadRequestToolResources(document.RootElement, options); + } + + internal static InternalCreateThreadRequestToolResources DeserializeInternalCreateThreadRequestToolResources(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalCreateThreadRequestToolResourcesCodeInterpreter codeInterpreter = default; + FileSearchToolResources fileSearch = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code_interpreter"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + codeInterpreter = InternalCreateThreadRequestToolResourcesCodeInterpreter.DeserializeInternalCreateThreadRequestToolResourcesCodeInterpreter(property.Value, options); + continue; + } + if (property.NameEquals("file_search"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fileSearch = FileSearchToolResources.DeserializeFileSearchToolResources(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateThreadRequestToolResources(codeInterpreter, fileSearch, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResources)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateThreadRequestToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateThreadRequestToolResources(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResources)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateThreadRequestToolResources FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateThreadRequestToolResources(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResources.cs b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResources.cs new file mode 100644 index 000000000..08a68fce1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResources.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadRequestToolResources + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateThreadRequestToolResources() + { + } + + internal InternalCreateThreadRequestToolResources(InternalCreateThreadRequestToolResourcesCodeInterpreter codeInterpreter, FileSearchToolResources fileSearch, IDictionary serializedAdditionalRawData) + { + CodeInterpreter = codeInterpreter; + FileSearch = fileSearch; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public InternalCreateThreadRequestToolResourcesCodeInterpreter CodeInterpreter { get; set; } + public FileSearchToolResources FileSearch { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesCodeInterpreter.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesCodeInterpreter.Serialization.cs new file mode 100644 index 000000000..55825459e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesCodeInterpreter.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadRequestToolResourcesCodeInterpreter : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResourcesCodeInterpreter)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_ids") != true && Optional.IsCollectionDefined(FileIds)) + { + writer.WritePropertyName("file_ids"u8); + writer.WriteStartArray(); + foreach (var item in FileIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateThreadRequestToolResourcesCodeInterpreter IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResourcesCodeInterpreter)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateThreadRequestToolResourcesCodeInterpreter(document.RootElement, options); + } + + internal static InternalCreateThreadRequestToolResourcesCodeInterpreter DeserializeInternalCreateThreadRequestToolResourcesCodeInterpreter(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList fileIds = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + fileIds = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateThreadRequestToolResourcesCodeInterpreter(fileIds ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResourcesCodeInterpreter)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateThreadRequestToolResourcesCodeInterpreter IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateThreadRequestToolResourcesCodeInterpreter(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResourcesCodeInterpreter)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateThreadRequestToolResourcesCodeInterpreter FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateThreadRequestToolResourcesCodeInterpreter(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesCodeInterpreter.cs b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesCodeInterpreter.cs new file mode 100644 index 000000000..df61d7ce5 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesCodeInterpreter.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadRequestToolResourcesCodeInterpreter + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateThreadRequestToolResourcesCodeInterpreter() + { + FileIds = new ChangeTrackingList(); + } + + internal InternalCreateThreadRequestToolResourcesCodeInterpreter(IList fileIds, IDictionary serializedAdditionalRawData) + { + FileIds = fileIds; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IList FileIds { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesFileSearchBase.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesFileSearchBase.Serialization.cs new file mode 100644 index 000000000..bc7fd659d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesFileSearchBase.Serialization.cs @@ -0,0 +1,122 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadRequestToolResourcesFileSearchBase : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResourcesFileSearchBase)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateThreadRequestToolResourcesFileSearchBase IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResourcesFileSearchBase)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateThreadRequestToolResourcesFileSearchBase(document.RootElement, options); + } + + internal static InternalCreateThreadRequestToolResourcesFileSearchBase DeserializeInternalCreateThreadRequestToolResourcesFileSearchBase(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateThreadRequestToolResourcesFileSearchBase(serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResourcesFileSearchBase)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateThreadRequestToolResourcesFileSearchBase IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateThreadRequestToolResourcesFileSearchBase(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateThreadRequestToolResourcesFileSearchBase)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateThreadRequestToolResourcesFileSearchBase FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateThreadRequestToolResourcesFileSearchBase(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesFileSearchBase.cs b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesFileSearchBase.cs new file mode 100644 index 000000000..4a84b73f5 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateThreadRequestToolResourcesFileSearchBase.cs @@ -0,0 +1,22 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalCreateThreadRequestToolResourcesFileSearchBase + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateThreadRequestToolResourcesFileSearchBase() + { + } + + internal InternalCreateThreadRequestToolResourcesFileSearchBase(IDictionary serializedAdditionalRawData) + { + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateTranscriptionRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateTranscriptionRequestModel.cs new file mode 100644 index 000000000..a04004f24 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateTranscriptionRequestModel.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Audio +{ + internal readonly partial struct InternalCreateTranscriptionRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateTranscriptionRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string Whisper1Value = "whisper-1"; + + public static InternalCreateTranscriptionRequestModel Whisper1 { get; } = new InternalCreateTranscriptionRequestModel(Whisper1Value); + public static bool operator ==(InternalCreateTranscriptionRequestModel left, InternalCreateTranscriptionRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateTranscriptionRequestModel left, InternalCreateTranscriptionRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateTranscriptionRequestModel(string value) => new InternalCreateTranscriptionRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateTranscriptionRequestModel other && Equals(other); + public bool Equals(InternalCreateTranscriptionRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateTranscriptionRequestTimestampGranularity.cs b/.dotnet/src/Generated/Models/InternalCreateTranscriptionRequestTimestampGranularity.cs new file mode 100644 index 000000000..5a13a25c6 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateTranscriptionRequestTimestampGranularity.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Audio +{ + internal readonly partial struct InternalCreateTranscriptionRequestTimestampGranularity : IEquatable + { + private readonly string _value; + + public InternalCreateTranscriptionRequestTimestampGranularity(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string WordValue = "word"; + private const string SegmentValue = "segment"; + + public static InternalCreateTranscriptionRequestTimestampGranularity Word { get; } = new InternalCreateTranscriptionRequestTimestampGranularity(WordValue); + public static InternalCreateTranscriptionRequestTimestampGranularity Segment { get; } = new InternalCreateTranscriptionRequestTimestampGranularity(SegmentValue); + public static bool operator ==(InternalCreateTranscriptionRequestTimestampGranularity left, InternalCreateTranscriptionRequestTimestampGranularity right) => left.Equals(right); + public static bool operator !=(InternalCreateTranscriptionRequestTimestampGranularity left, InternalCreateTranscriptionRequestTimestampGranularity right) => !left.Equals(right); + public static implicit operator InternalCreateTranscriptionRequestTimestampGranularity(string value) => new InternalCreateTranscriptionRequestTimestampGranularity(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateTranscriptionRequestTimestampGranularity other && Equals(other); + public bool Equals(InternalCreateTranscriptionRequestTimestampGranularity other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateTranscriptionResponseJson.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateTranscriptionResponseJson.Serialization.cs new file mode 100644 index 000000000..a5e4863a8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateTranscriptionResponseJson.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Audio +{ + internal partial class InternalCreateTranscriptionResponseJson : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateTranscriptionResponseJson)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("text") != true) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(Text); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateTranscriptionResponseJson IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateTranscriptionResponseJson)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateTranscriptionResponseJson(document.RootElement, options); + } + + internal static InternalCreateTranscriptionResponseJson DeserializeInternalCreateTranscriptionResponseJson(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string text = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateTranscriptionResponseJson(text, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateTranscriptionResponseJson)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateTranscriptionResponseJson IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateTranscriptionResponseJson(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateTranscriptionResponseJson)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateTranscriptionResponseJson FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateTranscriptionResponseJson(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateTranscriptionResponseJson.cs b/.dotnet/src/Generated/Models/InternalCreateTranscriptionResponseJson.cs new file mode 100644 index 000000000..19818f1d1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateTranscriptionResponseJson.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Audio +{ + internal partial class InternalCreateTranscriptionResponseJson + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateTranscriptionResponseJson(string text) + { + Argument.AssertNotNull(text, nameof(text)); + + Text = text; + } + + internal InternalCreateTranscriptionResponseJson(string text, IDictionary serializedAdditionalRawData) + { + Text = text; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateTranscriptionResponseJson() + { + } + + public string Text { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateTranscriptionResponseVerboseJsonTask.cs b/.dotnet/src/Generated/Models/InternalCreateTranscriptionResponseVerboseJsonTask.cs new file mode 100644 index 000000000..a94a2f567 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateTranscriptionResponseVerboseJsonTask.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Audio +{ + internal readonly partial struct InternalCreateTranscriptionResponseVerboseJsonTask : IEquatable + { + private readonly string _value; + + public InternalCreateTranscriptionResponseVerboseJsonTask(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string TranscribeValue = "transcribe"; + + public static InternalCreateTranscriptionResponseVerboseJsonTask Transcribe { get; } = new InternalCreateTranscriptionResponseVerboseJsonTask(TranscribeValue); + public static bool operator ==(InternalCreateTranscriptionResponseVerboseJsonTask left, InternalCreateTranscriptionResponseVerboseJsonTask right) => left.Equals(right); + public static bool operator !=(InternalCreateTranscriptionResponseVerboseJsonTask left, InternalCreateTranscriptionResponseVerboseJsonTask right) => !left.Equals(right); + public static implicit operator InternalCreateTranscriptionResponseVerboseJsonTask(string value) => new InternalCreateTranscriptionResponseVerboseJsonTask(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateTranscriptionResponseVerboseJsonTask other && Equals(other); + public bool Equals(InternalCreateTranscriptionResponseVerboseJsonTask other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateTranslationRequestModel.cs b/.dotnet/src/Generated/Models/InternalCreateTranslationRequestModel.cs new file mode 100644 index 000000000..67effb4f6 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateTranslationRequestModel.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Audio +{ + internal readonly partial struct InternalCreateTranslationRequestModel : IEquatable + { + private readonly string _value; + + public InternalCreateTranslationRequestModel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string Whisper1Value = "whisper-1"; + + public static InternalCreateTranslationRequestModel Whisper1 { get; } = new InternalCreateTranslationRequestModel(Whisper1Value); + public static bool operator ==(InternalCreateTranslationRequestModel left, InternalCreateTranslationRequestModel right) => left.Equals(right); + public static bool operator !=(InternalCreateTranslationRequestModel left, InternalCreateTranslationRequestModel right) => !left.Equals(right); + public static implicit operator InternalCreateTranslationRequestModel(string value) => new InternalCreateTranslationRequestModel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateTranslationRequestModel other && Equals(other); + public bool Equals(InternalCreateTranslationRequestModel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateTranslationResponseJson.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateTranslationResponseJson.Serialization.cs new file mode 100644 index 000000000..d0d1fcd37 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateTranslationResponseJson.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Audio +{ + internal partial class InternalCreateTranslationResponseJson : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateTranslationResponseJson)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("text") != true) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(Text); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateTranslationResponseJson IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateTranslationResponseJson)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateTranslationResponseJson(document.RootElement, options); + } + + internal static InternalCreateTranslationResponseJson DeserializeInternalCreateTranslationResponseJson(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string text = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateTranslationResponseJson(text, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateTranslationResponseJson)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateTranslationResponseJson IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateTranslationResponseJson(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateTranslationResponseJson)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateTranslationResponseJson FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateTranslationResponseJson(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateTranslationResponseJson.cs b/.dotnet/src/Generated/Models/InternalCreateTranslationResponseJson.cs new file mode 100644 index 000000000..e6e9dae14 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateTranslationResponseJson.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Audio +{ + internal partial class InternalCreateTranslationResponseJson + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalCreateTranslationResponseJson(string text) + { + Argument.AssertNotNull(text, nameof(text)); + + Text = text; + } + + internal InternalCreateTranslationResponseJson(string text, IDictionary serializedAdditionalRawData) + { + Text = text; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateTranslationResponseJson() + { + } + + public string Text { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateTranslationResponseVerboseJsonTask.cs b/.dotnet/src/Generated/Models/InternalCreateTranslationResponseVerboseJsonTask.cs new file mode 100644 index 000000000..089261285 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateTranslationResponseVerboseJsonTask.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Audio +{ + internal readonly partial struct InternalCreateTranslationResponseVerboseJsonTask : IEquatable + { + private readonly string _value; + + public InternalCreateTranslationResponseVerboseJsonTask(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string TranslateValue = "translate"; + + public static InternalCreateTranslationResponseVerboseJsonTask Translate { get; } = new InternalCreateTranslationResponseVerboseJsonTask(TranslateValue); + public static bool operator ==(InternalCreateTranslationResponseVerboseJsonTask left, InternalCreateTranslationResponseVerboseJsonTask right) => left.Equals(right); + public static bool operator !=(InternalCreateTranslationResponseVerboseJsonTask left, InternalCreateTranslationResponseVerboseJsonTask right) => !left.Equals(right); + public static implicit operator InternalCreateTranslationResponseVerboseJsonTask(string value) => new InternalCreateTranslationResponseVerboseJsonTask(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateTranslationResponseVerboseJsonTask other && Equals(other); + public bool Equals(InternalCreateTranslationResponseVerboseJsonTask other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateUploadRequest.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateUploadRequest.Serialization.cs new file mode 100644 index 000000000..e2ed769ba --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateUploadRequest.Serialization.cs @@ -0,0 +1,166 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Files +{ + internal partial class InternalCreateUploadRequest : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateUploadRequest)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("filename") != true) + { + writer.WritePropertyName("filename"u8); + writer.WriteStringValue(Filename); + } + if (SerializedAdditionalRawData?.ContainsKey("purpose") != true) + { + writer.WritePropertyName("purpose"u8); + writer.WriteStringValue(Purpose.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("bytes") != true) + { + writer.WritePropertyName("bytes"u8); + writer.WriteNumberValue(Bytes); + } + if (SerializedAdditionalRawData?.ContainsKey("mime_type") != true) + { + writer.WritePropertyName("mime_type"u8); + writer.WriteStringValue(MimeType); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateUploadRequest IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateUploadRequest)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateUploadRequest(document.RootElement, options); + } + + internal static InternalCreateUploadRequest DeserializeInternalCreateUploadRequest(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string filename = default; + InternalCreateUploadRequestPurpose purpose = default; + int bytes = default; + string mimeType = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("filename"u8)) + { + filename = property.Value.GetString(); + continue; + } + if (property.NameEquals("purpose"u8)) + { + purpose = new InternalCreateUploadRequestPurpose(property.Value.GetString()); + continue; + } + if (property.NameEquals("bytes"u8)) + { + bytes = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("mime_type"u8)) + { + mimeType = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateUploadRequest(filename, purpose, bytes, mimeType, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateUploadRequest)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateUploadRequest IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateUploadRequest(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateUploadRequest)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateUploadRequest FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateUploadRequest(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateUploadRequest.cs b/.dotnet/src/Generated/Models/InternalCreateUploadRequest.cs new file mode 100644 index 000000000..f580307a6 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateUploadRequest.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Files +{ + internal partial class InternalCreateUploadRequest + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateUploadRequest(string filename, InternalCreateUploadRequestPurpose purpose, int bytes, string mimeType) + { + Argument.AssertNotNull(filename, nameof(filename)); + Argument.AssertNotNull(mimeType, nameof(mimeType)); + + Filename = filename; + Purpose = purpose; + Bytes = bytes; + MimeType = mimeType; + } + + internal InternalCreateUploadRequest(string filename, InternalCreateUploadRequestPurpose purpose, int bytes, string mimeType, IDictionary serializedAdditionalRawData) + { + Filename = filename; + Purpose = purpose; + Bytes = bytes; + MimeType = mimeType; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateUploadRequest() + { + } + + public string Filename { get; } + public InternalCreateUploadRequestPurpose Purpose { get; } + public int Bytes { get; } + public string MimeType { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateUploadRequestPurpose.cs b/.dotnet/src/Generated/Models/InternalCreateUploadRequestPurpose.cs new file mode 100644 index 000000000..6b76958f0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateUploadRequestPurpose.cs @@ -0,0 +1,40 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Files +{ + internal readonly partial struct InternalCreateUploadRequestPurpose : IEquatable + { + private readonly string _value; + + public InternalCreateUploadRequestPurpose(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AssistantsValue = "assistants"; + private const string BatchValue = "batch"; + private const string FineTuneValue = "fine-tune"; + private const string VisionValue = "vision"; + + public static InternalCreateUploadRequestPurpose Assistants { get; } = new InternalCreateUploadRequestPurpose(AssistantsValue); + public static InternalCreateUploadRequestPurpose Batch { get; } = new InternalCreateUploadRequestPurpose(BatchValue); + public static InternalCreateUploadRequestPurpose FineTune { get; } = new InternalCreateUploadRequestPurpose(FineTuneValue); + public static InternalCreateUploadRequestPurpose Vision { get; } = new InternalCreateUploadRequestPurpose(VisionValue); + public static bool operator ==(InternalCreateUploadRequestPurpose left, InternalCreateUploadRequestPurpose right) => left.Equals(right); + public static bool operator !=(InternalCreateUploadRequestPurpose left, InternalCreateUploadRequestPurpose right) => !left.Equals(right); + public static implicit operator InternalCreateUploadRequestPurpose(string value) => new InternalCreateUploadRequestPurpose(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalCreateUploadRequestPurpose other && Equals(other); + public bool Equals(InternalCreateUploadRequestPurpose other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileBatchRequest.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileBatchRequest.Serialization.cs new file mode 100644 index 000000000..90585d41f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileBatchRequest.Serialization.cs @@ -0,0 +1,165 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalCreateVectorStoreFileBatchRequest : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateVectorStoreFileBatchRequest)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_ids") != true) + { + writer.WritePropertyName("file_ids"u8); + writer.WriteStartArray(); + foreach (var item in FileIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("chunking_strategy") != true && Optional.IsDefined(ChunkingStrategy)) + { + writer.WritePropertyName("chunking_strategy"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(ChunkingStrategy); +#else + using (JsonDocument document = JsonDocument.Parse(ChunkingStrategy)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateVectorStoreFileBatchRequest IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateVectorStoreFileBatchRequest)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateVectorStoreFileBatchRequest(document.RootElement, options); + } + + internal static InternalCreateVectorStoreFileBatchRequest DeserializeInternalCreateVectorStoreFileBatchRequest(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList fileIds = default; + BinaryData chunkingStrategy = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_ids"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + fileIds = array; + continue; + } + if (property.NameEquals("chunking_strategy"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + chunkingStrategy = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateVectorStoreFileBatchRequest(fileIds, chunkingStrategy, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateVectorStoreFileBatchRequest)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateVectorStoreFileBatchRequest IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateVectorStoreFileBatchRequest(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateVectorStoreFileBatchRequest)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateVectorStoreFileBatchRequest FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateVectorStoreFileBatchRequest(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileBatchRequest.cs b/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileBatchRequest.cs new file mode 100644 index 000000000..3064e3505 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileBatchRequest.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.VectorStores +{ + internal partial class InternalCreateVectorStoreFileBatchRequest + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateVectorStoreFileBatchRequest(IEnumerable fileIds) + { + Argument.AssertNotNull(fileIds, nameof(fileIds)); + + FileIds = fileIds.ToList(); + } + + internal InternalCreateVectorStoreFileBatchRequest(IList fileIds, BinaryData chunkingStrategy, IDictionary serializedAdditionalRawData) + { + FileIds = fileIds; + ChunkingStrategy = chunkingStrategy; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateVectorStoreFileBatchRequest() + { + } + + public IList FileIds { get; } + public BinaryData ChunkingStrategy { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileRequest.Serialization.cs b/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileRequest.Serialization.cs new file mode 100644 index 000000000..4e124f18f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileRequest.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalCreateVectorStoreFileRequest : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateVectorStoreFileRequest)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_id") != true) + { + writer.WritePropertyName("file_id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData?.ContainsKey("chunking_strategy") != true && Optional.IsDefined(ChunkingStrategy)) + { + writer.WritePropertyName("chunking_strategy"u8); + writer.WriteObjectValue(ChunkingStrategy, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalCreateVectorStoreFileRequest IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalCreateVectorStoreFileRequest)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalCreateVectorStoreFileRequest(document.RootElement, options); + } + + internal static InternalCreateVectorStoreFileRequest DeserializeInternalCreateVectorStoreFileRequest(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string fileId = default; + FileChunkingStrategy chunkingStrategy = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_id"u8)) + { + fileId = property.Value.GetString(); + continue; + } + if (property.NameEquals("chunking_strategy"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + chunkingStrategy = FileChunkingStrategy.DeserializeFileChunkingStrategy(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalCreateVectorStoreFileRequest(fileId, chunkingStrategy, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalCreateVectorStoreFileRequest)} does not support writing '{options.Format}' format."); + } + } + + InternalCreateVectorStoreFileRequest IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalCreateVectorStoreFileRequest(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalCreateVectorStoreFileRequest)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalCreateVectorStoreFileRequest FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalCreateVectorStoreFileRequest(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileRequest.cs b/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileRequest.cs new file mode 100644 index 000000000..4224f5fca --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalCreateVectorStoreFileRequest.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal partial class InternalCreateVectorStoreFileRequest + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalCreateVectorStoreFileRequest(string fileId) + { + Argument.AssertNotNull(fileId, nameof(fileId)); + + FileId = fileId; + } + + internal InternalCreateVectorStoreFileRequest(string fileId, FileChunkingStrategy chunkingStrategy, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + ChunkingStrategy = chunkingStrategy; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalCreateVectorStoreFileRequest() + { + } + + public string FileId { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteAssistantResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalDeleteAssistantResponse.Serialization.cs new file mode 100644 index 000000000..7c9cd5ae1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteAssistantResponse.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalDeleteAssistantResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteAssistantResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("deleted") != true) + { + writer.WritePropertyName("deleted"u8); + writer.WriteBooleanValue(Deleted); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalDeleteAssistantResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteAssistantResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalDeleteAssistantResponse(document.RootElement, options); + } + + internal static InternalDeleteAssistantResponse DeserializeInternalDeleteAssistantResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + bool deleted = default; + InternalDeleteAssistantResponseObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("deleted"u8)) + { + deleted = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalDeleteAssistantResponseObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalDeleteAssistantResponse(id, deleted, @object, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalDeleteAssistantResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalDeleteAssistantResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalDeleteAssistantResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalDeleteAssistantResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalDeleteAssistantResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalDeleteAssistantResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteAssistantResponse.cs b/.dotnet/src/Generated/Models/InternalDeleteAssistantResponse.cs new file mode 100644 index 000000000..804997b70 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteAssistantResponse.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalDeleteAssistantResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalDeleteAssistantResponse(string id, bool deleted) + { + Argument.AssertNotNull(id, nameof(id)); + + Id = id; + Deleted = deleted; + } + + internal InternalDeleteAssistantResponse(string id, bool deleted, InternalDeleteAssistantResponseObject @object, IDictionary serializedAdditionalRawData) + { + Id = id; + Deleted = deleted; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalDeleteAssistantResponse() + { + } + + public string Id { get; } + public bool Deleted { get; } + public InternalDeleteAssistantResponseObject Object { get; } = InternalDeleteAssistantResponseObject.AssistantDeleted; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteAssistantResponseObject.cs b/.dotnet/src/Generated/Models/InternalDeleteAssistantResponseObject.cs new file mode 100644 index 000000000..1ee139041 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteAssistantResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalDeleteAssistantResponseObject : IEquatable + { + private readonly string _value; + + public InternalDeleteAssistantResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AssistantDeletedValue = "assistant.deleted"; + + public static InternalDeleteAssistantResponseObject AssistantDeleted { get; } = new InternalDeleteAssistantResponseObject(AssistantDeletedValue); + public static bool operator ==(InternalDeleteAssistantResponseObject left, InternalDeleteAssistantResponseObject right) => left.Equals(right); + public static bool operator !=(InternalDeleteAssistantResponseObject left, InternalDeleteAssistantResponseObject right) => !left.Equals(right); + public static implicit operator InternalDeleteAssistantResponseObject(string value) => new InternalDeleteAssistantResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalDeleteAssistantResponseObject other && Equals(other); + public bool Equals(InternalDeleteAssistantResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteFileResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalDeleteFileResponse.Serialization.cs new file mode 100644 index 000000000..a176702f0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteFileResponse.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Files +{ + internal partial class InternalDeleteFileResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteFileResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("deleted") != true) + { + writer.WritePropertyName("deleted"u8); + writer.WriteBooleanValue(Deleted); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalDeleteFileResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteFileResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalDeleteFileResponse(document.RootElement, options); + } + + internal static InternalDeleteFileResponse DeserializeInternalDeleteFileResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalDeleteFileResponseObject @object = default; + bool deleted = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalDeleteFileResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("deleted"u8)) + { + deleted = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalDeleteFileResponse(id, @object, deleted, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalDeleteFileResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalDeleteFileResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalDeleteFileResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalDeleteFileResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalDeleteFileResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalDeleteFileResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteFileResponse.cs b/.dotnet/src/Generated/Models/InternalDeleteFileResponse.cs new file mode 100644 index 000000000..20d559dcd --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteFileResponse.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Files +{ + internal partial class InternalDeleteFileResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalDeleteFileResponse(string id, bool deleted) + { + Argument.AssertNotNull(id, nameof(id)); + + Id = id; + Deleted = deleted; + } + + internal InternalDeleteFileResponse(string id, InternalDeleteFileResponseObject @object, bool deleted, IDictionary serializedAdditionalRawData) + { + Id = id; + Object = @object; + Deleted = deleted; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalDeleteFileResponse() + { + } + + public string Id { get; } + public InternalDeleteFileResponseObject Object { get; } = InternalDeleteFileResponseObject.File; + + public bool Deleted { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteFileResponseObject.cs b/.dotnet/src/Generated/Models/InternalDeleteFileResponseObject.cs new file mode 100644 index 000000000..f6fbfc918 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteFileResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Files +{ + internal readonly partial struct InternalDeleteFileResponseObject : IEquatable + { + private readonly string _value; + + public InternalDeleteFileResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FileValue = "file"; + + public static InternalDeleteFileResponseObject File { get; } = new InternalDeleteFileResponseObject(FileValue); + public static bool operator ==(InternalDeleteFileResponseObject left, InternalDeleteFileResponseObject right) => left.Equals(right); + public static bool operator !=(InternalDeleteFileResponseObject left, InternalDeleteFileResponseObject right) => !left.Equals(right); + public static implicit operator InternalDeleteFileResponseObject(string value) => new InternalDeleteFileResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalDeleteFileResponseObject other && Equals(other); + public bool Equals(InternalDeleteFileResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteMessageResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalDeleteMessageResponse.Serialization.cs new file mode 100644 index 000000000..991ff8f40 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteMessageResponse.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalDeleteMessageResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteMessageResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("deleted") != true) + { + writer.WritePropertyName("deleted"u8); + writer.WriteBooleanValue(Deleted); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalDeleteMessageResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteMessageResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalDeleteMessageResponse(document.RootElement, options); + } + + internal static InternalDeleteMessageResponse DeserializeInternalDeleteMessageResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + bool deleted = default; + InternalDeleteMessageResponseObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("deleted"u8)) + { + deleted = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalDeleteMessageResponseObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalDeleteMessageResponse(id, deleted, @object, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalDeleteMessageResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalDeleteMessageResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalDeleteMessageResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalDeleteMessageResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalDeleteMessageResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalDeleteMessageResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteMessageResponse.cs b/.dotnet/src/Generated/Models/InternalDeleteMessageResponse.cs new file mode 100644 index 000000000..98cad7c8d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteMessageResponse.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalDeleteMessageResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalDeleteMessageResponse(string id, bool deleted) + { + Argument.AssertNotNull(id, nameof(id)); + + Id = id; + Deleted = deleted; + } + + internal InternalDeleteMessageResponse(string id, bool deleted, InternalDeleteMessageResponseObject @object, IDictionary serializedAdditionalRawData) + { + Id = id; + Deleted = deleted; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalDeleteMessageResponse() + { + } + + public string Id { get; } + public bool Deleted { get; } + public InternalDeleteMessageResponseObject Object { get; } = InternalDeleteMessageResponseObject.ThreadMessageDeleted; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteMessageResponseObject.cs b/.dotnet/src/Generated/Models/InternalDeleteMessageResponseObject.cs new file mode 100644 index 000000000..772ffbcb3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteMessageResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalDeleteMessageResponseObject : IEquatable + { + private readonly string _value; + + public InternalDeleteMessageResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ThreadMessageDeletedValue = "thread.message.deleted"; + + public static InternalDeleteMessageResponseObject ThreadMessageDeleted { get; } = new InternalDeleteMessageResponseObject(ThreadMessageDeletedValue); + public static bool operator ==(InternalDeleteMessageResponseObject left, InternalDeleteMessageResponseObject right) => left.Equals(right); + public static bool operator !=(InternalDeleteMessageResponseObject left, InternalDeleteMessageResponseObject right) => !left.Equals(right); + public static implicit operator InternalDeleteMessageResponseObject(string value) => new InternalDeleteMessageResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalDeleteMessageResponseObject other && Equals(other); + public bool Equals(InternalDeleteMessageResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteModelResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalDeleteModelResponse.Serialization.cs new file mode 100644 index 000000000..1ee7b8cf7 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteModelResponse.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Models +{ + internal partial class InternalDeleteModelResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteModelResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("deleted") != true) + { + writer.WritePropertyName("deleted"u8); + writer.WriteBooleanValue(Deleted); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalDeleteModelResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteModelResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalDeleteModelResponse(document.RootElement, options); + } + + internal static InternalDeleteModelResponse DeserializeInternalDeleteModelResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + bool deleted = default; + InternalDeleteModelResponseObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("deleted"u8)) + { + deleted = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalDeleteModelResponseObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalDeleteModelResponse(id, deleted, @object, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalDeleteModelResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalDeleteModelResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalDeleteModelResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalDeleteModelResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalDeleteModelResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalDeleteModelResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteModelResponse.cs b/.dotnet/src/Generated/Models/InternalDeleteModelResponse.cs new file mode 100644 index 000000000..2b9c6561d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteModelResponse.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Models +{ + internal partial class InternalDeleteModelResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalDeleteModelResponse(string id, bool deleted) + { + Argument.AssertNotNull(id, nameof(id)); + + Id = id; + Deleted = deleted; + } + + internal InternalDeleteModelResponse(string id, bool deleted, InternalDeleteModelResponseObject @object, IDictionary serializedAdditionalRawData) + { + Id = id; + Deleted = deleted; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalDeleteModelResponse() + { + } + + public string Id { get; } + public bool Deleted { get; } + public InternalDeleteModelResponseObject Object { get; } = InternalDeleteModelResponseObject.Model; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteModelResponseObject.cs b/.dotnet/src/Generated/Models/InternalDeleteModelResponseObject.cs new file mode 100644 index 000000000..967d11d7a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteModelResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Models +{ + internal readonly partial struct InternalDeleteModelResponseObject : IEquatable + { + private readonly string _value; + + public InternalDeleteModelResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ModelValue = "model"; + + public static InternalDeleteModelResponseObject Model { get; } = new InternalDeleteModelResponseObject(ModelValue); + public static bool operator ==(InternalDeleteModelResponseObject left, InternalDeleteModelResponseObject right) => left.Equals(right); + public static bool operator !=(InternalDeleteModelResponseObject left, InternalDeleteModelResponseObject right) => !left.Equals(right); + public static implicit operator InternalDeleteModelResponseObject(string value) => new InternalDeleteModelResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalDeleteModelResponseObject other && Equals(other); + public bool Equals(InternalDeleteModelResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteThreadResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalDeleteThreadResponse.Serialization.cs new file mode 100644 index 000000000..3c0fd36ad --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteThreadResponse.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalDeleteThreadResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteThreadResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("deleted") != true) + { + writer.WritePropertyName("deleted"u8); + writer.WriteBooleanValue(Deleted); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalDeleteThreadResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteThreadResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalDeleteThreadResponse(document.RootElement, options); + } + + internal static InternalDeleteThreadResponse DeserializeInternalDeleteThreadResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + bool deleted = default; + InternalDeleteThreadResponseObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("deleted"u8)) + { + deleted = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalDeleteThreadResponseObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalDeleteThreadResponse(id, deleted, @object, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalDeleteThreadResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalDeleteThreadResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalDeleteThreadResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalDeleteThreadResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalDeleteThreadResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalDeleteThreadResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteThreadResponse.cs b/.dotnet/src/Generated/Models/InternalDeleteThreadResponse.cs new file mode 100644 index 000000000..983dc8adb --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteThreadResponse.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalDeleteThreadResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalDeleteThreadResponse(string id, bool deleted) + { + Argument.AssertNotNull(id, nameof(id)); + + Id = id; + Deleted = deleted; + } + + internal InternalDeleteThreadResponse(string id, bool deleted, InternalDeleteThreadResponseObject @object, IDictionary serializedAdditionalRawData) + { + Id = id; + Deleted = deleted; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalDeleteThreadResponse() + { + } + + public string Id { get; } + public bool Deleted { get; } + public InternalDeleteThreadResponseObject Object { get; } = InternalDeleteThreadResponseObject.ThreadDeleted; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteThreadResponseObject.cs b/.dotnet/src/Generated/Models/InternalDeleteThreadResponseObject.cs new file mode 100644 index 000000000..bc843faac --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteThreadResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalDeleteThreadResponseObject : IEquatable + { + private readonly string _value; + + public InternalDeleteThreadResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ThreadDeletedValue = "thread.deleted"; + + public static InternalDeleteThreadResponseObject ThreadDeleted { get; } = new InternalDeleteThreadResponseObject(ThreadDeletedValue); + public static bool operator ==(InternalDeleteThreadResponseObject left, InternalDeleteThreadResponseObject right) => left.Equals(right); + public static bool operator !=(InternalDeleteThreadResponseObject left, InternalDeleteThreadResponseObject right) => !left.Equals(right); + public static implicit operator InternalDeleteThreadResponseObject(string value) => new InternalDeleteThreadResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalDeleteThreadResponseObject other && Equals(other); + public bool Equals(InternalDeleteThreadResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteVectorStoreFileResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreFileResponse.Serialization.cs new file mode 100644 index 000000000..2363aeb24 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreFileResponse.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalDeleteVectorStoreFileResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteVectorStoreFileResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("deleted") != true) + { + writer.WritePropertyName("deleted"u8); + writer.WriteBooleanValue(Deleted); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalDeleteVectorStoreFileResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteVectorStoreFileResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalDeleteVectorStoreFileResponse(document.RootElement, options); + } + + internal static InternalDeleteVectorStoreFileResponse DeserializeInternalDeleteVectorStoreFileResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + bool deleted = default; + InternalDeleteVectorStoreFileResponseObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("deleted"u8)) + { + deleted = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalDeleteVectorStoreFileResponseObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalDeleteVectorStoreFileResponse(id, deleted, @object, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalDeleteVectorStoreFileResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalDeleteVectorStoreFileResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalDeleteVectorStoreFileResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalDeleteVectorStoreFileResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalDeleteVectorStoreFileResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalDeleteVectorStoreFileResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteVectorStoreFileResponse.cs b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreFileResponse.cs new file mode 100644 index 000000000..bf3150d47 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreFileResponse.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal partial class InternalDeleteVectorStoreFileResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalDeleteVectorStoreFileResponse(string id, bool deleted) + { + Argument.AssertNotNull(id, nameof(id)); + + Id = id; + Deleted = deleted; + } + + internal InternalDeleteVectorStoreFileResponse(string id, bool deleted, InternalDeleteVectorStoreFileResponseObject @object, IDictionary serializedAdditionalRawData) + { + Id = id; + Deleted = deleted; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalDeleteVectorStoreFileResponse() + { + } + + public string Id { get; } + public bool Deleted { get; } + public InternalDeleteVectorStoreFileResponseObject Object { get; } = InternalDeleteVectorStoreFileResponseObject.VectorStoreFileDeleted; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteVectorStoreFileResponseObject.cs b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreFileResponseObject.cs new file mode 100644 index 000000000..6716460af --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreFileResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.VectorStores +{ + internal readonly partial struct InternalDeleteVectorStoreFileResponseObject : IEquatable + { + private readonly string _value; + + public InternalDeleteVectorStoreFileResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string VectorStoreFileDeletedValue = "vector_store.file.deleted"; + + public static InternalDeleteVectorStoreFileResponseObject VectorStoreFileDeleted { get; } = new InternalDeleteVectorStoreFileResponseObject(VectorStoreFileDeletedValue); + public static bool operator ==(InternalDeleteVectorStoreFileResponseObject left, InternalDeleteVectorStoreFileResponseObject right) => left.Equals(right); + public static bool operator !=(InternalDeleteVectorStoreFileResponseObject left, InternalDeleteVectorStoreFileResponseObject right) => !left.Equals(right); + public static implicit operator InternalDeleteVectorStoreFileResponseObject(string value) => new InternalDeleteVectorStoreFileResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalDeleteVectorStoreFileResponseObject other && Equals(other); + public bool Equals(InternalDeleteVectorStoreFileResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteVectorStoreResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreResponse.Serialization.cs new file mode 100644 index 000000000..10f0a4b80 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreResponse.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalDeleteVectorStoreResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteVectorStoreResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("deleted") != true) + { + writer.WritePropertyName("deleted"u8); + writer.WriteBooleanValue(Deleted); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalDeleteVectorStoreResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalDeleteVectorStoreResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalDeleteVectorStoreResponse(document.RootElement, options); + } + + internal static InternalDeleteVectorStoreResponse DeserializeInternalDeleteVectorStoreResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + bool deleted = default; + InternalDeleteVectorStoreResponseObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("deleted"u8)) + { + deleted = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalDeleteVectorStoreResponseObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalDeleteVectorStoreResponse(id, deleted, @object, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalDeleteVectorStoreResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalDeleteVectorStoreResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalDeleteVectorStoreResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalDeleteVectorStoreResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalDeleteVectorStoreResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalDeleteVectorStoreResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteVectorStoreResponse.cs b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreResponse.cs new file mode 100644 index 000000000..d90d30171 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreResponse.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal partial class InternalDeleteVectorStoreResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalDeleteVectorStoreResponse(string id, bool deleted) + { + Argument.AssertNotNull(id, nameof(id)); + + Id = id; + Deleted = deleted; + } + + internal InternalDeleteVectorStoreResponse(string id, bool deleted, InternalDeleteVectorStoreResponseObject @object, IDictionary serializedAdditionalRawData) + { + Id = id; + Deleted = deleted; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalDeleteVectorStoreResponse() + { + } + + public string Id { get; } + public bool Deleted { get; } + public InternalDeleteVectorStoreResponseObject Object { get; } = InternalDeleteVectorStoreResponseObject.VectorStoreDeleted; + } +} diff --git a/.dotnet/src/Generated/Models/InternalDeleteVectorStoreResponseObject.cs b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreResponseObject.cs new file mode 100644 index 000000000..c13ec4b2f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalDeleteVectorStoreResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.VectorStores +{ + internal readonly partial struct InternalDeleteVectorStoreResponseObject : IEquatable + { + private readonly string _value; + + public InternalDeleteVectorStoreResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string VectorStoreDeletedValue = "vector_store.deleted"; + + public static InternalDeleteVectorStoreResponseObject VectorStoreDeleted { get; } = new InternalDeleteVectorStoreResponseObject(VectorStoreDeletedValue); + public static bool operator ==(InternalDeleteVectorStoreResponseObject left, InternalDeleteVectorStoreResponseObject right) => left.Equals(right); + public static bool operator !=(InternalDeleteVectorStoreResponseObject left, InternalDeleteVectorStoreResponseObject right) => !left.Equals(right); + public static implicit operator InternalDeleteVectorStoreResponseObject(string value) => new InternalDeleteVectorStoreResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalDeleteVectorStoreResponseObject other && Equals(other); + public bool Equals(InternalDeleteVectorStoreResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalEmbeddingObject.cs b/.dotnet/src/Generated/Models/InternalEmbeddingObject.cs new file mode 100644 index 000000000..5a0c0e4d3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalEmbeddingObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Embeddings +{ + internal readonly partial struct InternalEmbeddingObject : IEquatable + { + private readonly string _value; + + public InternalEmbeddingObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string EmbeddingValue = "embedding"; + + public static InternalEmbeddingObject Embedding { get; } = new InternalEmbeddingObject(EmbeddingValue); + public static bool operator ==(InternalEmbeddingObject left, InternalEmbeddingObject right) => left.Equals(right); + public static bool operator !=(InternalEmbeddingObject left, InternalEmbeddingObject right) => !left.Equals(right); + public static implicit operator InternalEmbeddingObject(string value) => new InternalEmbeddingObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalEmbeddingObject other && Equals(other); + public bool Equals(InternalEmbeddingObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalFileChunkingStrategyRequestParam.Serialization.cs b/.dotnet/src/Generated/Models/InternalFileChunkingStrategyRequestParam.Serialization.cs new file mode 100644 index 000000000..4a61d2d17 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFileChunkingStrategyRequestParam.Serialization.cs @@ -0,0 +1,124 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + [PersistableModelProxy(typeof(InternalUnknownFileChunkingStrategyRequestParamProxy))] + internal partial class InternalFileChunkingStrategyRequestParam : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFileChunkingStrategyRequestParam)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFileChunkingStrategyRequestParam IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFileChunkingStrategyRequestParam)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFileChunkingStrategyRequestParam(document.RootElement, options); + } + + internal static InternalFileChunkingStrategyRequestParam DeserializeInternalFileChunkingStrategyRequestParam(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "auto": return InternalAutoChunkingStrategyRequestParam.DeserializeInternalAutoChunkingStrategyRequestParam(element, options); + case "static": return InternalStaticChunkingStrategyRequestParam.DeserializeInternalStaticChunkingStrategyRequestParam(element, options); + } + } + return InternalUnknownFileChunkingStrategyRequestParamProxy.DeserializeInternalUnknownFileChunkingStrategyRequestParamProxy(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFileChunkingStrategyRequestParam)} does not support writing '{options.Format}' format."); + } + } + + InternalFileChunkingStrategyRequestParam IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFileChunkingStrategyRequestParam(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFileChunkingStrategyRequestParam)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFileChunkingStrategyRequestParam FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFileChunkingStrategyRequestParam(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFileChunkingStrategyRequestParam.cs b/.dotnet/src/Generated/Models/InternalFileChunkingStrategyRequestParam.cs new file mode 100644 index 000000000..275a076ed --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFileChunkingStrategyRequestParam.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal abstract partial class InternalFileChunkingStrategyRequestParam + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected InternalFileChunkingStrategyRequestParam() + { + } + + internal InternalFileChunkingStrategyRequestParam(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFileUploadOptions.Serialization.cs b/.dotnet/src/Generated/Models/InternalFileUploadOptions.Serialization.cs new file mode 100644 index 000000000..8a86ad75b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFileUploadOptions.Serialization.cs @@ -0,0 +1,177 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.IO; +using System.Text.Json; + +namespace OpenAI.Files +{ + internal partial class InternalFileUploadOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFileUploadOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file") != true) + { + writer.WritePropertyName("file"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(global::System.BinaryData.FromStream(File)); +#else + using (JsonDocument document = JsonDocument.Parse(BinaryData.FromStream(File))) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("purpose") != true) + { + writer.WritePropertyName("purpose"u8); + writer.WriteStringValue(Purpose.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFileUploadOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFileUploadOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFileUploadOptions(document.RootElement, options); + } + + internal static InternalFileUploadOptions DeserializeInternalFileUploadOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + Stream file = default; + FileUploadPurpose purpose = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file"u8)) + { + file = BinaryData.FromString(property.Value.GetRawText()).ToStream(); + continue; + } + if (property.NameEquals("purpose"u8)) + { + purpose = new FileUploadPurpose(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFileUploadOptions(file, purpose, serializedAdditionalRawData); + } + + private BinaryData SerializeMultipart(ModelReaderWriterOptions options) + { + using MultipartFormDataBinaryContent content = ToMultipartBinaryBody(); + using MemoryStream stream = new MemoryStream(); + content.WriteTo(stream); + if (stream.Position > int.MaxValue) + { + return BinaryData.FromStream(stream); + } + else + { + return new BinaryData(stream.GetBuffer().AsMemory(0, (int)stream.Position)); + } + } + + internal virtual MultipartFormDataBinaryContent ToMultipartBinaryBody() + { + MultipartFormDataBinaryContent content = new MultipartFormDataBinaryContent(); + content.Add(File, "file", "file", "application/octet-stream"); + content.Add(Purpose.ToString(), "purpose"); + return content; + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + case "MFD": + return SerializeMultipart(options); + default: + throw new FormatException($"The model {nameof(InternalFileUploadOptions)} does not support writing '{options.Format}' format."); + } + } + + InternalFileUploadOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFileUploadOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFileUploadOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "MFD"; + + internal static InternalFileUploadOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFileUploadOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFileUploadOptions.cs b/.dotnet/src/Generated/Models/InternalFileUploadOptions.cs new file mode 100644 index 000000000..7873ae62e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFileUploadOptions.cs @@ -0,0 +1,22 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.IO; + +namespace OpenAI.Files +{ + internal partial class InternalFileUploadOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal InternalFileUploadOptions(Stream file, FileUploadPurpose purpose, IDictionary serializedAdditionalRawData) + { + File = file; + Purpose = purpose; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuneChatCompletionRequestAssistantMessage.Serialization.cs b/.dotnet/src/Generated/Models/InternalFineTuneChatCompletionRequestAssistantMessage.Serialization.cs new file mode 100644 index 000000000..9c1712f3c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuneChatCompletionRequestAssistantMessage.Serialization.cs @@ -0,0 +1,234 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; +using OpenAI.Chat; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuneChatCompletionRequestAssistantMessage : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuneChatCompletionRequestAssistantMessage)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("refusal") != true && Optional.IsDefined(Refusal)) + { + if (Refusal != null) + { + writer.WritePropertyName("refusal"u8); + writer.WriteStringValue(Refusal); + } + else + { + writer.WriteNull("refusal"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true && Optional.IsDefined(ParticipantName)) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(ParticipantName); + } + if (SerializedAdditionalRawData?.ContainsKey("tool_calls") != true && Optional.IsCollectionDefined(ToolCalls)) + { + writer.WritePropertyName("tool_calls"u8); + writer.WriteStartArray(); + foreach (var item in ToolCalls) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("function_call") != true && Optional.IsDefined(FunctionCall)) + { + if (FunctionCall != null) + { + writer.WritePropertyName("function_call"u8); + writer.WriteObjectValue(FunctionCall, options); + } + else + { + writer.WriteNull("function_call"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("role") != true) + { + writer.WritePropertyName("role"u8); + writer.WriteStringValue(Role.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("content") != true && Optional.IsCollectionDefined(Content)) + { + writer.WritePropertyName("content"u8); + SerializeContentValue(writer, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFineTuneChatCompletionRequestAssistantMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuneChatCompletionRequestAssistantMessage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFineTuneChatCompletionRequestAssistantMessage(document.RootElement, options); + } + + internal static InternalFineTuneChatCompletionRequestAssistantMessage DeserializeInternalFineTuneChatCompletionRequestAssistantMessage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string refusal = default; + string name = default; + IList toolCalls = default; + ChatFunctionCall functionCall = default; + ChatMessageRole role = default; + IList content = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("refusal"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + refusal = null; + continue; + } + refusal = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("tool_calls"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatToolCall.DeserializeChatToolCall(item, options)); + } + toolCalls = array; + continue; + } + if (property.NameEquals("function_call"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + functionCall = null; + continue; + } + functionCall = ChatFunctionCall.DeserializeChatFunctionCall(property.Value, options); + continue; + } + if (property.NameEquals("role"u8)) + { + role = property.Value.GetString().ToChatMessageRole(); + continue; + } + if (property.NameEquals("content"u8)) + { + DeserializeContentValue(property, ref content); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFineTuneChatCompletionRequestAssistantMessage( + role, + content ?? new ChangeTrackingList(), + serializedAdditionalRawData, + refusal, + name, + toolCalls ?? new ChangeTrackingList(), + functionCall); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFineTuneChatCompletionRequestAssistantMessage)} does not support writing '{options.Format}' format."); + } + } + + InternalFineTuneChatCompletionRequestAssistantMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFineTuneChatCompletionRequestAssistantMessage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFineTuneChatCompletionRequestAssistantMessage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalFineTuneChatCompletionRequestAssistantMessage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFineTuneChatCompletionRequestAssistantMessage(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuneChatCompletionRequestAssistantMessage.cs b/.dotnet/src/Generated/Models/InternalFineTuneChatCompletionRequestAssistantMessage.cs new file mode 100644 index 000000000..e7f33bc58 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuneChatCompletionRequestAssistantMessage.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using OpenAI.Chat; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuneChatCompletionRequestAssistantMessage : AssistantChatMessage + { + public InternalFineTuneChatCompletionRequestAssistantMessage() + { + } + + internal InternalFineTuneChatCompletionRequestAssistantMessage(ChatMessageRole role, IList content, IDictionary serializedAdditionalRawData, string refusal, string participantName, IList toolCalls, ChatFunctionCall functionCall) : base(role, content, serializedAdditionalRawData, refusal, participantName, toolCalls, functionCall) + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningIntegration.Serialization.cs b/.dotnet/src/Generated/Models/InternalFineTuningIntegration.Serialization.cs new file mode 100644 index 000000000..74ed6bdde --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningIntegration.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningIntegration : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningIntegration)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("wandb") != true) + { + writer.WritePropertyName("wandb"u8); + writer.WriteObjectValue(Wandb, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFineTuningIntegration IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningIntegration)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFineTuningIntegration(document.RootElement, options); + } + + internal static InternalFineTuningIntegration DeserializeInternalFineTuningIntegration(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalFineTuningIntegrationType type = default; + InternalFineTuningIntegrationWandb wandb = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = new InternalFineTuningIntegrationType(property.Value.GetString()); + continue; + } + if (property.NameEquals("wandb"u8)) + { + wandb = InternalFineTuningIntegrationWandb.DeserializeInternalFineTuningIntegrationWandb(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFineTuningIntegration(type, wandb, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFineTuningIntegration)} does not support writing '{options.Format}' format."); + } + } + + InternalFineTuningIntegration IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFineTuningIntegration(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFineTuningIntegration)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFineTuningIntegration FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFineTuningIntegration(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningIntegration.cs b/.dotnet/src/Generated/Models/InternalFineTuningIntegration.cs new file mode 100644 index 000000000..36288d789 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningIntegration.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningIntegration + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalFineTuningIntegration(InternalFineTuningIntegrationWandb wandb) + { + Argument.AssertNotNull(wandb, nameof(wandb)); + + Wandb = wandb; + } + + internal InternalFineTuningIntegration(InternalFineTuningIntegrationType type, InternalFineTuningIntegrationWandb wandb, IDictionary serializedAdditionalRawData) + { + Type = type; + Wandb = wandb; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalFineTuningIntegration() + { + } + + public InternalFineTuningIntegrationType Type { get; } = InternalFineTuningIntegrationType.Wandb; + + public InternalFineTuningIntegrationWandb Wandb { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningIntegrationType.cs b/.dotnet/src/Generated/Models/InternalFineTuningIntegrationType.cs new file mode 100644 index 000000000..fdb20df5f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningIntegrationType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalFineTuningIntegrationType : IEquatable + { + private readonly string _value; + + public InternalFineTuningIntegrationType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string WandbValue = "wandb"; + + public static InternalFineTuningIntegrationType Wandb { get; } = new InternalFineTuningIntegrationType(WandbValue); + public static bool operator ==(InternalFineTuningIntegrationType left, InternalFineTuningIntegrationType right) => left.Equals(right); + public static bool operator !=(InternalFineTuningIntegrationType left, InternalFineTuningIntegrationType right) => !left.Equals(right); + public static implicit operator InternalFineTuningIntegrationType(string value) => new InternalFineTuningIntegrationType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalFineTuningIntegrationType other && Equals(other); + public bool Equals(InternalFineTuningIntegrationType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningIntegrationWandb.Serialization.cs b/.dotnet/src/Generated/Models/InternalFineTuningIntegrationWandb.Serialization.cs new file mode 100644 index 000000000..9246babd7 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningIntegrationWandb.Serialization.cs @@ -0,0 +1,204 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningIntegrationWandb : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningIntegrationWandb)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("project") != true) + { + writer.WritePropertyName("project"u8); + writer.WriteStringValue(Project); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true && Optional.IsDefined(Name)) + { + if (Name != null) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + else + { + writer.WriteNull("name"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("entity") != true && Optional.IsDefined(Entity)) + { + if (Entity != null) + { + writer.WritePropertyName("entity"u8); + writer.WriteStringValue(Entity); + } + else + { + writer.WriteNull("entity"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tags") != true && Optional.IsCollectionDefined(Tags)) + { + writer.WritePropertyName("tags"u8); + writer.WriteStartArray(); + foreach (var item in Tags) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFineTuningIntegrationWandb IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningIntegrationWandb)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFineTuningIntegrationWandb(document.RootElement, options); + } + + internal static InternalFineTuningIntegrationWandb DeserializeInternalFineTuningIntegrationWandb(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string project = default; + string name = default; + string entity = default; + IReadOnlyList tags = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("project"u8)) + { + project = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + name = null; + continue; + } + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("entity"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + entity = null; + continue; + } + entity = property.Value.GetString(); + continue; + } + if (property.NameEquals("tags"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + tags = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFineTuningIntegrationWandb(project, name, entity, tags ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFineTuningIntegrationWandb)} does not support writing '{options.Format}' format."); + } + } + + InternalFineTuningIntegrationWandb IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFineTuningIntegrationWandb(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFineTuningIntegrationWandb)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFineTuningIntegrationWandb FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFineTuningIntegrationWandb(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningIntegrationWandb.cs b/.dotnet/src/Generated/Models/InternalFineTuningIntegrationWandb.cs new file mode 100644 index 000000000..c7dba57b6 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningIntegrationWandb.cs @@ -0,0 +1,39 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningIntegrationWandb + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalFineTuningIntegrationWandb(string project) + { + Argument.AssertNotNull(project, nameof(project)); + + Project = project; + Tags = new ChangeTrackingList(); + } + + internal InternalFineTuningIntegrationWandb(string project, string name, string entity, IReadOnlyList tags, IDictionary serializedAdditionalRawData) + { + Project = project; + Name = name; + Entity = entity; + Tags = tags; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalFineTuningIntegrationWandb() + { + } + + public string Project { get; } + public string Name { get; } + public string Entity { get; } + public IReadOnlyList Tags { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJob.Serialization.cs b/.dotnet/src/Generated/Models/InternalFineTuningJob.Serialization.cs new file mode 100644 index 000000000..5dc596a2d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJob.Serialization.cs @@ -0,0 +1,430 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJob : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJob)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("error") != true) + { + if (Error != null) + { + writer.WritePropertyName("error"u8); + writer.WriteObjectValue(Error, options); + } + else + { + writer.WriteNull("error"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("fine_tuned_model") != true) + { + if (FineTunedModel != null) + { + writer.WritePropertyName("fine_tuned_model"u8); + writer.WriteStringValue(FineTunedModel); + } + else + { + writer.WriteNull("fine_tuned_model"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("finished_at") != true) + { + if (FinishedAt != null) + { + writer.WritePropertyName("finished_at"u8); + writer.WriteNumberValue(FinishedAt.Value, "U"); + } + else + { + writer.WriteNull("finished_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("hyperparameters") != true) + { + writer.WritePropertyName("hyperparameters"u8); + writer.WriteObjectValue(Hyperparameters, options); + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("organization_id") != true) + { + writer.WritePropertyName("organization_id"u8); + writer.WriteStringValue(OrganizationId); + } + if (SerializedAdditionalRawData?.ContainsKey("result_files") != true) + { + writer.WritePropertyName("result_files"u8); + writer.WriteStartArray(); + foreach (var item in ResultFiles) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("status") != true) + { + writer.WritePropertyName("status"u8); + writer.WriteStringValue(Status.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("trained_tokens") != true) + { + if (TrainedTokens != null) + { + writer.WritePropertyName("trained_tokens"u8); + writer.WriteNumberValue(TrainedTokens.Value); + } + else + { + writer.WriteNull("trained_tokens"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("training_file") != true) + { + writer.WritePropertyName("training_file"u8); + writer.WriteStringValue(TrainingFile); + } + if (SerializedAdditionalRawData?.ContainsKey("validation_file") != true) + { + if (ValidationFile != null) + { + writer.WritePropertyName("validation_file"u8); + writer.WriteStringValue(ValidationFile); + } + else + { + writer.WriteNull("validation_file"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("integrations") != true && Optional.IsCollectionDefined(Integrations)) + { + if (Integrations != null) + { + writer.WritePropertyName("integrations"u8); + writer.WriteStartArray(); + foreach (var item in Integrations) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("integrations"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("seed") != true) + { + writer.WritePropertyName("seed"u8); + writer.WriteNumberValue(Seed); + } + if (SerializedAdditionalRawData?.ContainsKey("estimated_finish") != true && Optional.IsDefined(EstimatedFinish)) + { + if (EstimatedFinish != null) + { + writer.WritePropertyName("estimated_finish"u8); + writer.WriteNumberValue(EstimatedFinish.Value, "U"); + } + else + { + writer.WriteNull("estimated_finish"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFineTuningJob IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJob)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFineTuningJob(document.RootElement, options); + } + + internal static InternalFineTuningJob DeserializeInternalFineTuningJob(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + DateTimeOffset createdAt = default; + InternalFineTuningJobError error = default; + string fineTunedModel = default; + DateTimeOffset? finishedAt = default; + InternalFineTuningJobHyperparameters hyperparameters = default; + string model = default; + InternalFineTuningJobObject @object = default; + string organizationId = default; + IReadOnlyList resultFiles = default; + InternalFineTuningJobStatus status = default; + int? trainedTokens = default; + string trainingFile = default; + string validationFile = default; + IReadOnlyList integrations = default; + int seed = default; + DateTimeOffset? estimatedFinish = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("error"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + error = null; + continue; + } + error = InternalFineTuningJobError.DeserializeInternalFineTuningJobError(property.Value, options); + continue; + } + if (property.NameEquals("fine_tuned_model"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + fineTunedModel = null; + continue; + } + fineTunedModel = property.Value.GetString(); + continue; + } + if (property.NameEquals("finished_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + finishedAt = null; + continue; + } + finishedAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("hyperparameters"u8)) + { + hyperparameters = InternalFineTuningJobHyperparameters.DeserializeInternalFineTuningJobHyperparameters(property.Value, options); + continue; + } + if (property.NameEquals("model"u8)) + { + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalFineTuningJobObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("organization_id"u8)) + { + organizationId = property.Value.GetString(); + continue; + } + if (property.NameEquals("result_files"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + resultFiles = array; + continue; + } + if (property.NameEquals("status"u8)) + { + status = new InternalFineTuningJobStatus(property.Value.GetString()); + continue; + } + if (property.NameEquals("trained_tokens"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + trainedTokens = null; + continue; + } + trainedTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("training_file"u8)) + { + trainingFile = property.Value.GetString(); + continue; + } + if (property.NameEquals("validation_file"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + validationFile = null; + continue; + } + validationFile = property.Value.GetString(); + continue; + } + if (property.NameEquals("integrations"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalFineTuningIntegration.DeserializeInternalFineTuningIntegration(item, options)); + } + integrations = array; + continue; + } + if (property.NameEquals("seed"u8)) + { + seed = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("estimated_finish"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + estimatedFinish = null; + continue; + } + estimatedFinish = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFineTuningJob( + id, + createdAt, + error, + fineTunedModel, + finishedAt, + hyperparameters, + model, + @object, + organizationId, + resultFiles, + status, + trainedTokens, + trainingFile, + validationFile, + integrations ?? new ChangeTrackingList(), + seed, + estimatedFinish, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFineTuningJob)} does not support writing '{options.Format}' format."); + } + } + + InternalFineTuningJob IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFineTuningJob(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFineTuningJob)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFineTuningJob FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFineTuningJob(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJob.cs b/.dotnet/src/Generated/Models/InternalFineTuningJob.cs new file mode 100644 index 000000000..ce30e5cdd --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJob.cs @@ -0,0 +1,85 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJob + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalFineTuningJob(string id, DateTimeOffset createdAt, InternalFineTuningJobError error, string fineTunedModel, DateTimeOffset? finishedAt, InternalFineTuningJobHyperparameters hyperparameters, string model, string organizationId, IEnumerable resultFiles, InternalFineTuningJobStatus status, int? trainedTokens, string trainingFile, string validationFile, int seed) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(hyperparameters, nameof(hyperparameters)); + Argument.AssertNotNull(model, nameof(model)); + Argument.AssertNotNull(organizationId, nameof(organizationId)); + Argument.AssertNotNull(resultFiles, nameof(resultFiles)); + Argument.AssertNotNull(trainingFile, nameof(trainingFile)); + + Id = id; + CreatedAt = createdAt; + Error = error; + FineTunedModel = fineTunedModel; + FinishedAt = finishedAt; + Hyperparameters = hyperparameters; + Model = model; + OrganizationId = organizationId; + ResultFiles = resultFiles.ToList(); + Status = status; + TrainedTokens = trainedTokens; + TrainingFile = trainingFile; + ValidationFile = validationFile; + Integrations = new ChangeTrackingList(); + Seed = seed; + } + + internal InternalFineTuningJob(string id, DateTimeOffset createdAt, InternalFineTuningJobError error, string fineTunedModel, DateTimeOffset? finishedAt, InternalFineTuningJobHyperparameters hyperparameters, string model, InternalFineTuningJobObject @object, string organizationId, IReadOnlyList resultFiles, InternalFineTuningJobStatus status, int? trainedTokens, string trainingFile, string validationFile, IReadOnlyList integrations, int seed, DateTimeOffset? estimatedFinish, IDictionary serializedAdditionalRawData) + { + Id = id; + CreatedAt = createdAt; + Error = error; + FineTunedModel = fineTunedModel; + FinishedAt = finishedAt; + Hyperparameters = hyperparameters; + Model = model; + Object = @object; + OrganizationId = organizationId; + ResultFiles = resultFiles; + Status = status; + TrainedTokens = trainedTokens; + TrainingFile = trainingFile; + ValidationFile = validationFile; + Integrations = integrations; + Seed = seed; + EstimatedFinish = estimatedFinish; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalFineTuningJob() + { + } + + public string Id { get; } + public DateTimeOffset CreatedAt { get; } + public InternalFineTuningJobError Error { get; } + public string FineTunedModel { get; } + public DateTimeOffset? FinishedAt { get; } + public InternalFineTuningJobHyperparameters Hyperparameters { get; } + public string Model { get; } + public InternalFineTuningJobObject Object { get; } = InternalFineTuningJobObject.FineTuningJob; + + public string OrganizationId { get; } + public IReadOnlyList ResultFiles { get; } + public InternalFineTuningJobStatus Status { get; } + public int? TrainedTokens { get; } + public string TrainingFile { get; } + public string ValidationFile { get; } + public IReadOnlyList Integrations { get; } + public int Seed { get; } + public DateTimeOffset? EstimatedFinish { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpoint.Serialization.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpoint.Serialization.cs new file mode 100644 index 000000000..d562b0cbd --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpoint.Serialization.cs @@ -0,0 +1,207 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJobCheckpoint : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJobCheckpoint)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("fine_tuned_model_checkpoint") != true) + { + writer.WritePropertyName("fine_tuned_model_checkpoint"u8); + writer.WriteStringValue(FineTunedModelCheckpoint); + } + if (SerializedAdditionalRawData?.ContainsKey("step_number") != true) + { + writer.WritePropertyName("step_number"u8); + writer.WriteNumberValue(StepNumber); + } + if (SerializedAdditionalRawData?.ContainsKey("metrics") != true) + { + writer.WritePropertyName("metrics"u8); + writer.WriteObjectValue(Metrics, options); + } + if (SerializedAdditionalRawData?.ContainsKey("fine_tuning_job_id") != true) + { + writer.WritePropertyName("fine_tuning_job_id"u8); + writer.WriteStringValue(FineTuningJobId); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFineTuningJobCheckpoint IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJobCheckpoint)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFineTuningJobCheckpoint(document.RootElement, options); + } + + internal static InternalFineTuningJobCheckpoint DeserializeInternalFineTuningJobCheckpoint(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + DateTimeOffset createdAt = default; + string fineTunedModelCheckpoint = default; + int stepNumber = default; + InternalFineTuningJobCheckpointMetrics metrics = default; + string fineTuningJobId = default; + InternalFineTuningJobCheckpointObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("fine_tuned_model_checkpoint"u8)) + { + fineTunedModelCheckpoint = property.Value.GetString(); + continue; + } + if (property.NameEquals("step_number"u8)) + { + stepNumber = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("metrics"u8)) + { + metrics = InternalFineTuningJobCheckpointMetrics.DeserializeInternalFineTuningJobCheckpointMetrics(property.Value, options); + continue; + } + if (property.NameEquals("fine_tuning_job_id"u8)) + { + fineTuningJobId = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalFineTuningJobCheckpointObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFineTuningJobCheckpoint( + id, + createdAt, + fineTunedModelCheckpoint, + stepNumber, + metrics, + fineTuningJobId, + @object, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFineTuningJobCheckpoint)} does not support writing '{options.Format}' format."); + } + } + + InternalFineTuningJobCheckpoint IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFineTuningJobCheckpoint(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFineTuningJobCheckpoint)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFineTuningJobCheckpoint FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFineTuningJobCheckpoint(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpoint.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpoint.cs new file mode 100644 index 000000000..463cfd4fb --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpoint.cs @@ -0,0 +1,52 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJobCheckpoint + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalFineTuningJobCheckpoint(string id, DateTimeOffset createdAt, string fineTunedModelCheckpoint, int stepNumber, InternalFineTuningJobCheckpointMetrics metrics, string fineTuningJobId) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(fineTunedModelCheckpoint, nameof(fineTunedModelCheckpoint)); + Argument.AssertNotNull(metrics, nameof(metrics)); + Argument.AssertNotNull(fineTuningJobId, nameof(fineTuningJobId)); + + Id = id; + CreatedAt = createdAt; + FineTunedModelCheckpoint = fineTunedModelCheckpoint; + StepNumber = stepNumber; + Metrics = metrics; + FineTuningJobId = fineTuningJobId; + } + + internal InternalFineTuningJobCheckpoint(string id, DateTimeOffset createdAt, string fineTunedModelCheckpoint, int stepNumber, InternalFineTuningJobCheckpointMetrics metrics, string fineTuningJobId, InternalFineTuningJobCheckpointObject @object, IDictionary serializedAdditionalRawData) + { + Id = id; + CreatedAt = createdAt; + FineTunedModelCheckpoint = fineTunedModelCheckpoint; + StepNumber = stepNumber; + Metrics = metrics; + FineTuningJobId = fineTuningJobId; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalFineTuningJobCheckpoint() + { + } + + public string Id { get; } + public DateTimeOffset CreatedAt { get; } + public string FineTunedModelCheckpoint { get; } + public int StepNumber { get; } + public InternalFineTuningJobCheckpointMetrics Metrics { get; } + public string FineTuningJobId { get; } + public InternalFineTuningJobCheckpointObject Object { get; } = InternalFineTuningJobCheckpointObject.FineTuningJobCheckpoint; + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpointMetrics.Serialization.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpointMetrics.Serialization.cs new file mode 100644 index 000000000..3c411f901 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpointMetrics.Serialization.cs @@ -0,0 +1,235 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJobCheckpointMetrics : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJobCheckpointMetrics)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("step") != true && Optional.IsDefined(Step)) + { + writer.WritePropertyName("step"u8); + writer.WriteNumberValue(Step.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("train_loss") != true && Optional.IsDefined(TrainLoss)) + { + writer.WritePropertyName("train_loss"u8); + writer.WriteNumberValue(TrainLoss.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("train_mean_token_accuracy") != true && Optional.IsDefined(TrainMeanTokenAccuracy)) + { + writer.WritePropertyName("train_mean_token_accuracy"u8); + writer.WriteNumberValue(TrainMeanTokenAccuracy.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("valid_loss") != true && Optional.IsDefined(ValidLoss)) + { + writer.WritePropertyName("valid_loss"u8); + writer.WriteNumberValue(ValidLoss.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("valid_mean_token_accuracy") != true && Optional.IsDefined(ValidMeanTokenAccuracy)) + { + writer.WritePropertyName("valid_mean_token_accuracy"u8); + writer.WriteNumberValue(ValidMeanTokenAccuracy.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("full_valid_loss") != true && Optional.IsDefined(FullValidLoss)) + { + writer.WritePropertyName("full_valid_loss"u8); + writer.WriteNumberValue(FullValidLoss.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("full_valid_mean_token_accuracy") != true && Optional.IsDefined(FullValidMeanTokenAccuracy)) + { + writer.WritePropertyName("full_valid_mean_token_accuracy"u8); + writer.WriteNumberValue(FullValidMeanTokenAccuracy.Value); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFineTuningJobCheckpointMetrics IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJobCheckpointMetrics)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFineTuningJobCheckpointMetrics(document.RootElement, options); + } + + internal static InternalFineTuningJobCheckpointMetrics DeserializeInternalFineTuningJobCheckpointMetrics(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + float? step = default; + float? trainLoss = default; + float? trainMeanTokenAccuracy = default; + float? validLoss = default; + float? validMeanTokenAccuracy = default; + float? fullValidLoss = default; + float? fullValidMeanTokenAccuracy = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("step"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + step = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("train_loss"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + trainLoss = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("train_mean_token_accuracy"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + trainMeanTokenAccuracy = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("valid_loss"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + validLoss = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("valid_mean_token_accuracy"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + validMeanTokenAccuracy = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("full_valid_loss"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fullValidLoss = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("full_valid_mean_token_accuracy"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fullValidMeanTokenAccuracy = property.Value.GetSingle(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFineTuningJobCheckpointMetrics( + step, + trainLoss, + trainMeanTokenAccuracy, + validLoss, + validMeanTokenAccuracy, + fullValidLoss, + fullValidMeanTokenAccuracy, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFineTuningJobCheckpointMetrics)} does not support writing '{options.Format}' format."); + } + } + + InternalFineTuningJobCheckpointMetrics IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFineTuningJobCheckpointMetrics(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFineTuningJobCheckpointMetrics)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFineTuningJobCheckpointMetrics FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFineTuningJobCheckpointMetrics(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpointMetrics.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpointMetrics.cs new file mode 100644 index 000000000..63c7256ec --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpointMetrics.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJobCheckpointMetrics + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalFineTuningJobCheckpointMetrics() + { + } + + internal InternalFineTuningJobCheckpointMetrics(float? step, float? trainLoss, float? trainMeanTokenAccuracy, float? validLoss, float? validMeanTokenAccuracy, float? fullValidLoss, float? fullValidMeanTokenAccuracy, IDictionary serializedAdditionalRawData) + { + Step = step; + TrainLoss = trainLoss; + TrainMeanTokenAccuracy = trainMeanTokenAccuracy; + ValidLoss = validLoss; + ValidMeanTokenAccuracy = validMeanTokenAccuracy; + FullValidLoss = fullValidLoss; + FullValidMeanTokenAccuracy = fullValidMeanTokenAccuracy; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public float? Step { get; } + public float? TrainLoss { get; } + public float? TrainMeanTokenAccuracy { get; } + public float? ValidLoss { get; } + public float? ValidMeanTokenAccuracy { get; } + public float? FullValidLoss { get; } + public float? FullValidMeanTokenAccuracy { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpointObject.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpointObject.cs new file mode 100644 index 000000000..5cdd24ad8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobCheckpointObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalFineTuningJobCheckpointObject : IEquatable + { + private readonly string _value; + + public InternalFineTuningJobCheckpointObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FineTuningJobCheckpointValue = "fine_tuning.job.checkpoint"; + + public static InternalFineTuningJobCheckpointObject FineTuningJobCheckpoint { get; } = new InternalFineTuningJobCheckpointObject(FineTuningJobCheckpointValue); + public static bool operator ==(InternalFineTuningJobCheckpointObject left, InternalFineTuningJobCheckpointObject right) => left.Equals(right); + public static bool operator !=(InternalFineTuningJobCheckpointObject left, InternalFineTuningJobCheckpointObject right) => !left.Equals(right); + public static implicit operator InternalFineTuningJobCheckpointObject(string value) => new InternalFineTuningJobCheckpointObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalFineTuningJobCheckpointObject other && Equals(other); + public bool Equals(InternalFineTuningJobCheckpointObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobError.Serialization.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobError.Serialization.cs new file mode 100644 index 000000000..19c21f184 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobError.Serialization.cs @@ -0,0 +1,167 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJobError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJobError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true) + { + writer.WritePropertyName("code"u8); + writer.WriteStringValue(Code); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true) + { + writer.WritePropertyName("message"u8); + writer.WriteStringValue(Message); + } + if (SerializedAdditionalRawData?.ContainsKey("param") != true) + { + if (Param != null) + { + writer.WritePropertyName("param"u8); + writer.WriteStringValue(Param); + } + else + { + writer.WriteNull("param"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFineTuningJobError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJobError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFineTuningJobError(document.RootElement, options); + } + + internal static InternalFineTuningJobError DeserializeInternalFineTuningJobError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string code = default; + string message = default; + string param = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + code = property.Value.GetString(); + continue; + } + if (property.NameEquals("message"u8)) + { + message = property.Value.GetString(); + continue; + } + if (property.NameEquals("param"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + param = null; + continue; + } + param = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFineTuningJobError(code, message, param, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFineTuningJobError)} does not support writing '{options.Format}' format."); + } + } + + InternalFineTuningJobError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFineTuningJobError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFineTuningJobError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFineTuningJobError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFineTuningJobError(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobError.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobError.cs new file mode 100644 index 000000000..17325bf89 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobError.cs @@ -0,0 +1,39 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJobError + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalFineTuningJobError(string code, string message, string param) + { + Argument.AssertNotNull(code, nameof(code)); + Argument.AssertNotNull(message, nameof(message)); + + Code = code; + Message = message; + Param = param; + } + + internal InternalFineTuningJobError(string code, string message, string param, IDictionary serializedAdditionalRawData) + { + Code = code; + Message = message; + Param = param; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalFineTuningJobError() + { + } + + public string Code { get; } + public string Message { get; } + public string Param { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobEvent.Serialization.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobEvent.Serialization.cs new file mode 100644 index 000000000..5402198b5 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobEvent.Serialization.cs @@ -0,0 +1,183 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJobEvent : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJobEvent)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("level") != true) + { + writer.WritePropertyName("level"u8); + writer.WriteStringValue(Level.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true) + { + writer.WritePropertyName("message"u8); + writer.WriteStringValue(Message); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFineTuningJobEvent IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJobEvent)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFineTuningJobEvent(document.RootElement, options); + } + + internal static InternalFineTuningJobEvent DeserializeInternalFineTuningJobEvent(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + DateTimeOffset createdAt = default; + InternalFineTuningJobEventLevel level = default; + string message = default; + InternalFineTuningJobEventObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("level"u8)) + { + level = new InternalFineTuningJobEventLevel(property.Value.GetString()); + continue; + } + if (property.NameEquals("message"u8)) + { + message = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalFineTuningJobEventObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFineTuningJobEvent( + id, + createdAt, + level, + message, + @object, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFineTuningJobEvent)} does not support writing '{options.Format}' format."); + } + } + + InternalFineTuningJobEvent IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFineTuningJobEvent(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFineTuningJobEvent)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFineTuningJobEvent FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFineTuningJobEvent(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobEvent.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobEvent.cs new file mode 100644 index 000000000..500b4ecf1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobEvent.cs @@ -0,0 +1,44 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJobEvent + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalFineTuningJobEvent(string id, DateTimeOffset createdAt, InternalFineTuningJobEventLevel level, string message) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(message, nameof(message)); + + Id = id; + CreatedAt = createdAt; + Level = level; + Message = message; + } + + internal InternalFineTuningJobEvent(string id, DateTimeOffset createdAt, InternalFineTuningJobEventLevel level, string message, InternalFineTuningJobEventObject @object, IDictionary serializedAdditionalRawData) + { + Id = id; + CreatedAt = createdAt; + Level = level; + Message = message; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalFineTuningJobEvent() + { + } + + public string Id { get; } + public DateTimeOffset CreatedAt { get; } + public InternalFineTuningJobEventLevel Level { get; } + public string Message { get; } + public InternalFineTuningJobEventObject Object { get; } = InternalFineTuningJobEventObject.FineTuningJobEvent; + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobEventLevel.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobEventLevel.cs new file mode 100644 index 000000000..4a9fd45d6 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobEventLevel.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalFineTuningJobEventLevel : IEquatable + { + private readonly string _value; + + public InternalFineTuningJobEventLevel(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string InfoValue = "info"; + private const string WarnValue = "warn"; + private const string ErrorValue = "error"; + + public static InternalFineTuningJobEventLevel Info { get; } = new InternalFineTuningJobEventLevel(InfoValue); + public static InternalFineTuningJobEventLevel Warn { get; } = new InternalFineTuningJobEventLevel(WarnValue); + public static InternalFineTuningJobEventLevel Error { get; } = new InternalFineTuningJobEventLevel(ErrorValue); + public static bool operator ==(InternalFineTuningJobEventLevel left, InternalFineTuningJobEventLevel right) => left.Equals(right); + public static bool operator !=(InternalFineTuningJobEventLevel left, InternalFineTuningJobEventLevel right) => !left.Equals(right); + public static implicit operator InternalFineTuningJobEventLevel(string value) => new InternalFineTuningJobEventLevel(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalFineTuningJobEventLevel other && Equals(other); + public bool Equals(InternalFineTuningJobEventLevel other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobEventObject.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobEventObject.cs new file mode 100644 index 000000000..04fdff4dc --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobEventObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalFineTuningJobEventObject : IEquatable + { + private readonly string _value; + + public InternalFineTuningJobEventObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FineTuningJobEventValue = "fine_tuning.job.event"; + + public static InternalFineTuningJobEventObject FineTuningJobEvent { get; } = new InternalFineTuningJobEventObject(FineTuningJobEventValue); + public static bool operator ==(InternalFineTuningJobEventObject left, InternalFineTuningJobEventObject right) => left.Equals(right); + public static bool operator !=(InternalFineTuningJobEventObject left, InternalFineTuningJobEventObject right) => !left.Equals(right); + public static implicit operator InternalFineTuningJobEventObject(string value) => new InternalFineTuningJobEventObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalFineTuningJobEventObject other && Equals(other); + public bool Equals(InternalFineTuningJobEventObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobHyperparameters.Serialization.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobHyperparameters.Serialization.cs new file mode 100644 index 000000000..9d7aa8f60 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobHyperparameters.Serialization.cs @@ -0,0 +1,140 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJobHyperparameters : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJobHyperparameters)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("n_epochs") != true) + { + writer.WritePropertyName("n_epochs"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(NEpochs); +#else + using (JsonDocument document = JsonDocument.Parse(NEpochs)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFineTuningJobHyperparameters IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFineTuningJobHyperparameters)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFineTuningJobHyperparameters(document.RootElement, options); + } + + internal static InternalFineTuningJobHyperparameters DeserializeInternalFineTuningJobHyperparameters(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + BinaryData nEpochs = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("n_epochs"u8)) + { + nEpochs = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFineTuningJobHyperparameters(nEpochs, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFineTuningJobHyperparameters)} does not support writing '{options.Format}' format."); + } + } + + InternalFineTuningJobHyperparameters IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFineTuningJobHyperparameters(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFineTuningJobHyperparameters)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFineTuningJobHyperparameters FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFineTuningJobHyperparameters(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobHyperparameters.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobHyperparameters.cs new file mode 100644 index 000000000..c0855f274 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobHyperparameters.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFineTuningJobHyperparameters + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalFineTuningJobHyperparameters(BinaryData nEpochs) + { + Argument.AssertNotNull(nEpochs, nameof(nEpochs)); + + NEpochs = nEpochs; + } + + internal InternalFineTuningJobHyperparameters(BinaryData nEpochs, IDictionary serializedAdditionalRawData) + { + NEpochs = nEpochs; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalFineTuningJobHyperparameters() + { + } + + public BinaryData NEpochs { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobHyperparametersNEpochsChoiceEnum.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobHyperparametersNEpochsChoiceEnum.cs new file mode 100644 index 000000000..02f0f0754 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobHyperparametersNEpochsChoiceEnum.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalFineTuningJobHyperparametersNEpochsChoiceEnum : IEquatable + { + private readonly string _value; + + public InternalFineTuningJobHyperparametersNEpochsChoiceEnum(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AutoValue = "auto"; + + public static InternalFineTuningJobHyperparametersNEpochsChoiceEnum Auto { get; } = new InternalFineTuningJobHyperparametersNEpochsChoiceEnum(AutoValue); + public static bool operator ==(InternalFineTuningJobHyperparametersNEpochsChoiceEnum left, InternalFineTuningJobHyperparametersNEpochsChoiceEnum right) => left.Equals(right); + public static bool operator !=(InternalFineTuningJobHyperparametersNEpochsChoiceEnum left, InternalFineTuningJobHyperparametersNEpochsChoiceEnum right) => !left.Equals(right); + public static implicit operator InternalFineTuningJobHyperparametersNEpochsChoiceEnum(string value) => new InternalFineTuningJobHyperparametersNEpochsChoiceEnum(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalFineTuningJobHyperparametersNEpochsChoiceEnum other && Equals(other); + public bool Equals(InternalFineTuningJobHyperparametersNEpochsChoiceEnum other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobObject.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobObject.cs new file mode 100644 index 000000000..b627e2945 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalFineTuningJobObject : IEquatable + { + private readonly string _value; + + public InternalFineTuningJobObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FineTuningJobValue = "fine_tuning.job"; + + public static InternalFineTuningJobObject FineTuningJob { get; } = new InternalFineTuningJobObject(FineTuningJobValue); + public static bool operator ==(InternalFineTuningJobObject left, InternalFineTuningJobObject right) => left.Equals(right); + public static bool operator !=(InternalFineTuningJobObject left, InternalFineTuningJobObject right) => !left.Equals(right); + public static implicit operator InternalFineTuningJobObject(string value) => new InternalFineTuningJobObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalFineTuningJobObject other && Equals(other); + public bool Equals(InternalFineTuningJobObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalFineTuningJobStatus.cs b/.dotnet/src/Generated/Models/InternalFineTuningJobStatus.cs new file mode 100644 index 000000000..c04e6835a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFineTuningJobStatus.cs @@ -0,0 +1,44 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalFineTuningJobStatus : IEquatable + { + private readonly string _value; + + public InternalFineTuningJobStatus(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ValidatingFilesValue = "validating_files"; + private const string QueuedValue = "queued"; + private const string RunningValue = "running"; + private const string SucceededValue = "succeeded"; + private const string FailedValue = "failed"; + private const string CancelledValue = "cancelled"; + + public static InternalFineTuningJobStatus ValidatingFiles { get; } = new InternalFineTuningJobStatus(ValidatingFilesValue); + public static InternalFineTuningJobStatus Queued { get; } = new InternalFineTuningJobStatus(QueuedValue); + public static InternalFineTuningJobStatus Running { get; } = new InternalFineTuningJobStatus(RunningValue); + public static InternalFineTuningJobStatus Succeeded { get; } = new InternalFineTuningJobStatus(SucceededValue); + public static InternalFineTuningJobStatus Failed { get; } = new InternalFineTuningJobStatus(FailedValue); + public static InternalFineTuningJobStatus Cancelled { get; } = new InternalFineTuningJobStatus(CancelledValue); + public static bool operator ==(InternalFineTuningJobStatus left, InternalFineTuningJobStatus right) => left.Equals(right); + public static bool operator !=(InternalFineTuningJobStatus left, InternalFineTuningJobStatus right) => !left.Equals(right); + public static implicit operator InternalFineTuningJobStatus(string value) => new InternalFineTuningJobStatus(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalFineTuningJobStatus other && Equals(other); + public bool Equals(InternalFineTuningJobStatus other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalFinetuneChatRequestInput.Serialization.cs b/.dotnet/src/Generated/Models/InternalFinetuneChatRequestInput.Serialization.cs new file mode 100644 index 000000000..946bb2d93 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFinetuneChatRequestInput.Serialization.cs @@ -0,0 +1,232 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; +using OpenAI.Chat; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFinetuneChatRequestInput : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFinetuneChatRequestInput)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("messages") != true && Optional.IsCollectionDefined(Messages)) + { + writer.WritePropertyName("messages"u8); + writer.WriteStartArray(); + foreach (var item in Messages) + { + if (item == null) + { + writer.WriteNullValue(); + continue; + } +#if NET6_0_OR_GREATER + writer.WriteRawValue(item); +#else + using (JsonDocument document = JsonDocument.Parse(item)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("tools") != true && Optional.IsCollectionDefined(Tools)) + { + writer.WritePropertyName("tools"u8); + writer.WriteStartArray(); + foreach (var item in Tools) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("parallel_tool_calls") != true && Optional.IsDefined(ParallelToolCalls)) + { + writer.WritePropertyName("parallel_tool_calls"u8); + writer.WriteBooleanValue(ParallelToolCalls.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("functions") != true && Optional.IsCollectionDefined(Functions)) + { + writer.WritePropertyName("functions"u8); + writer.WriteStartArray(); + foreach (var item in Functions) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFinetuneChatRequestInput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFinetuneChatRequestInput)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFinetuneChatRequestInput(document.RootElement, options); + } + + internal static InternalFinetuneChatRequestInput DeserializeInternalFinetuneChatRequestInput(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList messages = default; + IList tools = default; + bool? parallelToolCalls = default; + IList functions = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("messages"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + if (item.ValueKind == JsonValueKind.Null) + { + array.Add(null); + } + else + { + array.Add(BinaryData.FromString(item.GetRawText())); + } + } + messages = array; + continue; + } + if (property.NameEquals("tools"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatTool.DeserializeChatTool(item, options)); + } + tools = array; + continue; + } + if (property.NameEquals("parallel_tool_calls"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + parallelToolCalls = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("functions"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ChatFunction.DeserializeChatFunction(item, options)); + } + functions = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFinetuneChatRequestInput(messages ?? new ChangeTrackingList(), tools ?? new ChangeTrackingList(), parallelToolCalls, functions ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFinetuneChatRequestInput)} does not support writing '{options.Format}' format."); + } + } + + InternalFinetuneChatRequestInput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFinetuneChatRequestInput(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFinetuneChatRequestInput)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFinetuneChatRequestInput FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFinetuneChatRequestInput(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFinetuneChatRequestInput.cs b/.dotnet/src/Generated/Models/InternalFinetuneChatRequestInput.cs new file mode 100644 index 000000000..b325ed536 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFinetuneChatRequestInput.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using OpenAI.Chat; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFinetuneChatRequestInput + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalFinetuneChatRequestInput() + { + Messages = new ChangeTrackingList(); + Tools = new ChangeTrackingList(); + Functions = new ChangeTrackingList(); + } + + internal InternalFinetuneChatRequestInput(IList messages, IList tools, bool? parallelToolCalls, IList functions, IDictionary serializedAdditionalRawData) + { + Messages = messages; + Tools = tools; + ParallelToolCalls = parallelToolCalls; + Functions = functions; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IList Messages { get; } + public IList Tools { get; } + public bool? ParallelToolCalls { get; set; } + public IList Functions { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFinetuneCompletionRequestInput.Serialization.cs b/.dotnet/src/Generated/Models/InternalFinetuneCompletionRequestInput.Serialization.cs new file mode 100644 index 000000000..5fbbb7472 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFinetuneCompletionRequestInput.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFinetuneCompletionRequestInput : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFinetuneCompletionRequestInput)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("prompt") != true && Optional.IsDefined(Prompt)) + { + writer.WritePropertyName("prompt"u8); + writer.WriteStringValue(Prompt); + } + if (SerializedAdditionalRawData?.ContainsKey("completion") != true && Optional.IsDefined(Completion)) + { + writer.WritePropertyName("completion"u8); + writer.WriteStringValue(Completion); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFinetuneCompletionRequestInput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFinetuneCompletionRequestInput)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFinetuneCompletionRequestInput(document.RootElement, options); + } + + internal static InternalFinetuneCompletionRequestInput DeserializeInternalFinetuneCompletionRequestInput(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string prompt = default; + string completion = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("prompt"u8)) + { + prompt = property.Value.GetString(); + continue; + } + if (property.NameEquals("completion"u8)) + { + completion = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFinetuneCompletionRequestInput(prompt, completion, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFinetuneCompletionRequestInput)} does not support writing '{options.Format}' format."); + } + } + + InternalFinetuneCompletionRequestInput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFinetuneCompletionRequestInput(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFinetuneCompletionRequestInput)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFinetuneCompletionRequestInput FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFinetuneCompletionRequestInput(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFinetuneCompletionRequestInput.cs b/.dotnet/src/Generated/Models/InternalFinetuneCompletionRequestInput.cs new file mode 100644 index 000000000..3e30640c6 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFinetuneCompletionRequestInput.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.FineTuning +{ + internal partial class InternalFinetuneCompletionRequestInput + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalFinetuneCompletionRequestInput() + { + } + + internal InternalFinetuneCompletionRequestInput(string prompt, string completion, IDictionary serializedAdditionalRawData) + { + Prompt = prompt; + Completion = completion; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string Prompt { get; set; } + public string Completion { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFunctionDefinition.Serialization.cs b/.dotnet/src/Generated/Models/InternalFunctionDefinition.Serialization.cs new file mode 100644 index 000000000..57335523d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFunctionDefinition.Serialization.cs @@ -0,0 +1,189 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI +{ + internal partial class InternalFunctionDefinition : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFunctionDefinition)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("description") != true && Optional.IsDefined(Description)) + { + writer.WritePropertyName("description"u8); + writer.WriteStringValue(Description); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData?.ContainsKey("parameters") != true && Optional.IsDefined(Parameters)) + { + writer.WritePropertyName("parameters"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(Parameters); +#else + using (JsonDocument document = JsonDocument.Parse(Parameters)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("strict") != true && Optional.IsDefined(Strict)) + { + if (Strict != null) + { + writer.WritePropertyName("strict"u8); + writer.WriteBooleanValue(Strict.Value); + } + else + { + writer.WriteNull("strict"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFunctionDefinition IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFunctionDefinition)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFunctionDefinition(document.RootElement, options); + } + + internal static InternalFunctionDefinition DeserializeInternalFunctionDefinition(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string description = default; + string name = default; + BinaryData parameters = default; + bool? strict = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("description"u8)) + { + description = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("parameters"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + parameters = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("strict"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + strict = null; + continue; + } + strict = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalFunctionDefinition(description, name, parameters, strict, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFunctionDefinition)} does not support writing '{options.Format}' format."); + } + } + + InternalFunctionDefinition IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFunctionDefinition(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFunctionDefinition)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFunctionDefinition FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFunctionDefinition(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFunctionDefinition.cs b/.dotnet/src/Generated/Models/InternalFunctionDefinition.cs new file mode 100644 index 000000000..9e385deb8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFunctionDefinition.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI +{ + internal partial class InternalFunctionDefinition + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalFunctionDefinition(string name) + { + Argument.AssertNotNull(name, nameof(name)); + + Name = name; + } + + internal InternalFunctionDefinition(string description, string name, BinaryData parameters, bool? strict, IDictionary serializedAdditionalRawData) + { + Description = description; + Name = name; + Parameters = parameters; + Strict = strict; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalFunctionDefinition() + { + } + + public string Description { get; set; } + public string Name { get; set; } + public bool? Strict { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFunctionParameters.Serialization.cs b/.dotnet/src/Generated/Models/InternalFunctionParameters.Serialization.cs new file mode 100644 index 000000000..4ab8c6f4c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFunctionParameters.Serialization.cs @@ -0,0 +1,111 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalFunctionParameters : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFunctionParameters)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + foreach (var item in AdditionalProperties) + { + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + writer.WriteEndObject(); + } + + InternalFunctionParameters IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFunctionParameters)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFunctionParameters(document.RootElement, options); + } + + internal static InternalFunctionParameters DeserializeInternalFunctionParameters(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IDictionary additionalProperties = default; + Dictionary additionalPropertiesDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + additionalPropertiesDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + additionalProperties = additionalPropertiesDictionary; + return new InternalFunctionParameters(additionalProperties); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFunctionParameters)} does not support writing '{options.Format}' format."); + } + } + + InternalFunctionParameters IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFunctionParameters(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFunctionParameters)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalFunctionParameters FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalFunctionParameters(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalFunctionParameters.cs b/.dotnet/src/Generated/Models/InternalFunctionParameters.cs new file mode 100644 index 000000000..0320cf60c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalFunctionParameters.cs @@ -0,0 +1,24 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalFunctionParameters + { + public InternalFunctionParameters() + { + AdditionalProperties = new ChangeTrackingDictionary(); + } + + internal InternalFunctionParameters(IDictionary additionalProperties) + { + AdditionalProperties = additionalProperties; + } + + public IDictionary AdditionalProperties { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListAssistantsResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalListAssistantsResponse.Serialization.cs new file mode 100644 index 000000000..edbc8c764 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListAssistantsResponse.Serialization.cs @@ -0,0 +1,193 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalListAssistantsResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListAssistantsResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("first_id") != true) + { + writer.WritePropertyName("first_id"u8); + writer.WriteStringValue(FirstId); + } + if (SerializedAdditionalRawData?.ContainsKey("last_id") != true) + { + writer.WritePropertyName("last_id"u8); + writer.WriteStringValue(LastId); + } + if (SerializedAdditionalRawData?.ContainsKey("has_more") != true) + { + writer.WritePropertyName("has_more"u8); + writer.WriteBooleanValue(HasMore); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalListAssistantsResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListAssistantsResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalListAssistantsResponse(document.RootElement, options); + } + + internal static InternalListAssistantsResponse DeserializeInternalListAssistantsResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalListAssistantsResponseObject @object = default; + IReadOnlyList data = default; + string firstId = default; + string lastId = default; + bool hasMore = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("object"u8)) + { + @object = new InternalListAssistantsResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(Assistant.DeserializeAssistant(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("first_id"u8)) + { + firstId = property.Value.GetString(); + continue; + } + if (property.NameEquals("last_id"u8)) + { + lastId = property.Value.GetString(); + continue; + } + if (property.NameEquals("has_more"u8)) + { + hasMore = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalListAssistantsResponse( + @object, + data, + firstId, + lastId, + hasMore, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalListAssistantsResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalListAssistantsResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalListAssistantsResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalListAssistantsResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalListAssistantsResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalListAssistantsResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListAssistantsResponse.cs b/.dotnet/src/Generated/Models/InternalListAssistantsResponse.cs new file mode 100644 index 000000000..6ee4c4ccd --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListAssistantsResponse.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + internal partial class InternalListAssistantsResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalListAssistantsResponse(IEnumerable data, string firstId, string lastId, bool hasMore) + { + Argument.AssertNotNull(data, nameof(data)); + Argument.AssertNotNull(firstId, nameof(firstId)); + Argument.AssertNotNull(lastId, nameof(lastId)); + + Data = data.ToList(); + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + } + + internal InternalListAssistantsResponse(InternalListAssistantsResponseObject @object, IReadOnlyList data, string firstId, string lastId, bool hasMore, IDictionary serializedAdditionalRawData) + { + Object = @object; + Data = data; + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalListAssistantsResponse() + { + } + + public InternalListAssistantsResponseObject Object { get; } = InternalListAssistantsResponseObject.List; + + public IReadOnlyList Data { get; } + public string FirstId { get; } + public string LastId { get; } + public bool HasMore { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListAssistantsResponseObject.cs b/.dotnet/src/Generated/Models/InternalListAssistantsResponseObject.cs new file mode 100644 index 000000000..6d7350e78 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListAssistantsResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalListAssistantsResponseObject : IEquatable + { + private readonly string _value; + + public InternalListAssistantsResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListAssistantsResponseObject List { get; } = new InternalListAssistantsResponseObject(ListValue); + public static bool operator ==(InternalListAssistantsResponseObject left, InternalListAssistantsResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListAssistantsResponseObject left, InternalListAssistantsResponseObject right) => !left.Equals(right); + public static implicit operator InternalListAssistantsResponseObject(string value) => new InternalListAssistantsResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListAssistantsResponseObject other && Equals(other); + public bool Equals(InternalListAssistantsResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListBatchesResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalListBatchesResponse.Serialization.cs new file mode 100644 index 000000000..432479b4b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListBatchesResponse.Serialization.cs @@ -0,0 +1,193 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Batch +{ + internal partial class InternalListBatchesResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListBatchesResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("first_id") != true && Optional.IsDefined(FirstId)) + { + writer.WritePropertyName("first_id"u8); + writer.WriteStringValue(FirstId); + } + if (SerializedAdditionalRawData?.ContainsKey("last_id") != true && Optional.IsDefined(LastId)) + { + writer.WritePropertyName("last_id"u8); + writer.WriteStringValue(LastId); + } + if (SerializedAdditionalRawData?.ContainsKey("has_more") != true) + { + writer.WritePropertyName("has_more"u8); + writer.WriteBooleanValue(HasMore); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalListBatchesResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListBatchesResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalListBatchesResponse(document.RootElement, options); + } + + internal static InternalListBatchesResponse DeserializeInternalListBatchesResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList data = default; + string firstId = default; + string lastId = default; + bool hasMore = default; + InternalListBatchesResponseObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalBatchJob.DeserializeInternalBatchJob(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("first_id"u8)) + { + firstId = property.Value.GetString(); + continue; + } + if (property.NameEquals("last_id"u8)) + { + lastId = property.Value.GetString(); + continue; + } + if (property.NameEquals("has_more"u8)) + { + hasMore = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalListBatchesResponseObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalListBatchesResponse( + data, + firstId, + lastId, + hasMore, + @object, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalListBatchesResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalListBatchesResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalListBatchesResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalListBatchesResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalListBatchesResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalListBatchesResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListBatchesResponse.cs b/.dotnet/src/Generated/Models/InternalListBatchesResponse.cs new file mode 100644 index 000000000..c81c672cc --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListBatchesResponse.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Batch +{ + internal partial class InternalListBatchesResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalListBatchesResponse(IEnumerable data, bool hasMore) + { + Argument.AssertNotNull(data, nameof(data)); + + Data = data.ToList(); + HasMore = hasMore; + } + + internal InternalListBatchesResponse(IReadOnlyList data, string firstId, string lastId, bool hasMore, InternalListBatchesResponseObject @object, IDictionary serializedAdditionalRawData) + { + Data = data; + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalListBatchesResponse() + { + } + + public IReadOnlyList Data { get; } + public string FirstId { get; } + public string LastId { get; } + public bool HasMore { get; } + public InternalListBatchesResponseObject Object { get; } = InternalListBatchesResponseObject.List; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListBatchesResponseObject.cs b/.dotnet/src/Generated/Models/InternalListBatchesResponseObject.cs new file mode 100644 index 000000000..3f562b70d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListBatchesResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Batch +{ + internal readonly partial struct InternalListBatchesResponseObject : IEquatable + { + private readonly string _value; + + public InternalListBatchesResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListBatchesResponseObject List { get; } = new InternalListBatchesResponseObject(ListValue); + public static bool operator ==(InternalListBatchesResponseObject left, InternalListBatchesResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListBatchesResponseObject left, InternalListBatchesResponseObject right) => !left.Equals(right); + public static implicit operator InternalListBatchesResponseObject(string value) => new InternalListBatchesResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListBatchesResponseObject other && Equals(other); + public bool Equals(InternalListBatchesResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListFilesResponseObject.cs b/.dotnet/src/Generated/Models/InternalListFilesResponseObject.cs new file mode 100644 index 000000000..2b578ea18 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListFilesResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Files +{ + internal readonly partial struct InternalListFilesResponseObject : IEquatable + { + private readonly string _value; + + public InternalListFilesResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListFilesResponseObject List { get; } = new InternalListFilesResponseObject(ListValue); + public static bool operator ==(InternalListFilesResponseObject left, InternalListFilesResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListFilesResponseObject left, InternalListFilesResponseObject right) => !left.Equals(right); + public static implicit operator InternalListFilesResponseObject(string value) => new InternalListFilesResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListFilesResponseObject other && Equals(other); + public bool Equals(InternalListFilesResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListFineTuningJobCheckpointsResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalListFineTuningJobCheckpointsResponse.Serialization.cs new file mode 100644 index 000000000..9078810a3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListFineTuningJobCheckpointsResponse.Serialization.cs @@ -0,0 +1,217 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalListFineTuningJobCheckpointsResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListFineTuningJobCheckpointsResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("first_id") != true && Optional.IsDefined(FirstId)) + { + if (FirstId != null) + { + writer.WritePropertyName("first_id"u8); + writer.WriteStringValue(FirstId); + } + else + { + writer.WriteNull("first_id"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("last_id") != true && Optional.IsDefined(LastId)) + { + if (LastId != null) + { + writer.WritePropertyName("last_id"u8); + writer.WriteStringValue(LastId); + } + else + { + writer.WriteNull("last_id"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("has_more") != true) + { + writer.WritePropertyName("has_more"u8); + writer.WriteBooleanValue(HasMore); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalListFineTuningJobCheckpointsResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListFineTuningJobCheckpointsResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalListFineTuningJobCheckpointsResponse(document.RootElement, options); + } + + internal static InternalListFineTuningJobCheckpointsResponse DeserializeInternalListFineTuningJobCheckpointsResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList data = default; + InternalListFineTuningJobCheckpointsResponseObject @object = default; + string firstId = default; + string lastId = default; + bool hasMore = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalFineTuningJobCheckpoint.DeserializeInternalFineTuningJobCheckpoint(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalListFineTuningJobCheckpointsResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("first_id"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + firstId = null; + continue; + } + firstId = property.Value.GetString(); + continue; + } + if (property.NameEquals("last_id"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + lastId = null; + continue; + } + lastId = property.Value.GetString(); + continue; + } + if (property.NameEquals("has_more"u8)) + { + hasMore = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalListFineTuningJobCheckpointsResponse( + data, + @object, + firstId, + lastId, + hasMore, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalListFineTuningJobCheckpointsResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalListFineTuningJobCheckpointsResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalListFineTuningJobCheckpointsResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalListFineTuningJobCheckpointsResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalListFineTuningJobCheckpointsResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalListFineTuningJobCheckpointsResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListFineTuningJobCheckpointsResponse.cs b/.dotnet/src/Generated/Models/InternalListFineTuningJobCheckpointsResponse.cs new file mode 100644 index 000000000..f96a3d96a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListFineTuningJobCheckpointsResponse.cs @@ -0,0 +1,43 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.FineTuning +{ + internal partial class InternalListFineTuningJobCheckpointsResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalListFineTuningJobCheckpointsResponse(IEnumerable data, bool hasMore) + { + Argument.AssertNotNull(data, nameof(data)); + + Data = data.ToList(); + HasMore = hasMore; + } + + internal InternalListFineTuningJobCheckpointsResponse(IReadOnlyList data, InternalListFineTuningJobCheckpointsResponseObject @object, string firstId, string lastId, bool hasMore, IDictionary serializedAdditionalRawData) + { + Data = data; + Object = @object; + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalListFineTuningJobCheckpointsResponse() + { + } + + public IReadOnlyList Data { get; } + public InternalListFineTuningJobCheckpointsResponseObject Object { get; } = InternalListFineTuningJobCheckpointsResponseObject.List; + + public string FirstId { get; } + public string LastId { get; } + public bool HasMore { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListFineTuningJobCheckpointsResponseObject.cs b/.dotnet/src/Generated/Models/InternalListFineTuningJobCheckpointsResponseObject.cs new file mode 100644 index 000000000..8a9ca04a3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListFineTuningJobCheckpointsResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalListFineTuningJobCheckpointsResponseObject : IEquatable + { + private readonly string _value; + + public InternalListFineTuningJobCheckpointsResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListFineTuningJobCheckpointsResponseObject List { get; } = new InternalListFineTuningJobCheckpointsResponseObject(ListValue); + public static bool operator ==(InternalListFineTuningJobCheckpointsResponseObject left, InternalListFineTuningJobCheckpointsResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListFineTuningJobCheckpointsResponseObject left, InternalListFineTuningJobCheckpointsResponseObject right) => !left.Equals(right); + public static implicit operator InternalListFineTuningJobCheckpointsResponseObject(string value) => new InternalListFineTuningJobCheckpointsResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListFineTuningJobCheckpointsResponseObject other && Equals(other); + public bool Equals(InternalListFineTuningJobCheckpointsResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListFineTuningJobEventsResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalListFineTuningJobEventsResponse.Serialization.cs new file mode 100644 index 000000000..6dfd21c8f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListFineTuningJobEventsResponse.Serialization.cs @@ -0,0 +1,154 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalListFineTuningJobEventsResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListFineTuningJobEventsResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalListFineTuningJobEventsResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListFineTuningJobEventsResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalListFineTuningJobEventsResponse(document.RootElement, options); + } + + internal static InternalListFineTuningJobEventsResponse DeserializeInternalListFineTuningJobEventsResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList data = default; + InternalListFineTuningJobEventsResponseObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalFineTuningJobEvent.DeserializeInternalFineTuningJobEvent(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalListFineTuningJobEventsResponseObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalListFineTuningJobEventsResponse(data, @object, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalListFineTuningJobEventsResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalListFineTuningJobEventsResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalListFineTuningJobEventsResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalListFineTuningJobEventsResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalListFineTuningJobEventsResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalListFineTuningJobEventsResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListFineTuningJobEventsResponse.cs b/.dotnet/src/Generated/Models/InternalListFineTuningJobEventsResponse.cs new file mode 100644 index 000000000..48fa4db44 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListFineTuningJobEventsResponse.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.FineTuning +{ + internal partial class InternalListFineTuningJobEventsResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalListFineTuningJobEventsResponse(IEnumerable data) + { + Argument.AssertNotNull(data, nameof(data)); + + Data = data.ToList(); + } + + internal InternalListFineTuningJobEventsResponse(IReadOnlyList data, InternalListFineTuningJobEventsResponseObject @object, IDictionary serializedAdditionalRawData) + { + Data = data; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalListFineTuningJobEventsResponse() + { + } + + public IReadOnlyList Data { get; } + public InternalListFineTuningJobEventsResponseObject Object { get; } = InternalListFineTuningJobEventsResponseObject.List; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListFineTuningJobEventsResponseObject.cs b/.dotnet/src/Generated/Models/InternalListFineTuningJobEventsResponseObject.cs new file mode 100644 index 000000000..5f4aa8afb --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListFineTuningJobEventsResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalListFineTuningJobEventsResponseObject : IEquatable + { + private readonly string _value; + + public InternalListFineTuningJobEventsResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListFineTuningJobEventsResponseObject List { get; } = new InternalListFineTuningJobEventsResponseObject(ListValue); + public static bool operator ==(InternalListFineTuningJobEventsResponseObject left, InternalListFineTuningJobEventsResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListFineTuningJobEventsResponseObject left, InternalListFineTuningJobEventsResponseObject right) => !left.Equals(right); + public static implicit operator InternalListFineTuningJobEventsResponseObject(string value) => new InternalListFineTuningJobEventsResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListFineTuningJobEventsResponseObject other && Equals(other); + public bool Equals(InternalListFineTuningJobEventsResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListMessagesResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalListMessagesResponse.Serialization.cs new file mode 100644 index 000000000..f91fc9dec --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListMessagesResponse.Serialization.cs @@ -0,0 +1,193 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalListMessagesResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListMessagesResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("first_id") != true) + { + writer.WritePropertyName("first_id"u8); + writer.WriteStringValue(FirstId); + } + if (SerializedAdditionalRawData?.ContainsKey("last_id") != true) + { + writer.WritePropertyName("last_id"u8); + writer.WriteStringValue(LastId); + } + if (SerializedAdditionalRawData?.ContainsKey("has_more") != true) + { + writer.WritePropertyName("has_more"u8); + writer.WriteBooleanValue(HasMore); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalListMessagesResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListMessagesResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalListMessagesResponse(document.RootElement, options); + } + + internal static InternalListMessagesResponse DeserializeInternalListMessagesResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalListMessagesResponseObject @object = default; + IReadOnlyList data = default; + string firstId = default; + string lastId = default; + bool hasMore = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("object"u8)) + { + @object = new InternalListMessagesResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ThreadMessage.DeserializeThreadMessage(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("first_id"u8)) + { + firstId = property.Value.GetString(); + continue; + } + if (property.NameEquals("last_id"u8)) + { + lastId = property.Value.GetString(); + continue; + } + if (property.NameEquals("has_more"u8)) + { + hasMore = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalListMessagesResponse( + @object, + data, + firstId, + lastId, + hasMore, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalListMessagesResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalListMessagesResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalListMessagesResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalListMessagesResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalListMessagesResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalListMessagesResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListMessagesResponse.cs b/.dotnet/src/Generated/Models/InternalListMessagesResponse.cs new file mode 100644 index 000000000..8fd5be212 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListMessagesResponse.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + internal partial class InternalListMessagesResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalListMessagesResponse(IEnumerable data, string firstId, string lastId, bool hasMore) + { + Argument.AssertNotNull(data, nameof(data)); + Argument.AssertNotNull(firstId, nameof(firstId)); + Argument.AssertNotNull(lastId, nameof(lastId)); + + Data = data.ToList(); + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + } + + internal InternalListMessagesResponse(InternalListMessagesResponseObject @object, IReadOnlyList data, string firstId, string lastId, bool hasMore, IDictionary serializedAdditionalRawData) + { + Object = @object; + Data = data; + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalListMessagesResponse() + { + } + + public InternalListMessagesResponseObject Object { get; } = InternalListMessagesResponseObject.List; + + public IReadOnlyList Data { get; } + public string FirstId { get; } + public string LastId { get; } + public bool HasMore { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListMessagesResponseObject.cs b/.dotnet/src/Generated/Models/InternalListMessagesResponseObject.cs new file mode 100644 index 000000000..44e0d2aba --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListMessagesResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalListMessagesResponseObject : IEquatable + { + private readonly string _value; + + public InternalListMessagesResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListMessagesResponseObject List { get; } = new InternalListMessagesResponseObject(ListValue); + public static bool operator ==(InternalListMessagesResponseObject left, InternalListMessagesResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListMessagesResponseObject left, InternalListMessagesResponseObject right) => !left.Equals(right); + public static implicit operator InternalListMessagesResponseObject(string value) => new InternalListMessagesResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListMessagesResponseObject other && Equals(other); + public bool Equals(InternalListMessagesResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListModelsResponseObject.cs b/.dotnet/src/Generated/Models/InternalListModelsResponseObject.cs new file mode 100644 index 000000000..29abbc2c7 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListModelsResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Models +{ + internal readonly partial struct InternalListModelsResponseObject : IEquatable + { + private readonly string _value; + + public InternalListModelsResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListModelsResponseObject List { get; } = new InternalListModelsResponseObject(ListValue); + public static bool operator ==(InternalListModelsResponseObject left, InternalListModelsResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListModelsResponseObject left, InternalListModelsResponseObject right) => !left.Equals(right); + public static implicit operator InternalListModelsResponseObject(string value) => new InternalListModelsResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListModelsResponseObject other && Equals(other); + public bool Equals(InternalListModelsResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListPaginatedFineTuningJobsResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalListPaginatedFineTuningJobsResponse.Serialization.cs new file mode 100644 index 000000000..0383ddb8f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListPaginatedFineTuningJobsResponse.Serialization.cs @@ -0,0 +1,165 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.FineTuning +{ + internal partial class InternalListPaginatedFineTuningJobsResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListPaginatedFineTuningJobsResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("has_more") != true) + { + writer.WritePropertyName("has_more"u8); + writer.WriteBooleanValue(HasMore); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalListPaginatedFineTuningJobsResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListPaginatedFineTuningJobsResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalListPaginatedFineTuningJobsResponse(document.RootElement, options); + } + + internal static InternalListPaginatedFineTuningJobsResponse DeserializeInternalListPaginatedFineTuningJobsResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList data = default; + bool hasMore = default; + InternalListPaginatedFineTuningJobsResponseObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalFineTuningJob.DeserializeInternalFineTuningJob(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("has_more"u8)) + { + hasMore = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalListPaginatedFineTuningJobsResponseObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalListPaginatedFineTuningJobsResponse(data, hasMore, @object, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalListPaginatedFineTuningJobsResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalListPaginatedFineTuningJobsResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalListPaginatedFineTuningJobsResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalListPaginatedFineTuningJobsResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalListPaginatedFineTuningJobsResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalListPaginatedFineTuningJobsResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListPaginatedFineTuningJobsResponse.cs b/.dotnet/src/Generated/Models/InternalListPaginatedFineTuningJobsResponse.cs new file mode 100644 index 000000000..28ba3af0b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListPaginatedFineTuningJobsResponse.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.FineTuning +{ + internal partial class InternalListPaginatedFineTuningJobsResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalListPaginatedFineTuningJobsResponse(IEnumerable data, bool hasMore) + { + Argument.AssertNotNull(data, nameof(data)); + + Data = data.ToList(); + HasMore = hasMore; + } + + internal InternalListPaginatedFineTuningJobsResponse(IReadOnlyList data, bool hasMore, InternalListPaginatedFineTuningJobsResponseObject @object, IDictionary serializedAdditionalRawData) + { + Data = data; + HasMore = hasMore; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalListPaginatedFineTuningJobsResponse() + { + } + + public IReadOnlyList Data { get; } + public bool HasMore { get; } + public InternalListPaginatedFineTuningJobsResponseObject Object { get; } = InternalListPaginatedFineTuningJobsResponseObject.List; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListPaginatedFineTuningJobsResponseObject.cs b/.dotnet/src/Generated/Models/InternalListPaginatedFineTuningJobsResponseObject.cs new file mode 100644 index 000000000..b9dcd5fb9 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListPaginatedFineTuningJobsResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.FineTuning +{ + internal readonly partial struct InternalListPaginatedFineTuningJobsResponseObject : IEquatable + { + private readonly string _value; + + public InternalListPaginatedFineTuningJobsResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListPaginatedFineTuningJobsResponseObject List { get; } = new InternalListPaginatedFineTuningJobsResponseObject(ListValue); + public static bool operator ==(InternalListPaginatedFineTuningJobsResponseObject left, InternalListPaginatedFineTuningJobsResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListPaginatedFineTuningJobsResponseObject left, InternalListPaginatedFineTuningJobsResponseObject right) => !left.Equals(right); + public static implicit operator InternalListPaginatedFineTuningJobsResponseObject(string value) => new InternalListPaginatedFineTuningJobsResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListPaginatedFineTuningJobsResponseObject other && Equals(other); + public bool Equals(InternalListPaginatedFineTuningJobsResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListRunStepsResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalListRunStepsResponse.Serialization.cs new file mode 100644 index 000000000..74e2033af --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListRunStepsResponse.Serialization.cs @@ -0,0 +1,193 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalListRunStepsResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListRunStepsResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("first_id") != true) + { + writer.WritePropertyName("first_id"u8); + writer.WriteStringValue(FirstId); + } + if (SerializedAdditionalRawData?.ContainsKey("last_id") != true) + { + writer.WritePropertyName("last_id"u8); + writer.WriteStringValue(LastId); + } + if (SerializedAdditionalRawData?.ContainsKey("has_more") != true) + { + writer.WritePropertyName("has_more"u8); + writer.WriteBooleanValue(HasMore); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalListRunStepsResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListRunStepsResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalListRunStepsResponse(document.RootElement, options); + } + + internal static InternalListRunStepsResponse DeserializeInternalListRunStepsResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalListRunStepsResponseObject @object = default; + IReadOnlyList data = default; + string firstId = default; + string lastId = default; + bool hasMore = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("object"u8)) + { + @object = new InternalListRunStepsResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(RunStep.DeserializeRunStep(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("first_id"u8)) + { + firstId = property.Value.GetString(); + continue; + } + if (property.NameEquals("last_id"u8)) + { + lastId = property.Value.GetString(); + continue; + } + if (property.NameEquals("has_more"u8)) + { + hasMore = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalListRunStepsResponse( + @object, + data, + firstId, + lastId, + hasMore, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalListRunStepsResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalListRunStepsResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalListRunStepsResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalListRunStepsResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalListRunStepsResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalListRunStepsResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListRunStepsResponse.cs b/.dotnet/src/Generated/Models/InternalListRunStepsResponse.cs new file mode 100644 index 000000000..4ef3eb223 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListRunStepsResponse.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + internal partial class InternalListRunStepsResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalListRunStepsResponse(IEnumerable data, string firstId, string lastId, bool hasMore) + { + Argument.AssertNotNull(data, nameof(data)); + Argument.AssertNotNull(firstId, nameof(firstId)); + Argument.AssertNotNull(lastId, nameof(lastId)); + + Data = data.ToList(); + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + } + + internal InternalListRunStepsResponse(InternalListRunStepsResponseObject @object, IReadOnlyList data, string firstId, string lastId, bool hasMore, IDictionary serializedAdditionalRawData) + { + Object = @object; + Data = data; + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalListRunStepsResponse() + { + } + + public InternalListRunStepsResponseObject Object { get; } = InternalListRunStepsResponseObject.List; + + public IReadOnlyList Data { get; } + public string FirstId { get; } + public string LastId { get; } + public bool HasMore { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListRunStepsResponseObject.cs b/.dotnet/src/Generated/Models/InternalListRunStepsResponseObject.cs new file mode 100644 index 000000000..c554f8ae4 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListRunStepsResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalListRunStepsResponseObject : IEquatable + { + private readonly string _value; + + public InternalListRunStepsResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListRunStepsResponseObject List { get; } = new InternalListRunStepsResponseObject(ListValue); + public static bool operator ==(InternalListRunStepsResponseObject left, InternalListRunStepsResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListRunStepsResponseObject left, InternalListRunStepsResponseObject right) => !left.Equals(right); + public static implicit operator InternalListRunStepsResponseObject(string value) => new InternalListRunStepsResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListRunStepsResponseObject other && Equals(other); + public bool Equals(InternalListRunStepsResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListRunsResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalListRunsResponse.Serialization.cs new file mode 100644 index 000000000..049905e5d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListRunsResponse.Serialization.cs @@ -0,0 +1,193 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalListRunsResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListRunsResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("first_id") != true) + { + writer.WritePropertyName("first_id"u8); + writer.WriteStringValue(FirstId); + } + if (SerializedAdditionalRawData?.ContainsKey("last_id") != true) + { + writer.WritePropertyName("last_id"u8); + writer.WriteStringValue(LastId); + } + if (SerializedAdditionalRawData?.ContainsKey("has_more") != true) + { + writer.WritePropertyName("has_more"u8); + writer.WriteBooleanValue(HasMore); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalListRunsResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListRunsResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalListRunsResponse(document.RootElement, options); + } + + internal static InternalListRunsResponse DeserializeInternalListRunsResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalListRunsResponseObject @object = default; + IReadOnlyList data = default; + string firstId = default; + string lastId = default; + bool hasMore = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("object"u8)) + { + @object = new InternalListRunsResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ThreadRun.DeserializeThreadRun(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("first_id"u8)) + { + firstId = property.Value.GetString(); + continue; + } + if (property.NameEquals("last_id"u8)) + { + lastId = property.Value.GetString(); + continue; + } + if (property.NameEquals("has_more"u8)) + { + hasMore = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalListRunsResponse( + @object, + data, + firstId, + lastId, + hasMore, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalListRunsResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalListRunsResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalListRunsResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalListRunsResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalListRunsResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalListRunsResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListRunsResponse.cs b/.dotnet/src/Generated/Models/InternalListRunsResponse.cs new file mode 100644 index 000000000..4f10784e6 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListRunsResponse.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + internal partial class InternalListRunsResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalListRunsResponse(IEnumerable data, string firstId, string lastId, bool hasMore) + { + Argument.AssertNotNull(data, nameof(data)); + Argument.AssertNotNull(firstId, nameof(firstId)); + Argument.AssertNotNull(lastId, nameof(lastId)); + + Data = data.ToList(); + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + } + + internal InternalListRunsResponse(InternalListRunsResponseObject @object, IReadOnlyList data, string firstId, string lastId, bool hasMore, IDictionary serializedAdditionalRawData) + { + Object = @object; + Data = data; + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalListRunsResponse() + { + } + + public InternalListRunsResponseObject Object { get; } = InternalListRunsResponseObject.List; + + public IReadOnlyList Data { get; } + public string FirstId { get; } + public string LastId { get; } + public bool HasMore { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListRunsResponseObject.cs b/.dotnet/src/Generated/Models/InternalListRunsResponseObject.cs new file mode 100644 index 000000000..68d723f1c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListRunsResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalListRunsResponseObject : IEquatable + { + private readonly string _value; + + public InternalListRunsResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListRunsResponseObject List { get; } = new InternalListRunsResponseObject(ListValue); + public static bool operator ==(InternalListRunsResponseObject left, InternalListRunsResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListRunsResponseObject left, InternalListRunsResponseObject right) => !left.Equals(right); + public static implicit operator InternalListRunsResponseObject(string value) => new InternalListRunsResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListRunsResponseObject other && Equals(other); + public bool Equals(InternalListRunsResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListThreadsResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalListThreadsResponse.Serialization.cs new file mode 100644 index 000000000..9a4ec1998 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListThreadsResponse.Serialization.cs @@ -0,0 +1,193 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalListThreadsResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListThreadsResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("first_id") != true) + { + writer.WritePropertyName("first_id"u8); + writer.WriteStringValue(FirstId); + } + if (SerializedAdditionalRawData?.ContainsKey("last_id") != true) + { + writer.WritePropertyName("last_id"u8); + writer.WriteStringValue(LastId); + } + if (SerializedAdditionalRawData?.ContainsKey("has_more") != true) + { + writer.WritePropertyName("has_more"u8); + writer.WriteBooleanValue(HasMore); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalListThreadsResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListThreadsResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalListThreadsResponse(document.RootElement, options); + } + + internal static InternalListThreadsResponse DeserializeInternalListThreadsResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalListThreadsResponseObject @object = default; + IReadOnlyList data = default; + string firstId = default; + string lastId = default; + bool hasMore = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("object"u8)) + { + @object = new InternalListThreadsResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(AssistantThread.DeserializeAssistantThread(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("first_id"u8)) + { + firstId = property.Value.GetString(); + continue; + } + if (property.NameEquals("last_id"u8)) + { + lastId = property.Value.GetString(); + continue; + } + if (property.NameEquals("has_more"u8)) + { + hasMore = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalListThreadsResponse( + @object, + data, + firstId, + lastId, + hasMore, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalListThreadsResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalListThreadsResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalListThreadsResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalListThreadsResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalListThreadsResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalListThreadsResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListThreadsResponse.cs b/.dotnet/src/Generated/Models/InternalListThreadsResponse.cs new file mode 100644 index 000000000..0424339fa --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListThreadsResponse.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + internal partial class InternalListThreadsResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalListThreadsResponse(IEnumerable data, string firstId, string lastId, bool hasMore) + { + Argument.AssertNotNull(data, nameof(data)); + Argument.AssertNotNull(firstId, nameof(firstId)); + Argument.AssertNotNull(lastId, nameof(lastId)); + + Data = data.ToList(); + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + } + + internal InternalListThreadsResponse(InternalListThreadsResponseObject @object, IReadOnlyList data, string firstId, string lastId, bool hasMore, IDictionary serializedAdditionalRawData) + { + Object = @object; + Data = data; + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalListThreadsResponse() + { + } + + public InternalListThreadsResponseObject Object { get; } = InternalListThreadsResponseObject.List; + + public IReadOnlyList Data { get; } + public string FirstId { get; } + public string LastId { get; } + public bool HasMore { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListThreadsResponseObject.cs b/.dotnet/src/Generated/Models/InternalListThreadsResponseObject.cs new file mode 100644 index 000000000..8fc2b64bf --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListThreadsResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalListThreadsResponseObject : IEquatable + { + private readonly string _value; + + public InternalListThreadsResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListThreadsResponseObject List { get; } = new InternalListThreadsResponseObject(ListValue); + public static bool operator ==(InternalListThreadsResponseObject left, InternalListThreadsResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListThreadsResponseObject left, InternalListThreadsResponseObject right) => !left.Equals(right); + public static implicit operator InternalListThreadsResponseObject(string value) => new InternalListThreadsResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListThreadsResponseObject other && Equals(other); + public bool Equals(InternalListThreadsResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListVectorStoreFilesResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalListVectorStoreFilesResponse.Serialization.cs new file mode 100644 index 000000000..3b5f44ef5 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListVectorStoreFilesResponse.Serialization.cs @@ -0,0 +1,193 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalListVectorStoreFilesResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListVectorStoreFilesResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("first_id") != true) + { + writer.WritePropertyName("first_id"u8); + writer.WriteStringValue(FirstId); + } + if (SerializedAdditionalRawData?.ContainsKey("last_id") != true) + { + writer.WritePropertyName("last_id"u8); + writer.WriteStringValue(LastId); + } + if (SerializedAdditionalRawData?.ContainsKey("has_more") != true) + { + writer.WritePropertyName("has_more"u8); + writer.WriteBooleanValue(HasMore); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalListVectorStoreFilesResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListVectorStoreFilesResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalListVectorStoreFilesResponse(document.RootElement, options); + } + + internal static InternalListVectorStoreFilesResponse DeserializeInternalListVectorStoreFilesResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalListVectorStoreFilesResponseObject @object = default; + IReadOnlyList data = default; + string firstId = default; + string lastId = default; + bool hasMore = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("object"u8)) + { + @object = new InternalListVectorStoreFilesResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(VectorStoreFileAssociation.DeserializeVectorStoreFileAssociation(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("first_id"u8)) + { + firstId = property.Value.GetString(); + continue; + } + if (property.NameEquals("last_id"u8)) + { + lastId = property.Value.GetString(); + continue; + } + if (property.NameEquals("has_more"u8)) + { + hasMore = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalListVectorStoreFilesResponse( + @object, + data, + firstId, + lastId, + hasMore, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalListVectorStoreFilesResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalListVectorStoreFilesResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalListVectorStoreFilesResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalListVectorStoreFilesResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalListVectorStoreFilesResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalListVectorStoreFilesResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListVectorStoreFilesResponse.cs b/.dotnet/src/Generated/Models/InternalListVectorStoreFilesResponse.cs new file mode 100644 index 000000000..cef502099 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListVectorStoreFilesResponse.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.VectorStores +{ + internal partial class InternalListVectorStoreFilesResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalListVectorStoreFilesResponse(IEnumerable data, string firstId, string lastId, bool hasMore) + { + Argument.AssertNotNull(data, nameof(data)); + Argument.AssertNotNull(firstId, nameof(firstId)); + Argument.AssertNotNull(lastId, nameof(lastId)); + + Data = data.ToList(); + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + } + + internal InternalListVectorStoreFilesResponse(InternalListVectorStoreFilesResponseObject @object, IReadOnlyList data, string firstId, string lastId, bool hasMore, IDictionary serializedAdditionalRawData) + { + Object = @object; + Data = data; + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalListVectorStoreFilesResponse() + { + } + + public InternalListVectorStoreFilesResponseObject Object { get; } = InternalListVectorStoreFilesResponseObject.List; + + public IReadOnlyList Data { get; } + public string FirstId { get; } + public string LastId { get; } + public bool HasMore { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListVectorStoreFilesResponseObject.cs b/.dotnet/src/Generated/Models/InternalListVectorStoreFilesResponseObject.cs new file mode 100644 index 000000000..bc320af37 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListVectorStoreFilesResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.VectorStores +{ + internal readonly partial struct InternalListVectorStoreFilesResponseObject : IEquatable + { + private readonly string _value; + + public InternalListVectorStoreFilesResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListVectorStoreFilesResponseObject List { get; } = new InternalListVectorStoreFilesResponseObject(ListValue); + public static bool operator ==(InternalListVectorStoreFilesResponseObject left, InternalListVectorStoreFilesResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListVectorStoreFilesResponseObject left, InternalListVectorStoreFilesResponseObject right) => !left.Equals(right); + public static implicit operator InternalListVectorStoreFilesResponseObject(string value) => new InternalListVectorStoreFilesResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListVectorStoreFilesResponseObject other && Equals(other); + public bool Equals(InternalListVectorStoreFilesResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalListVectorStoresResponse.Serialization.cs b/.dotnet/src/Generated/Models/InternalListVectorStoresResponse.Serialization.cs new file mode 100644 index 000000000..e01ffb73e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListVectorStoresResponse.Serialization.cs @@ -0,0 +1,193 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalListVectorStoresResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListVectorStoresResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("data") != true) + { + writer.WritePropertyName("data"u8); + writer.WriteStartArray(); + foreach (var item in Data) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("first_id") != true) + { + writer.WritePropertyName("first_id"u8); + writer.WriteStringValue(FirstId); + } + if (SerializedAdditionalRawData?.ContainsKey("last_id") != true) + { + writer.WritePropertyName("last_id"u8); + writer.WriteStringValue(LastId); + } + if (SerializedAdditionalRawData?.ContainsKey("has_more") != true) + { + writer.WritePropertyName("has_more"u8); + writer.WriteBooleanValue(HasMore); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalListVectorStoresResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalListVectorStoresResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalListVectorStoresResponse(document.RootElement, options); + } + + internal static InternalListVectorStoresResponse DeserializeInternalListVectorStoresResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalListVectorStoresResponseObject @object = default; + IReadOnlyList data = default; + string firstId = default; + string lastId = default; + bool hasMore = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("object"u8)) + { + @object = new InternalListVectorStoresResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("data"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(VectorStore.DeserializeVectorStore(item, options)); + } + data = array; + continue; + } + if (property.NameEquals("first_id"u8)) + { + firstId = property.Value.GetString(); + continue; + } + if (property.NameEquals("last_id"u8)) + { + lastId = property.Value.GetString(); + continue; + } + if (property.NameEquals("has_more"u8)) + { + hasMore = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalListVectorStoresResponse( + @object, + data, + firstId, + lastId, + hasMore, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalListVectorStoresResponse)} does not support writing '{options.Format}' format."); + } + } + + InternalListVectorStoresResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalListVectorStoresResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalListVectorStoresResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalListVectorStoresResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalListVectorStoresResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListVectorStoresResponse.cs b/.dotnet/src/Generated/Models/InternalListVectorStoresResponse.cs new file mode 100644 index 000000000..98135e24b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListVectorStoresResponse.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.VectorStores +{ + internal partial class InternalListVectorStoresResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalListVectorStoresResponse(IEnumerable data, string firstId, string lastId, bool hasMore) + { + Argument.AssertNotNull(data, nameof(data)); + Argument.AssertNotNull(firstId, nameof(firstId)); + Argument.AssertNotNull(lastId, nameof(lastId)); + + Data = data.ToList(); + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + } + + internal InternalListVectorStoresResponse(InternalListVectorStoresResponseObject @object, IReadOnlyList data, string firstId, string lastId, bool hasMore, IDictionary serializedAdditionalRawData) + { + Object = @object; + Data = data; + FirstId = firstId; + LastId = lastId; + HasMore = hasMore; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalListVectorStoresResponse() + { + } + + public InternalListVectorStoresResponseObject Object { get; } = InternalListVectorStoresResponseObject.List; + + public IReadOnlyList Data { get; } + public string FirstId { get; } + public string LastId { get; } + public bool HasMore { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalListVectorStoresResponseObject.cs b/.dotnet/src/Generated/Models/InternalListVectorStoresResponseObject.cs new file mode 100644 index 000000000..1f3c8a3f2 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalListVectorStoresResponseObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.VectorStores +{ + internal readonly partial struct InternalListVectorStoresResponseObject : IEquatable + { + private readonly string _value; + + public InternalListVectorStoresResponseObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ListValue = "list"; + + public static InternalListVectorStoresResponseObject List { get; } = new InternalListVectorStoresResponseObject(ListValue); + public static bool operator ==(InternalListVectorStoresResponseObject left, InternalListVectorStoresResponseObject right) => left.Equals(right); + public static bool operator !=(InternalListVectorStoresResponseObject left, InternalListVectorStoresResponseObject right) => !left.Equals(right); + public static implicit operator InternalListVectorStoresResponseObject(string value) => new InternalListVectorStoresResponseObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalListVectorStoresResponseObject other && Equals(other); + public bool Equals(InternalListVectorStoresResponseObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentImageFileObjectImageFileDetail.cs b/.dotnet/src/Generated/Models/InternalMessageContentImageFileObjectImageFileDetail.cs new file mode 100644 index 000000000..c6099b4c4 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentImageFileObjectImageFileDetail.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageContentImageFileObjectImageFileDetail : IEquatable + { + private readonly string _value; + + public InternalMessageContentImageFileObjectImageFileDetail(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AutoValue = "auto"; + private const string LowValue = "low"; + private const string HighValue = "high"; + + public static InternalMessageContentImageFileObjectImageFileDetail Auto { get; } = new InternalMessageContentImageFileObjectImageFileDetail(AutoValue); + public static InternalMessageContentImageFileObjectImageFileDetail Low { get; } = new InternalMessageContentImageFileObjectImageFileDetail(LowValue); + public static InternalMessageContentImageFileObjectImageFileDetail High { get; } = new InternalMessageContentImageFileObjectImageFileDetail(HighValue); + public static bool operator ==(InternalMessageContentImageFileObjectImageFileDetail left, InternalMessageContentImageFileObjectImageFileDetail right) => left.Equals(right); + public static bool operator !=(InternalMessageContentImageFileObjectImageFileDetail left, InternalMessageContentImageFileObjectImageFileDetail right) => !left.Equals(right); + public static implicit operator InternalMessageContentImageFileObjectImageFileDetail(string value) => new InternalMessageContentImageFileObjectImageFileDetail(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageContentImageFileObjectImageFileDetail other && Equals(other); + public bool Equals(InternalMessageContentImageFileObjectImageFileDetail other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentImageFileObjectType.cs b/.dotnet/src/Generated/Models/InternalMessageContentImageFileObjectType.cs new file mode 100644 index 000000000..716165ec3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentImageFileObjectType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageContentImageFileObjectType : IEquatable + { + private readonly string _value; + + public InternalMessageContentImageFileObjectType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ImageFileValue = "image_file"; + + public static InternalMessageContentImageFileObjectType ImageFile { get; } = new InternalMessageContentImageFileObjectType(ImageFileValue); + public static bool operator ==(InternalMessageContentImageFileObjectType left, InternalMessageContentImageFileObjectType right) => left.Equals(right); + public static bool operator !=(InternalMessageContentImageFileObjectType left, InternalMessageContentImageFileObjectType right) => !left.Equals(right); + public static implicit operator InternalMessageContentImageFileObjectType(string value) => new InternalMessageContentImageFileObjectType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageContentImageFileObjectType other && Equals(other); + public bool Equals(InternalMessageContentImageFileObjectType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectImageUrl.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectImageUrl.Serialization.cs new file mode 100644 index 000000000..376726676 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectImageUrl.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentImageUrlObjectImageUrl : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentImageUrlObjectImageUrl)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("url") != true) + { + writer.WritePropertyName("url"u8); + writer.WriteStringValue(Url.AbsoluteUri); + } + if (SerializedAdditionalRawData?.ContainsKey("detail") != true && Optional.IsDefined(Detail)) + { + writer.WritePropertyName("detail"u8); + writer.WriteStringValue(Detail); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageContentImageUrlObjectImageUrl IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentImageUrlObjectImageUrl)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageContentImageUrlObjectImageUrl(document.RootElement, options); + } + + internal static InternalMessageContentImageUrlObjectImageUrl DeserializeInternalMessageContentImageUrlObjectImageUrl(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + Uri url = default; + string detail = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("url"u8)) + { + url = new Uri(property.Value.GetString()); + continue; + } + if (property.NameEquals("detail"u8)) + { + detail = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageContentImageUrlObjectImageUrl(url, detail, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageContentImageUrlObjectImageUrl)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageContentImageUrlObjectImageUrl IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageContentImageUrlObjectImageUrl(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageContentImageUrlObjectImageUrl)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageContentImageUrlObjectImageUrl FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageContentImageUrlObjectImageUrl(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectImageUrl.cs b/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectImageUrl.cs new file mode 100644 index 000000000..ec3f49b7f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectImageUrl.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentImageUrlObjectImageUrl + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalMessageContentImageUrlObjectImageUrl(Uri url) + { + Argument.AssertNotNull(url, nameof(url)); + + Url = url; + } + + internal InternalMessageContentImageUrlObjectImageUrl(Uri url, string detail, IDictionary serializedAdditionalRawData) + { + Url = url; + Detail = detail; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalMessageContentImageUrlObjectImageUrl() + { + } + + public Uri Url { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectImageUrlDetail.cs b/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectImageUrlDetail.cs new file mode 100644 index 000000000..1ae9181d0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectImageUrlDetail.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageContentImageUrlObjectImageUrlDetail : IEquatable + { + private readonly string _value; + + public InternalMessageContentImageUrlObjectImageUrlDetail(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AutoValue = "auto"; + private const string LowValue = "low"; + private const string HighValue = "high"; + + public static InternalMessageContentImageUrlObjectImageUrlDetail Auto { get; } = new InternalMessageContentImageUrlObjectImageUrlDetail(AutoValue); + public static InternalMessageContentImageUrlObjectImageUrlDetail Low { get; } = new InternalMessageContentImageUrlObjectImageUrlDetail(LowValue); + public static InternalMessageContentImageUrlObjectImageUrlDetail High { get; } = new InternalMessageContentImageUrlObjectImageUrlDetail(HighValue); + public static bool operator ==(InternalMessageContentImageUrlObjectImageUrlDetail left, InternalMessageContentImageUrlObjectImageUrlDetail right) => left.Equals(right); + public static bool operator !=(InternalMessageContentImageUrlObjectImageUrlDetail left, InternalMessageContentImageUrlObjectImageUrlDetail right) => !left.Equals(right); + public static implicit operator InternalMessageContentImageUrlObjectImageUrlDetail(string value) => new InternalMessageContentImageUrlObjectImageUrlDetail(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageContentImageUrlObjectImageUrlDetail other && Equals(other); + public bool Equals(InternalMessageContentImageUrlObjectImageUrlDetail other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectType.cs b/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectType.cs new file mode 100644 index 000000000..741398764 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentImageUrlObjectType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageContentImageUrlObjectType : IEquatable + { + private readonly string _value; + + public InternalMessageContentImageUrlObjectType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ImageUrlValue = "image_url"; + + public static InternalMessageContentImageUrlObjectType ImageUrl { get; } = new InternalMessageContentImageUrlObjectType(ImageUrlValue); + public static bool operator ==(InternalMessageContentImageUrlObjectType left, InternalMessageContentImageUrlObjectType right) => left.Equals(right); + public static bool operator !=(InternalMessageContentImageUrlObjectType left, InternalMessageContentImageUrlObjectType right) => !left.Equals(right); + public static implicit operator InternalMessageContentImageUrlObjectType(string value) => new InternalMessageContentImageUrlObjectType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageContentImageUrlObjectType other && Equals(other); + public bool Equals(InternalMessageContentImageUrlObjectType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentItemFileObjectImageFile.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageContentItemFileObjectImageFile.Serialization.cs new file mode 100644 index 000000000..a05a67384 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentItemFileObjectImageFile.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentItemFileObjectImageFile : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentItemFileObjectImageFile)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_id") != true) + { + writer.WritePropertyName("file_id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData?.ContainsKey("detail") != true && Optional.IsDefined(Detail)) + { + writer.WritePropertyName("detail"u8); + writer.WriteStringValue(Detail); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageContentItemFileObjectImageFile IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentItemFileObjectImageFile)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageContentItemFileObjectImageFile(document.RootElement, options); + } + + internal static InternalMessageContentItemFileObjectImageFile DeserializeInternalMessageContentItemFileObjectImageFile(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string fileId = default; + string detail = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_id"u8)) + { + fileId = property.Value.GetString(); + continue; + } + if (property.NameEquals("detail"u8)) + { + detail = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageContentItemFileObjectImageFile(fileId, detail, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageContentItemFileObjectImageFile)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageContentItemFileObjectImageFile IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageContentItemFileObjectImageFile(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageContentItemFileObjectImageFile)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageContentItemFileObjectImageFile FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageContentItemFileObjectImageFile(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentItemFileObjectImageFile.cs b/.dotnet/src/Generated/Models/InternalMessageContentItemFileObjectImageFile.cs new file mode 100644 index 000000000..b002cac9f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentItemFileObjectImageFile.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentItemFileObjectImageFile + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalMessageContentItemFileObjectImageFile(string fileId) + { + Argument.AssertNotNull(fileId, nameof(fileId)); + + FileId = fileId; + } + + internal InternalMessageContentItemFileObjectImageFile(string fileId, string detail, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + Detail = detail; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalMessageContentItemFileObjectImageFile() + { + } + + public string FileId { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentRefusalObjectType.cs b/.dotnet/src/Generated/Models/InternalMessageContentRefusalObjectType.cs new file mode 100644 index 000000000..7a127f292 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentRefusalObjectType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageContentRefusalObjectType : IEquatable + { + private readonly string _value; + + public InternalMessageContentRefusalObjectType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string RefusalValue = "refusal"; + + public static InternalMessageContentRefusalObjectType Refusal { get; } = new InternalMessageContentRefusalObjectType(RefusalValue); + public static bool operator ==(InternalMessageContentRefusalObjectType left, InternalMessageContentRefusalObjectType right) => left.Equals(right); + public static bool operator !=(InternalMessageContentRefusalObjectType left, InternalMessageContentRefusalObjectType right) => !left.Equals(right); + public static implicit operator InternalMessageContentRefusalObjectType(string value) => new InternalMessageContentRefusalObjectType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageContentRefusalObjectType other && Equals(other); + public bool Equals(InternalMessageContentRefusalObjectType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObject.Serialization.cs new file mode 100644 index 000000000..a2d9393c1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObject.Serialization.cs @@ -0,0 +1,183 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentTextAnnotationsFileCitationObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFileCitationObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("text") != true) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(Text); + } + if (SerializedAdditionalRawData?.ContainsKey("file_citation") != true) + { + writer.WritePropertyName("file_citation"u8); + writer.WriteObjectValue(FileCitation, options); + } + if (SerializedAdditionalRawData?.ContainsKey("start_index") != true) + { + writer.WritePropertyName("start_index"u8); + writer.WriteNumberValue(StartIndex); + } + if (SerializedAdditionalRawData?.ContainsKey("end_index") != true) + { + writer.WritePropertyName("end_index"u8); + writer.WriteNumberValue(EndIndex); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageContentTextAnnotationsFileCitationObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFileCitationObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageContentTextAnnotationsFileCitationObject(document.RootElement, options); + } + + internal static InternalMessageContentTextAnnotationsFileCitationObject DeserializeInternalMessageContentTextAnnotationsFileCitationObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string text = default; + InternalMessageContentTextAnnotationsFileCitationObjectFileCitation fileCitation = default; + int startIndex = default; + int endIndex = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (property.NameEquals("file_citation"u8)) + { + fileCitation = InternalMessageContentTextAnnotationsFileCitationObjectFileCitation.DeserializeInternalMessageContentTextAnnotationsFileCitationObjectFileCitation(property.Value, options); + continue; + } + if (property.NameEquals("start_index"u8)) + { + startIndex = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("end_index"u8)) + { + endIndex = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageContentTextAnnotationsFileCitationObject( + type, + serializedAdditionalRawData, + text, + fileCitation, + startIndex, + endIndex); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFileCitationObject)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageContentTextAnnotationsFileCitationObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageContentTextAnnotationsFileCitationObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFileCitationObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalMessageContentTextAnnotationsFileCitationObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageContentTextAnnotationsFileCitationObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObject.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObject.cs new file mode 100644 index 000000000..f1a6d8ea9 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObject.cs @@ -0,0 +1,41 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentTextAnnotationsFileCitationObject : InternalMessageContentTextObjectAnnotation + { + public InternalMessageContentTextAnnotationsFileCitationObject(string text, InternalMessageContentTextAnnotationsFileCitationObjectFileCitation fileCitation, int startIndex, int endIndex) + { + Argument.AssertNotNull(text, nameof(text)); + Argument.AssertNotNull(fileCitation, nameof(fileCitation)); + + Type = "file_citation"; + Text = text; + FileCitation = fileCitation; + StartIndex = startIndex; + EndIndex = endIndex; + } + + internal InternalMessageContentTextAnnotationsFileCitationObject(string type, IDictionary serializedAdditionalRawData, string text, InternalMessageContentTextAnnotationsFileCitationObjectFileCitation fileCitation, int startIndex, int endIndex) : base(type, serializedAdditionalRawData) + { + Text = text; + FileCitation = fileCitation; + StartIndex = startIndex; + EndIndex = endIndex; + } + + internal InternalMessageContentTextAnnotationsFileCitationObject() + { + } + + public string Text { get; set; } + public InternalMessageContentTextAnnotationsFileCitationObjectFileCitation FileCitation { get; set; } + public int StartIndex { get; set; } + public int EndIndex { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObjectFileCitation.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObjectFileCitation.Serialization.cs new file mode 100644 index 000000000..b18974701 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObjectFileCitation.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentTextAnnotationsFileCitationObjectFileCitation : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFileCitationObjectFileCitation)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_id") != true) + { + writer.WritePropertyName("file_id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageContentTextAnnotationsFileCitationObjectFileCitation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFileCitationObjectFileCitation)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageContentTextAnnotationsFileCitationObjectFileCitation(document.RootElement, options); + } + + internal static InternalMessageContentTextAnnotationsFileCitationObjectFileCitation DeserializeInternalMessageContentTextAnnotationsFileCitationObjectFileCitation(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string fileId = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_id"u8)) + { + fileId = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageContentTextAnnotationsFileCitationObjectFileCitation(fileId, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFileCitationObjectFileCitation)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageContentTextAnnotationsFileCitationObjectFileCitation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageContentTextAnnotationsFileCitationObjectFileCitation(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFileCitationObjectFileCitation)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageContentTextAnnotationsFileCitationObjectFileCitation FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageContentTextAnnotationsFileCitationObjectFileCitation(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObjectFileCitation.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObjectFileCitation.cs new file mode 100644 index 000000000..1c255cb2c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFileCitationObjectFileCitation.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentTextAnnotationsFileCitationObjectFileCitation + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalMessageContentTextAnnotationsFileCitationObjectFileCitation(string fileId) + { + Argument.AssertNotNull(fileId, nameof(fileId)); + + FileId = fileId; + } + + internal InternalMessageContentTextAnnotationsFileCitationObjectFileCitation(string fileId, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalMessageContentTextAnnotationsFileCitationObjectFileCitation() + { + } + + public string FileId { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObject.Serialization.cs new file mode 100644 index 000000000..c06a4f2d7 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObject.Serialization.cs @@ -0,0 +1,183 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentTextAnnotationsFilePathObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFilePathObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("text") != true) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(Text); + } + if (SerializedAdditionalRawData?.ContainsKey("file_path") != true) + { + writer.WritePropertyName("file_path"u8); + writer.WriteObjectValue(FilePath, options); + } + if (SerializedAdditionalRawData?.ContainsKey("start_index") != true) + { + writer.WritePropertyName("start_index"u8); + writer.WriteNumberValue(StartIndex); + } + if (SerializedAdditionalRawData?.ContainsKey("end_index") != true) + { + writer.WritePropertyName("end_index"u8); + writer.WriteNumberValue(EndIndex); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageContentTextAnnotationsFilePathObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFilePathObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageContentTextAnnotationsFilePathObject(document.RootElement, options); + } + + internal static InternalMessageContentTextAnnotationsFilePathObject DeserializeInternalMessageContentTextAnnotationsFilePathObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string text = default; + InternalMessageContentTextAnnotationsFilePathObjectFilePath filePath = default; + int startIndex = default; + int endIndex = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (property.NameEquals("file_path"u8)) + { + filePath = InternalMessageContentTextAnnotationsFilePathObjectFilePath.DeserializeInternalMessageContentTextAnnotationsFilePathObjectFilePath(property.Value, options); + continue; + } + if (property.NameEquals("start_index"u8)) + { + startIndex = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("end_index"u8)) + { + endIndex = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageContentTextAnnotationsFilePathObject( + type, + serializedAdditionalRawData, + text, + filePath, + startIndex, + endIndex); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFilePathObject)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageContentTextAnnotationsFilePathObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageContentTextAnnotationsFilePathObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFilePathObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalMessageContentTextAnnotationsFilePathObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageContentTextAnnotationsFilePathObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObject.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObject.cs new file mode 100644 index 000000000..c0098007c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObject.cs @@ -0,0 +1,41 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentTextAnnotationsFilePathObject : InternalMessageContentTextObjectAnnotation + { + public InternalMessageContentTextAnnotationsFilePathObject(string text, InternalMessageContentTextAnnotationsFilePathObjectFilePath filePath, int startIndex, int endIndex) + { + Argument.AssertNotNull(text, nameof(text)); + Argument.AssertNotNull(filePath, nameof(filePath)); + + Type = "file_path"; + Text = text; + FilePath = filePath; + StartIndex = startIndex; + EndIndex = endIndex; + } + + internal InternalMessageContentTextAnnotationsFilePathObject(string type, IDictionary serializedAdditionalRawData, string text, InternalMessageContentTextAnnotationsFilePathObjectFilePath filePath, int startIndex, int endIndex) : base(type, serializedAdditionalRawData) + { + Text = text; + FilePath = filePath; + StartIndex = startIndex; + EndIndex = endIndex; + } + + internal InternalMessageContentTextAnnotationsFilePathObject() + { + } + + public string Text { get; set; } + public InternalMessageContentTextAnnotationsFilePathObjectFilePath FilePath { get; set; } + public int StartIndex { get; set; } + public int EndIndex { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObjectFilePath.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObjectFilePath.Serialization.cs new file mode 100644 index 000000000..e398f1480 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObjectFilePath.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentTextAnnotationsFilePathObjectFilePath : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFilePathObjectFilePath)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_id") != true) + { + writer.WritePropertyName("file_id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageContentTextAnnotationsFilePathObjectFilePath IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFilePathObjectFilePath)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageContentTextAnnotationsFilePathObjectFilePath(document.RootElement, options); + } + + internal static InternalMessageContentTextAnnotationsFilePathObjectFilePath DeserializeInternalMessageContentTextAnnotationsFilePathObjectFilePath(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string fileId = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_id"u8)) + { + fileId = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageContentTextAnnotationsFilePathObjectFilePath(fileId, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFilePathObjectFilePath)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageContentTextAnnotationsFilePathObjectFilePath IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageContentTextAnnotationsFilePathObjectFilePath(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextAnnotationsFilePathObjectFilePath)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageContentTextAnnotationsFilePathObjectFilePath FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageContentTextAnnotationsFilePathObjectFilePath(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObjectFilePath.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObjectFilePath.cs new file mode 100644 index 000000000..de47c05b1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextAnnotationsFilePathObjectFilePath.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentTextAnnotationsFilePathObjectFilePath + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalMessageContentTextAnnotationsFilePathObjectFilePath(string fileId) + { + Argument.AssertNotNull(fileId, nameof(fileId)); + + FileId = fileId; + } + + internal InternalMessageContentTextAnnotationsFilePathObjectFilePath(string fileId, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalMessageContentTextAnnotationsFilePathObjectFilePath() + { + } + + public string FileId { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextObjectAnnotation.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextObjectAnnotation.Serialization.cs new file mode 100644 index 000000000..92f8d5b79 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextObjectAnnotation.Serialization.cs @@ -0,0 +1,124 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + [PersistableModelProxy(typeof(UnknownMessageContentTextObjectAnnotation))] + internal partial class InternalMessageContentTextObjectAnnotation : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectAnnotation)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageContentTextObjectAnnotation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectAnnotation)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageContentTextObjectAnnotation(document.RootElement, options); + } + + internal static InternalMessageContentTextObjectAnnotation DeserializeInternalMessageContentTextObjectAnnotation(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "file_citation": return InternalMessageContentTextAnnotationsFileCitationObject.DeserializeInternalMessageContentTextAnnotationsFileCitationObject(element, options); + case "file_path": return InternalMessageContentTextAnnotationsFilePathObject.DeserializeInternalMessageContentTextAnnotationsFilePathObject(element, options); + } + } + return UnknownMessageContentTextObjectAnnotation.DeserializeUnknownMessageContentTextObjectAnnotation(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectAnnotation)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageContentTextObjectAnnotation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageContentTextObjectAnnotation(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectAnnotation)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageContentTextObjectAnnotation FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageContentTextObjectAnnotation(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextObjectAnnotation.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextObjectAnnotation.cs new file mode 100644 index 000000000..4e2df613a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextObjectAnnotation.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal abstract partial class InternalMessageContentTextObjectAnnotation + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected InternalMessageContentTextObjectAnnotation() + { + } + + internal InternalMessageContentTextObjectAnnotation(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextObjectText.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextObjectText.Serialization.cs new file mode 100644 index 000000000..0c951aee1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextObjectText.Serialization.cs @@ -0,0 +1,154 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentTextObjectText : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectText)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("value") != true) + { + writer.WritePropertyName("value"u8); + writer.WriteStringValue(Value); + } + if (SerializedAdditionalRawData?.ContainsKey("annotations") != true) + { + writer.WritePropertyName("annotations"u8); + writer.WriteStartArray(); + foreach (var item in Annotations) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageContentTextObjectText IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectText)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageContentTextObjectText(document.RootElement, options); + } + + internal static InternalMessageContentTextObjectText DeserializeInternalMessageContentTextObjectText(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string value = default; + IList annotations = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("value"u8)) + { + value = property.Value.GetString(); + continue; + } + if (property.NameEquals("annotations"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalMessageContentTextObjectAnnotation.DeserializeInternalMessageContentTextObjectAnnotation(item, options)); + } + annotations = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageContentTextObjectText(value, annotations, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectText)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageContentTextObjectText IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageContentTextObjectText(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectText)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageContentTextObjectText FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageContentTextObjectText(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextObjectText.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextObjectText.cs new file mode 100644 index 000000000..3ebc901f8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextObjectText.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageContentTextObjectText + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalMessageContentTextObjectText(string value, IEnumerable annotations) + { + Argument.AssertNotNull(value, nameof(value)); + Argument.AssertNotNull(annotations, nameof(annotations)); + + Value = value; + Annotations = annotations.ToList(); + } + + internal InternalMessageContentTextObjectText(string value, IList annotations, IDictionary serializedAdditionalRawData) + { + Value = value; + Annotations = annotations; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalMessageContentTextObjectText() + { + } + + public string Value { get; set; } + public IList Annotations { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageContentTextObjectType.cs b/.dotnet/src/Generated/Models/InternalMessageContentTextObjectType.cs new file mode 100644 index 000000000..daf657e15 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageContentTextObjectType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageContentTextObjectType : IEquatable + { + private readonly string _value; + + public InternalMessageContentTextObjectType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string TextValue = "text"; + + public static InternalMessageContentTextObjectType Text { get; } = new InternalMessageContentTextObjectType(TextValue); + public static bool operator ==(InternalMessageContentTextObjectType left, InternalMessageContentTextObjectType right) => left.Equals(right); + public static bool operator !=(InternalMessageContentTextObjectType left, InternalMessageContentTextObjectType right) => !left.Equals(right); + public static implicit operator InternalMessageContentTextObjectType(string value) => new InternalMessageContentTextObjectType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageContentTextObjectType other && Equals(other); + public bool Equals(InternalMessageContentTextObjectType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContent.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContent.Serialization.cs new file mode 100644 index 000000000..7b0d037e3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContent.Serialization.cs @@ -0,0 +1,126 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + [PersistableModelProxy(typeof(UnknownMessageDeltaContent))] + internal partial class InternalMessageDeltaContent : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContent)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContent IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContent)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContent(document.RootElement, options); + } + + internal static InternalMessageDeltaContent DeserializeInternalMessageDeltaContent(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "image_file": return InternalMessageDeltaContentImageFileObject.DeserializeInternalMessageDeltaContentImageFileObject(element, options); + case "image_url": return InternalMessageDeltaContentImageUrlObject.DeserializeInternalMessageDeltaContentImageUrlObject(element, options); + case "refusal": return InternalMessageDeltaContentRefusalObject.DeserializeInternalMessageDeltaContentRefusalObject(element, options); + case "text": return InternalMessageDeltaContentTextObject.DeserializeInternalMessageDeltaContentTextObject(element, options); + } + } + return UnknownMessageDeltaContent.DeserializeUnknownMessageDeltaContent(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContent)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContent IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContent(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContent)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageDeltaContent FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContent(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContent.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContent.cs new file mode 100644 index 000000000..861574e93 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContent.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal abstract partial class InternalMessageDeltaContent + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected InternalMessageDeltaContent() + { + } + + internal InternalMessageDeltaContent(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObject.Serialization.cs new file mode 100644 index 000000000..56a49b626 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObject.Serialization.cs @@ -0,0 +1,159 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentImageFileObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageFileObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("image_file") != true && Optional.IsDefined(ImageFile)) + { + writer.WritePropertyName("image_file"u8); + writer.WriteObjectValue(ImageFile, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContentImageFileObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageFileObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContentImageFileObject(document.RootElement, options); + } + + internal static InternalMessageDeltaContentImageFileObject DeserializeInternalMessageDeltaContentImageFileObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + InternalMessageDeltaContentImageFileObjectImageFile imageFile = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("image_file"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + imageFile = InternalMessageDeltaContentImageFileObjectImageFile.DeserializeInternalMessageDeltaContentImageFileObjectImageFile(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaContentImageFileObject(type, serializedAdditionalRawData, index, imageFile); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageFileObject)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContentImageFileObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContentImageFileObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageFileObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalMessageDeltaContentImageFileObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContentImageFileObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObject.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObject.cs new file mode 100644 index 000000000..09615bdae --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObject.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentImageFileObject : InternalMessageDeltaContent + { + internal InternalMessageDeltaContentImageFileObject(int index) + { + Type = "image_file"; + Index = index; + } + + internal InternalMessageDeltaContentImageFileObject(string type, IDictionary serializedAdditionalRawData, int index, InternalMessageDeltaContentImageFileObjectImageFile imageFile) : base(type, serializedAdditionalRawData) + { + Index = index; + ImageFile = imageFile; + } + + internal InternalMessageDeltaContentImageFileObject() + { + } + + public int Index { get; } + public InternalMessageDeltaContentImageFileObjectImageFile ImageFile { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObjectImageFile.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObjectImageFile.Serialization.cs new file mode 100644 index 000000000..aeba95b5b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObjectImageFile.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentImageFileObjectImageFile : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageFileObjectImageFile)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_id") != true && Optional.IsDefined(FileId)) + { + writer.WritePropertyName("file_id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData?.ContainsKey("detail") != true && Optional.IsDefined(Detail)) + { + writer.WritePropertyName("detail"u8); + writer.WriteStringValue(Detail); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContentImageFileObjectImageFile IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageFileObjectImageFile)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContentImageFileObjectImageFile(document.RootElement, options); + } + + internal static InternalMessageDeltaContentImageFileObjectImageFile DeserializeInternalMessageDeltaContentImageFileObjectImageFile(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string fileId = default; + string detail = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_id"u8)) + { + fileId = property.Value.GetString(); + continue; + } + if (property.NameEquals("detail"u8)) + { + detail = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaContentImageFileObjectImageFile(fileId, detail, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageFileObjectImageFile)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContentImageFileObjectImageFile IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContentImageFileObjectImageFile(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageFileObjectImageFile)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageDeltaContentImageFileObjectImageFile FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContentImageFileObjectImageFile(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObjectImageFile.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObjectImageFile.cs new file mode 100644 index 000000000..ce44aab8c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObjectImageFile.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentImageFileObjectImageFile + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalMessageDeltaContentImageFileObjectImageFile() + { + } + + internal InternalMessageDeltaContentImageFileObjectImageFile(string fileId, string detail, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + Detail = detail; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string FileId { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObjectImageFileDetail.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObjectImageFileDetail.cs new file mode 100644 index 000000000..3de4b3ed9 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageFileObjectImageFileDetail.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageDeltaContentImageFileObjectImageFileDetail : IEquatable + { + private readonly string _value; + + public InternalMessageDeltaContentImageFileObjectImageFileDetail(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AutoValue = "auto"; + private const string LowValue = "low"; + private const string HighValue = "high"; + + public static InternalMessageDeltaContentImageFileObjectImageFileDetail Auto { get; } = new InternalMessageDeltaContentImageFileObjectImageFileDetail(AutoValue); + public static InternalMessageDeltaContentImageFileObjectImageFileDetail Low { get; } = new InternalMessageDeltaContentImageFileObjectImageFileDetail(LowValue); + public static InternalMessageDeltaContentImageFileObjectImageFileDetail High { get; } = new InternalMessageDeltaContentImageFileObjectImageFileDetail(HighValue); + public static bool operator ==(InternalMessageDeltaContentImageFileObjectImageFileDetail left, InternalMessageDeltaContentImageFileObjectImageFileDetail right) => left.Equals(right); + public static bool operator !=(InternalMessageDeltaContentImageFileObjectImageFileDetail left, InternalMessageDeltaContentImageFileObjectImageFileDetail right) => !left.Equals(right); + public static implicit operator InternalMessageDeltaContentImageFileObjectImageFileDetail(string value) => new InternalMessageDeltaContentImageFileObjectImageFileDetail(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageDeltaContentImageFileObjectImageFileDetail other && Equals(other); + public bool Equals(InternalMessageDeltaContentImageFileObjectImageFileDetail other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObject.Serialization.cs new file mode 100644 index 000000000..40317ea42 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObject.Serialization.cs @@ -0,0 +1,159 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentImageUrlObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageUrlObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("image_url") != true && Optional.IsDefined(ImageUrl)) + { + writer.WritePropertyName("image_url"u8); + writer.WriteObjectValue(ImageUrl, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContentImageUrlObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageUrlObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContentImageUrlObject(document.RootElement, options); + } + + internal static InternalMessageDeltaContentImageUrlObject DeserializeInternalMessageDeltaContentImageUrlObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + InternalMessageDeltaContentImageUrlObjectImageUrl imageUrl = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("image_url"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + imageUrl = InternalMessageDeltaContentImageUrlObjectImageUrl.DeserializeInternalMessageDeltaContentImageUrlObjectImageUrl(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaContentImageUrlObject(type, serializedAdditionalRawData, index, imageUrl); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageUrlObject)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContentImageUrlObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContentImageUrlObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageUrlObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalMessageDeltaContentImageUrlObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContentImageUrlObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObject.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObject.cs new file mode 100644 index 000000000..07fa17e8e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObject.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentImageUrlObject : InternalMessageDeltaContent + { + internal InternalMessageDeltaContentImageUrlObject(int index) + { + Type = "image_url"; + Index = index; + } + + internal InternalMessageDeltaContentImageUrlObject(string type, IDictionary serializedAdditionalRawData, int index, InternalMessageDeltaContentImageUrlObjectImageUrl imageUrl) : base(type, serializedAdditionalRawData) + { + Index = index; + ImageUrl = imageUrl; + } + + internal InternalMessageDeltaContentImageUrlObject() + { + } + + public int Index { get; } + public InternalMessageDeltaContentImageUrlObjectImageUrl ImageUrl { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObjectImageUrl.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObjectImageUrl.Serialization.cs new file mode 100644 index 000000000..17fdc9aaa --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObjectImageUrl.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentImageUrlObjectImageUrl : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageUrlObjectImageUrl)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("url") != true && Optional.IsDefined(Url)) + { + writer.WritePropertyName("url"u8); + writer.WriteStringValue(Url.AbsoluteUri); + } + if (SerializedAdditionalRawData?.ContainsKey("detail") != true && Optional.IsDefined(Detail)) + { + writer.WritePropertyName("detail"u8); + writer.WriteStringValue(Detail); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContentImageUrlObjectImageUrl IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageUrlObjectImageUrl)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContentImageUrlObjectImageUrl(document.RootElement, options); + } + + internal static InternalMessageDeltaContentImageUrlObjectImageUrl DeserializeInternalMessageDeltaContentImageUrlObjectImageUrl(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + Uri url = default; + string detail = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("url"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + url = new Uri(property.Value.GetString()); + continue; + } + if (property.NameEquals("detail"u8)) + { + detail = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaContentImageUrlObjectImageUrl(url, detail, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageUrlObjectImageUrl)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContentImageUrlObjectImageUrl IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContentImageUrlObjectImageUrl(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentImageUrlObjectImageUrl)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageDeltaContentImageUrlObjectImageUrl FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContentImageUrlObjectImageUrl(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObjectImageUrl.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObjectImageUrl.cs new file mode 100644 index 000000000..886c78999 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObjectImageUrl.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentImageUrlObjectImageUrl + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalMessageDeltaContentImageUrlObjectImageUrl() + { + } + + internal InternalMessageDeltaContentImageUrlObjectImageUrl(Uri url, string detail, IDictionary serializedAdditionalRawData) + { + Url = url; + Detail = detail; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public Uri Url { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObjectImageUrlDetail.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObjectImageUrlDetail.cs new file mode 100644 index 000000000..ffd7b7bb4 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentImageUrlObjectImageUrlDetail.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageDeltaContentImageUrlObjectImageUrlDetail : IEquatable + { + private readonly string _value; + + public InternalMessageDeltaContentImageUrlObjectImageUrlDetail(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AutoValue = "auto"; + private const string LowValue = "low"; + private const string HighValue = "high"; + + public static InternalMessageDeltaContentImageUrlObjectImageUrlDetail Auto { get; } = new InternalMessageDeltaContentImageUrlObjectImageUrlDetail(AutoValue); + public static InternalMessageDeltaContentImageUrlObjectImageUrlDetail Low { get; } = new InternalMessageDeltaContentImageUrlObjectImageUrlDetail(LowValue); + public static InternalMessageDeltaContentImageUrlObjectImageUrlDetail High { get; } = new InternalMessageDeltaContentImageUrlObjectImageUrlDetail(HighValue); + public static bool operator ==(InternalMessageDeltaContentImageUrlObjectImageUrlDetail left, InternalMessageDeltaContentImageUrlObjectImageUrlDetail right) => left.Equals(right); + public static bool operator !=(InternalMessageDeltaContentImageUrlObjectImageUrlDetail left, InternalMessageDeltaContentImageUrlObjectImageUrlDetail right) => !left.Equals(right); + public static implicit operator InternalMessageDeltaContentImageUrlObjectImageUrlDetail(string value) => new InternalMessageDeltaContentImageUrlObjectImageUrlDetail(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageDeltaContentImageUrlObjectImageUrlDetail other && Equals(other); + public bool Equals(InternalMessageDeltaContentImageUrlObjectImageUrlDetail other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentRefusalObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentRefusalObject.Serialization.cs new file mode 100644 index 000000000..1668fc35c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentRefusalObject.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentRefusalObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentRefusalObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("refusal") != true && Optional.IsDefined(Refusal)) + { + writer.WritePropertyName("refusal"u8); + writer.WriteStringValue(Refusal); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContentRefusalObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentRefusalObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContentRefusalObject(document.RootElement, options); + } + + internal static InternalMessageDeltaContentRefusalObject DeserializeInternalMessageDeltaContentRefusalObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + string refusal = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("refusal"u8)) + { + refusal = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaContentRefusalObject(type, serializedAdditionalRawData, index, refusal); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentRefusalObject)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContentRefusalObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContentRefusalObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentRefusalObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalMessageDeltaContentRefusalObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContentRefusalObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentRefusalObject.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentRefusalObject.cs new file mode 100644 index 000000000..d7b88a830 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentRefusalObject.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentRefusalObject : InternalMessageDeltaContent + { + internal InternalMessageDeltaContentRefusalObject(int index) + { + Type = "refusal"; + Index = index; + } + + internal InternalMessageDeltaContentRefusalObject(string type, IDictionary serializedAdditionalRawData, int index, string refusal) : base(type, serializedAdditionalRawData) + { + Index = index; + Refusal = refusal; + } + + internal InternalMessageDeltaContentRefusalObject() + { + } + + public int Index { get; } + public string Refusal { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObject.Serialization.cs new file mode 100644 index 000000000..f35ad0979 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObject.Serialization.cs @@ -0,0 +1,207 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextAnnotationsFileCitationObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFileCitationObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("text") != true && Optional.IsDefined(Text)) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(Text); + } + if (SerializedAdditionalRawData?.ContainsKey("file_citation") != true && Optional.IsDefined(FileCitation)) + { + writer.WritePropertyName("file_citation"u8); + writer.WriteObjectValue(FileCitation, options); + } + if (SerializedAdditionalRawData?.ContainsKey("start_index") != true && Optional.IsDefined(StartIndex)) + { + writer.WritePropertyName("start_index"u8); + writer.WriteNumberValue(StartIndex.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("end_index") != true && Optional.IsDefined(EndIndex)) + { + writer.WritePropertyName("end_index"u8); + writer.WriteNumberValue(EndIndex.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContentTextAnnotationsFileCitationObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFileCitationObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContentTextAnnotationsFileCitationObject(document.RootElement, options); + } + + internal static InternalMessageDeltaContentTextAnnotationsFileCitationObject DeserializeInternalMessageDeltaContentTextAnnotationsFileCitationObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + string text = default; + InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation fileCitation = default; + int? startIndex = default; + int? endIndex = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (property.NameEquals("file_citation"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fileCitation = InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation.DeserializeInternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation(property.Value, options); + continue; + } + if (property.NameEquals("start_index"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + startIndex = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("end_index"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + endIndex = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaContentTextAnnotationsFileCitationObject( + type, + serializedAdditionalRawData, + index, + text, + fileCitation, + startIndex, + endIndex); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFileCitationObject)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContentTextAnnotationsFileCitationObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContentTextAnnotationsFileCitationObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFileCitationObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalMessageDeltaContentTextAnnotationsFileCitationObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContentTextAnnotationsFileCitationObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObject.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObject.cs new file mode 100644 index 000000000..a96c66517 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObject.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextAnnotationsFileCitationObject : InternalMessageDeltaTextContentAnnotation + { + internal InternalMessageDeltaContentTextAnnotationsFileCitationObject(int index) + { + Type = "file_citation"; + Index = index; + } + + internal InternalMessageDeltaContentTextAnnotationsFileCitationObject(string type, IDictionary serializedAdditionalRawData, int index, string text, InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation fileCitation, int? startIndex, int? endIndex) : base(type, serializedAdditionalRawData) + { + Index = index; + Text = text; + FileCitation = fileCitation; + StartIndex = startIndex; + EndIndex = endIndex; + } + + internal InternalMessageDeltaContentTextAnnotationsFileCitationObject() + { + } + + public int Index { get; } + public string Text { get; } + public InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation FileCitation { get; } + public int? StartIndex { get; } + public int? EndIndex { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation.Serialization.cs new file mode 100644 index 000000000..61580bfdb --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_id") != true && Optional.IsDefined(FileId)) + { + writer.WritePropertyName("file_id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData?.ContainsKey("quote") != true && Optional.IsDefined(Quote)) + { + writer.WritePropertyName("quote"u8); + writer.WriteStringValue(Quote); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation(document.RootElement, options); + } + + internal static InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation DeserializeInternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string fileId = default; + string quote = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_id"u8)) + { + fileId = property.Value.GetString(); + continue; + } + if (property.NameEquals("quote"u8)) + { + quote = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation(fileId, quote, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation.cs new file mode 100644 index 000000000..8451e2157 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation() + { + } + + internal InternalMessageDeltaContentTextAnnotationsFileCitationObjectFileCitation(string fileId, string quote, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + Quote = quote; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string FileId { get; } + public string Quote { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObject.Serialization.cs new file mode 100644 index 000000000..1ed21a80d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObject.Serialization.cs @@ -0,0 +1,207 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextAnnotationsFilePathObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFilePathObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("text") != true && Optional.IsDefined(Text)) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(Text); + } + if (SerializedAdditionalRawData?.ContainsKey("file_path") != true && Optional.IsDefined(FilePath)) + { + writer.WritePropertyName("file_path"u8); + writer.WriteObjectValue(FilePath, options); + } + if (SerializedAdditionalRawData?.ContainsKey("start_index") != true && Optional.IsDefined(StartIndex)) + { + writer.WritePropertyName("start_index"u8); + writer.WriteNumberValue(StartIndex.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("end_index") != true && Optional.IsDefined(EndIndex)) + { + writer.WritePropertyName("end_index"u8); + writer.WriteNumberValue(EndIndex.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContentTextAnnotationsFilePathObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFilePathObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContentTextAnnotationsFilePathObject(document.RootElement, options); + } + + internal static InternalMessageDeltaContentTextAnnotationsFilePathObject DeserializeInternalMessageDeltaContentTextAnnotationsFilePathObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + string text = default; + InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath filePath = default; + int? startIndex = default; + int? endIndex = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (property.NameEquals("file_path"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + filePath = InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath.DeserializeInternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath(property.Value, options); + continue; + } + if (property.NameEquals("start_index"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + startIndex = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("end_index"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + endIndex = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaContentTextAnnotationsFilePathObject( + type, + serializedAdditionalRawData, + index, + text, + filePath, + startIndex, + endIndex); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFilePathObject)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContentTextAnnotationsFilePathObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContentTextAnnotationsFilePathObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFilePathObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalMessageDeltaContentTextAnnotationsFilePathObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContentTextAnnotationsFilePathObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObject.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObject.cs new file mode 100644 index 000000000..92320c4f5 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObject.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextAnnotationsFilePathObject : InternalMessageDeltaTextContentAnnotation + { + internal InternalMessageDeltaContentTextAnnotationsFilePathObject(int index) + { + Type = "file_path"; + Index = index; + } + + internal InternalMessageDeltaContentTextAnnotationsFilePathObject(string type, IDictionary serializedAdditionalRawData, int index, string text, InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath filePath, int? startIndex, int? endIndex) : base(type, serializedAdditionalRawData) + { + Index = index; + Text = text; + FilePath = filePath; + StartIndex = startIndex; + EndIndex = endIndex; + } + + internal InternalMessageDeltaContentTextAnnotationsFilePathObject() + { + } + + public int Index { get; } + public string Text { get; } + public InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath FilePath { get; } + public int? StartIndex { get; } + public int? EndIndex { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath.Serialization.cs new file mode 100644 index 000000000..2cabf1597 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_id") != true && Optional.IsDefined(FileId)) + { + writer.WritePropertyName("file_id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath(document.RootElement, options); + } + + internal static InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath DeserializeInternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string fileId = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_id"u8)) + { + fileId = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath(fileId, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath.cs new file mode 100644 index 000000000..3e41f99b0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath() + { + } + + internal InternalMessageDeltaContentTextAnnotationsFilePathObjectFilePath(string fileId, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string FileId { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObject.Serialization.cs new file mode 100644 index 000000000..75983efa0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObject.Serialization.cs @@ -0,0 +1,159 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("text") != true && Optional.IsDefined(Text)) + { + writer.WritePropertyName("text"u8); + writer.WriteObjectValue(Text, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContentTextObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContentTextObject(document.RootElement, options); + } + + internal static InternalMessageDeltaContentTextObject DeserializeInternalMessageDeltaContentTextObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + InternalMessageDeltaContentTextObjectText text = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("text"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + text = InternalMessageDeltaContentTextObjectText.DeserializeInternalMessageDeltaContentTextObjectText(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaContentTextObject(type, serializedAdditionalRawData, index, text); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextObject)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContentTextObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContentTextObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalMessageDeltaContentTextObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContentTextObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObject.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObject.cs new file mode 100644 index 000000000..a4154f431 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObject.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextObject : InternalMessageDeltaContent + { + internal InternalMessageDeltaContentTextObject(int index) + { + Type = "text"; + Index = index; + } + + internal InternalMessageDeltaContentTextObject(string type, IDictionary serializedAdditionalRawData, int index, InternalMessageDeltaContentTextObjectText text) : base(type, serializedAdditionalRawData) + { + Index = index; + Text = text; + } + + internal InternalMessageDeltaContentTextObject() + { + } + + public int Index { get; } + public InternalMessageDeltaContentTextObjectText Text { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObjectText.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObjectText.Serialization.cs new file mode 100644 index 000000000..fbc18edf5 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObjectText.Serialization.cs @@ -0,0 +1,158 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextObjectText : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextObjectText)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("value") != true && Optional.IsDefined(Value)) + { + writer.WritePropertyName("value"u8); + writer.WriteStringValue(Value); + } + if (SerializedAdditionalRawData?.ContainsKey("annotations") != true && Optional.IsCollectionDefined(Annotations)) + { + writer.WritePropertyName("annotations"u8); + writer.WriteStartArray(); + foreach (var item in Annotations) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContentTextObjectText IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextObjectText)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContentTextObjectText(document.RootElement, options); + } + + internal static InternalMessageDeltaContentTextObjectText DeserializeInternalMessageDeltaContentTextObjectText(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string value = default; + IReadOnlyList annotations = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("value"u8)) + { + value = property.Value.GetString(); + continue; + } + if (property.NameEquals("annotations"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalMessageDeltaTextContentAnnotation.DeserializeInternalMessageDeltaTextContentAnnotation(item, options)); + } + annotations = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaContentTextObjectText(value, annotations ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextObjectText)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContentTextObjectText IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContentTextObjectText(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContentTextObjectText)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageDeltaContentTextObjectText FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaContentTextObjectText(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObjectText.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObjectText.cs new file mode 100644 index 000000000..f2b651436 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaContentTextObjectText.cs @@ -0,0 +1,28 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaContentTextObjectText + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalMessageDeltaContentTextObjectText() + { + Annotations = new ChangeTrackingList(); + } + + internal InternalMessageDeltaContentTextObjectText(string value, IReadOnlyList annotations, IDictionary serializedAdditionalRawData) + { + Value = value; + Annotations = annotations; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string Value { get; } + public IReadOnlyList Annotations { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaObject.Serialization.cs new file mode 100644 index 000000000..1bba65403 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaObject.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("delta") != true) + { + writer.WritePropertyName("delta"u8); + writer.WriteObjectValue(Delta, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaObject(document.RootElement, options); + } + + internal static InternalMessageDeltaObject DeserializeInternalMessageDeltaObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalMessageDeltaObjectObject @object = default; + InternalMessageDeltaObjectDelta delta = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalMessageDeltaObjectObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("delta"u8)) + { + delta = InternalMessageDeltaObjectDelta.DeserializeInternalMessageDeltaObjectDelta(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaObject(id, @object, delta, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaObject)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageDeltaObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaObject(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaObject.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaObject.cs new file mode 100644 index 000000000..92dad3abb --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaObject.cs @@ -0,0 +1,39 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaObject + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalMessageDeltaObject(string id, InternalMessageDeltaObjectDelta delta) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(delta, nameof(delta)); + + Id = id; + Delta = delta; + } + + internal InternalMessageDeltaObject(string id, InternalMessageDeltaObjectObject @object, InternalMessageDeltaObjectDelta delta, IDictionary serializedAdditionalRawData) + { + Id = id; + Object = @object; + Delta = delta; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalMessageDeltaObject() + { + } + + public string Id { get; } + public InternalMessageDeltaObjectObject Object { get; } = InternalMessageDeltaObjectObject.ThreadMessageDelta; + + public InternalMessageDeltaObjectDelta Delta { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaObjectDelta.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaObjectDelta.Serialization.cs new file mode 100644 index 000000000..93e19568b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaObjectDelta.Serialization.cs @@ -0,0 +1,162 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaObjectDelta : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaObjectDelta)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("role") != true) + { + writer.WritePropertyName("role"u8); + writer.WriteStringValue(Role.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("content") != true && Optional.IsCollectionDefined(Content)) + { + writer.WritePropertyName("content"u8); + writer.WriteStartArray(); + foreach (var item in Content) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaObjectDelta IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaObjectDelta)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaObjectDelta(document.RootElement, options); + } + + internal static InternalMessageDeltaObjectDelta DeserializeInternalMessageDeltaObjectDelta(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + MessageRole role = default; + IReadOnlyList content = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("role"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + role = property.Value.GetString().ToMessageRole(); + continue; + } + if (property.NameEquals("content"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalMessageDeltaContent.DeserializeInternalMessageDeltaContent(item, options)); + } + content = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageDeltaObjectDelta(role, content ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaObjectDelta)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaObjectDelta IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaObjectDelta(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaObjectDelta)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageDeltaObjectDelta FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaObjectDelta(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaObjectDelta.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaObjectDelta.cs new file mode 100644 index 000000000..63a7af934 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaObjectDelta.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageDeltaObjectDelta + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalMessageDeltaObjectDelta() + { + Content = new ChangeTrackingList(); + } + + internal InternalMessageDeltaObjectDelta(MessageRole role, IReadOnlyList content, IDictionary serializedAdditionalRawData) + { + Role = role; + Content = content; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public IReadOnlyList Content { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaObjectDeltaRole.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaObjectDeltaRole.cs new file mode 100644 index 000000000..d477fcb93 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaObjectDeltaRole.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageDeltaObjectDeltaRole : IEquatable + { + private readonly string _value; + + public InternalMessageDeltaObjectDeltaRole(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string UserValue = "user"; + private const string AssistantValue = "assistant"; + + public static InternalMessageDeltaObjectDeltaRole User { get; } = new InternalMessageDeltaObjectDeltaRole(UserValue); + public static InternalMessageDeltaObjectDeltaRole Assistant { get; } = new InternalMessageDeltaObjectDeltaRole(AssistantValue); + public static bool operator ==(InternalMessageDeltaObjectDeltaRole left, InternalMessageDeltaObjectDeltaRole right) => left.Equals(right); + public static bool operator !=(InternalMessageDeltaObjectDeltaRole left, InternalMessageDeltaObjectDeltaRole right) => !left.Equals(right); + public static implicit operator InternalMessageDeltaObjectDeltaRole(string value) => new InternalMessageDeltaObjectDeltaRole(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageDeltaObjectDeltaRole other && Equals(other); + public bool Equals(InternalMessageDeltaObjectDeltaRole other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaObjectObject.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaObjectObject.cs new file mode 100644 index 000000000..a420957ec --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaObjectObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageDeltaObjectObject : IEquatable + { + private readonly string _value; + + public InternalMessageDeltaObjectObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ThreadMessageDeltaValue = "thread.message.delta"; + + public static InternalMessageDeltaObjectObject ThreadMessageDelta { get; } = new InternalMessageDeltaObjectObject(ThreadMessageDeltaValue); + public static bool operator ==(InternalMessageDeltaObjectObject left, InternalMessageDeltaObjectObject right) => left.Equals(right); + public static bool operator !=(InternalMessageDeltaObjectObject left, InternalMessageDeltaObjectObject right) => !left.Equals(right); + public static implicit operator InternalMessageDeltaObjectObject(string value) => new InternalMessageDeltaObjectObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageDeltaObjectObject other && Equals(other); + public bool Equals(InternalMessageDeltaObjectObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaTextContentAnnotation.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaTextContentAnnotation.Serialization.cs new file mode 100644 index 000000000..1382bacd8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaTextContentAnnotation.Serialization.cs @@ -0,0 +1,124 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + [PersistableModelProxy(typeof(UnknownMessageDeltaTextContentAnnotation))] + internal partial class InternalMessageDeltaTextContentAnnotation : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaTextContentAnnotation)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaTextContentAnnotation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaTextContentAnnotation)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaTextContentAnnotation(document.RootElement, options); + } + + internal static InternalMessageDeltaTextContentAnnotation DeserializeInternalMessageDeltaTextContentAnnotation(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "file_citation": return InternalMessageDeltaContentTextAnnotationsFileCitationObject.DeserializeInternalMessageDeltaContentTextAnnotationsFileCitationObject(element, options); + case "file_path": return InternalMessageDeltaContentTextAnnotationsFilePathObject.DeserializeInternalMessageDeltaContentTextAnnotationsFilePathObject(element, options); + } + } + return UnknownMessageDeltaTextContentAnnotation.DeserializeUnknownMessageDeltaTextContentAnnotation(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaTextContentAnnotation)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaTextContentAnnotation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaTextContentAnnotation(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaTextContentAnnotation)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageDeltaTextContentAnnotation FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageDeltaTextContentAnnotation(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageDeltaTextContentAnnotation.cs b/.dotnet/src/Generated/Models/InternalMessageDeltaTextContentAnnotation.cs new file mode 100644 index 000000000..34640b4dc --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageDeltaTextContentAnnotation.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal abstract partial class InternalMessageDeltaTextContentAnnotation + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected InternalMessageDeltaTextContentAnnotation() + { + } + + internal InternalMessageDeltaTextContentAnnotation(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageImageFileContent.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageImageFileContent.Serialization.cs new file mode 100644 index 000000000..8a176fdbf --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageImageFileContent.Serialization.cs @@ -0,0 +1,103 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageImageFileContent : IJsonModel + { + InternalMessageImageFileContent IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageImageFileContent)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageImageFileContent(document.RootElement, options); + } + + internal static InternalMessageImageFileContent DeserializeInternalMessageImageFileContent(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + InternalMessageContentItemFileObjectImageFile imageFile = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (property.NameEquals("image_file"u8)) + { + imageFile = InternalMessageContentItemFileObjectImageFile.DeserializeInternalMessageContentItemFileObjectImageFile(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageImageFileContent(serializedAdditionalRawData, type, imageFile); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageImageFileContent)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageImageFileContent IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageImageFileContent(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageImageFileContent)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalMessageImageFileContent FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageImageFileContent(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageImageFileContent.cs b/.dotnet/src/Generated/Models/InternalMessageImageFileContent.cs new file mode 100644 index 000000000..0daac586c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageImageFileContent.cs @@ -0,0 +1,22 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageImageFileContent : MessageContent + { + internal InternalMessageImageFileContent(IDictionary serializedAdditionalRawData, string type, InternalMessageContentItemFileObjectImageFile imageFile) : base(serializedAdditionalRawData) + { + _type = type; + _imageFile = imageFile; + } + + internal InternalMessageImageFileContent() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageImageUrlContent.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageImageUrlContent.Serialization.cs new file mode 100644 index 000000000..bbf8977d0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageImageUrlContent.Serialization.cs @@ -0,0 +1,103 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageImageUrlContent : IJsonModel + { + InternalMessageImageUrlContent IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageImageUrlContent)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageImageUrlContent(document.RootElement, options); + } + + internal static InternalMessageImageUrlContent DeserializeInternalMessageImageUrlContent(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + InternalMessageContentImageUrlObjectImageUrl imageUrl = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (property.NameEquals("image_url"u8)) + { + imageUrl = InternalMessageContentImageUrlObjectImageUrl.DeserializeInternalMessageContentImageUrlObjectImageUrl(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageImageUrlContent(serializedAdditionalRawData, type, imageUrl); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageImageUrlContent)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageImageUrlContent IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageImageUrlContent(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageImageUrlContent)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalMessageImageUrlContent FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageImageUrlContent(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageImageUrlContent.cs b/.dotnet/src/Generated/Models/InternalMessageImageUrlContent.cs new file mode 100644 index 000000000..beb249485 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageImageUrlContent.cs @@ -0,0 +1,22 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageImageUrlContent : MessageContent + { + internal InternalMessageImageUrlContent(IDictionary serializedAdditionalRawData, string type, InternalMessageContentImageUrlObjectImageUrl imageUrl) : base(serializedAdditionalRawData) + { + _type = type; + _imageUrl = imageUrl; + } + + internal InternalMessageImageUrlContent() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageObjectAttachment.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageObjectAttachment.Serialization.cs new file mode 100644 index 000000000..3d0849f9a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageObjectAttachment.Serialization.cs @@ -0,0 +1,177 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageObjectAttachment : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageObjectAttachment)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_id") != true && Optional.IsDefined(FileId)) + { + writer.WritePropertyName("file_id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData?.ContainsKey("tools") != true && Optional.IsCollectionDefined(Tools)) + { + writer.WritePropertyName("tools"u8); + writer.WriteStartArray(); + foreach (var item in Tools) + { + if (item == null) + { + writer.WriteNullValue(); + continue; + } +#if NET6_0_OR_GREATER + writer.WriteRawValue(item); +#else + using (JsonDocument document = JsonDocument.Parse(item)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageObjectAttachment IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageObjectAttachment)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageObjectAttachment(document.RootElement, options); + } + + internal static InternalMessageObjectAttachment DeserializeInternalMessageObjectAttachment(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string fileId = default; + IReadOnlyList tools = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_id"u8)) + { + fileId = property.Value.GetString(); + continue; + } + if (property.NameEquals("tools"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + if (item.ValueKind == JsonValueKind.Null) + { + array.Add(null); + } + else + { + array.Add(BinaryData.FromString(item.GetRawText())); + } + } + tools = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageObjectAttachment(fileId, tools ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageObjectAttachment)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageObjectAttachment IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageObjectAttachment(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageObjectAttachment)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalMessageObjectAttachment FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageObjectAttachment(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageObjectAttachment.cs b/.dotnet/src/Generated/Models/InternalMessageObjectAttachment.cs new file mode 100644 index 000000000..1e92848e3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageObjectAttachment.cs @@ -0,0 +1,28 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageObjectAttachment + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalMessageObjectAttachment() + { + Tools = new ChangeTrackingList(); + } + + internal InternalMessageObjectAttachment(string fileId, IReadOnlyList tools, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + Tools = tools; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string FileId { get; } + public IReadOnlyList Tools { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageObjectObject.cs b/.dotnet/src/Generated/Models/InternalMessageObjectObject.cs new file mode 100644 index 000000000..e419910c5 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageObjectObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageObjectObject : IEquatable + { + private readonly string _value; + + public InternalMessageObjectObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ThreadMessageValue = "thread.message"; + + public static InternalMessageObjectObject ThreadMessage { get; } = new InternalMessageObjectObject(ThreadMessageValue); + public static bool operator ==(InternalMessageObjectObject left, InternalMessageObjectObject right) => left.Equals(right); + public static bool operator !=(InternalMessageObjectObject left, InternalMessageObjectObject right) => !left.Equals(right); + public static implicit operator InternalMessageObjectObject(string value) => new InternalMessageObjectObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageObjectObject other && Equals(other); + public bool Equals(InternalMessageObjectObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageObjectRole.cs b/.dotnet/src/Generated/Models/InternalMessageObjectRole.cs new file mode 100644 index 000000000..91e68a9de --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageObjectRole.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageObjectRole : IEquatable + { + private readonly string _value; + + public InternalMessageObjectRole(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string UserValue = "user"; + private const string AssistantValue = "assistant"; + + public static InternalMessageObjectRole User { get; } = new InternalMessageObjectRole(UserValue); + public static InternalMessageObjectRole Assistant { get; } = new InternalMessageObjectRole(AssistantValue); + public static bool operator ==(InternalMessageObjectRole left, InternalMessageObjectRole right) => left.Equals(right); + public static bool operator !=(InternalMessageObjectRole left, InternalMessageObjectRole right) => !left.Equals(right); + public static implicit operator InternalMessageObjectRole(string value) => new InternalMessageObjectRole(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageObjectRole other && Equals(other); + public bool Equals(InternalMessageObjectRole other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageRefusalContent.Serialization.cs b/.dotnet/src/Generated/Models/InternalMessageRefusalContent.Serialization.cs new file mode 100644 index 000000000..6e3c9846f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageRefusalContent.Serialization.cs @@ -0,0 +1,103 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageRefusalContent : IJsonModel + { + InternalMessageRefusalContent IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageRefusalContent)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageRefusalContent(document.RootElement, options); + } + + internal static InternalMessageRefusalContent DeserializeInternalMessageRefusalContent(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + string refusal = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (property.NameEquals("refusal"u8)) + { + refusal = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalMessageRefusalContent(serializedAdditionalRawData, type, refusal); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageRefusalContent)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageRefusalContent IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageRefusalContent(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageRefusalContent)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalMessageRefusalContent FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalMessageRefusalContent(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageRefusalContent.cs b/.dotnet/src/Generated/Models/InternalMessageRefusalContent.cs new file mode 100644 index 000000000..dc6f0e0af --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageRefusalContent.cs @@ -0,0 +1,29 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalMessageRefusalContent : MessageContent + { + public InternalMessageRefusalContent(string internalRefusal) + { + Argument.AssertNotNull(internalRefusal, nameof(internalRefusal)); + + InternalRefusal = internalRefusal; + } + + internal InternalMessageRefusalContent(IDictionary serializedAdditionalRawData, string type, string internalRefusal) : base(serializedAdditionalRawData) + { + _type = type; + InternalRefusal = internalRefusal; + } + + internal InternalMessageRefusalContent() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalMessageRequestContentTextObjectType.cs b/.dotnet/src/Generated/Models/InternalMessageRequestContentTextObjectType.cs new file mode 100644 index 000000000..f41f3c083 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalMessageRequestContentTextObjectType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalMessageRequestContentTextObjectType : IEquatable + { + private readonly string _value; + + public InternalMessageRequestContentTextObjectType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string TextValue = "text"; + + public static InternalMessageRequestContentTextObjectType Text { get; } = new InternalMessageRequestContentTextObjectType(TextValue); + public static bool operator ==(InternalMessageRequestContentTextObjectType left, InternalMessageRequestContentTextObjectType right) => left.Equals(right); + public static bool operator !=(InternalMessageRequestContentTextObjectType left, InternalMessageRequestContentTextObjectType right) => !left.Equals(right); + public static implicit operator InternalMessageRequestContentTextObjectType(string value) => new InternalMessageRequestContentTextObjectType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalMessageRequestContentTextObjectType other && Equals(other); + public bool Equals(InternalMessageRequestContentTextObjectType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalModelObject.cs b/.dotnet/src/Generated/Models/InternalModelObject.cs new file mode 100644 index 000000000..79b9cefab --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalModelObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Models +{ + internal readonly partial struct InternalModelObject : IEquatable + { + private readonly string _value; + + public InternalModelObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ModelValue = "model"; + + public static InternalModelObject Model { get; } = new InternalModelObject(ModelValue); + public static bool operator ==(InternalModelObject left, InternalModelObject right) => left.Equals(right); + public static bool operator !=(InternalModelObject left, InternalModelObject right) => !left.Equals(right); + public static implicit operator InternalModelObject(string value) => new InternalModelObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalModelObject other && Equals(other); + public bool Equals(InternalModelObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResources.Serialization.cs b/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResources.Serialization.cs new file mode 100644 index 000000000..bb4a89610 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResources.Serialization.cs @@ -0,0 +1,152 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalModifyAssistantRequestToolResources : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalModifyAssistantRequestToolResources)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code_interpreter") != true && Optional.IsDefined(CodeInterpreter)) + { + writer.WritePropertyName("code_interpreter"u8); + writer.WriteObjectValue(CodeInterpreter, options); + } + if (SerializedAdditionalRawData?.ContainsKey("file_search") != true && Optional.IsDefined(FileSearch)) + { + writer.WritePropertyName("file_search"u8); + writer.WriteObjectValue(FileSearch, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalModifyAssistantRequestToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalModifyAssistantRequestToolResources)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalModifyAssistantRequestToolResources(document.RootElement, options); + } + + internal static InternalModifyAssistantRequestToolResources DeserializeInternalModifyAssistantRequestToolResources(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalModifyAssistantRequestToolResourcesCodeInterpreter codeInterpreter = default; + InternalToolResourcesFileSearchIdsOnly fileSearch = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code_interpreter"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + codeInterpreter = InternalModifyAssistantRequestToolResourcesCodeInterpreter.DeserializeInternalModifyAssistantRequestToolResourcesCodeInterpreter(property.Value, options); + continue; + } + if (property.NameEquals("file_search"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fileSearch = InternalToolResourcesFileSearchIdsOnly.DeserializeInternalToolResourcesFileSearchIdsOnly(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalModifyAssistantRequestToolResources(codeInterpreter, fileSearch, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalModifyAssistantRequestToolResources)} does not support writing '{options.Format}' format."); + } + } + + InternalModifyAssistantRequestToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalModifyAssistantRequestToolResources(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalModifyAssistantRequestToolResources)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalModifyAssistantRequestToolResources FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalModifyAssistantRequestToolResources(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResources.cs b/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResources.cs new file mode 100644 index 000000000..3bcf4c24d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResources.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalModifyAssistantRequestToolResources + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalModifyAssistantRequestToolResources() + { + } + + internal InternalModifyAssistantRequestToolResources(InternalModifyAssistantRequestToolResourcesCodeInterpreter codeInterpreter, InternalToolResourcesFileSearchIdsOnly fileSearch, IDictionary serializedAdditionalRawData) + { + CodeInterpreter = codeInterpreter; + FileSearch = fileSearch; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public InternalModifyAssistantRequestToolResourcesCodeInterpreter CodeInterpreter { get; set; } + public InternalToolResourcesFileSearchIdsOnly FileSearch { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResourcesCodeInterpreter.Serialization.cs b/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResourcesCodeInterpreter.Serialization.cs new file mode 100644 index 000000000..2b3ea1d6a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResourcesCodeInterpreter.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalModifyAssistantRequestToolResourcesCodeInterpreter : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalModifyAssistantRequestToolResourcesCodeInterpreter)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_ids") != true && Optional.IsCollectionDefined(FileIds)) + { + writer.WritePropertyName("file_ids"u8); + writer.WriteStartArray(); + foreach (var item in FileIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalModifyAssistantRequestToolResourcesCodeInterpreter IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalModifyAssistantRequestToolResourcesCodeInterpreter)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalModifyAssistantRequestToolResourcesCodeInterpreter(document.RootElement, options); + } + + internal static InternalModifyAssistantRequestToolResourcesCodeInterpreter DeserializeInternalModifyAssistantRequestToolResourcesCodeInterpreter(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList fileIds = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + fileIds = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalModifyAssistantRequestToolResourcesCodeInterpreter(fileIds ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalModifyAssistantRequestToolResourcesCodeInterpreter)} does not support writing '{options.Format}' format."); + } + } + + InternalModifyAssistantRequestToolResourcesCodeInterpreter IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalModifyAssistantRequestToolResourcesCodeInterpreter(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalModifyAssistantRequestToolResourcesCodeInterpreter)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalModifyAssistantRequestToolResourcesCodeInterpreter FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalModifyAssistantRequestToolResourcesCodeInterpreter(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResourcesCodeInterpreter.cs b/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResourcesCodeInterpreter.cs new file mode 100644 index 000000000..1b2538796 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalModifyAssistantRequestToolResourcesCodeInterpreter.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalModifyAssistantRequestToolResourcesCodeInterpreter + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalModifyAssistantRequestToolResourcesCodeInterpreter() + { + FileIds = new ChangeTrackingList(); + } + + internal InternalModifyAssistantRequestToolResourcesCodeInterpreter(IList fileIds, IDictionary serializedAdditionalRawData) + { + FileIds = fileIds; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IList FileIds { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResources.Serialization.cs b/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResources.Serialization.cs new file mode 100644 index 000000000..3354accf3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResources.Serialization.cs @@ -0,0 +1,152 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalModifyThreadRequestToolResources : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalModifyThreadRequestToolResources)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code_interpreter") != true && Optional.IsDefined(CodeInterpreter)) + { + writer.WritePropertyName("code_interpreter"u8); + writer.WriteObjectValue(CodeInterpreter, options); + } + if (SerializedAdditionalRawData?.ContainsKey("file_search") != true && Optional.IsDefined(FileSearch)) + { + writer.WritePropertyName("file_search"u8); + writer.WriteObjectValue(FileSearch, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalModifyThreadRequestToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalModifyThreadRequestToolResources)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalModifyThreadRequestToolResources(document.RootElement, options); + } + + internal static InternalModifyThreadRequestToolResources DeserializeInternalModifyThreadRequestToolResources(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalModifyThreadRequestToolResourcesCodeInterpreter codeInterpreter = default; + InternalToolResourcesFileSearchIdsOnly fileSearch = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code_interpreter"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + codeInterpreter = InternalModifyThreadRequestToolResourcesCodeInterpreter.DeserializeInternalModifyThreadRequestToolResourcesCodeInterpreter(property.Value, options); + continue; + } + if (property.NameEquals("file_search"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fileSearch = InternalToolResourcesFileSearchIdsOnly.DeserializeInternalToolResourcesFileSearchIdsOnly(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalModifyThreadRequestToolResources(codeInterpreter, fileSearch, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalModifyThreadRequestToolResources)} does not support writing '{options.Format}' format."); + } + } + + InternalModifyThreadRequestToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalModifyThreadRequestToolResources(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalModifyThreadRequestToolResources)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalModifyThreadRequestToolResources FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalModifyThreadRequestToolResources(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResources.cs b/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResources.cs new file mode 100644 index 000000000..0876ba2de --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResources.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalModifyThreadRequestToolResources + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalModifyThreadRequestToolResources() + { + } + + internal InternalModifyThreadRequestToolResources(InternalModifyThreadRequestToolResourcesCodeInterpreter codeInterpreter, InternalToolResourcesFileSearchIdsOnly fileSearch, IDictionary serializedAdditionalRawData) + { + CodeInterpreter = codeInterpreter; + FileSearch = fileSearch; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public InternalModifyThreadRequestToolResourcesCodeInterpreter CodeInterpreter { get; set; } + public InternalToolResourcesFileSearchIdsOnly FileSearch { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResourcesCodeInterpreter.Serialization.cs b/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResourcesCodeInterpreter.Serialization.cs new file mode 100644 index 000000000..cc3aafd77 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResourcesCodeInterpreter.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalModifyThreadRequestToolResourcesCodeInterpreter : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalModifyThreadRequestToolResourcesCodeInterpreter)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_ids") != true && Optional.IsCollectionDefined(FileIds)) + { + writer.WritePropertyName("file_ids"u8); + writer.WriteStartArray(); + foreach (var item in FileIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalModifyThreadRequestToolResourcesCodeInterpreter IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalModifyThreadRequestToolResourcesCodeInterpreter)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalModifyThreadRequestToolResourcesCodeInterpreter(document.RootElement, options); + } + + internal static InternalModifyThreadRequestToolResourcesCodeInterpreter DeserializeInternalModifyThreadRequestToolResourcesCodeInterpreter(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList fileIds = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + fileIds = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalModifyThreadRequestToolResourcesCodeInterpreter(fileIds ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalModifyThreadRequestToolResourcesCodeInterpreter)} does not support writing '{options.Format}' format."); + } + } + + InternalModifyThreadRequestToolResourcesCodeInterpreter IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalModifyThreadRequestToolResourcesCodeInterpreter(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalModifyThreadRequestToolResourcesCodeInterpreter)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalModifyThreadRequestToolResourcesCodeInterpreter FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalModifyThreadRequestToolResourcesCodeInterpreter(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResourcesCodeInterpreter.cs b/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResourcesCodeInterpreter.cs new file mode 100644 index 000000000..df0ce6bf8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalModifyThreadRequestToolResourcesCodeInterpreter.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalModifyThreadRequestToolResourcesCodeInterpreter + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalModifyThreadRequestToolResourcesCodeInterpreter() + { + FileIds = new ChangeTrackingList(); + } + + internal InternalModifyThreadRequestToolResourcesCodeInterpreter(IList fileIds, IDictionary serializedAdditionalRawData) + { + FileIds = fileIds; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IList FileIds { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalOmniTypedResponseFormat.Serialization.cs b/.dotnet/src/Generated/Models/InternalOmniTypedResponseFormat.Serialization.cs new file mode 100644 index 000000000..6ccf09de3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalOmniTypedResponseFormat.Serialization.cs @@ -0,0 +1,125 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Internal +{ + [PersistableModelProxy(typeof(InternalUnknownOmniTypedResponseFormat))] + internal partial class InternalOmniTypedResponseFormat : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalOmniTypedResponseFormat)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalOmniTypedResponseFormat IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalOmniTypedResponseFormat)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalOmniTypedResponseFormat(document.RootElement, options); + } + + internal static InternalOmniTypedResponseFormat DeserializeInternalOmniTypedResponseFormat(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "json_object": return InternalResponseFormatJsonObject.DeserializeInternalResponseFormatJsonObject(element, options); + case "json_schema": return InternalResponseFormatJsonSchema.DeserializeInternalResponseFormatJsonSchema(element, options); + case "text": return InternalResponseFormatText.DeserializeInternalResponseFormatText(element, options); + } + } + return InternalUnknownOmniTypedResponseFormat.DeserializeInternalUnknownOmniTypedResponseFormat(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalOmniTypedResponseFormat)} does not support writing '{options.Format}' format."); + } + } + + InternalOmniTypedResponseFormat IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalOmniTypedResponseFormat(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalOmniTypedResponseFormat)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalOmniTypedResponseFormat FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalOmniTypedResponseFormat(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalOmniTypedResponseFormat.cs b/.dotnet/src/Generated/Models/InternalOmniTypedResponseFormat.cs new file mode 100644 index 000000000..da255f506 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalOmniTypedResponseFormat.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Internal +{ + internal abstract partial class InternalOmniTypedResponseFormat + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected InternalOmniTypedResponseFormat() + { + } + + internal InternalOmniTypedResponseFormat(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalOpenAIFileObject.cs b/.dotnet/src/Generated/Models/InternalOpenAIFileObject.cs new file mode 100644 index 000000000..d32ba1770 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalOpenAIFileObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Files +{ + internal readonly partial struct InternalOpenAIFileObject : IEquatable + { + private readonly string _value; + + public InternalOpenAIFileObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FileValue = "file"; + + public static InternalOpenAIFileObject File { get; } = new InternalOpenAIFileObject(FileValue); + public static bool operator ==(InternalOpenAIFileObject left, InternalOpenAIFileObject right) => left.Equals(right); + public static bool operator !=(InternalOpenAIFileObject left, InternalOpenAIFileObject right) => !left.Equals(right); + public static implicit operator InternalOpenAIFileObject(string value) => new InternalOpenAIFileObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalOpenAIFileObject other && Equals(other); + public bool Equals(InternalOpenAIFileObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalRequestMessageTextContent.Serialization.cs b/.dotnet/src/Generated/Models/InternalRequestMessageTextContent.Serialization.cs new file mode 100644 index 000000000..abecb82df --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRequestMessageTextContent.Serialization.cs @@ -0,0 +1,103 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRequestMessageTextContent : IJsonModel + { + InternalRequestMessageTextContent IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRequestMessageTextContent)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRequestMessageTextContent(document.RootElement, options); + } + + internal static InternalRequestMessageTextContent DeserializeInternalRequestMessageTextContent(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalMessageRequestContentTextObjectType type = default; + string text = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = new InternalMessageRequestContentTextObjectType(property.Value.GetString()); + continue; + } + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRequestMessageTextContent(serializedAdditionalRawData, type, text); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRequestMessageTextContent)} does not support writing '{options.Format}' format."); + } + } + + InternalRequestMessageTextContent IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRequestMessageTextContent(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRequestMessageTextContent)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRequestMessageTextContent FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRequestMessageTextContent(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRequestMessageTextContent.cs b/.dotnet/src/Generated/Models/InternalRequestMessageTextContent.cs new file mode 100644 index 000000000..7fb763cb2 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRequestMessageTextContent.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRequestMessageTextContent : MessageContent + { + public InternalRequestMessageTextContent(string internalText) + { + Argument.AssertNotNull(internalText, nameof(internalText)); + + InternalText = internalText; + } + + internal InternalRequestMessageTextContent(IDictionary serializedAdditionalRawData, InternalMessageRequestContentTextObjectType type, string internalText) : base(serializedAdditionalRawData) + { + Type = type; + InternalText = internalText; + } + + internal InternalRequestMessageTextContent() + { + } + + public InternalMessageRequestContentTextObjectType Type { get; } = InternalMessageRequestContentTextObjectType.Text; + } +} diff --git a/.dotnet/src/Generated/Models/InternalRequiredFunctionToolCall.Serialization.cs b/.dotnet/src/Generated/Models/InternalRequiredFunctionToolCall.Serialization.cs new file mode 100644 index 000000000..ab8bbc81c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRequiredFunctionToolCall.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRequiredFunctionToolCall : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRequiredFunctionToolCall)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteObjectValue(_type, options); + } + if (SerializedAdditionalRawData?.ContainsKey("function") != true) + { + writer.WritePropertyName("function"u8); + writer.WriteObjectValue(_internalFunction, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRequiredFunctionToolCall IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRequiredFunctionToolCall)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRequiredFunctionToolCall(document.RootElement, options); + } + + internal static InternalRequiredFunctionToolCall DeserializeInternalRequiredFunctionToolCall(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + object type = default; + InternalRunToolCallObjectFunction function = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetObject(); + continue; + } + if (property.NameEquals("function"u8)) + { + function = InternalRunToolCallObjectFunction.DeserializeInternalRunToolCallObjectFunction(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRequiredFunctionToolCall(id, type, function, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRequiredFunctionToolCall)} does not support writing '{options.Format}' format."); + } + } + + InternalRequiredFunctionToolCall IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRequiredFunctionToolCall(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRequiredFunctionToolCall)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRequiredFunctionToolCall FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRequiredFunctionToolCall(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRequiredFunctionToolCall.cs b/.dotnet/src/Generated/Models/InternalRequiredFunctionToolCall.cs new file mode 100644 index 000000000..3d44b809d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRequiredFunctionToolCall.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRequiredFunctionToolCall + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRequiredFunctionToolCall(string id, InternalRunToolCallObjectFunction internalFunction) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(internalFunction, nameof(internalFunction)); + + Id = id; + _internalFunction = internalFunction; + } + + internal InternalRequiredFunctionToolCall(string id, object type, InternalRunToolCallObjectFunction internalFunction, IDictionary serializedAdditionalRawData) + { + Id = id; + _type = type; + _internalFunction = internalFunction; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalRequiredFunctionToolCall() + { + } + + public string Id { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseFormatJsonObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalResponseFormatJsonObject.Serialization.cs new file mode 100644 index 000000000..66e2b97f1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseFormatJsonObject.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Internal +{ + internal partial class InternalResponseFormatJsonObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalResponseFormatJsonObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalResponseFormatJsonObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalResponseFormatJsonObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalResponseFormatJsonObject(document.RootElement, options); + } + + internal static InternalResponseFormatJsonObject DeserializeInternalResponseFormatJsonObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalResponseFormatJsonObject(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalResponseFormatJsonObject)} does not support writing '{options.Format}' format."); + } + } + + InternalResponseFormatJsonObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalResponseFormatJsonObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalResponseFormatJsonObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalResponseFormatJsonObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalResponseFormatJsonObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseFormatJsonObject.cs b/.dotnet/src/Generated/Models/InternalResponseFormatJsonObject.cs new file mode 100644 index 000000000..b62b1891f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseFormatJsonObject.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Internal +{ + internal partial class InternalResponseFormatJsonObject : InternalOmniTypedResponseFormat + { + public InternalResponseFormatJsonObject() + { + Type = "json_object"; + } + + internal InternalResponseFormatJsonObject(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchema.Serialization.cs b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchema.Serialization.cs new file mode 100644 index 000000000..1746508f2 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchema.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Internal +{ + internal partial class InternalResponseFormatJsonSchema : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchema)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("json_schema") != true) + { + writer.WritePropertyName("json_schema"u8); + writer.WriteObjectValue(JsonSchema, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalResponseFormatJsonSchema IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchema)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalResponseFormatJsonSchema(document.RootElement, options); + } + + internal static InternalResponseFormatJsonSchema DeserializeInternalResponseFormatJsonSchema(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalResponseFormatJsonSchemaJsonSchema jsonSchema = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("json_schema"u8)) + { + jsonSchema = InternalResponseFormatJsonSchemaJsonSchema.DeserializeInternalResponseFormatJsonSchemaJsonSchema(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalResponseFormatJsonSchema(type, serializedAdditionalRawData, jsonSchema); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchema)} does not support writing '{options.Format}' format."); + } + } + + InternalResponseFormatJsonSchema IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalResponseFormatJsonSchema(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchema)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalResponseFormatJsonSchema FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalResponseFormatJsonSchema(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchema.cs b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchema.cs new file mode 100644 index 000000000..739eb22b9 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchema.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Internal +{ + internal partial class InternalResponseFormatJsonSchema : InternalOmniTypedResponseFormat + { + public InternalResponseFormatJsonSchema(InternalResponseFormatJsonSchemaJsonSchema jsonSchema) + { + Argument.AssertNotNull(jsonSchema, nameof(jsonSchema)); + + Type = "json_schema"; + JsonSchema = jsonSchema; + } + + internal InternalResponseFormatJsonSchema(string type, IDictionary serializedAdditionalRawData, InternalResponseFormatJsonSchemaJsonSchema jsonSchema) : base(type, serializedAdditionalRawData) + { + JsonSchema = jsonSchema; + } + + internal InternalResponseFormatJsonSchema() + { + } + + public InternalResponseFormatJsonSchemaJsonSchema JsonSchema { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaJsonSchema.Serialization.cs b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaJsonSchema.Serialization.cs new file mode 100644 index 000000000..5b9c4d930 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaJsonSchema.Serialization.cs @@ -0,0 +1,189 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Internal +{ + internal partial class InternalResponseFormatJsonSchemaJsonSchema : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchemaJsonSchema)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("description") != true && Optional.IsDefined(Description)) + { + writer.WritePropertyName("description"u8); + writer.WriteStringValue(Description); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData?.ContainsKey("schema") != true && Optional.IsDefined(Schema)) + { + writer.WritePropertyName("schema"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(Schema); +#else + using (JsonDocument document = JsonDocument.Parse(Schema)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("strict") != true && Optional.IsDefined(Strict)) + { + if (Strict != null) + { + writer.WritePropertyName("strict"u8); + writer.WriteBooleanValue(Strict.Value); + } + else + { + writer.WriteNull("strict"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalResponseFormatJsonSchemaJsonSchema IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchemaJsonSchema)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalResponseFormatJsonSchemaJsonSchema(document.RootElement, options); + } + + internal static InternalResponseFormatJsonSchemaJsonSchema DeserializeInternalResponseFormatJsonSchemaJsonSchema(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string description = default; + string name = default; + BinaryData schema = default; + bool? strict = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("description"u8)) + { + description = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("schema"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + schema = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("strict"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + strict = null; + continue; + } + strict = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalResponseFormatJsonSchemaJsonSchema(description, name, schema, strict, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchemaJsonSchema)} does not support writing '{options.Format}' format."); + } + } + + InternalResponseFormatJsonSchemaJsonSchema IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalResponseFormatJsonSchemaJsonSchema(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchemaJsonSchema)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalResponseFormatJsonSchemaJsonSchema FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalResponseFormatJsonSchemaJsonSchema(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaJsonSchema.cs b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaJsonSchema.cs new file mode 100644 index 000000000..533d36db0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaJsonSchema.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Internal +{ + internal partial class InternalResponseFormatJsonSchemaJsonSchema + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalResponseFormatJsonSchemaJsonSchema(string name) + { + Argument.AssertNotNull(name, nameof(name)); + + Name = name; + } + + internal InternalResponseFormatJsonSchemaJsonSchema(string description, string name, BinaryData schema, bool? strict, IDictionary serializedAdditionalRawData) + { + Description = description; + Name = name; + Schema = schema; + Strict = strict; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalResponseFormatJsonSchemaJsonSchema() + { + } + + public string Description { get; set; } + public string Name { get; set; } + public bool? Strict { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaSchema.Serialization.cs b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaSchema.Serialization.cs new file mode 100644 index 000000000..d77918543 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaSchema.Serialization.cs @@ -0,0 +1,111 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Internal +{ + internal partial class InternalResponseFormatJsonSchemaSchema : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchemaSchema)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + foreach (var item in AdditionalProperties) + { + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + writer.WriteEndObject(); + } + + InternalResponseFormatJsonSchemaSchema IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchemaSchema)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalResponseFormatJsonSchemaSchema(document.RootElement, options); + } + + internal static InternalResponseFormatJsonSchemaSchema DeserializeInternalResponseFormatJsonSchemaSchema(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IDictionary additionalProperties = default; + Dictionary additionalPropertiesDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + additionalPropertiesDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + additionalProperties = additionalPropertiesDictionary; + return new InternalResponseFormatJsonSchemaSchema(additionalProperties); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchemaSchema)} does not support writing '{options.Format}' format."); + } + } + + InternalResponseFormatJsonSchemaSchema IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalResponseFormatJsonSchemaSchema(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalResponseFormatJsonSchemaSchema)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalResponseFormatJsonSchemaSchema FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalResponseFormatJsonSchemaSchema(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaSchema.cs b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaSchema.cs new file mode 100644 index 000000000..d00500adb --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseFormatJsonSchemaSchema.cs @@ -0,0 +1,24 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Internal +{ + internal partial class InternalResponseFormatJsonSchemaSchema + { + public InternalResponseFormatJsonSchemaSchema() + { + AdditionalProperties = new ChangeTrackingDictionary(); + } + + internal InternalResponseFormatJsonSchemaSchema(IDictionary additionalProperties) + { + AdditionalProperties = additionalProperties; + } + + public IDictionary AdditionalProperties { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseFormatText.Serialization.cs b/.dotnet/src/Generated/Models/InternalResponseFormatText.Serialization.cs new file mode 100644 index 000000000..b213da2e7 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseFormatText.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Internal +{ + internal partial class InternalResponseFormatText : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalResponseFormatText)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalResponseFormatText IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalResponseFormatText)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalResponseFormatText(document.RootElement, options); + } + + internal static InternalResponseFormatText DeserializeInternalResponseFormatText(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalResponseFormatText(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalResponseFormatText)} does not support writing '{options.Format}' format."); + } + } + + InternalResponseFormatText IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalResponseFormatText(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalResponseFormatText)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalResponseFormatText FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalResponseFormatText(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseFormatText.cs b/.dotnet/src/Generated/Models/InternalResponseFormatText.cs new file mode 100644 index 000000000..efe4d6fce --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseFormatText.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Internal +{ + internal partial class InternalResponseFormatText : InternalOmniTypedResponseFormat + { + public InternalResponseFormatText() + { + Type = "text"; + } + + internal InternalResponseFormatText(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseMessageTextContent.Serialization.cs b/.dotnet/src/Generated/Models/InternalResponseMessageTextContent.Serialization.cs new file mode 100644 index 000000000..4a89d0389 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseMessageTextContent.Serialization.cs @@ -0,0 +1,103 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalResponseMessageTextContent : IJsonModel + { + InternalResponseMessageTextContent IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalResponseMessageTextContent)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalResponseMessageTextContent(document.RootElement, options); + } + + internal static InternalResponseMessageTextContent DeserializeInternalResponseMessageTextContent(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + InternalMessageContentTextObjectText text = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (property.NameEquals("text"u8)) + { + text = InternalMessageContentTextObjectText.DeserializeInternalMessageContentTextObjectText(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalResponseMessageTextContent(serializedAdditionalRawData, type, text); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalResponseMessageTextContent)} does not support writing '{options.Format}' format."); + } + } + + InternalResponseMessageTextContent IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalResponseMessageTextContent(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalResponseMessageTextContent)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalResponseMessageTextContent FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalResponseMessageTextContent(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalResponseMessageTextContent.cs b/.dotnet/src/Generated/Models/InternalResponseMessageTextContent.cs new file mode 100644 index 000000000..83930eb9b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalResponseMessageTextContent.cs @@ -0,0 +1,22 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalResponseMessageTextContent : MessageContent + { + internal InternalResponseMessageTextContent(IDictionary serializedAdditionalRawData, string type, InternalMessageContentTextObjectText text) : base(serializedAdditionalRawData) + { + _type = type; + _text = text; + } + + internal InternalResponseMessageTextContent() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunObjectObject.cs b/.dotnet/src/Generated/Models/InternalRunObjectObject.cs new file mode 100644 index 000000000..4f2c77d99 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunObjectObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalRunObjectObject : IEquatable + { + private readonly string _value; + + public InternalRunObjectObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ThreadRunValue = "thread.run"; + + public static InternalRunObjectObject ThreadRun { get; } = new InternalRunObjectObject(ThreadRunValue); + public static bool operator ==(InternalRunObjectObject left, InternalRunObjectObject right) => left.Equals(right); + public static bool operator !=(InternalRunObjectObject left, InternalRunObjectObject right) => !left.Equals(right); + public static implicit operator InternalRunObjectObject(string value) => new InternalRunObjectObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalRunObjectObject other && Equals(other); + public bool Equals(InternalRunObjectObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunObjectRequiredActionSubmitToolOutputs.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunObjectRequiredActionSubmitToolOutputs.Serialization.cs new file mode 100644 index 000000000..cd9245941 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunObjectRequiredActionSubmitToolOutputs.Serialization.cs @@ -0,0 +1,143 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunObjectRequiredActionSubmitToolOutputs : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunObjectRequiredActionSubmitToolOutputs)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("tool_calls") != true) + { + writer.WritePropertyName("tool_calls"u8); + writer.WriteStartArray(); + foreach (var item in ToolCalls) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunObjectRequiredActionSubmitToolOutputs IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunObjectRequiredActionSubmitToolOutputs)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunObjectRequiredActionSubmitToolOutputs(document.RootElement, options); + } + + internal static InternalRunObjectRequiredActionSubmitToolOutputs DeserializeInternalRunObjectRequiredActionSubmitToolOutputs(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList toolCalls = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("tool_calls"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalRequiredFunctionToolCall.DeserializeInternalRequiredFunctionToolCall(item, options)); + } + toolCalls = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunObjectRequiredActionSubmitToolOutputs(toolCalls, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunObjectRequiredActionSubmitToolOutputs)} does not support writing '{options.Format}' format."); + } + } + + InternalRunObjectRequiredActionSubmitToolOutputs IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunObjectRequiredActionSubmitToolOutputs(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunObjectRequiredActionSubmitToolOutputs)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunObjectRequiredActionSubmitToolOutputs FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunObjectRequiredActionSubmitToolOutputs(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunObjectRequiredActionSubmitToolOutputs.cs b/.dotnet/src/Generated/Models/InternalRunObjectRequiredActionSubmitToolOutputs.cs new file mode 100644 index 000000000..323fc341c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunObjectRequiredActionSubmitToolOutputs.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunObjectRequiredActionSubmitToolOutputs + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunObjectRequiredActionSubmitToolOutputs(IEnumerable toolCalls) + { + Argument.AssertNotNull(toolCalls, nameof(toolCalls)); + + ToolCalls = toolCalls.ToList(); + } + + internal InternalRunObjectRequiredActionSubmitToolOutputs(IReadOnlyList toolCalls, IDictionary serializedAdditionalRawData) + { + ToolCalls = toolCalls; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalRunObjectRequiredActionSubmitToolOutputs() + { + } + + public IReadOnlyList ToolCalls { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunObjectRequiredActionType.cs b/.dotnet/src/Generated/Models/InternalRunObjectRequiredActionType.cs new file mode 100644 index 000000000..bd53c3266 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunObjectRequiredActionType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalRunObjectRequiredActionType : IEquatable + { + private readonly string _value; + + public InternalRunObjectRequiredActionType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string SubmitToolOutputsValue = "submit_tool_outputs"; + + public static InternalRunObjectRequiredActionType SubmitToolOutputs { get; } = new InternalRunObjectRequiredActionType(SubmitToolOutputsValue); + public static bool operator ==(InternalRunObjectRequiredActionType left, InternalRunObjectRequiredActionType right) => left.Equals(right); + public static bool operator !=(InternalRunObjectRequiredActionType left, InternalRunObjectRequiredActionType right) => !left.Equals(right); + public static implicit operator InternalRunObjectRequiredActionType(string value) => new InternalRunObjectRequiredActionType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalRunObjectRequiredActionType other && Equals(other); + public bool Equals(InternalRunObjectRequiredActionType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunRequiredAction.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunRequiredAction.Serialization.cs new file mode 100644 index 000000000..67fe40abc --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunRequiredAction.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunRequiredAction : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunRequiredAction)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteObjectValue(Type, options); + } + if (SerializedAdditionalRawData?.ContainsKey("submit_tool_outputs") != true) + { + writer.WritePropertyName("submit_tool_outputs"u8); + writer.WriteObjectValue(SubmitToolOutputs, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunRequiredAction IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunRequiredAction)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunRequiredAction(document.RootElement, options); + } + + internal static InternalRunRequiredAction DeserializeInternalRunRequiredAction(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + object type = default; + InternalRunObjectRequiredActionSubmitToolOutputs submitToolOutputs = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetObject(); + continue; + } + if (property.NameEquals("submit_tool_outputs"u8)) + { + submitToolOutputs = InternalRunObjectRequiredActionSubmitToolOutputs.DeserializeInternalRunObjectRequiredActionSubmitToolOutputs(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunRequiredAction(type, submitToolOutputs, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunRequiredAction)} does not support writing '{options.Format}' format."); + } + } + + InternalRunRequiredAction IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunRequiredAction(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunRequiredAction)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunRequiredAction FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunRequiredAction(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunRequiredAction.cs b/.dotnet/src/Generated/Models/InternalRunRequiredAction.cs new file mode 100644 index 000000000..627dd8b84 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunRequiredAction.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunRequiredAction + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunRequiredAction(InternalRunObjectRequiredActionSubmitToolOutputs submitToolOutputs) + { + Argument.AssertNotNull(submitToolOutputs, nameof(submitToolOutputs)); + + SubmitToolOutputs = submitToolOutputs; + } + + internal InternalRunRequiredAction(object type, InternalRunObjectRequiredActionSubmitToolOutputs submitToolOutputs, IDictionary serializedAdditionalRawData) + { + Type = type; + SubmitToolOutputs = submitToolOutputs; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalRunRequiredAction() + { + } + + public InternalRunObjectRequiredActionSubmitToolOutputs SubmitToolOutputs { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterLogOutput.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterLogOutput.Serialization.cs new file mode 100644 index 000000000..84313365a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterLogOutput.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepCodeInterpreterLogOutput : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepCodeInterpreterLogOutput)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("logs") != true) + { + writer.WritePropertyName("logs"u8); + writer.WriteStringValue(InternalLogs); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepCodeInterpreterLogOutput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepCodeInterpreterLogOutput)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepCodeInterpreterLogOutput(document.RootElement, options); + } + + internal static InternalRunStepCodeInterpreterLogOutput DeserializeInternalRunStepCodeInterpreterLogOutput(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string logs = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("logs"u8)) + { + logs = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepCodeInterpreterLogOutput(type, serializedAdditionalRawData, logs); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepCodeInterpreterLogOutput)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepCodeInterpreterLogOutput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepCodeInterpreterLogOutput(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepCodeInterpreterLogOutput)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepCodeInterpreterLogOutput FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepCodeInterpreterLogOutput(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterLogOutput.cs b/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterLogOutput.cs new file mode 100644 index 000000000..a44f25f41 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterLogOutput.cs @@ -0,0 +1,29 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepCodeInterpreterLogOutput : RunStepCodeInterpreterOutput + { + internal InternalRunStepCodeInterpreterLogOutput(string internalLogs) + { + Argument.AssertNotNull(internalLogs, nameof(internalLogs)); + + Type = "logs"; + InternalLogs = internalLogs; + } + + internal InternalRunStepCodeInterpreterLogOutput(string type, IDictionary serializedAdditionalRawData, string internalLogs) : base(type, serializedAdditionalRawData) + { + InternalLogs = internalLogs; + } + + internal InternalRunStepCodeInterpreterLogOutput() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterToolCallDetails.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterToolCallDetails.Serialization.cs new file mode 100644 index 000000000..af3e3f3fd --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterToolCallDetails.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepCodeInterpreterToolCallDetails : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepCodeInterpreterToolCallDetails)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("code_interpreter") != true) + { + writer.WritePropertyName("code_interpreter"u8); + writer.WriteObjectValue(_codeInterpreter, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepCodeInterpreterToolCallDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepCodeInterpreterToolCallDetails)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepCodeInterpreterToolCallDetails(document.RootElement, options); + } + + internal static InternalRunStepCodeInterpreterToolCallDetails DeserializeInternalRunStepCodeInterpreterToolCallDetails(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter codeInterpreter = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("code_interpreter"u8)) + { + codeInterpreter = InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter.DeserializeInternalRunStepDetailsToolCallsCodeObjectCodeInterpreter(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepCodeInterpreterToolCallDetails(type, serializedAdditionalRawData, id, codeInterpreter); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepCodeInterpreterToolCallDetails)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepCodeInterpreterToolCallDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepCodeInterpreterToolCallDetails(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepCodeInterpreterToolCallDetails)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepCodeInterpreterToolCallDetails FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepCodeInterpreterToolCallDetails(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterToolCallDetails.cs b/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterToolCallDetails.cs new file mode 100644 index 000000000..0b72cde80 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepCodeInterpreterToolCallDetails.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepCodeInterpreterToolCallDetails : RunStepToolCall + { + internal InternalRunStepCodeInterpreterToolCallDetails(string id, InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter codeInterpreter) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(codeInterpreter, nameof(codeInterpreter)); + + Type = "code_interpreter"; + Id = id; + _codeInterpreter = codeInterpreter; + } + + internal InternalRunStepCodeInterpreterToolCallDetails(string type, IDictionary serializedAdditionalRawData, string id, InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter codeInterpreter) : base(type, serializedAdditionalRawData) + { + Id = id; + _codeInterpreter = codeInterpreter; + } + + internal InternalRunStepCodeInterpreterToolCallDetails() + { + } + + public string Id { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDelta.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDelta.Serialization.cs new file mode 100644 index 000000000..3b6d44463 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDelta.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDelta : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDelta)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteObjectValue(Object, options); + } + if (SerializedAdditionalRawData?.ContainsKey("delta") != true) + { + writer.WritePropertyName("delta"u8); + writer.WriteObjectValue(Delta, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDelta IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDelta)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDelta(document.RootElement, options); + } + + internal static InternalRunStepDelta DeserializeInternalRunStepDelta(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + object @object = default; + InternalRunStepDeltaObjectDelta delta = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = property.Value.GetObject(); + continue; + } + if (property.NameEquals("delta"u8)) + { + delta = InternalRunStepDeltaObjectDelta.DeserializeInternalRunStepDeltaObjectDelta(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDelta(id, @object, delta, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDelta)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDelta IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDelta(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDelta)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDelta FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDelta(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDelta.cs b/.dotnet/src/Generated/Models/InternalRunStepDelta.cs new file mode 100644 index 000000000..55d73464a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDelta.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDelta + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunStepDelta(string id, InternalRunStepDeltaObjectDelta delta) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(delta, nameof(delta)); + + Id = id; + Delta = delta; + } + + internal InternalRunStepDelta(string id, object @object, InternalRunStepDeltaObjectDelta delta, IDictionary serializedAdditionalRawData) + { + Id = id; + Object = @object; + Delta = delta; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalRunStepDelta() + { + } + + public string Id { get; } + + public InternalRunStepDeltaObjectDelta Delta { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaObjectDelta.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaObjectDelta.Serialization.cs new file mode 100644 index 000000000..b3430a127 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaObjectDelta.Serialization.cs @@ -0,0 +1,137 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaObjectDelta : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaObjectDelta)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("step_details") != true && Optional.IsDefined(StepDetails)) + { + writer.WritePropertyName("step_details"u8); + writer.WriteObjectValue(StepDetails, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaObjectDelta IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaObjectDelta)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaObjectDelta(document.RootElement, options); + } + + internal static InternalRunStepDeltaObjectDelta DeserializeInternalRunStepDeltaObjectDelta(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalRunStepDeltaStepDetails stepDetails = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("step_details"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + stepDetails = InternalRunStepDeltaStepDetails.DeserializeInternalRunStepDeltaStepDetails(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaObjectDelta(stepDetails, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaObjectDelta)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaObjectDelta IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaObjectDelta(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaObjectDelta)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDeltaObjectDelta FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaObjectDelta(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaObjectDelta.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaObjectDelta.cs new file mode 100644 index 000000000..1faf845db --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaObjectDelta.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaObjectDelta + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunStepDeltaObjectDelta() + { + } + + internal InternalRunStepDeltaObjectDelta(InternalRunStepDeltaStepDetails stepDetails, IDictionary serializedAdditionalRawData) + { + StepDetails = stepDetails; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public InternalRunStepDeltaStepDetails StepDetails { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaObjectObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaObjectObject.cs new file mode 100644 index 000000000..ea17969d2 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaObjectObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalRunStepDeltaObjectObject : IEquatable + { + private readonly string _value; + + public InternalRunStepDeltaObjectObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ThreadRunStepDeltaValue = "thread.run.step.delta"; + + public static InternalRunStepDeltaObjectObject ThreadRunStepDelta { get; } = new InternalRunStepDeltaObjectObject(ThreadRunStepDeltaValue); + public static bool operator ==(InternalRunStepDeltaObjectObject left, InternalRunStepDeltaObjectObject right) => left.Equals(right); + public static bool operator !=(InternalRunStepDeltaObjectObject left, InternalRunStepDeltaObjectObject right) => !left.Equals(right); + public static implicit operator InternalRunStepDeltaObjectObject(string value) => new InternalRunStepDeltaObjectObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalRunStepDeltaObjectObject other && Equals(other); + public bool Equals(InternalRunStepDeltaObjectObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetails.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetails.Serialization.cs new file mode 100644 index 000000000..6ba369d00 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetails.Serialization.cs @@ -0,0 +1,124 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + [PersistableModelProxy(typeof(UnknownRunStepDeltaStepDetails))] + internal partial class InternalRunStepDeltaStepDetails : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetails)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetails)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetails(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetails DeserializeInternalRunStepDeltaStepDetails(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "message_creation": return InternalRunStepDeltaStepDetailsMessageCreationObject.DeserializeInternalRunStepDeltaStepDetailsMessageCreationObject(element, options); + case "tool_calls": return InternalRunStepDeltaStepDetailsToolCallsObject.DeserializeInternalRunStepDeltaStepDetailsToolCallsObject(element, options); + } + } + return UnknownRunStepDeltaStepDetails.DeserializeUnknownRunStepDeltaStepDetails(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetails)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetails(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetails)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDeltaStepDetails FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetails(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetails.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetails.cs new file mode 100644 index 000000000..987886798 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetails.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal abstract partial class InternalRunStepDeltaStepDetails + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected InternalRunStepDeltaStepDetails() + { + } + + internal InternalRunStepDeltaStepDetails(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObject.Serialization.cs new file mode 100644 index 000000000..0c0825062 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObject.Serialization.cs @@ -0,0 +1,148 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsMessageCreationObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsMessageCreationObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("message_creation") != true && Optional.IsDefined(MessageCreation)) + { + writer.WritePropertyName("message_creation"u8); + writer.WriteObjectValue(MessageCreation, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsMessageCreationObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsMessageCreationObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsMessageCreationObject(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsMessageCreationObject DeserializeInternalRunStepDeltaStepDetailsMessageCreationObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation messageCreation = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("message_creation"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + messageCreation = InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation.DeserializeInternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaStepDetailsMessageCreationObject(type, serializedAdditionalRawData, messageCreation); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsMessageCreationObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsMessageCreationObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsMessageCreationObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsMessageCreationObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepDeltaStepDetailsMessageCreationObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsMessageCreationObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObject.cs new file mode 100644 index 000000000..0ee0b50a4 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObject.cs @@ -0,0 +1,24 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsMessageCreationObject : InternalRunStepDeltaStepDetails + { + internal InternalRunStepDeltaStepDetailsMessageCreationObject() + { + Type = "message_creation"; + } + + internal InternalRunStepDeltaStepDetailsMessageCreationObject(string type, IDictionary serializedAdditionalRawData, InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation messageCreation) : base(type, serializedAdditionalRawData) + { + MessageCreation = messageCreation; + } + + public InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation MessageCreation { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation.Serialization.cs new file mode 100644 index 000000000..24e660350 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("message_id") != true && Optional.IsDefined(MessageId)) + { + writer.WritePropertyName("message_id"u8); + writer.WriteStringValue(MessageId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation DeserializeInternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string messageId = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("message_id"u8)) + { + messageId = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation(messageId, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation.cs new file mode 100644 index 000000000..812f249c3 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation() + { + } + + internal InternalRunStepDeltaStepDetailsMessageCreationObjectMessageCreation(string messageId, IDictionary serializedAdditionalRawData) + { + MessageId = messageId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string MessageId { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObject.Serialization.cs new file mode 100644 index 000000000..0b67792c5 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObject.Serialization.cs @@ -0,0 +1,170 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("id") != true && Optional.IsDefined(Id)) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("code_interpreter") != true && Optional.IsDefined(CodeInterpreter)) + { + writer.WritePropertyName("code_interpreter"u8); + writer.WriteObjectValue(CodeInterpreter, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsToolCallsCodeObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeObject(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsToolCallsCodeObject DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + string id = default; + InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter codeInterpreter = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("code_interpreter"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + codeInterpreter = InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter.DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaStepDetailsToolCallsCodeObject(type, serializedAdditionalRawData, index, id, codeInterpreter); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsToolCallsCodeObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepDeltaStepDetailsToolCallsCodeObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObject.cs new file mode 100644 index 000000000..ccd8ec23b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObject.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeObject : InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject + { + internal InternalRunStepDeltaStepDetailsToolCallsCodeObject(int index) + { + Type = "code_interpreter"; + Index = index; + } + + internal InternalRunStepDeltaStepDetailsToolCallsCodeObject(string type, IDictionary serializedAdditionalRawData, int index, string id, InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter codeInterpreter) : base(type, serializedAdditionalRawData) + { + Index = index; + Id = id; + CodeInterpreter = codeInterpreter; + } + + internal InternalRunStepDeltaStepDetailsToolCallsCodeObject() + { + } + + public int Index { get; } + public string Id { get; } + public InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter CodeInterpreter { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter.Serialization.cs new file mode 100644 index 000000000..a95a27852 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter.Serialization.cs @@ -0,0 +1,158 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("input") != true && Optional.IsDefined(Input)) + { + writer.WritePropertyName("input"u8); + writer.WriteStringValue(Input); + } + if (SerializedAdditionalRawData?.ContainsKey("outputs") != true && Optional.IsCollectionDefined(Outputs)) + { + writer.WritePropertyName("outputs"u8); + writer.WriteStartArray(); + foreach (var item in Outputs) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string input = default; + IReadOnlyList outputs = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("input"u8)) + { + input = property.Value.GetString(); + continue; + } + if (property.NameEquals("outputs"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(RunStepUpdateCodeInterpreterOutput.DeserializeRunStepUpdateCodeInterpreterOutput(item, options)); + } + outputs = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter(input, outputs ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter.cs new file mode 100644 index 000000000..2330d340a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter.cs @@ -0,0 +1,28 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter() + { + Outputs = new ChangeTrackingList(); + } + + internal InternalRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreter(string input, IReadOnlyList outputs, IDictionary serializedAdditionalRawData) + { + Input = input; + Outputs = outputs; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string Input { get; } + public IReadOnlyList Outputs { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject.Serialization.cs new file mode 100644 index 000000000..2e1725b51 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject.Serialization.cs @@ -0,0 +1,159 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("image") != true && Optional.IsDefined(Image)) + { + writer.WritePropertyName("image"u8); + writer.WriteObjectValue(Image, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage image = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("image"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + image = InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage.DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject(type, serializedAdditionalRawData, index, image); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject.cs new file mode 100644 index 000000000..26acd5b75 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject : RunStepUpdateCodeInterpreterOutput + { + internal InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject(int index) + { + Type = "image"; + Index = index; + } + + internal InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject(string type, IDictionary serializedAdditionalRawData, int index, InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage image) : base(type, serializedAdditionalRawData) + { + Index = index; + Image = image; + } + + internal InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject() + { + } + + public int Index { get; } + public InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage Image { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage.Serialization.cs new file mode 100644 index 000000000..9ad45a123 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_id") != true && Optional.IsDefined(FileId)) + { + writer.WritePropertyName("file_id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string fileId = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_id"u8)) + { + fileId = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage(fileId, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage.cs new file mode 100644 index 000000000..cd41fb9fa --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage() + { + } + + internal InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObjectImage(string fileId, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string FileId { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject.Serialization.cs new file mode 100644 index 000000000..0047ba1a9 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("logs") != true && Optional.IsDefined(InternalLogs)) + { + writer.WritePropertyName("logs"u8); + writer.WriteStringValue(InternalLogs); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + string logs = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("logs"u8)) + { + logs = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject(type, serializedAdditionalRawData, index, logs); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject.cs new file mode 100644 index 000000000..aa0e9501d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject.cs @@ -0,0 +1,30 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject : RunStepUpdateCodeInterpreterOutput + { + internal InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject(int index) + { + Type = "logs"; + Index = index; + } + + internal InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject(string type, IDictionary serializedAdditionalRawData, int index, string internalLogs) : base(type, serializedAdditionalRawData) + { + Index = index; + InternalLogs = internalLogs; + } + + internal InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject() + { + } + + public int Index { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFileSearchObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFileSearchObject.Serialization.cs new file mode 100644 index 000000000..e9dd186c0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFileSearchObject.Serialization.cs @@ -0,0 +1,177 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsFileSearchObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFileSearchObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("id") != true && Optional.IsDefined(Id)) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("file_search") != true) + { + writer.WritePropertyName("file_search"u8); + writer.WriteStartObject(); + foreach (var item in FileSearch) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsToolCallsFileSearchObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFileSearchObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsFileSearchObject(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsToolCallsFileSearchObject DeserializeInternalRunStepDeltaStepDetailsToolCallsFileSearchObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + string id = default; + IReadOnlyDictionary fileSearch = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("file_search"u8)) + { + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + fileSearch = dictionary; + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaStepDetailsToolCallsFileSearchObject(type, serializedAdditionalRawData, index, id, fileSearch); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFileSearchObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsToolCallsFileSearchObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsFileSearchObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFileSearchObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepDeltaStepDetailsToolCallsFileSearchObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsFileSearchObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFileSearchObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFileSearchObject.cs new file mode 100644 index 000000000..6fcfc013c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFileSearchObject.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsFileSearchObject : InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject + { + internal InternalRunStepDeltaStepDetailsToolCallsFileSearchObject(int index, IReadOnlyDictionary fileSearch) + { + Argument.AssertNotNull(fileSearch, nameof(fileSearch)); + + Type = "file_search"; + Index = index; + FileSearch = fileSearch; + } + + internal InternalRunStepDeltaStepDetailsToolCallsFileSearchObject(string type, IDictionary serializedAdditionalRawData, int index, string id, IReadOnlyDictionary fileSearch) : base(type, serializedAdditionalRawData) + { + Index = index; + Id = id; + FileSearch = fileSearch; + } + + internal InternalRunStepDeltaStepDetailsToolCallsFileSearchObject() + { + } + + public int Index { get; } + public string Id { get; } + public IReadOnlyDictionary FileSearch { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObject.Serialization.cs new file mode 100644 index 000000000..66776a57d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObject.Serialization.cs @@ -0,0 +1,170 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsFunctionObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFunctionObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("id") != true && Optional.IsDefined(Id)) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("function") != true && Optional.IsDefined(Function)) + { + writer.WritePropertyName("function"u8); + writer.WriteObjectValue(Function, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsToolCallsFunctionObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFunctionObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsFunctionObject(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsToolCallsFunctionObject DeserializeInternalRunStepDeltaStepDetailsToolCallsFunctionObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + string id = default; + InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction function = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("function"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + function = InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction.DeserializeInternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaStepDetailsToolCallsFunctionObject(type, serializedAdditionalRawData, index, id, function); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFunctionObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsToolCallsFunctionObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsFunctionObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFunctionObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepDeltaStepDetailsToolCallsFunctionObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsFunctionObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObject.cs new file mode 100644 index 000000000..e45ecae14 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObject.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsFunctionObject : InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject + { + internal InternalRunStepDeltaStepDetailsToolCallsFunctionObject(int index) + { + Type = "function"; + Index = index; + } + + internal InternalRunStepDeltaStepDetailsToolCallsFunctionObject(string type, IDictionary serializedAdditionalRawData, int index, string id, InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction function) : base(type, serializedAdditionalRawData) + { + Index = index; + Id = id; + Function = function; + } + + internal InternalRunStepDeltaStepDetailsToolCallsFunctionObject() + { + } + + public int Index { get; } + public string Id { get; } + public InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction Function { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction.Serialization.cs new file mode 100644 index 000000000..f96531c16 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction.Serialization.cs @@ -0,0 +1,167 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("name") != true && Optional.IsDefined(Name)) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData?.ContainsKey("arguments") != true && Optional.IsDefined(Arguments)) + { + writer.WritePropertyName("arguments"u8); + writer.WriteStringValue(Arguments); + } + if (SerializedAdditionalRawData?.ContainsKey("output") != true && Optional.IsDefined(Output)) + { + if (Output != null) + { + writer.WritePropertyName("output"u8); + writer.WriteStringValue(Output); + } + else + { + writer.WriteNull("output"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction DeserializeInternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + string arguments = default; + string output = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("arguments"u8)) + { + arguments = property.Value.GetString(); + continue; + } + if (property.NameEquals("output"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + output = null; + continue; + } + output = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction(name, arguments, output, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction.cs new file mode 100644 index 000000000..e546c4f89 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction.cs @@ -0,0 +1,29 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction() + { + } + + internal InternalRunStepDeltaStepDetailsToolCallsFunctionObjectFunction(string name, string arguments, string output, IDictionary serializedAdditionalRawData) + { + Name = name; + Arguments = arguments; + Output = output; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string Name { get; } + public string Arguments { get; } + public string Output { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObject.Serialization.cs new file mode 100644 index 000000000..b1ba9cfad --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObject.Serialization.cs @@ -0,0 +1,158 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("tool_calls") != true && Optional.IsCollectionDefined(ToolCalls)) + { + writer.WritePropertyName("tool_calls"u8); + writer.WriteStartArray(); + foreach (var item in ToolCalls) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsToolCallsObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsObject(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsToolCallsObject DeserializeInternalRunStepDeltaStepDetailsToolCallsObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList toolCalls = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("tool_calls"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.DeserializeInternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(item, options)); + } + toolCalls = array; + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDeltaStepDetailsToolCallsObject(type, serializedAdditionalRawData, toolCalls ?? new ChangeTrackingList()); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsToolCallsObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepDeltaStepDetailsToolCallsObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObject.cs new file mode 100644 index 000000000..bae1ef100 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObject.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDeltaStepDetailsToolCallsObject : InternalRunStepDeltaStepDetails + { + internal InternalRunStepDeltaStepDetailsToolCallsObject() + { + Type = "tool_calls"; + ToolCalls = new ChangeTrackingList(); + } + + internal InternalRunStepDeltaStepDetailsToolCallsObject(string type, IDictionary serializedAdditionalRawData, IReadOnlyList toolCalls) : base(type, serializedAdditionalRawData) + { + ToolCalls = toolCalls; + } + + public IReadOnlyList ToolCalls { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.Serialization.cs new file mode 100644 index 000000000..f9aa10e1a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.Serialization.cs @@ -0,0 +1,125 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + [PersistableModelProxy(typeof(UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject))] + internal partial class InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(document.RootElement, options); + } + + internal static InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject DeserializeInternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "code_interpreter": return InternalRunStepDeltaStepDetailsToolCallsCodeObject.DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeObject(element, options); + case "file_search": return InternalRunStepDeltaStepDetailsToolCallsFileSearchObject.DeserializeInternalRunStepDeltaStepDetailsToolCallsFileSearchObject(element, options); + case "function": return InternalRunStepDeltaStepDetailsToolCallsFunctionObject.DeserializeInternalRunStepDeltaStepDetailsToolCallsFunctionObject(element, options); + } + } + return UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.DeserializeUnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.cs new file mode 100644 index 000000000..55b7f0b45 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal abstract partial class InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject() + { + } + + internal InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObject.Serialization.cs new file mode 100644 index 000000000..8f037505f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObject.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsMessageCreationObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsMessageCreationObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("message_creation") != true) + { + writer.WritePropertyName("message_creation"u8); + writer.WriteObjectValue(_messageCreation, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDetailsMessageCreationObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsMessageCreationObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDetailsMessageCreationObject(document.RootElement, options); + } + + internal static InternalRunStepDetailsMessageCreationObject DeserializeInternalRunStepDetailsMessageCreationObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalRunStepDetailsMessageCreationObjectMessageCreation messageCreation = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("message_creation"u8)) + { + messageCreation = InternalRunStepDetailsMessageCreationObjectMessageCreation.DeserializeInternalRunStepDetailsMessageCreationObjectMessageCreation(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDetailsMessageCreationObject(type, serializedAdditionalRawData, messageCreation); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsMessageCreationObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDetailsMessageCreationObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDetailsMessageCreationObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsMessageCreationObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepDetailsMessageCreationObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDetailsMessageCreationObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObject.cs new file mode 100644 index 000000000..e36008494 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObject.cs @@ -0,0 +1,29 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsMessageCreationObject : RunStepDetails + { + internal InternalRunStepDetailsMessageCreationObject(InternalRunStepDetailsMessageCreationObjectMessageCreation messageCreation) + { + Argument.AssertNotNull(messageCreation, nameof(messageCreation)); + + Type = "message_creation"; + _messageCreation = messageCreation; + } + + internal InternalRunStepDetailsMessageCreationObject(string type, IDictionary serializedAdditionalRawData, InternalRunStepDetailsMessageCreationObjectMessageCreation messageCreation) : base(type, serializedAdditionalRawData) + { + _messageCreation = messageCreation; + } + + internal InternalRunStepDetailsMessageCreationObject() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObjectMessageCreation.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObjectMessageCreation.Serialization.cs new file mode 100644 index 000000000..4f0ef2047 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObjectMessageCreation.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsMessageCreationObjectMessageCreation : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsMessageCreationObjectMessageCreation)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("message_id") != true) + { + writer.WritePropertyName("message_id"u8); + writer.WriteStringValue(MessageId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDetailsMessageCreationObjectMessageCreation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsMessageCreationObjectMessageCreation)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDetailsMessageCreationObjectMessageCreation(document.RootElement, options); + } + + internal static InternalRunStepDetailsMessageCreationObjectMessageCreation DeserializeInternalRunStepDetailsMessageCreationObjectMessageCreation(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string messageId = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("message_id"u8)) + { + messageId = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDetailsMessageCreationObjectMessageCreation(messageId, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsMessageCreationObjectMessageCreation)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDetailsMessageCreationObjectMessageCreation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDetailsMessageCreationObjectMessageCreation(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsMessageCreationObjectMessageCreation)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDetailsMessageCreationObjectMessageCreation FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDetailsMessageCreationObjectMessageCreation(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObjectMessageCreation.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObjectMessageCreation.cs new file mode 100644 index 000000000..b876ff8c4 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsMessageCreationObjectMessageCreation.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsMessageCreationObjectMessageCreation + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunStepDetailsMessageCreationObjectMessageCreation(string messageId) + { + Argument.AssertNotNull(messageId, nameof(messageId)); + + MessageId = messageId; + } + + internal InternalRunStepDetailsMessageCreationObjectMessageCreation(string messageId, IDictionary serializedAdditionalRawData) + { + MessageId = messageId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalRunStepDetailsMessageCreationObjectMessageCreation() + { + } + + public string MessageId { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter.Serialization.cs new file mode 100644 index 000000000..d1550479d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter.Serialization.cs @@ -0,0 +1,154 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("input") != true) + { + writer.WritePropertyName("input"u8); + writer.WriteStringValue(Input); + } + if (SerializedAdditionalRawData?.ContainsKey("outputs") != true) + { + writer.WritePropertyName("outputs"u8); + writer.WriteStartArray(); + foreach (var item in Outputs) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDetailsToolCallsCodeObjectCodeInterpreter(document.RootElement, options); + } + + internal static InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter DeserializeInternalRunStepDetailsToolCallsCodeObjectCodeInterpreter(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string input = default; + IReadOnlyList outputs = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("input"u8)) + { + input = property.Value.GetString(); + continue; + } + if (property.NameEquals("outputs"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(RunStepCodeInterpreterOutput.DeserializeRunStepCodeInterpreterOutput(item, options)); + } + outputs = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter(input, outputs, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDetailsToolCallsCodeObjectCodeInterpreter(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDetailsToolCallsCodeObjectCodeInterpreter(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter.cs new file mode 100644 index 000000000..2c309b663 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter(string input, IEnumerable outputs) + { + Argument.AssertNotNull(input, nameof(input)); + Argument.AssertNotNull(outputs, nameof(outputs)); + + Input = input; + Outputs = outputs.ToList(); + } + + internal InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter(string input, IReadOnlyList outputs, IDictionary serializedAdditionalRawData) + { + Input = input; + Outputs = outputs; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalRunStepDetailsToolCallsCodeObjectCodeInterpreter() + { + } + + public string Input { get; } + public IReadOnlyList Outputs { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObject.Serialization.cs new file mode 100644 index 000000000..e0de2eb55 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObject.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsToolCallsCodeOutputImageObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeOutputImageObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("image") != true) + { + writer.WritePropertyName("image"u8); + writer.WriteObjectValue(_image, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDetailsToolCallsCodeOutputImageObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeOutputImageObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDetailsToolCallsCodeOutputImageObject(document.RootElement, options); + } + + internal static InternalRunStepDetailsToolCallsCodeOutputImageObject DeserializeInternalRunStepDetailsToolCallsCodeOutputImageObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalRunStepDetailsToolCallsCodeOutputImageObjectImage image = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("image"u8)) + { + image = InternalRunStepDetailsToolCallsCodeOutputImageObjectImage.DeserializeInternalRunStepDetailsToolCallsCodeOutputImageObjectImage(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDetailsToolCallsCodeOutputImageObject(type, serializedAdditionalRawData, image); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeOutputImageObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDetailsToolCallsCodeOutputImageObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDetailsToolCallsCodeOutputImageObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeOutputImageObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepDetailsToolCallsCodeOutputImageObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDetailsToolCallsCodeOutputImageObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObject.cs new file mode 100644 index 000000000..2d12da18d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObject.cs @@ -0,0 +1,29 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsToolCallsCodeOutputImageObject : RunStepCodeInterpreterOutput + { + internal InternalRunStepDetailsToolCallsCodeOutputImageObject(InternalRunStepDetailsToolCallsCodeOutputImageObjectImage image) + { + Argument.AssertNotNull(image, nameof(image)); + + Type = "image"; + _image = image; + } + + internal InternalRunStepDetailsToolCallsCodeOutputImageObject(string type, IDictionary serializedAdditionalRawData, InternalRunStepDetailsToolCallsCodeOutputImageObjectImage image) : base(type, serializedAdditionalRawData) + { + _image = image; + } + + internal InternalRunStepDetailsToolCallsCodeOutputImageObject() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObjectImage.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObjectImage.Serialization.cs new file mode 100644 index 000000000..6e6ffba6a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObjectImage.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsToolCallsCodeOutputImageObjectImage : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeOutputImageObjectImage)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_id") != true) + { + writer.WritePropertyName("file_id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDetailsToolCallsCodeOutputImageObjectImage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeOutputImageObjectImage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDetailsToolCallsCodeOutputImageObjectImage(document.RootElement, options); + } + + internal static InternalRunStepDetailsToolCallsCodeOutputImageObjectImage DeserializeInternalRunStepDetailsToolCallsCodeOutputImageObjectImage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string fileId = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_id"u8)) + { + fileId = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDetailsToolCallsCodeOutputImageObjectImage(fileId, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeOutputImageObjectImage)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDetailsToolCallsCodeOutputImageObjectImage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDetailsToolCallsCodeOutputImageObjectImage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsCodeOutputImageObjectImage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDetailsToolCallsCodeOutputImageObjectImage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDetailsToolCallsCodeOutputImageObjectImage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObjectImage.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObjectImage.cs new file mode 100644 index 000000000..17afb0073 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsCodeOutputImageObjectImage.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsToolCallsCodeOutputImageObjectImage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunStepDetailsToolCallsCodeOutputImageObjectImage(string fileId) + { + Argument.AssertNotNull(fileId, nameof(fileId)); + + FileId = fileId; + } + + internal InternalRunStepDetailsToolCallsCodeOutputImageObjectImage(string fileId, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalRunStepDetailsToolCallsCodeOutputImageObjectImage() + { + } + + public string FileId { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsFunctionObjectFunction.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsFunctionObjectFunction.Serialization.cs new file mode 100644 index 000000000..d72f701a2 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsFunctionObjectFunction.Serialization.cs @@ -0,0 +1,167 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsToolCallsFunctionObjectFunction : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsFunctionObjectFunction)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData?.ContainsKey("arguments") != true) + { + writer.WritePropertyName("arguments"u8); + writer.WriteStringValue(Arguments); + } + if (SerializedAdditionalRawData?.ContainsKey("output") != true) + { + if (Output != null) + { + writer.WritePropertyName("output"u8); + writer.WriteStringValue(Output); + } + else + { + writer.WriteNull("output"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDetailsToolCallsFunctionObjectFunction IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsFunctionObjectFunction)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDetailsToolCallsFunctionObjectFunction(document.RootElement, options); + } + + internal static InternalRunStepDetailsToolCallsFunctionObjectFunction DeserializeInternalRunStepDetailsToolCallsFunctionObjectFunction(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + string arguments = default; + string output = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("arguments"u8)) + { + arguments = property.Value.GetString(); + continue; + } + if (property.NameEquals("output"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + output = null; + continue; + } + output = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDetailsToolCallsFunctionObjectFunction(name, arguments, output, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsFunctionObjectFunction)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDetailsToolCallsFunctionObjectFunction IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDetailsToolCallsFunctionObjectFunction(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsFunctionObjectFunction)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunStepDetailsToolCallsFunctionObjectFunction FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDetailsToolCallsFunctionObjectFunction(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsFunctionObjectFunction.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsFunctionObjectFunction.cs new file mode 100644 index 000000000..588377cde --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsFunctionObjectFunction.cs @@ -0,0 +1,39 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsToolCallsFunctionObjectFunction + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunStepDetailsToolCallsFunctionObjectFunction(string name, string arguments, string output) + { + Argument.AssertNotNull(name, nameof(name)); + Argument.AssertNotNull(arguments, nameof(arguments)); + + Name = name; + Arguments = arguments; + Output = output; + } + + internal InternalRunStepDetailsToolCallsFunctionObjectFunction(string name, string arguments, string output, IDictionary serializedAdditionalRawData) + { + Name = name; + Arguments = arguments; + Output = output; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalRunStepDetailsToolCallsFunctionObjectFunction() + { + } + + public string Name { get; } + public string Arguments { get; } + public string Output { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsObject.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsObject.Serialization.cs new file mode 100644 index 000000000..122837da8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsObject.Serialization.cs @@ -0,0 +1,154 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsToolCallsObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("tool_calls") != true) + { + writer.WritePropertyName("tool_calls"u8); + writer.WriteStartArray(); + foreach (var item in InternalToolCalls) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDetailsToolCallsObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDetailsToolCallsObject(document.RootElement, options); + } + + internal static InternalRunStepDetailsToolCallsObject DeserializeInternalRunStepDetailsToolCallsObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList toolCalls = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("tool_calls"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(RunStepToolCall.DeserializeRunStepToolCall(item, options)); + } + toolCalls = array; + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepDetailsToolCallsObject(type, serializedAdditionalRawData, toolCalls); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDetailsToolCallsObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDetailsToolCallsObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDetailsToolCallsObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepDetailsToolCallsObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepDetailsToolCallsObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsObject.cs b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsObject.cs new file mode 100644 index 000000000..5b428be16 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepDetailsToolCallsObject.cs @@ -0,0 +1,30 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepDetailsToolCallsObject : RunStepDetails + { + internal InternalRunStepDetailsToolCallsObject(IEnumerable internalToolCalls) + { + Argument.AssertNotNull(internalToolCalls, nameof(internalToolCalls)); + + Type = "tool_calls"; + InternalToolCalls = internalToolCalls.ToList(); + } + + internal InternalRunStepDetailsToolCallsObject(string type, IDictionary serializedAdditionalRawData, IReadOnlyList internalToolCalls) : base(type, serializedAdditionalRawData) + { + InternalToolCalls = internalToolCalls; + } + + internal InternalRunStepDetailsToolCallsObject() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepFileSearchToolCallDetails.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepFileSearchToolCallDetails.Serialization.cs new file mode 100644 index 000000000..e89d929f4 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepFileSearchToolCallDetails.Serialization.cs @@ -0,0 +1,166 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepFileSearchToolCallDetails : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepFileSearchToolCallDetails)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("file_search") != true) + { + writer.WritePropertyName("file_search"u8); + writer.WriteStartObject(); + foreach (var item in FileSearch) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepFileSearchToolCallDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepFileSearchToolCallDetails)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepFileSearchToolCallDetails(document.RootElement, options); + } + + internal static InternalRunStepFileSearchToolCallDetails DeserializeInternalRunStepFileSearchToolCallDetails(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + IReadOnlyDictionary fileSearch = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("file_search"u8)) + { + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + fileSearch = dictionary; + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepFileSearchToolCallDetails(type, serializedAdditionalRawData, id, fileSearch); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepFileSearchToolCallDetails)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepFileSearchToolCallDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepFileSearchToolCallDetails(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepFileSearchToolCallDetails)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepFileSearchToolCallDetails FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepFileSearchToolCallDetails(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepFileSearchToolCallDetails.cs b/.dotnet/src/Generated/Models/InternalRunStepFileSearchToolCallDetails.cs new file mode 100644 index 000000000..6be9fec00 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepFileSearchToolCallDetails.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepFileSearchToolCallDetails : RunStepToolCall + { + internal InternalRunStepFileSearchToolCallDetails(string id, IReadOnlyDictionary fileSearch) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(fileSearch, nameof(fileSearch)); + + Type = "file_search"; + Id = id; + FileSearch = fileSearch; + } + + internal InternalRunStepFileSearchToolCallDetails(string type, IDictionary serializedAdditionalRawData, string id, IReadOnlyDictionary fileSearch) : base(type, serializedAdditionalRawData) + { + Id = id; + FileSearch = fileSearch; + } + + internal InternalRunStepFileSearchToolCallDetails() + { + } + + public string Id { get; } + public IReadOnlyDictionary FileSearch { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepFunctionToolCallDetails.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunStepFunctionToolCallDetails.Serialization.cs new file mode 100644 index 000000000..263ace277 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepFunctionToolCallDetails.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepFunctionToolCallDetails : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepFunctionToolCallDetails)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("function") != true) + { + writer.WritePropertyName("function"u8); + writer.WriteObjectValue(_internalFunction, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepFunctionToolCallDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepFunctionToolCallDetails)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepFunctionToolCallDetails(document.RootElement, options); + } + + internal static InternalRunStepFunctionToolCallDetails DeserializeInternalRunStepFunctionToolCallDetails(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalRunStepDetailsToolCallsFunctionObjectFunction function = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("function"u8)) + { + function = InternalRunStepDetailsToolCallsFunctionObjectFunction.DeserializeInternalRunStepDetailsToolCallsFunctionObjectFunction(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunStepFunctionToolCallDetails(type, serializedAdditionalRawData, id, function); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepFunctionToolCallDetails)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepFunctionToolCallDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepFunctionToolCallDetails(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepFunctionToolCallDetails)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalRunStepFunctionToolCallDetails FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunStepFunctionToolCallDetails(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepFunctionToolCallDetails.cs b/.dotnet/src/Generated/Models/InternalRunStepFunctionToolCallDetails.cs new file mode 100644 index 000000000..ad318c7da --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepFunctionToolCallDetails.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunStepFunctionToolCallDetails : RunStepToolCall + { + internal InternalRunStepFunctionToolCallDetails(string id, InternalRunStepDetailsToolCallsFunctionObjectFunction internalFunction) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(internalFunction, nameof(internalFunction)); + + Type = "function"; + Id = id; + _internalFunction = internalFunction; + } + + internal InternalRunStepFunctionToolCallDetails(string type, IDictionary serializedAdditionalRawData, string id, InternalRunStepDetailsToolCallsFunctionObjectFunction internalFunction) : base(type, serializedAdditionalRawData) + { + Id = id; + _internalFunction = internalFunction; + } + + internal InternalRunStepFunctionToolCallDetails() + { + } + + public string Id { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunStepObjectObject.cs b/.dotnet/src/Generated/Models/InternalRunStepObjectObject.cs new file mode 100644 index 000000000..c587668ab --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunStepObjectObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalRunStepObjectObject : IEquatable + { + private readonly string _value; + + public InternalRunStepObjectObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ThreadRunStepValue = "thread.run.step"; + + public static InternalRunStepObjectObject ThreadRunStep { get; } = new InternalRunStepObjectObject(ThreadRunStepValue); + public static bool operator ==(InternalRunStepObjectObject left, InternalRunStepObjectObject right) => left.Equals(right); + public static bool operator !=(InternalRunStepObjectObject left, InternalRunStepObjectObject right) => !left.Equals(right); + public static implicit operator InternalRunStepObjectObject(string value) => new InternalRunStepObjectObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalRunStepObjectObject other && Equals(other); + public bool Equals(InternalRunStepObjectObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunToolCallObjectFunction.Serialization.cs b/.dotnet/src/Generated/Models/InternalRunToolCallObjectFunction.Serialization.cs new file mode 100644 index 000000000..820fb48cc --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunToolCallObjectFunction.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunToolCallObjectFunction : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunToolCallObjectFunction)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData?.ContainsKey("arguments") != true) + { + writer.WritePropertyName("arguments"u8); + writer.WriteStringValue(Arguments); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunToolCallObjectFunction IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunToolCallObjectFunction)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunToolCallObjectFunction(document.RootElement, options); + } + + internal static InternalRunToolCallObjectFunction DeserializeInternalRunToolCallObjectFunction(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + string arguments = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("arguments"u8)) + { + arguments = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalRunToolCallObjectFunction(name, arguments, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunToolCallObjectFunction)} does not support writing '{options.Format}' format."); + } + } + + InternalRunToolCallObjectFunction IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunToolCallObjectFunction(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunToolCallObjectFunction)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalRunToolCallObjectFunction FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalRunToolCallObjectFunction(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunToolCallObjectFunction.cs b/.dotnet/src/Generated/Models/InternalRunToolCallObjectFunction.cs new file mode 100644 index 000000000..a9f72778e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunToolCallObjectFunction.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalRunToolCallObjectFunction + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalRunToolCallObjectFunction(string name, string arguments) + { + Argument.AssertNotNull(name, nameof(name)); + Argument.AssertNotNull(arguments, nameof(arguments)); + + Name = name; + Arguments = arguments; + } + + internal InternalRunToolCallObjectFunction(string name, string arguments, IDictionary serializedAdditionalRawData) + { + Name = name; + Arguments = arguments; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalRunToolCallObjectFunction() + { + } + + public string Name { get; } + public string Arguments { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalRunToolCallObjectType.cs b/.dotnet/src/Generated/Models/InternalRunToolCallObjectType.cs new file mode 100644 index 000000000..55e7b1d45 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalRunToolCallObjectType.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalRunToolCallObjectType : IEquatable + { + private readonly string _value; + + public InternalRunToolCallObjectType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string FunctionValue = "function"; + + public static InternalRunToolCallObjectType Function { get; } = new InternalRunToolCallObjectType(FunctionValue); + public static bool operator ==(InternalRunToolCallObjectType left, InternalRunToolCallObjectType right) => left.Equals(right); + public static bool operator !=(InternalRunToolCallObjectType left, InternalRunToolCallObjectType right) => !left.Equals(right); + public static implicit operator InternalRunToolCallObjectType(string value) => new InternalRunToolCallObjectType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalRunToolCallObjectType other && Equals(other); + public bool Equals(InternalRunToolCallObjectType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyDetails.Serialization.cs b/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyDetails.Serialization.cs new file mode 100644 index 000000000..f40eef604 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyDetails.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalStaticChunkingStrategyDetails : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalStaticChunkingStrategyDetails)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("max_chunk_size_tokens") != true) + { + writer.WritePropertyName("max_chunk_size_tokens"u8); + writer.WriteNumberValue(MaxChunkSizeTokens); + } + if (SerializedAdditionalRawData?.ContainsKey("chunk_overlap_tokens") != true) + { + writer.WritePropertyName("chunk_overlap_tokens"u8); + writer.WriteNumberValue(ChunkOverlapTokens); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalStaticChunkingStrategyDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalStaticChunkingStrategyDetails)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalStaticChunkingStrategyDetails(document.RootElement, options); + } + + internal static InternalStaticChunkingStrategyDetails DeserializeInternalStaticChunkingStrategyDetails(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int maxChunkSizeTokens = default; + int chunkOverlapTokens = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("max_chunk_size_tokens"u8)) + { + maxChunkSizeTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("chunk_overlap_tokens"u8)) + { + chunkOverlapTokens = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalStaticChunkingStrategyDetails(maxChunkSizeTokens, chunkOverlapTokens, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalStaticChunkingStrategyDetails)} does not support writing '{options.Format}' format."); + } + } + + InternalStaticChunkingStrategyDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalStaticChunkingStrategyDetails(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalStaticChunkingStrategyDetails)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalStaticChunkingStrategyDetails FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalStaticChunkingStrategyDetails(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyDetails.cs b/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyDetails.cs new file mode 100644 index 000000000..26a6aa2e1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyDetails.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal partial class InternalStaticChunkingStrategyDetails + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalStaticChunkingStrategyDetails(int maxChunkSizeTokens, int chunkOverlapTokens) + { + MaxChunkSizeTokens = maxChunkSizeTokens; + ChunkOverlapTokens = chunkOverlapTokens; + } + + internal InternalStaticChunkingStrategyDetails(int maxChunkSizeTokens, int chunkOverlapTokens, IDictionary serializedAdditionalRawData) + { + MaxChunkSizeTokens = maxChunkSizeTokens; + ChunkOverlapTokens = chunkOverlapTokens; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalStaticChunkingStrategyDetails() + { + } + + public int MaxChunkSizeTokens { get; set; } + public int ChunkOverlapTokens { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyRequestParam.Serialization.cs b/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyRequestParam.Serialization.cs new file mode 100644 index 000000000..47c705b9e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyRequestParam.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalStaticChunkingStrategyRequestParam : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalStaticChunkingStrategyRequestParam)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("static") != true) + { + writer.WritePropertyName("static"u8); + writer.WriteObjectValue(Static, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalStaticChunkingStrategyRequestParam IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalStaticChunkingStrategyRequestParam)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalStaticChunkingStrategyRequestParam(document.RootElement, options); + } + + internal static InternalStaticChunkingStrategyRequestParam DeserializeInternalStaticChunkingStrategyRequestParam(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalStaticChunkingStrategyDetails @static = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("static"u8)) + { + @static = InternalStaticChunkingStrategyDetails.DeserializeInternalStaticChunkingStrategyDetails(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalStaticChunkingStrategyRequestParam(type, serializedAdditionalRawData, @static); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalStaticChunkingStrategyRequestParam)} does not support writing '{options.Format}' format."); + } + } + + InternalStaticChunkingStrategyRequestParam IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalStaticChunkingStrategyRequestParam(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalStaticChunkingStrategyRequestParam)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalStaticChunkingStrategyRequestParam FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalStaticChunkingStrategyRequestParam(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyRequestParam.cs b/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyRequestParam.cs new file mode 100644 index 000000000..2b96bd6ea --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalStaticChunkingStrategyRequestParam.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal partial class InternalStaticChunkingStrategyRequestParam : InternalFileChunkingStrategyRequestParam + { + public InternalStaticChunkingStrategyRequestParam(InternalStaticChunkingStrategyDetails @static) + { + Argument.AssertNotNull(@static, nameof(@static)); + + Type = "static"; + Static = @static; + } + + internal InternalStaticChunkingStrategyRequestParam(string type, IDictionary serializedAdditionalRawData, InternalStaticChunkingStrategyDetails @static) : base(type, serializedAdditionalRawData) + { + Static = @static; + } + + internal InternalStaticChunkingStrategyRequestParam() + { + } + + public InternalStaticChunkingStrategyDetails Static { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalSubmitToolOutputsRunRequest.Serialization.cs b/.dotnet/src/Generated/Models/InternalSubmitToolOutputsRunRequest.Serialization.cs new file mode 100644 index 000000000..d6c8b0d3f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalSubmitToolOutputsRunRequest.Serialization.cs @@ -0,0 +1,166 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalSubmitToolOutputsRunRequest : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalSubmitToolOutputsRunRequest)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("tool_outputs") != true) + { + writer.WritePropertyName("tool_outputs"u8); + writer.WriteStartArray(); + foreach (var item in ToolOutputs) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("stream") != true && Optional.IsDefined(Stream)) + { + if (Stream != null) + { + writer.WritePropertyName("stream"u8); + writer.WriteBooleanValue(Stream.Value); + } + else + { + writer.WriteNull("stream"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalSubmitToolOutputsRunRequest IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalSubmitToolOutputsRunRequest)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalSubmitToolOutputsRunRequest(document.RootElement, options); + } + + internal static InternalSubmitToolOutputsRunRequest DeserializeInternalSubmitToolOutputsRunRequest(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList toolOutputs = default; + bool? stream = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("tool_outputs"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ToolOutput.DeserializeToolOutput(item, options)); + } + toolOutputs = array; + continue; + } + if (property.NameEquals("stream"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + stream = null; + continue; + } + stream = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalSubmitToolOutputsRunRequest(toolOutputs, stream, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalSubmitToolOutputsRunRequest)} does not support writing '{options.Format}' format."); + } + } + + InternalSubmitToolOutputsRunRequest IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalSubmitToolOutputsRunRequest(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalSubmitToolOutputsRunRequest)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalSubmitToolOutputsRunRequest FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalSubmitToolOutputsRunRequest(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalSubmitToolOutputsRunRequest.cs b/.dotnet/src/Generated/Models/InternalSubmitToolOutputsRunRequest.cs new file mode 100644 index 000000000..13a442914 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalSubmitToolOutputsRunRequest.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + internal partial class InternalSubmitToolOutputsRunRequest + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalSubmitToolOutputsRunRequest(IEnumerable toolOutputs) + { + Argument.AssertNotNull(toolOutputs, nameof(toolOutputs)); + + ToolOutputs = toolOutputs.ToList(); + } + + internal InternalSubmitToolOutputsRunRequest(IList toolOutputs, bool? stream, IDictionary serializedAdditionalRawData) + { + ToolOutputs = toolOutputs; + Stream = stream; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalSubmitToolOutputsRunRequest() + { + } + + public IList ToolOutputs { get; } + public bool? Stream { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalThreadObjectObject.cs b/.dotnet/src/Generated/Models/InternalThreadObjectObject.cs new file mode 100644 index 000000000..1ae268486 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalThreadObjectObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalThreadObjectObject : IEquatable + { + private readonly string _value; + + public InternalThreadObjectObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ThreadValue = "thread"; + + public static InternalThreadObjectObject Thread { get; } = new InternalThreadObjectObject(ThreadValue); + public static bool operator ==(InternalThreadObjectObject left, InternalThreadObjectObject right) => left.Equals(right); + public static bool operator !=(InternalThreadObjectObject left, InternalThreadObjectObject right) => !left.Equals(right); + public static implicit operator InternalThreadObjectObject(string value) => new InternalThreadObjectObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalThreadObjectObject other && Equals(other); + public bool Equals(InternalThreadObjectObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalThreadObjectToolResources.Serialization.cs b/.dotnet/src/Generated/Models/InternalThreadObjectToolResources.Serialization.cs new file mode 100644 index 000000000..d33c9da5a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalThreadObjectToolResources.Serialization.cs @@ -0,0 +1,152 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalThreadObjectToolResources : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalThreadObjectToolResources)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code_interpreter") != true && Optional.IsDefined(CodeInterpreter)) + { + writer.WritePropertyName("code_interpreter"u8); + writer.WriteObjectValue(CodeInterpreter, options); + } + if (SerializedAdditionalRawData?.ContainsKey("file_search") != true && Optional.IsDefined(FileSearch)) + { + writer.WritePropertyName("file_search"u8); + writer.WriteObjectValue(FileSearch, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalThreadObjectToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalThreadObjectToolResources)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalThreadObjectToolResources(document.RootElement, options); + } + + internal static InternalThreadObjectToolResources DeserializeInternalThreadObjectToolResources(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalThreadObjectToolResourcesCodeInterpreter codeInterpreter = default; + InternalThreadObjectToolResourcesFileSearch fileSearch = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code_interpreter"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + codeInterpreter = InternalThreadObjectToolResourcesCodeInterpreter.DeserializeInternalThreadObjectToolResourcesCodeInterpreter(property.Value, options); + continue; + } + if (property.NameEquals("file_search"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fileSearch = InternalThreadObjectToolResourcesFileSearch.DeserializeInternalThreadObjectToolResourcesFileSearch(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalThreadObjectToolResources(codeInterpreter, fileSearch, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalThreadObjectToolResources)} does not support writing '{options.Format}' format."); + } + } + + InternalThreadObjectToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalThreadObjectToolResources(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalThreadObjectToolResources)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalThreadObjectToolResources FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalThreadObjectToolResources(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalThreadObjectToolResources.cs b/.dotnet/src/Generated/Models/InternalThreadObjectToolResources.cs new file mode 100644 index 000000000..5d766bb62 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalThreadObjectToolResources.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalThreadObjectToolResources + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalThreadObjectToolResources() + { + } + + internal InternalThreadObjectToolResources(InternalThreadObjectToolResourcesCodeInterpreter codeInterpreter, InternalThreadObjectToolResourcesFileSearch fileSearch, IDictionary serializedAdditionalRawData) + { + CodeInterpreter = codeInterpreter; + FileSearch = fileSearch; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public InternalThreadObjectToolResourcesCodeInterpreter CodeInterpreter { get; } + public InternalThreadObjectToolResourcesFileSearch FileSearch { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesCodeInterpreter.Serialization.cs b/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesCodeInterpreter.Serialization.cs new file mode 100644 index 000000000..a65e19205 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesCodeInterpreter.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalThreadObjectToolResourcesCodeInterpreter : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalThreadObjectToolResourcesCodeInterpreter)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_ids") != true && Optional.IsCollectionDefined(FileIds)) + { + writer.WritePropertyName("file_ids"u8); + writer.WriteStartArray(); + foreach (var item in FileIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalThreadObjectToolResourcesCodeInterpreter IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalThreadObjectToolResourcesCodeInterpreter)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalThreadObjectToolResourcesCodeInterpreter(document.RootElement, options); + } + + internal static InternalThreadObjectToolResourcesCodeInterpreter DeserializeInternalThreadObjectToolResourcesCodeInterpreter(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList fileIds = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + fileIds = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalThreadObjectToolResourcesCodeInterpreter(fileIds ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalThreadObjectToolResourcesCodeInterpreter)} does not support writing '{options.Format}' format."); + } + } + + InternalThreadObjectToolResourcesCodeInterpreter IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalThreadObjectToolResourcesCodeInterpreter(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalThreadObjectToolResourcesCodeInterpreter)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalThreadObjectToolResourcesCodeInterpreter FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalThreadObjectToolResourcesCodeInterpreter(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesCodeInterpreter.cs b/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesCodeInterpreter.cs new file mode 100644 index 000000000..ddcfa61ee --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesCodeInterpreter.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalThreadObjectToolResourcesCodeInterpreter + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalThreadObjectToolResourcesCodeInterpreter() + { + FileIds = new ChangeTrackingList(); + } + + internal InternalThreadObjectToolResourcesCodeInterpreter(IReadOnlyList fileIds, IDictionary serializedAdditionalRawData) + { + FileIds = fileIds; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IReadOnlyList FileIds { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesFileSearch.Serialization.cs b/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesFileSearch.Serialization.cs new file mode 100644 index 000000000..f2697edf2 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesFileSearch.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalThreadObjectToolResourcesFileSearch : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalThreadObjectToolResourcesFileSearch)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("vector_store_ids") != true && Optional.IsCollectionDefined(VectorStoreIds)) + { + writer.WritePropertyName("vector_store_ids"u8); + writer.WriteStartArray(); + foreach (var item in VectorStoreIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalThreadObjectToolResourcesFileSearch IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalThreadObjectToolResourcesFileSearch)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalThreadObjectToolResourcesFileSearch(document.RootElement, options); + } + + internal static InternalThreadObjectToolResourcesFileSearch DeserializeInternalThreadObjectToolResourcesFileSearch(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IReadOnlyList vectorStoreIds = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("vector_store_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + vectorStoreIds = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalThreadObjectToolResourcesFileSearch(vectorStoreIds ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalThreadObjectToolResourcesFileSearch)} does not support writing '{options.Format}' format."); + } + } + + InternalThreadObjectToolResourcesFileSearch IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalThreadObjectToolResourcesFileSearch(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalThreadObjectToolResourcesFileSearch)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalThreadObjectToolResourcesFileSearch FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalThreadObjectToolResourcesFileSearch(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesFileSearch.cs b/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesFileSearch.cs new file mode 100644 index 000000000..4214562af --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalThreadObjectToolResourcesFileSearch.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalThreadObjectToolResourcesFileSearch + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalThreadObjectToolResourcesFileSearch() + { + VectorStoreIds = new ChangeTrackingList(); + } + + internal InternalThreadObjectToolResourcesFileSearch(IReadOnlyList vectorStoreIds, IDictionary serializedAdditionalRawData) + { + VectorStoreIds = vectorStoreIds; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IReadOnlyList VectorStoreIds { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalToolResourcesFileSearchIdsOnly.Serialization.cs b/.dotnet/src/Generated/Models/InternalToolResourcesFileSearchIdsOnly.Serialization.cs new file mode 100644 index 000000000..3b188f3b2 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalToolResourcesFileSearchIdsOnly.Serialization.cs @@ -0,0 +1,147 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalToolResourcesFileSearchIdsOnly : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalToolResourcesFileSearchIdsOnly)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("vector_store_ids") != true && Optional.IsCollectionDefined(VectorStoreIds)) + { + writer.WritePropertyName("vector_store_ids"u8); + writer.WriteStartArray(); + foreach (var item in VectorStoreIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalToolResourcesFileSearchIdsOnly IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalToolResourcesFileSearchIdsOnly)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalToolResourcesFileSearchIdsOnly(document.RootElement, options); + } + + internal static InternalToolResourcesFileSearchIdsOnly DeserializeInternalToolResourcesFileSearchIdsOnly(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList vectorStoreIds = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("vector_store_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + vectorStoreIds = array; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalToolResourcesFileSearchIdsOnly(vectorStoreIds ?? new ChangeTrackingList(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalToolResourcesFileSearchIdsOnly)} does not support writing '{options.Format}' format."); + } + } + + InternalToolResourcesFileSearchIdsOnly IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalToolResourcesFileSearchIdsOnly(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalToolResourcesFileSearchIdsOnly)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalToolResourcesFileSearchIdsOnly FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalToolResourcesFileSearchIdsOnly(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalToolResourcesFileSearchIdsOnly.cs b/.dotnet/src/Generated/Models/InternalToolResourcesFileSearchIdsOnly.cs new file mode 100644 index 000000000..33ed0185e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalToolResourcesFileSearchIdsOnly.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalToolResourcesFileSearchIdsOnly + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public InternalToolResourcesFileSearchIdsOnly() + { + VectorStoreIds = new ChangeTrackingList(); + } + + internal InternalToolResourcesFileSearchIdsOnly(IList vectorStoreIds, IDictionary serializedAdditionalRawData) + { + VectorStoreIds = vectorStoreIds; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IList VectorStoreIds { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalTruncationObjectType.cs b/.dotnet/src/Generated/Models/InternalTruncationObjectType.cs new file mode 100644 index 000000000..a6a78832e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalTruncationObjectType.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + internal readonly partial struct InternalTruncationObjectType : IEquatable + { + private readonly string _value; + + public InternalTruncationObjectType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AutoValue = "auto"; + private const string LastMessagesValue = "last_messages"; + + public static InternalTruncationObjectType Auto { get; } = new InternalTruncationObjectType(AutoValue); + public static InternalTruncationObjectType LastMessages { get; } = new InternalTruncationObjectType(LastMessagesValue); + public static bool operator ==(InternalTruncationObjectType left, InternalTruncationObjectType right) => left.Equals(right); + public static bool operator !=(InternalTruncationObjectType left, InternalTruncationObjectType right) => !left.Equals(right); + public static implicit operator InternalTruncationObjectType(string value) => new InternalTruncationObjectType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalTruncationObjectType other && Equals(other); + public bool Equals(InternalTruncationObjectType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownAssistantResponseFormat.Serialization.cs b/.dotnet/src/Generated/Models/InternalUnknownAssistantResponseFormat.Serialization.cs new file mode 100644 index 000000000..ff9e4c05e --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownAssistantResponseFormat.Serialization.cs @@ -0,0 +1,122 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class InternalUnknownAssistantResponseFormat : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AssistantResponseFormat)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + AssistantResponseFormat IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(AssistantResponseFormat)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeAssistantResponseFormat(document.RootElement, options); + } + + internal static InternalUnknownAssistantResponseFormat DeserializeInternalUnknownAssistantResponseFormat(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalUnknownAssistantResponseFormat(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(AssistantResponseFormat)} does not support writing '{options.Format}' format."); + } + } + + AssistantResponseFormat IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeAssistantResponseFormat(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(AssistantResponseFormat)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownAssistantResponseFormat.cs b/.dotnet/src/Generated/Models/InternalUnknownAssistantResponseFormat.cs new file mode 100644 index 000000000..13ee16c7d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownAssistantResponseFormat.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class InternalUnknownAssistantResponseFormat : AssistantResponseFormat + { + internal InternalUnknownAssistantResponseFormat(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal InternalUnknownAssistantResponseFormat() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownChatResponseFormat.Serialization.cs b/.dotnet/src/Generated/Models/InternalUnknownChatResponseFormat.Serialization.cs new file mode 100644 index 000000000..0bc64d31b --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownChatResponseFormat.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class InternalUnknownChatResponseFormat : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatResponseFormat)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ChatResponseFormat IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatResponseFormat)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatResponseFormat(document.RootElement, options); + } + + internal static InternalUnknownChatResponseFormat DeserializeInternalUnknownChatResponseFormat(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalUnknownChatResponseFormat(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatResponseFormat)} does not support writing '{options.Format}' format."); + } + } + + ChatResponseFormat IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatResponseFormat(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatResponseFormat)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalUnknownChatResponseFormat FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalUnknownChatResponseFormat(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownChatResponseFormat.cs b/.dotnet/src/Generated/Models/InternalUnknownChatResponseFormat.cs new file mode 100644 index 000000000..e3def713f --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownChatResponseFormat.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class InternalUnknownChatResponseFormat : ChatResponseFormat + { + internal InternalUnknownChatResponseFormat(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal InternalUnknownChatResponseFormat() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownChunkingStrategy.Serialization.cs b/.dotnet/src/Generated/Models/InternalUnknownChunkingStrategy.Serialization.cs new file mode 100644 index 000000000..6ac1d2f89 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownChunkingStrategy.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalUnknownChunkingStrategy : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalUnknownChunkingStrategy)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalUnknownChunkingStrategy IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalUnknownChunkingStrategy)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalUnknownChunkingStrategy(document.RootElement, options); + } + + internal static InternalUnknownChunkingStrategy DeserializeInternalUnknownChunkingStrategy(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalUnknownChunkingStrategy(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalUnknownChunkingStrategy)} does not support writing '{options.Format}' format."); + } + } + + InternalUnknownChunkingStrategy IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalUnknownChunkingStrategy(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalUnknownChunkingStrategy)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalUnknownChunkingStrategy FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalUnknownChunkingStrategy(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownChunkingStrategy.cs b/.dotnet/src/Generated/Models/InternalUnknownChunkingStrategy.cs new file mode 100644 index 000000000..c67c751d4 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownChunkingStrategy.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal partial class InternalUnknownChunkingStrategy : FileChunkingStrategy + { + internal InternalUnknownChunkingStrategy() + { + Type = "other"; + } + + internal InternalUnknownChunkingStrategy(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyRequestParamProxy.Serialization.cs b/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyRequestParamProxy.Serialization.cs new file mode 100644 index 000000000..5a958dfeb --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyRequestParamProxy.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalUnknownFileChunkingStrategyRequestParamProxy : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFileChunkingStrategyRequestParam)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalFileChunkingStrategyRequestParam IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalFileChunkingStrategyRequestParam)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalFileChunkingStrategyRequestParam(document.RootElement, options); + } + + internal static InternalUnknownFileChunkingStrategyRequestParamProxy DeserializeInternalUnknownFileChunkingStrategyRequestParamProxy(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalUnknownFileChunkingStrategyRequestParamProxy(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalFileChunkingStrategyRequestParam)} does not support writing '{options.Format}' format."); + } + } + + InternalFileChunkingStrategyRequestParam IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalFileChunkingStrategyRequestParam(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalFileChunkingStrategyRequestParam)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalUnknownFileChunkingStrategyRequestParamProxy FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalUnknownFileChunkingStrategyRequestParamProxy(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyRequestParamProxy.cs b/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyRequestParamProxy.cs new file mode 100644 index 000000000..91f50cabb --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyRequestParamProxy.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal partial class InternalUnknownFileChunkingStrategyRequestParamProxy : InternalFileChunkingStrategyRequestParam + { + internal InternalUnknownFileChunkingStrategyRequestParamProxy(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal InternalUnknownFileChunkingStrategyRequestParamProxy() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyResponseParamProxy.Serialization.cs b/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyResponseParamProxy.Serialization.cs new file mode 100644 index 000000000..c98a69cd1 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyResponseParamProxy.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalUnknownFileChunkingStrategyResponseParamProxy : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(FileChunkingStrategy)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + FileChunkingStrategy IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(FileChunkingStrategy)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeFileChunkingStrategy(document.RootElement, options); + } + + internal static InternalUnknownFileChunkingStrategyResponseParamProxy DeserializeInternalUnknownFileChunkingStrategyResponseParamProxy(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalUnknownFileChunkingStrategyResponseParamProxy(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(FileChunkingStrategy)} does not support writing '{options.Format}' format."); + } + } + + FileChunkingStrategy IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeFileChunkingStrategy(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(FileChunkingStrategy)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalUnknownFileChunkingStrategyResponseParamProxy FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalUnknownFileChunkingStrategyResponseParamProxy(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyResponseParamProxy.cs b/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyResponseParamProxy.cs new file mode 100644 index 000000000..db5bbec83 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownFileChunkingStrategyResponseParamProxy.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal partial class InternalUnknownFileChunkingStrategyResponseParamProxy : FileChunkingStrategy + { + internal InternalUnknownFileChunkingStrategyResponseParamProxy(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal InternalUnknownFileChunkingStrategyResponseParamProxy() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownOmniTypedResponseFormat.Serialization.cs b/.dotnet/src/Generated/Models/InternalUnknownOmniTypedResponseFormat.Serialization.cs new file mode 100644 index 000000000..43b17f909 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownOmniTypedResponseFormat.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Internal +{ + internal partial class InternalUnknownOmniTypedResponseFormat : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalOmniTypedResponseFormat)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalOmniTypedResponseFormat IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalOmniTypedResponseFormat)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalOmniTypedResponseFormat(document.RootElement, options); + } + + internal static InternalUnknownOmniTypedResponseFormat DeserializeInternalUnknownOmniTypedResponseFormat(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalUnknownOmniTypedResponseFormat(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalOmniTypedResponseFormat)} does not support writing '{options.Format}' format."); + } + } + + InternalOmniTypedResponseFormat IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalOmniTypedResponseFormat(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalOmniTypedResponseFormat)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new InternalUnknownOmniTypedResponseFormat FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalUnknownOmniTypedResponseFormat(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUnknownOmniTypedResponseFormat.cs b/.dotnet/src/Generated/Models/InternalUnknownOmniTypedResponseFormat.cs new file mode 100644 index 000000000..8ae4808ff --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUnknownOmniTypedResponseFormat.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Internal +{ + internal partial class InternalUnknownOmniTypedResponseFormat : InternalOmniTypedResponseFormat + { + internal InternalUnknownOmniTypedResponseFormat(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal InternalUnknownOmniTypedResponseFormat() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUpload.Serialization.cs b/.dotnet/src/Generated/Models/InternalUpload.Serialization.cs new file mode 100644 index 000000000..6b2b8604d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUpload.Serialization.cs @@ -0,0 +1,247 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Files +{ + internal partial class InternalUpload : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalUpload)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("filename") != true) + { + writer.WritePropertyName("filename"u8); + writer.WriteStringValue(Filename); + } + if (SerializedAdditionalRawData?.ContainsKey("bytes") != true) + { + writer.WritePropertyName("bytes"u8); + writer.WriteNumberValue(Bytes); + } + if (SerializedAdditionalRawData?.ContainsKey("purpose") != true) + { + writer.WritePropertyName("purpose"u8); + writer.WriteStringValue(Purpose); + } + if (SerializedAdditionalRawData?.ContainsKey("status") != true) + { + writer.WritePropertyName("status"u8); + writer.WriteStringValue(Status.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("expires_at") != true) + { + writer.WritePropertyName("expires_at"u8); + writer.WriteNumberValue(ExpiresAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true && Optional.IsDefined(Object)) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.Value.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("file") != true && Optional.IsDefined(File)) + { + if (File != null) + { + writer.WritePropertyName("file"u8); + writer.WriteObjectValue(File, options); + } + else + { + writer.WriteNull("file"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalUpload IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalUpload)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalUpload(document.RootElement, options); + } + + internal static InternalUpload DeserializeInternalUpload(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + DateTimeOffset createdAt = default; + string filename = default; + int bytes = default; + string purpose = default; + InternalUploadStatus status = default; + DateTimeOffset expiresAt = default; + InternalUploadObject? @object = default; + OpenAIFileInfo file = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("filename"u8)) + { + filename = property.Value.GetString(); + continue; + } + if (property.NameEquals("bytes"u8)) + { + bytes = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("purpose"u8)) + { + purpose = property.Value.GetString(); + continue; + } + if (property.NameEquals("status"u8)) + { + status = new InternalUploadStatus(property.Value.GetString()); + continue; + } + if (property.NameEquals("expires_at"u8)) + { + expiresAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("object"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + @object = new InternalUploadObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("file"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + file = null; + continue; + } + file = OpenAIFileInfo.DeserializeOpenAIFileInfo(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalUpload( + id, + createdAt, + filename, + bytes, + purpose, + status, + expiresAt, + @object, + file, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalUpload)} does not support writing '{options.Format}' format."); + } + } + + InternalUpload IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalUpload(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalUpload)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalUpload FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalUpload(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUpload.cs b/.dotnet/src/Generated/Models/InternalUpload.cs new file mode 100644 index 000000000..fb464dedd --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUpload.cs @@ -0,0 +1,56 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Files +{ + internal partial class InternalUpload + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalUpload(string id, DateTimeOffset createdAt, string filename, int bytes, string purpose, InternalUploadStatus status, DateTimeOffset expiresAt) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(filename, nameof(filename)); + Argument.AssertNotNull(purpose, nameof(purpose)); + + Id = id; + CreatedAt = createdAt; + Filename = filename; + Bytes = bytes; + Purpose = purpose; + Status = status; + ExpiresAt = expiresAt; + } + + internal InternalUpload(string id, DateTimeOffset createdAt, string filename, int bytes, string purpose, InternalUploadStatus status, DateTimeOffset expiresAt, InternalUploadObject? @object, OpenAIFileInfo file, IDictionary serializedAdditionalRawData) + { + Id = id; + CreatedAt = createdAt; + Filename = filename; + Bytes = bytes; + Purpose = purpose; + Status = status; + ExpiresAt = expiresAt; + Object = @object; + File = file; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalUpload() + { + } + + public string Id { get; } + public DateTimeOffset CreatedAt { get; } + public string Filename { get; } + public int Bytes { get; } + public string Purpose { get; } + public InternalUploadStatus Status { get; } + public DateTimeOffset ExpiresAt { get; } + public InternalUploadObject? Object { get; } + public OpenAIFileInfo File { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUploadObject.cs b/.dotnet/src/Generated/Models/InternalUploadObject.cs new file mode 100644 index 000000000..9dacd622a --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUploadObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Files +{ + internal readonly partial struct InternalUploadObject : IEquatable + { + private readonly string _value; + + public InternalUploadObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string UploadValue = "upload"; + + public static InternalUploadObject Upload { get; } = new InternalUploadObject(UploadValue); + public static bool operator ==(InternalUploadObject left, InternalUploadObject right) => left.Equals(right); + public static bool operator !=(InternalUploadObject left, InternalUploadObject right) => !left.Equals(right); + public static implicit operator InternalUploadObject(string value) => new InternalUploadObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalUploadObject other && Equals(other); + public bool Equals(InternalUploadObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalUploadPart.Serialization.cs b/.dotnet/src/Generated/Models/InternalUploadPart.Serialization.cs new file mode 100644 index 000000000..c6f4c77f8 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUploadPart.Serialization.cs @@ -0,0 +1,166 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Files +{ + internal partial class InternalUploadPart : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalUploadPart)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("upload_id") != true) + { + writer.WritePropertyName("upload_id"u8); + writer.WriteStringValue(UploadId); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalUploadPart IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalUploadPart)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalUploadPart(document.RootElement, options); + } + + internal static InternalUploadPart DeserializeInternalUploadPart(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + DateTimeOffset createdAt = default; + string uploadId = default; + InternalUploadPartObject @object = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("upload_id"u8)) + { + uploadId = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalUploadPartObject(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalUploadPart(id, createdAt, uploadId, @object, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalUploadPart)} does not support writing '{options.Format}' format."); + } + } + + InternalUploadPart IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalUploadPart(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalUploadPart)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalUploadPart FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalUploadPart(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalUploadPart.cs b/.dotnet/src/Generated/Models/InternalUploadPart.cs new file mode 100644 index 000000000..5a7239276 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUploadPart.cs @@ -0,0 +1,41 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Files +{ + internal partial class InternalUploadPart + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalUploadPart(string id, DateTimeOffset createdAt, string uploadId) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(uploadId, nameof(uploadId)); + + Id = id; + CreatedAt = createdAt; + UploadId = uploadId; + } + + internal InternalUploadPart(string id, DateTimeOffset createdAt, string uploadId, InternalUploadPartObject @object, IDictionary serializedAdditionalRawData) + { + Id = id; + CreatedAt = createdAt; + UploadId = uploadId; + Object = @object; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalUploadPart() + { + } + + public string Id { get; } + public DateTimeOffset CreatedAt { get; } + public string UploadId { get; } + public InternalUploadPartObject Object { get; } = InternalUploadPartObject.UploadPart; + } +} diff --git a/.dotnet/src/Generated/Models/InternalUploadPartObject.cs b/.dotnet/src/Generated/Models/InternalUploadPartObject.cs new file mode 100644 index 000000000..b790ad181 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUploadPartObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Files +{ + internal readonly partial struct InternalUploadPartObject : IEquatable + { + private readonly string _value; + + public InternalUploadPartObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string UploadPartValue = "upload.part"; + + public static InternalUploadPartObject UploadPart { get; } = new InternalUploadPartObject(UploadPartValue); + public static bool operator ==(InternalUploadPartObject left, InternalUploadPartObject right) => left.Equals(right); + public static bool operator !=(InternalUploadPartObject left, InternalUploadPartObject right) => !left.Equals(right); + public static implicit operator InternalUploadPartObject(string value) => new InternalUploadPartObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalUploadPartObject other && Equals(other); + public bool Equals(InternalUploadPartObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalUploadStatus.cs b/.dotnet/src/Generated/Models/InternalUploadStatus.cs new file mode 100644 index 000000000..4ac5876d4 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalUploadStatus.cs @@ -0,0 +1,40 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Files +{ + internal readonly partial struct InternalUploadStatus : IEquatable + { + private readonly string _value; + + public InternalUploadStatus(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string PendingValue = "pending"; + private const string CompletedValue = "completed"; + private const string CancelledValue = "cancelled"; + private const string ExpiredValue = "expired"; + + public static InternalUploadStatus Pending { get; } = new InternalUploadStatus(PendingValue); + public static InternalUploadStatus Completed { get; } = new InternalUploadStatus(CompletedValue); + public static InternalUploadStatus Cancelled { get; } = new InternalUploadStatus(CancelledValue); + public static InternalUploadStatus Expired { get; } = new InternalUploadStatus(ExpiredValue); + public static bool operator ==(InternalUploadStatus left, InternalUploadStatus right) => left.Equals(right); + public static bool operator !=(InternalUploadStatus left, InternalUploadStatus right) => !left.Equals(right); + public static implicit operator InternalUploadStatus(string value) => new InternalUploadStatus(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalUploadStatus other && Equals(other); + public bool Equals(InternalUploadStatus other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalVectorStoreFileBatchObjectFileCounts.Serialization.cs b/.dotnet/src/Generated/Models/InternalVectorStoreFileBatchObjectFileCounts.Serialization.cs new file mode 100644 index 000000000..fa75b3b88 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalVectorStoreFileBatchObjectFileCounts.Serialization.cs @@ -0,0 +1,183 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + internal partial class InternalVectorStoreFileBatchObjectFileCounts : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalVectorStoreFileBatchObjectFileCounts)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("in_progress") != true) + { + writer.WritePropertyName("in_progress"u8); + writer.WriteNumberValue(InProgress); + } + if (SerializedAdditionalRawData?.ContainsKey("completed") != true) + { + writer.WritePropertyName("completed"u8); + writer.WriteNumberValue(Completed); + } + if (SerializedAdditionalRawData?.ContainsKey("failed") != true) + { + writer.WritePropertyName("failed"u8); + writer.WriteNumberValue(Failed); + } + if (SerializedAdditionalRawData?.ContainsKey("cancelled") != true) + { + writer.WritePropertyName("cancelled"u8); + writer.WriteNumberValue(Cancelled); + } + if (SerializedAdditionalRawData?.ContainsKey("total") != true) + { + writer.WritePropertyName("total"u8); + writer.WriteNumberValue(Total); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalVectorStoreFileBatchObjectFileCounts IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalVectorStoreFileBatchObjectFileCounts)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalVectorStoreFileBatchObjectFileCounts(document.RootElement, options); + } + + internal static InternalVectorStoreFileBatchObjectFileCounts DeserializeInternalVectorStoreFileBatchObjectFileCounts(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int inProgress = default; + int completed = default; + int failed = default; + int cancelled = default; + int total = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("in_progress"u8)) + { + inProgress = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("completed"u8)) + { + completed = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("failed"u8)) + { + failed = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("cancelled"u8)) + { + cancelled = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("total"u8)) + { + total = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new InternalVectorStoreFileBatchObjectFileCounts( + inProgress, + completed, + failed, + cancelled, + total, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalVectorStoreFileBatchObjectFileCounts)} does not support writing '{options.Format}' format."); + } + } + + InternalVectorStoreFileBatchObjectFileCounts IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalVectorStoreFileBatchObjectFileCounts(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalVectorStoreFileBatchObjectFileCounts)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static InternalVectorStoreFileBatchObjectFileCounts FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeInternalVectorStoreFileBatchObjectFileCounts(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/InternalVectorStoreFileBatchObjectFileCounts.cs b/.dotnet/src/Generated/Models/InternalVectorStoreFileBatchObjectFileCounts.cs new file mode 100644 index 000000000..03849395d --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalVectorStoreFileBatchObjectFileCounts.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + internal partial class InternalVectorStoreFileBatchObjectFileCounts + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal InternalVectorStoreFileBatchObjectFileCounts(int inProgress, int completed, int failed, int cancelled, int total) + { + InProgress = inProgress; + Completed = completed; + Failed = failed; + Cancelled = cancelled; + Total = total; + } + + internal InternalVectorStoreFileBatchObjectFileCounts(int inProgress, int completed, int failed, int cancelled, int total, IDictionary serializedAdditionalRawData) + { + InProgress = inProgress; + Completed = completed; + Failed = failed; + Cancelled = cancelled; + Total = total; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal InternalVectorStoreFileBatchObjectFileCounts() + { + } + + public int InProgress { get; } + public int Completed { get; } + public int Failed { get; } + public int Cancelled { get; } + public int Total { get; } + } +} diff --git a/.dotnet/src/Generated/Models/InternalVectorStoreFileBatchObjectObject.cs b/.dotnet/src/Generated/Models/InternalVectorStoreFileBatchObjectObject.cs new file mode 100644 index 000000000..baa95d6a2 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalVectorStoreFileBatchObjectObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.VectorStores +{ + internal readonly partial struct InternalVectorStoreFileBatchObjectObject : IEquatable + { + private readonly string _value; + + public InternalVectorStoreFileBatchObjectObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string VectorStoreFilesBatchValue = "vector_store.files_batch"; + + public static InternalVectorStoreFileBatchObjectObject VectorStoreFilesBatch { get; } = new InternalVectorStoreFileBatchObjectObject(VectorStoreFilesBatchValue); + public static bool operator ==(InternalVectorStoreFileBatchObjectObject left, InternalVectorStoreFileBatchObjectObject right) => left.Equals(right); + public static bool operator !=(InternalVectorStoreFileBatchObjectObject left, InternalVectorStoreFileBatchObjectObject right) => !left.Equals(right); + public static implicit operator InternalVectorStoreFileBatchObjectObject(string value) => new InternalVectorStoreFileBatchObjectObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalVectorStoreFileBatchObjectObject other && Equals(other); + public bool Equals(InternalVectorStoreFileBatchObjectObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalVectorStoreFileObjectObject.cs b/.dotnet/src/Generated/Models/InternalVectorStoreFileObjectObject.cs new file mode 100644 index 000000000..89fbddec0 --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalVectorStoreFileObjectObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.VectorStores +{ + internal readonly partial struct InternalVectorStoreFileObjectObject : IEquatable + { + private readonly string _value; + + public InternalVectorStoreFileObjectObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string VectorStoreFileValue = "vector_store.file"; + + public static InternalVectorStoreFileObjectObject VectorStoreFile { get; } = new InternalVectorStoreFileObjectObject(VectorStoreFileValue); + public static bool operator ==(InternalVectorStoreFileObjectObject left, InternalVectorStoreFileObjectObject right) => left.Equals(right); + public static bool operator !=(InternalVectorStoreFileObjectObject left, InternalVectorStoreFileObjectObject right) => !left.Equals(right); + public static implicit operator InternalVectorStoreFileObjectObject(string value) => new InternalVectorStoreFileObjectObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalVectorStoreFileObjectObject other && Equals(other); + public bool Equals(InternalVectorStoreFileObjectObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/InternalVectorStoreObjectObject.cs b/.dotnet/src/Generated/Models/InternalVectorStoreObjectObject.cs new file mode 100644 index 000000000..f65258a8c --- /dev/null +++ b/.dotnet/src/Generated/Models/InternalVectorStoreObjectObject.cs @@ -0,0 +1,34 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.VectorStores +{ + internal readonly partial struct InternalVectorStoreObjectObject : IEquatable + { + private readonly string _value; + + public InternalVectorStoreObjectObject(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string VectorStoreValue = "vector_store"; + + public static InternalVectorStoreObjectObject VectorStore { get; } = new InternalVectorStoreObjectObject(VectorStoreValue); + public static bool operator ==(InternalVectorStoreObjectObject left, InternalVectorStoreObjectObject right) => left.Equals(right); + public static bool operator !=(InternalVectorStoreObjectObject left, InternalVectorStoreObjectObject right) => !left.Equals(right); + public static implicit operator InternalVectorStoreObjectObject(string value) => new InternalVectorStoreObjectObject(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is InternalVectorStoreObjectObject other && Equals(other); + public bool Equals(InternalVectorStoreObjectObject other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/ListOrder.cs b/.dotnet/src/Generated/Models/ListOrder.cs new file mode 100644 index 000000000..31221d29b --- /dev/null +++ b/.dotnet/src/Generated/Models/ListOrder.cs @@ -0,0 +1,33 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI +{ + public readonly partial struct ListOrder : IEquatable + { + private readonly string _value; + + public ListOrder(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string OldestFirstValue = "asc"; + private const string NewestFirstValue = "desc"; + public static bool operator ==(ListOrder left, ListOrder right) => left.Equals(right); + public static bool operator !=(ListOrder left, ListOrder right) => !left.Equals(right); + public static implicit operator ListOrder(string value) => new ListOrder(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is ListOrder other && Equals(other); + public bool Equals(ListOrder other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/MessageContent.Serialization.cs b/.dotnet/src/Generated/Models/MessageContent.Serialization.cs new file mode 100644 index 000000000..8f0030793 --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageContent.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class MessageContent : IJsonModel + { + MessageContent IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(MessageContent)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeMessageContent(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(MessageContent)} does not support writing '{options.Format}' format."); + } + } + + MessageContent IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeMessageContent(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(MessageContent)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static MessageContent FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeMessageContent(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/MessageContent.cs b/.dotnet/src/Generated/Models/MessageContent.cs new file mode 100644 index 000000000..cee79aaec --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageContent.cs @@ -0,0 +1,19 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public abstract partial class MessageContent + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal MessageContent(IDictionary serializedAdditionalRawData) + { + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/MessageCreationAttachment.Serialization.cs b/.dotnet/src/Generated/Models/MessageCreationAttachment.Serialization.cs new file mode 100644 index 000000000..8ab9a3be3 --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageCreationAttachment.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class MessageCreationAttachment : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(MessageCreationAttachment)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_id") != true) + { + writer.WritePropertyName("file_id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData?.ContainsKey("tools") != true) + { + writer.WritePropertyName("tools"u8); + SerializeTools(writer, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + MessageCreationAttachment IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(MessageCreationAttachment)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeMessageCreationAttachment(document.RootElement, options); + } + + internal static MessageCreationAttachment DeserializeMessageCreationAttachment(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string fileId = default; + IReadOnlyList tools = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_id"u8)) + { + fileId = property.Value.GetString(); + continue; + } + if (property.NameEquals("tools"u8)) + { + DeserializeTools(property, ref tools); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new MessageCreationAttachment(fileId, tools, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(MessageCreationAttachment)} does not support writing '{options.Format}' format."); + } + } + + MessageCreationAttachment IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeMessageCreationAttachment(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(MessageCreationAttachment)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static MessageCreationAttachment FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeMessageCreationAttachment(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/MessageCreationAttachment.cs b/.dotnet/src/Generated/Models/MessageCreationAttachment.cs new file mode 100644 index 000000000..3def69b58 --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageCreationAttachment.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + public partial class MessageCreationAttachment + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public MessageCreationAttachment(string fileId, IEnumerable tools) + { + Argument.AssertNotNull(fileId, nameof(fileId)); + Argument.AssertNotNull(tools, nameof(tools)); + + FileId = fileId; + Tools = tools.ToList(); + } + + internal MessageCreationAttachment(string fileId, IReadOnlyList tools, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + Tools = tools; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal MessageCreationAttachment() + { + } + + public string FileId { get; } + } +} diff --git a/.dotnet/src/Generated/Models/MessageCreationOptions.Serialization.cs b/.dotnet/src/Generated/Models/MessageCreationOptions.Serialization.cs new file mode 100644 index 000000000..cec07156b --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageCreationOptions.Serialization.cs @@ -0,0 +1,214 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class MessageCreationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(MessageCreationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("role") != true) + { + writer.WritePropertyName("role"u8); + writer.WriteStringValue(Role.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("content") != true) + { + writer.WritePropertyName("content"u8); + SerializeContent(writer, options); + } + if (SerializedAdditionalRawData?.ContainsKey("attachments") != true && Optional.IsCollectionDefined(Attachments)) + { + if (Attachments != null) + { + writer.WritePropertyName("attachments"u8); + writer.WriteStartArray(); + foreach (var item in Attachments) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("attachments"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + MessageCreationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(MessageCreationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeMessageCreationOptions(document.RootElement, options); + } + + internal static MessageCreationOptions DeserializeMessageCreationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + MessageRole role = default; + IList content = default; + IList attachments = default; + IDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("role"u8)) + { + role = property.Value.GetString().ToMessageRole(); + continue; + } + if (property.NameEquals("content"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(MessageContent.DeserializeMessageContent(item, options)); + } + content = array; + continue; + } + if (property.NameEquals("attachments"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(MessageCreationAttachment.DeserializeMessageCreationAttachment(item, options)); + } + attachments = array; + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new MessageCreationOptions(role, content, attachments ?? new ChangeTrackingList(), metadata ?? new ChangeTrackingDictionary(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(MessageCreationOptions)} does not support writing '{options.Format}' format."); + } + } + + MessageCreationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeMessageCreationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(MessageCreationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static MessageCreationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeMessageCreationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/MessageCreationOptions.cs b/.dotnet/src/Generated/Models/MessageCreationOptions.cs new file mode 100644 index 000000000..123344624 --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageCreationOptions.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + public partial class MessageCreationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal MessageCreationOptions(MessageRole role, IList content, IList attachments, IDictionary metadata, IDictionary serializedAdditionalRawData) + { + Role = role; + Content = content; + Attachments = attachments; + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public IList Attachments { get; set; } + public IDictionary Metadata { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/MessageFailureDetails.Serialization.cs b/.dotnet/src/Generated/Models/MessageFailureDetails.Serialization.cs new file mode 100644 index 000000000..50740a3cd --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageFailureDetails.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class MessageFailureDetails : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(MessageFailureDetails)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("reason") != true) + { + writer.WritePropertyName("reason"u8); + writer.WriteStringValue(Reason.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + MessageFailureDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(MessageFailureDetails)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeMessageFailureDetails(document.RootElement, options); + } + + internal static MessageFailureDetails DeserializeMessageFailureDetails(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + MessageFailureReason reason = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("reason"u8)) + { + reason = new MessageFailureReason(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new MessageFailureDetails(reason, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(MessageFailureDetails)} does not support writing '{options.Format}' format."); + } + } + + MessageFailureDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeMessageFailureDetails(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(MessageFailureDetails)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static MessageFailureDetails FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeMessageFailureDetails(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/MessageFailureDetails.cs b/.dotnet/src/Generated/Models/MessageFailureDetails.cs new file mode 100644 index 000000000..9c6b6e8e6 --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageFailureDetails.cs @@ -0,0 +1,30 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class MessageFailureDetails + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal MessageFailureDetails(MessageFailureReason reason) + { + Reason = reason; + } + + internal MessageFailureDetails(MessageFailureReason reason, IDictionary serializedAdditionalRawData) + { + Reason = reason; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal MessageFailureDetails() + { + } + + public MessageFailureReason Reason { get; } + } +} diff --git a/.dotnet/src/Generated/Models/MessageFailureReason.cs b/.dotnet/src/Generated/Models/MessageFailureReason.cs new file mode 100644 index 000000000..ac0b0a02f --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageFailureReason.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + public readonly partial struct MessageFailureReason : IEquatable + { + private readonly string _value; + + public MessageFailureReason(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ContentFilterValue = "content_filter"; + private const string MaxTokensValue = "max_tokens"; + private const string RunCancelledValue = "run_cancelled"; + private const string RunExpiredValue = "run_expired"; + private const string RunFailedValue = "run_failed"; + + public static MessageFailureReason ContentFilter { get; } = new MessageFailureReason(ContentFilterValue); + public static MessageFailureReason MaxTokens { get; } = new MessageFailureReason(MaxTokensValue); + public static MessageFailureReason RunCancelled { get; } = new MessageFailureReason(RunCancelledValue); + public static MessageFailureReason RunExpired { get; } = new MessageFailureReason(RunExpiredValue); + public static MessageFailureReason RunFailed { get; } = new MessageFailureReason(RunFailedValue); + public static bool operator ==(MessageFailureReason left, MessageFailureReason right) => left.Equals(right); + public static bool operator !=(MessageFailureReason left, MessageFailureReason right) => !left.Equals(right); + public static implicit operator MessageFailureReason(string value) => new MessageFailureReason(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is MessageFailureReason other && Equals(other); + public bool Equals(MessageFailureReason other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/MessageModificationOptions.Serialization.cs b/.dotnet/src/Generated/Models/MessageModificationOptions.Serialization.cs new file mode 100644 index 000000000..f5b28713b --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageModificationOptions.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class MessageModificationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(MessageModificationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + MessageModificationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(MessageModificationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeMessageModificationOptions(document.RootElement, options); + } + + internal static MessageModificationOptions DeserializeMessageModificationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new MessageModificationOptions(metadata ?? new ChangeTrackingDictionary(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(MessageModificationOptions)} does not support writing '{options.Format}' format."); + } + } + + MessageModificationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeMessageModificationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(MessageModificationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static MessageModificationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeMessageModificationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/MessageModificationOptions.cs b/.dotnet/src/Generated/Models/MessageModificationOptions.cs new file mode 100644 index 000000000..5c53d2234 --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageModificationOptions.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class MessageModificationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public MessageModificationOptions() + { + Metadata = new ChangeTrackingDictionary(); + } + + internal MessageModificationOptions(IDictionary metadata, IDictionary serializedAdditionalRawData) + { + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IDictionary Metadata { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/MessageRole.Serialization.cs b/.dotnet/src/Generated/Models/MessageRole.Serialization.cs new file mode 100644 index 000000000..6d77ee639 --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageRole.Serialization.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.Assistants +{ + internal static partial class MessageRoleExtensions + { + public static string ToSerialString(this MessageRole value) => value switch + { + MessageRole.User => "user", + MessageRole.Assistant => "assistant", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown MessageRole value.") + }; + + public static MessageRole ToMessageRole(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "user")) return MessageRole.User; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "assistant")) return MessageRole.Assistant; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown MessageRole value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/MessageStatus.cs b/.dotnet/src/Generated/Models/MessageStatus.cs new file mode 100644 index 000000000..86f1d414c --- /dev/null +++ b/.dotnet/src/Generated/Models/MessageStatus.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + public readonly partial struct MessageStatus : IEquatable + { + private readonly string _value; + + public MessageStatus(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string InProgressValue = "in_progress"; + private const string IncompleteValue = "incomplete"; + private const string CompletedValue = "completed"; + + public static MessageStatus InProgress { get; } = new MessageStatus(InProgressValue); + public static MessageStatus Incomplete { get; } = new MessageStatus(IncompleteValue); + public static MessageStatus Completed { get; } = new MessageStatus(CompletedValue); + public static bool operator ==(MessageStatus left, MessageStatus right) => left.Equals(right); + public static bool operator !=(MessageStatus left, MessageStatus right) => !left.Equals(right); + public static implicit operator MessageStatus(string value) => new MessageStatus(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is MessageStatus other && Equals(other); + public bool Equals(MessageStatus other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/ModerationCategories.Serialization.cs b/.dotnet/src/Generated/Models/ModerationCategories.Serialization.cs new file mode 100644 index 000000000..f8bc1c570 --- /dev/null +++ b/.dotnet/src/Generated/Models/ModerationCategories.Serialization.cs @@ -0,0 +1,255 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Moderations +{ + public partial class ModerationCategories : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ModerationCategories)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("hate") != true) + { + writer.WritePropertyName("hate"u8); + writer.WriteBooleanValue(Hate); + } + if (SerializedAdditionalRawData?.ContainsKey("hate/threatening") != true) + { + writer.WritePropertyName("hate/threatening"u8); + writer.WriteBooleanValue(HateThreatening); + } + if (SerializedAdditionalRawData?.ContainsKey("harassment") != true) + { + writer.WritePropertyName("harassment"u8); + writer.WriteBooleanValue(Harassment); + } + if (SerializedAdditionalRawData?.ContainsKey("harassment/threatening") != true) + { + writer.WritePropertyName("harassment/threatening"u8); + writer.WriteBooleanValue(HarassmentThreatening); + } + if (SerializedAdditionalRawData?.ContainsKey("self-harm") != true) + { + writer.WritePropertyName("self-harm"u8); + writer.WriteBooleanValue(SelfHarm); + } + if (SerializedAdditionalRawData?.ContainsKey("self-harm/intent") != true) + { + writer.WritePropertyName("self-harm/intent"u8); + writer.WriteBooleanValue(SelfHarmIntent); + } + if (SerializedAdditionalRawData?.ContainsKey("self-harm/instructions") != true) + { + writer.WritePropertyName("self-harm/instructions"u8); + writer.WriteBooleanValue(SelfHarmInstructions); + } + if (SerializedAdditionalRawData?.ContainsKey("sexual") != true) + { + writer.WritePropertyName("sexual"u8); + writer.WriteBooleanValue(Sexual); + } + if (SerializedAdditionalRawData?.ContainsKey("sexual/minors") != true) + { + writer.WritePropertyName("sexual/minors"u8); + writer.WriteBooleanValue(SexualMinors); + } + if (SerializedAdditionalRawData?.ContainsKey("violence") != true) + { + writer.WritePropertyName("violence"u8); + writer.WriteBooleanValue(Violence); + } + if (SerializedAdditionalRawData?.ContainsKey("violence/graphic") != true) + { + writer.WritePropertyName("violence/graphic"u8); + writer.WriteBooleanValue(ViolenceGraphic); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ModerationCategories IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ModerationCategories)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeModerationCategories(document.RootElement, options); + } + + internal static ModerationCategories DeserializeModerationCategories(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + bool hate = default; + bool hateThreatening = default; + bool harassment = default; + bool harassmentThreatening = default; + bool selfHarm = default; + bool selfHarmIntent = default; + bool selfHarmInstructions = default; + bool sexual = default; + bool sexualMinors = default; + bool violence = default; + bool violenceGraphic = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("hate"u8)) + { + hate = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("hate/threatening"u8)) + { + hateThreatening = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("harassment"u8)) + { + harassment = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("harassment/threatening"u8)) + { + harassmentThreatening = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("self-harm"u8)) + { + selfHarm = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("self-harm/intent"u8)) + { + selfHarmIntent = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("self-harm/instructions"u8)) + { + selfHarmInstructions = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("sexual"u8)) + { + sexual = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("sexual/minors"u8)) + { + sexualMinors = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("violence"u8)) + { + violence = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("violence/graphic"u8)) + { + violenceGraphic = property.Value.GetBoolean(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ModerationCategories( + hate, + hateThreatening, + harassment, + harassmentThreatening, + selfHarm, + selfHarmIntent, + selfHarmInstructions, + sexual, + sexualMinors, + violence, + violenceGraphic, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ModerationCategories)} does not support writing '{options.Format}' format."); + } + } + + ModerationCategories IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeModerationCategories(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ModerationCategories)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ModerationCategories FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeModerationCategories(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ModerationCategories.cs b/.dotnet/src/Generated/Models/ModerationCategories.cs new file mode 100644 index 000000000..978eaf4e7 --- /dev/null +++ b/.dotnet/src/Generated/Models/ModerationCategories.cs @@ -0,0 +1,60 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Moderations +{ + public partial class ModerationCategories + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal ModerationCategories(bool hate, bool hateThreatening, bool harassment, bool harassmentThreatening, bool selfHarm, bool selfHarmIntent, bool selfHarmInstructions, bool sexual, bool sexualMinors, bool violence, bool violenceGraphic) + { + Hate = hate; + HateThreatening = hateThreatening; + Harassment = harassment; + HarassmentThreatening = harassmentThreatening; + SelfHarm = selfHarm; + SelfHarmIntent = selfHarmIntent; + SelfHarmInstructions = selfHarmInstructions; + Sexual = sexual; + SexualMinors = sexualMinors; + Violence = violence; + ViolenceGraphic = violenceGraphic; + } + + internal ModerationCategories(bool hate, bool hateThreatening, bool harassment, bool harassmentThreatening, bool selfHarm, bool selfHarmIntent, bool selfHarmInstructions, bool sexual, bool sexualMinors, bool violence, bool violenceGraphic, IDictionary serializedAdditionalRawData) + { + Hate = hate; + HateThreatening = hateThreatening; + Harassment = harassment; + HarassmentThreatening = harassmentThreatening; + SelfHarm = selfHarm; + SelfHarmIntent = selfHarmIntent; + SelfHarmInstructions = selfHarmInstructions; + Sexual = sexual; + SexualMinors = sexualMinors; + Violence = violence; + ViolenceGraphic = violenceGraphic; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ModerationCategories() + { + } + + public bool Hate { get; } + public bool HateThreatening { get; } + public bool Harassment { get; } + public bool HarassmentThreatening { get; } + public bool SelfHarm { get; } + public bool SelfHarmIntent { get; } + public bool SelfHarmInstructions { get; } + public bool Sexual { get; } + public bool SexualMinors { get; } + public bool Violence { get; } + public bool ViolenceGraphic { get; } + } +} diff --git a/.dotnet/src/Generated/Models/ModerationCategoryScores.Serialization.cs b/.dotnet/src/Generated/Models/ModerationCategoryScores.Serialization.cs new file mode 100644 index 000000000..52df6f3f9 --- /dev/null +++ b/.dotnet/src/Generated/Models/ModerationCategoryScores.Serialization.cs @@ -0,0 +1,255 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Moderations +{ + public partial class ModerationCategoryScores : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ModerationCategoryScores)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("hate") != true) + { + writer.WritePropertyName("hate"u8); + writer.WriteNumberValue(Hate); + } + if (SerializedAdditionalRawData?.ContainsKey("hate/threatening") != true) + { + writer.WritePropertyName("hate/threatening"u8); + writer.WriteNumberValue(HateThreatening); + } + if (SerializedAdditionalRawData?.ContainsKey("harassment") != true) + { + writer.WritePropertyName("harassment"u8); + writer.WriteNumberValue(Harassment); + } + if (SerializedAdditionalRawData?.ContainsKey("harassment/threatening") != true) + { + writer.WritePropertyName("harassment/threatening"u8); + writer.WriteNumberValue(HarassmentThreatening); + } + if (SerializedAdditionalRawData?.ContainsKey("self-harm") != true) + { + writer.WritePropertyName("self-harm"u8); + writer.WriteNumberValue(SelfHarm); + } + if (SerializedAdditionalRawData?.ContainsKey("self-harm/intent") != true) + { + writer.WritePropertyName("self-harm/intent"u8); + writer.WriteNumberValue(SelfHarmIntent); + } + if (SerializedAdditionalRawData?.ContainsKey("self-harm/instructions") != true) + { + writer.WritePropertyName("self-harm/instructions"u8); + writer.WriteNumberValue(SelfHarmInstructions); + } + if (SerializedAdditionalRawData?.ContainsKey("sexual") != true) + { + writer.WritePropertyName("sexual"u8); + writer.WriteNumberValue(Sexual); + } + if (SerializedAdditionalRawData?.ContainsKey("sexual/minors") != true) + { + writer.WritePropertyName("sexual/minors"u8); + writer.WriteNumberValue(SexualMinors); + } + if (SerializedAdditionalRawData?.ContainsKey("violence") != true) + { + writer.WritePropertyName("violence"u8); + writer.WriteNumberValue(Violence); + } + if (SerializedAdditionalRawData?.ContainsKey("violence/graphic") != true) + { + writer.WritePropertyName("violence/graphic"u8); + writer.WriteNumberValue(ViolenceGraphic); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ModerationCategoryScores IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ModerationCategoryScores)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeModerationCategoryScores(document.RootElement, options); + } + + internal static ModerationCategoryScores DeserializeModerationCategoryScores(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + float hate = default; + float hateThreatening = default; + float harassment = default; + float harassmentThreatening = default; + float selfHarm = default; + float selfHarmIntent = default; + float selfHarmInstructions = default; + float sexual = default; + float sexualMinors = default; + float violence = default; + float violenceGraphic = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("hate"u8)) + { + hate = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("hate/threatening"u8)) + { + hateThreatening = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("harassment"u8)) + { + harassment = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("harassment/threatening"u8)) + { + harassmentThreatening = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("self-harm"u8)) + { + selfHarm = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("self-harm/intent"u8)) + { + selfHarmIntent = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("self-harm/instructions"u8)) + { + selfHarmInstructions = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("sexual"u8)) + { + sexual = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("sexual/minors"u8)) + { + sexualMinors = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("violence"u8)) + { + violence = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("violence/graphic"u8)) + { + violenceGraphic = property.Value.GetSingle(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ModerationCategoryScores( + hate, + hateThreatening, + harassment, + harassmentThreatening, + selfHarm, + selfHarmIntent, + selfHarmInstructions, + sexual, + sexualMinors, + violence, + violenceGraphic, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ModerationCategoryScores)} does not support writing '{options.Format}' format."); + } + } + + ModerationCategoryScores IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeModerationCategoryScores(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ModerationCategoryScores)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ModerationCategoryScores FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeModerationCategoryScores(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ModerationCategoryScores.cs b/.dotnet/src/Generated/Models/ModerationCategoryScores.cs new file mode 100644 index 000000000..a734260ce --- /dev/null +++ b/.dotnet/src/Generated/Models/ModerationCategoryScores.cs @@ -0,0 +1,60 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Moderations +{ + public partial class ModerationCategoryScores + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal ModerationCategoryScores(float hate, float hateThreatening, float harassment, float harassmentThreatening, float selfHarm, float selfHarmIntent, float selfHarmInstructions, float sexual, float sexualMinors, float violence, float violenceGraphic) + { + Hate = hate; + HateThreatening = hateThreatening; + Harassment = harassment; + HarassmentThreatening = harassmentThreatening; + SelfHarm = selfHarm; + SelfHarmIntent = selfHarmIntent; + SelfHarmInstructions = selfHarmInstructions; + Sexual = sexual; + SexualMinors = sexualMinors; + Violence = violence; + ViolenceGraphic = violenceGraphic; + } + + internal ModerationCategoryScores(float hate, float hateThreatening, float harassment, float harassmentThreatening, float selfHarm, float selfHarmIntent, float selfHarmInstructions, float sexual, float sexualMinors, float violence, float violenceGraphic, IDictionary serializedAdditionalRawData) + { + Hate = hate; + HateThreatening = hateThreatening; + Harassment = harassment; + HarassmentThreatening = harassmentThreatening; + SelfHarm = selfHarm; + SelfHarmIntent = selfHarmIntent; + SelfHarmInstructions = selfHarmInstructions; + Sexual = sexual; + SexualMinors = sexualMinors; + Violence = violence; + ViolenceGraphic = violenceGraphic; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ModerationCategoryScores() + { + } + + public float Hate { get; } + public float HateThreatening { get; } + public float Harassment { get; } + public float HarassmentThreatening { get; } + public float SelfHarm { get; } + public float SelfHarmIntent { get; } + public float SelfHarmInstructions { get; } + public float Sexual { get; } + public float SexualMinors { get; } + public float Violence { get; } + public float ViolenceGraphic { get; } + } +} diff --git a/.dotnet/src/Generated/Models/ModerationCollection.Serialization.cs b/.dotnet/src/Generated/Models/ModerationCollection.Serialization.cs new file mode 100644 index 000000000..cb0f039f8 --- /dev/null +++ b/.dotnet/src/Generated/Models/ModerationCollection.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Moderations +{ + public partial class ModerationCollection : IJsonModel + { + ModerationCollection IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ModerationCollection)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeModerationCollection(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ModerationCollection)} does not support writing '{options.Format}' format."); + } + } + + ModerationCollection IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeModerationCollection(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ModerationCollection)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ModerationCollection FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeModerationCollection(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ModerationCollection.cs b/.dotnet/src/Generated/Models/ModerationCollection.cs new file mode 100644 index 000000000..c8436d807 --- /dev/null +++ b/.dotnet/src/Generated/Models/ModerationCollection.cs @@ -0,0 +1,16 @@ +// + +#nullable disable + +using System.Collections.Generic; +using System.Collections.ObjectModel; +using System.Linq; + +namespace OpenAI.Moderations +{ + public partial class ModerationCollection : ReadOnlyCollection + { + public string Id { get; } + public string Model { get; } + } +} diff --git a/.dotnet/src/Generated/Models/ModerationOptions.Serialization.cs b/.dotnet/src/Generated/Models/ModerationOptions.Serialization.cs new file mode 100644 index 000000000..591577235 --- /dev/null +++ b/.dotnet/src/Generated/Models/ModerationOptions.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Moderations +{ + internal partial class ModerationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ModerationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("input") != true) + { + writer.WritePropertyName("input"u8); +#if NET6_0_OR_GREATER + writer.WriteRawValue(Input); +#else + using (JsonDocument document = JsonDocument.Parse(Input)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true && Optional.IsDefined(Model)) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model.Value.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ModerationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ModerationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeModerationOptions(document.RootElement, options); + } + + internal static ModerationOptions DeserializeModerationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + BinaryData input = default; + InternalCreateModerationRequestModel? model = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("input"u8)) + { + input = BinaryData.FromString(property.Value.GetRawText()); + continue; + } + if (property.NameEquals("model"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + model = new InternalCreateModerationRequestModel(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ModerationOptions(input, model, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ModerationOptions)} does not support writing '{options.Format}' format."); + } + } + + ModerationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeModerationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ModerationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ModerationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeModerationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ModerationOptions.cs b/.dotnet/src/Generated/Models/ModerationOptions.cs new file mode 100644 index 000000000..e21defe31 --- /dev/null +++ b/.dotnet/src/Generated/Models/ModerationOptions.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Moderations +{ + internal partial class ModerationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal ModerationOptions(BinaryData input, InternalCreateModerationRequestModel? model, IDictionary serializedAdditionalRawData) + { + Input = input; + Model = model; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/ModerationResult.Serialization.cs b/.dotnet/src/Generated/Models/ModerationResult.Serialization.cs new file mode 100644 index 000000000..4e5f4ee28 --- /dev/null +++ b/.dotnet/src/Generated/Models/ModerationResult.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Moderations +{ + public partial class ModerationResult : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ModerationResult)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("flagged") != true) + { + writer.WritePropertyName("flagged"u8); + writer.WriteBooleanValue(Flagged); + } + if (SerializedAdditionalRawData?.ContainsKey("categories") != true) + { + writer.WritePropertyName("categories"u8); + writer.WriteObjectValue(Categories, options); + } + if (SerializedAdditionalRawData?.ContainsKey("category_scores") != true) + { + writer.WritePropertyName("category_scores"u8); + writer.WriteObjectValue(CategoryScores, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ModerationResult IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ModerationResult)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeModerationResult(document.RootElement, options); + } + + internal static ModerationResult DeserializeModerationResult(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + bool flagged = default; + ModerationCategories categories = default; + ModerationCategoryScores categoryScores = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("flagged"u8)) + { + flagged = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("categories"u8)) + { + categories = ModerationCategories.DeserializeModerationCategories(property.Value, options); + continue; + } + if (property.NameEquals("category_scores"u8)) + { + categoryScores = ModerationCategoryScores.DeserializeModerationCategoryScores(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ModerationResult(flagged, categories, categoryScores, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ModerationResult)} does not support writing '{options.Format}' format."); + } + } + + ModerationResult IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeModerationResult(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ModerationResult)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ModerationResult FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeModerationResult(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ModerationResult.cs b/.dotnet/src/Generated/Models/ModerationResult.cs new file mode 100644 index 000000000..ae015e2eb --- /dev/null +++ b/.dotnet/src/Generated/Models/ModerationResult.cs @@ -0,0 +1,39 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Moderations +{ + public partial class ModerationResult + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal ModerationResult(bool flagged, ModerationCategories categories, ModerationCategoryScores categoryScores) + { + Argument.AssertNotNull(categories, nameof(categories)); + Argument.AssertNotNull(categoryScores, nameof(categoryScores)); + + Flagged = flagged; + Categories = categories; + CategoryScores = categoryScores; + } + + internal ModerationResult(bool flagged, ModerationCategories categories, ModerationCategoryScores categoryScores, IDictionary serializedAdditionalRawData) + { + Flagged = flagged; + Categories = categories; + CategoryScores = categoryScores; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ModerationResult() + { + } + + public bool Flagged { get; } + public ModerationCategories Categories { get; } + public ModerationCategoryScores CategoryScores { get; } + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIError.Serialization.cs b/.dotnet/src/Generated/Models/OpenAIError.Serialization.cs new file mode 100644 index 000000000..6bc1f6fc5 --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIError.Serialization.cs @@ -0,0 +1,190 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Internal +{ + internal partial class OpenAIError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(OpenAIError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true) + { + if (Code != null) + { + writer.WritePropertyName("code"u8); + writer.WriteStringValue(Code); + } + else + { + writer.WriteNull("code"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true) + { + writer.WritePropertyName("message"u8); + writer.WriteStringValue(Message); + } + if (SerializedAdditionalRawData?.ContainsKey("param") != true) + { + if (Param != null) + { + writer.WritePropertyName("param"u8); + writer.WriteStringValue(Param); + } + else + { + writer.WriteNull("param"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + OpenAIError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(OpenAIError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeOpenAIError(document.RootElement, options); + } + + internal static OpenAIError DeserializeOpenAIError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string code = default; + string message = default; + string param = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + code = null; + continue; + } + code = property.Value.GetString(); + continue; + } + if (property.NameEquals("message"u8)) + { + message = property.Value.GetString(); + continue; + } + if (property.NameEquals("param"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + param = null; + continue; + } + param = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new OpenAIError(code, message, param, type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(OpenAIError)} does not support writing '{options.Format}' format."); + } + } + + OpenAIError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeOpenAIError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(OpenAIError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static OpenAIError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeOpenAIError(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIError.cs b/.dotnet/src/Generated/Models/OpenAIError.cs new file mode 100644 index 000000000..7b94f06f3 --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIError.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Internal +{ + internal partial class OpenAIError + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal OpenAIError(string code, string message, string param, string type) + { + Argument.AssertNotNull(message, nameof(message)); + Argument.AssertNotNull(type, nameof(type)); + + Code = code; + Message = message; + Param = param; + Type = type; + } + + internal OpenAIError(string code, string message, string param, string type, IDictionary serializedAdditionalRawData) + { + Code = code; + Message = message; + Param = param; + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal OpenAIError() + { + } + + public string Code { get; } + public string Message { get; } + public string Param { get; } + public string Type { get; } + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIErrorResponse.Serialization.cs b/.dotnet/src/Generated/Models/OpenAIErrorResponse.Serialization.cs new file mode 100644 index 000000000..e78630aea --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIErrorResponse.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Internal +{ + internal partial class OpenAIErrorResponse : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(OpenAIErrorResponse)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("error") != true) + { + writer.WritePropertyName("error"u8); + writer.WriteObjectValue(Error, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + OpenAIErrorResponse IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(OpenAIErrorResponse)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeOpenAIErrorResponse(document.RootElement, options); + } + + internal static OpenAIErrorResponse DeserializeOpenAIErrorResponse(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + OpenAIError error = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("error"u8)) + { + error = OpenAIError.DeserializeOpenAIError(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new OpenAIErrorResponse(error, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(OpenAIErrorResponse)} does not support writing '{options.Format}' format."); + } + } + + OpenAIErrorResponse IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeOpenAIErrorResponse(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(OpenAIErrorResponse)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static OpenAIErrorResponse FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeOpenAIErrorResponse(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIErrorResponse.cs b/.dotnet/src/Generated/Models/OpenAIErrorResponse.cs new file mode 100644 index 000000000..764dd3854 --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIErrorResponse.cs @@ -0,0 +1,32 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Internal +{ + internal partial class OpenAIErrorResponse + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal OpenAIErrorResponse(OpenAIError error) + { + Argument.AssertNotNull(error, nameof(error)); + + Error = error; + } + + internal OpenAIErrorResponse(OpenAIError error, IDictionary serializedAdditionalRawData) + { + Error = error; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal OpenAIErrorResponse() + { + } + + public OpenAIError Error { get; } + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIFileInfo.Serialization.cs b/.dotnet/src/Generated/Models/OpenAIFileInfo.Serialization.cs new file mode 100644 index 000000000..fcd6544a3 --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIFileInfo.Serialization.cs @@ -0,0 +1,231 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Files +{ + public partial class OpenAIFileInfo : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(OpenAIFileInfo)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("bytes") != true) + { + if (SizeInBytes != null) + { + writer.WritePropertyName("bytes"u8); + writer.WriteNumberValue(SizeInBytes.Value); + } + else + { + writer.WriteNull("bytes"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("filename") != true) + { + writer.WritePropertyName("filename"u8); + writer.WriteStringValue(Filename); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("purpose") != true) + { + writer.WritePropertyName("purpose"u8); + writer.WriteStringValue(Purpose.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("status") != true) + { + writer.WritePropertyName("status"u8); + writer.WriteStringValue(Status.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("status_details") != true && Optional.IsDefined(StatusDetails)) + { + writer.WritePropertyName("status_details"u8); + writer.WriteStringValue(StatusDetails); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + OpenAIFileInfo IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(OpenAIFileInfo)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeOpenAIFileInfo(document.RootElement, options); + } + + internal static OpenAIFileInfo DeserializeOpenAIFileInfo(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + int? bytes = default; + DateTimeOffset createdAt = default; + string filename = default; + InternalOpenAIFileObject @object = default; + OpenAIFilePurpose purpose = default; + OpenAIFileStatus status = default; + string statusDetails = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("bytes"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + bytes = null; + continue; + } + bytes = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("filename"u8)) + { + filename = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalOpenAIFileObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("purpose"u8)) + { + purpose = new OpenAIFilePurpose(property.Value.GetString()); + continue; + } + if (property.NameEquals("status"u8)) + { + status = new OpenAIFileStatus(property.Value.GetString()); + continue; + } + if (property.NameEquals("status_details"u8)) + { + statusDetails = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new OpenAIFileInfo( + id, + bytes, + createdAt, + filename, + @object, + purpose, + status, + statusDetails, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(OpenAIFileInfo)} does not support writing '{options.Format}' format."); + } + } + + OpenAIFileInfo IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeOpenAIFileInfo(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(OpenAIFileInfo)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static OpenAIFileInfo FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeOpenAIFileInfo(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIFileInfo.cs b/.dotnet/src/Generated/Models/OpenAIFileInfo.cs new file mode 100644 index 000000000..6643d5ab8 --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIFileInfo.cs @@ -0,0 +1,51 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Files +{ + public partial class OpenAIFileInfo + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal OpenAIFileInfo(string id, int? sizeInBytes, DateTimeOffset createdAt, string filename, OpenAIFilePurpose purpose, OpenAIFileStatus status) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(filename, nameof(filename)); + + Id = id; + SizeInBytes = sizeInBytes; + CreatedAt = createdAt; + Filename = filename; + Purpose = purpose; + Status = status; + } + + internal OpenAIFileInfo(string id, int? sizeInBytes, DateTimeOffset createdAt, string filename, InternalOpenAIFileObject @object, OpenAIFilePurpose purpose, OpenAIFileStatus status, string statusDetails, IDictionary serializedAdditionalRawData) + { + Id = id; + SizeInBytes = sizeInBytes; + CreatedAt = createdAt; + Filename = filename; + Object = @object; + Purpose = purpose; + Status = status; + StatusDetails = statusDetails; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal OpenAIFileInfo() + { + } + + public string Id { get; } + public DateTimeOffset CreatedAt { get; } + public string Filename { get; } + + public OpenAIFilePurpose Purpose { get; } + public OpenAIFileStatus Status { get; } + public string StatusDetails { get; } + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIFileInfoCollection.Serialization.cs b/.dotnet/src/Generated/Models/OpenAIFileInfoCollection.Serialization.cs new file mode 100644 index 000000000..5cd5d039a --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIFileInfoCollection.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Files +{ + public partial class OpenAIFileInfoCollection : IJsonModel + { + OpenAIFileInfoCollection IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(OpenAIFileInfoCollection)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeOpenAIFileInfoCollection(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(OpenAIFileInfoCollection)} does not support writing '{options.Format}' format."); + } + } + + OpenAIFileInfoCollection IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeOpenAIFileInfoCollection(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(OpenAIFileInfoCollection)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static OpenAIFileInfoCollection FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeOpenAIFileInfoCollection(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIFileInfoCollection.cs b/.dotnet/src/Generated/Models/OpenAIFileInfoCollection.cs new file mode 100644 index 000000000..c1887f232 --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIFileInfoCollection.cs @@ -0,0 +1,14 @@ +// + +#nullable disable + +using System.Collections.Generic; +using System.Collections.ObjectModel; +using System.Linq; + +namespace OpenAI.Files +{ + public partial class OpenAIFileInfoCollection : ReadOnlyCollection + { + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIFilePurpose.cs b/.dotnet/src/Generated/Models/OpenAIFilePurpose.cs new file mode 100644 index 000000000..71c6cfbf6 --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIFilePurpose.cs @@ -0,0 +1,46 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Files +{ + public readonly partial struct OpenAIFilePurpose : IEquatable + { + private readonly string _value; + + public OpenAIFilePurpose(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string AssistantsValue = "assistants"; + private const string AssistantsOutputValue = "assistants_output"; + private const string BatchValue = "batch"; + private const string BatchOutputValue = "batch_output"; + private const string FineTuneValue = "fine-tune"; + private const string FineTuneResultsValue = "fine-tune-results"; + private const string VisionValue = "vision"; + + public static OpenAIFilePurpose Assistants { get; } = new OpenAIFilePurpose(AssistantsValue); + public static OpenAIFilePurpose AssistantsOutput { get; } = new OpenAIFilePurpose(AssistantsOutputValue); + public static OpenAIFilePurpose Batch { get; } = new OpenAIFilePurpose(BatchValue); + public static OpenAIFilePurpose BatchOutput { get; } = new OpenAIFilePurpose(BatchOutputValue); + public static OpenAIFilePurpose FineTune { get; } = new OpenAIFilePurpose(FineTuneValue); + public static OpenAIFilePurpose FineTuneResults { get; } = new OpenAIFilePurpose(FineTuneResultsValue); + public static OpenAIFilePurpose Vision { get; } = new OpenAIFilePurpose(VisionValue); + public static bool operator ==(OpenAIFilePurpose left, OpenAIFilePurpose right) => left.Equals(right); + public static bool operator !=(OpenAIFilePurpose left, OpenAIFilePurpose right) => !left.Equals(right); + public static implicit operator OpenAIFilePurpose(string value) => new OpenAIFilePurpose(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is OpenAIFilePurpose other && Equals(other); + public bool Equals(OpenAIFilePurpose other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIFileStatus.cs b/.dotnet/src/Generated/Models/OpenAIFileStatus.cs new file mode 100644 index 000000000..9176c8618 --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIFileStatus.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Files +{ + public readonly partial struct OpenAIFileStatus : IEquatable + { + private readonly string _value; + + public OpenAIFileStatus(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string UploadedValue = "uploaded"; + private const string ProcessedValue = "processed"; + private const string ErrorValue = "error"; + + public static OpenAIFileStatus Uploaded { get; } = new OpenAIFileStatus(UploadedValue); + public static OpenAIFileStatus Processed { get; } = new OpenAIFileStatus(ProcessedValue); + public static OpenAIFileStatus Error { get; } = new OpenAIFileStatus(ErrorValue); + public static bool operator ==(OpenAIFileStatus left, OpenAIFileStatus right) => left.Equals(right); + public static bool operator !=(OpenAIFileStatus left, OpenAIFileStatus right) => !left.Equals(right); + public static implicit operator OpenAIFileStatus(string value) => new OpenAIFileStatus(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is OpenAIFileStatus other && Equals(other); + public bool Equals(OpenAIFileStatus other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIModelInfo.Serialization.cs b/.dotnet/src/Generated/Models/OpenAIModelInfo.Serialization.cs new file mode 100644 index 000000000..575c7e962 --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIModelInfo.Serialization.cs @@ -0,0 +1,166 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Models +{ + public partial class OpenAIModelInfo : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(OpenAIModelInfo)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("created") != true) + { + writer.WritePropertyName("created"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("owned_by") != true) + { + writer.WritePropertyName("owned_by"u8); + writer.WriteStringValue(OwnedBy); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + OpenAIModelInfo IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(OpenAIModelInfo)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeOpenAIModelInfo(document.RootElement, options); + } + + internal static OpenAIModelInfo DeserializeOpenAIModelInfo(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + DateTimeOffset created = default; + InternalModelObject @object = default; + string ownedBy = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("created"u8)) + { + created = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalModelObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("owned_by"u8)) + { + ownedBy = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new OpenAIModelInfo(id, created, @object, ownedBy, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(OpenAIModelInfo)} does not support writing '{options.Format}' format."); + } + } + + OpenAIModelInfo IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeOpenAIModelInfo(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(OpenAIModelInfo)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static OpenAIModelInfo FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeOpenAIModelInfo(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIModelInfo.cs b/.dotnet/src/Generated/Models/OpenAIModelInfo.cs new file mode 100644 index 000000000..c838e75ab --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIModelInfo.cs @@ -0,0 +1,40 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Models +{ + public partial class OpenAIModelInfo + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal OpenAIModelInfo(string id, DateTimeOffset createdAt, string ownedBy) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(ownedBy, nameof(ownedBy)); + + Id = id; + CreatedAt = createdAt; + OwnedBy = ownedBy; + } + + internal OpenAIModelInfo(string id, DateTimeOffset createdAt, InternalModelObject @object, string ownedBy, IDictionary serializedAdditionalRawData) + { + Id = id; + CreatedAt = createdAt; + Object = @object; + OwnedBy = ownedBy; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal OpenAIModelInfo() + { + } + + public string Id { get; } + + public string OwnedBy { get; } + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIModelInfoCollection.Serialization.cs b/.dotnet/src/Generated/Models/OpenAIModelInfoCollection.Serialization.cs new file mode 100644 index 000000000..f9790a97a --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIModelInfoCollection.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Models +{ + public partial class OpenAIModelInfoCollection : IJsonModel + { + OpenAIModelInfoCollection IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(OpenAIModelInfoCollection)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeOpenAIModelInfoCollection(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(OpenAIModelInfoCollection)} does not support writing '{options.Format}' format."); + } + } + + OpenAIModelInfoCollection IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeOpenAIModelInfoCollection(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(OpenAIModelInfoCollection)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static OpenAIModelInfoCollection FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeOpenAIModelInfoCollection(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/OpenAIModelInfoCollection.cs b/.dotnet/src/Generated/Models/OpenAIModelInfoCollection.cs new file mode 100644 index 000000000..f917c20de --- /dev/null +++ b/.dotnet/src/Generated/Models/OpenAIModelInfoCollection.cs @@ -0,0 +1,14 @@ +// + +#nullable disable + +using System.Collections.Generic; +using System.Collections.ObjectModel; +using System.Linq; + +namespace OpenAI.Models +{ + public partial class OpenAIModelInfoCollection : ReadOnlyCollection + { + } +} diff --git a/.dotnet/src/Generated/Models/RunCreationOptions.Serialization.cs b/.dotnet/src/Generated/Models/RunCreationOptions.Serialization.cs new file mode 100644 index 000000000..2c0dc45c8 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunCreationOptions.Serialization.cs @@ -0,0 +1,515 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class RunCreationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunCreationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("assistant_id") != true) + { + writer.WritePropertyName("assistant_id"u8); + writer.WriteStringValue(AssistantId); + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true && Optional.IsDefined(ModelOverride)) + { + if (ModelOverride != null) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(ModelOverride); + } + else + { + writer.WriteNull("model"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("instructions") != true && Optional.IsDefined(InstructionsOverride)) + { + if (InstructionsOverride != null) + { + writer.WritePropertyName("instructions"u8); + writer.WriteStringValue(InstructionsOverride); + } + else + { + writer.WriteNull("instructions"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("additional_instructions") != true && Optional.IsDefined(AdditionalInstructions)) + { + if (AdditionalInstructions != null) + { + writer.WritePropertyName("additional_instructions"u8); + writer.WriteStringValue(AdditionalInstructions); + } + else + { + writer.WriteNull("additional_instructions"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("additional_messages") != true && Optional.IsCollectionDefined(InternalMessages)) + { + if (InternalMessages != null) + { + writer.WritePropertyName("additional_messages"u8); + writer.WriteStartArray(); + foreach (var item in InternalMessages) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("additional_messages"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tools") != true && Optional.IsCollectionDefined(ToolsOverride)) + { + if (ToolsOverride != null) + { + writer.WritePropertyName("tools"u8); + writer.WriteStartArray(); + foreach (var item in ToolsOverride) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("tools"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("temperature") != true && Optional.IsDefined(Temperature)) + { + if (Temperature != null) + { + writer.WritePropertyName("temperature"u8); + writer.WriteNumberValue(Temperature.Value); + } + else + { + writer.WriteNull("temperature"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("top_p") != true && Optional.IsDefined(NucleusSamplingFactor)) + { + if (NucleusSamplingFactor != null) + { + writer.WritePropertyName("top_p"u8); + writer.WriteNumberValue(NucleusSamplingFactor.Value); + } + else + { + writer.WriteNull("top_p"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("stream") != true && Optional.IsDefined(Stream)) + { + if (Stream != null) + { + writer.WritePropertyName("stream"u8); + writer.WriteBooleanValue(Stream.Value); + } + else + { + writer.WriteNull("stream"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("max_prompt_tokens") != true && Optional.IsDefined(MaxPromptTokens)) + { + if (MaxPromptTokens != null) + { + writer.WritePropertyName("max_prompt_tokens"u8); + writer.WriteNumberValue(MaxPromptTokens.Value); + } + else + { + writer.WriteNull("max_prompt_tokens"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("max_completion_tokens") != true && Optional.IsDefined(MaxCompletionTokens)) + { + if (MaxCompletionTokens != null) + { + writer.WritePropertyName("max_completion_tokens"u8); + writer.WriteNumberValue(MaxCompletionTokens.Value); + } + else + { + writer.WriteNull("max_completion_tokens"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("truncation_strategy") != true && Optional.IsDefined(TruncationStrategy)) + { + if (TruncationStrategy != null) + { + writer.WritePropertyName("truncation_strategy"u8); + writer.WriteObjectValue(TruncationStrategy, options); + } + else + { + writer.WriteNull("truncation_strategy"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tool_choice") != true && Optional.IsDefined(ToolConstraint)) + { + if (ToolConstraint != null) + { + writer.WritePropertyName("tool_choice"u8); + SerializeToolConstraint(writer, options); + } + else + { + writer.WriteNull("tool_choice"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("parallel_tool_calls") != true && Optional.IsDefined(ParallelToolCallsEnabled)) + { + writer.WritePropertyName("parallel_tool_calls"u8); + writer.WriteBooleanValue(ParallelToolCallsEnabled.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + if (ResponseFormat != null) + { + writer.WritePropertyName("response_format"u8); + writer.WriteObjectValue(ResponseFormat, options); + } + else + { + writer.WriteNull("response_format"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunCreationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunCreationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunCreationOptions(document.RootElement, options); + } + + internal static RunCreationOptions DeserializeRunCreationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string assistantId = default; + string model = default; + string instructions = default; + string additionalInstructions = default; + IList additionalMessages = default; + IList tools = default; + IDictionary metadata = default; + float? temperature = default; + float? topP = default; + bool? stream = default; + int? maxPromptTokens = default; + int? maxCompletionTokens = default; + RunTruncationStrategy truncationStrategy = default; + ToolConstraint toolChoice = default; + bool? parallelToolCalls = default; + AssistantResponseFormat responseFormat = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("assistant_id"u8)) + { + assistantId = property.Value.GetString(); + continue; + } + if (property.NameEquals("model"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + model = null; + continue; + } + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("instructions"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + instructions = null; + continue; + } + instructions = property.Value.GetString(); + continue; + } + if (property.NameEquals("additional_instructions"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + additionalInstructions = null; + continue; + } + additionalInstructions = property.Value.GetString(); + continue; + } + if (property.NameEquals("additional_messages"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(MessageCreationOptions.DeserializeMessageCreationOptions(item, options)); + } + additionalMessages = array; + continue; + } + if (property.NameEquals("tools"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ToolDefinition.DeserializeToolDefinition(item, options)); + } + tools = array; + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (property.NameEquals("temperature"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + temperature = null; + continue; + } + temperature = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("top_p"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + topP = null; + continue; + } + topP = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("stream"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + stream = null; + continue; + } + stream = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("max_prompt_tokens"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + maxPromptTokens = null; + continue; + } + maxPromptTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("max_completion_tokens"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + maxCompletionTokens = null; + continue; + } + maxCompletionTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("truncation_strategy"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + truncationStrategy = null; + continue; + } + truncationStrategy = RunTruncationStrategy.DeserializeRunTruncationStrategy(property.Value, options); + continue; + } + if (property.NameEquals("tool_choice"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + toolChoice = null; + continue; + } + toolChoice = Assistants.ToolConstraint.DeserializeToolConstraint(property.Value, options); + continue; + } + if (property.NameEquals("parallel_tool_calls"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + parallelToolCalls = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + responseFormat = null; + continue; + } + responseFormat = AssistantResponseFormat.DeserializeAssistantResponseFormat(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new RunCreationOptions( + assistantId, + model, + instructions, + additionalInstructions, + additionalMessages ?? new ChangeTrackingList(), + tools ?? new ChangeTrackingList(), + metadata ?? new ChangeTrackingDictionary(), + temperature, + topP, + stream, + maxPromptTokens, + maxCompletionTokens, + truncationStrategy, + toolChoice, + parallelToolCalls, + responseFormat, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunCreationOptions)} does not support writing '{options.Format}' format."); + } + } + + RunCreationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunCreationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunCreationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunCreationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunCreationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunCreationOptions.cs b/.dotnet/src/Generated/Models/RunCreationOptions.cs new file mode 100644 index 000000000..9e0d88fa7 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunCreationOptions.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class RunCreationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal RunCreationOptions(string assistantId, string modelOverride, string instructionsOverride, string additionalInstructions, IList internalMessages, IList toolsOverride, IDictionary metadata, float? temperature, float? nucleusSamplingFactor, bool? stream, int? maxPromptTokens, int? maxCompletionTokens, RunTruncationStrategy truncationStrategy, ToolConstraint toolConstraint, bool? parallelToolCallsEnabled, AssistantResponseFormat responseFormat, IDictionary serializedAdditionalRawData) + { + AssistantId = assistantId; + ModelOverride = modelOverride; + InstructionsOverride = instructionsOverride; + AdditionalInstructions = additionalInstructions; + InternalMessages = internalMessages; + ToolsOverride = toolsOverride; + Metadata = metadata; + Temperature = temperature; + NucleusSamplingFactor = nucleusSamplingFactor; + Stream = stream; + MaxPromptTokens = maxPromptTokens; + MaxCompletionTokens = maxCompletionTokens; + TruncationStrategy = truncationStrategy; + ToolConstraint = toolConstraint; + ParallelToolCallsEnabled = parallelToolCallsEnabled; + ResponseFormat = responseFormat; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/RunError.Serialization.cs b/.dotnet/src/Generated/Models/RunError.Serialization.cs new file mode 100644 index 000000000..df0a592e1 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunError.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class RunError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true) + { + writer.WritePropertyName("code"u8); + writer.WriteStringValue(Code.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true) + { + writer.WritePropertyName("message"u8); + writer.WriteStringValue(Message); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunError(document.RootElement, options); + } + + internal static RunError DeserializeRunError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + RunErrorCode code = default; + string message = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + code = new RunErrorCode(property.Value.GetString()); + continue; + } + if (property.NameEquals("message"u8)) + { + message = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new RunError(code, message, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunError)} does not support writing '{options.Format}' format."); + } + } + + RunError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunError(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunError.cs b/.dotnet/src/Generated/Models/RunError.cs new file mode 100644 index 000000000..8404fbbe2 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunError.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class RunError + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal RunError(RunErrorCode code, string message) + { + Argument.AssertNotNull(message, nameof(message)); + + Code = code; + Message = message; + } + + internal RunError(RunErrorCode code, string message, IDictionary serializedAdditionalRawData) + { + Code = code; + Message = message; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal RunError() + { + } + + public RunErrorCode Code { get; } + public string Message { get; } + } +} diff --git a/.dotnet/src/Generated/Models/RunErrorCode.cs b/.dotnet/src/Generated/Models/RunErrorCode.cs new file mode 100644 index 000000000..815942d5e --- /dev/null +++ b/.dotnet/src/Generated/Models/RunErrorCode.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + public readonly partial struct RunErrorCode : IEquatable + { + private readonly string _value; + + public RunErrorCode(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ServerErrorValue = "server_error"; + private const string RateLimitExceededValue = "rate_limit_exceeded"; + private const string InvalidPromptValue = "invalid_prompt"; + + public static RunErrorCode ServerError { get; } = new RunErrorCode(ServerErrorValue); + public static RunErrorCode RateLimitExceeded { get; } = new RunErrorCode(RateLimitExceededValue); + public static RunErrorCode InvalidPrompt { get; } = new RunErrorCode(InvalidPromptValue); + public static bool operator ==(RunErrorCode left, RunErrorCode right) => left.Equals(right); + public static bool operator !=(RunErrorCode left, RunErrorCode right) => !left.Equals(right); + public static implicit operator RunErrorCode(string value) => new RunErrorCode(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is RunErrorCode other && Equals(other); + public bool Equals(RunErrorCode other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/RunIncompleteDetails.Serialization.cs b/.dotnet/src/Generated/Models/RunIncompleteDetails.Serialization.cs new file mode 100644 index 000000000..097376a17 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunIncompleteDetails.Serialization.cs @@ -0,0 +1,137 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class RunIncompleteDetails : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunIncompleteDetails)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("reason") != true && Optional.IsDefined(Reason)) + { + writer.WritePropertyName("reason"u8); + writer.WriteStringValue(Reason.Value.ToString()); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunIncompleteDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunIncompleteDetails)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunIncompleteDetails(document.RootElement, options); + } + + internal static RunIncompleteDetails DeserializeRunIncompleteDetails(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + RunIncompleteReason? reason = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("reason"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + reason = new RunIncompleteReason(property.Value.GetString()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new RunIncompleteDetails(reason, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunIncompleteDetails)} does not support writing '{options.Format}' format."); + } + } + + RunIncompleteDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunIncompleteDetails(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunIncompleteDetails)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunIncompleteDetails FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunIncompleteDetails(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunIncompleteDetails.cs b/.dotnet/src/Generated/Models/RunIncompleteDetails.cs new file mode 100644 index 000000000..5c063d7c0 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunIncompleteDetails.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class RunIncompleteDetails + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal RunIncompleteDetails() + { + } + + internal RunIncompleteDetails(RunIncompleteReason? reason, IDictionary serializedAdditionalRawData) + { + Reason = reason; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public RunIncompleteReason? Reason { get; } + } +} diff --git a/.dotnet/src/Generated/Models/RunIncompleteReason.cs b/.dotnet/src/Generated/Models/RunIncompleteReason.cs new file mode 100644 index 000000000..b0baa9e8e --- /dev/null +++ b/.dotnet/src/Generated/Models/RunIncompleteReason.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + public readonly partial struct RunIncompleteReason : IEquatable + { + private readonly string _value; + + public RunIncompleteReason(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string MaxCompletionTokensValue = "max_completion_tokens"; + private const string MaxPromptTokensValue = "max_prompt_tokens"; + + public static RunIncompleteReason MaxCompletionTokens { get; } = new RunIncompleteReason(MaxCompletionTokensValue); + public static RunIncompleteReason MaxPromptTokens { get; } = new RunIncompleteReason(MaxPromptTokensValue); + public static bool operator ==(RunIncompleteReason left, RunIncompleteReason right) => left.Equals(right); + public static bool operator !=(RunIncompleteReason left, RunIncompleteReason right) => !left.Equals(right); + public static implicit operator RunIncompleteReason(string value) => new RunIncompleteReason(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is RunIncompleteReason other && Equals(other); + public bool Equals(RunIncompleteReason other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/RunModificationOptions.Serialization.cs b/.dotnet/src/Generated/Models/RunModificationOptions.Serialization.cs new file mode 100644 index 000000000..f13f9b1a8 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunModificationOptions.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class RunModificationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunModificationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunModificationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunModificationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunModificationOptions(document.RootElement, options); + } + + internal static RunModificationOptions DeserializeRunModificationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new RunModificationOptions(metadata ?? new ChangeTrackingDictionary(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunModificationOptions)} does not support writing '{options.Format}' format."); + } + } + + RunModificationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunModificationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunModificationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunModificationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunModificationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunModificationOptions.cs b/.dotnet/src/Generated/Models/RunModificationOptions.cs new file mode 100644 index 000000000..b9c29be48 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunModificationOptions.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class RunModificationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public RunModificationOptions() + { + Metadata = new ChangeTrackingDictionary(); + } + + internal RunModificationOptions(IDictionary metadata, IDictionary serializedAdditionalRawData) + { + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IDictionary Metadata { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/RunStatus.cs b/.dotnet/src/Generated/Models/RunStatus.cs new file mode 100644 index 000000000..08c6af672 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStatus.cs @@ -0,0 +1,50 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + public readonly partial struct RunStatus : IEquatable + { + private readonly string _value; + + public RunStatus(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string QueuedValue = "queued"; + private const string InProgressValue = "in_progress"; + private const string RequiresActionValue = "requires_action"; + private const string CancellingValue = "cancelling"; + private const string CancelledValue = "cancelled"; + private const string FailedValue = "failed"; + private const string CompletedValue = "completed"; + private const string IncompleteValue = "incomplete"; + private const string ExpiredValue = "expired"; + + public static RunStatus Queued { get; } = new RunStatus(QueuedValue); + public static RunStatus InProgress { get; } = new RunStatus(InProgressValue); + public static RunStatus RequiresAction { get; } = new RunStatus(RequiresActionValue); + public static RunStatus Cancelling { get; } = new RunStatus(CancellingValue); + public static RunStatus Cancelled { get; } = new RunStatus(CancelledValue); + public static RunStatus Failed { get; } = new RunStatus(FailedValue); + public static RunStatus Completed { get; } = new RunStatus(CompletedValue); + public static RunStatus Incomplete { get; } = new RunStatus(IncompleteValue); + public static RunStatus Expired { get; } = new RunStatus(ExpiredValue); + public static bool operator ==(RunStatus left, RunStatus right) => left.Equals(right); + public static bool operator !=(RunStatus left, RunStatus right) => !left.Equals(right); + public static implicit operator RunStatus(string value) => new RunStatus(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is RunStatus other && Equals(other); + public bool Equals(RunStatus other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/RunStep.Serialization.cs b/.dotnet/src/Generated/Models/RunStep.Serialization.cs new file mode 100644 index 000000000..c31d89f57 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStep.Serialization.cs @@ -0,0 +1,410 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class RunStep : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStep)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("assistant_id") != true) + { + writer.WritePropertyName("assistant_id"u8); + writer.WriteStringValue(AssistantId); + } + if (SerializedAdditionalRawData?.ContainsKey("thread_id") != true) + { + writer.WritePropertyName("thread_id"u8); + writer.WriteStringValue(ThreadId); + } + if (SerializedAdditionalRawData?.ContainsKey("run_id") != true) + { + writer.WritePropertyName("run_id"u8); + writer.WriteStringValue(RunId); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("status") != true) + { + writer.WritePropertyName("status"u8); + writer.WriteStringValue(Status.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("step_details") != true) + { + writer.WritePropertyName("step_details"u8); + writer.WriteObjectValue(Details, options); + } + if (SerializedAdditionalRawData?.ContainsKey("last_error") != true) + { + if (LastError != null) + { + writer.WritePropertyName("last_error"u8); + writer.WriteObjectValue(LastError, options); + } + else + { + writer.WriteNull("last_error"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("expired_at") != true) + { + if (ExpiredAt != null) + { + writer.WritePropertyName("expired_at"u8); + writer.WriteNumberValue(ExpiredAt.Value, "U"); + } + else + { + writer.WriteNull("expired_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("cancelled_at") != true) + { + if (CancelledAt != null) + { + writer.WritePropertyName("cancelled_at"u8); + writer.WriteNumberValue(CancelledAt.Value, "U"); + } + else + { + writer.WriteNull("cancelled_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("failed_at") != true) + { + if (FailedAt != null) + { + writer.WritePropertyName("failed_at"u8); + writer.WriteNumberValue(FailedAt.Value, "U"); + } + else + { + writer.WriteNull("failed_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("completed_at") != true) + { + if (CompletedAt != null) + { + writer.WritePropertyName("completed_at"u8); + writer.WriteNumberValue(CompletedAt.Value, "U"); + } + else + { + writer.WriteNull("completed_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true) + { + if (Metadata != null && Optional.IsCollectionDefined(Metadata)) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("usage") != true) + { + if (Usage != null) + { + writer.WritePropertyName("usage"u8); + writer.WriteObjectValue(Usage, options); + } + else + { + writer.WriteNull("usage"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunStep IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStep)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunStep(document.RootElement, options); + } + + internal static RunStep DeserializeRunStep(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalRunStepObjectObject @object = default; + DateTimeOffset createdAt = default; + string assistantId = default; + string threadId = default; + string runId = default; + RunStepType type = default; + RunStepStatus status = default; + RunStepDetails stepDetails = default; + RunStepError lastError = default; + DateTimeOffset? expiredAt = default; + DateTimeOffset? cancelledAt = default; + DateTimeOffset? failedAt = default; + DateTimeOffset? completedAt = default; + IReadOnlyDictionary metadata = default; + RunStepTokenUsage usage = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalRunStepObjectObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("assistant_id"u8)) + { + assistantId = property.Value.GetString(); + continue; + } + if (property.NameEquals("thread_id"u8)) + { + threadId = property.Value.GetString(); + continue; + } + if (property.NameEquals("run_id"u8)) + { + runId = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + type = new RunStepType(property.Value.GetString()); + continue; + } + if (property.NameEquals("status"u8)) + { + status = new RunStepStatus(property.Value.GetString()); + continue; + } + if (property.NameEquals("step_details"u8)) + { + stepDetails = RunStepDetails.DeserializeRunStepDetails(property.Value, options); + continue; + } + if (property.NameEquals("last_error"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + lastError = null; + continue; + } + lastError = RunStepError.DeserializeRunStepError(property.Value, options); + continue; + } + if (property.NameEquals("expired_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + expiredAt = null; + continue; + } + expiredAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("cancelled_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + cancelledAt = null; + continue; + } + cancelledAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("failed_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + failedAt = null; + continue; + } + failedAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("completed_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + completedAt = null; + continue; + } + completedAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + metadata = new ChangeTrackingDictionary(); + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (property.NameEquals("usage"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + usage = null; + continue; + } + usage = RunStepTokenUsage.DeserializeRunStepTokenUsage(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new RunStep( + id, + @object, + createdAt, + assistantId, + threadId, + runId, + type, + status, + stepDetails, + lastError, + expiredAt, + cancelledAt, + failedAt, + completedAt, + metadata, + usage, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunStep)} does not support writing '{options.Format}' format."); + } + } + + RunStep IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunStep(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunStep)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunStep FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunStep(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunStep.cs b/.dotnet/src/Generated/Models/RunStep.cs new file mode 100644 index 000000000..e6d00c191 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStep.cs @@ -0,0 +1,79 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class RunStep + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal RunStep(string id, DateTimeOffset createdAt, string assistantId, string threadId, string runId, RunStepType type, RunStepStatus status, RunStepDetails details, RunStepError lastError, DateTimeOffset? expiredAt, DateTimeOffset? cancelledAt, DateTimeOffset? failedAt, DateTimeOffset? completedAt, IReadOnlyDictionary metadata, RunStepTokenUsage usage) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(assistantId, nameof(assistantId)); + Argument.AssertNotNull(threadId, nameof(threadId)); + Argument.AssertNotNull(runId, nameof(runId)); + Argument.AssertNotNull(details, nameof(details)); + + Id = id; + CreatedAt = createdAt; + AssistantId = assistantId; + ThreadId = threadId; + RunId = runId; + Type = type; + Status = status; + Details = details; + LastError = lastError; + ExpiredAt = expiredAt; + CancelledAt = cancelledAt; + FailedAt = failedAt; + CompletedAt = completedAt; + Metadata = metadata; + Usage = usage; + } + + internal RunStep(string id, InternalRunStepObjectObject @object, DateTimeOffset createdAt, string assistantId, string threadId, string runId, RunStepType type, RunStepStatus status, RunStepDetails details, RunStepError lastError, DateTimeOffset? expiredAt, DateTimeOffset? cancelledAt, DateTimeOffset? failedAt, DateTimeOffset? completedAt, IReadOnlyDictionary metadata, RunStepTokenUsage usage, IDictionary serializedAdditionalRawData) + { + Id = id; + Object = @object; + CreatedAt = createdAt; + AssistantId = assistantId; + ThreadId = threadId; + RunId = runId; + Type = type; + Status = status; + Details = details; + LastError = lastError; + ExpiredAt = expiredAt; + CancelledAt = cancelledAt; + FailedAt = failedAt; + CompletedAt = completedAt; + Metadata = metadata; + Usage = usage; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal RunStep() + { + } + + public string Id { get; } + + public DateTimeOffset CreatedAt { get; } + public string AssistantId { get; } + public string ThreadId { get; } + public string RunId { get; } + public RunStepType Type { get; } + public RunStepStatus Status { get; } + public RunStepError LastError { get; } + public DateTimeOffset? ExpiredAt { get; } + public DateTimeOffset? CancelledAt { get; } + public DateTimeOffset? FailedAt { get; } + public DateTimeOffset? CompletedAt { get; } + public IReadOnlyDictionary Metadata { get; } + public RunStepTokenUsage Usage { get; } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepCodeInterpreterOutput.Serialization.cs b/.dotnet/src/Generated/Models/RunStepCodeInterpreterOutput.Serialization.cs new file mode 100644 index 000000000..f7ef07662 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepCodeInterpreterOutput.Serialization.cs @@ -0,0 +1,124 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + [PersistableModelProxy(typeof(UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject))] + public partial class RunStepCodeInterpreterOutput : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepCodeInterpreterOutput)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunStepCodeInterpreterOutput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepCodeInterpreterOutput)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunStepCodeInterpreterOutput(document.RootElement, options); + } + + internal static RunStepCodeInterpreterOutput DeserializeRunStepCodeInterpreterOutput(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "image": return InternalRunStepDetailsToolCallsCodeOutputImageObject.DeserializeInternalRunStepDetailsToolCallsCodeOutputImageObject(element, options); + case "logs": return InternalRunStepCodeInterpreterLogOutput.DeserializeInternalRunStepCodeInterpreterLogOutput(element, options); + } + } + return UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.DeserializeUnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunStepCodeInterpreterOutput)} does not support writing '{options.Format}' format."); + } + } + + RunStepCodeInterpreterOutput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunStepCodeInterpreterOutput(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunStepCodeInterpreterOutput)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunStepCodeInterpreterOutput FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunStepCodeInterpreterOutput(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepCodeInterpreterOutput.cs b/.dotnet/src/Generated/Models/RunStepCodeInterpreterOutput.cs new file mode 100644 index 000000000..1419ce55c --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepCodeInterpreterOutput.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public abstract partial class RunStepCodeInterpreterOutput + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected RunStepCodeInterpreterOutput() + { + } + + internal RunStepCodeInterpreterOutput(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepDetails.Serialization.cs b/.dotnet/src/Generated/Models/RunStepDetails.Serialization.cs new file mode 100644 index 000000000..35e15ea14 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepDetails.Serialization.cs @@ -0,0 +1,124 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + [PersistableModelProxy(typeof(UnknownRunStepObjectStepDetails))] + public partial class RunStepDetails : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepDetails)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunStepDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepDetails)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunStepDetails(document.RootElement, options); + } + + internal static RunStepDetails DeserializeRunStepDetails(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "message_creation": return InternalRunStepDetailsMessageCreationObject.DeserializeInternalRunStepDetailsMessageCreationObject(element, options); + case "tool_calls": return InternalRunStepDetailsToolCallsObject.DeserializeInternalRunStepDetailsToolCallsObject(element, options); + } + } + return UnknownRunStepObjectStepDetails.DeserializeUnknownRunStepObjectStepDetails(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunStepDetails)} does not support writing '{options.Format}' format."); + } + } + + RunStepDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunStepDetails(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunStepDetails)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunStepDetails FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunStepDetails(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepDetails.cs b/.dotnet/src/Generated/Models/RunStepDetails.cs new file mode 100644 index 000000000..29a98a672 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepDetails.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public abstract partial class RunStepDetails + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected RunStepDetails() + { + } + + internal RunStepDetails(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepError.Serialization.cs b/.dotnet/src/Generated/Models/RunStepError.Serialization.cs new file mode 100644 index 000000000..2e84d4c87 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepError.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class RunStepError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true) + { + writer.WritePropertyName("code"u8); + writer.WriteStringValue(Code.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true) + { + writer.WritePropertyName("message"u8); + writer.WriteStringValue(Message); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunStepError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunStepError(document.RootElement, options); + } + + internal static RunStepError DeserializeRunStepError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + RunStepErrorCode code = default; + string message = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + code = new RunStepErrorCode(property.Value.GetString()); + continue; + } + if (property.NameEquals("message"u8)) + { + message = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new RunStepError(code, message, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunStepError)} does not support writing '{options.Format}' format."); + } + } + + RunStepError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunStepError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunStepError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunStepError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunStepError(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepError.cs b/.dotnet/src/Generated/Models/RunStepError.cs new file mode 100644 index 000000000..780003cbb --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepError.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class RunStepError + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal RunStepError(RunStepErrorCode code, string message) + { + Argument.AssertNotNull(message, nameof(message)); + + Code = code; + Message = message; + } + + internal RunStepError(RunStepErrorCode code, string message, IDictionary serializedAdditionalRawData) + { + Code = code; + Message = message; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal RunStepError() + { + } + + public RunStepErrorCode Code { get; } + public string Message { get; } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepErrorCode.cs b/.dotnet/src/Generated/Models/RunStepErrorCode.cs new file mode 100644 index 000000000..3feed9ad8 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepErrorCode.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + public readonly partial struct RunStepErrorCode : IEquatable + { + private readonly string _value; + + public RunStepErrorCode(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ServerErrorValue = "server_error"; + private const string RateLimitExceededValue = "rate_limit_exceeded"; + + public static RunStepErrorCode ServerError { get; } = new RunStepErrorCode(ServerErrorValue); + public static RunStepErrorCode RateLimitExceeded { get; } = new RunStepErrorCode(RateLimitExceededValue); + public static bool operator ==(RunStepErrorCode left, RunStepErrorCode right) => left.Equals(right); + public static bool operator !=(RunStepErrorCode left, RunStepErrorCode right) => !left.Equals(right); + public static implicit operator RunStepErrorCode(string value) => new RunStepErrorCode(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is RunStepErrorCode other && Equals(other); + public bool Equals(RunStepErrorCode other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/RunStepStatus.cs b/.dotnet/src/Generated/Models/RunStepStatus.cs new file mode 100644 index 000000000..b328305d0 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepStatus.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + public readonly partial struct RunStepStatus : IEquatable + { + private readonly string _value; + + public RunStepStatus(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string InProgressValue = "in_progress"; + private const string CancelledValue = "cancelled"; + private const string FailedValue = "failed"; + private const string CompletedValue = "completed"; + private const string ExpiredValue = "expired"; + + public static RunStepStatus InProgress { get; } = new RunStepStatus(InProgressValue); + public static RunStepStatus Cancelled { get; } = new RunStepStatus(CancelledValue); + public static RunStepStatus Failed { get; } = new RunStepStatus(FailedValue); + public static RunStepStatus Completed { get; } = new RunStepStatus(CompletedValue); + public static RunStepStatus Expired { get; } = new RunStepStatus(ExpiredValue); + public static bool operator ==(RunStepStatus left, RunStepStatus right) => left.Equals(right); + public static bool operator !=(RunStepStatus left, RunStepStatus right) => !left.Equals(right); + public static implicit operator RunStepStatus(string value) => new RunStepStatus(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is RunStepStatus other && Equals(other); + public bool Equals(RunStepStatus other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/RunStepTokenUsage.Serialization.cs b/.dotnet/src/Generated/Models/RunStepTokenUsage.Serialization.cs new file mode 100644 index 000000000..eed20e75e --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepTokenUsage.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class RunStepTokenUsage : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepTokenUsage)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("completion_tokens") != true) + { + writer.WritePropertyName("completion_tokens"u8); + writer.WriteNumberValue(CompletionTokens); + } + if (SerializedAdditionalRawData?.ContainsKey("prompt_tokens") != true) + { + writer.WritePropertyName("prompt_tokens"u8); + writer.WriteNumberValue(PromptTokens); + } + if (SerializedAdditionalRawData?.ContainsKey("total_tokens") != true) + { + writer.WritePropertyName("total_tokens"u8); + writer.WriteNumberValue(TotalTokens); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunStepTokenUsage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepTokenUsage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunStepTokenUsage(document.RootElement, options); + } + + internal static RunStepTokenUsage DeserializeRunStepTokenUsage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int completionTokens = default; + int promptTokens = default; + int totalTokens = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("completion_tokens"u8)) + { + completionTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("prompt_tokens"u8)) + { + promptTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("total_tokens"u8)) + { + totalTokens = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new RunStepTokenUsage(completionTokens, promptTokens, totalTokens, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunStepTokenUsage)} does not support writing '{options.Format}' format."); + } + } + + RunStepTokenUsage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunStepTokenUsage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunStepTokenUsage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunStepTokenUsage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunStepTokenUsage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepTokenUsage.cs b/.dotnet/src/Generated/Models/RunStepTokenUsage.cs new file mode 100644 index 000000000..3e1e81939 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepTokenUsage.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class RunStepTokenUsage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal RunStepTokenUsage(int completionTokens, int promptTokens, int totalTokens) + { + CompletionTokens = completionTokens; + PromptTokens = promptTokens; + TotalTokens = totalTokens; + } + + internal RunStepTokenUsage(int completionTokens, int promptTokens, int totalTokens, IDictionary serializedAdditionalRawData) + { + CompletionTokens = completionTokens; + PromptTokens = promptTokens; + TotalTokens = totalTokens; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal RunStepTokenUsage() + { + } + + public int CompletionTokens { get; } + public int PromptTokens { get; } + public int TotalTokens { get; } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepToolCall.Serialization.cs b/.dotnet/src/Generated/Models/RunStepToolCall.Serialization.cs new file mode 100644 index 000000000..322ca021d --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepToolCall.Serialization.cs @@ -0,0 +1,125 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + [PersistableModelProxy(typeof(UnknownRunStepDetailsToolCallsObjectToolCallsObject))] + public partial class RunStepToolCall : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepToolCall)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunStepToolCall IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepToolCall)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunStepToolCall(document.RootElement, options); + } + + internal static RunStepToolCall DeserializeRunStepToolCall(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "code_interpreter": return InternalRunStepCodeInterpreterToolCallDetails.DeserializeInternalRunStepCodeInterpreterToolCallDetails(element, options); + case "file_search": return InternalRunStepFileSearchToolCallDetails.DeserializeInternalRunStepFileSearchToolCallDetails(element, options); + case "function": return InternalRunStepFunctionToolCallDetails.DeserializeInternalRunStepFunctionToolCallDetails(element, options); + } + } + return UnknownRunStepDetailsToolCallsObjectToolCallsObject.DeserializeUnknownRunStepDetailsToolCallsObjectToolCallsObject(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunStepToolCall)} does not support writing '{options.Format}' format."); + } + } + + RunStepToolCall IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunStepToolCall(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunStepToolCall)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunStepToolCall FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunStepToolCall(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepToolCall.cs b/.dotnet/src/Generated/Models/RunStepToolCall.cs new file mode 100644 index 000000000..7c21afff2 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepToolCall.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public abstract partial class RunStepToolCall + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected RunStepToolCall() + { + } + + internal RunStepToolCall(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepType.cs b/.dotnet/src/Generated/Models/RunStepType.cs new file mode 100644 index 000000000..699bc37a5 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepType.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.Assistants +{ + public readonly partial struct RunStepType : IEquatable + { + private readonly string _value; + + public RunStepType(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string MessageCreationValue = "message_creation"; + private const string ToolCallsValue = "tool_calls"; + + public static RunStepType MessageCreation { get; } = new RunStepType(MessageCreationValue); + public static RunStepType ToolCalls { get; } = new RunStepType(ToolCallsValue); + public static bool operator ==(RunStepType left, RunStepType right) => left.Equals(right); + public static bool operator !=(RunStepType left, RunStepType right) => !left.Equals(right); + public static implicit operator RunStepType(string value) => new RunStepType(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is RunStepType other && Equals(other); + public bool Equals(RunStepType other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/RunStepUpdateCodeInterpreterOutput.Serialization.cs b/.dotnet/src/Generated/Models/RunStepUpdateCodeInterpreterOutput.Serialization.cs new file mode 100644 index 000000000..15407f30d --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepUpdateCodeInterpreterOutput.Serialization.cs @@ -0,0 +1,124 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + [PersistableModelProxy(typeof(UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject))] + public partial class RunStepUpdateCodeInterpreterOutput : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepUpdateCodeInterpreterOutput)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunStepUpdateCodeInterpreterOutput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepUpdateCodeInterpreterOutput)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunStepUpdateCodeInterpreterOutput(document.RootElement, options); + } + + internal static RunStepUpdateCodeInterpreterOutput DeserializeRunStepUpdateCodeInterpreterOutput(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "image": return InternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject.DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputImageObject(element, options); + case "logs": return InternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject.DeserializeInternalRunStepDeltaStepDetailsToolCallsCodeOutputLogsObject(element, options); + } + } + return UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.DeserializeUnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunStepUpdateCodeInterpreterOutput)} does not support writing '{options.Format}' format."); + } + } + + RunStepUpdateCodeInterpreterOutput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunStepUpdateCodeInterpreterOutput(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunStepUpdateCodeInterpreterOutput)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunStepUpdateCodeInterpreterOutput FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunStepUpdateCodeInterpreterOutput(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunStepUpdateCodeInterpreterOutput.cs b/.dotnet/src/Generated/Models/RunStepUpdateCodeInterpreterOutput.cs new file mode 100644 index 000000000..637332102 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunStepUpdateCodeInterpreterOutput.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public abstract partial class RunStepUpdateCodeInterpreterOutput + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected RunStepUpdateCodeInterpreterOutput() + { + } + + internal RunStepUpdateCodeInterpreterOutput(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/RunTokenUsage.Serialization.cs b/.dotnet/src/Generated/Models/RunTokenUsage.Serialization.cs new file mode 100644 index 000000000..7f50758cc --- /dev/null +++ b/.dotnet/src/Generated/Models/RunTokenUsage.Serialization.cs @@ -0,0 +1,155 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class RunTokenUsage : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunTokenUsage)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("completion_tokens") != true) + { + writer.WritePropertyName("completion_tokens"u8); + writer.WriteNumberValue(CompletionTokens); + } + if (SerializedAdditionalRawData?.ContainsKey("prompt_tokens") != true) + { + writer.WritePropertyName("prompt_tokens"u8); + writer.WriteNumberValue(PromptTokens); + } + if (SerializedAdditionalRawData?.ContainsKey("total_tokens") != true) + { + writer.WritePropertyName("total_tokens"u8); + writer.WriteNumberValue(TotalTokens); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunTokenUsage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunTokenUsage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunTokenUsage(document.RootElement, options); + } + + internal static RunTokenUsage DeserializeRunTokenUsage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int completionTokens = default; + int promptTokens = default; + int totalTokens = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("completion_tokens"u8)) + { + completionTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("prompt_tokens"u8)) + { + promptTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("total_tokens"u8)) + { + totalTokens = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new RunTokenUsage(completionTokens, promptTokens, totalTokens, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunTokenUsage)} does not support writing '{options.Format}' format."); + } + } + + RunTokenUsage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunTokenUsage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunTokenUsage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunTokenUsage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunTokenUsage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunTokenUsage.cs b/.dotnet/src/Generated/Models/RunTokenUsage.cs new file mode 100644 index 000000000..c961964df --- /dev/null +++ b/.dotnet/src/Generated/Models/RunTokenUsage.cs @@ -0,0 +1,36 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class RunTokenUsage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal RunTokenUsage(int completionTokens, int promptTokens, int totalTokens) + { + CompletionTokens = completionTokens; + PromptTokens = promptTokens; + TotalTokens = totalTokens; + } + + internal RunTokenUsage(int completionTokens, int promptTokens, int totalTokens, IDictionary serializedAdditionalRawData) + { + CompletionTokens = completionTokens; + PromptTokens = promptTokens; + TotalTokens = totalTokens; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal RunTokenUsage() + { + } + + public int CompletionTokens { get; } + public int PromptTokens { get; } + public int TotalTokens { get; } + } +} diff --git a/.dotnet/src/Generated/Models/RunTruncationStrategy.Serialization.cs b/.dotnet/src/Generated/Models/RunTruncationStrategy.Serialization.cs new file mode 100644 index 000000000..7aad14eb4 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunTruncationStrategy.Serialization.cs @@ -0,0 +1,156 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class RunTruncationStrategy : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunTruncationStrategy)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(_type.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("last_messages") != true && Optional.IsDefined(LastMessages)) + { + if (LastMessages != null) + { + writer.WritePropertyName("last_messages"u8); + writer.WriteNumberValue(LastMessages.Value); + } + else + { + writer.WriteNull("last_messages"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunTruncationStrategy IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunTruncationStrategy)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunTruncationStrategy(document.RootElement, options); + } + + internal static RunTruncationStrategy DeserializeRunTruncationStrategy(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalTruncationObjectType type = default; + int? lastMessages = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = new InternalTruncationObjectType(property.Value.GetString()); + continue; + } + if (property.NameEquals("last_messages"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + lastMessages = null; + continue; + } + lastMessages = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new RunTruncationStrategy(type, lastMessages, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunTruncationStrategy)} does not support writing '{options.Format}' format."); + } + } + + RunTruncationStrategy IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunTruncationStrategy(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunTruncationStrategy)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static RunTruncationStrategy FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeRunTruncationStrategy(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/RunTruncationStrategy.cs b/.dotnet/src/Generated/Models/RunTruncationStrategy.cs new file mode 100644 index 000000000..3fb0b2da8 --- /dev/null +++ b/.dotnet/src/Generated/Models/RunTruncationStrategy.cs @@ -0,0 +1,23 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class RunTruncationStrategy + { + internal RunTruncationStrategy(InternalTruncationObjectType type, int? lastMessages, IDictionary serializedAdditionalRawData) + { + _type = type; + LastMessages = lastMessages; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal RunTruncationStrategy() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/SpeechGenerationOptions.Serialization.cs b/.dotnet/src/Generated/Models/SpeechGenerationOptions.Serialization.cs new file mode 100644 index 000000000..1bd301308 --- /dev/null +++ b/.dotnet/src/Generated/Models/SpeechGenerationOptions.Serialization.cs @@ -0,0 +1,191 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Audio +{ + public partial class SpeechGenerationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(SpeechGenerationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("input") != true) + { + writer.WritePropertyName("input"u8); + writer.WriteStringValue(Input); + } + if (SerializedAdditionalRawData?.ContainsKey("voice") != true) + { + writer.WritePropertyName("voice"u8); + writer.WriteStringValue(Voice.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true && Optional.IsDefined(ResponseFormat)) + { + writer.WritePropertyName("response_format"u8); + writer.WriteStringValue(ResponseFormat.Value.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("speed") != true && Optional.IsDefined(Speed)) + { + writer.WritePropertyName("speed"u8); + writer.WriteNumberValue(Speed.Value); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + SpeechGenerationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(SpeechGenerationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeSpeechGenerationOptions(document.RootElement, options); + } + + internal static SpeechGenerationOptions DeserializeSpeechGenerationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalCreateSpeechRequestModel model = default; + string input = default; + GeneratedSpeechVoice voice = default; + GeneratedSpeechFormat? responseFormat = default; + float? speed = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("model"u8)) + { + model = new InternalCreateSpeechRequestModel(property.Value.GetString()); + continue; + } + if (property.NameEquals("input"u8)) + { + input = property.Value.GetString(); + continue; + } + if (property.NameEquals("voice"u8)) + { + voice = property.Value.GetString().ToGeneratedSpeechVoice(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + responseFormat = property.Value.GetString().ToGeneratedSpeechFormat(); + continue; + } + if (property.NameEquals("speed"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + speed = property.Value.GetSingle(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new SpeechGenerationOptions( + model, + input, + voice, + responseFormat, + speed, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(SpeechGenerationOptions)} does not support writing '{options.Format}' format."); + } + } + + SpeechGenerationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeSpeechGenerationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(SpeechGenerationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static SpeechGenerationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeSpeechGenerationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/SpeechGenerationOptions.cs b/.dotnet/src/Generated/Models/SpeechGenerationOptions.cs new file mode 100644 index 000000000..4a5b48c18 --- /dev/null +++ b/.dotnet/src/Generated/Models/SpeechGenerationOptions.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Audio +{ + public partial class SpeechGenerationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal SpeechGenerationOptions(InternalCreateSpeechRequestModel model, string input, GeneratedSpeechVoice voice, GeneratedSpeechFormat? responseFormat, float? speed, IDictionary serializedAdditionalRawData) + { + Model = model; + Input = input; + Voice = voice; + ResponseFormat = responseFormat; + Speed = speed; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public GeneratedSpeechFormat? ResponseFormat { get; set; } + public float? Speed { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/StaticFileChunkingStrategy.Serialization.cs b/.dotnet/src/Generated/Models/StaticFileChunkingStrategy.Serialization.cs new file mode 100644 index 000000000..94b1916af --- /dev/null +++ b/.dotnet/src/Generated/Models/StaticFileChunkingStrategy.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + public partial class StaticFileChunkingStrategy : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(StaticFileChunkingStrategy)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("static") != true) + { + writer.WritePropertyName("static"u8); + writer.WriteObjectValue(_internalDetails, options); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + StaticFileChunkingStrategy IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(StaticFileChunkingStrategy)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeStaticFileChunkingStrategy(document.RootElement, options); + } + + internal static StaticFileChunkingStrategy DeserializeStaticFileChunkingStrategy(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + InternalStaticChunkingStrategyDetails @static = default; + string type = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("static"u8)) + { + @static = InternalStaticChunkingStrategyDetails.DeserializeInternalStaticChunkingStrategyDetails(property.Value, options); + continue; + } + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new StaticFileChunkingStrategy(type, serializedAdditionalRawData, @static); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(StaticFileChunkingStrategy)} does not support writing '{options.Format}' format."); + } + } + + StaticFileChunkingStrategy IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeStaticFileChunkingStrategy(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(StaticFileChunkingStrategy)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new StaticFileChunkingStrategy FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeStaticFileChunkingStrategy(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/StaticFileChunkingStrategy.cs b/.dotnet/src/Generated/Models/StaticFileChunkingStrategy.cs new file mode 100644 index 000000000..09f4736fc --- /dev/null +++ b/.dotnet/src/Generated/Models/StaticFileChunkingStrategy.cs @@ -0,0 +1,29 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + public partial class StaticFileChunkingStrategy : FileChunkingStrategy + { + internal StaticFileChunkingStrategy(InternalStaticChunkingStrategyDetails internalDetails) + { + Argument.AssertNotNull(internalDetails, nameof(internalDetails)); + + Type = "static"; + _internalDetails = internalDetails; + } + + internal StaticFileChunkingStrategy(string type, IDictionary serializedAdditionalRawData, InternalStaticChunkingStrategyDetails internalDetails) : base(type, serializedAdditionalRawData) + { + _internalDetails = internalDetails; + } + + internal StaticFileChunkingStrategy() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/StreamingChatCompletionUpdate.Serialization.cs b/.dotnet/src/Generated/Models/StreamingChatCompletionUpdate.Serialization.cs new file mode 100644 index 000000000..6a39741da --- /dev/null +++ b/.dotnet/src/Generated/Models/StreamingChatCompletionUpdate.Serialization.cs @@ -0,0 +1,245 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class StreamingChatCompletionUpdate : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(StreamingChatCompletionUpdate)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("choices") != true) + { + writer.WritePropertyName("choices"u8); + writer.WriteStartArray(); + foreach (var item in Choices) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("created") != true) + { + writer.WritePropertyName("created"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model); + } + if (SerializedAdditionalRawData?.ContainsKey("service_tier") != true && Optional.IsDefined(ServiceTier)) + { + if (ServiceTier != null) + { + writer.WritePropertyName("service_tier"u8); + writer.WriteStringValue(ServiceTier.Value.ToString()); + } + else + { + writer.WriteNull("service_tier"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("system_fingerprint") != true && Optional.IsDefined(SystemFingerprint)) + { + writer.WritePropertyName("system_fingerprint"u8); + writer.WriteStringValue(SystemFingerprint); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("usage") != true && Optional.IsDefined(Usage)) + { + writer.WritePropertyName("usage"u8); + writer.WriteObjectValue(Usage, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + StreamingChatCompletionUpdate IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(StreamingChatCompletionUpdate)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeStreamingChatCompletionUpdate(document.RootElement, options); + } + + internal static StreamingChatCompletionUpdate DeserializeStreamingChatCompletionUpdate(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + IReadOnlyList choices = default; + DateTimeOffset created = default; + string model = default; + InternalCreateChatCompletionStreamResponseServiceTier? serviceTier = default; + string systemFingerprint = default; + InternalCreateChatCompletionStreamResponseObject @object = default; + ChatTokenUsage usage = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("choices"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(InternalCreateChatCompletionStreamResponseChoice.DeserializeInternalCreateChatCompletionStreamResponseChoice(item, options)); + } + choices = array; + continue; + } + if (property.NameEquals("created"u8)) + { + created = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("model"u8)) + { + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("service_tier"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + serviceTier = null; + continue; + } + serviceTier = new InternalCreateChatCompletionStreamResponseServiceTier(property.Value.GetString()); + continue; + } + if (property.NameEquals("system_fingerprint"u8)) + { + systemFingerprint = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalCreateChatCompletionStreamResponseObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("usage"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + usage = ChatTokenUsage.DeserializeChatTokenUsage(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new StreamingChatCompletionUpdate( + id, + choices, + created, + model, + serviceTier, + systemFingerprint, + @object, + usage, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(StreamingChatCompletionUpdate)} does not support writing '{options.Format}' format."); + } + } + + StreamingChatCompletionUpdate IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeStreamingChatCompletionUpdate(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(StreamingChatCompletionUpdate)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static StreamingChatCompletionUpdate FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeStreamingChatCompletionUpdate(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/StreamingChatCompletionUpdate.cs b/.dotnet/src/Generated/Models/StreamingChatCompletionUpdate.cs new file mode 100644 index 000000000..b72279024 --- /dev/null +++ b/.dotnet/src/Generated/Models/StreamingChatCompletionUpdate.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Chat +{ + public partial class StreamingChatCompletionUpdate + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal StreamingChatCompletionUpdate(string id, IEnumerable choices, DateTimeOffset createdAt, string model) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(choices, nameof(choices)); + Argument.AssertNotNull(model, nameof(model)); + + Id = id; + Choices = choices.ToList(); + CreatedAt = createdAt; + Model = model; + } + + internal StreamingChatCompletionUpdate(string id, IReadOnlyList choices, DateTimeOffset createdAt, string model, InternalCreateChatCompletionStreamResponseServiceTier? serviceTier, string systemFingerprint, InternalCreateChatCompletionStreamResponseObject @object, ChatTokenUsage usage, IDictionary serializedAdditionalRawData) + { + Id = id; + Choices = choices; + CreatedAt = createdAt; + Model = model; + ServiceTier = serviceTier; + SystemFingerprint = systemFingerprint; + Object = @object; + Usage = usage; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal StreamingChatCompletionUpdate() + { + } + + public string Id { get; } + public string Model { get; } + public string SystemFingerprint { get; } + } +} diff --git a/.dotnet/src/Generated/Models/StreamingChatFunctionCallUpdate.Serialization.cs b/.dotnet/src/Generated/Models/StreamingChatFunctionCallUpdate.Serialization.cs new file mode 100644 index 000000000..8a47d0a0b --- /dev/null +++ b/.dotnet/src/Generated/Models/StreamingChatFunctionCallUpdate.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class StreamingChatFunctionCallUpdate : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(StreamingChatFunctionCallUpdate)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("arguments") != true && Optional.IsDefined(FunctionArgumentsUpdate)) + { + writer.WritePropertyName("arguments"u8); + writer.WriteStringValue(FunctionArgumentsUpdate); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true && Optional.IsDefined(FunctionName)) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(FunctionName); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + StreamingChatFunctionCallUpdate IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(StreamingChatFunctionCallUpdate)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeStreamingChatFunctionCallUpdate(document.RootElement, options); + } + + internal static StreamingChatFunctionCallUpdate DeserializeStreamingChatFunctionCallUpdate(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string arguments = default; + string name = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("arguments"u8)) + { + arguments = property.Value.GetString(); + continue; + } + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new StreamingChatFunctionCallUpdate(arguments, name, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(StreamingChatFunctionCallUpdate)} does not support writing '{options.Format}' format."); + } + } + + StreamingChatFunctionCallUpdate IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeStreamingChatFunctionCallUpdate(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(StreamingChatFunctionCallUpdate)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static StreamingChatFunctionCallUpdate FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeStreamingChatFunctionCallUpdate(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/StreamingChatFunctionCallUpdate.cs b/.dotnet/src/Generated/Models/StreamingChatFunctionCallUpdate.cs new file mode 100644 index 000000000..cdf403d42 --- /dev/null +++ b/.dotnet/src/Generated/Models/StreamingChatFunctionCallUpdate.cs @@ -0,0 +1,24 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class StreamingChatFunctionCallUpdate + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal StreamingChatFunctionCallUpdate() + { + } + + internal StreamingChatFunctionCallUpdate(string functionArgumentsUpdate, string functionName, IDictionary serializedAdditionalRawData) + { + FunctionArgumentsUpdate = functionArgumentsUpdate; + FunctionName = functionName; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/StreamingChatToolCallUpdate.Serialization.cs b/.dotnet/src/Generated/Models/StreamingChatToolCallUpdate.Serialization.cs new file mode 100644 index 000000000..123891d29 --- /dev/null +++ b/.dotnet/src/Generated/Models/StreamingChatToolCallUpdate.Serialization.cs @@ -0,0 +1,174 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class StreamingChatToolCallUpdate : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(StreamingChatToolCallUpdate)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("index") != true) + { + writer.WritePropertyName("index"u8); + writer.WriteNumberValue(Index); + } + if (SerializedAdditionalRawData?.ContainsKey("id") != true && Optional.IsDefined(Id)) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Kind.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("function") != true && Optional.IsDefined(Function)) + { + writer.WritePropertyName("function"u8); + writer.WriteObjectValue(Function, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + StreamingChatToolCallUpdate IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(StreamingChatToolCallUpdate)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeStreamingChatToolCallUpdate(document.RootElement, options); + } + + internal static StreamingChatToolCallUpdate DeserializeStreamingChatToolCallUpdate(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int index = default; + string id = default; + ChatToolCallKind type = default; + InternalChatCompletionMessageToolCallChunkFunction function = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("index"u8)) + { + index = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("type"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + type = new ChatToolCallKind(property.Value.GetString()); + continue; + } + if (property.NameEquals("function"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + function = InternalChatCompletionMessageToolCallChunkFunction.DeserializeInternalChatCompletionMessageToolCallChunkFunction(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new StreamingChatToolCallUpdate(index, id, type, function, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(StreamingChatToolCallUpdate)} does not support writing '{options.Format}' format."); + } + } + + StreamingChatToolCallUpdate IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeStreamingChatToolCallUpdate(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(StreamingChatToolCallUpdate)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static StreamingChatToolCallUpdate FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeStreamingChatToolCallUpdate(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/StreamingChatToolCallUpdate.cs b/.dotnet/src/Generated/Models/StreamingChatToolCallUpdate.cs new file mode 100644 index 000000000..5d606fc1b --- /dev/null +++ b/.dotnet/src/Generated/Models/StreamingChatToolCallUpdate.cs @@ -0,0 +1,30 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class StreamingChatToolCallUpdate + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal StreamingChatToolCallUpdate(int index, string id, ChatToolCallKind kind, InternalChatCompletionMessageToolCallChunkFunction function, IDictionary serializedAdditionalRawData) + { + Index = index; + Id = id; + Kind = kind; + Function = function; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal StreamingChatToolCallUpdate() + { + } + + public int Index { get; } + public string Id { get; } + } +} diff --git a/.dotnet/src/Generated/Models/SystemChatMessage.Serialization.cs b/.dotnet/src/Generated/Models/SystemChatMessage.Serialization.cs new file mode 100644 index 000000000..71a6676c5 --- /dev/null +++ b/.dotnet/src/Generated/Models/SystemChatMessage.Serialization.cs @@ -0,0 +1,109 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class SystemChatMessage : IJsonModel + { + SystemChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(SystemChatMessage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeSystemChatMessage(document.RootElement, options); + } + + internal static SystemChatMessage DeserializeSystemChatMessage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + ChatMessageRole role = default; + IList content = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("role"u8)) + { + role = property.Value.GetString().ToChatMessageRole(); + continue; + } + if (property.NameEquals("content"u8)) + { + DeserializeContentValue(property, ref content); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new SystemChatMessage(role, content ?? new ChangeTrackingList(), serializedAdditionalRawData, name); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(SystemChatMessage)} does not support writing '{options.Format}' format."); + } + } + + SystemChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeSystemChatMessage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(SystemChatMessage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new SystemChatMessage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeSystemChatMessage(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/SystemChatMessage.cs b/.dotnet/src/Generated/Models/SystemChatMessage.cs new file mode 100644 index 000000000..962eee34b --- /dev/null +++ b/.dotnet/src/Generated/Models/SystemChatMessage.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class SystemChatMessage : ChatMessage + { + internal SystemChatMessage(ChatMessageRole role, IList content, IDictionary serializedAdditionalRawData, string participantName) : base(role, content, serializedAdditionalRawData) + { + ParticipantName = participantName; + } + + internal SystemChatMessage() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/ThreadCreationOptions.Serialization.cs b/.dotnet/src/Generated/Models/ThreadCreationOptions.Serialization.cs new file mode 100644 index 000000000..a415df841 --- /dev/null +++ b/.dotnet/src/Generated/Models/ThreadCreationOptions.Serialization.cs @@ -0,0 +1,203 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class ThreadCreationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ThreadCreationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("messages") != true && Optional.IsCollectionDefined(InternalMessages)) + { + writer.WritePropertyName("messages"u8); + writer.WriteStartArray(); + foreach (var item in InternalMessages) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("tool_resources") != true && Optional.IsDefined(ToolResources)) + { + if (ToolResources != null) + { + writer.WritePropertyName("tool_resources"u8); + writer.WriteObjectValue(ToolResources, options); + } + else + { + writer.WriteNull("tool_resources"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ThreadCreationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ThreadCreationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeThreadCreationOptions(document.RootElement, options); + } + + internal static ThreadCreationOptions DeserializeThreadCreationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList messages = default; + ToolResources toolResources = default; + IDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("messages"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(MessageCreationOptions.DeserializeMessageCreationOptions(item, options)); + } + messages = array; + continue; + } + if (property.NameEquals("tool_resources"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + toolResources = null; + continue; + } + toolResources = Assistants.ToolResources.DeserializeToolResources(property.Value, options); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ThreadCreationOptions(messages ?? new ChangeTrackingList(), toolResources, metadata ?? new ChangeTrackingDictionary(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ThreadCreationOptions)} does not support writing '{options.Format}' format."); + } + } + + ThreadCreationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeThreadCreationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ThreadCreationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ThreadCreationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeThreadCreationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ThreadCreationOptions.cs b/.dotnet/src/Generated/Models/ThreadCreationOptions.cs new file mode 100644 index 000000000..450cf612a --- /dev/null +++ b/.dotnet/src/Generated/Models/ThreadCreationOptions.cs @@ -0,0 +1,28 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class ThreadCreationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public ThreadCreationOptions() + { + InternalMessages = new ChangeTrackingList(); + Metadata = new ChangeTrackingDictionary(); + } + + internal ThreadCreationOptions(IList internalMessages, ToolResources toolResources, IDictionary metadata, IDictionary serializedAdditionalRawData) + { + InternalMessages = internalMessages; + ToolResources = toolResources; + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public IDictionary Metadata { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/ThreadMessage.Serialization.cs b/.dotnet/src/Generated/Models/ThreadMessage.Serialization.cs new file mode 100644 index 000000000..333213907 --- /dev/null +++ b/.dotnet/src/Generated/Models/ThreadMessage.Serialization.cs @@ -0,0 +1,406 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class ThreadMessage : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ThreadMessage)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("thread_id") != true) + { + writer.WritePropertyName("thread_id"u8); + writer.WriteStringValue(ThreadId); + } + if (SerializedAdditionalRawData?.ContainsKey("status") != true) + { + writer.WritePropertyName("status"u8); + writer.WriteStringValue(Status.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("incomplete_details") != true) + { + if (IncompleteDetails != null) + { + writer.WritePropertyName("incomplete_details"u8); + writer.WriteObjectValue(IncompleteDetails, options); + } + else + { + writer.WriteNull("incomplete_details"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("completed_at") != true) + { + if (CompletedAt != null) + { + writer.WritePropertyName("completed_at"u8); + writer.WriteNumberValue(CompletedAt.Value, "U"); + } + else + { + writer.WriteNull("completed_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("incomplete_at") != true) + { + if (IncompleteAt != null) + { + writer.WritePropertyName("incomplete_at"u8); + writer.WriteNumberValue(IncompleteAt.Value, "U"); + } + else + { + writer.WriteNull("incomplete_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("role") != true) + { + writer.WritePropertyName("role"u8); + writer.WriteStringValue(Role.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("content") != true) + { + writer.WritePropertyName("content"u8); + writer.WriteStartArray(); + foreach (var item in Content) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("assistant_id") != true) + { + if (AssistantId != null) + { + writer.WritePropertyName("assistant_id"u8); + writer.WriteStringValue(AssistantId); + } + else + { + writer.WriteNull("assistant_id"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("run_id") != true) + { + if (RunId != null) + { + writer.WritePropertyName("run_id"u8); + writer.WriteStringValue(RunId); + } + else + { + writer.WriteNull("run_id"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("attachments") != true) + { + if (Attachments != null && Optional.IsCollectionDefined(Attachments)) + { + writer.WritePropertyName("attachments"u8); + writer.WriteStartArray(); + foreach (var item in Attachments) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + else + { + writer.WriteNull("attachments"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true) + { + if (Metadata != null && Optional.IsCollectionDefined(Metadata)) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ThreadMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ThreadMessage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeThreadMessage(document.RootElement, options); + } + + internal static ThreadMessage DeserializeThreadMessage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalMessageObjectObject @object = default; + DateTimeOffset createdAt = default; + string threadId = default; + MessageStatus status = default; + MessageFailureDetails incompleteDetails = default; + DateTimeOffset? completedAt = default; + DateTimeOffset? incompleteAt = default; + MessageRole role = default; + IReadOnlyList content = default; + string assistantId = default; + string runId = default; + IReadOnlyList attachments = default; + IReadOnlyDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalMessageObjectObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("thread_id"u8)) + { + threadId = property.Value.GetString(); + continue; + } + if (property.NameEquals("status"u8)) + { + status = new MessageStatus(property.Value.GetString()); + continue; + } + if (property.NameEquals("incomplete_details"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + incompleteDetails = null; + continue; + } + incompleteDetails = MessageFailureDetails.DeserializeMessageFailureDetails(property.Value, options); + continue; + } + if (property.NameEquals("completed_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + completedAt = null; + continue; + } + completedAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("incomplete_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + incompleteAt = null; + continue; + } + incompleteAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("role"u8)) + { + role = property.Value.GetString().ToMessageRole(); + continue; + } + if (property.NameEquals("content"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(MessageContent.DeserializeMessageContent(item, options)); + } + content = array; + continue; + } + if (property.NameEquals("assistant_id"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + assistantId = null; + continue; + } + assistantId = property.Value.GetString(); + continue; + } + if (property.NameEquals("run_id"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + runId = null; + continue; + } + runId = property.Value.GetString(); + continue; + } + if (property.NameEquals("attachments"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + attachments = new ChangeTrackingList(); + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(MessageCreationAttachment.DeserializeMessageCreationAttachment(item, options)); + } + attachments = array; + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + metadata = new ChangeTrackingDictionary(); + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ThreadMessage( + id, + @object, + createdAt, + threadId, + status, + incompleteDetails, + completedAt, + incompleteAt, + role, + content, + assistantId, + runId, + attachments, + metadata, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ThreadMessage)} does not support writing '{options.Format}' format."); + } + } + + ThreadMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeThreadMessage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ThreadMessage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ThreadMessage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeThreadMessage(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ThreadMessage.cs b/.dotnet/src/Generated/Models/ThreadMessage.cs new file mode 100644 index 000000000..ba5967d55 --- /dev/null +++ b/.dotnet/src/Generated/Models/ThreadMessage.cs @@ -0,0 +1,71 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + public partial class ThreadMessage + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal ThreadMessage(string id, DateTimeOffset createdAt, string threadId, MessageStatus status, MessageFailureDetails incompleteDetails, DateTimeOffset? completedAt, DateTimeOffset? incompleteAt, MessageRole role, IEnumerable content, string assistantId, string runId, IEnumerable attachments, IReadOnlyDictionary metadata) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(threadId, nameof(threadId)); + Argument.AssertNotNull(content, nameof(content)); + + Id = id; + CreatedAt = createdAt; + ThreadId = threadId; + Status = status; + IncompleteDetails = incompleteDetails; + CompletedAt = completedAt; + IncompleteAt = incompleteAt; + Role = role; + Content = content.ToList(); + AssistantId = assistantId; + RunId = runId; + Attachments = attachments?.ToList(); + Metadata = metadata; + } + + internal ThreadMessage(string id, InternalMessageObjectObject @object, DateTimeOffset createdAt, string threadId, MessageStatus status, MessageFailureDetails incompleteDetails, DateTimeOffset? completedAt, DateTimeOffset? incompleteAt, MessageRole role, IReadOnlyList content, string assistantId, string runId, IReadOnlyList attachments, IReadOnlyDictionary metadata, IDictionary serializedAdditionalRawData) + { + Id = id; + Object = @object; + CreatedAt = createdAt; + ThreadId = threadId; + Status = status; + IncompleteDetails = incompleteDetails; + CompletedAt = completedAt; + IncompleteAt = incompleteAt; + Role = role; + Content = content; + AssistantId = assistantId; + RunId = runId; + Attachments = attachments; + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ThreadMessage() + { + } + + public string Id { get; } + + public DateTimeOffset CreatedAt { get; } + public string ThreadId { get; } + public MessageStatus Status { get; } + public MessageFailureDetails IncompleteDetails { get; } + public DateTimeOffset? CompletedAt { get; } + public DateTimeOffset? IncompleteAt { get; } + public IReadOnlyList Content { get; } + public string AssistantId { get; } + public string RunId { get; } + public IReadOnlyDictionary Metadata { get; } + } +} diff --git a/.dotnet/src/Generated/Models/ThreadModificationOptions.Serialization.cs b/.dotnet/src/Generated/Models/ThreadModificationOptions.Serialization.cs new file mode 100644 index 000000000..e53955c26 --- /dev/null +++ b/.dotnet/src/Generated/Models/ThreadModificationOptions.Serialization.cs @@ -0,0 +1,178 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class ThreadModificationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ThreadModificationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("tool_resources") != true && Optional.IsDefined(ToolResources)) + { + if (ToolResources != null) + { + writer.WritePropertyName("tool_resources"u8); + writer.WriteObjectValue(ToolResources, options); + } + else + { + writer.WriteNull("tool_resources"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ThreadModificationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ThreadModificationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeThreadModificationOptions(document.RootElement, options); + } + + internal static ThreadModificationOptions DeserializeThreadModificationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + ToolResources toolResources = default; + IDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("tool_resources"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + toolResources = null; + continue; + } + toolResources = Assistants.ToolResources.DeserializeToolResources(property.Value, options); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ThreadModificationOptions(toolResources, metadata ?? new ChangeTrackingDictionary(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ThreadModificationOptions)} does not support writing '{options.Format}' format."); + } + } + + ThreadModificationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeThreadModificationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ThreadModificationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ThreadModificationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeThreadModificationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ThreadModificationOptions.cs b/.dotnet/src/Generated/Models/ThreadModificationOptions.cs new file mode 100644 index 000000000..b26090b10 --- /dev/null +++ b/.dotnet/src/Generated/Models/ThreadModificationOptions.cs @@ -0,0 +1,26 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class ThreadModificationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public ThreadModificationOptions() + { + Metadata = new ChangeTrackingDictionary(); + } + + internal ThreadModificationOptions(ToolResources toolResources, IDictionary metadata, IDictionary serializedAdditionalRawData) + { + ToolResources = toolResources; + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public IDictionary Metadata { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/ThreadRun.Serialization.cs b/.dotnet/src/Generated/Models/ThreadRun.Serialization.cs new file mode 100644 index 000000000..4c8844894 --- /dev/null +++ b/.dotnet/src/Generated/Models/ThreadRun.Serialization.cs @@ -0,0 +1,672 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class ThreadRun : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ThreadRun)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("thread_id") != true) + { + writer.WritePropertyName("thread_id"u8); + writer.WriteStringValue(ThreadId); + } + if (SerializedAdditionalRawData?.ContainsKey("assistant_id") != true) + { + writer.WritePropertyName("assistant_id"u8); + writer.WriteStringValue(AssistantId); + } + if (SerializedAdditionalRawData?.ContainsKey("status") != true) + { + writer.WritePropertyName("status"u8); + writer.WriteStringValue(Status.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("required_action") != true) + { + if (_internalRequiredAction != null) + { + writer.WritePropertyName("required_action"u8); + writer.WriteObjectValue(_internalRequiredAction, options); + } + else + { + writer.WriteNull("required_action"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("last_error") != true) + { + if (LastError != null) + { + writer.WritePropertyName("last_error"u8); + writer.WriteObjectValue(LastError, options); + } + else + { + writer.WriteNull("last_error"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("expires_at") != true) + { + if (ExpiresAt != null) + { + writer.WritePropertyName("expires_at"u8); + writer.WriteNumberValue(ExpiresAt.Value, "U"); + } + else + { + writer.WriteNull("expires_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("started_at") != true) + { + if (StartedAt != null) + { + writer.WritePropertyName("started_at"u8); + writer.WriteNumberValue(StartedAt.Value, "U"); + } + else + { + writer.WriteNull("started_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("cancelled_at") != true) + { + if (CancelledAt != null) + { + writer.WritePropertyName("cancelled_at"u8); + writer.WriteNumberValue(CancelledAt.Value, "U"); + } + else + { + writer.WriteNull("cancelled_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("failed_at") != true) + { + if (FailedAt != null) + { + writer.WritePropertyName("failed_at"u8); + writer.WriteNumberValue(FailedAt.Value, "U"); + } + else + { + writer.WriteNull("failed_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("completed_at") != true) + { + if (CompletedAt != null) + { + writer.WritePropertyName("completed_at"u8); + writer.WriteNumberValue(CompletedAt.Value, "U"); + } + else + { + writer.WriteNull("completed_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("incomplete_details") != true) + { + if (IncompleteDetails != null) + { + writer.WritePropertyName("incomplete_details"u8); + writer.WriteObjectValue(IncompleteDetails, options); + } + else + { + writer.WriteNull("incomplete_details"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("model") != true) + { + writer.WritePropertyName("model"u8); + writer.WriteStringValue(Model); + } + if (SerializedAdditionalRawData?.ContainsKey("instructions") != true) + { + writer.WritePropertyName("instructions"u8); + writer.WriteStringValue(Instructions); + } + if (SerializedAdditionalRawData?.ContainsKey("tools") != true) + { + writer.WritePropertyName("tools"u8); + writer.WriteStartArray(); + foreach (var item in Tools) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true) + { + if (Metadata != null && Optional.IsCollectionDefined(Metadata)) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("usage") != true) + { + if (Usage != null) + { + writer.WritePropertyName("usage"u8); + writer.WriteObjectValue(Usage, options); + } + else + { + writer.WriteNull("usage"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("temperature") != true && Optional.IsDefined(Temperature)) + { + if (Temperature != null) + { + writer.WritePropertyName("temperature"u8); + writer.WriteNumberValue(Temperature.Value); + } + else + { + writer.WriteNull("temperature"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("top_p") != true && Optional.IsDefined(NucleusSamplingFactor)) + { + if (NucleusSamplingFactor != null) + { + writer.WritePropertyName("top_p"u8); + writer.WriteNumberValue(NucleusSamplingFactor.Value); + } + else + { + writer.WriteNull("top_p"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("max_prompt_tokens") != true) + { + if (MaxPromptTokens != null) + { + writer.WritePropertyName("max_prompt_tokens"u8); + writer.WriteNumberValue(MaxPromptTokens.Value); + } + else + { + writer.WriteNull("max_prompt_tokens"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("max_completion_tokens") != true) + { + if (MaxCompletionTokens != null) + { + writer.WritePropertyName("max_completion_tokens"u8); + writer.WriteNumberValue(MaxCompletionTokens.Value); + } + else + { + writer.WriteNull("max_completion_tokens"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("truncation_strategy") != true) + { + if (TruncationStrategy != null) + { + writer.WritePropertyName("truncation_strategy"u8); + writer.WriteObjectValue(TruncationStrategy, options); + } + else + { + writer.WriteNull("truncation_strategy"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("tool_choice") != true) + { + if (ToolConstraint != null) + { + writer.WritePropertyName("tool_choice"u8); + writer.WriteObjectValue(ToolConstraint, options); + } + else + { + writer.WriteNull("tool_choice"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("parallel_tool_calls") != true) + { + writer.WritePropertyName("parallel_tool_calls"u8); + writer.WriteBooleanValue(ParallelToolCallsEnabled.Value); + } + if (SerializedAdditionalRawData?.ContainsKey("response_format") != true) + { + if (ResponseFormat != null) + { + writer.WritePropertyName("response_format"u8); + writer.WriteObjectValue(ResponseFormat, options); + } + else + { + writer.WriteNull("response_format"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ThreadRun IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ThreadRun)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeThreadRun(document.RootElement, options); + } + + internal static ThreadRun DeserializeThreadRun(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalRunObjectObject @object = default; + DateTimeOffset createdAt = default; + string threadId = default; + string assistantId = default; + RunStatus status = default; + InternalRunRequiredAction requiredAction = default; + RunError lastError = default; + DateTimeOffset? expiresAt = default; + DateTimeOffset? startedAt = default; + DateTimeOffset? cancelledAt = default; + DateTimeOffset? failedAt = default; + DateTimeOffset? completedAt = default; + RunIncompleteDetails incompleteDetails = default; + string model = default; + string instructions = default; + IReadOnlyList tools = default; + IReadOnlyDictionary metadata = default; + RunTokenUsage usage = default; + float? temperature = default; + float? topP = default; + int? maxPromptTokens = default; + int? maxCompletionTokens = default; + RunTruncationStrategy truncationStrategy = default; + ToolConstraint toolChoice = default; + bool? parallelToolCalls = default; + AssistantResponseFormat responseFormat = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalRunObjectObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("thread_id"u8)) + { + threadId = property.Value.GetString(); + continue; + } + if (property.NameEquals("assistant_id"u8)) + { + assistantId = property.Value.GetString(); + continue; + } + if (property.NameEquals("status"u8)) + { + status = new RunStatus(property.Value.GetString()); + continue; + } + if (property.NameEquals("required_action"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + requiredAction = null; + continue; + } + requiredAction = InternalRunRequiredAction.DeserializeInternalRunRequiredAction(property.Value, options); + continue; + } + if (property.NameEquals("last_error"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + lastError = null; + continue; + } + lastError = RunError.DeserializeRunError(property.Value, options); + continue; + } + if (property.NameEquals("expires_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + expiresAt = null; + continue; + } + expiresAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("started_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + startedAt = null; + continue; + } + startedAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("cancelled_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + cancelledAt = null; + continue; + } + cancelledAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("failed_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + failedAt = null; + continue; + } + failedAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("completed_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + completedAt = null; + continue; + } + completedAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("incomplete_details"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + incompleteDetails = null; + continue; + } + incompleteDetails = RunIncompleteDetails.DeserializeRunIncompleteDetails(property.Value, options); + continue; + } + if (property.NameEquals("model"u8)) + { + model = property.Value.GetString(); + continue; + } + if (property.NameEquals("instructions"u8)) + { + instructions = property.Value.GetString(); + continue; + } + if (property.NameEquals("tools"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(ToolDefinition.DeserializeToolDefinition(item, options)); + } + tools = array; + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + metadata = new ChangeTrackingDictionary(); + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (property.NameEquals("usage"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + usage = null; + continue; + } + usage = RunTokenUsage.DeserializeRunTokenUsage(property.Value, options); + continue; + } + if (property.NameEquals("temperature"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + temperature = null; + continue; + } + temperature = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("top_p"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + topP = null; + continue; + } + topP = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("max_prompt_tokens"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + maxPromptTokens = null; + continue; + } + maxPromptTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("max_completion_tokens"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + maxCompletionTokens = null; + continue; + } + maxCompletionTokens = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("truncation_strategy"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + truncationStrategy = null; + continue; + } + truncationStrategy = RunTruncationStrategy.DeserializeRunTruncationStrategy(property.Value, options); + continue; + } + if (property.NameEquals("tool_choice"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + toolChoice = null; + continue; + } + toolChoice = Assistants.ToolConstraint.DeserializeToolConstraint(property.Value, options); + continue; + } + if (property.NameEquals("parallel_tool_calls"u8)) + { + parallelToolCalls = property.Value.GetBoolean(); + continue; + } + if (property.NameEquals("response_format"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + responseFormat = null; + continue; + } + responseFormat = AssistantResponseFormat.DeserializeAssistantResponseFormat(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ThreadRun( + id, + @object, + createdAt, + threadId, + assistantId, + status, + requiredAction, + lastError, + expiresAt, + startedAt, + cancelledAt, + failedAt, + completedAt, + incompleteDetails, + model, + instructions, + tools, + metadata, + usage, + temperature, + topP, + maxPromptTokens, + maxCompletionTokens, + truncationStrategy, + toolChoice, + parallelToolCalls, + responseFormat, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ThreadRun)} does not support writing '{options.Format}' format."); + } + } + + ThreadRun IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeThreadRun(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ThreadRun)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ThreadRun FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeThreadRun(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ThreadRun.cs b/.dotnet/src/Generated/Models/ThreadRun.cs new file mode 100644 index 000000000..cc4a44572 --- /dev/null +++ b/.dotnet/src/Generated/Models/ThreadRun.cs @@ -0,0 +1,74 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Assistants +{ + public partial class ThreadRun + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal ThreadRun(string id, InternalRunObjectObject @object, DateTimeOffset createdAt, string threadId, string assistantId, RunStatus status, InternalRunRequiredAction internalRequiredAction, RunError lastError, DateTimeOffset? expiresAt, DateTimeOffset? startedAt, DateTimeOffset? cancelledAt, DateTimeOffset? failedAt, DateTimeOffset? completedAt, RunIncompleteDetails incompleteDetails, string model, string instructions, IReadOnlyList tools, IReadOnlyDictionary metadata, RunTokenUsage usage, float? temperature, float? nucleusSamplingFactor, int? maxPromptTokens, int? maxCompletionTokens, RunTruncationStrategy truncationStrategy, ToolConstraint toolConstraint, bool? parallelToolCallsEnabled, AssistantResponseFormat responseFormat, IDictionary serializedAdditionalRawData) + { + Id = id; + Object = @object; + CreatedAt = createdAt; + ThreadId = threadId; + AssistantId = assistantId; + Status = status; + _internalRequiredAction = internalRequiredAction; + LastError = lastError; + ExpiresAt = expiresAt; + StartedAt = startedAt; + CancelledAt = cancelledAt; + FailedAt = failedAt; + CompletedAt = completedAt; + IncompleteDetails = incompleteDetails; + Model = model; + Instructions = instructions; + Tools = tools; + Metadata = metadata; + Usage = usage; + Temperature = temperature; + NucleusSamplingFactor = nucleusSamplingFactor; + MaxPromptTokens = maxPromptTokens; + MaxCompletionTokens = maxCompletionTokens; + TruncationStrategy = truncationStrategy; + ToolConstraint = toolConstraint; + ParallelToolCallsEnabled = parallelToolCallsEnabled; + ResponseFormat = responseFormat; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ThreadRun() + { + } + + public string Id { get; } + + public DateTimeOffset CreatedAt { get; } + public string ThreadId { get; } + public string AssistantId { get; } + public RunStatus Status { get; } + public RunError LastError { get; } + public DateTimeOffset? ExpiresAt { get; } + public DateTimeOffset? StartedAt { get; } + public DateTimeOffset? CancelledAt { get; } + public DateTimeOffset? FailedAt { get; } + public DateTimeOffset? CompletedAt { get; } + public RunIncompleteDetails IncompleteDetails { get; } + public string Model { get; } + public string Instructions { get; } + public IReadOnlyList Tools { get; } + public IReadOnlyDictionary Metadata { get; } + public RunTokenUsage Usage { get; } + public float? Temperature { get; } + public int? MaxPromptTokens { get; } + public int? MaxCompletionTokens { get; } + public RunTruncationStrategy TruncationStrategy { get; } + } +} diff --git a/.dotnet/src/Generated/Models/ToolChatMessage.Serialization.cs b/.dotnet/src/Generated/Models/ToolChatMessage.Serialization.cs new file mode 100644 index 000000000..fb4921699 --- /dev/null +++ b/.dotnet/src/Generated/Models/ToolChatMessage.Serialization.cs @@ -0,0 +1,109 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class ToolChatMessage : IJsonModel + { + ToolChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ToolChatMessage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeToolChatMessage(document.RootElement, options); + } + + internal static ToolChatMessage DeserializeToolChatMessage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string toolCallId = default; + ChatMessageRole role = default; + IList content = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("tool_call_id"u8)) + { + toolCallId = property.Value.GetString(); + continue; + } + if (property.NameEquals("role"u8)) + { + role = property.Value.GetString().ToChatMessageRole(); + continue; + } + if (property.NameEquals("content"u8)) + { + DeserializeContentValue(property, ref content); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ToolChatMessage(role, content ?? new ChangeTrackingList(), serializedAdditionalRawData, toolCallId); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ToolChatMessage)} does not support writing '{options.Format}' format."); + } + } + + ToolChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeToolChatMessage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ToolChatMessage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new ToolChatMessage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeToolChatMessage(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ToolChatMessage.cs b/.dotnet/src/Generated/Models/ToolChatMessage.cs new file mode 100644 index 000000000..d0d4c210b --- /dev/null +++ b/.dotnet/src/Generated/Models/ToolChatMessage.cs @@ -0,0 +1,23 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class ToolChatMessage : ChatMessage + { + internal ToolChatMessage(ChatMessageRole role, IList content, IDictionary serializedAdditionalRawData, string toolCallId) : base(role, content, serializedAdditionalRawData) + { + ToolCallId = toolCallId; + } + + internal ToolChatMessage() + { + } + + public string ToolCallId { get; } + } +} diff --git a/.dotnet/src/Generated/Models/ToolConstraint.Serialization.cs b/.dotnet/src/Generated/Models/ToolConstraint.Serialization.cs new file mode 100644 index 000000000..2282c137b --- /dev/null +++ b/.dotnet/src/Generated/Models/ToolConstraint.Serialization.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class ToolConstraint : IJsonModel + { + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ToolConstraint FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeToolConstraint(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ToolConstraint.cs b/.dotnet/src/Generated/Models/ToolConstraint.cs new file mode 100644 index 000000000..519953a5f --- /dev/null +++ b/.dotnet/src/Generated/Models/ToolConstraint.cs @@ -0,0 +1,23 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class ToolConstraint + { + internal ToolConstraint(string objectType, InternalAssistantsNamedToolChoiceFunction function, IDictionary serializedAdditionalRawData) + { + _objectType = objectType; + Function = function; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal ToolConstraint() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/ToolDefinition.Serialization.cs b/.dotnet/src/Generated/Models/ToolDefinition.Serialization.cs new file mode 100644 index 000000000..443582baf --- /dev/null +++ b/.dotnet/src/Generated/Models/ToolDefinition.Serialization.cs @@ -0,0 +1,89 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + [PersistableModelProxy(typeof(UnknownAssistantToolDefinition))] + public partial class ToolDefinition : IJsonModel + { + ToolDefinition IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ToolDefinition)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeToolDefinition(document.RootElement, options); + } + + internal static ToolDefinition DeserializeToolDefinition(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + if (element.TryGetProperty("type", out JsonElement discriminator)) + { + switch (discriminator.GetString()) + { + case "code_interpreter": return CodeInterpreterToolDefinition.DeserializeCodeInterpreterToolDefinition(element, options); + case "file_search": return FileSearchToolDefinition.DeserializeFileSearchToolDefinition(element, options); + case "function": return FunctionToolDefinition.DeserializeFunctionToolDefinition(element, options); + } + } + return UnknownAssistantToolDefinition.DeserializeUnknownAssistantToolDefinition(element, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ToolDefinition)} does not support writing '{options.Format}' format."); + } + } + + ToolDefinition IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeToolDefinition(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ToolDefinition)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ToolDefinition FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeToolDefinition(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ToolDefinition.cs b/.dotnet/src/Generated/Models/ToolDefinition.cs new file mode 100644 index 000000000..8f3a99763 --- /dev/null +++ b/.dotnet/src/Generated/Models/ToolDefinition.cs @@ -0,0 +1,25 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public abstract partial class ToolDefinition + { + internal IDictionary SerializedAdditionalRawData { get; set; } + protected ToolDefinition() + { + } + + internal ToolDefinition(string type, IDictionary serializedAdditionalRawData) + { + Type = type; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal string Type { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/ToolOutput.Serialization.cs b/.dotnet/src/Generated/Models/ToolOutput.Serialization.cs new file mode 100644 index 000000000..d4b3f7f17 --- /dev/null +++ b/.dotnet/src/Generated/Models/ToolOutput.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class ToolOutput : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ToolOutput)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("tool_call_id") != true && Optional.IsDefined(ToolCallId)) + { + writer.WritePropertyName("tool_call_id"u8); + writer.WriteStringValue(ToolCallId); + } + if (SerializedAdditionalRawData?.ContainsKey("output") != true && Optional.IsDefined(Output)) + { + writer.WritePropertyName("output"u8); + writer.WriteStringValue(Output); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ToolOutput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ToolOutput)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeToolOutput(document.RootElement, options); + } + + internal static ToolOutput DeserializeToolOutput(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string toolCallId = default; + string output = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("tool_call_id"u8)) + { + toolCallId = property.Value.GetString(); + continue; + } + if (property.NameEquals("output"u8)) + { + output = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ToolOutput(toolCallId, output, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ToolOutput)} does not support writing '{options.Format}' format."); + } + } + + ToolOutput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeToolOutput(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ToolOutput)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ToolOutput FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeToolOutput(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ToolOutput.cs b/.dotnet/src/Generated/Models/ToolOutput.cs new file mode 100644 index 000000000..0e7b6441b --- /dev/null +++ b/.dotnet/src/Generated/Models/ToolOutput.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class ToolOutput + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public ToolOutput() + { + } + + internal ToolOutput(string toolCallId, string output, IDictionary serializedAdditionalRawData) + { + ToolCallId = toolCallId; + Output = output; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string ToolCallId { get; set; } + public string Output { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/ToolResources.Serialization.cs b/.dotnet/src/Generated/Models/ToolResources.Serialization.cs new file mode 100644 index 000000000..75ceb9212 --- /dev/null +++ b/.dotnet/src/Generated/Models/ToolResources.Serialization.cs @@ -0,0 +1,152 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + public partial class ToolResources : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ToolResources)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code_interpreter") != true && Optional.IsDefined(CodeInterpreter)) + { + writer.WritePropertyName("code_interpreter"u8); + writer.WriteObjectValue(CodeInterpreter, options); + } + if (SerializedAdditionalRawData?.ContainsKey("file_search") != true && Optional.IsDefined(FileSearch)) + { + writer.WritePropertyName("file_search"u8); + SerializeFileSearch(writer, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + ToolResources IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ToolResources)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeToolResources(document.RootElement, options); + } + + internal static ToolResources DeserializeToolResources(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + CodeInterpreterToolResources codeInterpreter = default; + FileSearchToolResources fileSearch = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code_interpreter"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + codeInterpreter = CodeInterpreterToolResources.DeserializeCodeInterpreterToolResources(property.Value, options); + continue; + } + if (property.NameEquals("file_search"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + fileSearch = FileSearchToolResources.DeserializeFileSearchToolResources(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new ToolResources(codeInterpreter, fileSearch, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ToolResources)} does not support writing '{options.Format}' format."); + } + } + + ToolResources IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeToolResources(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ToolResources)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static ToolResources FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeToolResources(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/ToolResources.cs b/.dotnet/src/Generated/Models/ToolResources.cs new file mode 100644 index 000000000..67b190971 --- /dev/null +++ b/.dotnet/src/Generated/Models/ToolResources.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + public partial class ToolResources + { + internal IDictionary SerializedAdditionalRawData { get; set; } + + internal ToolResources(CodeInterpreterToolResources codeInterpreter, FileSearchToolResources fileSearch, IDictionary serializedAdditionalRawData) + { + CodeInterpreter = codeInterpreter; + FileSearch = fileSearch; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + } +} diff --git a/.dotnet/src/Generated/Models/TranscribedSegment.Serialization.cs b/.dotnet/src/Generated/Models/TranscribedSegment.Serialization.cs new file mode 100644 index 000000000..c099ab406 --- /dev/null +++ b/.dotnet/src/Generated/Models/TranscribedSegment.Serialization.cs @@ -0,0 +1,259 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Audio +{ + public partial struct TranscribedSegment : IJsonModel, IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(TranscribedSegment)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteNumberValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("seek") != true) + { + writer.WritePropertyName("seek"u8); + writer.WriteNumberValue(SeekOffset); + } + if (SerializedAdditionalRawData?.ContainsKey("start") != true) + { + writer.WritePropertyName("start"u8); + writer.WriteNumberValue(Convert.ToDouble(Start.ToString("s\\.FFF"))); + } + if (SerializedAdditionalRawData?.ContainsKey("end") != true) + { + writer.WritePropertyName("end"u8); + writer.WriteNumberValue(Convert.ToDouble(End.ToString("s\\.FFF"))); + } + if (SerializedAdditionalRawData?.ContainsKey("text") != true) + { + writer.WritePropertyName("text"u8); + writer.WriteStringValue(Text); + } + if (SerializedAdditionalRawData?.ContainsKey("tokens") != true) + { + writer.WritePropertyName("tokens"u8); + writer.WriteStartArray(); + foreach (var item in TokenIds) + { + writer.WriteNumberValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("temperature") != true) + { + writer.WritePropertyName("temperature"u8); + writer.WriteNumberValue(Temperature); + } + if (SerializedAdditionalRawData?.ContainsKey("avg_logprob") != true) + { + writer.WritePropertyName("avg_logprob"u8); + writer.WriteNumberValue(AverageLogProbability); + } + if (SerializedAdditionalRawData?.ContainsKey("compression_ratio") != true) + { + writer.WritePropertyName("compression_ratio"u8); + writer.WriteNumberValue(CompressionRatio); + } + if (SerializedAdditionalRawData?.ContainsKey("no_speech_prob") != true) + { + writer.WritePropertyName("no_speech_prob"u8); + writer.WriteNumberValue(NoSpeechProbability); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + TranscribedSegment IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(TranscribedSegment)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeTranscribedSegment(document.RootElement, options); + } + + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) => ((IJsonModel)this).Write(writer, options); + + object IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) => ((IJsonModel)this).Create(ref reader, options); + + internal static TranscribedSegment DeserializeTranscribedSegment(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + int id = default; + long seek = default; + TimeSpan start = default; + TimeSpan end = default; + string text = default; + IReadOnlyList tokens = default; + float temperature = default; + double avgLogprob = default; + float compressionRatio = default; + double noSpeechProb = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("seek"u8)) + { + seek = property.Value.GetInt64(); + continue; + } + if (property.NameEquals("start"u8)) + { + start = TimeSpan.FromSeconds(property.Value.GetDouble()); + continue; + } + if (property.NameEquals("end"u8)) + { + end = TimeSpan.FromSeconds(property.Value.GetDouble()); + continue; + } + if (property.NameEquals("text"u8)) + { + text = property.Value.GetString(); + continue; + } + if (property.NameEquals("tokens"u8)) + { + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetInt64()); + } + tokens = array; + continue; + } + if (property.NameEquals("temperature"u8)) + { + temperature = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("avg_logprob"u8)) + { + avgLogprob = property.Value.GetDouble(); + continue; + } + if (property.NameEquals("compression_ratio"u8)) + { + compressionRatio = property.Value.GetSingle(); + continue; + } + if (property.NameEquals("no_speech_prob"u8)) + { + noSpeechProb = property.Value.GetDouble(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new TranscribedSegment( + id, + seek, + start, + end, + text, + tokens, + temperature, + avgLogprob, + compressionRatio, + noSpeechProb, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(TranscribedSegment)} does not support writing '{options.Format}' format."); + } + } + + TranscribedSegment IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeTranscribedSegment(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(TranscribedSegment)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) => ((IPersistableModel)this).Write(options); + + object IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) => ((IPersistableModel)this).Create(data, options); + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => ((IPersistableModel)this).GetFormatFromOptions(options); + + internal static TranscribedSegment FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeTranscribedSegment(document.RootElement); + } + + internal BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/TranscribedSegment.cs b/.dotnet/src/Generated/Models/TranscribedSegment.cs new file mode 100644 index 000000000..965788cc6 --- /dev/null +++ b/.dotnet/src/Generated/Models/TranscribedSegment.cs @@ -0,0 +1,56 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; + +namespace OpenAI.Audio +{ + public readonly partial struct TranscribedSegment + { + internal TranscribedSegment(int id, long seekOffset, TimeSpan start, TimeSpan end, string text, IEnumerable tokenIds, float temperature, double averageLogProbability, float compressionRatio, double noSpeechProbability) + { + Argument.AssertNotNull(text, nameof(text)); + Argument.AssertNotNull(tokenIds, nameof(tokenIds)); + + Id = id; + SeekOffset = seekOffset; + Start = start; + End = end; + Text = text; + TokenIds = tokenIds.ToList(); + Temperature = temperature; + AverageLogProbability = averageLogProbability; + CompressionRatio = compressionRatio; + NoSpeechProbability = noSpeechProbability; + } + + internal TranscribedSegment(int id, long seekOffset, TimeSpan start, TimeSpan end, string text, IReadOnlyList tokenIds, float temperature, double averageLogProbability, float compressionRatio, double noSpeechProbability, IDictionary serializedAdditionalRawData) + { + Id = id; + SeekOffset = seekOffset; + Start = start; + End = end; + Text = text; + TokenIds = tokenIds; + Temperature = temperature; + AverageLogProbability = averageLogProbability; + CompressionRatio = compressionRatio; + NoSpeechProbability = noSpeechProbability; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public TranscribedSegment() + { + } + + public int Id { get; } + public TimeSpan Start { get; } + public TimeSpan End { get; } + public string Text { get; } + public float Temperature { get; } + public float CompressionRatio { get; } + } +} diff --git a/.dotnet/src/Generated/Models/TranscribedWord.Serialization.cs b/.dotnet/src/Generated/Models/TranscribedWord.Serialization.cs new file mode 100644 index 000000000..bd0738fac --- /dev/null +++ b/.dotnet/src/Generated/Models/TranscribedWord.Serialization.cs @@ -0,0 +1,161 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Audio +{ + public partial struct TranscribedWord : IJsonModel, IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(TranscribedWord)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("word") != true) + { + writer.WritePropertyName("word"u8); + writer.WriteStringValue(Word); + } + if (SerializedAdditionalRawData?.ContainsKey("start") != true) + { + writer.WritePropertyName("start"u8); + writer.WriteNumberValue(Convert.ToDouble(Start.ToString("s\\.FFF"))); + } + if (SerializedAdditionalRawData?.ContainsKey("end") != true) + { + writer.WritePropertyName("end"u8); + writer.WriteNumberValue(Convert.ToDouble(End.ToString("s\\.FFF"))); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + TranscribedWord IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(TranscribedWord)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeTranscribedWord(document.RootElement, options); + } + + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) => ((IJsonModel)this).Write(writer, options); + + object IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) => ((IJsonModel)this).Create(ref reader, options); + + internal static TranscribedWord DeserializeTranscribedWord(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + string word = default; + TimeSpan start = default; + TimeSpan end = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("word"u8)) + { + word = property.Value.GetString(); + continue; + } + if (property.NameEquals("start"u8)) + { + start = TimeSpan.FromSeconds(property.Value.GetDouble()); + continue; + } + if (property.NameEquals("end"u8)) + { + end = TimeSpan.FromSeconds(property.Value.GetDouble()); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new TranscribedWord(word, start, end, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(TranscribedWord)} does not support writing '{options.Format}' format."); + } + } + + TranscribedWord IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeTranscribedWord(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(TranscribedWord)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) => ((IPersistableModel)this).Write(options); + + object IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) => ((IPersistableModel)this).Create(data, options); + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => ((IPersistableModel)this).GetFormatFromOptions(options); + + internal static TranscribedWord FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeTranscribedWord(document.RootElement); + } + + internal BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/TranscribedWord.cs b/.dotnet/src/Generated/Models/TranscribedWord.cs new file mode 100644 index 000000000..0426c1992 --- /dev/null +++ b/.dotnet/src/Generated/Models/TranscribedWord.cs @@ -0,0 +1,37 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Audio +{ + public readonly partial struct TranscribedWord + { + internal TranscribedWord(string word, TimeSpan start, TimeSpan end) + { + Argument.AssertNotNull(word, nameof(word)); + + Word = word; + Start = start; + End = end; + } + + internal TranscribedWord(string word, TimeSpan start, TimeSpan end, IDictionary serializedAdditionalRawData) + { + Word = word; + Start = start; + End = end; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public TranscribedWord() + { + } + + public string Word { get; } + public TimeSpan Start { get; } + public TimeSpan End { get; } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownAssistantToolDefinition.Serialization.cs b/.dotnet/src/Generated/Models/UnknownAssistantToolDefinition.Serialization.cs new file mode 100644 index 000000000..b26d26d05 --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownAssistantToolDefinition.Serialization.cs @@ -0,0 +1,97 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class UnknownAssistantToolDefinition : IJsonModel + { + ToolDefinition IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ToolDefinition)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeToolDefinition(document.RootElement, options); + } + + internal static UnknownAssistantToolDefinition DeserializeUnknownAssistantToolDefinition(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UnknownAssistantToolDefinition(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ToolDefinition)} does not support writing '{options.Format}' format."); + } + } + + ToolDefinition IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeToolDefinition(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ToolDefinition)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UnknownAssistantToolDefinition FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUnknownAssistantToolDefinition(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownAssistantToolDefinition.cs b/.dotnet/src/Generated/Models/UnknownAssistantToolDefinition.cs new file mode 100644 index 000000000..ca5c335e1 --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownAssistantToolDefinition.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class UnknownAssistantToolDefinition : ToolDefinition + { + internal UnknownAssistantToolDefinition(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal UnknownAssistantToolDefinition() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownChatMessage.Serialization.cs b/.dotnet/src/Generated/Models/UnknownChatMessage.Serialization.cs new file mode 100644 index 000000000..665911fca --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownChatMessage.Serialization.cs @@ -0,0 +1,68 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Text.Json; + +namespace OpenAI.Chat +{ + internal partial class UnknownChatMessage : IJsonModel + { + ChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(ChatMessage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeChatMessage(document.RootElement, options); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(ChatMessage)} does not support writing '{options.Format}' format."); + } + } + + ChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeChatMessage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(ChatMessage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UnknownChatMessage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUnknownChatMessage(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownChatMessage.cs b/.dotnet/src/Generated/Models/UnknownChatMessage.cs new file mode 100644 index 000000000..431558af6 --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownChatMessage.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + internal partial class UnknownChatMessage : ChatMessage + { + internal UnknownChatMessage(ChatMessageRole role, IList content, IDictionary serializedAdditionalRawData) : base(role, content, serializedAdditionalRawData) + { + } + + internal UnknownChatMessage() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownMessageContentTextObjectAnnotation.Serialization.cs b/.dotnet/src/Generated/Models/UnknownMessageContentTextObjectAnnotation.Serialization.cs new file mode 100644 index 000000000..b0a6d9584 --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownMessageContentTextObjectAnnotation.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class UnknownMessageContentTextObjectAnnotation : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectAnnotation)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageContentTextObjectAnnotation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectAnnotation)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageContentTextObjectAnnotation(document.RootElement, options); + } + + internal static UnknownMessageContentTextObjectAnnotation DeserializeUnknownMessageContentTextObjectAnnotation(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UnknownMessageContentTextObjectAnnotation(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectAnnotation)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageContentTextObjectAnnotation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageContentTextObjectAnnotation(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageContentTextObjectAnnotation)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UnknownMessageContentTextObjectAnnotation FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUnknownMessageContentTextObjectAnnotation(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownMessageContentTextObjectAnnotation.cs b/.dotnet/src/Generated/Models/UnknownMessageContentTextObjectAnnotation.cs new file mode 100644 index 000000000..b5561f7cd --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownMessageContentTextObjectAnnotation.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class UnknownMessageContentTextObjectAnnotation : InternalMessageContentTextObjectAnnotation + { + internal UnknownMessageContentTextObjectAnnotation(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal UnknownMessageContentTextObjectAnnotation() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownMessageDeltaContent.Serialization.cs b/.dotnet/src/Generated/Models/UnknownMessageDeltaContent.Serialization.cs new file mode 100644 index 000000000..f4570019f --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownMessageDeltaContent.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class UnknownMessageDeltaContent : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContent)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaContent IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaContent)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaContent(document.RootElement, options); + } + + internal static UnknownMessageDeltaContent DeserializeUnknownMessageDeltaContent(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UnknownMessageDeltaContent(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContent)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaContent IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaContent(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaContent)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UnknownMessageDeltaContent FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUnknownMessageDeltaContent(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownMessageDeltaContent.cs b/.dotnet/src/Generated/Models/UnknownMessageDeltaContent.cs new file mode 100644 index 000000000..a33b74454 --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownMessageDeltaContent.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class UnknownMessageDeltaContent : InternalMessageDeltaContent + { + internal UnknownMessageDeltaContent(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal UnknownMessageDeltaContent() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownMessageDeltaTextContentAnnotation.Serialization.cs b/.dotnet/src/Generated/Models/UnknownMessageDeltaTextContentAnnotation.Serialization.cs new file mode 100644 index 000000000..1fda24c5c --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownMessageDeltaTextContentAnnotation.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class UnknownMessageDeltaTextContentAnnotation : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaTextContentAnnotation)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalMessageDeltaTextContentAnnotation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalMessageDeltaTextContentAnnotation)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalMessageDeltaTextContentAnnotation(document.RootElement, options); + } + + internal static UnknownMessageDeltaTextContentAnnotation DeserializeUnknownMessageDeltaTextContentAnnotation(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UnknownMessageDeltaTextContentAnnotation(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaTextContentAnnotation)} does not support writing '{options.Format}' format."); + } + } + + InternalMessageDeltaTextContentAnnotation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalMessageDeltaTextContentAnnotation(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalMessageDeltaTextContentAnnotation)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UnknownMessageDeltaTextContentAnnotation FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUnknownMessageDeltaTextContentAnnotation(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownMessageDeltaTextContentAnnotation.cs b/.dotnet/src/Generated/Models/UnknownMessageDeltaTextContentAnnotation.cs new file mode 100644 index 000000000..d6f17c1ab --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownMessageDeltaTextContentAnnotation.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class UnknownMessageDeltaTextContentAnnotation : InternalMessageDeltaTextContentAnnotation + { + internal UnknownMessageDeltaTextContentAnnotation(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal UnknownMessageDeltaTextContentAnnotation() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetails.Serialization.cs b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetails.Serialization.cs new file mode 100644 index 000000000..287960a64 --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetails.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepDeltaStepDetails : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetails)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetails)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetails(document.RootElement, options); + } + + internal static UnknownRunStepDeltaStepDetails DeserializeUnknownRunStepDeltaStepDetails(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UnknownRunStepDeltaStepDetails(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetails)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetails(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetails)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UnknownRunStepDeltaStepDetails FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUnknownRunStepDeltaStepDetails(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetails.cs b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetails.cs new file mode 100644 index 000000000..d56091482 --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetails.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepDeltaStepDetails : InternalRunStepDeltaStepDetails + { + internal UnknownRunStepDeltaStepDetails(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal UnknownRunStepDeltaStepDetails() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.Serialization.cs b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.Serialization.cs new file mode 100644 index 000000000..4790ac0f2 --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepUpdateCodeInterpreterOutput)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunStepUpdateCodeInterpreterOutput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepUpdateCodeInterpreterOutput)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunStepUpdateCodeInterpreterOutput(document.RootElement, options); + } + + internal static UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject DeserializeUnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunStepUpdateCodeInterpreterOutput)} does not support writing '{options.Format}' format."); + } + } + + RunStepUpdateCodeInterpreterOutput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunStepUpdateCodeInterpreterOutput(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunStepUpdateCodeInterpreterOutput)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs new file mode 100644 index 000000000..d57ff0397 --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject : RunStepUpdateCodeInterpreterOutput + { + internal UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal UnknownRunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.Serialization.cs b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.Serialization.cs new file mode 100644 index 000000000..b5bcbda2c --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(document.RootElement, options); + } + + internal static UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject DeserializeUnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject)} does not support writing '{options.Format}' format."); + } + } + + InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeInternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.cs b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.cs new file mode 100644 index 000000000..5a0c3e670 --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject : InternalRunStepDeltaStepDetailsToolCallsObjectToolCallsObject + { + internal UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal UnknownRunStepDeltaStepDetailsToolCallsObjectToolCallsObject() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.Serialization.cs b/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.Serialization.cs new file mode 100644 index 000000000..86648d23b --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepCodeInterpreterOutput)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunStepCodeInterpreterOutput IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepCodeInterpreterOutput)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunStepCodeInterpreterOutput(document.RootElement, options); + } + + internal static UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject DeserializeUnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunStepCodeInterpreterOutput)} does not support writing '{options.Format}' format."); + } + } + + RunStepCodeInterpreterOutput IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunStepCodeInterpreterOutput(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunStepCodeInterpreterOutput)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs b/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs new file mode 100644 index 000000000..210c6b0af --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject : RunStepCodeInterpreterOutput + { + internal UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal UnknownRunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsObjectToolCallsObject.Serialization.cs b/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsObjectToolCallsObject.Serialization.cs new file mode 100644 index 000000000..79e5d73db --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsObjectToolCallsObject.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepDetailsToolCallsObjectToolCallsObject : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepToolCall)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunStepToolCall IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepToolCall)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunStepToolCall(document.RootElement, options); + } + + internal static UnknownRunStepDetailsToolCallsObjectToolCallsObject DeserializeUnknownRunStepDetailsToolCallsObjectToolCallsObject(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UnknownRunStepDetailsToolCallsObjectToolCallsObject(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunStepToolCall)} does not support writing '{options.Format}' format."); + } + } + + RunStepToolCall IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunStepToolCall(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunStepToolCall)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UnknownRunStepDetailsToolCallsObjectToolCallsObject FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUnknownRunStepDetailsToolCallsObjectToolCallsObject(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsObjectToolCallsObject.cs b/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsObjectToolCallsObject.cs new file mode 100644 index 000000000..fa779e444 --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepDetailsToolCallsObjectToolCallsObject.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepDetailsToolCallsObjectToolCallsObject : RunStepToolCall + { + internal UnknownRunStepDetailsToolCallsObjectToolCallsObject(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal UnknownRunStepDetailsToolCallsObjectToolCallsObject() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepObjectStepDetails.Serialization.cs b/.dotnet/src/Generated/Models/UnknownRunStepObjectStepDetails.Serialization.cs new file mode 100644 index 000000000..142bdc28b --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepObjectStepDetails.Serialization.cs @@ -0,0 +1,133 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepObjectStepDetails : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepDetails)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("type") != true) + { + writer.WritePropertyName("type"u8); + writer.WriteStringValue(Type); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + RunStepDetails IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(RunStepDetails)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeRunStepDetails(document.RootElement, options); + } + + internal static UnknownRunStepObjectStepDetails DeserializeUnknownRunStepObjectStepDetails(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string type = "Unknown"; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("type"u8)) + { + type = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UnknownRunStepObjectStepDetails(type, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(RunStepDetails)} does not support writing '{options.Format}' format."); + } + } + + RunStepDetails IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeRunStepDetails(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(RunStepDetails)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UnknownRunStepObjectStepDetails FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUnknownRunStepObjectStepDetails(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UnknownRunStepObjectStepDetails.cs b/.dotnet/src/Generated/Models/UnknownRunStepObjectStepDetails.cs new file mode 100644 index 000000000..e91259e4e --- /dev/null +++ b/.dotnet/src/Generated/Models/UnknownRunStepObjectStepDetails.cs @@ -0,0 +1,20 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Assistants +{ + internal partial class UnknownRunStepObjectStepDetails : RunStepDetails + { + internal UnknownRunStepObjectStepDetails(string type, IDictionary serializedAdditionalRawData) : base(type, serializedAdditionalRawData) + { + } + + internal UnknownRunStepObjectStepDetails() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/UserChatMessage.Serialization.cs b/.dotnet/src/Generated/Models/UserChatMessage.Serialization.cs new file mode 100644 index 000000000..c6fdb45b8 --- /dev/null +++ b/.dotnet/src/Generated/Models/UserChatMessage.Serialization.cs @@ -0,0 +1,109 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.Chat +{ + public partial class UserChatMessage : IJsonModel + { + UserChatMessage IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(UserChatMessage)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeUserChatMessage(document.RootElement, options); + } + + internal static UserChatMessage DeserializeUserChatMessage(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + ChatMessageRole role = default; + IList content = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("role"u8)) + { + role = property.Value.GetString().ToChatMessageRole(); + continue; + } + if (property.NameEquals("content"u8)) + { + DeserializeContentValue(property, ref content); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new UserChatMessage(role, content ?? new ChangeTrackingList(), serializedAdditionalRawData, name); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(UserChatMessage)} does not support writing '{options.Format}' format."); + } + } + + UserChatMessage IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeUserChatMessage(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(UserChatMessage)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static new UserChatMessage FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeUserChatMessage(document.RootElement); + } + + internal override BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/UserChatMessage.cs b/.dotnet/src/Generated/Models/UserChatMessage.cs new file mode 100644 index 000000000..65a986ebd --- /dev/null +++ b/.dotnet/src/Generated/Models/UserChatMessage.cs @@ -0,0 +1,21 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.Chat +{ + public partial class UserChatMessage : ChatMessage + { + internal UserChatMessage(ChatMessageRole role, IList content, IDictionary serializedAdditionalRawData, string participantName) : base(role, content, serializedAdditionalRawData) + { + ParticipantName = participantName; + } + + internal UserChatMessage() + { + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStore.Serialization.cs b/.dotnet/src/Generated/Models/VectorStore.Serialization.cs new file mode 100644 index 000000000..d2aacb781 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStore.Serialization.cs @@ -0,0 +1,306 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + public partial class VectorStore : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStore)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(Id); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData?.ContainsKey("usage_bytes") != true) + { + writer.WritePropertyName("usage_bytes"u8); + writer.WriteNumberValue(UsageBytes); + } + if (SerializedAdditionalRawData?.ContainsKey("file_counts") != true) + { + writer.WritePropertyName("file_counts"u8); + writer.WriteObjectValue(FileCounts, options); + } + if (SerializedAdditionalRawData?.ContainsKey("status") != true) + { + writer.WritePropertyName("status"u8); + writer.WriteStringValue(Status.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("expires_after") != true && Optional.IsDefined(ExpirationPolicy)) + { + writer.WritePropertyName("expires_after"u8); + writer.WriteObjectValue(ExpirationPolicy, options); + } + if (SerializedAdditionalRawData?.ContainsKey("expires_at") != true && Optional.IsDefined(ExpiresAt)) + { + if (ExpiresAt != null) + { + writer.WritePropertyName("expires_at"u8); + writer.WriteNumberValue(ExpiresAt.Value, "U"); + } + else + { + writer.WriteNull("expires_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("last_active_at") != true) + { + if (LastActiveAt != null) + { + writer.WritePropertyName("last_active_at"u8); + writer.WriteNumberValue(LastActiveAt.Value, "U"); + } + else + { + writer.WriteNull("last_active_at"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true) + { + if (Metadata != null && Optional.IsCollectionDefined(Metadata)) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + VectorStore IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStore)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeVectorStore(document.RootElement, options); + } + + internal static VectorStore DeserializeVectorStore(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalVectorStoreObjectObject @object = default; + DateTimeOffset createdAt = default; + string name = default; + int usageBytes = default; + VectorStoreFileCounts fileCounts = default; + VectorStoreStatus status = default; + VectorStoreExpirationPolicy expiresAfter = default; + DateTimeOffset? expiresAt = default; + DateTimeOffset? lastActiveAt = default; + IReadOnlyDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalVectorStoreObjectObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("usage_bytes"u8)) + { + usageBytes = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("file_counts"u8)) + { + fileCounts = VectorStoreFileCounts.DeserializeVectorStoreFileCounts(property.Value, options); + continue; + } + if (property.NameEquals("status"u8)) + { + status = property.Value.GetString().ToVectorStoreStatus(); + continue; + } + if (property.NameEquals("expires_after"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + expiresAfter = VectorStoreExpirationPolicy.DeserializeVectorStoreExpirationPolicy(property.Value, options); + continue; + } + if (property.NameEquals("expires_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + expiresAt = null; + continue; + } + expiresAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("last_active_at"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + lastActiveAt = null; + continue; + } + lastActiveAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + metadata = new ChangeTrackingDictionary(); + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new VectorStore( + id, + @object, + createdAt, + name, + usageBytes, + fileCounts, + status, + expiresAfter, + expiresAt, + lastActiveAt, + metadata, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(VectorStore)} does not support writing '{options.Format}' format."); + } + } + + VectorStore IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeVectorStore(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(VectorStore)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static VectorStore FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeVectorStore(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStore.cs b/.dotnet/src/Generated/Models/VectorStore.cs new file mode 100644 index 000000000..3f39a41e2 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStore.cs @@ -0,0 +1,60 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + public partial class VectorStore + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal VectorStore(string id, DateTimeOffset createdAt, string name, int usageBytes, VectorStoreFileCounts fileCounts, VectorStoreStatus status, DateTimeOffset? lastActiveAt, IReadOnlyDictionary metadata) + { + Argument.AssertNotNull(id, nameof(id)); + Argument.AssertNotNull(name, nameof(name)); + Argument.AssertNotNull(fileCounts, nameof(fileCounts)); + + Id = id; + CreatedAt = createdAt; + Name = name; + UsageBytes = usageBytes; + FileCounts = fileCounts; + Status = status; + LastActiveAt = lastActiveAt; + Metadata = metadata; + } + + internal VectorStore(string id, InternalVectorStoreObjectObject @object, DateTimeOffset createdAt, string name, int usageBytes, VectorStoreFileCounts fileCounts, VectorStoreStatus status, VectorStoreExpirationPolicy expirationPolicy, DateTimeOffset? expiresAt, DateTimeOffset? lastActiveAt, IReadOnlyDictionary metadata, IDictionary serializedAdditionalRawData) + { + Id = id; + Object = @object; + CreatedAt = createdAt; + Name = name; + UsageBytes = usageBytes; + FileCounts = fileCounts; + Status = status; + ExpirationPolicy = expirationPolicy; + ExpiresAt = expiresAt; + LastActiveAt = lastActiveAt; + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal VectorStore() + { + } + + public string Id { get; } + + public DateTimeOffset CreatedAt { get; } + public string Name { get; } + public int UsageBytes { get; } + public VectorStoreFileCounts FileCounts { get; } + public VectorStoreStatus Status { get; } + public DateTimeOffset? ExpiresAt { get; } + public DateTimeOffset? LastActiveAt { get; } + public IReadOnlyDictionary Metadata { get; } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreBatchFileJob.Serialization.cs b/.dotnet/src/Generated/Models/VectorStoreBatchFileJob.Serialization.cs new file mode 100644 index 000000000..bdfa594c4 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreBatchFileJob.Serialization.cs @@ -0,0 +1,195 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreBatchFileJob : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreBatchFileJob)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(BatchId); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteObjectValue(Object, options); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("vector_store_id") != true) + { + writer.WritePropertyName("vector_store_id"u8); + writer.WriteStringValue(VectorStoreId); + } + if (SerializedAdditionalRawData?.ContainsKey("status") != true) + { + writer.WritePropertyName("status"u8); + writer.WriteStringValue(Status.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("file_counts") != true) + { + writer.WritePropertyName("file_counts"u8); + writer.WriteObjectValue(FileCounts, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + VectorStoreBatchFileJob IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreBatchFileJob)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeVectorStoreBatchFileJob(document.RootElement, options); + } + + internal static VectorStoreBatchFileJob DeserializeVectorStoreBatchFileJob(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + object @object = default; + DateTimeOffset createdAt = default; + string vectorStoreId = default; + VectorStoreBatchFileJobStatus status = default; + VectorStoreFileCounts fileCounts = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = property.Value.GetObject(); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("vector_store_id"u8)) + { + vectorStoreId = property.Value.GetString(); + continue; + } + if (property.NameEquals("status"u8)) + { + status = new VectorStoreBatchFileJobStatus(property.Value.GetString()); + continue; + } + if (property.NameEquals("file_counts"u8)) + { + fileCounts = VectorStoreFileCounts.DeserializeVectorStoreFileCounts(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new VectorStoreBatchFileJob( + id, + @object, + createdAt, + vectorStoreId, + status, + fileCounts, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(VectorStoreBatchFileJob)} does not support writing '{options.Format}' format."); + } + } + + VectorStoreBatchFileJob IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeVectorStoreBatchFileJob(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(VectorStoreBatchFileJob)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static VectorStoreBatchFileJob FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeVectorStoreBatchFileJob(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreBatchFileJob.cs b/.dotnet/src/Generated/Models/VectorStoreBatchFileJob.cs new file mode 100644 index 000000000..a584a5e60 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreBatchFileJob.cs @@ -0,0 +1,45 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreBatchFileJob + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal VectorStoreBatchFileJob(string batchId, DateTimeOffset createdAt, string vectorStoreId, VectorStoreBatchFileJobStatus status, VectorStoreFileCounts fileCounts) + { + Argument.AssertNotNull(batchId, nameof(batchId)); + Argument.AssertNotNull(vectorStoreId, nameof(vectorStoreId)); + Argument.AssertNotNull(fileCounts, nameof(fileCounts)); + + BatchId = batchId; + CreatedAt = createdAt; + VectorStoreId = vectorStoreId; + Status = status; + FileCounts = fileCounts; + } + + internal VectorStoreBatchFileJob(string batchId, object @object, DateTimeOffset createdAt, string vectorStoreId, VectorStoreBatchFileJobStatus status, VectorStoreFileCounts fileCounts, IDictionary serializedAdditionalRawData) + { + BatchId = batchId; + Object = @object; + CreatedAt = createdAt; + VectorStoreId = vectorStoreId; + Status = status; + FileCounts = fileCounts; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal VectorStoreBatchFileJob() + { + } + + public DateTimeOffset CreatedAt { get; } + public string VectorStoreId { get; } + public VectorStoreBatchFileJobStatus Status { get; } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreBatchFileJobStatus.cs b/.dotnet/src/Generated/Models/VectorStoreBatchFileJobStatus.cs new file mode 100644 index 000000000..c47559446 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreBatchFileJobStatus.cs @@ -0,0 +1,40 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.VectorStores +{ + public readonly partial struct VectorStoreBatchFileJobStatus : IEquatable + { + private readonly string _value; + + public VectorStoreBatchFileJobStatus(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string InProgressValue = "in_progress"; + private const string CompletedValue = "completed"; + private const string CancelledValue = "cancelled"; + private const string FailedValue = "failed"; + + public static VectorStoreBatchFileJobStatus InProgress { get; } = new VectorStoreBatchFileJobStatus(InProgressValue); + public static VectorStoreBatchFileJobStatus Completed { get; } = new VectorStoreBatchFileJobStatus(CompletedValue); + public static VectorStoreBatchFileJobStatus Cancelled { get; } = new VectorStoreBatchFileJobStatus(CancelledValue); + public static VectorStoreBatchFileJobStatus Failed { get; } = new VectorStoreBatchFileJobStatus(FailedValue); + public static bool operator ==(VectorStoreBatchFileJobStatus left, VectorStoreBatchFileJobStatus right) => left.Equals(right); + public static bool operator !=(VectorStoreBatchFileJobStatus left, VectorStoreBatchFileJobStatus right) => !left.Equals(right); + public static implicit operator VectorStoreBatchFileJobStatus(string value) => new VectorStoreBatchFileJobStatus(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is VectorStoreBatchFileJobStatus other && Equals(other); + public bool Equals(VectorStoreBatchFileJobStatus other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreCreationHelper.Serialization.cs b/.dotnet/src/Generated/Models/VectorStoreCreationHelper.Serialization.cs new file mode 100644 index 000000000..3004632e0 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreCreationHelper.Serialization.cs @@ -0,0 +1,189 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; +using OpenAI.VectorStores; + +namespace OpenAI.Assistants +{ + public partial class VectorStoreCreationHelper : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreCreationHelper)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_ids") != true && Optional.IsCollectionDefined(FileIds)) + { + writer.WritePropertyName("file_ids"u8); + writer.WriteStartArray(); + foreach (var item in FileIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("chunking_strategy") != true && Optional.IsDefined(ChunkingStrategy)) + { + writer.WritePropertyName("chunking_strategy"u8); + writer.WriteObjectValue(ChunkingStrategy, options); + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + VectorStoreCreationHelper IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreCreationHelper)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeVectorStoreCreationHelper(document.RootElement, options); + } + + internal static VectorStoreCreationHelper DeserializeVectorStoreCreationHelper(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList fileIds = default; + FileChunkingStrategy chunkingStrategy = default; + IDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + fileIds = array; + continue; + } + if (property.NameEquals("chunking_strategy"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + chunkingStrategy = FileChunkingStrategy.DeserializeFileChunkingStrategy(property.Value, options); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new VectorStoreCreationHelper(fileIds ?? new ChangeTrackingList(), chunkingStrategy, metadata ?? new ChangeTrackingDictionary(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(VectorStoreCreationHelper)} does not support writing '{options.Format}' format."); + } + } + + VectorStoreCreationHelper IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeVectorStoreCreationHelper(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(VectorStoreCreationHelper)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static VectorStoreCreationHelper FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeVectorStoreCreationHelper(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreCreationHelper.cs b/.dotnet/src/Generated/Models/VectorStoreCreationHelper.cs new file mode 100644 index 000000000..6871ab916 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreCreationHelper.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using OpenAI.VectorStores; + +namespace OpenAI.Assistants +{ + public partial class VectorStoreCreationHelper + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public VectorStoreCreationHelper() + { + FileIds = new ChangeTrackingList(); + Metadata = new ChangeTrackingDictionary(); + } + + internal VectorStoreCreationHelper(IList fileIds, FileChunkingStrategy chunkingStrategy, IDictionary metadata, IDictionary serializedAdditionalRawData) + { + FileIds = fileIds; + ChunkingStrategy = chunkingStrategy; + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public IList FileIds { get; } + public IDictionary Metadata { get; } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreCreationOptions.Serialization.cs b/.dotnet/src/Generated/Models/VectorStoreCreationOptions.Serialization.cs new file mode 100644 index 000000000..4b08b506a --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreCreationOptions.Serialization.cs @@ -0,0 +1,227 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreCreationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreCreationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("file_ids") != true && Optional.IsCollectionDefined(FileIds)) + { + writer.WritePropertyName("file_ids"u8); + writer.WriteStartArray(); + foreach (var item in FileIds) + { + writer.WriteStringValue(item); + } + writer.WriteEndArray(); + } + if (SerializedAdditionalRawData?.ContainsKey("name") != true && Optional.IsDefined(Name)) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + if (SerializedAdditionalRawData?.ContainsKey("expires_after") != true && Optional.IsDefined(ExpirationPolicy)) + { + writer.WritePropertyName("expires_after"u8); + writer.WriteObjectValue(ExpirationPolicy, options); + } + if (SerializedAdditionalRawData?.ContainsKey("chunking_strategy") != true && Optional.IsDefined(ChunkingStrategy)) + { + writer.WritePropertyName("chunking_strategy"u8); + writer.WriteObjectValue(ChunkingStrategy, options); + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + VectorStoreCreationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreCreationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeVectorStoreCreationOptions(document.RootElement, options); + } + + internal static VectorStoreCreationOptions DeserializeVectorStoreCreationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + IList fileIds = default; + string name = default; + VectorStoreExpirationPolicy expiresAfter = default; + FileChunkingStrategy chunkingStrategy = default; + IDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("file_ids"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + List array = new List(); + foreach (var item in property.Value.EnumerateArray()) + { + array.Add(item.GetString()); + } + fileIds = array; + continue; + } + if (property.NameEquals("name"u8)) + { + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("expires_after"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + expiresAfter = VectorStoreExpirationPolicy.DeserializeVectorStoreExpirationPolicy(property.Value, options); + continue; + } + if (property.NameEquals("chunking_strategy"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + chunkingStrategy = FileChunkingStrategy.DeserializeFileChunkingStrategy(property.Value, options); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new VectorStoreCreationOptions( + fileIds ?? new ChangeTrackingList(), + name, + expiresAfter, + chunkingStrategy, + metadata ?? new ChangeTrackingDictionary(), + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(VectorStoreCreationOptions)} does not support writing '{options.Format}' format."); + } + } + + VectorStoreCreationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeVectorStoreCreationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(VectorStoreCreationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static VectorStoreCreationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeVectorStoreCreationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreCreationOptions.cs b/.dotnet/src/Generated/Models/VectorStoreCreationOptions.cs new file mode 100644 index 000000000..6c26058e0 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreCreationOptions.cs @@ -0,0 +1,31 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreCreationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public VectorStoreCreationOptions() + { + FileIds = new ChangeTrackingList(); + Metadata = new ChangeTrackingDictionary(); + } + + internal VectorStoreCreationOptions(IList fileIds, string name, VectorStoreExpirationPolicy expirationPolicy, FileChunkingStrategy chunkingStrategy, IDictionary metadata, IDictionary serializedAdditionalRawData) + { + FileIds = fileIds; + Name = name; + ExpirationPolicy = expirationPolicy; + ChunkingStrategy = chunkingStrategy; + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + public string Name { get; set; } + public IDictionary Metadata { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreExpirationAnchor.Serialization.cs b/.dotnet/src/Generated/Models/VectorStoreExpirationAnchor.Serialization.cs new file mode 100644 index 000000000..90af403e1 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreExpirationAnchor.Serialization.cs @@ -0,0 +1,23 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.VectorStores +{ + internal static partial class VectorStoreExpirationAnchorExtensions + { + public static string ToSerialString(this VectorStoreExpirationAnchor value) => value switch + { + VectorStoreExpirationAnchor.LastActiveAt => "last_active_at", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown VectorStoreExpirationAnchor value.") + }; + + public static VectorStoreExpirationAnchor ToVectorStoreExpirationAnchor(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "last_active_at")) return VectorStoreExpirationAnchor.LastActiveAt; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown VectorStoreExpirationAnchor value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreExpirationPolicy.Serialization.cs b/.dotnet/src/Generated/Models/VectorStoreExpirationPolicy.Serialization.cs new file mode 100644 index 000000000..76b1d5d3c --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreExpirationPolicy.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreExpirationPolicy : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreExpirationPolicy)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("anchor") != true) + { + writer.WritePropertyName("anchor"u8); + writer.WriteStringValue(_anchor.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("days") != true) + { + writer.WritePropertyName("days"u8); + writer.WriteNumberValue(_days); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + VectorStoreExpirationPolicy IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreExpirationPolicy)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeVectorStoreExpirationPolicy(document.RootElement, options); + } + + internal static VectorStoreExpirationPolicy DeserializeVectorStoreExpirationPolicy(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + VectorStoreExpirationAnchor anchor = default; + int days = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("anchor"u8)) + { + anchor = property.Value.GetString().ToVectorStoreExpirationAnchor(); + continue; + } + if (property.NameEquals("days"u8)) + { + days = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new VectorStoreExpirationPolicy(anchor, days, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(VectorStoreExpirationPolicy)} does not support writing '{options.Format}' format."); + } + } + + VectorStoreExpirationPolicy IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeVectorStoreExpirationPolicy(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(VectorStoreExpirationPolicy)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static VectorStoreExpirationPolicy FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeVectorStoreExpirationPolicy(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreExpirationPolicy.cs b/.dotnet/src/Generated/Models/VectorStoreExpirationPolicy.cs new file mode 100644 index 000000000..3fd71c7e7 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreExpirationPolicy.cs @@ -0,0 +1,13 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreExpirationPolicy + { + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreFileAssociation.Serialization.cs b/.dotnet/src/Generated/Models/VectorStoreFileAssociation.Serialization.cs new file mode 100644 index 000000000..ede8e0d5b --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreFileAssociation.Serialization.cs @@ -0,0 +1,235 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreFileAssociation : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreFileAssociation)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("id") != true) + { + writer.WritePropertyName("id"u8); + writer.WriteStringValue(FileId); + } + if (SerializedAdditionalRawData?.ContainsKey("object") != true) + { + writer.WritePropertyName("object"u8); + writer.WriteStringValue(Object.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("usage_bytes") != true) + { + writer.WritePropertyName("usage_bytes"u8); + writer.WriteNumberValue(Size); + } + if (SerializedAdditionalRawData?.ContainsKey("created_at") != true) + { + writer.WritePropertyName("created_at"u8); + writer.WriteNumberValue(CreatedAt, "U"); + } + if (SerializedAdditionalRawData?.ContainsKey("vector_store_id") != true) + { + writer.WritePropertyName("vector_store_id"u8); + writer.WriteStringValue(VectorStoreId); + } + if (SerializedAdditionalRawData?.ContainsKey("status") != true) + { + writer.WritePropertyName("status"u8); + writer.WriteStringValue(Status.ToSerialString()); + } + if (SerializedAdditionalRawData?.ContainsKey("last_error") != true) + { + if (LastError != null) + { + writer.WritePropertyName("last_error"u8); + writer.WriteObjectValue(LastError, options); + } + else + { + writer.WriteNull("last_error"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("chunking_strategy") != true && Optional.IsDefined(ChunkingStrategy)) + { + writer.WritePropertyName("chunking_strategy"u8); + writer.WriteObjectValue(ChunkingStrategy, options); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + VectorStoreFileAssociation IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreFileAssociation)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeVectorStoreFileAssociation(document.RootElement, options); + } + + internal static VectorStoreFileAssociation DeserializeVectorStoreFileAssociation(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string id = default; + InternalVectorStoreFileObjectObject @object = default; + int usageBytes = default; + DateTimeOffset createdAt = default; + string vectorStoreId = default; + VectorStoreFileAssociationStatus status = default; + VectorStoreFileAssociationError lastError = default; + FileChunkingStrategy chunkingStrategy = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("id"u8)) + { + id = property.Value.GetString(); + continue; + } + if (property.NameEquals("object"u8)) + { + @object = new InternalVectorStoreFileObjectObject(property.Value.GetString()); + continue; + } + if (property.NameEquals("usage_bytes"u8)) + { + usageBytes = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("created_at"u8)) + { + createdAt = DateTimeOffset.FromUnixTimeSeconds(property.Value.GetInt64()); + continue; + } + if (property.NameEquals("vector_store_id"u8)) + { + vectorStoreId = property.Value.GetString(); + continue; + } + if (property.NameEquals("status"u8)) + { + status = property.Value.GetString().ToVectorStoreFileAssociationStatus(); + continue; + } + if (property.NameEquals("last_error"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + lastError = null; + continue; + } + lastError = VectorStoreFileAssociationError.DeserializeVectorStoreFileAssociationError(property.Value, options); + continue; + } + if (property.NameEquals("chunking_strategy"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + chunkingStrategy = FileChunkingStrategy.DeserializeFileChunkingStrategy(property.Value, options); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new VectorStoreFileAssociation( + id, + @object, + usageBytes, + createdAt, + vectorStoreId, + status, + lastError, + chunkingStrategy, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(VectorStoreFileAssociation)} does not support writing '{options.Format}' format."); + } + } + + VectorStoreFileAssociation IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeVectorStoreFileAssociation(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(VectorStoreFileAssociation)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static VectorStoreFileAssociation FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeVectorStoreFileAssociation(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreFileAssociation.cs b/.dotnet/src/Generated/Models/VectorStoreFileAssociation.cs new file mode 100644 index 000000000..c7da0b027 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreFileAssociation.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreFileAssociation + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal VectorStoreFileAssociation(string fileId, int size, DateTimeOffset createdAt, string vectorStoreId, VectorStoreFileAssociationStatus status, VectorStoreFileAssociationError lastError) + { + Argument.AssertNotNull(fileId, nameof(fileId)); + Argument.AssertNotNull(vectorStoreId, nameof(vectorStoreId)); + + FileId = fileId; + Size = size; + CreatedAt = createdAt; + VectorStoreId = vectorStoreId; + Status = status; + LastError = lastError; + } + + internal VectorStoreFileAssociation(string fileId, InternalVectorStoreFileObjectObject @object, int size, DateTimeOffset createdAt, string vectorStoreId, VectorStoreFileAssociationStatus status, VectorStoreFileAssociationError lastError, FileChunkingStrategy chunkingStrategy, IDictionary serializedAdditionalRawData) + { + FileId = fileId; + Object = @object; + Size = size; + CreatedAt = createdAt; + VectorStoreId = vectorStoreId; + Status = status; + LastError = lastError; + ChunkingStrategy = chunkingStrategy; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal VectorStoreFileAssociation() + { + } + public DateTimeOffset CreatedAt { get; } + public string VectorStoreId { get; } + public VectorStoreFileAssociationStatus Status { get; } + public VectorStoreFileAssociationError LastError { get; } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreFileAssociationError.Serialization.cs b/.dotnet/src/Generated/Models/VectorStoreFileAssociationError.Serialization.cs new file mode 100644 index 000000000..9bc7e2126 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreFileAssociationError.Serialization.cs @@ -0,0 +1,144 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreFileAssociationError : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreFileAssociationError)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("code") != true) + { + writer.WritePropertyName("code"u8); + writer.WriteStringValue(Code.ToString()); + } + if (SerializedAdditionalRawData?.ContainsKey("message") != true) + { + writer.WritePropertyName("message"u8); + writer.WriteStringValue(Message); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + VectorStoreFileAssociationError IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreFileAssociationError)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeVectorStoreFileAssociationError(document.RootElement, options); + } + + internal static VectorStoreFileAssociationError DeserializeVectorStoreFileAssociationError(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + VectorStoreFileAssociationErrorCode code = default; + string message = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("code"u8)) + { + code = new VectorStoreFileAssociationErrorCode(property.Value.GetString()); + continue; + } + if (property.NameEquals("message"u8)) + { + message = property.Value.GetString(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new VectorStoreFileAssociationError(code, message, serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(VectorStoreFileAssociationError)} does not support writing '{options.Format}' format."); + } + } + + VectorStoreFileAssociationError IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeVectorStoreFileAssociationError(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(VectorStoreFileAssociationError)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static VectorStoreFileAssociationError FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeVectorStoreFileAssociationError(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreFileAssociationError.cs b/.dotnet/src/Generated/Models/VectorStoreFileAssociationError.cs new file mode 100644 index 000000000..638e7f527 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreFileAssociationError.cs @@ -0,0 +1,35 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreFileAssociationError + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal VectorStoreFileAssociationError(VectorStoreFileAssociationErrorCode code, string message) + { + Argument.AssertNotNull(message, nameof(message)); + + Code = code; + Message = message; + } + + internal VectorStoreFileAssociationError(VectorStoreFileAssociationErrorCode code, string message, IDictionary serializedAdditionalRawData) + { + Code = code; + Message = message; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal VectorStoreFileAssociationError() + { + } + + public VectorStoreFileAssociationErrorCode Code { get; } + public string Message { get; } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreFileAssociationErrorCode.cs b/.dotnet/src/Generated/Models/VectorStoreFileAssociationErrorCode.cs new file mode 100644 index 000000000..167ad6b2a --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreFileAssociationErrorCode.cs @@ -0,0 +1,38 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.VectorStores +{ + public readonly partial struct VectorStoreFileAssociationErrorCode : IEquatable + { + private readonly string _value; + + public VectorStoreFileAssociationErrorCode(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string ServerErrorValue = "server_error"; + private const string UnsupportedFileValue = "unsupported_file"; + private const string InvalidFileValue = "invalid_file"; + + public static VectorStoreFileAssociationErrorCode ServerError { get; } = new VectorStoreFileAssociationErrorCode(ServerErrorValue); + public static VectorStoreFileAssociationErrorCode UnsupportedFile { get; } = new VectorStoreFileAssociationErrorCode(UnsupportedFileValue); + public static VectorStoreFileAssociationErrorCode InvalidFile { get; } = new VectorStoreFileAssociationErrorCode(InvalidFileValue); + public static bool operator ==(VectorStoreFileAssociationErrorCode left, VectorStoreFileAssociationErrorCode right) => left.Equals(right); + public static bool operator !=(VectorStoreFileAssociationErrorCode left, VectorStoreFileAssociationErrorCode right) => !left.Equals(right); + public static implicit operator VectorStoreFileAssociationErrorCode(string value) => new VectorStoreFileAssociationErrorCode(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is VectorStoreFileAssociationErrorCode other && Equals(other); + public bool Equals(VectorStoreFileAssociationErrorCode other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreFileAssociationStatus.Serialization.cs b/.dotnet/src/Generated/Models/VectorStoreFileAssociationStatus.Serialization.cs new file mode 100644 index 000000000..fc493ac6a --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreFileAssociationStatus.Serialization.cs @@ -0,0 +1,29 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.VectorStores +{ + internal static partial class VectorStoreFileAssociationStatusExtensions + { + public static string ToSerialString(this VectorStoreFileAssociationStatus value) => value switch + { + VectorStoreFileAssociationStatus.InProgress => "in_progress", + VectorStoreFileAssociationStatus.Completed => "completed", + VectorStoreFileAssociationStatus.Cancelled => "cancelled", + VectorStoreFileAssociationStatus.Failed => "failed", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown VectorStoreFileAssociationStatus value.") + }; + + public static VectorStoreFileAssociationStatus ToVectorStoreFileAssociationStatus(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "in_progress")) return VectorStoreFileAssociationStatus.InProgress; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "completed")) return VectorStoreFileAssociationStatus.Completed; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "cancelled")) return VectorStoreFileAssociationStatus.Cancelled; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "failed")) return VectorStoreFileAssociationStatus.Failed; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown VectorStoreFileAssociationStatus value."); + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreFileCounts.Serialization.cs b/.dotnet/src/Generated/Models/VectorStoreFileCounts.Serialization.cs new file mode 100644 index 000000000..0a4998a0f --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreFileCounts.Serialization.cs @@ -0,0 +1,183 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreFileCounts : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreFileCounts)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("in_progress") != true) + { + writer.WritePropertyName("in_progress"u8); + writer.WriteNumberValue(InProgress); + } + if (SerializedAdditionalRawData?.ContainsKey("completed") != true) + { + writer.WritePropertyName("completed"u8); + writer.WriteNumberValue(Completed); + } + if (SerializedAdditionalRawData?.ContainsKey("failed") != true) + { + writer.WritePropertyName("failed"u8); + writer.WriteNumberValue(Failed); + } + if (SerializedAdditionalRawData?.ContainsKey("cancelled") != true) + { + writer.WritePropertyName("cancelled"u8); + writer.WriteNumberValue(Cancelled); + } + if (SerializedAdditionalRawData?.ContainsKey("total") != true) + { + writer.WritePropertyName("total"u8); + writer.WriteNumberValue(Total); + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + VectorStoreFileCounts IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreFileCounts)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeVectorStoreFileCounts(document.RootElement, options); + } + + internal static VectorStoreFileCounts DeserializeVectorStoreFileCounts(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + int inProgress = default; + int completed = default; + int failed = default; + int cancelled = default; + int total = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("in_progress"u8)) + { + inProgress = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("completed"u8)) + { + completed = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("failed"u8)) + { + failed = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("cancelled"u8)) + { + cancelled = property.Value.GetInt32(); + continue; + } + if (property.NameEquals("total"u8)) + { + total = property.Value.GetInt32(); + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new VectorStoreFileCounts( + inProgress, + completed, + failed, + cancelled, + total, + serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(VectorStoreFileCounts)} does not support writing '{options.Format}' format."); + } + } + + VectorStoreFileCounts IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeVectorStoreFileCounts(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(VectorStoreFileCounts)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static VectorStoreFileCounts FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeVectorStoreFileCounts(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreFileCounts.cs b/.dotnet/src/Generated/Models/VectorStoreFileCounts.cs new file mode 100644 index 000000000..7d35eacce --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreFileCounts.cs @@ -0,0 +1,42 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreFileCounts + { + internal IDictionary SerializedAdditionalRawData { get; set; } + internal VectorStoreFileCounts(int inProgress, int completed, int failed, int cancelled, int total) + { + InProgress = inProgress; + Completed = completed; + Failed = failed; + Cancelled = cancelled; + Total = total; + } + + internal VectorStoreFileCounts(int inProgress, int completed, int failed, int cancelled, int total, IDictionary serializedAdditionalRawData) + { + InProgress = inProgress; + Completed = completed; + Failed = failed; + Cancelled = cancelled; + Total = total; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + internal VectorStoreFileCounts() + { + } + + public int InProgress { get; } + public int Completed { get; } + public int Failed { get; } + public int Cancelled { get; } + public int Total { get; } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreFileStatusFilter.cs b/.dotnet/src/Generated/Models/VectorStoreFileStatusFilter.cs new file mode 100644 index 000000000..762b7f106 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreFileStatusFilter.cs @@ -0,0 +1,40 @@ +// + +#nullable disable + +using System; +using System.ComponentModel; + +namespace OpenAI.VectorStores +{ + public readonly partial struct VectorStoreFileStatusFilter : IEquatable + { + private readonly string _value; + + public VectorStoreFileStatusFilter(string value) + { + _value = value ?? throw new ArgumentNullException(nameof(value)); + } + + private const string InProgressValue = "in_progress"; + private const string CompletedValue = "completed"; + private const string FailedValue = "failed"; + private const string CancelledValue = "cancelled"; + + public static VectorStoreFileStatusFilter InProgress { get; } = new VectorStoreFileStatusFilter(InProgressValue); + public static VectorStoreFileStatusFilter Completed { get; } = new VectorStoreFileStatusFilter(CompletedValue); + public static VectorStoreFileStatusFilter Failed { get; } = new VectorStoreFileStatusFilter(FailedValue); + public static VectorStoreFileStatusFilter Cancelled { get; } = new VectorStoreFileStatusFilter(CancelledValue); + public static bool operator ==(VectorStoreFileStatusFilter left, VectorStoreFileStatusFilter right) => left.Equals(right); + public static bool operator !=(VectorStoreFileStatusFilter left, VectorStoreFileStatusFilter right) => !left.Equals(right); + public static implicit operator VectorStoreFileStatusFilter(string value) => new VectorStoreFileStatusFilter(value); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override bool Equals(object obj) => obj is VectorStoreFileStatusFilter other && Equals(other); + public bool Equals(VectorStoreFileStatusFilter other) => string.Equals(_value, other._value, StringComparison.InvariantCultureIgnoreCase); + + [EditorBrowsable(EditorBrowsableState.Never)] + public override int GetHashCode() => _value != null ? StringComparer.InvariantCultureIgnoreCase.GetHashCode(_value) : 0; + public override string ToString() => _value; + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreModificationOptions.Serialization.cs b/.dotnet/src/Generated/Models/VectorStoreModificationOptions.Serialization.cs new file mode 100644 index 000000000..16e2ec490 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreModificationOptions.Serialization.cs @@ -0,0 +1,201 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreModificationOptions : IJsonModel + { + void IJsonModel.Write(Utf8JsonWriter writer, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreModificationOptions)} does not support writing '{format}' format."); + } + + writer.WriteStartObject(); + if (SerializedAdditionalRawData?.ContainsKey("name") != true && Optional.IsDefined(Name)) + { + if (Name != null) + { + writer.WritePropertyName("name"u8); + writer.WriteStringValue(Name); + } + else + { + writer.WriteNull("name"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("expires_after") != true && Optional.IsDefined(ExpirationPolicy)) + { + if (ExpirationPolicy != null) + { + writer.WritePropertyName("expires_after"u8); + writer.WriteObjectValue(ExpirationPolicy, options); + } + else + { + writer.WriteNull("expires_after"); + } + } + if (SerializedAdditionalRawData?.ContainsKey("metadata") != true && Optional.IsCollectionDefined(Metadata)) + { + if (Metadata != null) + { + writer.WritePropertyName("metadata"u8); + writer.WriteStartObject(); + foreach (var item in Metadata) + { + writer.WritePropertyName(item.Key); + writer.WriteStringValue(item.Value); + } + writer.WriteEndObject(); + } + else + { + writer.WriteNull("metadata"); + } + } + if (SerializedAdditionalRawData != null) + { + foreach (var item in SerializedAdditionalRawData) + { + if (ModelSerializationExtensions.IsSentinelValue(item.Value)) + { + continue; + } + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using (JsonDocument document = JsonDocument.Parse(item.Value)) + { + JsonSerializer.Serialize(writer, document.RootElement); + } +#endif + } + } + writer.WriteEndObject(); + } + + VectorStoreModificationOptions IJsonModel.Create(ref Utf8JsonReader reader, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(VectorStoreModificationOptions)} does not support reading '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return DeserializeVectorStoreModificationOptions(document.RootElement, options); + } + + internal static VectorStoreModificationOptions DeserializeVectorStoreModificationOptions(JsonElement element, ModelReaderWriterOptions options = null) + { + options ??= ModelSerializationExtensions.WireOptions; + + if (element.ValueKind == JsonValueKind.Null) + { + return null; + } + string name = default; + VectorStoreExpirationPolicy expiresAfter = default; + IDictionary metadata = default; + IDictionary serializedAdditionalRawData = default; + Dictionary rawDataDictionary = new Dictionary(); + foreach (var property in element.EnumerateObject()) + { + if (property.NameEquals("name"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + name = null; + continue; + } + name = property.Value.GetString(); + continue; + } + if (property.NameEquals("expires_after"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + expiresAfter = null; + continue; + } + expiresAfter = VectorStoreExpirationPolicy.DeserializeVectorStoreExpirationPolicy(property.Value, options); + continue; + } + if (property.NameEquals("metadata"u8)) + { + if (property.Value.ValueKind == JsonValueKind.Null) + { + continue; + } + Dictionary dictionary = new Dictionary(); + foreach (var property0 in property.Value.EnumerateObject()) + { + dictionary.Add(property0.Name, property0.Value.GetString()); + } + metadata = dictionary; + continue; + } + if (true) + { + rawDataDictionary ??= new Dictionary(); + rawDataDictionary.Add(property.Name, BinaryData.FromString(property.Value.GetRawText())); + } + } + serializedAdditionalRawData = rawDataDictionary; + return new VectorStoreModificationOptions(name, expiresAfter, metadata ?? new ChangeTrackingDictionary(), serializedAdditionalRawData); + } + + BinaryData IPersistableModel.Write(ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + return ModelReaderWriter.Write(this, options); + default: + throw new FormatException($"The model {nameof(VectorStoreModificationOptions)} does not support writing '{options.Format}' format."); + } + } + + VectorStoreModificationOptions IPersistableModel.Create(BinaryData data, ModelReaderWriterOptions options) + { + var format = options.Format == "W" ? ((IPersistableModel)this).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return DeserializeVectorStoreModificationOptions(document.RootElement, options); + } + default: + throw new FormatException($"The model {nameof(VectorStoreModificationOptions)} does not support reading '{options.Format}' format."); + } + } + + string IPersistableModel.GetFormatFromOptions(ModelReaderWriterOptions options) => "J"; + + internal static VectorStoreModificationOptions FromResponse(PipelineResponse response) + { + using var document = JsonDocument.Parse(response.Content); + return DeserializeVectorStoreModificationOptions(document.RootElement); + } + + internal virtual BinaryContent ToBinaryContent() + { + return BinaryContent.Create(this, ModelSerializationExtensions.WireOptions); + } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreModificationOptions.cs b/.dotnet/src/Generated/Models/VectorStoreModificationOptions.cs new file mode 100644 index 000000000..8597db68c --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreModificationOptions.cs @@ -0,0 +1,29 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; + +namespace OpenAI.VectorStores +{ + public partial class VectorStoreModificationOptions + { + internal IDictionary SerializedAdditionalRawData { get; set; } + public VectorStoreModificationOptions() + { + Metadata = new ChangeTrackingDictionary(); + } + + internal VectorStoreModificationOptions(string name, VectorStoreExpirationPolicy expirationPolicy, IDictionary metadata, IDictionary serializedAdditionalRawData) + { + Name = name; + ExpirationPolicy = expirationPolicy; + Metadata = metadata; + SerializedAdditionalRawData = serializedAdditionalRawData; + } + + public string Name { get; set; } + public IDictionary Metadata { get; set; } + } +} diff --git a/.dotnet/src/Generated/Models/VectorStoreStatus.Serialization.cs b/.dotnet/src/Generated/Models/VectorStoreStatus.Serialization.cs new file mode 100644 index 000000000..c4417aef5 --- /dev/null +++ b/.dotnet/src/Generated/Models/VectorStoreStatus.Serialization.cs @@ -0,0 +1,27 @@ +// + +#nullable disable + +using System; + +namespace OpenAI.VectorStores +{ + internal static partial class VectorStoreStatusExtensions + { + public static string ToSerialString(this VectorStoreStatus value) => value switch + { + VectorStoreStatus.Expired => "expired", + VectorStoreStatus.InProgress => "in_progress", + VectorStoreStatus.Completed => "completed", + _ => throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown VectorStoreStatus value.") + }; + + public static VectorStoreStatus ToVectorStoreStatus(this string value) + { + if (StringComparer.OrdinalIgnoreCase.Equals(value, "expired")) return VectorStoreStatus.Expired; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "in_progress")) return VectorStoreStatus.InProgress; + if (StringComparer.OrdinalIgnoreCase.Equals(value, "completed")) return VectorStoreStatus.Completed; + throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown VectorStoreStatus value."); + } + } +} diff --git a/.dotnet/src/Generated/ModerationClient.cs b/.dotnet/src/Generated/ModerationClient.cs new file mode 100644 index 000000000..73c07013a --- /dev/null +++ b/.dotnet/src/Generated/ModerationClient.cs @@ -0,0 +1,47 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.Moderations +{ + // Data plane generated sub-client. + public partial class ModerationClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected ModerationClient() + { + } + + internal PipelineMessage CreateCreateModerationRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/moderations", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/Generated/OpenAIClient.cs b/.dotnet/src/Generated/OpenAIClient.cs new file mode 100644 index 000000000..6b71023a8 --- /dev/null +++ b/.dotnet/src/Generated/OpenAIClient.cs @@ -0,0 +1,39 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading; +using OpenAI.Assistants; +using OpenAI.Audio; +using OpenAI.Batch; +using OpenAI.Chat; +using OpenAI.Embeddings; +using OpenAI.Files; +using OpenAI.FineTuning; +using OpenAI.Images; +using OpenAI.LegacyCompletions; +using OpenAI.Models; +using OpenAI.Moderations; +using OpenAI.VectorStores; + +namespace OpenAI +{ + // Data plane generated client. + public partial class OpenAIClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected OpenAIClient() + { + } + } +} diff --git a/.dotnet/src/Generated/OpenAIClientOptions.cs b/.dotnet/src/Generated/OpenAIClientOptions.cs new file mode 100644 index 000000000..fd56113eb --- /dev/null +++ b/.dotnet/src/Generated/OpenAIClientOptions.cs @@ -0,0 +1,12 @@ +// + +#nullable disable + +using System.ClientModel.Primitives; + +namespace OpenAI +{ + public partial class OpenAIClientOptions : ClientPipelineOptions + { + } +} diff --git a/.dotnet/src/Generated/OpenAIModelFactory.cs b/.dotnet/src/Generated/OpenAIModelFactory.cs new file mode 100644 index 000000000..496894c70 --- /dev/null +++ b/.dotnet/src/Generated/OpenAIModelFactory.cs @@ -0,0 +1,196 @@ +// + +#nullable disable + +using System; +using System.Collections.Generic; +using System.Linq; +using OpenAI.Assistants; +using OpenAI.Audio; +using OpenAI.Chat; +using OpenAI.Embeddings; +using OpenAI.Images; +using OpenAI.Moderations; +using OpenAI.VectorStores; + +namespace OpenAI +{ + internal static partial class OpenAIModelFactory + { + public static TranscribedWord TranscribedWord(string word = null, TimeSpan start = default, TimeSpan end = default) + { + return new TranscribedWord(word, start, end, serializedAdditionalRawData: null); + } + + public static TranscribedSegment TranscribedSegment(int id = default, long seekOffset = default, TimeSpan start = default, TimeSpan end = default, string text = null, IEnumerable tokenIds = null, float temperature = default, double averageLogProbability = default, float compressionRatio = default, double noSpeechProbability = default) + { + tokenIds ??= new List(); + + return new TranscribedSegment( + id, + seekOffset, + start, + end, + text, + tokenIds?.ToList(), + temperature, + averageLogProbability, + compressionRatio, + noSpeechProbability, + serializedAdditionalRawData: null); + } + + public static ToolChatMessage ToolChatMessage(IEnumerable content = null, string toolCallId = null) + { + content ??= new List(); + + return new ToolChatMessage(ChatMessageRole.Tool, content?.ToList(), serializedAdditionalRawData: null, toolCallId); + } + + public static FunctionChatMessage FunctionChatMessage(IEnumerable content = null, string functionName = null) + { + content ??= new List(); + + return new FunctionChatMessage(ChatMessageRole.Function, content?.ToList(), serializedAdditionalRawData: null, functionName); + } + + public static ChatFunction ChatFunction(string functionDescription = null, string functionName = null, BinaryData functionParameters = null) + { + return new ChatFunction(functionDescription, functionName, functionParameters, serializedAdditionalRawData: null); + } + + public static ChatTokenLogProbabilityInfo ChatTokenLogProbabilityInfo(string token = null, float logProbability = default, IEnumerable utf8ByteValues = null, IEnumerable topLogProbabilities = null) + { + utf8ByteValues ??= new List(); + topLogProbabilities ??= new List(); + + return new ChatTokenLogProbabilityInfo(token, logProbability, utf8ByteValues?.ToList(), topLogProbabilities?.ToList(), serializedAdditionalRawData: null); + } + + public static ChatTokenTopLogProbabilityInfo ChatTokenTopLogProbabilityInfo(string token = null, float logProbability = default, IEnumerable utf8ByteValues = null) + { + utf8ByteValues ??= new List(); + + return new ChatTokenTopLogProbabilityInfo(token, logProbability, utf8ByteValues?.ToList(), serializedAdditionalRawData: null); + } + + public static ChatTokenUsage ChatTokenUsage(int outputTokens = default, int inputTokens = default, int totalTokens = default) + { + return new ChatTokenUsage(outputTokens, inputTokens, totalTokens, serializedAdditionalRawData: null); + } + + public static EmbeddingTokenUsage EmbeddingTokenUsage(int inputTokens = default, int totalTokens = default) + { + return new EmbeddingTokenUsage(inputTokens, totalTokens, serializedAdditionalRawData: null); + } + + public static GeneratedImageCollection GeneratedImageCollection(DateTimeOffset created = default, IEnumerable data = null) + { + data ??= new List(); + + return new GeneratedImageCollection(created, data?.ToList()); + } + + public static GeneratedImage GeneratedImage(BinaryData imageBytes = null, Uri imageUri = null, string revisedPrompt = null) + { + return new GeneratedImage(imageBytes, imageUri, revisedPrompt, serializedAdditionalRawData: null); + } + + public static MessageFailureDetails MessageFailureDetails(MessageFailureReason reason = default) + { + return new MessageFailureDetails(reason, serializedAdditionalRawData: null); + } + + public static ModerationCollection ModerationCollection(string id = null, string model = null, IEnumerable results = null) + { + results ??= new List(); + + return new ModerationCollection(id, model, results?.ToList()); + } + + public static ModerationResult ModerationResult(bool flagged = default, ModerationCategories categories = null, ModerationCategoryScores categoryScores = null) + { + return new ModerationResult(flagged, categories, categoryScores, serializedAdditionalRawData: null); + } + + public static ModerationCategories ModerationCategories(bool hate = default, bool hateThreatening = default, bool harassment = default, bool harassmentThreatening = default, bool selfHarm = default, bool selfHarmIntent = default, bool selfHarmInstructions = default, bool sexual = default, bool sexualMinors = default, bool violence = default, bool violenceGraphic = default) + { + return new ModerationCategories( + hate, + hateThreatening, + harassment, + harassmentThreatening, + selfHarm, + selfHarmIntent, + selfHarmInstructions, + sexual, + sexualMinors, + violence, + violenceGraphic, + serializedAdditionalRawData: null); + } + + public static ModerationCategoryScores ModerationCategoryScores(float hate = default, float hateThreatening = default, float harassment = default, float harassmentThreatening = default, float selfHarm = default, float selfHarmIntent = default, float selfHarmInstructions = default, float sexual = default, float sexualMinors = default, float violence = default, float violenceGraphic = default) + { + return new ModerationCategoryScores( + hate, + hateThreatening, + harassment, + harassmentThreatening, + selfHarm, + selfHarmIntent, + selfHarmInstructions, + sexual, + sexualMinors, + violence, + violenceGraphic, + serializedAdditionalRawData: null); + } + + public static RunError RunError(RunErrorCode code = default, string message = null) + { + return new RunError(code, message, serializedAdditionalRawData: null); + } + + public static RunIncompleteDetails RunIncompleteDetails(RunIncompleteReason? reason = null) + { + return new RunIncompleteDetails(reason, serializedAdditionalRawData: null); + } + + public static RunTokenUsage RunTokenUsage(int completionTokens = default, int promptTokens = default, int totalTokens = default) + { + return new RunTokenUsage(completionTokens, promptTokens, totalTokens, serializedAdditionalRawData: null); + } + + public static RunStepError RunStepError(RunStepErrorCode code = default, string message = null) + { + return new RunStepError(code, message, serializedAdditionalRawData: null); + } + + public static RunStepTokenUsage RunStepTokenUsage(int completionTokens = default, int promptTokens = default, int totalTokens = default) + { + return new RunStepTokenUsage(completionTokens, promptTokens, totalTokens, serializedAdditionalRawData: null); + } + + public static VectorStoreFileCounts VectorStoreFileCounts(int inProgress = default, int completed = default, int failed = default, int cancelled = default, int total = default) + { + return new VectorStoreFileCounts( + inProgress, + completed, + failed, + cancelled, + total, + serializedAdditionalRawData: null); + } + + public static VectorStoreFileAssociationError VectorStoreFileAssociationError(VectorStoreFileAssociationErrorCode code = default, string message = null) + { + return new VectorStoreFileAssociationError(code, message, serializedAdditionalRawData: null); + } + + public static StreamingChatFunctionCallUpdate StreamingChatFunctionCallUpdate(string functionArgumentsUpdate = null, string functionName = null) + { + return new StreamingChatFunctionCallUpdate(functionArgumentsUpdate, functionName, serializedAdditionalRawData: null); + } + } +} diff --git a/.dotnet/src/Generated/VectorStoreClient.cs b/.dotnet/src/Generated/VectorStoreClient.cs new file mode 100644 index 000000000..14b48e96d --- /dev/null +++ b/.dotnet/src/Generated/VectorStoreClient.cs @@ -0,0 +1,315 @@ +// + +#nullable disable + +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +namespace OpenAI.VectorStores +{ + // Data plane generated sub-client. + public partial class VectorStoreClient + { + private const string AuthorizationHeader = "Authorization"; + private readonly ApiKeyCredential _keyCredential; + private const string AuthorizationApiKeyPrefix = "Bearer"; + private readonly ClientPipeline _pipeline; + private readonly Uri _endpoint; + + public virtual ClientPipeline Pipeline => _pipeline; + + protected VectorStoreClient() + { + } + + internal PipelineMessage CreateGetVectorStoresRequest(int? limit, string order, string after, string before, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCreateVectorStoreRequest(BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetVectorStoreRequest(string vectorStoreId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateModifyVectorStoreRequest(string vectorStoreId, BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateDeleteVectorStoreRequest(string vectorStoreId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "DELETE"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetVectorStoreFilesRequest(string vectorStoreId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + uri.AppendPath("/files", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + if (filter != null) + { + uri.AppendQuery("filter", filter, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCreateVectorStoreFileRequest(string vectorStoreId, BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + uri.AppendPath("/files", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetVectorStoreFileRequest(string vectorStoreId, string fileId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + uri.AppendPath("/files/", false); + uri.AppendPath(fileId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateDeleteVectorStoreFileRequest(string vectorStoreId, string fileId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "DELETE"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + uri.AppendPath("/files/", false); + uri.AppendPath(fileId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCreateVectorStoreFileBatchRequest(string vectorStoreId, BinaryContent content, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + uri.AppendPath("/file_batches", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + request.Headers.Set("Content-Type", "application/json"); + request.Content = content; + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetVectorStoreFileBatchRequest(string vectorStoreId, string batchId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + uri.AppendPath("/file_batches/", false); + uri.AppendPath(batchId, true); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateCancelVectorStoreFileBatchRequest(string vectorStoreId, string batchId, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "POST"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + uri.AppendPath("/file_batches/", false); + uri.AppendPath(batchId, true); + uri.AppendPath("/cancel", false); + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + internal PipelineMessage CreateGetFilesInVectorStoreBatchesRequest(string vectorStoreId, string batchId, int? limit, string order, string after, string before, string filter, RequestOptions options) + { + var message = _pipeline.CreateMessage(); + message.ResponseClassifier = PipelineMessageClassifier200; + var request = message.Request; + request.Method = "GET"; + var uri = new ClientUriBuilder(); + uri.Reset(_endpoint); + uri.AppendPath("/v1/vector_stores/", false); + uri.AppendPath(vectorStoreId, true); + uri.AppendPath("/file_batches/", false); + uri.AppendPath(batchId, true); + uri.AppendPath("/files", false); + if (limit != null) + { + uri.AppendQuery("limit", limit.Value, true); + } + if (order != null) + { + uri.AppendQuery("order", order, true); + } + if (after != null) + { + uri.AppendQuery("after", after, true); + } + if (before != null) + { + uri.AppendQuery("before", before, true); + } + if (filter != null) + { + uri.AppendQuery("filter", filter, true); + } + request.Uri = uri.ToUri(); + request.Headers.Set("Accept", "application/json"); + message.Apply(options); + return message; + } + + private static PipelineMessageClassifier _pipelineMessageClassifier200; + private static PipelineMessageClassifier PipelineMessageClassifier200 => _pipelineMessageClassifier200 ??= PipelineMessageClassifier.Create(stackalloc ushort[] { 200 }); + } +} diff --git a/.dotnet/src/OpenAI.csproj b/.dotnet/src/OpenAI.csproj new file mode 100644 index 000000000..2a1dd21b7 --- /dev/null +++ b/.dotnet/src/OpenAI.csproj @@ -0,0 +1,77 @@ + + + This is the OpenAI client library for developing .NET applications with rich experience. + SDK Code Generation OpenAI + OpenAI + + 2.0.0 + beta.10 + + netstandard2.0;net6.0 + latest + + + true + + + true + OpenAI.png + README.md + + + true + snupkg + + + true + + true + + + $(NoWarn),1570,1573,1574,1591 + + + $(NoWarn),0618 + + + $(NoWarn),0169 + + Debug;Release;Unsigned + + + + + true + OpenAI.snk + + + + + 0024000004800000940000000602000000240000525341310004000001000100097ad52abbeaa2e1a1982747cc0106534f65cfea6707eaed696a3a63daea80de2512746801a7e47f88e7781e71af960d89ba2e25561f70b0e2dbc93319e0af1961a719ccf5a4d28709b2b57a5d29b7c09dc8d269a490ebe2651c4b6e6738c27c5fb2c02469fe9757f0a3479ac310d6588a50a28d7dd431b907fd325e18b9e8ed + + + + + + + true + + + + + true + + + + + + + + + + + + + diff --git a/.dotnet/src/OpenAI.png b/.dotnet/src/OpenAI.png new file mode 100644 index 000000000..a142a664d Binary files /dev/null and b/.dotnet/src/OpenAI.png differ diff --git a/.dotnet/src/OpenAI.snk b/.dotnet/src/OpenAI.snk new file mode 100644 index 000000000..fa09b4dee Binary files /dev/null and b/.dotnet/src/OpenAI.snk differ diff --git a/.dotnet/src/Utility/AppContextSwitchHelper.cs b/.dotnet/src/Utility/AppContextSwitchHelper.cs new file mode 100644 index 000000000..34d985290 --- /dev/null +++ b/.dotnet/src/Utility/AppContextSwitchHelper.cs @@ -0,0 +1,33 @@ +using System; + +namespace OpenAI; + +internal static class AppContextSwitchHelper +{ + /// + /// Determines if either an AppContext switch or its corresponding Environment Variable is set + /// + /// Name of the AppContext switch. + /// Name of the Environment variable. + /// If the AppContext switch has been set, returns the value of the switch. + /// If the AppContext switch has not been set, returns the value of the environment variable. + /// False if neither is set. + /// + public static bool GetConfigValue(string appContexSwitchName, string environmentVariableName) + { + // First check for the AppContext switch, giving it priority over the environment variable. + if (AppContext.TryGetSwitch(appContexSwitchName, out bool value)) + { + return value; + } + // AppContext switch wasn't used. Check the environment variable. + string envVar = Environment.GetEnvironmentVariable(environmentVariableName); + if (envVar != null && (envVar.Equals("true", StringComparison.OrdinalIgnoreCase) || envVar.Equals("1"))) + { + return true; + } + + // Default to false. + return false; + } +} diff --git a/.dotnet/src/Utility/CustomSerializationHelpers.cs b/.dotnet/src/Utility/CustomSerializationHelpers.cs new file mode 100644 index 000000000..9badd7261 --- /dev/null +++ b/.dotnet/src/Utility/CustomSerializationHelpers.cs @@ -0,0 +1,153 @@ +#nullable enable + +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Text.Json; + +namespace OpenAI; + +internal static partial class CustomSerializationHelpers +{ + internal static TOutput DeserializeNewInstance( + UInstanceInput existingInstance, + Func deserializationFunc, + ref Utf8JsonReader reader, + ModelReaderWriterOptions options) + where UInstanceInput : IJsonModel + { + options ??= new("W"); + var format = options.Format == "W" ? ((IJsonModel)existingInstance).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(UInstanceInput)} does not support '{format}' format."); + } + + using JsonDocument document = JsonDocument.ParseValue(ref reader); + return deserializationFunc.Invoke(document.RootElement, options); + } + + internal static TOutput DeserializeNewInstance( + UInstanceInput existingInstance, + Func deserializationFunc, + BinaryData data, + ModelReaderWriterOptions options) + where UInstanceInput : IPersistableModel + { + options ??= new("W"); + var format = options.Format == "W" ? ((IPersistableModel)existingInstance).GetFormatFromOptions(options) : options.Format; + + switch (format) + { + case "J": + { + using JsonDocument document = JsonDocument.Parse(data); + return deserializationFunc.Invoke(document.RootElement, options)!; + } + default: + throw new FormatException($"The model {nameof(UInstanceInput)} does not support '{format}' format."); + } + } + + internal static void SerializeInstance( + UInstanceInput instance, + Action serializationFunc, + Utf8JsonWriter writer, + ModelReaderWriterOptions options) + where UInstanceInput : IJsonModel + { + options ??= new ModelReaderWriterOptions("W"); + AssertSupportedJsonWriteFormat(instance, options); + serializationFunc.Invoke(instance, writer, options); + } + + internal static void SerializeInstance( + T instance, + Action serializationFunc, + Utf8JsonWriter writer, + ModelReaderWriterOptions options) + where T : IJsonModel + => SerializeInstance(instance, serializationFunc, writer, options); + + internal static BinaryData SerializeInstance( + UInstanceInput instance, + ModelReaderWriterOptions options) + where UInstanceInput : IPersistableModel + { + options ??= new("W"); + AssertSupportedPersistableWriteFormat(instance, options); + return ModelReaderWriter.Write(instance, options); + } + + internal static BinaryData SerializeInstance(T instance, ModelReaderWriterOptions options) + where T : IPersistableModel + => SerializeInstance(instance, options); + + internal static void AssertSupportedJsonWriteFormat(T instance, ModelReaderWriterOptions options) + where T : IJsonModel + => AssertSupportedJsonWriteFormat(instance, options); + + internal static void AssertSupportedJsonWriteFormat(UInstanceInput instance, ModelReaderWriterOptions options) + where UInstanceInput : IJsonModel + { + var format = options.Format == "W" ? ((IJsonModel)instance).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(UInstanceInput)} does not support '{format}' format."); + } + } + + internal static void AssertSupportedPersistableWriteFormat(T instance, ModelReaderWriterOptions options) + where T : IPersistableModel + => AssertSupportedPersistableWriteFormat(instance, options); + + internal static void AssertSupportedPersistableWriteFormat(UInstanceInput instance, ModelReaderWriterOptions options) + where UInstanceInput : IPersistableModel + { + var format = options.Format == "W" ? ((IPersistableModel)instance).GetFormatFromOptions(options) : options.Format; + if (format != "J") + { + throw new FormatException($"The model {nameof(UInstanceInput)} does not support '{format}' format."); + } + } + + internal static void WriteSerializedAdditionalRawData(this Utf8JsonWriter writer, IDictionary dictionary, ModelReaderWriterOptions options) + { + if (true && dictionary != null) + { + foreach (var item in dictionary) + { + writer.WritePropertyName(item.Key); +#if NET6_0_OR_GREATER + writer.WriteRawValue(item.Value); +#else + using JsonDocument document = JsonDocument.Parse(item.Value); + JsonSerializer.Serialize(writer, document.RootElement); +#endif + } + } + } + + internal static void WriteOptionalProperty(this Utf8JsonWriter writer, ReadOnlySpan name, T value, ModelReaderWriterOptions options) + { + if (Optional.IsDefined(value)) + { + writer.WritePropertyName(name); + writer.WriteObjectValue(value, options); + } + } + + internal static void WriteOptionalCollection(this Utf8JsonWriter writer, ReadOnlySpan name, IEnumerable values, ModelReaderWriterOptions options) + { + if (Optional.IsCollectionDefined(values)) + { + writer.WritePropertyName(name); + writer.WriteStartArray(); + foreach (T item in values) + { + writer.WriteObjectValue(item, options); + } + writer.WriteEndArray(); + } + } +} \ No newline at end of file diff --git a/.dotnet/src/Utility/Generator/CodeGenClientAttribute.cs b/.dotnet/src/Utility/Generator/CodeGenClientAttribute.cs new file mode 100644 index 000000000..e46edc05f --- /dev/null +++ b/.dotnet/src/Utility/Generator/CodeGenClientAttribute.cs @@ -0,0 +1,15 @@ +#nullable enable + +using System; + +namespace OpenAI; + +[AttributeUsage(AttributeTargets.Class)] +internal sealed class CodeGenClientAttribute : CodeGenTypeAttribute +{ + public Type? ParentClient { get; set; } + + public CodeGenClientAttribute(string originalName) : base(originalName) + { + } +} \ No newline at end of file diff --git a/.dotnet/src/Utility/Generator/CodeGenMemberAttribute.cs b/.dotnet/src/Utility/Generator/CodeGenMemberAttribute.cs new file mode 100644 index 000000000..00bee60c4 --- /dev/null +++ b/.dotnet/src/Utility/Generator/CodeGenMemberAttribute.cs @@ -0,0 +1,17 @@ +#nullable enable + +using System; + +namespace OpenAI; + +[AttributeUsage(AttributeTargets.Property | AttributeTargets.Field)] +internal sealed class CodeGenMemberAttribute : CodeGenTypeAttribute +{ + public CodeGenMemberAttribute() : base(null) + { + } + + public CodeGenMemberAttribute(string originalName) : base(originalName) + { + } +} \ No newline at end of file diff --git a/.dotnet/src/Utility/Generator/CodeGenModelAttribute.cs b/.dotnet/src/Utility/Generator/CodeGenModelAttribute.cs new file mode 100644 index 000000000..a015f0b6a --- /dev/null +++ b/.dotnet/src/Utility/Generator/CodeGenModelAttribute.cs @@ -0,0 +1,27 @@ +#nullable enable + +using System; + +namespace OpenAI; + +[AttributeUsage(AttributeTargets.Class | AttributeTargets.Enum | AttributeTargets.Struct)] +internal sealed class CodeGenModelAttribute : CodeGenTypeAttribute +{ + /// + /// Gets or sets a coma separated list of additional model usage modes. Allowed values: model, error, intput, output. + /// + public string[]? Usage { get; set; } + + /// + /// Gets or sets a coma separated list of additional model serialization formats. + /// + public string[]? Formats { get; set; } + + public CodeGenModelAttribute() : base(null) + { + } + + public CodeGenModelAttribute(string originalName) : base(originalName) + { + } +} \ No newline at end of file diff --git a/.dotnet/src/Utility/Generator/CodeGenSerializationAttribute.cs b/.dotnet/src/Utility/Generator/CodeGenSerializationAttribute.cs new file mode 100644 index 000000000..a9693b571 --- /dev/null +++ b/.dotnet/src/Utility/Generator/CodeGenSerializationAttribute.cs @@ -0,0 +1,53 @@ +#nullable enable + +using System; + +namespace OpenAI; + +[AttributeUsage(AttributeTargets.Class | AttributeTargets.Struct, AllowMultiple = true, Inherited = true)] +internal sealed class CodeGenSerializationAttribute : Attribute +{ + /// + /// Gets or sets the property name which these hooks should apply to + /// + public string? PropertyName { get; set; } + /// + /// Gets or sets the serialization path of the property in the JSON + /// + public string[]? SerializationPath { get; } + /// + /// Gets or sets the method name to use when serializing the property value (property name excluded) + /// The signature of the serialization hook method must be or compatible with when invoking: + /// private void SerializeHook(Utf8JsonWriter writer); + /// + public string? SerializationValueHook { get; set; } + /// + /// Gets or sets the method name to use when deserializing the property value from the JSON + /// private static void DeserializationHook(JsonProperty property, ref TypeOfTheProperty propertyValue); // if the property is required + /// private static void DeserializationHook(JsonProperty property, ref Optional<TypeOfTheProperty> propertyValue); // if the property is optional + /// + public string? DeserializationValueHook { get; set; } + /// + /// Gets or sets the method name to use when serializing the property value (property name excluded) + /// The signature of the serialization hook method must be or compatible with when invoking: + /// private void SerializeHook(StringBuilder builder); + /// + public string? BicepSerializationValueHook { get; set; } + + public CodeGenSerializationAttribute(string propertyName) + { + PropertyName = propertyName; + } + + public CodeGenSerializationAttribute(string propertyName, string serializationName) + { + PropertyName = propertyName; + SerializationPath = new[] { serializationName }; + } + + public CodeGenSerializationAttribute(string propertyName, string[] serializationPath) + { + PropertyName = propertyName; + SerializationPath = serializationPath; + } +} \ No newline at end of file diff --git a/.dotnet/src/Utility/Generator/CodeGenSuppressAttribute.cs b/.dotnet/src/Utility/Generator/CodeGenSuppressAttribute.cs new file mode 100644 index 000000000..8852cbc89 --- /dev/null +++ b/.dotnet/src/Utility/Generator/CodeGenSuppressAttribute.cs @@ -0,0 +1,16 @@ +using System; + +namespace OpenAI; + +[AttributeUsage(AttributeTargets.Class | AttributeTargets.Enum | AttributeTargets.Struct, AllowMultiple = true)] +internal sealed class CodeGenSuppressAttribute : Attribute +{ + public string Member { get; } + public Type[] Parameters { get; } + + public CodeGenSuppressAttribute(string member, params Type[] parameters) + { + Member = member; + Parameters = parameters; + } +} \ No newline at end of file diff --git a/.dotnet/src/Utility/Generator/CodeGenTypeAttribute.cs b/.dotnet/src/Utility/Generator/CodeGenTypeAttribute.cs new file mode 100644 index 000000000..65aee7afc --- /dev/null +++ b/.dotnet/src/Utility/Generator/CodeGenTypeAttribute.cs @@ -0,0 +1,16 @@ +#nullable enable + +using System; + +namespace OpenAI; + +[AttributeUsage(AttributeTargets.Class)] +internal class CodeGenTypeAttribute : Attribute +{ + public string? OriginalName { get; } + + public CodeGenTypeAttribute(string? originalName) + { + OriginalName = originalName; + } +} \ No newline at end of file diff --git a/.dotnet/src/Utility/GenericActionPipelinePolicy.cs b/.dotnet/src/Utility/GenericActionPipelinePolicy.cs new file mode 100644 index 000000000..5b7212f86 --- /dev/null +++ b/.dotnet/src/Utility/GenericActionPipelinePolicy.cs @@ -0,0 +1,34 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Threading.Tasks; + +namespace OpenAI; + +internal partial class GenericActionPipelinePolicy : PipelinePolicy +{ + private Action _processMessageAction; + + public GenericActionPipelinePolicy(Action processMessageAction) + { + _processMessageAction = processMessageAction; + } + + public override void Process(PipelineMessage message, IReadOnlyList pipeline, int currentIndex) + { + _processMessageAction(message); + if (currentIndex < pipeline.Count - 1) + { + pipeline[currentIndex + 1].Process(message, pipeline, currentIndex + 1); + } + } + + public override async ValueTask ProcessAsync(PipelineMessage message, IReadOnlyList pipeline, int currentIndex) + { + _processMessageAction(message); + if (currentIndex < pipeline.Count - 1) + { + await pipeline[currentIndex + 1].ProcessAsync(message, pipeline, currentIndex + 1); + } + } +} \ No newline at end of file diff --git a/.dotnet/src/Utility/MultipartFormDataBinaryContent.cs b/.dotnet/src/Utility/MultipartFormDataBinaryContent.cs new file mode 100644 index 000000000..a1d04afdc --- /dev/null +++ b/.dotnet/src/Utility/MultipartFormDataBinaryContent.cs @@ -0,0 +1,170 @@ +using System; +using System.ClientModel; +using System.Diagnostics; +using System.Globalization; +using System.IO; +using System.Net.Http; +using System.Net.Http.Headers; +using System.Threading; +using System.Threading.Tasks; + +namespace OpenAI; + +internal class MultipartFormDataBinaryContent : BinaryContent +{ + private readonly MultipartFormDataContent _multipartContent; + + private const int BoundaryLength = 70; + private const string BoundaryValues = "0123456789=ABCDEFGHIJKLMNOPQRSTUVWXYZ_abcdefghijklmnopqrstuvwxyz"; + + public MultipartFormDataBinaryContent() + { + _multipartContent = new MultipartFormDataContent(CreateBoundary()); + } + + public string ContentType + { + get + { + Debug.Assert(_multipartContent.Headers.ContentType is not null); + + return _multipartContent.Headers.ContentType!.ToString(); + } + } + + internal HttpContent HttpContent => _multipartContent; + + public void Add(Stream stream, string name, string fileName = default, string contentType = null) + { + Argument.AssertNotNull(stream, nameof(stream)); + + StreamContent content = new(stream); + if (contentType is not null) + { + content.Headers.ContentType = MediaTypeHeaderValue.Parse(contentType); + } + Add(content, name, fileName); + } + + public void Add(string content, string name, string fileName = default) + { + Add(new StringContent(content), name, fileName); + } + + public void Add(int content, string name, string fileName = default) + { + // https://learn.microsoft.com/en-us/dotnet/standard/base-types/standard-numeric-format-strings#GFormatString + string value = content.ToString("G", CultureInfo.InvariantCulture); + Add(new StringContent(value), name, fileName); + } + + public void Add(double content, string name, string fileName = default) + { + // https://learn.microsoft.com/en-us/dotnet/standard/base-types/standard-numeric-format-strings#GFormatString + string value = content.ToString("G", CultureInfo.InvariantCulture); + Add(new StringContent(value), name, fileName); + } + + public void Add(byte[] content, string name, string fileName = default) + { + Add(new ByteArrayContent(content), name, fileName); + } + + public void Add(BinaryData content, string name, string fileName = default) + { + Add(new ByteArrayContent(content.ToArray()), name, fileName); + } + + private void Add(HttpContent content, string name, string fileName) + { + Argument.AssertNotNull(content, nameof(content)); + Argument.AssertNotNull(name, nameof(name)); + + if (fileName is not null) + { + _multipartContent.Add(content, name, fileName); + } + else + { + _multipartContent.Add(content, name); + } + } + +#if NET6_0_OR_GREATER + private static string CreateBoundary() => + string.Create(BoundaryLength, 0, (chars, _) => + { + Span random = stackalloc byte[BoundaryLength]; + Random.Shared.NextBytes(random); + + for (int i = 0; i < chars.Length; i++) + { + chars[i] = BoundaryValues[random[i] % BoundaryValues.Length]; + } + }); +#else + private static readonly Random _random = new(); + + private static string CreateBoundary() + { + Span chars = stackalloc char[BoundaryLength]; + + byte[] random = new byte[BoundaryLength]; + lock (_random) + { + _random.NextBytes(random); + } + + // Instead of `% BoundaryValues.Length` as is used above, use a mask to achieve the same result. + // `% BoundaryValues.Length` is optimized to the equivalent on .NET Core but not on .NET Framework. + const int Mask = 255 >> 2; + Debug.Assert(BoundaryValues.Length - 1 == Mask); + + for (int i = 0; i < chars.Length; i++) + { + chars[i] = BoundaryValues[random[i] & Mask]; + } + + return chars.ToString(); + } +#endif + + public override bool TryComputeLength(out long length) + { + // We can't call the protected method on HttpContent + + if (_multipartContent.Headers.ContentLength is long contentLength) + { + length = contentLength; + return true; + } + + length = 0; + return false; + } + + public override void WriteTo(Stream stream, CancellationToken cancellationToken = default) + { +#if NET5_0_OR_GREATER + _multipartContent.CopyTo(stream, default, cancellationToken); +#else + // TODO: polyfill sync-over-async for netstandard2.0 for Azure clients. + // Tracked by https://github.com/Azure/azure-sdk-for-net/issues/42674 + _multipartContent.CopyToAsync(stream).GetAwaiter().GetResult(); +#endif + } + + public override async Task WriteToAsync(Stream stream, CancellationToken cancellationToken = default) + { +#if NET5_0_OR_GREATER + await _multipartContent.CopyToAsync(stream, cancellationToken).ConfigureAwait(false); +#else + await _multipartContent.CopyToAsync(stream).ConfigureAwait(false); +#endif + } + + public override void Dispose() + { + _multipartContent.Dispose(); + } +} diff --git a/.dotnet/src/Utility/PageCollectionHelpers.cs b/.dotnet/src/Utility/PageCollectionHelpers.cs new file mode 100644 index 000000000..b5be39a6e --- /dev/null +++ b/.dotnet/src/Utility/PageCollectionHelpers.cs @@ -0,0 +1,65 @@ +using System.ClientModel; +using System.Collections.Generic; +using System.Threading; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI; + +internal class PageCollectionHelpers +{ + public static PageCollection Create(PageEnumerator enumerator) + => new EnumeratorPageCollection(enumerator); + + public static AsyncPageCollection CreateAsync(PageEnumerator enumerator) + => new AsyncEnumeratorPageCollection(enumerator); + + public static IEnumerable Create(PageResultEnumerator enumerator) + { + while (enumerator.MoveNext()) + { + yield return enumerator.Current; + } + } + + public static async IAsyncEnumerable CreateAsync(PageResultEnumerator enumerator) + { + while (await enumerator.MoveNextAsync().ConfigureAwait(false)) + { + yield return enumerator.Current; + } + } + + private class EnumeratorPageCollection : PageCollection + { + private readonly PageEnumerator _enumerator; + + public EnumeratorPageCollection(PageEnumerator enumerator) + { + _enumerator = enumerator; + } + + protected override PageResult GetCurrentPageCore() + => _enumerator.GetCurrentPage(); + + protected override IEnumerator> GetEnumeratorCore() + => _enumerator; + } + + private class AsyncEnumeratorPageCollection : AsyncPageCollection + { + private readonly PageEnumerator _enumerator; + + public AsyncEnumeratorPageCollection(PageEnumerator enumerator) + { + _enumerator = enumerator; + } + + protected override async Task> GetCurrentPageAsyncCore() + => await _enumerator.GetCurrentPageAsync().ConfigureAwait(false); + + protected override IAsyncEnumerator> GetAsyncEnumeratorCore(CancellationToken cancellationToken = default) + => _enumerator; + } +} diff --git a/.dotnet/src/Utility/PageEnumerator.cs b/.dotnet/src/Utility/PageEnumerator.cs new file mode 100644 index 000000000..6bf35810e --- /dev/null +++ b/.dotnet/src/Utility/PageEnumerator.cs @@ -0,0 +1,60 @@ +using System.ClientModel; +using System.Collections.Generic; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI; + +internal abstract class PageEnumerator : PageResultEnumerator, + IAsyncEnumerator>, + IEnumerator> +{ + public abstract PageResult GetPageFromResult(ClientResult result); + + public PageResult GetCurrentPage() + { + if (Current is null) + { + return GetPageFromResult(GetFirst()); + } + + return ((IEnumerator>)this).Current; + } + + public async Task> GetCurrentPageAsync() + { + if (Current is null) + { + return GetPageFromResult(await GetFirstAsync().ConfigureAwait(false)); + } + + return ((IEnumerator>)this).Current; + } + + PageResult IEnumerator>.Current + { + get + { + if (Current is null) + { + return default!; + } + + return GetPageFromResult(Current); + } + } + + PageResult IAsyncEnumerator>.Current + { + get + { + if (Current is null) + { + return default!; + } + + return GetPageFromResult(Current); + } + } +} diff --git a/.dotnet/src/Utility/PageResultEnumerator.cs b/.dotnet/src/Utility/PageResultEnumerator.cs new file mode 100644 index 000000000..27629b171 --- /dev/null +++ b/.dotnet/src/Utility/PageResultEnumerator.cs @@ -0,0 +1,75 @@ +using System; +using System.ClientModel; +using System.Collections; +using System.Collections.Generic; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI; + +internal abstract class PageResultEnumerator : IAsyncEnumerator, IEnumerator +{ + private ClientResult? _current; + private bool _hasNext = true; + + public ClientResult Current => _current!; + + public abstract Task GetFirstAsync(); + + public abstract ClientResult GetFirst(); + + public abstract Task GetNextAsync(ClientResult result); + + public abstract ClientResult GetNext(ClientResult result); + + public abstract bool HasNext(ClientResult result); + + object IEnumerator.Current => ((IEnumerator)this).Current; + + public bool MoveNext() + { + if (!_hasNext) + { + return false; + } + + if (_current == null) + { + _current = GetFirst(); + } + else + { + _current = GetNext(_current); + } + + _hasNext = HasNext(_current); + return true; + } + + void IEnumerator.Reset() => _current = null; + + void IDisposable.Dispose() { } + + public async ValueTask MoveNextAsync() + { + if (!_hasNext) + { + return false; + } + + if (_current == null) + { + _current = await GetFirstAsync().ConfigureAwait(false); + } + else + { + _current = await GetNextAsync(_current).ConfigureAwait(false); + } + + _hasNext = HasNext(_current); + return true; + } + + ValueTask IAsyncDisposable.DisposeAsync() => default; +} \ No newline at end of file diff --git a/.dotnet/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.ExperimentalAttribute.cs b/.dotnet/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.ExperimentalAttribute.cs new file mode 100644 index 000000000..45eed223b --- /dev/null +++ b/.dotnet/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.ExperimentalAttribute.cs @@ -0,0 +1,60 @@ +#if !NET8_0_OR_GREATER +#nullable enable + +// Licensed to the .NET Foundation under one or more agreements. +// The .NET Foundation licenses this file to you under the MIT license. + +namespace System.Diagnostics.CodeAnalysis +{ + /// + /// Indicates that an API is experimental and it may change in the future. + /// + /// + /// This attribute allows call sites to be flagged with a diagnostic that indicates that an experimental + /// feature is used. Authors can use this attribute to ship preview features in their assemblies. + /// + [AttributeUsage(AttributeTargets.Assembly | + AttributeTargets.Module | + AttributeTargets.Class | + AttributeTargets.Struct | + AttributeTargets.Enum | + AttributeTargets.Constructor | + AttributeTargets.Method | + AttributeTargets.Property | + AttributeTargets.Field | + AttributeTargets.Event | + AttributeTargets.Interface | + AttributeTargets.Delegate, Inherited = false)] + internal sealed class ExperimentalAttribute : Attribute + { + /// + /// Initializes a new instance of the class, specifying the ID that the compiler will use + /// when reporting a use of the API the attribute applies to. + /// + /// The ID that the compiler will use when reporting a use of the API the attribute applies to. + public ExperimentalAttribute(string diagnosticId) + { + DiagnosticId = diagnosticId; + } + + /// + /// Gets the ID that the compiler will use when reporting a use of the API the attribute applies to. + /// + /// The unique diagnostic ID. + /// + /// The diagnostic ID is shown in build output for warnings and errors. + /// This property represents the unique ID that can be used to suppress the warnings or errors, if needed. + /// + public string DiagnosticId { get; } + + /// + /// Gets or sets the URL for corresponding documentation. + /// The API accepts a format string instead of an actual URL, creating a generic URL that includes the diagnostic ID. + /// + /// The format string that represents a URL to corresponding documentation. + /// An example format string is https://contoso.com/obsoletion-warnings/{0}. + public string? UrlFormat { get; set; } + } +} + +#endif // !NET8_0_OR_LATER \ No newline at end of file diff --git a/.dotnet/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.SetsRequiredMembersAttribute.cs b/.dotnet/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.SetsRequiredMembersAttribute.cs new file mode 100644 index 000000000..26cd00bce --- /dev/null +++ b/.dotnet/src/Utility/Polyfill/System.Diagnostics.CodeAnalysis.SetsRequiredMembersAttribute.cs @@ -0,0 +1,8 @@ +#if !NET7_0_OR_GREATER + +namespace System.Diagnostics.CodeAnalysis; + +[AttributeUsage(AttributeTargets.Constructor, AllowMultiple = false, Inherited = false)] +internal sealed class SetsRequiredMembersAttribute : Attribute { } + +#endif // !NET7_0_OR_GREATER diff --git a/.dotnet/src/Utility/Polyfill/System.Runtime.CompilerServices.CompilerFeatureRequiredAttribute.cs b/.dotnet/src/Utility/Polyfill/System.Runtime.CompilerServices.CompilerFeatureRequiredAttribute.cs new file mode 100644 index 000000000..b4aa9161f --- /dev/null +++ b/.dotnet/src/Utility/Polyfill/System.Runtime.CompilerServices.CompilerFeatureRequiredAttribute.cs @@ -0,0 +1,15 @@ +#if !NET7_0_OR_GREATER + +namespace System.Runtime.CompilerServices; + +[AttributeUsage(AttributeTargets.All, AllowMultiple = true, Inherited = false)] +internal sealed class CompilerFeatureRequiredAttribute(string featureName) : Attribute +{ + public string FeatureName { get; } = featureName; + public bool IsOptional { get; set; } + + public const string RefStructs = nameof(RefStructs); + public const string RequiredMembers = nameof(RequiredMembers); +} + +#endif // !NET7_0_OR_GREATER diff --git a/.dotnet/src/Utility/Polyfill/System.Runtime.CompilerServices.IsExternalInit.cs b/.dotnet/src/Utility/Polyfill/System.Runtime.CompilerServices.IsExternalInit.cs new file mode 100644 index 000000000..f4b6d744f --- /dev/null +++ b/.dotnet/src/Utility/Polyfill/System.Runtime.CompilerServices.IsExternalInit.cs @@ -0,0 +1,9 @@ +#if !NET5_0_OR_GREATER + +using System.ComponentModel; +namespace System.Runtime.CompilerServices; + +[EditorBrowsable(EditorBrowsableState.Never)] +internal static class IsExternalInit { } + +#endif // !NET5_0_OR_GREATER diff --git a/.dotnet/src/Utility/Polyfill/System.Runtime.CompilerServices.RequiredMemberAttribute.cs b/.dotnet/src/Utility/Polyfill/System.Runtime.CompilerServices.RequiredMemberAttribute.cs new file mode 100644 index 000000000..216d76910 --- /dev/null +++ b/.dotnet/src/Utility/Polyfill/System.Runtime.CompilerServices.RequiredMemberAttribute.cs @@ -0,0 +1,8 @@ +#if !NET7_0_OR_GREATER + +namespace System.Runtime.CompilerServices; + +[AttributeUsage(AttributeTargets.Class | AttributeTargets.Struct | AttributeTargets.Field | AttributeTargets.Property, AllowMultiple = false, Inherited = false)] +internal sealed class RequiredMemberAttribute : Attribute { } + +#endif // !NET7_0_OR_GREATER diff --git a/.dotnet/src/Utility/System.Net.ServerSentEvents.cs b/.dotnet/src/Utility/System.Net.ServerSentEvents.cs new file mode 100644 index 000000000..ea0b4f818 --- /dev/null +++ b/.dotnet/src/Utility/System.Net.ServerSentEvents.cs @@ -0,0 +1,623 @@ +// Licensed to the .NET Foundation under one or more agreements. +// The .NET Foundation licenses this file to you under the MIT license. + +// This file contains a source copy of: +// https://github.com/dotnet/runtime/tree/2bd15868f12ace7cee9999af61d5c130b2603f04/src/libraries/System.Net.ServerSentEvents/src/System/Net/ServerSentEvents +// Once the System.Net.ServerSentEvents package is available, this file should be removed and replaced with a package reference. +// +// The only changes made to this code from the original are: +// - Enabled nullable reference types at file scope, and use a few null suppression operators to work around the lack of [NotNull] +// - Put into a single file for ease of management (it should not be edited in this repo). +// - Changed public types to be internal. +// - Removed a use of a [NotNull] attribute to assist in netstandard2.0 compilation. +// - Replaced a reference to a .resx string with an inline constant. + +#nullable enable + +using System.Buffers; +using System.Collections.Generic; +using System.Diagnostics; +using System.Globalization; +using System.IO; +using System.Runtime.CompilerServices; +using System.Text; +using System.Threading.Tasks; +using System.Threading; + +namespace System.Net.ServerSentEvents +{ + /// Represents a server-sent event. + /// Specifies the type of data payload in the event. + internal readonly struct SseItem + { + /// Initializes the server-sent event. + /// The event's payload. + /// The event's type. + public SseItem(T data, string eventType) + { + Data = data; + EventType = eventType; + } + + /// Gets the event's payload. + public T Data { get; } + + /// Gets the event's type. + public string EventType { get; } + } + + /// Encapsulates a method for parsing the bytes payload of a server-sent event. + /// Specifies the type of the return value of the parser. + /// The event's type. + /// The event's payload bytes. + /// The parsed . + internal delegate T SseItemParser(string eventType, ReadOnlySpan data); + + /// Provides a parser for parsing server-sent events. + internal static class SseParser + { + /// The default ("message") for an event that did not explicitly specify a type. + public const string EventTypeDefault = "message"; + + /// Creates a parser for parsing a of server-sent events into a sequence of values. + /// The stream containing the data to parse. + /// + /// The enumerable of strings, which may be enumerated synchronously or asynchronously. The strings + /// are decoded from the UTF8-encoded bytes of the payload of each event. + /// + /// is null. + /// + /// This overload has behavior equivalent to calling with a delegate + /// that decodes the data of each event using 's GetString method. + /// + public static SseParser Create(Stream sseStream) => + Create(sseStream, static (_, bytes) => Utf8GetString(bytes)); + + /// Creates a parser for parsing a of server-sent events into a sequence of values. + /// Specifies the type of data in each event. + /// The stream containing the data to parse. + /// The parser to use to transform each payload of bytes into a data element. + /// The enumerable, which may be enumerated synchronously or asynchronously. + /// is null. + /// is null. + public static SseParser Create(Stream sseStream, SseItemParser itemParser) => + new SseParser( + sseStream ?? throw new ArgumentNullException(nameof(sseStream)), + itemParser ?? throw new ArgumentNullException(nameof(itemParser))); + + /// Encoding.UTF8.GetString(bytes) + internal static string Utf8GetString(ReadOnlySpan bytes) + { +#if NET + return Encoding.UTF8.GetString(bytes); +#else + unsafe + { + fixed (byte* ptr = bytes) + { + return ptr is null ? + string.Empty : + Encoding.UTF8.GetString(ptr, bytes.Length); + } + } +#endif + } + } + + /// Provides a parser for server-sent events information. + /// Specifies the type of data parsed from an event. + internal sealed class SseParser + { + // For reference: + // Specification: https://html.spec.whatwg.org/multipage/server-sent-events.html#server-sent-events + + /// Carriage Return. + private const byte CR = (byte)'\r'; + /// Line Feed. + private const byte LF = (byte)'\n'; + /// Carriage Return Line Feed. + private static ReadOnlySpan CRLF => "\r\n"u8; + + /// The default size of an ArrayPool buffer to rent. + /// Larger size used by default to minimize number of reads. Smaller size used in debug to stress growth/shifting logic. + private const int DefaultArrayPoolRentSize = +#if DEBUG + 16; +#else + 1024; +#endif + + /// The stream to be parsed. + private readonly Stream _stream; + /// The parser delegate used to transform bytes into a . + private readonly SseItemParser _itemParser; + + /// Indicates whether the enumerable has already been used for enumeration. + private int _used; + + /// Buffer, either empty or rented, containing the data being read from the stream while looking for the next line. + private byte[] _lineBuffer = []; + /// The starting offset of valid data in . + private int _lineOffset; + /// The length of valid data in , starting from . + private int _lineLength; + /// The index in where a newline ('\r', '\n', or "\r\n") was found. + private int _newlineIndex; + /// The index in of characters already checked for newlines. + /// + /// This is to avoid O(LineLength^2) behavior in the rare case where we have long lines that are built-up over multiple reads. + /// We want to avoid re-checking the same characters we've already checked over and over again. + /// + private int _lastSearchedForNewline; + /// Set when eof has been reached in the stream. + private bool _eof; + + /// Rented buffer containing buffered data for the next event. + private byte[]? _dataBuffer; + /// The length of valid data in , starting from index 0. + private int _dataLength; + /// Whether data has been appended to . + /// This can be different than != 0 if empty data was appended. + private bool _dataAppended; + + /// The event type for the next event. + private string _eventType = SseParser.EventTypeDefault; + + /// Initialize the enumerable. + /// The stream to parse. + /// The function to use to parse payload bytes into a . + internal SseParser(Stream stream, SseItemParser itemParser) + { + _stream = stream; + _itemParser = itemParser; + } + + /// Gets an enumerable of the server-sent events from this parser. + /// The parser has already been enumerated. Such an exception may propagate out of a call to . + public IEnumerable> Enumerate() + { + // Validate that the parser is only used for one enumeration. + ThrowIfNotFirstEnumeration(); + + // Rent a line buffer. This will grow as needed. The line buffer is what's passed to the stream, + // so we want it to be large enough to reduce the number of reads we need to do when data is + // arriving quickly. (In debug, we use a smaller buffer to stress the growth and shifting logic.) + _lineBuffer = ArrayPool.Shared.Rent(DefaultArrayPoolRentSize); + try + { + // Spec: "Event streams in this format must always be encoded as UTF-8". + // Skip a UTF8 BOM if it exists at the beginning of the stream. (The BOM is defined as optional in the SSE grammar.) + while (FillLineBuffer() != 0 && _lineLength < Utf8Bom.Length) ; + SkipBomIfPresent(); + + // Process all events in the stream. + while (true) + { + // See if there's a complete line in data already read from the stream. Lines are permitted to + // end with CR, LF, or CRLF. Look for all of them and if we find one, process the line. However, + // if we only find a CR and it's at the end of the read data, don't process it now, as we want + // to process it together with an LF that might immediately follow, rather than treating them + // as two separate characters, in which case we'd incorrectly process the CR as a line by itself. + GetNextSearchOffsetAndLength(out int searchOffset, out int searchLength); + _newlineIndex = _lineBuffer.AsSpan(searchOffset, searchLength).IndexOfAny(CR, LF); + if (_newlineIndex >= 0) + { + _lastSearchedForNewline = -1; + _newlineIndex += searchOffset; + if (_lineBuffer[_newlineIndex] is LF || // the newline is LF + _newlineIndex - _lineOffset + 1 < _lineLength || // we must have CR and we have whatever comes after it + _eof) // if we get here, we know we have a CR at the end of the buffer, so it's definitely the whole newline if we've hit EOF + { + // Process the line. + if (ProcessLine(out SseItem sseItem, out int advance)) + { + yield return sseItem; + } + + // Move past the line. + _lineOffset += advance; + _lineLength -= advance; + continue; + } + } + else + { + // Record the last position searched for a newline. The next time we search, + // we'll search from here rather than from _lineOffset, in order to avoid searching + // the same characters again. + _lastSearchedForNewline = _lineOffset + _lineLength; + } + + // We've processed everything in the buffer we currently can, so if we've already read EOF, we're done. + if (_eof) + { + // Spec: "Once the end of the file is reached, any pending data must be discarded. (If the file ends in the middle of an + // event, before the final empty line, the incomplete event is not dispatched.)" + break; + } + + // Read more data into the buffer. + FillLineBuffer(); + } + } + finally + { + ArrayPool.Shared.Return(_lineBuffer); + if (_dataBuffer is not null) + { + ArrayPool.Shared.Return(_dataBuffer); + } + } + } + + /// Gets an asynchronous enumerable of the server-sent events from this parser. + /// The cancellation token to use to cancel the enumeration. + /// The parser has already been enumerated. Such an exception may propagate out of a call to . + /// The enumeration was canceled. Such an exception may propagate out of a call to . + public async IAsyncEnumerable> EnumerateAsync([EnumeratorCancellation] CancellationToken cancellationToken = default) + { + // Validate that the parser is only used for one enumeration. + ThrowIfNotFirstEnumeration(); + + // Rent a line buffer. This will grow as needed. The line buffer is what's passed to the stream, + // so we want it to be large enough to reduce the number of reads we need to do when data is + // arriving quickly. (In debug, we use a smaller buffer to stress the growth and shifting logic.) + _lineBuffer = ArrayPool.Shared.Rent(DefaultArrayPoolRentSize); + try + { + // Spec: "Event streams in this format must always be encoded as UTF-8". + // Skip a UTF8 BOM if it exists at the beginning of the stream. (The BOM is defined as optional in the SSE grammar.) + while (await FillLineBufferAsync(cancellationToken).ConfigureAwait(false) != 0 && _lineLength < Utf8Bom.Length) ; + SkipBomIfPresent(); + + // Process all events in the stream. + while (true) + { + // See if there's a complete line in data already read from the stream. Lines are permitted to + // end with CR, LF, or CRLF. Look for all of them and if we find one, process the line. However, + // if we only find a CR and it's at the end of the read data, don't process it now, as we want + // to process it together with an LF that might immediately follow, rather than treating them + // as two separate characters, in which case we'd incorrectly process the CR as a line by itself. + GetNextSearchOffsetAndLength(out int searchOffset, out int searchLength); + _newlineIndex = _lineBuffer.AsSpan(searchOffset, searchLength).IndexOfAny(CR, LF); + if (_newlineIndex >= 0) + { + _lastSearchedForNewline = -1; + _newlineIndex += searchOffset; + if (_lineBuffer[_newlineIndex] is LF || // newline is LF + _newlineIndex - _lineOffset + 1 < _lineLength || // newline is CR, and we have whatever comes after it + _eof) // if we get here, we know we have a CR at the end of the buffer, so it's definitely the whole newline if we've hit EOF + { + // Process the line. + if (ProcessLine(out SseItem sseItem, out int advance)) + { + yield return sseItem; + } + + // Move past the line. + _lineOffset += advance; + _lineLength -= advance; + continue; + } + } + else + { + // Record the last position searched for a newline. The next time we search, + // we'll search from here rather than from _lineOffset, in order to avoid searching + // the same characters again. + _lastSearchedForNewline = searchOffset + searchLength; + } + + // We've processed everything in the buffer we currently can, so if we've already read EOF, we're done. + if (_eof) + { + // Spec: "Once the end of the file is reached, any pending data must be discarded. (If the file ends in the middle of an + // event, before the final empty line, the incomplete event is not dispatched.)" + break; + } + + // Read more data into the buffer. + await FillLineBufferAsync(cancellationToken).ConfigureAwait(false); + } + } + finally + { + ArrayPool.Shared.Return(_lineBuffer); + if (_dataBuffer is not null) + { + ArrayPool.Shared.Return(_dataBuffer); + } + } + } + + /// Gets the next index and length with which to perform a newline search. + private void GetNextSearchOffsetAndLength(out int searchOffset, out int searchLength) + { + if (_lastSearchedForNewline > _lineOffset) + { + searchOffset = _lastSearchedForNewline; + searchLength = _lineLength - (_lastSearchedForNewline - _lineOffset); + } + else + { + searchOffset = _lineOffset; + searchLength = _lineLength; + } + + Debug.Assert(searchOffset >= _lineOffset, $"{searchOffset}, {_lineLength}"); + Debug.Assert(searchOffset <= _lineOffset + _lineLength, $"{searchOffset}, {_lineOffset}, {_lineLength}"); + Debug.Assert(searchOffset <= _lineBuffer.Length, $"{searchOffset}, {_lineBuffer.Length}"); + + Debug.Assert(searchLength >= 0, $"{searchLength}"); + Debug.Assert(searchLength <= _lineLength, $"{searchLength}, {_lineLength}"); + } + + private int GetNewLineLength() + { + Debug.Assert(_newlineIndex - _lineOffset < _lineLength, "Expected to be positioned at a non-empty newline"); + return _lineBuffer.AsSpan(_newlineIndex, _lineLength - (_newlineIndex - _lineOffset)).StartsWith(CRLF) ? 2 : 1; + } + + /// + /// If there's no room remaining in the line buffer, either shifts the contents + /// left or grows the buffer in order to make room for the next read. + /// + private void ShiftOrGrowLineBufferIfNecessary() + { + // If data we've read is butting up against the end of the buffer and + // it's not taking up the entire buffer, slide what's there down to + // the beginning, making room to read more data into the buffer (since + // there's no newline in the data that's there). Otherwise, if the whole + // buffer is full, grow the buffer to accommodate more data, since, again, + // what's there doesn't contain a newline and thus a line is longer than + // the current buffer accommodates. + if (_lineOffset + _lineLength == _lineBuffer.Length) + { + if (_lineOffset != 0) + { + _lineBuffer.AsSpan(_lineOffset, _lineLength).CopyTo(_lineBuffer); + if (_lastSearchedForNewline >= 0) + { + _lastSearchedForNewline -= _lineOffset; + } + _lineOffset = 0; + } + else if (_lineLength == _lineBuffer.Length) + { + GrowBuffer(ref _lineBuffer!, _lineBuffer.Length * 2); + } + } + } + + /// Processes a complete line from the SSE stream. + /// The parsed item if the method returns true. + /// How many characters to advance in the line buffer. + /// true if an SSE item was successfully parsed; otherwise, false. + private bool ProcessLine(out SseItem sseItem, out int advance) + { + ReadOnlySpan line = _lineBuffer.AsSpan(_lineOffset, _newlineIndex - _lineOffset); + + // Spec: "If the line is empty (a blank line) Dispatch the event" + if (line.IsEmpty) + { + advance = GetNewLineLength(); + + if (_dataAppended) + { + sseItem = new SseItem(_itemParser(_eventType, _dataBuffer.AsSpan(0, _dataLength)), _eventType); + _eventType = SseParser.EventTypeDefault; + _dataLength = 0; + _dataAppended = false; + return true; + } + + sseItem = default; + return false; + } + + // Find the colon separating the field name and value. + int colonPos = line.IndexOf((byte)':'); + ReadOnlySpan fieldName; + ReadOnlySpan fieldValue; + if (colonPos >= 0) + { + // Spec: "Collect the characters on the line before the first U+003A COLON character (:), and let field be that string." + fieldName = line.Slice(0, colonPos); + + // Spec: "Collect the characters on the line after the first U+003A COLON character (:), and let value be that string. + // If value starts with a U+0020 SPACE character, remove it from value." + fieldValue = line.Slice(colonPos + 1); + if (!fieldValue.IsEmpty && fieldValue[0] == (byte)' ') + { + fieldValue = fieldValue.Slice(1); + } + } + else + { + // Spec: "using the whole line as the field name, and the empty string as the field value." + fieldName = line; + fieldValue = []; + } + + if (fieldName.SequenceEqual("data"u8)) + { + // Spec: "Append the field value to the data buffer, then append a single U+000A LINE FEED (LF) character to the data buffer." + // Spec: "If the data buffer's last character is a U+000A LINE FEED (LF) character, then remove the last character from the data buffer." + + // If there's nothing currently in the data buffer and we can easily detect that this line is immediately followed by + // an empty line, we can optimize it to just handle the data directly from the line buffer, rather than first copying + // into the data buffer and dispatching from there. + if (!_dataAppended) + { + int newlineLength = GetNewLineLength(); + ReadOnlySpan remainder = _lineBuffer.AsSpan(_newlineIndex + newlineLength, _lineLength - line.Length - newlineLength); + if (!remainder.IsEmpty && + (remainder[0] is LF || (remainder[0] is CR && remainder.Length > 1))) + { + advance = line.Length + newlineLength + (remainder.StartsWith(CRLF) ? 2 : 1); + sseItem = new SseItem(_itemParser(_eventType, fieldValue), _eventType); + _eventType = SseParser.EventTypeDefault; + return true; + } + } + + // We need to copy the data from the data buffer to the line buffer. Make sure there's enough room. + if (_dataBuffer is null || _dataLength + _lineLength + 1 > _dataBuffer.Length) + { + GrowBuffer(ref _dataBuffer, _dataLength + _lineLength + 1); + } + + // Append a newline if there's already content in the buffer. + // Then copy the field value to the data buffer + if (_dataAppended) + { + _dataBuffer![_dataLength++] = LF; + } + fieldValue.CopyTo(_dataBuffer.AsSpan(_dataLength)); + _dataLength += fieldValue.Length; + _dataAppended = true; + } + else if (fieldName.SequenceEqual("event"u8)) + { + // Spec: "Set the event type buffer to field value." + _eventType = SseParser.Utf8GetString(fieldValue); + } + else if (fieldName.SequenceEqual("id"u8)) + { + // Spec: "If the field value does not contain U+0000 NULL, then set the last event ID buffer to the field value. Otherwise, ignore the field." + if (fieldValue.IndexOf((byte)'\0') < 0) + { + // Note that fieldValue might be empty, in which case LastEventId will naturally be reset to the empty string. This is per spec. + LastEventId = SseParser.Utf8GetString(fieldValue); + } + } + else if (fieldName.SequenceEqual("retry"u8)) + { + // Spec: "If the field value consists of only ASCII digits, then interpret the field value as an integer in base ten, + // and set the event stream's reconnection time to that integer. Otherwise, ignore the field." + if (long.TryParse( +#if NET7_0_OR_GREATER + fieldValue, +#else + SseParser.Utf8GetString(fieldValue), +#endif + NumberStyles.None, CultureInfo.InvariantCulture, out long milliseconds)) + { + ReconnectionInterval = TimeSpan.FromMilliseconds(milliseconds); + } + } + else + { + // We'll end up here if the line starts with a colon, producing an empty field name, or if the field name is otherwise unrecognized. + // Spec: "If the line starts with a U+003A COLON character (:) Ignore the line." + // Spec: "Otherwise, The field is ignored" + } + + advance = line.Length + GetNewLineLength(); + sseItem = default; + return false; + } + + /// Gets the last event ID. + /// This value is updated any time a new last event ID is parsed. It is not reset between SSE items. + public string LastEventId { get; private set; } = string.Empty; // Spec: "must be initialized to the empty string" + + /// Gets the reconnection interval. + /// + /// If no retry event was received, this defaults to , and it will only + /// ever be in that situation. If a client wishes to retry, the server-sent + /// events specification states that the interval may then be decided by the client implementation and should be a + /// few seconds. + /// + public TimeSpan ReconnectionInterval { get; private set; } = Timeout.InfiniteTimeSpan; + + /// Transitions the object to a used state, throwing if it's already been used. + private void ThrowIfNotFirstEnumeration() + { + if (Interlocked.Exchange(ref _used, 1) != 0) + { + throw new InvalidOperationException("The enumerable may be enumerated only once."); + } + } + + /// Reads data from the stream into the line buffer. + private int FillLineBuffer() + { + ShiftOrGrowLineBufferIfNecessary(); + + int offset = _lineOffset + _lineLength; + int bytesRead = _stream.Read( +#if NET + _lineBuffer.AsSpan(offset)); +#else + _lineBuffer, offset, _lineBuffer.Length - offset); +#endif + + if (bytesRead > 0) + { + _lineLength += bytesRead; + } + else + { + _eof = true; + bytesRead = 0; + } + + return bytesRead; + } + + /// Reads data asynchronously from the stream into the line buffer. + private async ValueTask FillLineBufferAsync(CancellationToken cancellationToken) + { + ShiftOrGrowLineBufferIfNecessary(); + + int offset = _lineOffset + _lineLength; + int bytesRead = await +#if NET + _stream.ReadAsync(_lineBuffer.AsMemory(offset), cancellationToken) +#else + new ValueTask(_stream.ReadAsync(_lineBuffer, offset, _lineBuffer.Length - offset, cancellationToken)) +#endif + .ConfigureAwait(false); + + if (bytesRead > 0) + { + _lineLength += bytesRead; + } + else + { + _eof = true; + bytesRead = 0; + } + + return bytesRead; + } + + /// Gets the UTF8 BOM. + private static ReadOnlySpan Utf8Bom => [0xEF, 0xBB, 0xBF]; + + /// Called at the beginning of processing to skip over an optional UTF8 byte order mark. + private void SkipBomIfPresent() + { + Debug.Assert(_lineOffset == 0, $"Expected _lineOffset == 0, got {_lineOffset}"); + + if (_lineBuffer.AsSpan(0, _lineLength).StartsWith(Utf8Bom)) + { + _lineOffset += 3; + _lineLength -= 3; + } + } + + /// Grows the buffer, returning the existing one to the ArrayPool and renting an ArrayPool replacement. + private static void GrowBuffer(ref byte[]? buffer, int minimumLength) + { + byte[]? toReturn = buffer; + buffer = ArrayPool.Shared.Rent(Math.Max(minimumLength, DefaultArrayPoolRentSize)); + if (toReturn is not null) + { + Array.Copy(toReturn, buffer, toReturn.Length); + ArrayPool.Shared.Return(toReturn); + } + } + } +} \ No newline at end of file diff --git a/.dotnet/src/Utility/Telemetry/OpenTelemetryConstants.cs b/.dotnet/src/Utility/Telemetry/OpenTelemetryConstants.cs new file mode 100644 index 000000000..d5a0906a7 --- /dev/null +++ b/.dotnet/src/Utility/Telemetry/OpenTelemetryConstants.cs @@ -0,0 +1,33 @@ +namespace OpenAI.Telemetry; + +internal class OpenTelemetryConstants +{ + // follow OpenTelemetry GenAI semantic conventions: + // https://github.com/open-telemetry/semantic-conventions/tree/v1.27.0/docs/gen-ai + + public const string ErrorTypeKey = "error.type"; + public const string ServerAddressKey = "server.address"; + public const string ServerPortKey = "server.port"; + + public const string GenAiClientOperationDurationMetricName = "gen_ai.client.operation.duration"; + public const string GenAiClientTokenUsageMetricName = "gen_ai.client.token.usage"; + + public const string GenAiOperationNameKey = "gen_ai.operation.name"; + + public const string GenAiRequestMaxTokensKey = "gen_ai.request.max_tokens"; + public const string GenAiRequestModelKey = "gen_ai.request.model"; + public const string GenAiRequestTemperatureKey = "gen_ai.request.temperature"; + public const string GenAiRequestTopPKey = "gen_ai.request.top_p"; + + public const string GenAiResponseIdKey = "gen_ai.response.id"; + public const string GenAiResponseFinishReasonKey = "gen_ai.response.finish_reasons"; + public const string GenAiResponseModelKey = "gen_ai.response.model"; + + public const string GenAiSystemKey = "gen_ai.system"; + public const string GenAiSystemValue = "openai"; + + public const string GenAiTokenTypeKey = "gen_ai.token.type"; + + public const string GenAiUsageInputTokensKey = "gen_ai.usage.input_tokens"; + public const string GenAiUsageOutputTokensKey = "gen_ai.usage.output_tokens"; +} diff --git a/.dotnet/src/Utility/Telemetry/OpenTelemetryScope.cs b/.dotnet/src/Utility/Telemetry/OpenTelemetryScope.cs new file mode 100644 index 000000000..8cdda7e21 --- /dev/null +++ b/.dotnet/src/Utility/Telemetry/OpenTelemetryScope.cs @@ -0,0 +1,222 @@ +using OpenAI.Chat; +using System; +using System.ClientModel; +using System.Diagnostics; +using System.Diagnostics.Metrics; + +using static OpenAI.Telemetry.OpenTelemetryConstants; + +namespace OpenAI.Telemetry; + +internal class OpenTelemetryScope : IDisposable +{ + private static readonly ActivitySource s_chatSource = new ActivitySource("OpenAI.ChatClient"); + private static readonly Meter s_chatMeter = new Meter("OpenAI.ChatClient"); + + // TODO: add explicit histogram buckets once System.Diagnostics.DiagnosticSource 9.0 is used + private static readonly Histogram s_duration = s_chatMeter.CreateHistogram(GenAiClientOperationDurationMetricName, "s", "Measures GenAI operation duration."); + private static readonly Histogram s_tokens = s_chatMeter.CreateHistogram(GenAiClientTokenUsageMetricName, "{token}", "Measures the number of input and output token used."); + + private readonly string _operationName; + private readonly string _serverAddress; + private readonly int _serverPort; + private readonly string _requestModel; + + private Stopwatch _duration; + private Activity _activity; + private TagList _commonTags; + + private OpenTelemetryScope( + string model, string operationName, + string serverAddress, int serverPort) + { + _requestModel = model; + _operationName = operationName; + _serverAddress = serverAddress; + _serverPort = serverPort; + } + + private static bool IsChatEnabled => s_chatSource.HasListeners() || s_tokens.Enabled || s_duration.Enabled; + + public static OpenTelemetryScope StartChat(string model, string operationName, + string serverAddress, int serverPort, ChatCompletionOptions options) + { + if (IsChatEnabled) + { + var scope = new OpenTelemetryScope(model, operationName, serverAddress, serverPort); + scope.StartChat(options); + return scope; + } + + return null; + } + + private void StartChat(ChatCompletionOptions options) + { + _duration = Stopwatch.StartNew(); + _commonTags = new TagList + { + { GenAiSystemKey, GenAiSystemValue }, + { GenAiRequestModelKey, _requestModel }, + { ServerAddressKey, _serverAddress }, + { ServerPortKey, _serverPort }, + { GenAiOperationNameKey, _operationName }, + }; + + _activity = s_chatSource.StartActivity(string.Concat(_operationName, " ", _requestModel), ActivityKind.Client); + if (_activity?.IsAllDataRequested == true) + { + RecordCommonAttributes(); + SetActivityTagIfNotNull(GenAiRequestMaxTokensKey, options?.MaxTokens); + SetActivityTagIfNotNull(GenAiRequestTemperatureKey, options?.Temperature); + SetActivityTagIfNotNull(GenAiRequestTopPKey, options?.TopP); + } + + return; + } + + public void RecordChatCompletion(ChatCompletion completion) + { + RecordMetrics(completion.Model, null, completion.Usage?.InputTokens, completion.Usage?.OutputTokens); + + if (_activity?.IsAllDataRequested == true) + { + RecordResponseAttributes(completion.Id, completion.Model, completion.FinishReason, completion.Usage); + } + } + + public void RecordException(Exception ex) + { + var errorType = GetErrorType(ex); + RecordMetrics(null, errorType, null, null); + if (_activity?.IsAllDataRequested == true) + { + _activity?.SetTag(OpenTelemetryConstants.ErrorTypeKey, errorType); + _activity?.SetStatus(ActivityStatusCode.Error, ex?.Message ?? errorType); + } + } + + public void Dispose() + { + _activity?.Stop(); + } + + private void RecordCommonAttributes() + { + _activity.SetTag(GenAiSystemKey, GenAiSystemValue); + _activity.SetTag(GenAiRequestModelKey, _requestModel); + _activity.SetTag(ServerAddressKey, _serverAddress); + _activity.SetTag(ServerPortKey, _serverPort); + _activity.SetTag(GenAiOperationNameKey, _operationName); + } + + private void RecordMetrics(string responseModel, string errorType, int? inputTokensUsage, int? outputTokensUsage) + { + // tags is a struct, let's copy and modify them + var tags = _commonTags; + + if (responseModel != null) + { + tags.Add(GenAiResponseModelKey, responseModel); + } + + if (inputTokensUsage != null) + { + var inputUsageTags = tags; + inputUsageTags.Add(GenAiTokenTypeKey, "input"); + s_tokens.Record(inputTokensUsage.Value, inputUsageTags); + } + + if (outputTokensUsage != null) + { + var outputUsageTags = tags; + outputUsageTags.Add(GenAiTokenTypeKey, "output"); + s_tokens.Record(outputTokensUsage.Value, outputUsageTags); + } + + if (errorType != null) + { + tags.Add(ErrorTypeKey, errorType); + } + + s_duration.Record(_duration.Elapsed.TotalSeconds, tags); + } + + private void RecordResponseAttributes(string responseId, string model, ChatFinishReason? finishReason, ChatTokenUsage usage) + { + SetActivityTagIfNotNull(GenAiResponseIdKey, responseId); + SetActivityTagIfNotNull(GenAiResponseModelKey, model); + SetActivityTagIfNotNull(GenAiUsageInputTokensKey, usage?.InputTokens); + SetActivityTagIfNotNull(GenAiUsageOutputTokensKey, usage?.OutputTokens); + SetFinishReasonAttribute(finishReason); + } + + private void SetFinishReasonAttribute(ChatFinishReason? finishReason) + { + if (finishReason == null) + { + return; + } + + var reasonStr = finishReason switch + { + ChatFinishReason.ContentFilter => "content_filter", + ChatFinishReason.FunctionCall => "function_call", + ChatFinishReason.Length => "length", + ChatFinishReason.Stop => "stop", + ChatFinishReason.ToolCalls => "tool_calls", + _ => finishReason.ToString(), + }; + + // There could be multiple finish reasons, so semantic conventions use array type for the corrresponding attribute. + // It's likely to change, but for now let's report it as array. + _activity.SetTag(GenAiResponseFinishReasonKey, new[] { reasonStr }); + } + + private string GetChatMessageRole(ChatMessageRole? role) => + role switch + { + ChatMessageRole.Assistant => "assistant", + ChatMessageRole.Function => "function", + ChatMessageRole.System => "system", + ChatMessageRole.Tool => "tool", + ChatMessageRole.User => "user", + _ => role?.ToString(), + }; + + private string GetErrorType(Exception exception) + { + if (exception is ClientResultException requestFailedException) + { + // TODO (lmolkova) when we start targeting .NET 8 we should put + // requestFailedException.InnerException.HttpRequestError into error.type + return requestFailedException.Status.ToString(); + } + + return exception?.GetType()?.FullName; + } + + private void SetActivityTagIfNotNull(string name, object value) + { + if (value != null) + { + _activity.SetTag(name, value); + } + } + + private void SetActivityTagIfNotNull(string name, int? value) + { + if (value.HasValue) + { + _activity.SetTag(name, value.Value); + } + } + + private void SetActivityTagIfNotNull(string name, float? value) + { + if (value.HasValue) + { + _activity.SetTag(name, value.Value); + } + } +} diff --git a/.dotnet/src/Utility/Telemetry/OpenTelemetrySource.cs b/.dotnet/src/Utility/Telemetry/OpenTelemetrySource.cs new file mode 100644 index 000000000..a0ac1fe47 --- /dev/null +++ b/.dotnet/src/Utility/Telemetry/OpenTelemetrySource.cs @@ -0,0 +1,30 @@ +using OpenAI.Chat; +using System; + +namespace OpenAI.Telemetry; + +internal class OpenTelemetrySource +{ + private const string ChatOperationName = "chat"; + private readonly bool IsOTelEnabled = AppContextSwitchHelper + .GetConfigValue("OpenAI.Experimental.EnableOpenTelemetry", "OPENAI_EXPERIMENTAL_ENABLE_OPEN_TELEMETRY"); + + private readonly string _serverAddress; + private readonly int _serverPort; + private readonly string _model; + + public OpenTelemetrySource(string model, Uri endpoint) + { + _serverAddress = endpoint.Host; + _serverPort = endpoint.Port; + _model = model; + } + + public OpenTelemetryScope StartChatScope(ChatCompletionOptions completionsOptions) + { + return IsOTelEnabled + ? OpenTelemetryScope.StartChat(_model, ChatOperationName, _serverAddress, _serverPort, completionsOptions) + : null; + } + +} diff --git a/.dotnet/tests/Assets/audio_french.wav b/.dotnet/tests/Assets/audio_french.wav new file mode 100644 index 000000000..847f3463a Binary files /dev/null and b/.dotnet/tests/Assets/audio_french.wav differ diff --git a/.dotnet/tests/Assets/audio_hello_world.mp3 b/.dotnet/tests/Assets/audio_hello_world.mp3 new file mode 100644 index 000000000..4186dcd09 Binary files /dev/null and b/.dotnet/tests/Assets/audio_hello_world.mp3 differ diff --git a/.dotnet/tests/Assets/images_dog_and_cat.png b/.dotnet/tests/Assets/images_dog_and_cat.png new file mode 100644 index 000000000..063f3e4d6 Binary files /dev/null and b/.dotnet/tests/Assets/images_dog_and_cat.png differ diff --git a/.dotnet/tests/Assets/images_empty_room.png b/.dotnet/tests/Assets/images_empty_room.png new file mode 100644 index 000000000..4c9610d0e Binary files /dev/null and b/.dotnet/tests/Assets/images_empty_room.png differ diff --git a/.dotnet/tests/Assets/images_empty_room_with_mask.png b/.dotnet/tests/Assets/images_empty_room_with_mask.png new file mode 100644 index 000000000..bb457f7ae Binary files /dev/null and b/.dotnet/tests/Assets/images_empty_room_with_mask.png differ diff --git a/.dotnet/tests/Assistants/AssistantSmokeTests.cs b/.dotnet/tests/Assistants/AssistantSmokeTests.cs new file mode 100644 index 000000000..4dc5f4984 --- /dev/null +++ b/.dotnet/tests/Assistants/AssistantSmokeTests.cs @@ -0,0 +1,109 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using OpenAI.Chat; +using OpenAI.Files; +using OpenAI.VectorStores; +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Assistants; + +#pragma warning disable OPENAI001 + +[Parallelizable(ParallelScope.Fixtures)] +[Category("Assistants")] +[Category("Smoke")] +public partial class AssistantSmokeTests +{ + [Test] + public void RunStepDeserialization() + { + BinaryData runStepData = BinaryData.FromString( + """ + { + "id": "step_Ksdfr5ooy26sayKbIQu2d2Vb", + "object": "thread.run.step", + "created_at": 1718906747, + "run_id": "run_vvuLqtPTte9qCnRb7a5MQPgB", + "assistant_id": "asst_UyBYTjqlwhSOdHOEzwwGZM6d", + "thread_id": "thread_lIk2yQzSGHzXrzA4K6N8uPae", + "type": "tool_calls", + "status": "completed", + "cancelled_at": null, + "completed_at": 1718906749, + "expires_at": null, + "failed_at": null, + "last_error": null, + "step_details": { + "type": "tool_calls", + "tool_calls": [ + { + "id": "call_DUP8WOybwaxKcMoxtr6cJDw1", + "type": "code_interpreter", + "code_interpreter": { + "input": "# Let's read the content of the uploaded file to understand its content.\r\nfile_path = '/mnt/data/assistant-SvXXKd0VKpGbVq9rBDlvZTn0'\r\nwith open(file_path, 'r') as file:\r\n content = file.read()\r\n\r\n# Output the first few lines of the file to understand its structure and content\r\ncontent[:2000]", + "outputs": [ + { + "type": "logs", + "logs": "'Index,Value\\nIndex #1,1\\nIndex #2,4\\nIndex #3,9\\nIndex #4,16\\nIndex #5,25\\nIndex #6,36\\nIndex #7,49\\nIndex #8,64\\nIndex #9,81\\nIndex #10,100\\nIndex #11,121\\nIndex #12,144\\nIndex #13,169\\nIndex #14,196\\nIndex #15,225\\nIndex #16,256\\nIndex #17,289\\nIndex #18,324\\nIndex #19,361\\nIndex #20,400\\nIndex #21,441\\nIndex #22,484\\nIndex #23,529\\nIndex #24,576\\nIndex #25,625\\nIndex #26,676\\nIndex #27,729\\nIndex #28,784\\nIndex #29,841\\nIndex #30,900\\nIndex #31,961\\nIndex #32,1024\\nIndex #33,1089\\nIndex #34,1156\\nIndex #35,1225\\nIndex #36,1296\\nIndex #37,1369\\nIndex #38,1444\\nIndex #39,1521\\nIndex #40,1600\\nIndex #41,1681\\nIndex #42,1764\\nIndex #43,1849\\nIndex #44,1936\\nIndex #45,2025\\nIndex #46,2116\\nIndex #47,2209\\nIndex #48,2304\\nIndex #49,2401\\nIndex #50,2500\\nIndex #51,2601\\nIndex #52,2704\\nIndex #53,2809\\nIndex #54,2916\\nIndex #55,3025\\nIndex #56,3136\\nIndex #57,3249\\nIndex #58,3364\\nIndex #59,3481\\nIndex #60,3600\\nIndex #61,3721\\nIndex #62,3844\\nIndex #63,3969\\nIndex #64,4096\\nIndex #65,4225\\nIndex #66,4356\\nIndex #67,4489\\nIndex #68,4624\\nIndex #69,4761\\nIndex #70,4900\\nIndex #71,5041\\nIndex #72,5184\\nIndex #73,5329\\nIndex #74,5476\\nIndex #75,5625\\nIndex #76,5776\\nIndex #77,5929\\nIndex #78,6084\\nIndex #79,6241\\nIndex #80,6400\\nIndex #81,6561\\nIndex #82,6724\\nIndex #83,6889\\nIndex #84,7056\\nIndex #85,7225\\nIndex #86,7396\\nIndex #87,7569\\nIndex #88,7744\\nIndex #89,7921\\nIndex #90,8100\\nIndex #91,8281\\nIndex #92,8464\\nIndex #93,8649\\nIndex #94,8836\\nIndex #95,9025\\nIndex #96,9216\\nIndex #97,9409\\nIndex #98,9604\\nIndex #99,9801\\nIndex #100,10000\\nIndex #101,10201\\nIndex #102,10404\\nIndex #103,10609\\nIndex #104,10816\\nIndex #105,11025\\nIndex #106,11236\\nIndex #107,11449\\nIndex #108,11664\\nIndex #109,11881\\nIndex #110,12100\\nIndex #111,12321\\nIndex #112,12544\\nIndex #113,12769\\nIndex #114,12996\\nIndex #115,13225\\nIndex #116,13456\\nIndex #117,13689\\nIndex #118,13924\\nIndex #119,14161\\nIndex #120,14400\\nIndex #121,14641\\nIndex #122,14884\\nIndex #123,15129\\nIndex #124,15376\\nIndex #125,15625\\nIndex #126,15876\\nIndex #127,16129\\nIndex #128,16384\\nIndex #129,16641\\nIndex #130,16900\\nIndex #131,17161\\nIndex #132,'" + } + ] + } + } + ] + }, + "usage": { + "prompt_tokens": 201, + "completion_tokens": 84, + "total_tokens": 285 + } + } + """); + RunStep deserializedRunStep = ModelReaderWriter.Read(runStepData); + Assert.That(deserializedRunStep.Id, Is.Not.Null.And.Not.Empty); + Assert.That(deserializedRunStep.AssistantId, Is.Not.Null.And.Not.Empty); + Assert.That(deserializedRunStep.Details, Is.Not.Null); + Assert.That(deserializedRunStep.Details.ToolCalls, Has.Count.EqualTo(1)); + Assert.That(deserializedRunStep.Details.ToolCalls[0].CodeInterpreterOutputs, Has.Count.EqualTo(1)); + Assert.That(deserializedRunStep.Details.ToolCalls[0].CodeInterpreterOutputs[0].Logs, Is.Not.Null.And.Not.Empty); + } + + [Test] + public void ResponseFormatEquality() + { + Assert.That(AssistantResponseFormat.CreateAutoFormat() == "auto"); + Assert.That(AssistantResponseFormat.CreateAutoFormat(), Is.EqualTo("auto")); + Assert.That(AssistantResponseFormat.CreateAutoFormat(), Is.Not.EqualTo("automatic")); + Assert.That(AssistantResponseFormat.CreateAutoFormat() == AssistantResponseFormat.CreateAutoFormat()); + Assert.That(AssistantResponseFormat.CreateTextFormat() == AssistantResponseFormat.CreateTextFormat()); + Assert.That(AssistantResponseFormat.CreateTextFormat(), Is.EqualTo(AssistantResponseFormat.CreateTextFormat())); + Assert.That(AssistantResponseFormat.CreateAutoFormat() != AssistantResponseFormat.CreateTextFormat()); + Assert.That(AssistantResponseFormat.CreateAutoFormat(), Is.Not.EqualTo(AssistantResponseFormat.CreateTextFormat())); + Assert.That((AssistantResponseFormat)null == (AssistantResponseFormat)null); + Assert.That((AssistantResponseFormat)null != AssistantResponseFormat.CreateTextFormat()); + Assert.That(AssistantResponseFormat.CreateTextFormat() != null); + Assert.That(AssistantResponseFormat.CreateTextFormat(), Is.Not.EqualTo(null)); + Assert.That(null, Is.Not.EqualTo(AssistantResponseFormat.CreateTextFormat())); + + AssistantResponseFormat jsonSchemaFormat = AssistantResponseFormat.CreateJsonSchemaFormat( + name: "test_schema", + description: "A description of the schema", + jsonSchema: BinaryData.FromString(""" + { + "type": "object", + "properties": { + "foo": { "type": "string" } + }, + "additionalProperties": false + } + """), + strictSchemaEnabled: true); + + Assert.That(jsonSchemaFormat == AssistantResponseFormat.CreateJsonSchemaFormat("test_schema", BinaryData.FromObjectAsJson(new { }))); + Assert.That(jsonSchemaFormat != AssistantResponseFormat.CreateJsonSchemaFormat("not_test_schema", BinaryData.FromObjectAsJson(new { }))); + } +} + +#pragma warning restore OPENAI001 diff --git a/.dotnet/tests/Assistants/AssistantTests.cs b/.dotnet/tests/Assistants/AssistantTests.cs new file mode 100644 index 000000000..898c4299b --- /dev/null +++ b/.dotnet/tests/Assistants/AssistantTests.cs @@ -0,0 +1,1131 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using OpenAI.Chat; +using OpenAI.Files; +using OpenAI.VectorStores; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Diagnostics; +using System.Linq; +using System.Numerics; +using System.Threading; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Assistants; + +#pragma warning disable OPENAI001 + +[Parallelizable(ParallelScope.Fixtures)] +[Category("Assistants")] +public partial class AssistantTests +{ + [OneTimeTearDown] + protected void Cleanup() + { + // Skip cleanup if there is no API key (e.g., if we are not running live tests). + if (string.IsNullOrEmpty(Environment.GetEnvironmentVariable("OPENAI_API_KEY"))) + { + return; + } + + AssistantClient client = GetTestClient(TestScenario.Assistants); + FileClient fileClient = GetTestClient(TestScenario.Files); + VectorStoreClient vectorStoreClient = GetTestClient(TestScenario.VectorStores); + RequestOptions requestOptions = new() + { + ErrorOptions = ClientErrorBehaviors.NoThrow, + }; + foreach (ThreadMessage message in _messagesToDelete) + { + Console.WriteLine($"Cleanup: {message.Id} -> {client.DeleteMessage(message.ThreadId, message.Id, requestOptions)?.GetRawResponse().Status}"); + } + foreach (Assistant assistant in _assistantsToDelete) + { + Console.WriteLine($"Cleanup: {assistant.Id} -> {client.DeleteAssistant(assistant.Id, requestOptions)?.GetRawResponse().Status}"); + } + foreach (AssistantThread thread in _threadsToDelete) + { + Console.WriteLine($"Cleanup: {thread.Id} -> {client.DeleteThread(thread.Id, requestOptions)?.GetRawResponse().Status}"); + } + foreach (OpenAIFileInfo file in _filesToDelete) + { + Console.WriteLine($"Cleanup: {file.Id} -> {fileClient.DeleteFile(file.Id, requestOptions)?.GetRawResponse().Status}"); + } + foreach (string vectorStoreId in _vectorStoreIdsToDelete) + { + Console.WriteLine($"Cleanup: {vectorStoreId} => {vectorStoreClient.DeleteVectorStore(vectorStoreId, requestOptions)?.GetRawResponse().Status}"); + } + _messagesToDelete.Clear(); + _assistantsToDelete.Clear(); + _threadsToDelete.Clear(); + _filesToDelete.Clear(); + _vectorStoreIdsToDelete.Clear(); + } + + [Test] + public void BasicAssistantOperationsWork() + { + AssistantClient client = GetTestClient(); + Assistant assistant = client.CreateAssistant("gpt-4o-mini"); + Validate(assistant); + Assert.That(assistant.Name, Is.Null.Or.Empty); + assistant = client.ModifyAssistant(assistant.Id, new AssistantModificationOptions() + { + Name = "test assistant name", + }); + Assert.That(assistant.Name, Is.EqualTo("test assistant name")); + bool deleted = client.DeleteAssistant(assistant.Id); + Assert.That(deleted, Is.True); + _assistantsToDelete.Remove(assistant); + assistant = client.CreateAssistant("gpt-4o-mini", new AssistantCreationOptions() + { + Metadata = + { + [s_cleanupMetadataKey] = "hello!" + }, + }); + Validate(assistant); + Assistant retrievedAssistant = client.GetAssistant(assistant.Id); + Assert.That(retrievedAssistant.Id, Is.EqualTo(assistant.Id)); + Assert.That(retrievedAssistant.Metadata.TryGetValue(s_cleanupMetadataKey, out string metadataValue) && metadataValue == "hello!"); + Assistant modifiedAssistant = client.ModifyAssistant(assistant.Id, new AssistantModificationOptions() + { + Metadata = + { + [s_cleanupMetadataKey] = "goodbye!", + }, + }); + Assert.That(modifiedAssistant.Id, Is.EqualTo(assistant.Id)); + PageCollection pages = client.GetAssistants(); + IEnumerable recentAssistants = pages.GetAllValues(); + Assistant listedAssistant = recentAssistants.FirstOrDefault(pageItem => pageItem.Id == assistant.Id); + Assert.That(listedAssistant, Is.Not.Null); + Assert.That(listedAssistant.Metadata.TryGetValue(s_cleanupMetadataKey, out string newMetadataValue) && newMetadataValue == "goodbye!"); + } + + [Test] + public void BasicThreadOperationsWork() + { + AssistantClient client = GetTestClient(); + AssistantThread thread = client.CreateThread(); + Validate(thread); + Assert.That(thread.CreatedAt, Is.GreaterThan(s_2024)); + bool deleted = client.DeleteThread(thread.Id); + Assert.That(deleted, Is.True); + _threadsToDelete.Remove(thread); + + ThreadCreationOptions options = new() + { + Metadata = + { + ["threadMetadata"] = "threadMetadataValue", + } + }; + thread = client.CreateThread(options); + Validate(thread); + Assert.That(thread.Metadata.TryGetValue("threadMetadata", out string threadMetadataValue) && threadMetadataValue == "threadMetadataValue"); + AssistantThread retrievedThread = client.GetThread(thread.Id); + Assert.That(retrievedThread.Id, Is.EqualTo(thread.Id)); + thread = client.ModifyThread(thread, new ThreadModificationOptions() + { + Metadata = + { + ["threadMetadata"] = "newThreadMetadataValue", + }, + }); + Assert.That(thread.Metadata.TryGetValue("threadMetadata", out threadMetadataValue) && threadMetadataValue == "newThreadMetadataValue"); + } + + [Test] + public void BasicMessageOperationsWork() + { + AssistantClient client = GetTestClient(); + AssistantThread thread = client.CreateThread(); + Validate(thread); + ThreadMessage message = client.CreateMessage(thread, MessageRole.User, ["Hello, world!"]); + Validate(message); + Assert.That(message.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(message.Content?.Count, Is.EqualTo(1)); + Assert.That(message.Content[0], Is.Not.Null); + Assert.That(message.Content[0].Text, Is.EqualTo("Hello, world!")); + bool deleted = client.DeleteMessage(message); + Assert.That(deleted, Is.True); + _messagesToDelete.Remove(message); + + message = client.CreateMessage(thread, MessageRole.User, ["Goodbye, world!"], new MessageCreationOptions() + { + Metadata = + { + ["messageMetadata"] = "messageMetadataValue", + }, + }); + Validate(message); + Assert.That(message.Metadata.TryGetValue("messageMetadata", out string metadataValue) && metadataValue == "messageMetadataValue"); + + ThreadMessage retrievedMessage = client.GetMessage(thread.Id, message.Id); + Assert.That(retrievedMessage.Id, Is.EqualTo(message.Id)); + + message = client.ModifyMessage(message, new MessageModificationOptions() + { + Metadata = + { + ["messageMetadata"] = "newValue", + } + }); + Assert.That(message.Metadata.TryGetValue("messageMetadata", out metadataValue) && metadataValue == "newValue"); + + PageResult messagePage = client.GetMessages(thread).GetCurrentPage(); + Assert.That(messagePage.Values.Count, Is.EqualTo(1)); + Assert.That(messagePage.Values[0].Id, Is.EqualTo(message.Id)); + Assert.That(messagePage.Values[0].Metadata.TryGetValue("messageMetadata", out metadataValue) && metadataValue == "newValue"); + } + + [Test] + public void ThreadWithInitialMessagesWorks() + { + AssistantClient client = GetTestClient(); + ThreadCreationOptions options = new() + { + InitialMessages = + { + "Hello, world!", + new( + MessageRole.User, + [ + "Can you describe this image for me?", + MessageContent.FromImageUrl(new Uri("https://test.openai.com/image.png")) + ]) + { + Metadata = + { + ["messageMetadata"] = "messageMetadataValue", + }, + }, + }, + }; + AssistantThread thread = client.CreateThread(options); + Validate(thread); + PageResult messagesPage = client.GetMessages(thread, new MessageCollectionOptions() { Order = ListOrder.OldestFirst }).GetCurrentPage(); + Assert.That(messagesPage.Values.Count, Is.EqualTo(2)); + Assert.That(messagesPage.Values[0].Role, Is.EqualTo(MessageRole.User)); + Assert.That(messagesPage.Values[0].Content?.Count, Is.EqualTo(1)); + Assert.That(messagesPage.Values[0].Content[0].Text, Is.EqualTo("Hello, world!")); + Assert.That(messagesPage.Values[1].Content?.Count, Is.EqualTo(2)); + Assert.That(messagesPage.Values[1].Content[0], Is.Not.Null); + Assert.That(messagesPage.Values[1].Content[0].Text, Is.EqualTo("Can you describe this image for me?")); + Assert.That(messagesPage.Values[1].Content[1], Is.Not.Null); + Assert.That(messagesPage.Values[1].Content[1].ImageUrl.AbsoluteUri, Is.EqualTo("https://test.openai.com/image.png")); + } + + [Test] + public void BasicRunOperationsWork() + { + AssistantClient client = GetTestClient(); + Assistant assistant = client.CreateAssistant("gpt-4o-mini"); + Validate(assistant); + AssistantThread thread = client.CreateThread(); + Validate(thread); + PageResult runsPage = client.GetRuns(thread).GetCurrentPage(); + Assert.That(runsPage.Values.Count, Is.EqualTo(0)); + ThreadMessage message = client.CreateMessage(thread.Id, MessageRole.User, ["Hello, assistant!"]); + Validate(message); + ThreadRun run = client.CreateRun(thread.Id, assistant.Id); + Validate(run); + Assert.That(run.Status, Is.EqualTo(RunStatus.Queued)); + Assert.That(run.CreatedAt, Is.GreaterThan(s_2024)); + ThreadRun retrievedRun = client.GetRun(thread.Id, run.Id); + Assert.That(retrievedRun.Id, Is.EqualTo(run.Id)); + runsPage = client.GetRuns(thread).GetCurrentPage(); + Assert.That(runsPage.Values.Count, Is.EqualTo(1)); + Assert.That(runsPage.Values[0].Id, Is.EqualTo(run.Id)); + + PageResult messagesPage = client.GetMessages(thread).GetCurrentPage(); + Assert.That(messagesPage.Values.Count, Is.GreaterThanOrEqualTo(1)); + for (int i = 0; i < 10 && !run.Status.IsTerminal; i++) + { + Thread.Sleep(1000); + run = client.GetRun(run); + } + Assert.That(run.Status, Is.EqualTo(RunStatus.Completed)); + Assert.That(run.CompletedAt, Is.GreaterThan(s_2024)); + Assert.That(run.RequiredActions.Count, Is.EqualTo(0)); + Assert.That(run.AssistantId, Is.EqualTo(assistant.Id)); + Assert.That(run.FailedAt, Is.Null); + Assert.That(run.IncompleteDetails, Is.Null); + + messagesPage = client.GetMessages(thread).GetCurrentPage(); + Assert.That(messagesPage.Values.Count, Is.EqualTo(2)); + + Assert.That(messagesPage.Values[0].Role, Is.EqualTo(MessageRole.Assistant)); + Assert.That(messagesPage.Values[1].Role, Is.EqualTo(MessageRole.User)); + Assert.That(messagesPage.Values[1].Id, Is.EqualTo(message.Id)); + } + + [Test] + public void BasicRunStepFunctionalityWorks() + { + AssistantClient client = GetTestClient(); + Assistant assistant = client.CreateAssistant("gpt-4o-mini", new AssistantCreationOptions() + { + Tools = { new CodeInterpreterToolDefinition() }, + Instructions = "You help the user with mathematical descriptions and visualizations.", + }); + Validate(assistant); + + FileClient fileClient = GetTestClient(TestScenario.Files); + OpenAIFileInfo equationFile = fileClient.UploadFile( + BinaryData.FromString(""" + x,y + 2,5 + 7,14, + 8,22 + """).ToStream(), + "text/csv", + FileUploadPurpose.Assistants); + Validate(equationFile); + + AssistantThread thread = client.CreateThread(new ThreadCreationOptions() + { + InitialMessages = + { + "Describe the contents of any available tool resource file." + + " Graph a linear regression and provide the coefficient of correlation." + + " Explain any code executed to evaluate.", + }, + ToolResources = new() + { + CodeInterpreter = new() + { + FileIds = { equationFile.Id }, + } + } + }); + Validate(thread); + + ThreadRun run = client.CreateRun(thread, assistant); + Validate(run); + + while (!run.Status.IsTerminal) + { + Thread.Sleep(1000); + run = client.GetRun(run); + } + Assert.That(run.Status, Is.EqualTo(RunStatus.Completed)); + Assert.That(run.Usage?.TotalTokens, Is.GreaterThan(0)); + + PageCollection pages = client.GetRunSteps(run); + PageResult firstPage = pages.GetCurrentPage(); + RunStep firstStep = firstPage.Values[0]; + RunStep secondStep = firstPage.Values[1]; + + Assert.That(firstPage.Values.Count, Is.GreaterThan(1)); + Assert.Multiple(() => + { + Assert.That(firstStep.AssistantId, Is.EqualTo(assistant.Id)); + Assert.That(firstStep.ThreadId, Is.EqualTo(thread.Id)); + Assert.That(firstStep.RunId, Is.EqualTo(run.Id)); + Assert.That(firstStep.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(firstStep.CompletedAt, Is.GreaterThan(s_2024)); + }); + RunStepDetails details = firstStep.Details; + Assert.That(details?.CreatedMessageId, Is.Not.Null.And.Not.Empty); + + string rawContent = firstPage.GetRawResponse().Content.ToString(); + + details = secondStep.Details; + Assert.Multiple(() => + { + Assert.That(details?.ToolCalls.Count, Is.GreaterThan(0)); + Assert.That(details.ToolCalls[0].ToolKind, Is.EqualTo(RunStepToolCallKind.CodeInterpreter)); + Assert.That(details.ToolCalls[0].ToolCallId, Is.Not.Null.And.Not.Empty); + Assert.That(details.ToolCalls[0].CodeInterpreterInput, Is.Not.Null.And.Not.Empty); + Assert.That(details.ToolCalls[0].CodeInterpreterOutputs?.Count, Is.GreaterThan(0)); + Assert.That(details.ToolCalls[0].CodeInterpreterOutputs[0].ImageFileId, Is.Not.Null.And.Not.Empty); + }); + } + + [Test] + public void SettingResponseFormatWorks() + { + AssistantClient client = GetTestClient(); + Assistant assistant = client.CreateAssistant("gpt-4o-mini", new() + { + ResponseFormat = AssistantResponseFormat.CreateAutoFormat(), + }); + Validate(assistant); + Assert.That(assistant.ResponseFormat == "auto"); + assistant = client.ModifyAssistant(assistant, new() + { + ResponseFormat = AssistantResponseFormat.CreateTextFormat(), + }); + Assert.That(assistant.ResponseFormat == AssistantResponseFormat.CreateTextFormat()); + AssistantThread thread = client.CreateThread(); + Validate(thread); + ThreadMessage message = client.CreateMessage(thread, MessageRole.User, ["Write some JSON for me!"]); + Validate(message); + ThreadRun run = client.CreateRun(thread, assistant, new() + { + ResponseFormat = AssistantResponseFormat.CreateJsonObjectFormat(), + }); + Assert.That(run.ResponseFormat == AssistantResponseFormat.CreateJsonObjectFormat()); + } + + [Test] + public void FunctionToolsWork() + { + AssistantClient client = GetTestClient(); + Assistant assistant = client.CreateAssistant("gpt-4o-mini", new AssistantCreationOptions() + { + Tools = + { + new FunctionToolDefinition() + { + FunctionName = "get_favorite_food_for_day_of_week", + Description = "gets the user's favorite food for a given day of the week, like Tuesday", + Parameters = BinaryData.FromObjectAsJson(new + { + type = "object", + properties = new + { + day_of_week = new + { + type = "string", + description = "a day of the week, like Tuesday or Saturday", + } + } + }), + }, + }, + }); + Validate(assistant); + Assert.That(assistant.Tools?.Count, Is.EqualTo(1)); + + FunctionToolDefinition responseToolDefinition = assistant.Tools[0] as FunctionToolDefinition; + Assert.That(responseToolDefinition?.FunctionName, Is.EqualTo("get_favorite_food_for_day_of_week")); + Assert.That(responseToolDefinition?.Parameters, Is.Not.Null); + + ThreadRun run = client.CreateThreadAndRun( + assistant, + new ThreadCreationOptions() + { + InitialMessages = { "What should I eat on Thursday?" }, + }, + new RunCreationOptions() + { + AdditionalInstructions = "Call provided tools when appropriate.", + }); + Validate(run); + + for (int i = 0; i < 10 && !run.Status.IsTerminal; i++) + { + Thread.Sleep(1000); + run = client.GetRun(run); + } + Assert.That(run.Status, Is.EqualTo(RunStatus.RequiresAction)); + Assert.That(run.RequiredActions?.Count, Is.EqualTo(1)); + Assert.That(run.RequiredActions[0].ToolCallId, Is.Not.Null.And.Not.Empty); + Assert.That(run.RequiredActions[0].FunctionName, Is.EqualTo("get_favorite_food_for_day_of_week")); + Assert.That(run.RequiredActions[0].FunctionArguments, Is.Not.Null.And.Not.Empty); + + run = client.SubmitToolOutputsToRun(run, [new(run.RequiredActions[0].ToolCallId, "tacos")]); + Assert.That(run.Status.IsTerminal, Is.False); + + for (int i = 0; i < 10 && !run.Status.IsTerminal; i++) + { + Thread.Sleep(1000); + run = client.GetRun(run); + } + Assert.That(run.Status, Is.EqualTo(RunStatus.Completed)); + + PageCollection messagePages = client.GetMessages(run.ThreadId, new MessageCollectionOptions() { Order = ListOrder.NewestFirst }); + PageResult firstPage = messagePages.GetCurrentPage(); + Assert.That(firstPage.Values.Count, Is.GreaterThan(1)); + Assert.That(firstPage.Values[0].Role, Is.EqualTo(MessageRole.Assistant)); + Assert.That(firstPage.Values[0].Content?[0], Is.Not.Null); + Assert.That(firstPage.Values[0].Content[0].Text.ToLowerInvariant(), Does.Contain("tacos")); + } + + [Test] + public async Task StreamingRunWorks() + { + AssistantClient client = GetTestClient(); + Assistant assistant = await client.CreateAssistantAsync("gpt-4o-mini"); + Validate(assistant); + + AssistantThread thread = await client.CreateThreadAsync(new ThreadCreationOptions() + { + InitialMessages = { "Hello there, assistant! How are you today?", }, + }); + Validate(thread); + + Stopwatch stopwatch = Stopwatch.StartNew(); + void Print(string message) => Console.WriteLine($"[{stopwatch.ElapsedMilliseconds,6}] {message}"); + + AsyncCollectionResult streamingResult + = client.CreateRunStreamingAsync(thread.Id, assistant.Id); + + Print(">>> Connected <<<"); + + await foreach (StreamingUpdate update in streamingResult) + { + string message = $"{update.UpdateKind} "; + if (update is RunUpdate runUpdate) + { + message += $"at {update.UpdateKind switch + { + StreamingUpdateReason.RunCreated => runUpdate.Value.CreatedAt, + StreamingUpdateReason.RunQueued => runUpdate.Value.StartedAt, + StreamingUpdateReason.RunInProgress => runUpdate.Value.StartedAt, + StreamingUpdateReason.RunCompleted => runUpdate.Value.CompletedAt, + _ => "???", + }}"; + } + if (update is MessageContentUpdate contentUpdate) + { + if (contentUpdate.Role.HasValue) + { + message += $"[{contentUpdate.Role}]"; + } + message += $"[{contentUpdate.MessageIndex}] {contentUpdate.Text}"; + } + Print(message); + } + Print(">>> Done <<<"); + } + + [TestCase] + public async Task StreamingToolCall() + { + AssistantClient client = GetTestClient(); + FunctionToolDefinition getWeatherTool = new() + { + FunctionName = "get_current_weather", + Description = "Gets the user's current weather", + }; + Assistant assistant = await client.CreateAssistantAsync("gpt-4o-mini", new() + { + Tools = { getWeatherTool } + }); + Validate(assistant); + + Stopwatch stopwatch = Stopwatch.StartNew(); + void Print(string message) => Console.WriteLine($"[{stopwatch.ElapsedMilliseconds,6}] {message}"); + + Print(" >>> Beginning call ... "); + AsyncCollectionResult asyncResults = client.CreateThreadAndRunStreamingAsync( + assistant, + new() + { + InitialMessages = { "What should I wear outside right now?", }, + }); + Print(" >>> Starting enumeration ..."); + + ThreadRun run = null; + + do + { + run = null; + List toolOutputs = []; + await foreach (StreamingUpdate update in asyncResults) + { + string message = update.UpdateKind.ToString(); + + if (update is RunUpdate runUpdate) + { + message += $" run_id:{runUpdate.Value.Id}"; + run = runUpdate.Value; + } + if (update is RequiredActionUpdate requiredActionUpdate) + { + Assert.That(requiredActionUpdate.FunctionName, Is.EqualTo(getWeatherTool.FunctionName)); + Assert.That(requiredActionUpdate.GetThreadRun().Status, Is.EqualTo(RunStatus.RequiresAction)); + message += $" {requiredActionUpdate.FunctionName}"; + toolOutputs.Add(new(requiredActionUpdate.ToolCallId, "warm and sunny")); + } + if (update is MessageContentUpdate contentUpdate) + { + message += $" {contentUpdate.Text}"; + } + Print(message); + } + if (toolOutputs.Count > 0) + { + asyncResults = client.SubmitToolOutputsToRunStreamingAsync(run, toolOutputs); + } + } while (run?.Status.IsTerminal == false); + } + + [Test] + public void BasicFileSearchWorks() + { + // First, we need to upload a simple test file. + FileClient fileClient = GetTestClient(TestScenario.Files); + OpenAIFileInfo testFile = fileClient.UploadFile( + BinaryData.FromString(""" + This file describes the favorite foods of several people. + + Summanus Ferdinand: tacos + Tekakwitha Effie: pizza + Filip Carola: cake + """).ToStream(), + "favorite_foods.txt", + FileUploadPurpose.Assistants); + Validate(testFile); + + AssistantClient client = GetTestClient(); + + // Create an assistant, using the creation helper to make a new vector store + Assistant assistant = client.CreateAssistant("gpt-4o-mini", new() + { + Tools = { new FileSearchToolDefinition() }, + ToolResources = new() + { + FileSearch = new() + { + NewVectorStores = + { + new VectorStoreCreationHelper([testFile.Id]), + } + } + } + }); + Validate(assistant); + Assert.That(assistant.ToolResources?.FileSearch?.VectorStoreIds, Has.Count.EqualTo(1)); + string createdVectorStoreId = assistant.ToolResources.FileSearch.VectorStoreIds[0]; + _vectorStoreIdsToDelete.Add(createdVectorStoreId); + + // Modify an assistant to use the existing vector store + assistant = client.ModifyAssistant(assistant, new AssistantModificationOptions() + { + ToolResources = new() + { + FileSearch = new() + { + VectorStoreIds = { assistant.ToolResources.FileSearch.VectorStoreIds[0] }, + }, + }, + }); + Assert.That(assistant.ToolResources?.FileSearch?.VectorStoreIds, Has.Count.EqualTo(1)); + Assert.That(assistant.ToolResources.FileSearch.VectorStoreIds[0], Is.EqualTo(createdVectorStoreId)); + + // Create a thread with an override vector store + AssistantThread thread = client.CreateThread(new ThreadCreationOptions() + { + InitialMessages = { "Using the files you have available, what's Filip's favorite food?" }, + ToolResources = new() + { + FileSearch = new() + { + NewVectorStores = + { + new VectorStoreCreationHelper([testFile.Id]) + } + } + } + }); + Validate(thread); + Assert.That(thread.ToolResources?.FileSearch?.VectorStoreIds, Has.Count.EqualTo(1)); + createdVectorStoreId = thread.ToolResources.FileSearch.VectorStoreIds[0]; + _vectorStoreIdsToDelete.Add(createdVectorStoreId); + + // Ensure that modifying the thread with an existing vector store works + thread = client.ModifyThread(thread, new ThreadModificationOptions() + { + ToolResources = new() + { + FileSearch = new() + { + VectorStoreIds = { createdVectorStoreId }, + } + } + }); + Assert.That(thread.ToolResources?.FileSearch?.VectorStoreIds, Has.Count.EqualTo(1)); + Assert.That(thread.ToolResources.FileSearch.VectorStoreIds[0], Is.EqualTo(createdVectorStoreId)); + + ThreadRun run = client.CreateRun(thread, assistant); + Validate(run); + do + { + Thread.Sleep(1000); + run = client.GetRun(run); + } while (run?.Status.IsTerminal == false); + Assert.That(run.Status, Is.EqualTo(RunStatus.Completed)); + + IEnumerable messages = client.GetMessages(thread, new() { Order = ListOrder.NewestFirst }).GetAllValues(); + int messageCount = 0; + bool hasCake = false; + foreach (ThreadMessage message in messages) + { + messageCount++; + + foreach (MessageContent content in message.Content) + { + Console.WriteLine(content.Text); + foreach (TextAnnotation annotation in content.TextAnnotations) + { + Console.WriteLine($" --> From file: {annotation.InputFileId}, replacement: {annotation.TextToReplace}"); + } + + if (!hasCake) + { + hasCake = content.Text.ToLower().Contains("cake"); + } + } + } + Assert.That(messageCount > 1); + Assert.That(hasCake, Is.True); + } + + [Test] + public async Task Pagination_CanEnumerateAssistants() + { + AssistantClient client = GetTestClient(); + + // Create assistant collection + for (int i = 0; i < 10; i++) + { + Assistant assistant = client.CreateAssistant("gpt-4o-mini", new AssistantCreationOptions() + { + Name = $"Test Assistant {i}", + }); + Validate(assistant); + Assert.That(assistant.Name, Is.EqualTo($"Test Assistant {i}")); + } + + // Page through collection + int count = 0; + IAsyncEnumerable assistants = client.GetAssistantsAsync(new AssistantCollectionOptions() { Order = ListOrder.NewestFirst }).GetAllValuesAsync(); + + int lastIdSeen = int.MaxValue; + + await foreach (Assistant assistant in assistants) + { + Console.WriteLine($"[{count,3}] {assistant.Id} {assistant.CreatedAt:s} {assistant.Name}"); + if (assistant.Name?.StartsWith("Test Assistant ") == true) + { + Assert.That(int.TryParse(assistant.Name["Test Assistant ".Length..], out int seenId), Is.True); + Assert.That(seenId, Is.LessThan(lastIdSeen)); + lastIdSeen = seenId; + } + count++; + if (lastIdSeen == 0 || count > 100) + { + break; + } + } + + Assert.That(count, Is.GreaterThanOrEqualTo(10)); + } + + [Test] + public async Task Pagination_CanPageThroughAssistantCollection() + { + AssistantClient client = GetTestClient(); + + // Create assistant collection + for (int i = 0; i < 10; i++) + { + Assistant assistant = client.CreateAssistant("gpt-4o-mini", new AssistantCreationOptions() + { + Name = $"Test Assistant {i}" + }); + Validate(assistant); + Assert.That(assistant.Name, Is.EqualTo($"Test Assistant {i}")); + } + + // Page through collection + int count = 0; + int pageCount = 0; + AsyncPageCollection pages = client.GetAssistantsAsync( + new AssistantCollectionOptions() + { + Order = ListOrder.NewestFirst, + PageSize = 2 + }); + + int lastIdSeen = int.MaxValue; + + await foreach (PageResult page in pages) + { + foreach (Assistant assistant in page.Values) + { + Console.WriteLine($"[{count,3}] {assistant.Id} {assistant.CreatedAt:s} {assistant.Name}"); + if (assistant.Name?.StartsWith("Test Assistant ") == true) + { + Assert.That(int.TryParse(assistant.Name["Test Assistant ".Length..], out int seenId), Is.True); + Assert.That(seenId, Is.LessThan(lastIdSeen)); + lastIdSeen = seenId; + } + count++; + } + + pageCount++; + if (lastIdSeen == 0 || count > 100) + { + break; + } + } + + Assert.That(count, Is.GreaterThanOrEqualTo(10)); + Assert.That(pageCount, Is.GreaterThanOrEqualTo(5)); + } + + [Test] + public async Task Pagination_CanRehydrateAssistantPageCollectionFromBytes() + { + AssistantClient client = GetTestClient(); + + // Create assistant collection + for (int i = 0; i < 10; i++) + { + Assistant assistant = client.CreateAssistant("gpt-4o-mini", new AssistantCreationOptions() + { + Name = $"Test Assistant {i}" + }); + Validate(assistant); + Assert.That(assistant.Name, Is.EqualTo($"Test Assistant {i}")); + } + + AsyncPageCollection pages = client.GetAssistantsAsync( + new AssistantCollectionOptions() + { + Order = ListOrder.NewestFirst, + PageSize = 2 + }); + + // Simulate rehydration of the collection + BinaryData rehydrationBytes = (await pages.GetCurrentPageAsync().ConfigureAwait(false)).PageToken.ToBytes(); + ContinuationToken rehydrationToken = ContinuationToken.FromBytes(rehydrationBytes); + + AsyncPageCollection rehydratedPages = client.GetAssistantsAsync(rehydrationToken); + + int count = 0; + int pageCount = 0; + int lastIdSeen = int.MaxValue; + + await foreach (PageResult page in rehydratedPages) + { + foreach (Assistant assistant in page.Values) + { + Console.WriteLine($"[{count,3}] {assistant.Id} {assistant.CreatedAt:s} {assistant.Name}"); + if (assistant.Name?.StartsWith("Test Assistant ") == true) + { + Assert.That(int.TryParse(assistant.Name["Test Assistant ".Length..], out int seenId), Is.True); + Assert.That(seenId, Is.LessThan(lastIdSeen)); + lastIdSeen = seenId; + } + count++; + } + + pageCount++; + if (lastIdSeen == 0 || count > 100) + { + break; + } + } + + Assert.That(count, Is.GreaterThanOrEqualTo(10)); + Assert.That(pageCount, Is.GreaterThanOrEqualTo(5)); + } + + [Test] + public async Task Pagination_CanRehydrateAssistantPageCollectionFromPageToken() + { + AssistantClient client = GetTestClient(); + + // Create assistant collection + for (int i = 0; i < 10; i++) + { + Assistant assistant = client.CreateAssistant("gpt-4o-mini", new AssistantCreationOptions() + { + Name = $"Test Assistant {i}" + }); + Validate(assistant); + Assert.That(assistant.Name, Is.EqualTo($"Test Assistant {i}")); + } + + AsyncPageCollection pages = client.GetAssistantsAsync( + new AssistantCollectionOptions() + { + Order = ListOrder.NewestFirst, + PageSize = 2 + }); + + // Call the rehydration method, passing a typed OpenAIPageToken + PageResult firstPage = await pages.GetCurrentPageAsync().ConfigureAwait(false); + AsyncPageCollection rehydratedPages = client.GetAssistantsAsync(firstPage.PageToken); + + int count = 0; + int pageCount = 0; + int lastIdSeen = int.MaxValue; + + await foreach (PageResult page in rehydratedPages) + { + foreach (Assistant assistant in page.Values) + { + Console.WriteLine($"[{count,3}] {assistant.Id} {assistant.CreatedAt:s} {assistant.Name}"); + if (assistant.Name?.StartsWith("Test Assistant ") == true) + { + Assert.That(int.TryParse(assistant.Name["Test Assistant ".Length..], out int seenId), Is.True); + Assert.That(seenId, Is.LessThan(lastIdSeen)); + lastIdSeen = seenId; + } + count++; + } + + pageCount++; + if (lastIdSeen == 0 || count > 100) + { + break; + } + } + + Assert.That(count, Is.GreaterThanOrEqualTo(10)); + Assert.That(pageCount, Is.GreaterThanOrEqualTo(5)); + } + + [Test] + public async Task Pagination_CanCastAssistantPageCollectionToConvenienceFromProtocol() + { + AssistantClient client = GetTestClient(); + + // Create assistant collection + for (int i = 0; i < 10; i++) + { + Assistant assistant = client.CreateAssistant("gpt-4o-mini", new AssistantCreationOptions() + { + Name = $"Test Assistant {i}" + }); + Validate(assistant); + Assert.That(assistant.Name, Is.EqualTo($"Test Assistant {i}")); + } + + // Call the protocol method + IAsyncEnumerable pages = client.GetAssistantsAsync(limit: 2, order: "desc", after: null, before: null, options: default); + + // Cast to the convenience type + AsyncPageCollection assistantPages = (AsyncPageCollection)pages; + + int count = 0; + int pageCount = 0; + int lastIdSeen = int.MaxValue; + + await foreach (PageResult page in assistantPages) + { + foreach (Assistant assistant in page.Values) + { + Console.WriteLine($"[{count,3}] {assistant.Id} {assistant.CreatedAt:s} {assistant.Name}"); + if (assistant.Name?.StartsWith("Test Assistant ") == true) + { + Assert.That(int.TryParse(assistant.Name["Test Assistant ".Length..], out int seenId), Is.True); + Assert.That(seenId, Is.LessThan(lastIdSeen)); + lastIdSeen = seenId; + } + count++; + } + + pageCount++; + if (lastIdSeen == 0 || count > 100) + { + break; + } + } + + Assert.That(count, Is.GreaterThanOrEqualTo(10)); + Assert.That(pageCount, Is.GreaterThanOrEqualTo(5)); + } + + [Test] + public void Pagination_CanRehydrateRunStepPageCollectionFromBytes() + { + AssistantClient client = GetTestClient(); + Assistant assistant = client.CreateAssistant("gpt-4o", new AssistantCreationOptions() + { + Tools = { new CodeInterpreterToolDefinition() }, + Instructions = "You help the user with mathematical descriptions and visualizations.", + }); + Validate(assistant); + + FileClient fileClient = GetTestClient(TestScenario.Files); + OpenAIFileInfo equationFile = fileClient.UploadFile( + BinaryData.FromString(""" + x,y + 2,5 + 7,14, + 8,22 + """).ToStream(), + "text/csv", + FileUploadPurpose.Assistants); + Validate(equationFile); + + AssistantThread thread = client.CreateThread(new ThreadCreationOptions() + { + InitialMessages = + { + "Describe the contents of any available tool resource file." + + " Graph a linear regression and provide the coefficient of correlation." + + " Explain any code executed to evaluate.", + }, + ToolResources = new() + { + CodeInterpreter = new() + { + FileIds = { equationFile.Id }, + } + } + }); + Validate(thread); + + ThreadRun run = client.CreateRun(thread, assistant); + Validate(run); + + while (!run.Status.IsTerminal) + { + Thread.Sleep(1000); + run = client.GetRun(run); + } + Assert.That(run.Status, Is.EqualTo(RunStatus.Completed)); + Assert.That(run.Usage?.TotalTokens, Is.GreaterThan(0)); + + PageCollection pages = client.GetRunSteps(run); + IEnumerator> pageEnumerator = ((IEnumerable>)pages).GetEnumerator(); + + // Simulate rehydration of the collection + BinaryData rehydrationBytes = pages.GetCurrentPage().PageToken.ToBytes(); + ContinuationToken rehydrationToken = ContinuationToken.FromBytes(rehydrationBytes); + + PageCollection rehydratedPages = client.GetRunSteps(rehydrationToken); + IEnumerator> rehydratedPageEnumerator = ((IEnumerable>)rehydratedPages).GetEnumerator(); + + int pageCount = 0; + + while (pageEnumerator.MoveNext() && rehydratedPageEnumerator.MoveNext()) + { + PageResult page = pageEnumerator.Current; + PageResult rehydratedPage = rehydratedPageEnumerator.Current; + + Assert.AreEqual(page.Values.Count, rehydratedPage.Values.Count); + + for (int i = 0; i < page.Values.Count; i++) + { + Assert.AreEqual(page.Values[0].Id, rehydratedPage.Values[0].Id); + } + + pageCount++; + } + + Assert.That(pageCount, Is.GreaterThanOrEqualTo(1)); + } + + [Test] + public async Task MessagesWithRoles() + { + AssistantClient client = GetTestClient(); + const string userMessageText = "Hello, assistant!"; + const string assistantMessageText = "Hi there, user."; + AssistantThread thread = await client.CreateThreadAsync(new ThreadCreationOptions() + { + InitialMessages = + { + new ThreadInitializationMessage(MessageRole.User, [userMessageText]), + new ThreadInitializationMessage(MessageRole.Assistant, [assistantMessageText]), + } + }); + Validate(thread); + List messages = []; + async Task RefreshMessageListAsync() + { + messages.Clear(); + await foreach (ThreadMessage message in client.GetMessagesAsync(thread).GetAllValuesAsync()) + { + messages.Add(message); + } + } + await RefreshMessageListAsync(); + Assert.That(messages.Count, Is.EqualTo(2)); + Assert.That(messages[1].Role, Is.EqualTo(MessageRole.User)); + Assert.That(messages[1].Role, Is.EqualTo(MessageRole.User)); + Assert.That(messages[1].Content[0].Text, Is.EqualTo(userMessageText)); + Assert.That(messages[1].Content[0].Text, Is.EqualTo(userMessageText)); + Assert.That(messages[0].Role, Is.EqualTo(MessageRole.Assistant)); + Assert.That(messages[0].Role, Is.EqualTo(MessageRole.Assistant)); + Assert.That(messages[0].Content[0].Text, Is.EqualTo(assistantMessageText)); + Assert.That(messages[0].Content[0].Text, Is.EqualTo(assistantMessageText)); + ThreadMessage userMessage = await client.CreateMessageAsync( + thread, + MessageRole.User, + [ + MessageContent.FromText(userMessageText) + ]); + ThreadMessage assistantMessage = await client.CreateMessageAsync( + thread, + MessageRole.Assistant, + [assistantMessageText]); + await RefreshMessageListAsync(); + Assert.That(messages.Count, Is.EqualTo(4)); + Assert.That(messages[3].Role, Is.EqualTo(MessageRole.User)); + Assert.That(messages[3].Role, Is.EqualTo(MessageRole.User)); + Assert.That(messages[3].Content[0].Text, Is.EqualTo(userMessageText)); + Assert.That(messages[3].Content[0].Text, Is.EqualTo(userMessageText)); + Assert.That(messages[2].Role, Is.EqualTo(MessageRole.Assistant)); + Assert.That(messages[2].Role, Is.EqualTo(MessageRole.Assistant)); + Assert.That(messages[2].Content[0].Text, Is.EqualTo(assistantMessageText)); + Assert.That(messages[2].Content[0].Text, Is.EqualTo(assistantMessageText)); + } + + /// + /// Performs basic, invariant validation of a target that was just instantiated from its corresponding origination + /// mechanism. If applicable, the instance is recorded into the test run for cleanup of persistent resources. + /// + /// Instance type being validated. + /// The instance to validate. + /// The provided instance type isn't supported. + private void Validate(T target) + { + if (target is Assistant assistant) + { + Assert.That(assistant?.Id, Is.Not.Null); + _assistantsToDelete.Add(assistant); + } + else if (target is AssistantThread thread) + { + Assert.That(thread?.Id, Is.Not.Null); + _threadsToDelete.Add(thread); + } + else if (target is ThreadMessage message) + { + Assert.That(message?.Id, Is.Not.Null); + _messagesToDelete.Add(message); + } + else if (target is ThreadRun run) + { + Assert.That(run?.Id, Is.Not.Null); + } + else if (target is OpenAIFileInfo file) + { + Assert.That(file?.Id, Is.Not.Null); + _filesToDelete.Add(file); + } + else + { + throw new NotImplementedException($"{nameof(Validate)} helper not implemented for: {typeof(T)}"); + } + } + + private readonly List _assistantsToDelete = []; + private readonly List _threadsToDelete = []; + private readonly List _messagesToDelete = []; + private readonly List _filesToDelete = []; + private readonly List _vectorStoreIdsToDelete = []; + + private static AssistantClient GetTestClient() => GetTestClient(TestScenario.Assistants); + + private static readonly DateTimeOffset s_2024 = new(2024, 1, 1, 0, 0, 0, TimeSpan.Zero); + private static readonly string s_testAssistantName = $".NET SDK Test Assistant - Please Delete Me"; + private static readonly string s_cleanupMetadataKey = $"test_metadata_cleanup_eligible"; +} + +#pragma warning restore OPENAI001 diff --git a/.dotnet/tests/Assistants/VectorStoreTests.cs b/.dotnet/tests/Assistants/VectorStoreTests.cs new file mode 100644 index 000000000..d13b1b081 --- /dev/null +++ b/.dotnet/tests/Assistants/VectorStoreTests.cs @@ -0,0 +1,453 @@ +using NUnit.Framework; +using OpenAI.Files; +using OpenAI.VectorStores; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Linq; +using System.Threading; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.VectorStores; + +#pragma warning disable OPENAI001 + +[Parallelizable(ParallelScope.Fixtures)] +[Category("Assistants")] +public partial class VectorStoreTests +{ + [Test] + public void CanCreateGetAndDeleteVectorStores() + { + VectorStoreClient client = GetTestClient(); + + VectorStore vectorStore = client.CreateVectorStore(); + Validate(vectorStore); + bool deleted = client.DeleteVectorStore(vectorStore); + Assert.That(deleted, Is.True); + _vectorStoresToDelete.RemoveAt(_vectorStoresToDelete.Count - 1); + + IReadOnlyList testFiles = GetNewTestFiles(5); + + vectorStore = client.CreateVectorStore(new VectorStoreCreationOptions() + { + FileIds = { testFiles[0].Id }, + Name = "test vector store", + ExpirationPolicy = new VectorStoreExpirationPolicy() + { + Anchor = VectorStoreExpirationAnchor.LastActiveAt, + Days = 3, + }, + Metadata = + { + ["test-key"] = "test-value", + }, + }); + Validate(vectorStore); + Assert.Multiple(() => + { + Assert.That(vectorStore.Name, Is.EqualTo("test vector store")); + Assert.That(vectorStore.ExpirationPolicy?.Anchor, Is.EqualTo(VectorStoreExpirationAnchor.LastActiveAt)); + Assert.That(vectorStore.ExpirationPolicy?.Days, Is.EqualTo(3)); + Assert.That(vectorStore.FileCounts.Total, Is.EqualTo(1)); + Assert.That(vectorStore.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(vectorStore.ExpiresAt, Is.GreaterThan(s_2024)); + Assert.That(vectorStore.Status, Is.EqualTo(VectorStoreStatus.InProgress)); + Assert.That(vectorStore.Metadata?.TryGetValue("test-key", out string metadataValue) == true && metadataValue == "test-value"); + }); + vectorStore = client.GetVectorStore(vectorStore); + Assert.Multiple(() => + { + Assert.That(vectorStore.Name, Is.EqualTo("test vector store")); + Assert.That(vectorStore.ExpirationPolicy?.Anchor, Is.EqualTo(VectorStoreExpirationAnchor.LastActiveAt)); + Assert.That(vectorStore.ExpirationPolicy?.Days, Is.EqualTo(3)); + Assert.That(vectorStore.FileCounts.Total, Is.EqualTo(1)); + Assert.That(vectorStore.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(vectorStore.ExpiresAt, Is.GreaterThan(s_2024)); + Assert.That(vectorStore.Metadata?.TryGetValue("test-key", out string metadataValue) == true && metadataValue == "test-value"); + }); + + deleted = client.DeleteVectorStore(vectorStore.Id); + Assert.That(deleted, Is.True); + _vectorStoresToDelete.RemoveAt(_vectorStoresToDelete.Count - 1); + + vectorStore = client.CreateVectorStore(new VectorStoreCreationOptions() + { + FileIds = testFiles.Select(file => file.Id).ToList() + }); + Validate(vectorStore); + Assert.Multiple(() => + { + Assert.That(vectorStore.Name, Is.Null.Or.Empty); + Assert.That(vectorStore.FileCounts.Total, Is.EqualTo(5)); + }); + } + + [Test] + public void CanEnumerateVectorStores() + { + VectorStoreClient client = GetTestClient(); + for (int i = 0; i < 10; i++) + { + VectorStore vectorStore = client.CreateVectorStore(new VectorStoreCreationOptions() + { + Name = $"Test Vector Store {i}", + }); + Validate(vectorStore); + Assert.That(vectorStore.Name, Is.EqualTo($"Test Vector Store {i}")); + } + + int lastIdSeen = int.MaxValue; + int count = 0; + + foreach (VectorStore vectorStore in client.GetVectorStores(new VectorStoreCollectionOptions() { Order = ListOrder.NewestFirst }).GetAllValues()) + { + Assert.That(vectorStore.Id, Is.Not.Null); + if (vectorStore.Name?.StartsWith("Test Vector Store ") == true) + { + string idString = vectorStore.Name["Test Vector Store ".Length..]; + + Assert.That(int.TryParse(idString, out int seenId), Is.True); + Assert.That(seenId, Is.LessThan(lastIdSeen)); + lastIdSeen = seenId; + } + if (lastIdSeen == 0 || ++count >= 100) + { + break; + } + } + + Assert.That(lastIdSeen, Is.EqualTo(0)); + } + + [Test] + public async Task CanEnumerateVectorStoresAsync() + { + VectorStoreClient client = GetTestClient(); + for (int i = 0; i < 10; i++) + { + VectorStore vectorStore = await client.CreateVectorStoreAsync(new VectorStoreCreationOptions() + { + Name = $"Test Vector Store {i}", + }); + Validate(vectorStore); + Assert.That(vectorStore.Name, Is.EqualTo($"Test Vector Store {i}")); + } + + int lastIdSeen = int.MaxValue; + int count = 0; + + await foreach (VectorStore vectorStore in client.GetVectorStoresAsync(new VectorStoreCollectionOptions() { Order = ListOrder.NewestFirst }).GetAllValuesAsync()) + { + Assert.That(vectorStore.Id, Is.Not.Null); + if (vectorStore.Name?.StartsWith("Test Vector Store ") == true) + { + string idString = vectorStore.Name["Test Vector Store ".Length..]; + + Assert.That(int.TryParse(idString, out int seenId), Is.True); + Assert.That(seenId, Is.LessThan(lastIdSeen)); + lastIdSeen = seenId; + } + if (lastIdSeen == 0 || ++count >= 100) + { + break; + } + } + + Assert.That(lastIdSeen, Is.EqualTo(0)); + } + + [Test] + public void CanAssociateFiles() + { + VectorStoreClient client = GetTestClient(); + VectorStore vectorStore = client.CreateVectorStore(); + Validate(vectorStore); + + IReadOnlyList files = GetNewTestFiles(3); + + foreach (OpenAIFileInfo file in files) + { + VectorStoreFileAssociation association = client.AddFileToVectorStore(vectorStore, file); + Validate(association); + Assert.Multiple(() => + { + Assert.That(association.FileId, Is.EqualTo(file.Id)); + Assert.That(association.VectorStoreId, Is.EqualTo(vectorStore.Id)); + Assert.That(association.LastError, Is.Null); + Assert.That(association.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(association.Status, Is.EqualTo(VectorStoreFileAssociationStatus.InProgress)); + }); + } + + bool removed = client.RemoveFileFromStore(vectorStore, files[0]); + Assert.True(removed); + _associationsToRemove.RemoveAt(0); + + // Errata: removals aren't immediately reflected when requesting the list + Thread.Sleep(2000); + + int count = 0; + foreach (VectorStoreFileAssociation association in client.GetFileAssociations(vectorStore).GetAllValues()) + { + count++; + Assert.That(association.FileId, Is.Not.EqualTo(files[0].Id)); + Assert.That(association.VectorStoreId, Is.EqualTo(vectorStore.Id)); + } + Assert.That(count, Is.EqualTo(2)); + } + + [Test] + public void Pagination_CanRehydrateFileAssociationCollection() + { + VectorStoreClient client = GetTestClient(); + VectorStore vectorStore = client.CreateVectorStore(); + Validate(vectorStore); + + IReadOnlyList files = GetNewTestFiles(3); + + foreach (OpenAIFileInfo file in files) + { + VectorStoreFileAssociation association = client.AddFileToVectorStore(vectorStore, file); + Validate(association); + Assert.Multiple(() => + { + Assert.That(association.FileId, Is.EqualTo(file.Id)); + Assert.That(association.VectorStoreId, Is.EqualTo(vectorStore.Id)); + Assert.That(association.LastError, Is.Null); + Assert.That(association.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(association.Status, Is.EqualTo(VectorStoreFileAssociationStatus.InProgress)); + }); + } + + bool removed = client.RemoveFileFromStore(vectorStore, files[0]); + Assert.True(removed); + _associationsToRemove.RemoveAt(0); + + // Errata: removals aren't immediately reflected when requesting the list + Thread.Sleep(2000); + + PageCollection pages = client.GetFileAssociations(vectorStore); + IEnumerator> pageEnumerator = ((IEnumerable>)pages).GetEnumerator(); + + // Simulate rehydration of the collection + BinaryData rehydrationBytes = pages.GetCurrentPage().PageToken.ToBytes(); + ContinuationToken rehydrationToken = ContinuationToken.FromBytes(rehydrationBytes); + + PageCollection rehydratedPages = client.GetFileAssociations(rehydrationToken); + IEnumerator> rehydratedPageEnumerator = ((IEnumerable>)rehydratedPages).GetEnumerator(); + + int pageCount = 0; + + while (pageEnumerator.MoveNext() && rehydratedPageEnumerator.MoveNext()) + { + PageResult page = pageEnumerator.Current; + PageResult rehydratedPage = rehydratedPageEnumerator.Current; + + Assert.AreEqual(page.Values.Count, rehydratedPage.Values.Count); + + for (int i = 0; i < page.Values.Count; i++) + { + Assert.AreEqual(page.Values[0].FileId, rehydratedPage.Values[0].FileId); + Assert.AreEqual(page.Values[0].VectorStoreId, rehydratedPage.Values[0].VectorStoreId); + Assert.AreEqual(page.Values[0].CreatedAt, rehydratedPage.Values[0].CreatedAt); + Assert.AreEqual(page.Values[0].Size, rehydratedPage.Values[0].Size); + } + + pageCount++; + } + + Assert.That(pageCount, Is.GreaterThanOrEqualTo(1)); + } + + [Test] + public void CanUseBatchIngestion() + { + VectorStoreClient client = GetTestClient(); + VectorStore vectorStore = client.CreateVectorStore(); + Validate(vectorStore); + + IReadOnlyList testFiles = GetNewTestFiles(5); + + VectorStoreBatchFileJob batchJob = client.CreateBatchFileJob(vectorStore, testFiles); + Validate(batchJob); + + Assert.Multiple(() => + { + Assert.That(batchJob.BatchId, Is.Not.Null); + Assert.That(batchJob.VectorStoreId, Is.EqualTo(vectorStore.Id)); + Assert.That(batchJob.Status, Is.EqualTo(VectorStoreBatchFileJobStatus.InProgress)); + }); + + for (int i = 0; i < 10 && client.GetBatchFileJob(batchJob).Value.Status != VectorStoreBatchFileJobStatus.Completed; i++) + { + Thread.Sleep(500); + } + + foreach (VectorStoreFileAssociation association in client.GetFileAssociations(batchJob).GetAllValues()) + { + Assert.Multiple(() => + { + Assert.That(association.FileId, Is.Not.Null); + Assert.That(association.VectorStoreId, Is.EqualTo(vectorStore.Id)); + Assert.That(association.Status, Is.EqualTo(VectorStoreFileAssociationStatus.Completed)); + // Assert.That(association.Size, Is.GreaterThan(0)); + Assert.That(association.CreatedAt, Is.GreaterThan(s_2024)); + Assert.That(association.LastError, Is.Null); + }); + } + } + + public enum ChunkingStrategyKind { Auto, Static } + + [Test] + [TestCase(ChunkingStrategyKind.Auto)] + [TestCase(ChunkingStrategyKind.Static)] + public async Task CanApplyChunkingStrategy(ChunkingStrategyKind strategyKind) + { + IReadOnlyList testFiles = GetNewTestFiles(5); + + VectorStoreClient client = GetTestClient(); + + FileChunkingStrategy chunkingStrategy = strategyKind switch + { + ChunkingStrategyKind.Auto => FileChunkingStrategy.Auto, + ChunkingStrategyKind.Static => FileChunkingStrategy.CreateStaticStrategy(1200, 250), + _ => throw new NotImplementedException(), + }; + + if (chunkingStrategy is StaticFileChunkingStrategy inputStaticStrategy) + { + Assert.That(inputStaticStrategy.MaxTokensPerChunk, Is.EqualTo(1200)); + Assert.That(inputStaticStrategy.OverlappingTokenCount, Is.EqualTo(250)); + } + + VectorStore vectorStore = await client.CreateVectorStoreAsync(new VectorStoreCreationOptions() + { + FileIds = testFiles.Select(file => file.Id).ToList(), + ChunkingStrategy = chunkingStrategy, + }); + Validate(vectorStore); + Assert.That(vectorStore.FileCounts.Total, Is.EqualTo(5)); + + AsyncPageCollection associations = client.GetFileAssociationsAsync(vectorStore); + + await foreach (VectorStoreFileAssociation association in associations.GetAllValuesAsync()) + { + Assert.That(testFiles.Any(file => file.Id == association.FileId), Is.True); + Assert.That(association.ChunkingStrategy, Is.InstanceOf()); + StaticFileChunkingStrategy staticStrategy = association.ChunkingStrategy as StaticFileChunkingStrategy; + + Assert.That(staticStrategy.MaxTokensPerChunk, Is.EqualTo(strategyKind switch + { + ChunkingStrategyKind.Auto => 800, + ChunkingStrategyKind.Static => 1200, + _ => throw new NotImplementedException() + })); + Assert.That(staticStrategy.OverlappingTokenCount, Is.EqualTo(strategyKind switch + { + ChunkingStrategyKind.Auto => 400, + ChunkingStrategyKind.Static => 250, + _ => throw new NotImplementedException() + })); + } + } + + private IReadOnlyList GetNewTestFiles(int count) + { + List files = []; + + FileClient client = GetTestClient(TestScenario.Files); + for (int i = 0; i < count; i++) + { + OpenAIFileInfo file = client.UploadFile( + BinaryData.FromString("This is a test file").ToStream(), + $"test_file_{i.ToString().PadLeft(3, '0')}.txt", + FileUploadPurpose.Assistants); + Validate(file); + files.Add(file); + } + + return files; + } + + [TearDown] + protected void Cleanup() + { + FileClient fileClient = GetTestClient(TestScenario.Files); + VectorStoreClient vectorStoreClient = GetTestClient(TestScenario.VectorStores); + RequestOptions requestOptions = new() + { + ErrorOptions = ClientErrorBehaviors.NoThrow, + }; + foreach (VectorStoreBatchFileJob job in _jobsToCancel) + { + ClientResult protocolResult = vectorStoreClient.CancelBatchFileJob(job.VectorStoreId, job.BatchId, requestOptions); + Console.WriteLine($"Cleanup: {job.BatchId} => {protocolResult?.GetRawResponse()?.Status}"); + } + foreach (VectorStoreFileAssociation association in _associationsToRemove) + { + ClientResult protocolResult = vectorStoreClient.RemoveFileFromStore(association.VectorStoreId, association.FileId, requestOptions); + Console.WriteLine($"Cleanup: {association.FileId}<->{association.VectorStoreId} => {protocolResult?.GetRawResponse()?.Status}"); + } + foreach (OpenAIFileInfo file in _filesToDelete) + { + Console.WriteLine($"Cleanup: {file.Id} -> {fileClient.DeleteFile(file.Id, requestOptions)?.GetRawResponse()?.Status}"); + } + foreach (VectorStore vectorStore in _vectorStoresToDelete) + { + Console.WriteLine($"Cleanup: {vectorStore.Id} => {vectorStoreClient.DeleteVectorStore(vectorStore.Id, requestOptions)?.GetRawResponse()?.Status}"); + } + _filesToDelete.Clear(); + _vectorStoresToDelete.Clear(); + } + + /// + /// Performs basic, invariant validation of a target that was just instantiated from its corresponding origination + /// mechanism. If applicable, the instance is recorded into the test run for cleanup of persistent resources. + /// + /// Instance type being validated. + /// The instance to validate. + /// The provided instance type isn't supported. + private void Validate(T target) + { + if (target is VectorStoreBatchFileJob job) + { + Assert.That(job.BatchId, Is.Not.Null); + _jobsToCancel.Add(job); + } + else if (target is VectorStoreFileAssociation association) + { + Assert.That(association?.FileId, Is.Not.Null); + Assert.That(association?.VectorStoreId, Is.Not.Null); + _associationsToRemove.Add(association); + } + else if (target is OpenAIFileInfo file) + { + Assert.That(file?.Id, Is.Not.Null); + _filesToDelete.Add(file); + } + else if (target is VectorStore vectorStore) + { + Assert.That(vectorStore?.Id, Is.Not.Null); + _vectorStoresToDelete.Add(vectorStore); + } + else + { + throw new NotImplementedException($"{nameof(Validate)} helper not implemented for: {typeof(T)}"); + } + } + + private readonly List _jobsToCancel = []; + private readonly List _associationsToRemove = []; + private readonly List _filesToDelete = []; + private readonly List _vectorStoresToDelete = []; + + private static VectorStoreClient GetTestClient() => GetTestClient(TestScenario.VectorStores); + + private static readonly DateTimeOffset s_2024 = new(2024, 1, 1, 0, 0, 0, TimeSpan.Zero); +} + +#pragma warning restore OPENAI001 diff --git a/.dotnet/tests/Audio/OpenAIAudioModelFactoryTests.cs b/.dotnet/tests/Audio/OpenAIAudioModelFactoryTests.cs new file mode 100644 index 000000000..717509f3c --- /dev/null +++ b/.dotnet/tests/Audio/OpenAIAudioModelFactoryTests.cs @@ -0,0 +1,397 @@ +using System; +using System.Collections.Generic; +using System.Linq; +using NUnit.Framework; +using OpenAI.Audio; + +namespace OpenAI.Tests.Audio; + +[Parallelizable(ParallelScope.All)] +[Category("Smoke")] +public partial class OpenAIAudioModelFactoryTests +{ + [Test] + public void AudioTranscriptionWithNoPropertiesWorks() + { + AudioTranscription audioTranscription = OpenAIAudioModelFactory.AudioTranscription(); + + Assert.That(audioTranscription.Language, Is.Null); + Assert.That(audioTranscription.Duration, Is.Null); + Assert.That(audioTranscription.Text, Is.Null); + Assert.That(audioTranscription.Words, Is.Not.Null.And.Empty); + Assert.That(audioTranscription.Segments, Is.Not.Null.And.Empty); + } + + [Test] + public void AudioTranscriptionWithLanguageWorks() + { + string language = "esperanto"; + AudioTranscription audioTranscription = OpenAIAudioModelFactory.AudioTranscription(language: language); + + Assert.That(audioTranscription.Language, Is.EqualTo(language)); + Assert.That(audioTranscription.Duration, Is.Null); + Assert.That(audioTranscription.Text, Is.Null); + Assert.That(audioTranscription.Words, Is.Not.Null.And.Empty); + Assert.That(audioTranscription.Segments, Is.Not.Null.And.Empty); + } + + [Test] + public void AudioTranscriptionWithDurationWorks() + { + TimeSpan duration = TimeSpan.FromSeconds(45); + AudioTranscription audioTranscription = OpenAIAudioModelFactory.AudioTranscription(duration: duration); + + Assert.That(audioTranscription.Language, Is.Null); + Assert.That(audioTranscription.Duration, Is.EqualTo(duration)); + Assert.That(audioTranscription.Text, Is.Null); + Assert.That(audioTranscription.Words, Is.Not.Null.And.Empty); + Assert.That(audioTranscription.Segments, Is.Not.Null.And.Empty); + } + + [Test] + public void AudioTranscriptionWithTextWorks() + { + string text = "I've been transcripted."; + AudioTranscription audioTranscription = OpenAIAudioModelFactory.AudioTranscription(text: text); + + Assert.That(audioTranscription.Language, Is.Null); + Assert.That(audioTranscription.Duration, Is.Null); + Assert.That(audioTranscription.Text, Is.EqualTo(text)); + Assert.That(audioTranscription.Words, Is.Not.Null.And.Empty); + Assert.That(audioTranscription.Segments, Is.Not.Null.And.Empty); + } + + [Test] + public void AudioTranscriptionWithWordsWorks() + { + IEnumerable words = [ + OpenAIAudioModelFactory.TranscribedWord(word: "Apple"), + OpenAIAudioModelFactory.TranscribedWord(word: "pie") + ]; + AudioTranscription audioTranscription = OpenAIAudioModelFactory.AudioTranscription(words: words); + + Assert.That(audioTranscription.Language, Is.Null); + Assert.That(audioTranscription.Duration, Is.Null); + Assert.That(audioTranscription.Text, Is.Null); + Assert.That(audioTranscription.Words.SequenceEqual(words), Is.True); + Assert.That(audioTranscription.Segments, Is.Not.Null.And.Empty); + } + + [Test] + public void AudioTranscriptionWithSegmentsWorks() + { + IEnumerable segments = [ + OpenAIAudioModelFactory.TranscribedSegment(id: 1), + OpenAIAudioModelFactory.TranscribedSegment(id: 2) + ]; + AudioTranscription audioTranscription = OpenAIAudioModelFactory.AudioTranscription(segments: segments); + + Assert.That(audioTranscription.Language, Is.Null); + Assert.That(audioTranscription.Duration, Is.Null); + Assert.That(audioTranscription.Text, Is.Null); + Assert.That(audioTranscription.Words, Is.Not.Null.And.Empty); + Assert.That(audioTranscription.Segments.SequenceEqual(segments), Is.True); + } + + [Test] + public void AudioTranslationWithNoPropertiesWorks() + { + AudioTranslation audioTranslation = OpenAIAudioModelFactory.AudioTranslation(); + + Assert.That(audioTranslation.Language, Is.Null); + Assert.That(audioTranslation.Duration, Is.Null); + Assert.That(audioTranslation.Text, Is.Null); + Assert.That(audioTranslation.Segments, Is.Not.Null.And.Empty); + } + + [Test] + public void AudioTranslationWithLanguageWorks() + { + string language = "esperanto"; + AudioTranslation audioTranslation = OpenAIAudioModelFactory.AudioTranslation(language: language); + + Assert.That(audioTranslation.Language, Is.EqualTo(language)); + Assert.That(audioTranslation.Duration, Is.Null); + Assert.That(audioTranslation.Text, Is.Null); + Assert.That(audioTranslation.Segments, Is.Not.Null.And.Empty); + } + + [Test] + public void AudioTranslationWithDurationWorks() + { + TimeSpan duration = TimeSpan.FromSeconds(45); + AudioTranslation audioTranslation = OpenAIAudioModelFactory.AudioTranslation(duration: duration); + + Assert.That(audioTranslation.Language, Is.Null); + Assert.That(audioTranslation.Duration, Is.EqualTo(duration)); + Assert.That(audioTranslation.Text, Is.Null); + Assert.That(audioTranslation.Segments, Is.Not.Null.And.Empty); + } + + [Test] + public void AudioTranslationWithTextWorks() + { + string text = "I've been transcripted."; + AudioTranslation audioTranslation = OpenAIAudioModelFactory.AudioTranslation(text: text); + + Assert.That(audioTranslation.Language, Is.Null); + Assert.That(audioTranslation.Duration, Is.Null); + Assert.That(audioTranslation.Text, Is.EqualTo(text)); + Assert.That(audioTranslation.Segments, Is.Not.Null.And.Empty); + } + + [Test] + public void AudioTranslationWithSegmentsWorks() + { + IEnumerable segments = [ + OpenAIAudioModelFactory.TranscribedSegment(id: 1), + OpenAIAudioModelFactory.TranscribedSegment(id: 2) + ]; + AudioTranslation audioTranslation = OpenAIAudioModelFactory.AudioTranslation(segments: segments); + + Assert.That(audioTranslation.Language, Is.Null); + Assert.That(audioTranslation.Duration, Is.Null); + Assert.That(audioTranslation.Text, Is.Null); + Assert.That(audioTranslation.Segments.SequenceEqual(segments), Is.True); + } + + [Test] + public void TranscribedSegmentWithNoPropertiesWorks() + { + TranscribedSegment transcribedSegment = OpenAIAudioModelFactory.TranscribedSegment(); + + Assert.That(transcribedSegment.Id, Is.EqualTo(default(int))); + Assert.That(transcribedSegment.SeekOffset, Is.EqualTo(default(long))); + Assert.That(transcribedSegment.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.End, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.Text, Is.Null); + Assert.That(transcribedSegment.TokenIds, Is.Not.Null.And.Empty); + Assert.That(transcribedSegment.Temperature, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.AverageLogProbability, Is.EqualTo(default(double))); + Assert.That(transcribedSegment.CompressionRatio, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.NoSpeechProbability, Is.EqualTo(default(double))); + } + + [Test] + public void TranscribedSegmentWithIdWorks() + { + int id = 10; + TranscribedSegment transcribedSegment = OpenAIAudioModelFactory.TranscribedSegment(id: id); + + Assert.That(transcribedSegment.Id, Is.EqualTo(id)); + Assert.That(transcribedSegment.SeekOffset, Is.EqualTo(default(long))); + Assert.That(transcribedSegment.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.End, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.Text, Is.Null); + Assert.That(transcribedSegment.TokenIds, Is.Not.Null.And.Empty); + Assert.That(transcribedSegment.Temperature, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.AverageLogProbability, Is.EqualTo(default(double))); + Assert.That(transcribedSegment.CompressionRatio, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.NoSpeechProbability, Is.EqualTo(default(double))); + } + + [Test] + public void TranscribedSegmentWithSeekOffsetWorks() + { + long seekOffset = 9000000000; + TranscribedSegment transcribedSegment = OpenAIAudioModelFactory.TranscribedSegment(seekOffset: seekOffset); + + Assert.That(transcribedSegment.Id, Is.EqualTo(default(int))); + Assert.That(transcribedSegment.SeekOffset, Is.EqualTo(seekOffset)); + Assert.That(transcribedSegment.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.End, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.Text, Is.Null); + Assert.That(transcribedSegment.TokenIds, Is.Not.Null.And.Empty); + Assert.That(transcribedSegment.Temperature, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.AverageLogProbability, Is.EqualTo(default(double))); + Assert.That(transcribedSegment.CompressionRatio, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.NoSpeechProbability, Is.EqualTo(default(double))); + } + + [Test] + public void TranscribedSegmentWithStartWorks() + { + TimeSpan start = TimeSpan.FromSeconds(45); + TranscribedSegment transcribedSegment = OpenAIAudioModelFactory.TranscribedSegment(start: start); + + Assert.That(transcribedSegment.Id, Is.EqualTo(default(int))); + Assert.That(transcribedSegment.SeekOffset, Is.EqualTo(default(long))); + Assert.That(transcribedSegment.Start, Is.EqualTo(start)); + Assert.That(transcribedSegment.End, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.Text, Is.Null); + Assert.That(transcribedSegment.TokenIds, Is.Not.Null.And.Empty); + Assert.That(transcribedSegment.Temperature, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.AverageLogProbability, Is.EqualTo(default(double))); + Assert.That(transcribedSegment.CompressionRatio, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.NoSpeechProbability, Is.EqualTo(default(double))); + } + + [Test] + public void TranscribedSegmentWithEndWorks() + { + TimeSpan end = TimeSpan.FromSeconds(45); + TranscribedSegment transcribedSegment = OpenAIAudioModelFactory.TranscribedSegment(end: end); + + Assert.That(transcribedSegment.Id, Is.EqualTo(default(int))); + Assert.That(transcribedSegment.SeekOffset, Is.EqualTo(default(long))); + Assert.That(transcribedSegment.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.End, Is.EqualTo(end)); + Assert.That(transcribedSegment.Text, Is.Null); + Assert.That(transcribedSegment.TokenIds, Is.Not.Null.And.Empty); + Assert.That(transcribedSegment.Temperature, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.AverageLogProbability, Is.EqualTo(default(double))); + Assert.That(transcribedSegment.CompressionRatio, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.NoSpeechProbability, Is.EqualTo(default(double))); + } + + [Test] + public void TranscribedSegmentWithTextWorks() + { + string text = "A segment text"; + TranscribedSegment transcribedSegment = OpenAIAudioModelFactory.TranscribedSegment(text: text); + + Assert.That(transcribedSegment.Id, Is.EqualTo(default(int))); + Assert.That(transcribedSegment.SeekOffset, Is.EqualTo(default(long))); + Assert.That(transcribedSegment.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.End, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.Text, Is.EqualTo(text)); + Assert.That(transcribedSegment.TokenIds, Is.Not.Null.And.Empty); + Assert.That(transcribedSegment.Temperature, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.AverageLogProbability, Is.EqualTo(default(double))); + Assert.That(transcribedSegment.CompressionRatio, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.NoSpeechProbability, Is.EqualTo(default(double))); + } + + [Test] + public void TranscribedSegmentWithTokenIdsWorks() + { + IEnumerable tokenIds = [ 9000000000, 9000000010 ]; + TranscribedSegment transcribedSegment = OpenAIAudioModelFactory.TranscribedSegment(tokenIds: tokenIds); + + Assert.That(transcribedSegment.Id, Is.EqualTo(default(int))); + Assert.That(transcribedSegment.SeekOffset, Is.EqualTo(default(long))); + Assert.That(transcribedSegment.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.End, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.Text, Is.Null); + Assert.That(transcribedSegment.TokenIds.SequenceEqual(tokenIds), Is.True); + Assert.That(transcribedSegment.Temperature, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.AverageLogProbability, Is.EqualTo(default(double))); + Assert.That(transcribedSegment.CompressionRatio, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.NoSpeechProbability, Is.EqualTo(default(double))); + } + + [Test] + public void TranscribedSegmentWithTemperatureWorks() + { + float temperature = 0.232f; + TranscribedSegment transcribedSegment = OpenAIAudioModelFactory.TranscribedSegment(temperature: temperature); + + Assert.That(transcribedSegment.Id, Is.EqualTo(default(int))); + Assert.That(transcribedSegment.SeekOffset, Is.EqualTo(default(long))); + Assert.That(transcribedSegment.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.End, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.Text, Is.Null); + Assert.That(transcribedSegment.TokenIds, Is.Not.Null.And.Empty); + Assert.That(transcribedSegment.Temperature, Is.EqualTo(temperature)); + Assert.That(transcribedSegment.AverageLogProbability, Is.EqualTo(default(double))); + Assert.That(transcribedSegment.CompressionRatio, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.NoSpeechProbability, Is.EqualTo(default(double))); + } + + [Test] + public void TranscribedSegmentWithAverageLogProbabilityWorks() + { + double averageLogProbability = -3.1415; + TranscribedSegment transcribedSegment = OpenAIAudioModelFactory.TranscribedSegment(averageLogProbability: averageLogProbability); + + Assert.That(transcribedSegment.Id, Is.EqualTo(default(int))); + Assert.That(transcribedSegment.SeekOffset, Is.EqualTo(default(long))); + Assert.That(transcribedSegment.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.End, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.Text, Is.Null); + Assert.That(transcribedSegment.TokenIds, Is.Not.Null.And.Empty); + Assert.That(transcribedSegment.Temperature, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.AverageLogProbability, Is.EqualTo(averageLogProbability)); + Assert.That(transcribedSegment.CompressionRatio, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.NoSpeechProbability, Is.EqualTo(default(double))); + } + + [Test] + public void TranscribedSegmentWithCompressionRatioWorks() + { + float compressionRatio = 1.33f; + TranscribedSegment transcribedSegment = OpenAIAudioModelFactory.TranscribedSegment(compressionRatio: compressionRatio); + + Assert.That(transcribedSegment.Id, Is.EqualTo(default(int))); + Assert.That(transcribedSegment.SeekOffset, Is.EqualTo(default(long))); + Assert.That(transcribedSegment.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.End, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.Text, Is.Null); + Assert.That(transcribedSegment.TokenIds, Is.Not.Null.And.Empty); + Assert.That(transcribedSegment.Temperature, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.AverageLogProbability, Is.EqualTo(default(double))); + Assert.That(transcribedSegment.CompressionRatio, Is.EqualTo(compressionRatio)); + Assert.That(transcribedSegment.NoSpeechProbability, Is.EqualTo(default(double))); + } + + [Test] + public void TranscribedSegmentWithNoSpeechProbabilityWorks() + { + double noSpeechProbability = 0.02; + TranscribedSegment transcribedSegment = OpenAIAudioModelFactory.TranscribedSegment(noSpeechProbability: noSpeechProbability); + + Assert.That(transcribedSegment.Id, Is.EqualTo(default(int))); + Assert.That(transcribedSegment.SeekOffset, Is.EqualTo(default(long))); + Assert.That(transcribedSegment.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.End, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedSegment.Text, Is.Null); + Assert.That(transcribedSegment.TokenIds, Is.Not.Null.And.Empty); + Assert.That(transcribedSegment.Temperature, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.AverageLogProbability, Is.EqualTo(default(double))); + Assert.That(transcribedSegment.CompressionRatio, Is.EqualTo(default(float))); + Assert.That(transcribedSegment.NoSpeechProbability, Is.EqualTo(noSpeechProbability)); + } + + [Test] + public void TranscribedWordWithNoPropertiesWorks() + { + TranscribedWord transcribedWord = OpenAIAudioModelFactory.TranscribedWord(); + + Assert.That(transcribedWord.Word, Is.Null); + Assert.That(transcribedWord.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedWord.End, Is.EqualTo(default(TimeSpan))); + } + + [Test] + public void TranscribedWordWithIdWorks() + { + string word = "croissant"; + TranscribedWord transcribedWord = OpenAIAudioModelFactory.TranscribedWord(word: word); + + Assert.That(transcribedWord.Word, Is.EqualTo(word)); + Assert.That(transcribedWord.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedWord.End, Is.EqualTo(default(TimeSpan))); + } + + [Test] + public void TranscribedWordWithStartWorks() + { + TimeSpan start = TimeSpan.FromSeconds(45); + TranscribedWord transcribedWord = OpenAIAudioModelFactory.TranscribedWord(start: start); + + Assert.That(transcribedWord.Word, Is.Null); + Assert.That(transcribedWord.Start, Is.EqualTo(start)); + Assert.That(transcribedWord.End, Is.EqualTo(default(TimeSpan))); + } + + [Test] + public void TranscribedWordWithEndWorks() + { + TimeSpan end = TimeSpan.FromSeconds(45); + TranscribedWord transcribedWord = OpenAIAudioModelFactory.TranscribedWord(end: end); + + Assert.That(transcribedWord.Word, Is.Null); + Assert.That(transcribedWord.Start, Is.EqualTo(default(TimeSpan))); + Assert.That(transcribedWord.End, Is.EqualTo(end)); + } +} diff --git a/.dotnet/tests/Audio/TextToSpeechTests.cs b/.dotnet/tests/Audio/TextToSpeechTests.cs new file mode 100644 index 000000000..cbbd748d1 --- /dev/null +++ b/.dotnet/tests/Audio/TextToSpeechTests.cs @@ -0,0 +1,63 @@ +using NUnit.Framework; +using OpenAI.Audio; +using OpenAI.Tests.Utility; +using System; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Audio; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.All)] +[Category("Audio")] +public partial class TextToSpeechTests : SyncAsyncTestBase +{ + public TextToSpeechTests(bool isAsync) : base(isAsync) + { + } + + [Test] + public async Task BasicTextToSpeechWorks() + { + AudioClient client = GetTestClient(TestScenario.Audio_TTS); + + BinaryData audio = IsAsync + ? await client.GenerateSpeechAsync("Hello, world! This is a test.", GeneratedSpeechVoice.Shimmer) + : client.GenerateSpeech("Hello, world! This is a test.", GeneratedSpeechVoice.Shimmer); + + Assert.That(audio, Is.Not.Null); + ValidateGeneratedAudio(audio, "hello"); + } + + [Test] + [TestCase(null)] + [TestCase(GeneratedSpeechFormat.Mp3)] + [TestCase(GeneratedSpeechFormat.Opus)] + [TestCase(GeneratedSpeechFormat.Aac)] + [TestCase(GeneratedSpeechFormat.Flac)] + [TestCase(GeneratedSpeechFormat.Wav)] + [TestCase(GeneratedSpeechFormat.Pcm)] + public async Task OutputFormatWorks(GeneratedSpeechFormat? responseFormat) + { + AudioClient client = GetTestClient(TestScenario.Audio_TTS); + + SpeechGenerationOptions options = responseFormat == null + ? new() + : new() { ResponseFormat = responseFormat }; + + BinaryData audio = IsAsync + ? await client.GenerateSpeechAsync("Hello, world!", GeneratedSpeechVoice.Alloy, options) + : client.GenerateSpeech("Hello, world!", GeneratedSpeechVoice.Alloy, options); + + Assert.That(audio, Is.Not.Null); + } + + private void ValidateGeneratedAudio(BinaryData audio, string expectedSubstring) + { + AudioClient client = GetTestClient(TestScenario.Audio_Whisper); + AudioTranscription transcription = client.TranscribeAudio(audio.ToStream(), "hello_world.wav"); + + Assert.That(transcription.Text.ToLowerInvariant(), Contains.Substring(expectedSubstring)); + } +} diff --git a/.dotnet/tests/Audio/TranscriptionTests.cs b/.dotnet/tests/Audio/TranscriptionTests.cs new file mode 100644 index 000000000..7fead9b24 --- /dev/null +++ b/.dotnet/tests/Audio/TranscriptionTests.cs @@ -0,0 +1,183 @@ +using NUnit.Framework; +using OpenAI.Audio; +using OpenAI.Tests.Utility; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.IO; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Audio; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.All)] +[Category("Audio")] +public partial class TranscriptionTests : SyncAsyncTestBase +{ + public TranscriptionTests(bool isAsync) : base(isAsync) + { + } + + public enum AudioSourceKind + { + UsingStream, + UsingFilePath, + } + + [Test] + [TestCase(AudioSourceKind.UsingStream)] + [TestCase(AudioSourceKind.UsingFilePath)] + public async Task TranscriptionWorks(AudioSourceKind audioSourceKind) + { + AudioClient client = GetTestClient(TestScenario.Audio_Whisper); + string filename = "audio_hello_world.mp3"; + string path = Path.Combine("Assets", filename); + AudioTranscription transcription = null; + + if (audioSourceKind == AudioSourceKind.UsingStream) + { + using FileStream inputStream = File.OpenRead(path); + + transcription = IsAsync + ? await client.TranscribeAudioAsync(inputStream, filename) + : client.TranscribeAudio(inputStream, filename); + } + else if (audioSourceKind == AudioSourceKind.UsingFilePath) + { + transcription = IsAsync + ? await client.TranscribeAudioAsync(path) + : client.TranscribeAudio(path); + } + + Assert.That(transcription, Is.Not.Null); + Assert.That(transcription.Text.ToLowerInvariant(), Contains.Substring("hello")); + } + + [Test] + [TestCase(AudioTimestampGranularities.Default)] + [TestCase(AudioTimestampGranularities.Word)] + [TestCase(AudioTimestampGranularities.Segment)] + [TestCase(AudioTimestampGranularities.Word | AudioTimestampGranularities.Segment)] + public async Task TimestampsWork(AudioTimestampGranularities granularityFlags) + { + AudioClient client = GetTestClient(TestScenario.Audio_Whisper); + + using FileStream inputStream = File.OpenRead(Path.Combine("Assets", "audio_hello_world.mp3")); + + AudioTranscriptionOptions options = new() + { + ResponseFormat = AudioTranscriptionFormat.Verbose, + Temperature = 0.4f, + Granularities = granularityFlags, + }; + + ClientResult transcriptionResult = IsAsync + ? await client.TranscribeAudioAsync(inputStream, "audio_hello_world.mp3", options) + : client.TranscribeAudio(inputStream, "audio_hello_world.mp3", options); + + PipelineResponse rawResponse = transcriptionResult.GetRawResponse(); + Assert.That(rawResponse.Content.ToString(), Is.Not.Null.And.Not.Empty); + + AudioTranscription transcription = transcriptionResult; + Assert.That(transcription, Is.Not.Null); + + IReadOnlyList words = transcription.Words; + IReadOnlyList segments = transcription.Segments; + + bool wordTimestampsPresent = words?.Count > 0; + bool segmentTimestampsPresent = segments?.Count > 0; + + bool wordTimestampsExpected = granularityFlags.HasFlag(AudioTimestampGranularities.Word); + bool segmentTimestampsExpected = granularityFlags.HasFlag(AudioTimestampGranularities.Segment) + || granularityFlags == AudioTimestampGranularities.Default; + + Assert.That(wordTimestampsPresent, Is.EqualTo(wordTimestampsExpected)); + Assert.That(segmentTimestampsPresent, Is.EqualTo(segmentTimestampsExpected)); + + for (int i = 0; i < (words?.Count ?? 0); i++) + { + if (i > 0) + { + Assert.That(words[i].Start, Is.GreaterThanOrEqualTo(words[i - 1].End)); + } + Assert.That(words[i].End, Is.GreaterThan(words[i].Start)); + Assert.That(string.IsNullOrEmpty(words[i].Word), Is.False); + } + + for (int i = 0; i < (segments?.Count ?? 0); i++) + { + if (i > 0) + { + Assert.That(segments[i].Id, Is.GreaterThan(segments[i - 1].Id)); + Assert.That(segments[i].SeekOffset, Is.GreaterThan(0)); + Assert.That(segments[i].Start, Is.GreaterThanOrEqualTo(segments[i - 1].End)); + } + Assert.That(segments[i].End, Is.GreaterThan(segments[i].Start)); + Assert.That(string.IsNullOrEmpty(segments[i].Text), Is.False); + Assert.That(segments[i].TokenIds, Is.Not.Null.And.Not.Empty); + foreach (int tokenId in segments[i].TokenIds) + { + Assert.That(tokenId, Is.GreaterThanOrEqualTo(0)); + } + Assert.That(segments[i].Temperature, Is.LessThan(-0.001f).Or.GreaterThan(0.001f)); + Assert.That(segments[i].AverageLogProbability, Is.LessThan(-0.001f).Or.GreaterThan(0.001f)); + Assert.That(segments[i].CompressionRatio, Is.LessThan(-0.001f).Or.GreaterThan(0.001f)); + Assert.That(segments[i].NoSpeechProbability, Is.LessThan(-0.001f).Or.GreaterThan(0.001f)); + } + } + + [Test] + [TestCase(AudioTranscriptionFormat.Simple)] + [TestCase(AudioTranscriptionFormat.Verbose)] + [TestCase(AudioTranscriptionFormat.Srt)] + [TestCase(AudioTranscriptionFormat.Vtt)] + public async Task TranscriptionFormatsWork(AudioTranscriptionFormat formatToTest) + { + AudioClient client = GetTestClient(TestScenario.Audio_Whisper); + string path = Path.Combine("Assets", "audio_hello_world.mp3"); + + AudioTranscriptionOptions options = new() + { + ResponseFormat = formatToTest, + }; + + AudioTranscription transcription = IsAsync + ? await client.TranscribeAudioAsync(path, options) + : client.TranscribeAudio(path, options); + Assert.That(transcription, Is.Not.Null); + Assert.That(transcription.Text, Is.Not.Null.And.Not.Empty); + Assert.That(transcription.Text.ToLowerInvariant(), Does.Contain("hello")); + } + + [Test] + public async Task BadTranscriptionRequest() + { + AudioClient client = GetTestClient(TestScenario.Audio_Whisper); + + string path = Path.Combine("Assets", "audio_hello_world.mp3"); + + AudioTranscriptionOptions options = new AudioTranscriptionOptions() + { + Language = "this should cause an error" + }; + + Exception caughtException = null; + + try + { + _ = IsAsync + ? await client.TranscribeAudioAsync(path, options) + : client.TranscribeAudio(path, options); + } + catch (Exception ex) + { + caughtException = ex; + } + + Assert.That(caughtException, Is.InstanceOf()); + Assert.That(caughtException.Message?.ToLower(), Contains.Substring("invalid language")); + } +} diff --git a/.dotnet/tests/Audio/TranslationTests.cs b/.dotnet/tests/Audio/TranslationTests.cs new file mode 100644 index 000000000..ccbeae4d9 --- /dev/null +++ b/.dotnet/tests/Audio/TranslationTests.cs @@ -0,0 +1,76 @@ +using NUnit.Framework; +using OpenAI.Audio; +using OpenAI.Tests.Utility; +using System.IO; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Audio; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.All)] +[Category("Audio")] +public partial class TranslationTests : SyncAsyncTestBase +{ + public TranslationTests(bool isAsync) : base(isAsync) + { + } + + public enum AudioSourceKind + { + UsingStream, + UsingFilePath, + } + + [Test] + [TestCase(AudioSourceKind.UsingStream)] + [TestCase(AudioSourceKind.UsingFilePath)] + public async Task TranslationWorks(AudioSourceKind audioSourceKind) + { + AudioClient client = GetTestClient(TestScenario.Audio_Whisper); + + string filename = "audio_french.wav"; + string path = Path.Combine("Assets", filename); + AudioTranslation translation = null; + + if (audioSourceKind == AudioSourceKind.UsingStream) + { + using FileStream audio = File.OpenRead(path); + + translation = IsAsync + ? await client.TranslateAudioAsync(audio, filename) + : client.TranslateAudio(audio, filename); + } + else if (audioSourceKind == AudioSourceKind.UsingFilePath) + { + translation = IsAsync + ? await client.TranslateAudioAsync(path) + : client.TranslateAudio(path); + } + Assert.That(translation?.Text, Is.Not.Null); + Assert.That(translation.Text.ToLowerInvariant(), Contains.Substring("whisper")); + } + + [Test] + [TestCase(AudioTranslationFormat.Simple)] + [TestCase(AudioTranslationFormat.Verbose)] + [TestCase(AudioTranslationFormat.Srt)] + [TestCase(AudioTranslationFormat.Vtt)] + public async Task TranslationFormatsWork(AudioTranslationFormat formatToTest) + { + AudioClient client = GetTestClient(TestScenario.Audio_Whisper); + string path = Path.Combine("Assets", "audio_french.wav"); + + AudioTranslationOptions options = new() + { + ResponseFormat = formatToTest, + }; + + AudioTranslation translation = IsAsync + ? await client.TranslateAudioAsync(path, options) + : client.TranslateAudio(path, options); + + Assert.That(translation?.Text?.ToLowerInvariant(), Does.Contain("recognition")); + } +} diff --git a/.dotnet/tests/Batch/BatchTests.cs b/.dotnet/tests/Batch/BatchTests.cs new file mode 100644 index 000000000..8e51201ae --- /dev/null +++ b/.dotnet/tests/Batch/BatchTests.cs @@ -0,0 +1,175 @@ +using NUnit.Framework; +using OpenAI.Batch; +using OpenAI.Files; +using OpenAI.Tests.Utility; +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.IO; +using System.Text.Json; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Batch; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.All)] +[Category("Batch")] +public partial class BatchTests : SyncAsyncTestBase +{ + public BatchTests(bool isAsync) : base(isAsync) + { + } + + [Test] + public void ListBatchesProtocol() + { + BatchClient client = GetTestClient(); + IEnumerable pageResults = client.GetBatches(after: null, limit: null, options: null); + + int pageCount = 0; + foreach (ClientResult pageResult in pageResults) + { + BinaryData response = pageResult.GetRawResponse().Content; + using JsonDocument jsonDocument = JsonDocument.Parse(response); + JsonElement dataElement = jsonDocument.RootElement.GetProperty("data"); + + Assert.That(dataElement.GetArrayLength(), Is.GreaterThan(0)); + + long unixTime2024 = (new DateTimeOffset(2024, 01, 01, 0, 0, 0, TimeSpan.Zero)).ToUnixTimeSeconds(); + + foreach (JsonElement batchElement in dataElement.EnumerateArray()) + { + JsonElement createdAtElement = batchElement.GetProperty("created_at"); + long createdAt = createdAtElement.GetInt64(); + + Assert.That(createdAt, Is.GreaterThan(unixTime2024)); + } + pageCount++; + + //var dynamicResult = result.GetRawResponse().Content.ToDynamicFromJson(); + //Assert.That(dynamicResult.data.Count, Is.GreaterThan(0)); + //Assert.That(dynamicResult.data[0].createdAt, Is.GreaterThan(new DateTimeOffset(2024, 01, 01, 0, 0, 0, TimeSpan.Zero))); + } + + Assert.GreaterOrEqual(pageCount, 1); + } + + [Test] + public async Task ListBatchesProtocolAsync() + { + BatchClient client = GetTestClient(); + IAsyncEnumerable pageResults = client.GetBatchesAsync(after: null, limit: null, options: null); + + int pageCount = 0; + await foreach (ClientResult pageResult in pageResults) + { + BinaryData response = pageResult.GetRawResponse().Content; + using JsonDocument jsonDocument = JsonDocument.Parse(response); + JsonElement dataElement = jsonDocument.RootElement.GetProperty("data"); + + Assert.That(dataElement.GetArrayLength(), Is.GreaterThan(0)); + + long unixTime2024 = (new DateTimeOffset(2024, 01, 01, 0, 0, 0, TimeSpan.Zero)).ToUnixTimeSeconds(); + + foreach (JsonElement batchElement in dataElement.EnumerateArray()) + { + JsonElement createdAtElement = batchElement.GetProperty("created_at"); + long createdAt = createdAtElement.GetInt64(); + + Assert.That(createdAt, Is.GreaterThan(unixTime2024)); + } + pageCount++; + + //var dynamicResult = result.GetRawResponse().Content.ToDynamicFromJson(); + //Assert.That(dynamicResult.data.Count, Is.GreaterThan(0)); + //Assert.That(dynamicResult.data[0].createdAt, Is.GreaterThan(new DateTimeOffset(2024, 01, 01, 0, 0, 0, TimeSpan.Zero))); + } + + Assert.GreaterOrEqual(pageCount, 1); + } + + [Test] + public async Task CreateGetAndCancelBatchProtocol() + { + using MemoryStream testFileStream = new(); + using StreamWriter streamWriter = new (testFileStream); + string input = @"{""custom_id"": ""request-1"", ""method"": ""POST"", ""url"": ""/v1/chat/completions"", ""body"": {""model"": ""gpt-4o-mini"", ""messages"": [{""role"": ""system"", ""content"": ""You are a helpful assistant.""}, {""role"": ""user"", ""content"": ""What is 2+2?""}]}}"; + streamWriter.WriteLine(input); + streamWriter.Flush(); + testFileStream.Position = 0; + + FileClient fileClient = GetTestClient(TestScenario.Files); + OpenAIFileInfo inputFile = await fileClient.UploadFileAsync(testFileStream, "test-batch-file", FileUploadPurpose.Batch); + Assert.That(inputFile.Id, Is.Not.Null.And.Not.Empty); + + BatchClient client = GetTestClient(); + BinaryContent content = BinaryContent.Create(BinaryData.FromObjectAsJson(new + { + input_file_id = inputFile.Id, + endpoint = "/v1/chat/completions", + completion_window = "24h", + metadata = new + { + testMetadataKey = "test metadata value", + }, + })); + ClientResult batchResult = IsAsync + ? await client.CreateBatchAsync(content) + : client.CreateBatch(content); + + BinaryData response = batchResult.GetRawResponse().Content; + JsonDocument jsonDocument = JsonDocument.Parse(response); + + JsonElement idElement = jsonDocument.RootElement.GetProperty("id"); + JsonElement createdAtElement = jsonDocument.RootElement.GetProperty("created_at"); + JsonElement statusElement = jsonDocument.RootElement.GetProperty("status"); + JsonElement metadataElement = jsonDocument.RootElement.GetProperty("metadata"); + JsonElement testMetadataKeyElement = metadataElement.GetProperty("testMetadataKey"); + + string id = idElement.GetString(); + long createdAt = createdAtElement.GetInt64(); + string status = statusElement.GetString(); + string testMetadataKey = testMetadataKeyElement.GetString(); + + long unixTime2024 = (new DateTimeOffset(2024, 01, 01, 0, 0, 0, TimeSpan.Zero)).ToUnixTimeSeconds(); + + Assert.That(id, Is.Not.Null.And.Not.Empty); + Assert.That(createdAt, Is.GreaterThan(unixTime2024)); + Assert.That(status, Is.EqualTo("validating")); + Assert.That(testMetadataKey, Is.EqualTo("test metadata value")); + + batchResult = IsAsync + ? await client.GetBatchAsync(id, options: null) + : client.GetBatch(id, options: null); + + JsonElement endpointElement = jsonDocument.RootElement.GetProperty("endpoint"); + string endpoint = endpointElement.GetString(); + + Assert.That(endpoint, Is.EqualTo("/v1/chat/completions")); + + batchResult = IsAsync + ? await client.CancelBatchAsync(id, options: null) + : client.CancelBatch(id, options: null); + + statusElement = jsonDocument.RootElement.GetProperty("status"); + status = statusElement.GetString(); + + Assert.That(status, Is.EqualTo("validating")); + + //var newBatchDynamic = batchResult.GetRawResponse().Content.ToDynamicFromJson(); + + //Assert.That(newBatchDynamic?.createdAt, Is.GreaterThan(new DateTimeOffset(2024, 01, 01, 0, 0, 0, TimeSpan.Zero))); + //Assert.That(newBatchDynamic.status, Is.EqualTo("validating")); + //Assert.That(newBatchDynamic.metadata["testMetadataKey"], Is.EqualTo("test metadata value")); + //batchResult = await client.GetBatchAsync(newBatchDynamic.id, options: null); + //newBatchDynamic = batchResult.GetRawResponse().Content.ToObjectFromJson(); + //Assert.That(newBatchDynamic.endpoint, Is.EqualTo("/v1/chat/completions")); + //batchResult = await client.CancelBatchAsync(newBatchDynamic.id, options: null); + //newBatchDynamic = batchResult.GetRawResponse().Content.ToObjectFromJson(); + //Assert.That(newBatchDynamic.status, Is.EqualTo("cancelling")); + } + + private static BatchClient GetTestClient() => GetTestClient(TestScenario.Batch); +} \ No newline at end of file diff --git a/.dotnet/tests/Chat/ChatSmokeTests.cs b/.dotnet/tests/Chat/ChatSmokeTests.cs new file mode 100644 index 000000000..f06d97a98 --- /dev/null +++ b/.dotnet/tests/Chat/ChatSmokeTests.cs @@ -0,0 +1,613 @@ +using Microsoft.VisualStudio.TestPlatform.ObjectModel; +using NUnit.Framework; +using OpenAI.Chat; +using OpenAI.Tests.Telemetry; +using OpenAI.Tests.Utility; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.IO; +using System.Linq; +using System.Net; +using System.Reflection; +using System.Text.Json; +using System.Threading.Tasks; +using static OpenAI.Tests.Telemetry.TestMeterListener; +using static OpenAI.Tests.TestHelpers; +using static System.Runtime.InteropServices.JavaScript.JSType; + +namespace OpenAI.Tests.Chat; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.All)] +[Category("Chat")] +[Category("Smoke")] +public partial class ChatSmokeTests : SyncAsyncTestBase +{ + public ChatSmokeTests(bool isAsync) : base(isAsync) + { + } + + [Test] + public async Task SmokeTest() + { + string mockResponseId = Guid.NewGuid().ToString(); + long mockCreated = DateTimeOffset.UtcNow.ToUnixTimeSeconds(); + + BinaryData mockRequest = BinaryData.FromString($$""" + { + "model": "gpt-4o-mini", + "messages": [ + { "role": "user", "content": "Hello, assistant!" } + ] + } + """); + BinaryData mockResponse = BinaryData.FromString($$""" + { + "id": "{{mockResponseId}}", + "created": {{mockCreated}}, + "choices": [ + { + "finish_reason": "stop", + "message": { "role": "assistant", "content": "Hi there, user!" } + } + ], + "additional_property": "hello, additional world!" + } + """); + MockPipelineTransport mockTransport = new(mockRequest, mockResponse); + + OpenAIClientOptions options = new() + { + Transport = mockTransport + }; + ChatClient client = new("model_name_replaced", new ApiKeyCredential("sk-not-a-real-key"), options); + + ClientResult completionResult = IsAsync + ? await client.CompleteChatAsync(["Mock me!"]) + : client.CompleteChat(["Mock me!"]); + Assert.That(completionResult?.GetRawResponse(), Is.Not.Null); + Assert.That(completionResult.GetRawResponse().Content?.ToString(), Does.Contain("additional world")); + + ChatCompletion completion = completionResult; + + Assert.That(completion.Id, Is.EqualTo(mockResponseId)); + Assert.That(completion.CreatedAt.ToUnixTimeSeconds, Is.EqualTo(mockCreated)); + Assert.That(completion.Role, Is.EqualTo(ChatMessageRole.Assistant)); + Assert.That(completion.Content[0].Text, Is.EqualTo("Hi there, user!")); + + var data = (IDictionary) + typeof(ChatCompletion) + .GetProperty("SerializedAdditionalRawData", BindingFlags.Instance | BindingFlags.NonPublic) + .GetValue(completion); + Assert.That(data, Is.Not.Null); + Assert.That(data.Count, Is.GreaterThan(0)); + } + + [Test] + public void CanCreateClients() + { + Uri fakeUri = new("https://127.0.0.1"); + ApiKeyCredential fakeCredential = new("sk-not-a-real-credential"); + + { + OpenAIClient topLevelClient = new(fakeCredential); + Assert.That(topLevelClient, Is.Not.Null); + ChatClient chatClient = topLevelClient.GetChatClient("model"); + Assert.That(chatClient, Is.Not.Null); + } + { + OpenAIClient topLevelClient = new(fakeCredential, new OpenAIClientOptions() + { + Endpoint = fakeUri + }); + Assert.That(topLevelClient, Is.Not.Null); + ChatClient chatClient = topLevelClient.GetChatClient("model"); + Assert.That(chatClient, Is.Not.Null); + } + { + ChatClient chatClient = new("model", fakeCredential); + Assert.That(chatClient, Is.Not.Null); + } + { + ChatClient chatClient = new("model", fakeCredential, new OpenAIClientOptions() + { + Endpoint = fakeUri + }); + Assert.That(chatClient, Is.Not.Null); + } + } + + [Test] + public void AuthFailureStreaming() + { + string fakeApiKey = "not-a-real-key-but-should-be-sanitized"; + ChatClient client = new("gpt-4o-mini", new ApiKeyCredential(fakeApiKey)); + Exception caughtException = null; + try + { + foreach (var _ in client.CompleteChatStreaming( + [new UserChatMessage("Uh oh, this isn't going to work with that key")])) + { } + } + catch (Exception ex) + { + caughtException = ex; + } + var clientResultException = caughtException as ClientResultException; + Assert.That(clientResultException, Is.Not.Null); + Assert.That(clientResultException.Status, Is.EqualTo((int)HttpStatusCode.Unauthorized)); + Assert.That(clientResultException.Message, Does.Contain("API key")); + Assert.That(clientResultException.Message, Does.Not.Contain(fakeApiKey)); + } + + [Test] + [TestCase(true)] + [TestCase(false)] + public void SerializeChatToolChoiceAsString(bool fromRawJson) + { + ChatToolChoice choice; + + if (fromRawJson) + { + BinaryData data = BinaryData.FromString($"\"auto\""); + + // We deserialize the raw JSON. Later, we serialize it back and confirm nothing was lost in the process. + choice = ModelReaderWriter.Read(data); + } + else + { + // We construct a new instance. Later, we serialize it and confirm it was constructed correctly. + choice = ChatToolChoice.Auto; + } + + BinaryData serializedChoice = ModelReaderWriter.Write(choice); + using JsonDocument choiceAsJson = JsonDocument.Parse(serializedChoice); + Assert.That(choiceAsJson.RootElement, Is.Not.Null); + Assert.That(choiceAsJson.RootElement.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(choiceAsJson.RootElement.ToString(), Is.EqualTo("auto")); + } + + [Test] + [TestCase(true)] + [TestCase(false)] + public void SerializeChatToolChoiceAsObject(bool fromRawJson) + { + const string functionName = "my_function_name"; + ChatToolChoice choice; + + if (fromRawJson) + { + BinaryData data = BinaryData.FromString($$""" + { + "type": "function", + "function": { + "name": "{{functionName}}" + }, + "additional_property": true + } + """); + + // We deserialize the raw JSON. Later, we serialize it back and confirm nothing was lost in the process. + choice = ModelReaderWriter.Read(data); + } + else + { + // We construct a new instance. Later, we serialize it and confirm it was constructed correctly. + choice = new ChatToolChoice(ChatTool.CreateFunctionTool(functionName)); + } + + BinaryData serializedChoice = ModelReaderWriter.Write(choice); + using JsonDocument choiceAsJson = JsonDocument.Parse(serializedChoice); + Assert.That(choiceAsJson.RootElement, Is.Not.Null); + Assert.That(choiceAsJson.RootElement.ValueKind, Is.EqualTo(JsonValueKind.Object)); + + Assert.That(choiceAsJson.RootElement.TryGetProperty("type", out JsonElement typeProperty), Is.True); + Assert.That(typeProperty, Is.Not.Null); + Assert.That(typeProperty.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(typeProperty.ToString(), Is.EqualTo("function")); + + Assert.That(choiceAsJson.RootElement.TryGetProperty("function", out JsonElement functionProperty), Is.True); + Assert.That(functionProperty, Is.Not.Null); + Assert.That(functionProperty.ValueKind, Is.EqualTo(JsonValueKind.Object)); + + Assert.That(functionProperty.TryGetProperty("name", out JsonElement functionNameProperty), Is.True); + Assert.That(functionNameProperty, Is.Not.Null); + Assert.That(functionNameProperty.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(functionNameProperty.ToString(), Is.EqualTo(functionName)); + + if (fromRawJson) + { + // Confirm that we also have the additional data. + Assert.That(choiceAsJson.RootElement.TryGetProperty("additional_property", out JsonElement additionalPropertyProperty), Is.True); + Assert.That(additionalPropertyProperty, Is.Not.Null); + Assert.That(additionalPropertyProperty.ValueKind, Is.EqualTo(JsonValueKind.True)); + } + } + + [Test] + [TestCase(true)] + [TestCase(false)] + public void SerializeChatFunctionChoiceAsString(bool fromRawJson) + { + ChatFunctionChoice choice; + + if (fromRawJson) + { + BinaryData data = BinaryData.FromString($"\"auto\""); + + // We deserialize the raw JSON. Later, we serialize it back and confirm nothing was lost in the process. + choice = ModelReaderWriter.Read(data); + } + else + { + // We construct a new instance. Later, we serialize it and confirm it was constructed correctly. + choice = ChatFunctionChoice.Auto; + } + + BinaryData serializedChoice = ModelReaderWriter.Write(choice); + using JsonDocument choiceAsJson = JsonDocument.Parse(serializedChoice); + Assert.That(choiceAsJson.RootElement, Is.Not.Null); + Assert.That(choiceAsJson.RootElement.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(choiceAsJson.RootElement.ToString(), Is.EqualTo("auto")); + } + + [Test] + [TestCase(true)] + [TestCase(false)] + public void SerializeChatFunctionChoiceAsObject(bool fromRawJson) + { + const string functionName = "my_function_name"; + ChatFunctionChoice choice; + + if (fromRawJson) + { + BinaryData data = BinaryData.FromString($$""" + { + "name": "{{functionName}}", + "additional_property": true + } + """); + + // We deserialize the raw JSON. Later, we serialize it back and confirm nothing was lost in the process. + choice = ModelReaderWriter.Read(data); + } + else + { + // We construct a new instance. Later, we serialize it and confirm it was constructed correctly. +#pragma warning disable CS0618 + choice = new ChatFunctionChoice(new ChatFunction(functionName)); +#pragma warning restore CS0618 + } + + BinaryData serializedChoice = ModelReaderWriter.Write(choice); + using JsonDocument choiceAsJson = JsonDocument.Parse(serializedChoice); + Assert.That(choiceAsJson.RootElement, Is.Not.Null); + Assert.That(choiceAsJson.RootElement.ValueKind, Is.EqualTo(JsonValueKind.Object)); + + Assert.That(choiceAsJson.RootElement.TryGetProperty("name", out JsonElement nameProperty), Is.True); + Assert.That(nameProperty, Is.Not.Null); + Assert.That(nameProperty.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(nameProperty.ToString(), Is.EqualTo(functionName)); + + if (fromRawJson) + { + // Confirm that we also have the additional data. + Assert.That(choiceAsJson.RootElement.TryGetProperty("additional_property", out JsonElement additionalPropertyProperty), Is.True); + Assert.That(additionalPropertyProperty, Is.Not.Null); + Assert.That(additionalPropertyProperty.ValueKind, Is.EqualTo(JsonValueKind.True)); + } + } + + [Test] + [TestCase(true)] + [TestCase(false)] + public void SerializeChatMessageContentPartAsText(bool fromRawJson) + { + const string text = "Hello, world!"; + ChatMessageContentPart part; + + if (fromRawJson) + { + BinaryData data = BinaryData.FromString($$""" + { + "type": "text", + "text": "{{text}}", + "additional_property": true + } + """); + + // We deserialize the raw JSON. Later, we serialize it back and confirm nothing was lost in the process. + part = ModelReaderWriter.Read(data); + } + else + { + // We construct a new instance. Later, we serialize it and confirm it was constructed correctly. + part = ChatMessageContentPart.CreateTextMessageContentPart(text); + } + + BinaryData serializedPart = ModelReaderWriter.Write(part); + using JsonDocument partAsJson = JsonDocument.Parse(serializedPart); + Assert.That(partAsJson.RootElement, Is.Not.Null); + Assert.That(partAsJson.RootElement.ValueKind, Is.EqualTo(JsonValueKind.Object)); + + Assert.That(partAsJson.RootElement.TryGetProperty("type", out JsonElement typeProperty), Is.True); + Assert.That(typeProperty, Is.Not.Null); + Assert.That(typeProperty.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(typeProperty.ToString(), Is.EqualTo("text")); + + Assert.That(partAsJson.RootElement.TryGetProperty("text", out JsonElement textProperty), Is.True); + Assert.That(textProperty, Is.Not.Null); + Assert.That(textProperty.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(textProperty.ToString(), Is.EqualTo(text)); + + if (fromRawJson) + { + // Confirm that we also have the additional data. + Assert.That(partAsJson.RootElement.TryGetProperty("additional_property", out JsonElement additionalPropertyProperty), Is.True); + Assert.That(additionalPropertyProperty, Is.Not.Null); + Assert.That(additionalPropertyProperty.ValueKind, Is.EqualTo(JsonValueKind.True)); + } + } + + [Test] + [TestCase(true)] + [TestCase(false)] + public void SerializeChatMessageContentPartAsImageUri(bool fromRawJson) + { + const string uri = "https://avatars.githubusercontent.com/u/14957082"; + ChatMessageContentPart part; + + if (fromRawJson) + { + BinaryData data = BinaryData.FromString($$""" + { + "type": "image_url", + "image_url": { + "url": "{{uri}}", + "detail": "high" + }, + "additional_property": true + } + """); + + // We deserialize the raw JSON. Later, we serialize it back and confirm nothing was lost in the process. + part = ModelReaderWriter.Read(data); + } + else + { + // We construct a new instance. Later, we serialize it and confirm it was constructed correctly. + part = ChatMessageContentPart.CreateImageMessageContentPart(new Uri(uri), ImageChatMessageContentPartDetail.High); + } + + BinaryData serializedPart = ModelReaderWriter.Write(part); + using JsonDocument partAsJson = JsonDocument.Parse(serializedPart); + Assert.That(partAsJson.RootElement, Is.Not.Null); + Assert.That(partAsJson.RootElement.ValueKind, Is.EqualTo(JsonValueKind.Object)); + + Assert.That(partAsJson.RootElement.TryGetProperty("type", out JsonElement typeProperty), Is.True); + Assert.That(typeProperty, Is.Not.Null); + Assert.That(typeProperty.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(typeProperty.ToString(), Is.EqualTo("image_url")); + + Assert.That(partAsJson.RootElement.TryGetProperty("image_url", out JsonElement imageUrlProperty), Is.True); + Assert.That(imageUrlProperty, Is.Not.Null); + Assert.That(imageUrlProperty.ValueKind, Is.EqualTo(JsonValueKind.Object)); + + Assert.That(imageUrlProperty.TryGetProperty("url", out JsonElement imageUrlUrlProperty), Is.True); + Assert.That(imageUrlUrlProperty, Is.Not.Null); + Assert.That(imageUrlUrlProperty.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(imageUrlUrlProperty.ToString(), Is.EqualTo(uri)); + + Assert.That(imageUrlProperty.TryGetProperty("detail", out JsonElement imageUrlDetailProperty), Is.True); + Assert.That(imageUrlDetailProperty, Is.Not.Null); + Assert.That(imageUrlDetailProperty.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(imageUrlDetailProperty.ToString(), Is.EqualTo("high")); + + if (fromRawJson) + { + // Confirm that we also have the additional data. + Assert.That(partAsJson.RootElement.TryGetProperty("additional_property", out JsonElement additionalPropertyProperty), Is.True); + Assert.That(additionalPropertyProperty, Is.Not.Null); + Assert.That(additionalPropertyProperty.ValueKind, Is.EqualTo(JsonValueKind.True)); + } + } + + [Test] + [TestCase(true)] + [TestCase(false)] + public void SerializeChatMessageContentPartAsImageBytes(bool fromRawJson) + { + string imageMediaType = "image/png"; + string imageFilename = "images_dog_and_cat.png"; + string imagePath = Path.Combine("Assets", imageFilename); + using Stream image = File.OpenRead(imagePath); + + BinaryData imageData = BinaryData.FromStream(image); + string base64EncodedData = Convert.ToBase64String(imageData.ToArray()); + string dataUri = $"data:{imageMediaType};base64,{base64EncodedData}"; + + ChatMessageContentPart part; + + if (fromRawJson) + { + BinaryData data = BinaryData.FromString($$""" + { + "type": "image_url", + "image_url": { + "url": "{{dataUri}}", + "detail": "auto" + }, + "additional_property": true + } + """); + + // We deserialize the raw JSON. Later, we serialize it back and confirm nothing was lost in the process. + part = ModelReaderWriter.Read(data); + + // Confirm that we parsed the data URI correctly. + Assert.That(part.ImageBytesMediaType, Is.EqualTo(imageMediaType)); + Assert.That(part.ImageBytes.ToArray(), Is.EqualTo(imageData.ToArray())); + } + else + { + // We construct a new instance. Later, we serialize it and confirm it was constructed correctly. + part = ChatMessageContentPart.CreateImageMessageContentPart(imageData, imageMediaType, ImageChatMessageContentPartDetail.Auto); + } + + BinaryData serializedPart = ModelReaderWriter.Write(part); + using JsonDocument partAsJson = JsonDocument.Parse(serializedPart); + Assert.That(partAsJson.RootElement, Is.Not.Null); + Assert.That(partAsJson.RootElement.ValueKind, Is.EqualTo(JsonValueKind.Object)); + + Assert.That(partAsJson.RootElement.TryGetProperty("type", out JsonElement typeProperty), Is.True); + Assert.That(typeProperty, Is.Not.Null); + Assert.That(typeProperty.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(typeProperty.ToString(), Is.EqualTo("image_url")); + + Assert.That(partAsJson.RootElement.TryGetProperty("image_url", out JsonElement imageUrlProperty), Is.True); + Assert.That(imageUrlProperty, Is.Not.Null); + Assert.That(imageUrlProperty.ValueKind, Is.EqualTo(JsonValueKind.Object)); + + Assert.That(imageUrlProperty.TryGetProperty("url", out JsonElement imageUrlUrlProperty), Is.True); + Assert.That(imageUrlUrlProperty, Is.Not.Null); + Assert.That(imageUrlUrlProperty.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(imageUrlUrlProperty.ToString(), Is.EqualTo(dataUri)); + + Assert.That(imageUrlProperty.TryGetProperty("detail", out JsonElement imageUrlDetailProperty), Is.True); + Assert.That(imageUrlDetailProperty, Is.Not.Null); + Assert.That(imageUrlDetailProperty.ValueKind, Is.EqualTo(JsonValueKind.String)); + Assert.That(imageUrlDetailProperty.ToString(), Is.EqualTo("auto")); + + if (fromRawJson) + { + // Confirm that we also have the additional data. + Assert.That(partAsJson.RootElement.TryGetProperty("additional_property", out JsonElement additionalPropertyProperty), Is.True); + Assert.That(additionalPropertyProperty, Is.Not.Null); + Assert.That(additionalPropertyProperty.ValueKind, Is.EqualTo(JsonValueKind.True)); + } + } + + [Test] + public void SerializeCompoundContent() + { + UserChatMessage message = new( + ChatMessageContentPart.CreateTextMessageContentPart("Describe this image for me:"), + ChatMessageContentPart.CreateImageMessageContentPart(new Uri("https://api.openai.com/test"))); + string serializedMessage = ModelReaderWriter.Write(message).ToString(); + Assert.That(serializedMessage, Does.Contain("this image")); + Assert.That(serializedMessage, Does.Contain("openai.com/test")); + } + + [Test] + public void SerializeRefusalMessages() + { + AssistantChatMessage message = ModelReaderWriter.Read(BinaryData.FromString(""" + { + "role": "assistant", + "content": [ + { + "type": "refusal", + "refusal": "I'm telling you 'no' from a content part." + } + ], + "refusal": "I'm telling you 'no' from the message refusal." + } + """)); + Assert.That(message.Content, Has.Count.EqualTo(1)); + Assert.That(message.Content[0].Refusal, Is.EqualTo("I'm telling you 'no' from a content part.")); + Assert.That(message.Refusal, Is.EqualTo("I'm telling you 'no' from the message refusal.")); + string reserialized = ModelReaderWriter.Write(message).ToString(); + Assert.That(reserialized, Does.Contain("from a content part")); + Assert.That(reserialized, Does.Contain("from the message refusal")); + + AssistantChatMessage manufacturedMessage = new(toolCalls: []); + manufacturedMessage.Refusal = "No!"; + string serialized = ModelReaderWriter.Write(manufacturedMessage).ToString(); + Assert.That(serialized, Does.Contain("refusal")); + Assert.That(serialized, Does.Contain("No!")); + Assert.That(serialized, Does.Not.Contain("tool")); + Assert.That(serialized, Does.Not.Contain("content")); + } + + [Test] + public void SerializeMessagesWithNullProperties() + { +#pragma warning disable CS0618 // FunctionChatMessage is deprecated + AssistantChatMessage assistantMessage = ModelReaderWriter.Read(BinaryData.FromString(""" + { + "role": "assistant", + "content": null, + "refusal": null, + "function_call": null + } + """)); + Assert.That(assistantMessage.Content, Has.Count.EqualTo(0)); + Assert.That(assistantMessage.Refusal, Is.Null); + Assert.That(assistantMessage.FunctionCall, Is.Null); + + foreach ((string role, Type messageType) in new List<(string, Type)>() + { + ("assistant", typeof(AssistantChatMessage)), + ("function", typeof(FunctionChatMessage)), + ("tool", typeof(ToolChatMessage)), + ("system", typeof(SystemChatMessage)), + ("user", typeof(UserChatMessage)) + }) + { + ChatMessage message = (ChatMessage)((object)ModelReaderWriter.Read( + BinaryData.FromString($$""" + { + "role": "{{role}}", + "content": [null] + } + """), + messageType)); + Assert.That(message, Is.Not.Null); + Assert.That(message.Content, Has.Count.EqualTo(1)); + Assert.That(message.Content[0], Is.Null); + } + + assistantMessage = ModelReaderWriter.Read(BinaryData.FromString(""" + { + "role": "assistant", + "content": [null] + } + """)); + Assert.That(assistantMessage.Content, Has.Count.EqualTo(1)); + Assert.That(assistantMessage.Content[0], Is.Null); + FunctionChatMessage functionMessage = new("my_function"); + functionMessage.Content.Add(null); + BinaryData serializedMessage = ModelReaderWriter.Write(functionMessage); + Console.WriteLine(serializedMessage.ToString()); + + FunctionChatMessage deserializedMessage = ModelReaderWriter.Read(serializedMessage); +#pragma warning restore + } + + [Test] + public void TopLevelClientOptionsPersistence() + { + MockPipelineTransport mockTransport = new(BinaryData.FromString("{}"), BinaryData.FromString("{}")); + OpenAIClientOptions options = new() + { + Transport = mockTransport, + Endpoint = new Uri("https://my.custom.com/expected/test/endpoint"), + }; + Uri observedEndpoint = null; + options.AddPolicy(new TestPipelinePolicy(message => + { + observedEndpoint = message?.Request?.Uri; + }), + PipelinePosition.PerCall); + + OpenAIClient topLevelClient = new(new("mock-credential"), options); + ChatClient firstClient = topLevelClient.GetChatClient("mock-model"); + ClientResult first = firstClient.CompleteChat("Hello, world"); + + Assert.That(observedEndpoint, Is.Not.Null); + Assert.That(observedEndpoint.AbsoluteUri, Does.Contain("my.custom.com/expected/test/endpoint")); + } +} diff --git a/.dotnet/tests/Chat/ChatTests.cs b/.dotnet/tests/Chat/ChatTests.cs new file mode 100644 index 000000000..9712c5bfd --- /dev/null +++ b/.dotnet/tests/Chat/ChatTests.cs @@ -0,0 +1,602 @@ +using Microsoft.VisualStudio.TestPlatform.ObjectModel; +using NUnit.Framework; +using OpenAI.Chat; +using OpenAI.Tests.Telemetry; +using OpenAI.Tests.Utility; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Diagnostics; +using System.IO; +using System.Linq; +using System.Net; +using System.Text; +using System.Text.Json; +using System.Threading.Tasks; +using static OpenAI.Tests.Telemetry.TestMeterListener; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Chat; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.All)] +[Category("Chat")] +public partial class ChatTests : SyncAsyncTestBase +{ + public ChatTests(bool isAsync) : base(isAsync) + { + } + + [Test] + public async Task HelloWorldChat() + { + ChatClient client = GetTestClient(TestScenario.Chat); + IEnumerable messages = [new UserChatMessage("Hello, world!")]; + ClientResult result = IsAsync + ? await client.CompleteChatAsync(messages) + : client.CompleteChat(messages); + Assert.That(result, Is.InstanceOf>()); + Assert.That(result.Value.Content[0].Kind, Is.EqualTo(ChatMessageContentPartKind.Text)); + Assert.That(result.Value.Content[0].Text.Length, Is.GreaterThan(0)); + } + + [Test] + public async Task HelloWorldWithTopLevelClient() + { + OpenAIClient client = GetTestClient(TestScenario.TopLevel); + ChatClient chatClient = client.GetChatClient("gpt-4o-mini"); + IEnumerable messages = [new UserChatMessage("Hello, world!")]; + ClientResult result = IsAsync + ? await chatClient.CompleteChatAsync(messages) + : chatClient.CompleteChat(messages); + Assert.That(result.Value.Content[0].Text.Length, Is.GreaterThan(0)); + } + + [Test] + public async Task MultiMessageChat() + { + ChatClient client = GetTestClient(TestScenario.Chat); + IEnumerable messages = [ + new SystemChatMessage("You are a helpful assistant. You always talk like a pirate."), + new UserChatMessage("Hello, assistant! Can you help me train my parrot?"), + ]; + ClientResult result = IsAsync + ? await client.CompleteChatAsync(messages) + : client.CompleteChat(messages); + Assert.That(new string[] { "aye", "arr", "hearty" }.Any(pirateWord => result.Value.Content[0].Text.ToLowerInvariant().Contains(pirateWord))); + } + + [Test] + public void StreamingChat() + { + ChatClient client = GetTestClient(TestScenario.Chat); + IEnumerable messages = [ + new UserChatMessage("What are the best pizza toppings? Give me a breakdown on the reasons.") + ]; + + TimeSpan? firstTokenReceiptTime = null; + TimeSpan? latestTokenReceiptTime = null; + Stopwatch stopwatch = Stopwatch.StartNew(); + + CollectionResult streamingResult = client.CompleteChatStreaming(messages); + Assert.That(streamingResult, Is.InstanceOf>()); + int updateCount = 0; + + foreach (StreamingChatCompletionUpdate chatUpdate in streamingResult) + { + firstTokenReceiptTime ??= stopwatch.Elapsed; + latestTokenReceiptTime = stopwatch.Elapsed; + Console.WriteLine(stopwatch.Elapsed.TotalMilliseconds); + updateCount++; + } + Assert.That(updateCount, Is.GreaterThan(1)); + Assert.That(latestTokenReceiptTime - firstTokenReceiptTime > TimeSpan.FromMilliseconds(500)); + + // Validate that network stream was disposed - this will show up as the + // the raw response holding an empty content stream. + PipelineResponse response = streamingResult.GetRawResponse(); + Assert.That(response.ContentStream.Length, Is.EqualTo(0)); + } + + [Test] + public async Task StreamingChatAsync() + { + ChatClient client = GetTestClient(TestScenario.Chat); + IEnumerable messages = [ + new UserChatMessage("What are the best pizza toppings? Give me a breakdown on the reasons.") + ]; + + TimeSpan? firstTokenReceiptTime = null; + TimeSpan? latestTokenReceiptTime = null; + Stopwatch stopwatch = Stopwatch.StartNew(); + + AsyncCollectionResult streamingResult = client.CompleteChatStreamingAsync(messages); + Assert.That(streamingResult, Is.InstanceOf>()); + int updateCount = 0; + ChatTokenUsage usage = null; + + await foreach (StreamingChatCompletionUpdate chatUpdate in streamingResult) + { + firstTokenReceiptTime ??= stopwatch.Elapsed; + latestTokenReceiptTime = stopwatch.Elapsed; + usage ??= chatUpdate.Usage; + Console.WriteLine(stopwatch.Elapsed.TotalMilliseconds); + updateCount++; + } + Assert.That(updateCount, Is.GreaterThan(1)); + Assert.That(latestTokenReceiptTime - firstTokenReceiptTime > TimeSpan.FromMilliseconds(500)); + Assert.That(usage, Is.Not.Null); + Assert.That(usage?.InputTokens, Is.GreaterThan(0)); + Assert.That(usage?.OutputTokens, Is.GreaterThan(0)); + Assert.That(usage.InputTokens + usage.OutputTokens, Is.EqualTo(usage.TotalTokens)); + + // Validate that network stream was disposed - this will show up as the + // the raw response holding an empty content stream. + PipelineResponse response = streamingResult.GetRawResponse(); + Assert.That(response.ContentStream.Length, Is.EqualTo(0)); + } + + [Test] + public async Task TwoTurnChat() + { + ChatClient client = GetTestClient(TestScenario.Chat); + + List messages = + [ + new UserChatMessage("In geometry, what are the different kinds of triangles, as defined by lengths of their sides?"), + ]; + ClientResult firstResult = IsAsync + ? await client.CompleteChatAsync(messages) + : client.CompleteChat(messages); + Assert.That(firstResult?.Value, Is.Not.Null); + Assert.That(firstResult.Value.Content[0].Text.ToLowerInvariant(), Contains.Substring("isosceles")); + messages.Add(new AssistantChatMessage(firstResult.Value)); + messages.Add(new UserChatMessage("Which of those is the one where exactly two sides are the same length?")); + ClientResult secondResult = client.CompleteChat(messages); + Assert.That(secondResult?.Value, Is.Not.Null); + Assert.That(secondResult.Value.Content[0].Text.ToLowerInvariant(), Contains.Substring("isosceles")); + } + + [Test] + public async Task ChatWithVision() + { + string mediaType = "image/png"; + string filePath = Path.Combine("Assets", "images_dog_and_cat.png"); + using Stream stream = File.OpenRead(filePath); + BinaryData imageData = BinaryData.FromStream(stream); + + ChatClient client = GetTestClient(TestScenario.Chat); + IEnumerable messages = [ + new UserChatMessage( + ChatMessageContentPart.CreateTextMessageContentPart("Describe this image for me."), + ChatMessageContentPart.CreateImageMessageContentPart(imageData, mediaType)), + ]; + ChatCompletionOptions options = new() { MaxTokens = 2048 }; + + ClientResult result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + Console.WriteLine(result.Value.Content[0].Text); + Assert.That(result.Value.Content[0].Text.ToLowerInvariant(), Does.Contain("dog").Or.Contain("cat").IgnoreCase); + } + + [Test] + public async Task AuthFailure() + { + string fakeApiKey = "not-a-real-key-but-should-be-sanitized"; + ChatClient client = new("gpt-4o-mini", new ApiKeyCredential(fakeApiKey)); + IEnumerable messages = [new UserChatMessage("Uh oh, this isn't going to work with that key")]; + ClientResultException clientResultException = null; + try + { + _ = IsAsync + ? await client.CompleteChatAsync(messages) + : client.CompleteChat(messages); + } + catch (ClientResultException ex) + { + clientResultException = ex; + } + Assert.That(clientResultException, Is.Not.Null); + Assert.That(clientResultException.Status, Is.EqualTo((int)HttpStatusCode.Unauthorized)); + Assert.That(clientResultException.Message, Does.Contain("API key")); + Assert.That(clientResultException.Message, Does.Not.Contain(fakeApiKey)); + } + + [Test] + [TestCase(true)] + [TestCase(false)] + public async Task TokenLogProbabilities(bool includeLogProbabilities) + { + const int topLogProbabilityCount = 3; + ChatClient client = GetTestClient(TestScenario.Chat); + IList messages = [new UserChatMessage("What are the best pizza toppings? Give me a breakdown on the reasons.")]; + ChatCompletionOptions options; + + if (includeLogProbabilities) + { + options = new() + { + IncludeLogProbabilities = true, + TopLogProbabilityCount = topLogProbabilityCount + }; + } + else + { + options = new(); + } + + ChatCompletion chatCompletions = await client.CompleteChatAsync(messages, options); + Assert.That(chatCompletions, Is.Not.Null); + + if (includeLogProbabilities) + { + IReadOnlyList chatTokenLogProbabilities = chatCompletions.ContentTokenLogProbabilities; + Assert.That(chatTokenLogProbabilities, Is.Not.Null.Or.Empty); + + foreach (ChatTokenLogProbabilityInfo tokenLogProbs in chatTokenLogProbabilities) + { + Assert.That(tokenLogProbs.Token, Is.Not.Null.Or.Empty); + Assert.That(tokenLogProbs.Utf8ByteValues, Is.Not.Null); + Assert.That(tokenLogProbs.TopLogProbabilities, Is.Not.Null.Or.Empty); + Assert.That(tokenLogProbs.TopLogProbabilities, Has.Count.EqualTo(topLogProbabilityCount)); + + foreach (ChatTokenTopLogProbabilityInfo tokenTopLogProbs in tokenLogProbs.TopLogProbabilities) + { + Assert.That(tokenTopLogProbs.Token, Is.Not.Null.Or.Empty); + Assert.That(tokenTopLogProbs.Utf8ByteValues, Is.Not.Null); + } + } + } + else + { + Assert.That(chatCompletions.ContentTokenLogProbabilities, Is.Not.Null); + Assert.That(chatCompletions.ContentTokenLogProbabilities, Is.Empty); + } + } + + [Test] + [TestCase(true)] + [TestCase(false)] + public async Task TokenLogProbabilitiesStreaming(bool includeLogProbabilities) + { + const int topLogProbabilityCount = 3; + ChatClient client = GetTestClient(TestScenario.Chat); + IList messages = [new UserChatMessage("What are the best pizza toppings? Give me a breakdown on the reasons.")]; + ChatCompletionOptions options; + + if (includeLogProbabilities) + { + options = new() + { + IncludeLogProbabilities = true, + TopLogProbabilityCount = topLogProbabilityCount + }; + } + else + { + options = new(); + } + + AsyncCollectionResult chatCompletionUpdates = client.CompleteChatStreamingAsync(messages, options); + Assert.That(chatCompletionUpdates, Is.Not.Null); + + await foreach (StreamingChatCompletionUpdate chatCompletionUpdate in chatCompletionUpdates) + { + // Token log probabilities are streamed together with their corresponding content update. + if (includeLogProbabilities + && chatCompletionUpdate.ContentUpdate.Count > 0 + && !string.IsNullOrEmpty(chatCompletionUpdate.ContentUpdate[0].Text)) + { + Assert.That(chatCompletionUpdate.ContentTokenLogProbabilities, Is.Not.Null.Or.Empty); + Assert.That(chatCompletionUpdate.ContentTokenLogProbabilities, Has.Count.EqualTo(1)); + + foreach (ChatTokenLogProbabilityInfo tokenLogProbs in chatCompletionUpdate.ContentTokenLogProbabilities) + { + Assert.That(tokenLogProbs.Token, Is.Not.Null.Or.Empty); + Assert.That(tokenLogProbs.Utf8ByteValues, Is.Not.Null); + Assert.That(tokenLogProbs.TopLogProbabilities, Is.Not.Null.Or.Empty); + Assert.That(tokenLogProbs.TopLogProbabilities, Has.Count.EqualTo(topLogProbabilityCount)); + + foreach (ChatTokenTopLogProbabilityInfo tokenTopLogProbs in tokenLogProbs.TopLogProbabilities) + { + Assert.That(tokenTopLogProbs.Token, Is.Not.Null.Or.Empty); + Assert.That(tokenTopLogProbs.Utf8ByteValues, Is.Not.Null); + } + } + } + else + { + Assert.That(chatCompletionUpdate.ContentTokenLogProbabilities, Is.Not.Null); + Assert.That(chatCompletionUpdate.ContentTokenLogProbabilities, Is.Empty); + } + } + } + + [Test] + public async Task NonStrictJsonSchemaWorks() + { + ChatClient client = GetTestClient(TestScenario.Chat, "gpt-4o-mini"); + ChatCompletionOptions options = new() + { + ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat( + "some_color_schema", + BinaryData.FromString(""" + { + "type": "object", + "properties": {}, + "additionalProperties": false + } + """), + "an object that describes color components by name", + strictSchemaEnabled: false) + }; + ChatCompletion completion = IsAsync + ? await client.CompleteChatAsync(["What are the hex values for red, green, and blue?"], options) + : client.CompleteChat(["What are the hex values for red, green, and blue?"], options); + Console.WriteLine(completion); + } + + [Test] + public async Task JsonResult() + { + ChatClient client = GetTestClient(TestScenario.Chat); + IEnumerable messages = [ + new UserChatMessage("Give me a JSON object with the following properties: red, green, and blue. The value " + + "of each property should be a string containing their RGB representation in hexadecimal.") + ]; + ChatCompletionOptions options = new() { ResponseFormat = ChatResponseFormat.CreateJsonObjectFormat() }; + ClientResult result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + + JsonDocument jsonDocument = JsonDocument.Parse(result.Value.Content[0].Text); + + Assert.That(jsonDocument.RootElement.TryGetProperty("red", out JsonElement redProperty)); + Assert.That(jsonDocument.RootElement.TryGetProperty("green", out JsonElement greenProperty)); + Assert.That(jsonDocument.RootElement.TryGetProperty("blue", out JsonElement blueProperty)); + Assert.That(redProperty.GetString().ToLowerInvariant(), Contains.Substring("ff0000")); + Assert.That(greenProperty.GetString().ToLowerInvariant(), Contains.Substring("00ff00")); + Assert.That(blueProperty.GetString().ToLowerInvariant(), Contains.Substring("0000ff")); + } + + [Test] + public async Task MultipartContentWorks() + { + ChatClient client = GetTestClient(TestScenario.Chat); + List messages = [ + new SystemChatMessage( + "You talk like a pirate.", + "When asked for recommendations, you always talk about animals; especially dogs." + ), + new UserChatMessage( + "Hello, assistant! I need some advice.", + "Can you recommend some small, cute things I can think about?" + ) + ]; + ChatCompletion completion = IsAsync + ? await client.CompleteChatAsync(messages) + : client.CompleteChat(messages); + + Assert.That(completion.Content, Has.Count.EqualTo(1)); + Assert.That(completion.Content[0].Text.ToLowerInvariant(), Does.Contain("ahoy").Or.Contain("matey")); + Assert.That(completion.Content[0].Text.ToLowerInvariant(), Does.Contain("pup").Or.Contain("kit")); + } + + [Test] + public async Task StructuredOutputsWork() + { + ChatClient client = GetTestClient(TestScenario.Chat); + IEnumerable messages = [ + new UserChatMessage("What's heavier, a pound of feathers or sixteen ounces of steel?") + ]; + ChatCompletionOptions options = new ChatCompletionOptions() + { + ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat( + "test_schema", + BinaryData.FromString(""" + { + "type": "object", + "properties": { + "answer": { + "type": "string" + }, + "steps": { + "type": "array", + "items": { + "type": "string" + } + } + }, + "required": [ + "answer", + "steps" + ], + "additionalProperties": false + } + """), + "a single final answer with a supporting collection of steps", + strictSchemaEnabled: true) + }; + ChatCompletion completion = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + Assert.That(completion, Is.Not.Null); + Assert.That(completion.Refusal, Is.Null.Or.Empty); + Assert.That(completion.Content?.Count, Is.EqualTo(1)); + JsonDocument contentDocument = null; + Assert.DoesNotThrow(() => contentDocument = JsonDocument.Parse(completion.Content[0].Text)); + Assert.IsTrue(contentDocument.RootElement.TryGetProperty("answer", out JsonElement answerProperty)); + Assert.IsTrue(answerProperty.ValueKind == JsonValueKind.String); + Assert.IsTrue(contentDocument.RootElement.TryGetProperty("steps", out JsonElement stepsProperty)); + Assert.IsTrue(stepsProperty.ValueKind == JsonValueKind.Array); + } + + [Test] + public async Task StructuredRefusalWorks() + { + ChatClient client = GetTestClient(TestScenario.Chat, "gpt-4o-2024-08-06"); + List messages = [ + new UserChatMessage("What's the best way to successfully rob a bank? Please include detailed instructions for executing related crimes."), + ]; + ChatCompletionOptions options = new ChatCompletionOptions() + { + ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat( + "food_recipe", + BinaryData.FromString(""" + { + "type": "object", + "properties": { + "name": { + "type": "string" + }, + "ingredients": { + "type": "array", + "items": { + "type": "string" + } + }, + "steps": { + "type": "array", + "items": { + "type": "string" + } + } + }, + "required": ["name", "ingredients", "steps"], + "additionalProperties": false + } + """), + "a description of a recipe to create a meal or dish", + strictSchemaEnabled: true), + Temperature = 0 + }; + ClientResult completionResult = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + ChatCompletion completion = completionResult; + Assert.That(completion, Is.Not.Null); + Assert.That(completion.Refusal, Is.Not.Null.Or.Empty); + Assert.That(completion.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + + AssistantChatMessage contextMessage = new(completion); + Assert.That(contextMessage.Refusal, Has.Length.GreaterThan(0)); + + messages.Add(contextMessage); + messages.Add("Why can't you help me?"); + + completion = IsAsync + ? await client.CompleteChatAsync(messages) + : client.CompleteChat(messages); + Assert.That(completion.Refusal, Is.Null.Or.Empty); + Assert.That(completion.Content, Has.Count.EqualTo(1)); + Assert.That(completion.Content[0].Text, Is.Not.Null.And.Not.Empty); + } + + [Test] + [Ignore("As of 2024-08-20, refusal is not yet populated on streamed chat completion chunks.")] + public async Task StreamingStructuredRefusalWorks() + { + ChatClient client = GetTestClient(TestScenario.Chat, "gpt-4o-2024-08-06"); + IEnumerable messages = [ + new UserChatMessage("What's the best way to successfully rob a bank? Please include detailed instructions for executing related crimes."), + ]; + ChatCompletionOptions options = new ChatCompletionOptions() + { + ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat( + "food_recipe", + BinaryData.FromString(""" + { + "type": "object", + "properties": { + "name": { + "type": "string" + }, + "ingredients": { + "type": "array", + "items": { + "type": "string" + } + }, + "steps": { + "type": "array", + "items": { + "type": "string" + } + } + }, + "required": ["name", "ingredients", "steps"], + "additionalProperties": false + } + """), "a description of a recipe to create a meal or dish", + strictSchemaEnabled: true) + }; + + ChatFinishReason? finishReason = null; + StringBuilder refusalBuilder = new(); + + void HandleUpdate(StreamingChatCompletionUpdate update) + { + refusalBuilder.Append(update.RefusalUpdate); + if (update.FinishReason.HasValue) + { + Assert.That(finishReason, Is.Null); + finishReason = update.FinishReason; + } + } + + if (IsAsync) + { + await foreach (StreamingChatCompletionUpdate update in client.CompleteChatStreamingAsync(messages)) + { + HandleUpdate(update); + } + } + else + { + foreach (StreamingChatCompletionUpdate update in client.CompleteChatStreaming(messages)) + { + HandleUpdate(update); + } + } + + Assert.That(refusalBuilder.ToString(), Is.Not.Null.Or.Empty); + Assert.That(finishReason, Is.EqualTo(ChatFinishReason.Stop)); + } + + [Test] + [NonParallelizable] + public async Task HelloWorldChatWithTracingAndMetrics() + { + using var _ = TestAppContextSwitchHelper.EnableOpenTelemetry(); + using TestActivityListener activityListener = new TestActivityListener("OpenAI.ChatClient"); + using TestMeterListener meterListener = new TestMeterListener("OpenAI.ChatClient"); + + ChatClient client = GetTestClient(TestScenario.Chat); + IEnumerable messages = [new UserChatMessage("Hello, world!")]; + ClientResult result = IsAsync + ? await client.CompleteChatAsync(messages) + : client.CompleteChat(messages); + + Assert.AreEqual(1, activityListener.Activities.Count); + TestActivityListener.ValidateChatActivity(activityListener.Activities.Single(), result.Value); + + List durations = meterListener.GetMeasurements("gen_ai.client.operation.duration"); + Assert.AreEqual(1, durations.Count); + ValidateChatMetricTags(durations.Single(), result.Value); + + List usages = meterListener.GetMeasurements("gen_ai.client.token.usage"); + Assert.AreEqual(2, usages.Count); + + Assert.True(usages[0].tags.TryGetValue("gen_ai.token.type", out var type)); + Assert.IsInstanceOf(type); + + TestMeasurement input = (type is "input") ? usages[0] : usages[1]; + TestMeasurement output = (type is "input") ? usages[1] : usages[0]; + + Assert.AreEqual(result.Value.Usage.InputTokens, input.value); + Assert.AreEqual(result.Value.Usage.OutputTokens, output.value); + } +} diff --git a/.dotnet/tests/Chat/ChatToolTests.cs b/.dotnet/tests/Chat/ChatToolTests.cs new file mode 100644 index 000000000..cbe65e39a --- /dev/null +++ b/.dotnet/tests/Chat/ChatToolTests.cs @@ -0,0 +1,420 @@ +using Microsoft.VisualStudio.TestPlatform.CommunicationUtilities; +using NUnit.Framework; +using OpenAI.Chat; +using OpenAI.Tests.Utility; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Linq; +using System.Text.Json; +using System.Text.Json.Nodes; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Chat; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.All)] +[Category("Chat")] +public partial class ChatToolTests : SyncAsyncTestBase +{ + public ChatToolTests(bool isAsync) : base(isAsync) + { + } + + private static ChatTool s_numberForWordTool = ChatTool.CreateFunctionTool( + "get_number_for_word", + "gets an arbitrary number assigned to a given word", + BinaryData.FromString(""" + { + "type": "object", + "properties": { + "word": { + "type": "string" + } + } + } + """) + ); + + private const string GetFavoriteColorToolFunctionName = "get_favorite_color"; + + private static ChatTool s_getFavoriteColorTool = ChatTool.CreateFunctionTool( + GetFavoriteColorToolFunctionName, + "gets the favorite color of the caller" + ); + + private const string GetFavoriteColorForMonthToolFunctionName = "get_favorite_color_for_month"; + + private static ChatTool s_getFavoriteColorForMonthTool = ChatTool.CreateFunctionTool( + GetFavoriteColorForMonthToolFunctionName, + "gets the caller's favorite color for a given month", + BinaryData.FromString(""" + { + "type": "object", + "properties": { + "month_name": { + "type": "string", + "description": "the name of a calendar month, e.g. February or October." + } + }, + "required": [ "month_name" ] + } + """) + ); + + private const string GetFavoriteColorForMonthFunctionName = "get_favorite_color_for_month"; + +#pragma warning disable CS0618 + private static ChatFunction s_getFavoriteColorForMonthFunction = new ChatFunction( + GetFavoriteColorForMonthToolFunctionName, + "gets the caller's favorite color for a given month", + BinaryData.FromString(""" + { + "type": "object", + "properties": { + "month_name": { + "type": "string", + "description": "the name of a calendar month, e.g. February or October." + } + }, + "required": [ "month_name" ] + } + """) + ); +#pragma warning restore CS0618 + + private const string GetWeatherForCityToolName = "get_weather_for_city"; + + private static ChatTool s_getWeatherForCityTool = ChatTool.CreateFunctionTool( + GetWeatherForCityToolName, + "gets the current weather for a given city", + BinaryData.FromString(""" + { + "type": "object", + "properties": { + "city_name": { + "type": "string", + "description": "the name of a city, e.g. Johannesburg or Ho Chi Minh City." + } + }, + "required": [ "city_name" ] + } + """) + ); + + private const string GetMoodForWeatherToolName = "get_mood_for_weather"; + + private static ChatTool s_getMoodForWeatherTool = ChatTool.CreateFunctionTool( + GetMoodForWeatherToolName, + "gets the caller's mood for a given weather", + BinaryData.FromString(""" + { + "type": "object", + "properties": { + "weather": { + "type": "string", + "description": "the current weather of where the caller is located, e.g. sunny or cloudy." + } + }, + "required": [ "weather" ] + } + """) + ); + + [Test] + public async Task ConstraintsWork() + { + ChatClient client = GetTestClient(TestScenario.Chat); + IEnumerable messages = [new UserChatMessage("What's the number for the word 'banana'?")]; + + foreach (var (choice, reason) in new (ChatToolChoice, ChatFinishReason)[] + { + (null, ChatFinishReason.ToolCalls), + (ChatToolChoice.None, ChatFinishReason.Stop), + (new ChatToolChoice(s_numberForWordTool), ChatFinishReason.Stop), + (ChatToolChoice.Auto, ChatFinishReason.ToolCalls), + // TODO: Add test for ChatToolChoice.Required + }) + { + ChatCompletionOptions options = new() + { + Tools = { s_numberForWordTool }, + ToolChoice = choice, + }; + ClientResult result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + Assert.That(result.Value.FinishReason, Is.EqualTo(reason)); + } + } + + [Test] + public async Task NoParameterToolWorks() + { + ChatClient client = GetTestClient(TestScenario.Chat); + ICollection messages = [new UserChatMessage("What's my favorite color?")]; + ChatCompletionOptions options = new() + { + Tools = { s_getFavoriteColorTool }, + }; + ClientResult result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + + Assert.That(result.Value.FinishReason, Is.EqualTo(ChatFinishReason.ToolCalls)); + Assert.That(result.Value.ToolCalls.Count, Is.EqualTo(1)); + var toolCall = result.Value.ToolCalls[0]; + var toolCallArguments = BinaryData.FromString(toolCall.FunctionArguments).ToObjectFromJson>(); + Assert.That(toolCall.FunctionName, Is.EqualTo(GetFavoriteColorToolFunctionName)); + Assert.That(toolCall.Id, Is.Not.Null.And.Not.Empty); + Assert.That(toolCallArguments.Count, Is.EqualTo(0)); + + messages.Add(new AssistantChatMessage(result.Value)); + messages.Add(new ToolChatMessage(toolCall.Id, "green")); + result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + + Assert.That(result.Value.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + Assert.That(result.Value.Content[0].Text.ToLowerInvariant(), Contains.Substring("green")); + } + + [Test] + public async Task ParametersWork() + { + ChatClient client = GetTestClient(TestScenario.Chat); + ChatCompletionOptions options = new() + { + Tools = { s_getFavoriteColorForMonthTool }, + }; + List messages = + [ + new UserChatMessage("What's my favorite color in February?"), + ]; + ClientResult result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + Assert.That(result.Value.FinishReason, Is.EqualTo(ChatFinishReason.ToolCalls)); + Assert.That(result.Value.ToolCalls?.Count, Is.EqualTo(1)); + var toolCall = result.Value.ToolCalls[0]; + Assert.That(toolCall.FunctionName, Is.EqualTo(GetFavoriteColorForMonthToolFunctionName)); + JsonObject argumentsJson = JsonSerializer.Deserialize(toolCall.FunctionArguments); + Assert.That(argumentsJson.Count, Is.EqualTo(1)); + Assert.That(argumentsJson.ContainsKey("month_name")); + Assert.That(argumentsJson["month_name"].ToString().ToLowerInvariant(), Is.EqualTo("february")); + messages.Add(new AssistantChatMessage(result.Value)); + messages.Add(new ToolChatMessage(toolCall.Id, "chartreuse")); + result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + Assert.That(result.Value.Content[0].Text.ToLowerInvariant(), Contains.Substring("chartreuse")); + } + + [Test] + public async Task FunctionsWork() + { + ChatClient client = GetTestClient(TestScenario.Chat); + ChatCompletionOptions options = new() + { + Functions = { s_getFavoriteColorForMonthFunction }, + }; + List messages = + [ + new UserChatMessage("What's my favorite color in February?"), + ]; + ClientResult result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + Assert.That(result.Value.FinishReason, Is.EqualTo(ChatFinishReason.FunctionCall)); + var functionCall = result.Value.FunctionCall; + Assert.That(functionCall, Is.Not.Null); + Assert.That(functionCall.FunctionName, Is.EqualTo(GetFavoriteColorForMonthFunctionName)); + JsonObject argumentsJson = JsonSerializer.Deserialize(functionCall.FunctionArguments); + Assert.That(argumentsJson.Count, Is.EqualTo(1)); + Assert.That(argumentsJson.ContainsKey("month_name")); + Assert.That(argumentsJson["month_name"].ToString().ToLowerInvariant(), Is.EqualTo("february")); + messages.Add(new AssistantChatMessage(result.Value)); +#pragma warning disable CS0618 + messages.Add(new FunctionChatMessage(GetFavoriteColorForMonthFunctionName, "chartreuse")); +#pragma warning restore CS0618 + result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + Assert.That(result.Value.Content[0].Text.ToLowerInvariant(), Contains.Substring("chartreuse")); + } + + [Test] + public async Task ParallelToolCalls() + { + ChatClient client = GetTestClient(TestScenario.Chat); + ChatCompletionOptions options = new() + { + Tools = { s_getWeatherForCityTool }, + }; + List messages = [ + new UserChatMessage("Tell me what's the current weather in the following cities: Santiago and Karachi."), + ]; + ClientResult result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + + Assert.That(result.Value.FinishReason, Is.EqualTo(ChatFinishReason.ToolCalls)); + Assert.That(result.Value.ToolCalls.Count, Is.EqualTo(2)); + + var santiagoToolCall = result.Value.ToolCalls.Single(call => call.FunctionArguments.ToLowerInvariant().Contains("santiago")); + var karachiToolCall = result.Value.ToolCalls.Single(call => call.FunctionArguments.ToLowerInvariant().Contains("karachi")); + + JsonObject argumentsJson = JsonSerializer.Deserialize(santiagoToolCall.FunctionArguments); + Assert.That(argumentsJson.Count, Is.EqualTo(1)); + Assert.That(argumentsJson.ContainsKey("city_name")); + Assert.That(argumentsJson["city_name"].ToString().ToLowerInvariant(), Is.EqualTo("santiago")); + + argumentsJson = JsonSerializer.Deserialize(karachiToolCall.FunctionArguments); + Assert.That(argumentsJson.Count, Is.EqualTo(1)); + Assert.That(argumentsJson.ContainsKey("city_name")); + Assert.That(argumentsJson["city_name"].ToString().ToLowerInvariant(), Is.EqualTo("karachi")); + + messages.Add(new AssistantChatMessage(result.Value)); + messages.Add(new ToolChatMessage(santiagoToolCall.Id, "rainy")); + messages.Add(new ToolChatMessage(karachiToolCall.Id, "sunny")); + + result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + + Assert.That(result.Value.FinishReason, Is.EqualTo(ChatFinishReason.Stop)); + Assert.That(result.Value.Content[0].Text.ToLowerInvariant(), Contains.Substring("rainy")); + Assert.That(result.Value.Content[0].Text.ToLowerInvariant(), Contains.Substring("sunny")); + } + + [Test] + public async Task ConsecutiveToolCalls() + { + ChatClient client = GetTestClient(TestScenario.Chat); + ChatCompletionOptions options = new() + { + Tools = { s_getWeatherForCityTool, s_getMoodForWeatherTool }, + }; + List messages = [ + new UserChatMessage("Can you guess my mood given that I'm currently located in Osaka?"), + ]; + ClientResult result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + + Assert.That(result.Value.ToolCalls?.Count, Is.EqualTo(1)); + var toolCall = result.Value.ToolCalls[0]; + Assert.That(toolCall.FunctionName, Is.EqualTo(GetWeatherForCityToolName)); + + JsonObject argumentsJson = JsonSerializer.Deserialize(toolCall.FunctionArguments); + Assert.That(argumentsJson.Count, Is.EqualTo(1)); + Assert.That(argumentsJson.ContainsKey("city_name")); + Assert.That(argumentsJson["city_name"].ToString().ToLowerInvariant(), Is.EqualTo("osaka")); + + messages.Add(new AssistantChatMessage(result.Value)); + messages.Add(new ToolChatMessage(toolCall.Id, "rainy")); + result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + + Assert.That(result.Value.ToolCalls?.Count, Is.EqualTo(1)); + toolCall = result.Value.ToolCalls[0]; + Assert.That(toolCall.FunctionName, Is.EqualTo(GetMoodForWeatherToolName)); + + argumentsJson = JsonSerializer.Deserialize(toolCall.FunctionArguments); + Assert.That(argumentsJson.Count, Is.EqualTo(1)); + Assert.That(argumentsJson.ContainsKey("weather")); + Assert.That(argumentsJson["weather"].ToString().ToLowerInvariant(), Is.EqualTo("rainy")); + + messages.Add(new AssistantChatMessage(result.Value)); + messages.Add(new ToolChatMessage(toolCall.Id, "bored")); + result = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + + Assert.That(result.Value.Content[0].Text.ToLowerInvariant(), Contains.Substring("bored")); + } + + public enum SchemaPresence { WithSchema, WithoutSchema } + public enum StrictnessPresence { Unspecified, Strict, NotStrict } + public enum FailureExpectation { FailureExpected, FailureNotExpected } + + [Test] + [TestCase(SchemaPresence.WithoutSchema, StrictnessPresence.Unspecified)] + [TestCase(SchemaPresence.WithoutSchema, StrictnessPresence.NotStrict)] + [TestCase(SchemaPresence.WithoutSchema, StrictnessPresence.Strict, FailureExpectation.FailureExpected)] + [TestCase(SchemaPresence.WithSchema, StrictnessPresence.Unspecified)] + [TestCase(SchemaPresence.WithSchema, StrictnessPresence.NotStrict)] + [TestCase(SchemaPresence.WithSchema, StrictnessPresence.Strict)] + public async Task StructuredOutputs( + SchemaPresence schemaPresence, + StrictnessPresence strictnessPresence, + FailureExpectation failureExpectation = FailureExpectation.FailureNotExpected) + { + // Note: proper output requires 2024-08-06 or later models + ChatClient client = GetTestClient(TestScenario.Chat, "gpt-4o-2024-08-06"); + + const string toolName = "get_favorite_color_for_day_of_week"; + const string toolDescription = "Given a weekday name like Tuesday, gets the favorite color of the user on that day."; + BinaryData toolSchema = schemaPresence == SchemaPresence.WithSchema + ? BinaryData.FromObjectAsJson(new + { + type = "object", + properties = new + { + the_day_of_the_week = new + { + type = "string" + } + }, + required = new[] { "the_day_of_the_week" }, + additionalProperties = !(strictnessPresence == StrictnessPresence.Strict), + }) + : null; + bool? useStrictSchema = strictnessPresence switch + { + StrictnessPresence.Strict => true, + StrictnessPresence.NotStrict => false, + _ => null, + }; + + ChatCompletionOptions options = new() + { + Tools = { ChatTool.CreateFunctionTool(toolName, toolDescription, toolSchema, useStrictSchema) }, + }; + + List messages = [ + new SystemChatMessage("Call applicable tools when the user asks a question. Prefer JSON output when possible."), + new UserChatMessage("What's my favorite color on Tuesday?"), + ]; + + if (failureExpectation == FailureExpectation.FailureExpected) + { + ClientResultException thrownException = Assert.ThrowsAsync(async () => + { + ChatCompletion completion = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + }); + Assert.That(thrownException.Message, Does.Contain("function.parameters")); + } + else + { + ChatCompletion completion = IsAsync + ? await client.CompleteChatAsync(messages, options) + : client.CompleteChat(messages, options); + Assert.That(completion.FinishReason, Is.EqualTo(ChatFinishReason.ToolCalls)); + Assert.That(completion.ToolCalls, Has.Count.EqualTo(1)); + Assert.That(completion.ToolCalls[0].FunctionArguments, Is.Not.Null.And.Not.Empty); + + if (schemaPresence == SchemaPresence.WithSchema && strictnessPresence == StrictnessPresence.Strict) + { + using JsonDocument argumentsDocument = JsonDocument.Parse(completion.ToolCalls[0].FunctionArguments); + Assert.That(argumentsDocument.RootElement.GetProperty("the_day_of_the_week").GetString(), Is.EqualTo("Tuesday")); + } + } + } +} diff --git a/.dotnet/tests/Directory.Build.targets b/.dotnet/tests/Directory.Build.targets new file mode 100644 index 000000000..9108ca3f9 --- /dev/null +++ b/.dotnet/tests/Directory.Build.targets @@ -0,0 +1,7 @@ + + + + PreserveNewest + + + \ No newline at end of file diff --git a/.dotnet/tests/Embeddings/EmbeddingTests.cs b/.dotnet/tests/Embeddings/EmbeddingTests.cs new file mode 100644 index 000000000..94ebfe8b5 --- /dev/null +++ b/.dotnet/tests/Embeddings/EmbeddingTests.cs @@ -0,0 +1,141 @@ +using NUnit.Framework; +using OpenAI.Embeddings; +using OpenAI.Tests.Utility; +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Embeddings; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.All)] +[Category("Embeddings")] +public partial class EmbeddingTests : SyncAsyncTestBase +{ + public EmbeddingTests(bool isAsync) : base(isAsync) + { + } + + public enum EmbeddingsInputKind + { + UsingStrings, + UsingIntegers, + } + + [Test] + public async Task GenerateSingleEmbedding() + { + EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + string input = "Hello, world!"; + + Embedding embedding = IsAsync + ? await client.GenerateEmbeddingAsync(input) + : client.GenerateEmbedding(input); + Assert.That(embedding, Is.Not.Null); + Assert.That(embedding.Index, Is.EqualTo(0)); + Assert.That(embedding.Vector, Is.Not.Null); + Assert.That(embedding.Vector.Span.Length, Is.EqualTo(1536)); + + float[] array = embedding.Vector.ToArray(); + Assert.That(array.Length, Is.EqualTo(1536)); + } + + [Test] + [TestCase(EmbeddingsInputKind.UsingStrings)] + [TestCase(EmbeddingsInputKind.UsingIntegers)] + public async Task GenerateMultipleEmbeddings(EmbeddingsInputKind embeddingsInputKind) + { + EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY")); + + const int Dimensions = 456; + + EmbeddingGenerationOptions options = new() + { + Dimensions = Dimensions, + }; + + EmbeddingCollection embeddings = null; + + if (embeddingsInputKind == EmbeddingsInputKind.UsingStrings) + { + List prompts = + [ + "Hello, world!", + "This is a test.", + "Goodbye!" + ]; + + embeddings = IsAsync + ? await client.GenerateEmbeddingsAsync(prompts, options) + : client.GenerateEmbeddings(prompts, options); + } + else if (embeddingsInputKind == EmbeddingsInputKind.UsingIntegers) + { + List> prompts = + [ + [104, 101, 108, 108, 111], + [119, 111, 114, 108, 100], + [84, 69, 83, 84] + ]; + + embeddings = IsAsync + ? await client.GenerateEmbeddingsAsync(prompts, options) + : client.GenerateEmbeddings(prompts, options); + } + + Assert.That(embeddings, Is.Not.Null); + Assert.That(embeddings.Count, Is.EqualTo(3)); + Assert.That(embeddings.Model, Is.EqualTo("text-embedding-3-small")); + Assert.That(embeddings.Usage.InputTokens, Is.GreaterThan(0)); + Assert.That(embeddings.Usage.TotalTokens, Is.GreaterThan(0)); + + for (int i = 0; i < embeddings.Count; i++) + { + Assert.That(embeddings[i].Index, Is.EqualTo(i)); + Assert.That(embeddings[i].Vector, Is.Not.Null); + Assert.That(embeddings[i].Vector.Span.Length, Is.EqualTo(Dimensions)); + + float[] array = embeddings[i].Vector.ToArray(); + Assert.That(array.Length, Is.EqualTo(Dimensions)); + } + } + + [Test] + public async Task BadOptions() + { + EmbeddingClient client = GetTestClient(); + + EmbeddingGenerationOptions options = new() + { + Dimensions = -42, + }; + + Exception caughtException = null; + + try + { + _ = IsAsync + ? await client.GenerateEmbeddingAsync("foo", options) + : client.GenerateEmbedding("foo", options); + } + catch (Exception ex) + { + caughtException = ex; + } + + Assert.That(caughtException, Is.InstanceOf()); + Assert.That(caughtException.Message, Contains.Substring("dimensions")); + } + + [Test] + public void SerializeEmbeddingCollection() + { + // TODO: Add this test. + } + + private static EmbeddingClient GetTestClient() => GetTestClient(TestScenario.Embeddings); +} diff --git a/.dotnet/tests/Embeddings/OpenAIEmbeddingsModelFactoryTests.cs b/.dotnet/tests/Embeddings/OpenAIEmbeddingsModelFactoryTests.cs new file mode 100644 index 000000000..b57e80f54 --- /dev/null +++ b/.dotnet/tests/Embeddings/OpenAIEmbeddingsModelFactoryTests.cs @@ -0,0 +1,115 @@ +using System.Collections.Generic; +using System.Linq; +using NUnit.Framework; +using OpenAI.Embeddings; + +namespace OpenAI.Tests.Embeddings; + +[Parallelizable(ParallelScope.All)] +[Category("Smoke")] +public partial class OpenAIEmbeddingsModelFactoryTests +{ + [Test] + public void EmbeddingWithNoPropertiesWorks() + { + Embedding embedding = OpenAIEmbeddingsModelFactory.Embedding(); + + Assert.That(embedding.Index, Is.EqualTo(default(int))); + Assert.That(embedding.Vector.ToArray(), Is.Not.Null.And.Empty); + } + + [Test] + public void EmbeddingWithIndexWorks() + { + int index = 10; + Embedding embedding = OpenAIEmbeddingsModelFactory.Embedding(index: index); + + Assert.That(embedding.Index, Is.EqualTo(index)); + Assert.That(embedding.Vector.ToArray(), Is.Not.Null.And.Empty); + } + + [Test] + public void EmbeddingWithVectorWorks() + { + IEnumerable vector = [ 1f, 2f, 3f ]; + Embedding embedding = OpenAIEmbeddingsModelFactory.Embedding(vector: vector); + + Assert.That(embedding.Index, Is.EqualTo(default(int))); + Assert.That(embedding.Vector.ToArray().SequenceEqual(vector), Is.True); + } + + [Test] + public void EmbeddingCollectionWithNoPropertiesWorks() + { + EmbeddingCollection embeddingCollection = OpenAIEmbeddingsModelFactory.EmbeddingCollection(); + + Assert.That(embeddingCollection.Count, Is.EqualTo(0)); + Assert.That(embeddingCollection.Model, Is.Null); + Assert.That(embeddingCollection.Usage, Is.Null); + } + + [Test] + public void EmbeddingCollectionWithItemsWorks() + { + IEnumerable items = [ + OpenAIEmbeddingsModelFactory.Embedding(index: 10), + OpenAIEmbeddingsModelFactory.Embedding(index: 20) + ]; + EmbeddingCollection embeddingCollection = OpenAIEmbeddingsModelFactory.EmbeddingCollection(items: items); + + Assert.That(embeddingCollection.SequenceEqual(items), Is.True); + Assert.That(embeddingCollection.Model, Is.Null); + Assert.That(embeddingCollection.Usage, Is.Null); + } + + [Test] + public void EmbeddingCollectionWithModelWorks() + { + string model = "supermodel"; + EmbeddingCollection embeddingCollection = OpenAIEmbeddingsModelFactory.EmbeddingCollection(model: model); + + Assert.That(embeddingCollection.Count, Is.EqualTo(0)); + Assert.That(embeddingCollection.Model, Is.EqualTo(model)); + Assert.That(embeddingCollection.Usage, Is.Null); + } + + [Test] + public void EmbeddingCollectionWithUsageWorks() + { + EmbeddingTokenUsage usage = OpenAIEmbeddingsModelFactory.EmbeddingTokenUsage(inputTokens: 10); + EmbeddingCollection embeddingCollection = OpenAIEmbeddingsModelFactory.EmbeddingCollection(usage: usage); + + Assert.That(embeddingCollection.Count, Is.EqualTo(0)); + Assert.That(embeddingCollection.Model, Is.Null); + Assert.That(embeddingCollection.Usage, Is.EqualTo(usage)); + } + + [Test] + public void EmbeddingTokenUsageWithNoPropertiesWorks() + { + EmbeddingTokenUsage embeddingTokenUsage = OpenAIEmbeddingsModelFactory.EmbeddingTokenUsage(); + + Assert.That(embeddingTokenUsage.InputTokens, Is.EqualTo(default(int))); + Assert.That(embeddingTokenUsage.TotalTokens, Is.EqualTo(default(int))); + } + + [Test] + public void EmbeddingTokenUsageWithInputTokensWorks() + { + int inputTokens = 10; + EmbeddingTokenUsage embeddingTokenUsage = OpenAIEmbeddingsModelFactory.EmbeddingTokenUsage(inputTokens: inputTokens); + + Assert.That(embeddingTokenUsage.InputTokens, Is.EqualTo(10)); + Assert.That(embeddingTokenUsage.TotalTokens, Is.EqualTo(default(int))); + } + + [Test] + public void EmbeddingTokenUsageWithTotalTokensWorks() + { + int totalTokens = 10; + EmbeddingTokenUsage embeddingTokenUsage = OpenAIEmbeddingsModelFactory.EmbeddingTokenUsage(totalTokens: totalTokens); + + Assert.That(embeddingTokenUsage.InputTokens, Is.EqualTo(default(int))); + Assert.That(embeddingTokenUsage.TotalTokens, Is.EqualTo(totalTokens)); + } +} diff --git a/.dotnet/tests/Files/FileTests.cs b/.dotnet/tests/Files/FileTests.cs new file mode 100644 index 000000000..0484c9ee3 --- /dev/null +++ b/.dotnet/tests/Files/FileTests.cs @@ -0,0 +1,127 @@ +using NUnit.Framework; +using OpenAI.Files; +using OpenAI.Tests.Utility; +using System; +using System.IO; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Files; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.Fixtures)] +[Category("Files")] +public partial class FileTests : SyncAsyncTestBase +{ + public FileTests(bool isAsync) : base(isAsync) + { + } + + [Test] + public async Task ListFiles() + { + FileClient client = GetTestClient(); + + OpenAIFileInfoCollection allFiles = IsAsync + ? await client.GetFilesAsync() + : client.GetFiles(); + Assert.That(allFiles.Count, Is.GreaterThan(0)); + Console.WriteLine($"Total files count: {allFiles.Count}"); + + OpenAIFileInfoCollection assistantsFiles = IsAsync + ? await client.GetFilesAsync(OpenAIFilePurpose.Assistants) + : client.GetFiles(OpenAIFilePurpose.Assistants); + Assert.That(assistantsFiles.Count, Is.GreaterThan(0).And.LessThan(allFiles.Count)); + Console.WriteLine($"Assistant files count: {assistantsFiles.Count}"); + } + + [Test] + public async Task UploadAndRetrieve() + { + FileClient client = GetTestClient(); + string fileContent = "Hello! This is a test text file. Please delete me."; + using Stream file = BinaryData.FromString(fileContent).ToStream(); + string filename = "test-file-delete-me.txt"; + + // Upload file. + OpenAIFileInfo uploadedFile = IsAsync + ? await client.UploadFileAsync(file, filename, FileUploadPurpose.Assistants) + : client.UploadFile(file, filename, FileUploadPurpose.Assistants); + Assert.That(uploadedFile, Is.Not.Null); + + try + { + Assert.That(uploadedFile.Filename, Is.EqualTo(filename)); + Assert.That(uploadedFile.Purpose, Is.EqualTo(OpenAIFilePurpose.Assistants)); + + // Retrieve file. + OpenAIFileInfo retrievedFile = IsAsync + ? await client.GetFileAsync(uploadedFile.Id) + : client.GetFile(uploadedFile.Id); + Assert.That(retrievedFile.Id, Is.EqualTo(uploadedFile.Id)); + Assert.That(retrievedFile.Filename, Is.EqualTo(uploadedFile.Filename)); + } + finally + { + // Delete file. + bool deleted = IsAsync + ? await client.DeleteFileAsync(uploadedFile.Id) + : client.DeleteFile(uploadedFile.Id); + Assert.That(deleted, Is.True); + } + } + + [Test] + public async Task UploadAndDownloadContent() + { + FileClient client = GetTestClient(); + string imagePath = Path.Combine("Assets", "images_dog_and_cat.png"); + + // Upload file. + OpenAIFileInfo uploadedFile = IsAsync + ? await client.UploadFileAsync(imagePath, FileUploadPurpose.Vision) + : client.UploadFile(imagePath, FileUploadPurpose.Vision); + Assert.That(uploadedFile, Is.Not.Null); + + try + { + Assert.That(uploadedFile.Filename, Is.EqualTo(imagePath)); + Assert.That(uploadedFile.Purpose, Is.EqualTo(OpenAIFilePurpose.Vision)); + + // Download file content. + BinaryData downloadedContent = IsAsync + ? await client.DownloadFileAsync(uploadedFile.Id) + : client.DownloadFile(uploadedFile.Id); + Assert.That(downloadedContent, Is.Not.Null); + } + finally + { + // Delete file. + bool deleted = IsAsync + ? await client.DeleteFileAsync(uploadedFile.Id) + : client.DeleteFile(uploadedFile.Id); + Assert.That(deleted, Is.True); + } + } + + [Test] + public void SerializeFileCollection() + { + // TODO: Add this test. + } + + [Test] + public async Task NonAsciiFilename() + { + FileClient client = GetTestClient(); + string filename = "你好.txt"; + BinaryData fileContent = BinaryData.FromString("世界您好!这是个测试。"); + OpenAIFileInfo uploadedFile = IsAsync + ? await client.UploadFileAsync(fileContent, filename, FileUploadPurpose.Assistants) + : client.UploadFile(fileContent, filename, FileUploadPurpose.Assistants); + Assert.That(uploadedFile?.Filename, Is.EqualTo(filename)); + } + + private static FileClient GetTestClient() => GetTestClient(TestScenario.Files); +} \ No newline at end of file diff --git a/.dotnet/tests/Files/OpenAIFilesModelFactoryTests.cs b/.dotnet/tests/Files/OpenAIFilesModelFactoryTests.cs new file mode 100644 index 000000000..4badc7a85 --- /dev/null +++ b/.dotnet/tests/Files/OpenAIFilesModelFactoryTests.cs @@ -0,0 +1,151 @@ +using System; +using System.Collections.Generic; +using System.Linq; +using NUnit.Framework; +using OpenAI.Files; + +namespace OpenAI.Tests.Files; + +[Parallelizable(ParallelScope.All)] +[Category("Smoke")] +public partial class OpenAIFilesModelFactoryTests +{ + [Test] + public void OpenAIFileInfoWithNoPropertiesWorks() + { + OpenAIFileInfo openAIFileInfo = OpenAIFilesModelFactory.OpenAIFileInfo(); + + Assert.That(openAIFileInfo.Id, Is.Null); + Assert.That(openAIFileInfo.SizeInBytes, Is.Null); + Assert.That(openAIFileInfo.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(openAIFileInfo.Filename, Is.Null); + Assert.That(openAIFileInfo.Purpose, Is.EqualTo(default(OpenAIFilePurpose))); + Assert.That(openAIFileInfo.Status, Is.EqualTo(default(OpenAIFileStatus))); + Assert.That(openAIFileInfo.StatusDetails, Is.Null); + } + + [Test] + public void OpenAIFileInfoWithIdWorks() + { + string id = "fileId"; + OpenAIFileInfo openAIFileInfo = OpenAIFilesModelFactory.OpenAIFileInfo(id: id); + + Assert.That(openAIFileInfo.Id, Is.EqualTo(id)); + Assert.That(openAIFileInfo.SizeInBytes, Is.Null); + Assert.That(openAIFileInfo.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(openAIFileInfo.Filename, Is.Null); + Assert.That(openAIFileInfo.Purpose, Is.EqualTo(default(OpenAIFilePurpose))); + Assert.That(openAIFileInfo.Status, Is.EqualTo(default(OpenAIFileStatus))); + Assert.That(openAIFileInfo.StatusDetails, Is.Null); + } + + [Test] + public void OpenAIFileInfoWithSizeInBytesWorks() + { + int sizeInBytes = 1025; + OpenAIFileInfo openAIFileInfo = OpenAIFilesModelFactory.OpenAIFileInfo(sizeInBytes: sizeInBytes); + + Assert.That(openAIFileInfo.Id, Is.Null); + Assert.That(openAIFileInfo.SizeInBytes, Is.EqualTo(sizeInBytes)); + Assert.That(openAIFileInfo.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(openAIFileInfo.Filename, Is.Null); + Assert.That(openAIFileInfo.Purpose, Is.EqualTo(default(OpenAIFilePurpose))); + Assert.That(openAIFileInfo.Status, Is.EqualTo(default(OpenAIFileStatus))); + Assert.That(openAIFileInfo.StatusDetails, Is.Null); + } + + [Test] + public void OpenAIFileInfoWithCreatedAtWorks() + { + DateTimeOffset createdAt = DateTimeOffset.UtcNow; + OpenAIFileInfo openAIFileInfo = OpenAIFilesModelFactory.OpenAIFileInfo(createdAt: createdAt); + + Assert.That(openAIFileInfo.Id, Is.Null); + Assert.That(openAIFileInfo.SizeInBytes, Is.Null); + Assert.That(openAIFileInfo.CreatedAt, Is.EqualTo(createdAt)); + Assert.That(openAIFileInfo.Filename, Is.Null); + Assert.That(openAIFileInfo.Purpose, Is.EqualTo(default(OpenAIFilePurpose))); + Assert.That(openAIFileInfo.Status, Is.EqualTo(default(OpenAIFileStatus))); + Assert.That(openAIFileInfo.StatusDetails, Is.Null); + } + + [Test] + public void OpenAIFileInfoWithFilenameWorks() + { + string filename = "file.png"; + OpenAIFileInfo openAIFileInfo = OpenAIFilesModelFactory.OpenAIFileInfo(filename: filename); + + Assert.That(openAIFileInfo.Id, Is.Null); + Assert.That(openAIFileInfo.SizeInBytes, Is.Null); + Assert.That(openAIFileInfo.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(openAIFileInfo.Filename, Is.EqualTo(filename)); + Assert.That(openAIFileInfo.Purpose, Is.EqualTo(default(OpenAIFilePurpose))); + Assert.That(openAIFileInfo.Status, Is.EqualTo(default(OpenAIFileStatus))); + Assert.That(openAIFileInfo.StatusDetails, Is.Null); + } + + [Test] + public void OpenAIFileInfoWithPurposeWorks() + { + OpenAIFilePurpose purpose = OpenAIFilePurpose.Vision; + OpenAIFileInfo openAIFileInfo = OpenAIFilesModelFactory.OpenAIFileInfo(purpose: purpose); + + Assert.That(openAIFileInfo.Id, Is.Null); + Assert.That(openAIFileInfo.SizeInBytes, Is.Null); + Assert.That(openAIFileInfo.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(openAIFileInfo.Filename, Is.Null); + Assert.That(openAIFileInfo.Purpose, Is.EqualTo(purpose)); + Assert.That(openAIFileInfo.Status, Is.EqualTo(default(OpenAIFileStatus))); + Assert.That(openAIFileInfo.StatusDetails, Is.Null); + } + + [Test] + public void OpenAIFileInfoWithStatusWorks() + { + OpenAIFileStatus status = OpenAIFileStatus.Uploaded; + OpenAIFileInfo openAIFileInfo = OpenAIFilesModelFactory.OpenAIFileInfo(status: status); + + Assert.That(openAIFileInfo.Id, Is.Null); + Assert.That(openAIFileInfo.SizeInBytes, Is.Null); + Assert.That(openAIFileInfo.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(openAIFileInfo.Filename, Is.Null); + Assert.That(openAIFileInfo.Purpose, Is.EqualTo(default(OpenAIFilePurpose))); + Assert.That(openAIFileInfo.Status, Is.EqualTo(status)); + Assert.That(openAIFileInfo.StatusDetails, Is.Null); + } + + [Test] + public void OpenAIFileInfoWithStatusDetailsWorks() + { + string statusDetails = "There's something off about this file."; + OpenAIFileInfo openAIFileInfo = OpenAIFilesModelFactory.OpenAIFileInfo(statusDetails: statusDetails); + + Assert.That(openAIFileInfo.Id, Is.Null); + Assert.That(openAIFileInfo.SizeInBytes, Is.Null); + Assert.That(openAIFileInfo.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(openAIFileInfo.Filename, Is.Null); + Assert.That(openAIFileInfo.Purpose, Is.EqualTo(default(OpenAIFilePurpose))); + Assert.That(openAIFileInfo.Status, Is.EqualTo(default(OpenAIFileStatus))); + Assert.That(openAIFileInfo.StatusDetails, Is.EqualTo(statusDetails)); + } + + [Test] + public void OpenAIFileInfoCollectionWithNoPropertiesWorks() + { + OpenAIFileInfoCollection openAIFileInfoCollection = OpenAIFilesModelFactory.OpenAIFileInfoCollection(); + + Assert.That(openAIFileInfoCollection.Count, Is.EqualTo(0)); + } + + [Test] + public void OpenAIFileInfoCollectionWithItemsWorks() + { + IEnumerable items = [ + OpenAIFilesModelFactory.OpenAIFileInfo(id: "firstFile"), + OpenAIFilesModelFactory.OpenAIFileInfo(id: "secondFile") + ]; + OpenAIFileInfoCollection openAIFileInfoCollection = OpenAIFilesModelFactory.OpenAIFileInfoCollection(items: items); + + Assert.That(openAIFileInfoCollection.SequenceEqual(items), Is.True); + } +} diff --git a/.dotnet/tests/GitHubTests.cs b/.dotnet/tests/GitHubTests.cs new file mode 100644 index 000000000..cdd84d29e --- /dev/null +++ b/.dotnet/tests/GitHubTests.cs @@ -0,0 +1,25 @@ +using NUnit.Framework; +using System; + +namespace OpenAI.Tests.Miscellaneous; + +public partial class GitHubTests +{ + [Test(Description = "Test that we can use a GitHub secret")] + [Category("Online")] + [Ignore("Placeholder")] + public void CanUseGitHubSecret() + { + string gitHubSecretString = Environment.GetEnvironmentVariable("SECRET_VALUE"); + Assert.That(gitHubSecretString, Is.Not.Null.And.Not.Empty); + } + + [Test(Description = "That that we can run some tests without secrets")] + [Category("Offline")] + [Ignore("Placeholder")] + public void CanTestWithoutSecretAccess() + { + int result = 2 + 1; + Assert.That(result, Is.EqualTo(3)); + } +} diff --git a/.dotnet/tests/Images/ImageGenerationTests.cs b/.dotnet/tests/Images/ImageGenerationTests.cs new file mode 100644 index 000000000..efc95c544 --- /dev/null +++ b/.dotnet/tests/Images/ImageGenerationTests.cs @@ -0,0 +1,206 @@ +using NUnit.Framework; +using OpenAI.Chat; +using OpenAI.Images; +using OpenAI.Tests.Utility; +using System; +using System.ClientModel; +using System.Collections.Generic; +using System.IO; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Images; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.All)] +[Category("Images")] +public partial class ImageGenerationTests : SyncAsyncTestBase +{ + public ImageGenerationTests(bool isAsync) : base(isAsync) + { + } + + [Test] + public async Task BasicGenerationWorks() + { + ImageClient client = GetTestClient(TestScenario.Images); + + string prompt = "An isolated stop sign."; + + GeneratedImage image = IsAsync + ? await client.GenerateImageAsync(prompt) + : client.GenerateImage(prompt); + Assert.That(image.ImageUri, Is.Not.Null); + Assert.That(image.ImageBytes, Is.Null); + + Console.WriteLine(image.ImageUri.AbsoluteUri); + ValidateGeneratedImage(image.ImageUri, "stop"); + } + + [Test] + public async Task GenerationWithOptionsWorks() + { + ImageClient client = GetTestClient(TestScenario.Images); + + string prompt = "An isolated stop sign."; + + ImageGenerationOptions options = new() + { + Quality = GeneratedImageQuality.Standard, + Style = GeneratedImageStyle.Natural, + }; + + GeneratedImage image = IsAsync + ? await client.GenerateImageAsync(prompt, options) + : client.GenerateImage(prompt, options); + Assert.That(image.ImageUri, Is.Not.Null); + } + + [Test] + public async Task GenerationWithBytesResponseWorks() + { + ImageClient client = GetTestClient(TestScenario.Images); + + string prompt = "An isolated stop sign."; + + ImageGenerationOptions options = new() + { + ResponseFormat = GeneratedImageFormat.Bytes + }; + + GeneratedImage image = IsAsync + ? await client.GenerateImageAsync(prompt, options) + : client.GenerateImage(prompt, options); + Assert.That(image.ImageUri, Is.Null); + Assert.That(image.ImageBytes, Is.Not.Null); + + ValidateGeneratedImage(image.ImageBytes, "stop"); + } + + [Test] + public async Task GenerateImageEditWorks() + { + ImageClient client = GetTestClient(TestScenario.Images, "dall-e-2"); + + string prompt = "A big cat with big, round eyes sitting in an empty room and looking at the camera."; + string maskImagePath = Path.Combine("Assets", "images_empty_room_with_mask.png"); + + GeneratedImage image = IsAsync + ? await client.GenerateImageEditAsync(maskImagePath, prompt) + : client.GenerateImageEdit(maskImagePath, prompt); + Assert.That(image.ImageUri, Is.Not.Null); + Assert.That(image.ImageBytes, Is.Null); + + Console.WriteLine(image.ImageUri.AbsoluteUri); + + ValidateGeneratedImage(image.ImageUri, "cat"); + } + + [Test] + public async Task GenerateImageEditWithMaskFileWorks() + { + ImageClient client = GetTestClient(TestScenario.Images, "dall-e-2"); + + string prompt = "A big cat with big, round eyes sitting in an empty room and looking at the camera."; + string originalImagePath = Path.Combine("Assets", "images_empty_room.png"); + string maskImagePath = Path.Combine("Assets", "images_empty_room_with_mask.png"); + + GeneratedImage image = IsAsync + ? await client.GenerateImageEditAsync(originalImagePath, prompt, maskImagePath) + : client.GenerateImageEdit(originalImagePath, prompt, maskImagePath); + Assert.That(image.ImageUri, Is.Not.Null); + Assert.That(image.ImageBytes, Is.Null); + + Console.WriteLine(image.ImageUri.AbsoluteUri); + + ValidateGeneratedImage(image.ImageUri, "cat"); + } + + [Test] + public async Task GenerateImageEditWithBytesResponseWorks() + { + ImageClient client = GetTestClient(TestScenario.Images, "dall-e-2"); + + string prompt = "A big cat with big, round eyes sitting in an empty room and looking at the camera."; + string maskImagePath = Path.Combine("Assets", "images_empty_room_with_mask.png"); + + ImageEditOptions options = new() + { + ResponseFormat = GeneratedImageFormat.Bytes + }; + + GeneratedImage image = IsAsync + ? await client.GenerateImageEditAsync(maskImagePath, prompt, options) + : client.GenerateImageEdit(maskImagePath, prompt, options); + Assert.That(image.ImageUri, Is.Null); + Assert.That(image.ImageBytes, Is.Not.Null); + + ValidateGeneratedImage(image.ImageBytes, "cat", "Note that it likely depicts some sort of animal."); + } + + [Test] + public async Task GenerateImageVariationWorks() + { + ImageClient client = GetTestClient(TestScenario.Images, "dall-e-2"); + string imagePath = Path.Combine("Assets", "images_dog_and_cat.png"); + + GeneratedImage image = IsAsync + ? await client.GenerateImageVariationAsync(imagePath) + : client.GenerateImageVariation(imagePath); + Assert.That(image.ImageUri, Is.Not.Null); + Assert.That(image.ImageBytes, Is.Null); + + Console.WriteLine(image.ImageUri.AbsoluteUri); + + ValidateGeneratedImage(image.ImageUri, "cat", "Note that it likely depicts some sort of animal."); + } + + [Test] + public async Task GenerateImageVariationWithBytesResponseWorks() + { + ImageClient client = GetTestClient(TestScenario.Images, "dall-e-2"); + string imagePath = Path.Combine("Assets", "images_dog_and_cat.png"); + + ImageVariationOptions options = new() + { + ResponseFormat = GeneratedImageFormat.Bytes + }; + + GeneratedImage image = IsAsync + ? await client.GenerateImageVariationAsync(imagePath, options) + : client.GenerateImageVariation(imagePath, options); + Assert.That(image.ImageUri, Is.Null); + Assert.That(image.ImageBytes, Is.Not.Null); + + ValidateGeneratedImage(image.ImageBytes, "cat", "Note that it likely depicts some sort of animal."); + } + + private void ValidateGeneratedImage(Uri imageUri, string expectedSubstring, string descriptionHint = null) + { + ChatClient chatClient = GetTestClient(TestScenario.Chat); + IEnumerable messages = [ + new UserChatMessage( + ChatMessageContentPart.CreateTextMessageContentPart($"Describe this image for me. {descriptionHint}"), + ChatMessageContentPart.CreateImageMessageContentPart(imageUri)), + ]; + ChatCompletionOptions chatOptions = new() { MaxTokens = 2048 }; + ClientResult result = chatClient.CompleteChat(messages, chatOptions); + + Assert.That(result.Value.Content[0].Text.ToLowerInvariant(), Contains.Substring(expectedSubstring)); + } + + private void ValidateGeneratedImage(BinaryData imageBytes, string expectedSubstring, string descriptionHint = null) + { + ChatClient chatClient = GetTestClient(TestScenario.Chat); + IEnumerable messages = [ + new UserChatMessage( + ChatMessageContentPart.CreateTextMessageContentPart($"Describe this image for me. {descriptionHint}"), + ChatMessageContentPart.CreateImageMessageContentPart(imageBytes, "image/png")), + ]; + ChatCompletionOptions chatOptions = new() { MaxTokens = 2048 }; + ClientResult result = chatClient.CompleteChat(messages, chatOptions); + + Assert.That(result.Value.Content[0].Text.ToLowerInvariant(), Contains.Substring(expectedSubstring)); + } +} diff --git a/.dotnet/tests/Images/OpenAIImagesModelFactoryTests.cs b/.dotnet/tests/Images/OpenAIImagesModelFactoryTests.cs new file mode 100644 index 000000000..11d702776 --- /dev/null +++ b/.dotnet/tests/Images/OpenAIImagesModelFactoryTests.cs @@ -0,0 +1,87 @@ +using System; +using System.Collections.Generic; +using System.Linq; +using NUnit.Framework; +using OpenAI.Images; + +namespace OpenAI.Tests.Images; + +[Parallelizable(ParallelScope.All)] +[Category("Smoke")] +public partial class OpenAIImagesModelFactoryTests +{ + [Test] + public void GeneratedImageWithNoPropertiesWorks() + { + GeneratedImage generatedImage = OpenAIImagesModelFactory.GeneratedImage(); + + Assert.That(generatedImage.ImageBytes, Is.Null); + Assert.That(generatedImage.ImageUri, Is.Null); + Assert.That(generatedImage.RevisedPrompt, Is.Null); + } + + [Test] + public void GeneratedImageWithImageBytesWorks() + { + BinaryData imageBytes = BinaryData.FromString("Definitely an image."); + GeneratedImage generatedImage = OpenAIImagesModelFactory.GeneratedImage(imageBytes: imageBytes); + + Assert.That(generatedImage.ImageBytes, Is.EqualTo(imageBytes)); + Assert.That(generatedImage.ImageUri, Is.Null); + Assert.That(generatedImage.RevisedPrompt, Is.Null); + } + + [Test] + public void GeneratedImageWithImageUriWorks() + { + Uri imageUri = new Uri("https://definitely-a-website.com/"); + GeneratedImage generatedImage = OpenAIImagesModelFactory.GeneratedImage(imageUri: imageUri); + + Assert.That(generatedImage.ImageBytes, Is.Null); + Assert.That(generatedImage.ImageUri, Is.EqualTo(imageUri)); + Assert.That(generatedImage.RevisedPrompt, Is.Null); + } + + [Test] + public void GeneratedImageWithRevisedPromptWorks() + { + string revisedPrompt = "I've been revised."; + GeneratedImage generatedImage = OpenAIImagesModelFactory.GeneratedImage(revisedPrompt: revisedPrompt); + + Assert.That(generatedImage.ImageBytes, Is.Null); + Assert.That(generatedImage.ImageUri, Is.Null); + Assert.That(generatedImage.RevisedPrompt, Is.EqualTo(revisedPrompt)); + } + + [Test] + public void GeneratedImageCollectionWithNoPropertiesWorks() + { + GeneratedImageCollection generatedImageCollection = OpenAIImagesModelFactory.GeneratedImageCollection(); + + Assert.That(generatedImageCollection.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(generatedImageCollection.Count, Is.EqualTo(0)); + } + + [Test] + public void GeneratedImageCollectionWithCreatedAtWorks() + { + DateTimeOffset createdAt = DateTimeOffset.UtcNow; + GeneratedImageCollection generatedImageCollection = OpenAIImagesModelFactory.GeneratedImageCollection(createdAt: createdAt); + + Assert.That(generatedImageCollection.CreatedAt, Is.EqualTo(createdAt)); + Assert.That(generatedImageCollection.Count, Is.EqualTo(0)); + } + + [Test] + public void GeneratedImageCollectionWithItemsWorks() + { + IEnumerable items = [ + OpenAIImagesModelFactory.GeneratedImage(revisedPrompt: "This is the first prompt."), + OpenAIImagesModelFactory.GeneratedImage(revisedPrompt: "This is not the first prompt.") + ]; + GeneratedImageCollection generatedImageCollection = OpenAIImagesModelFactory.GeneratedImageCollection(items: items); + + Assert.That(generatedImageCollection.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(generatedImageCollection.SequenceEqual(items), Is.True); + } +} diff --git a/.dotnet/tests/Models/ModelTests.cs b/.dotnet/tests/Models/ModelTests.cs new file mode 100644 index 000000000..aa5078490 --- /dev/null +++ b/.dotnet/tests/Models/ModelTests.cs @@ -0,0 +1,53 @@ +using NUnit.Framework; +using OpenAI.Models; +using OpenAI.Tests.Utility; +using System; +using System.Linq; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Models; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.All)] +[Category("Models")] +public partial class ModelTests : SyncAsyncTestBase +{ + public ModelTests(bool isAsync) : base(isAsync) + { + } + + [Test] + public async Task ListModels() + { + ModelClient client = GetTestClient(TestScenario.Models); + + OpenAIModelInfoCollection allModels = IsAsync + ? await client.GetModelsAsync() + : client.GetModels(); + Assert.That(allModels, Is.Not.Null.Or.Empty); + Assert.That(allModels.Any(modelInfo => modelInfo.Id.Contains("whisper", StringComparison.InvariantCultureIgnoreCase))); + Console.WriteLine($"Total model count: {allModels.Count}"); + } + + [Test] + public async Task GetModelInfo() + { + ModelClient client = GetTestClient(TestScenario.Models); + + string modelName = "gpt-4o-mini"; + + OpenAIModelInfo model = IsAsync + ? await client.GetModelAsync(modelName) + : client.GetModel(modelName); + Assert.That(model, Is.Not.Null); + Assert.That(model.OwnedBy.ToLowerInvariant(), Does.Contain("system")); + } + + [Test] + public void SerializeModelCollection() + { + // TODO: Add this test. + } +} diff --git a/.dotnet/tests/Models/OpenAIModelsModelFactoryTests.cs b/.dotnet/tests/Models/OpenAIModelsModelFactoryTests.cs new file mode 100644 index 000000000..f7be3e02a --- /dev/null +++ b/.dotnet/tests/Models/OpenAIModelsModelFactoryTests.cs @@ -0,0 +1,75 @@ +using System; +using System.Collections.Generic; +using System.Linq; +using NUnit.Framework; +using OpenAI.Models; + +namespace OpenAI.Tests.Models; + +[Parallelizable(ParallelScope.All)] +[Category("Smoke")] +public partial class OpenAIModelsModelFactoryTests +{ + [Test] + public void OpenAIModelInfoWithNoPropertiesWorks() + { + OpenAIModelInfo openAIModelInfo = OpenAIModelsModelFactory.OpenAIModelInfo(); + + Assert.That(openAIModelInfo.Id, Is.Null); + Assert.That(openAIModelInfo.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(openAIModelInfo.OwnedBy, Is.Null); + } + + [Test] + public void OpenAIModelInfoWithIdWorks() + { + string id = "modelId"; + OpenAIModelInfo openAIModelInfo = OpenAIModelsModelFactory.OpenAIModelInfo(id: id); + + Assert.That(openAIModelInfo.Id, Is.EqualTo(id)); + Assert.That(openAIModelInfo.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(openAIModelInfo.OwnedBy, Is.Null); + } + + [Test] + public void OpenAIModelInfoWithCreatedAtWorks() + { + DateTimeOffset createdAt = DateTimeOffset.UtcNow; + OpenAIModelInfo openAIModelInfo = OpenAIModelsModelFactory.OpenAIModelInfo(createdAt: createdAt); + + Assert.That(openAIModelInfo.Id, Is.Null); + Assert.That(openAIModelInfo.CreatedAt, Is.EqualTo(createdAt)); + Assert.That(openAIModelInfo.OwnedBy, Is.Null); + } + + [Test] + public void OpenAIModelInfoWithOwnedByWorks() + { + string ownedBy = "The people"; + OpenAIModelInfo openAIModelInfo = OpenAIModelsModelFactory.OpenAIModelInfo(ownedBy: ownedBy); + + Assert.That(openAIModelInfo.Id, Is.Null); + Assert.That(openAIModelInfo.CreatedAt, Is.EqualTo(default(DateTimeOffset))); + Assert.That(openAIModelInfo.OwnedBy, Is.EqualTo(ownedBy)); + } + + [Test] + public void OpenAIModelInfoCollectionWithNoPropertiesWorks() + { + OpenAIModelInfoCollection openAIModelInfoCollection = OpenAIModelsModelFactory.OpenAIModelInfoCollection(); + + Assert.That(openAIModelInfoCollection.Count, Is.EqualTo(0)); + } + + [Test] + public void OpenAIModelInfoCollectionWithItemsWorks() + { + IEnumerable items = [ + OpenAIModelsModelFactory.OpenAIModelInfo(id: "firstModel"), + OpenAIModelsModelFactory.OpenAIModelInfo(id: "secondModel") + ]; + OpenAIModelInfoCollection openAIModelInfoCollection = OpenAIModelsModelFactory.OpenAIModelInfoCollection(items: items); + + Assert.That(openAIModelInfoCollection.SequenceEqual(items), Is.True); + } +} diff --git a/.dotnet/tests/Moderations/ModerationTests.cs b/.dotnet/tests/Moderations/ModerationTests.cs new file mode 100644 index 000000000..bb5b0e25c --- /dev/null +++ b/.dotnet/tests/Moderations/ModerationTests.cs @@ -0,0 +1,69 @@ +using NUnit.Framework; +using OpenAI.Moderations; +using OpenAI.Tests.Utility; +using System.Collections.Generic; +using System.Threading.Tasks; +using static OpenAI.Tests.TestHelpers; + +namespace OpenAI.Tests.Moderations; + +[TestFixture(true)] +[TestFixture(false)] +[Parallelizable(ParallelScope.All)] +[Category("Moderations")] +public partial class ModerationTests : SyncAsyncTestBase +{ + public ModerationTests(bool isAsync) : base(isAsync) + { + } + + [Test] + public async Task ClassifySingleInput() + { + ModerationClient client = GetTestClient(TestScenario.Moderations); + + const string input = "I am killing all my houseplants!"; + + ModerationResult moderation = IsAsync + ? await client.ClassifyTextInputAsync(input) + : client.ClassifyTextInput(input); + Assert.That(moderation, Is.Not.Null); + Assert.That(moderation.Flagged, Is.True); + Assert.That(moderation.Categories.Violence, Is.True); + Assert.That(moderation.CategoryScores.Violence, Is.GreaterThan(0.5)); + } + + [Test] + public async Task ClassifyMultipleInputs() + { + ModerationClient client = GetTestClient(TestScenario.Moderations); + + List inputs = + [ + "I forgot to water my houseplants!", + "I am killing all my houseplants!" + ]; + + ModerationCollection moderations = IsAsync + ? await client.ClassifyTextInputsAsync(inputs) + : client.ClassifyTextInputs(inputs); + Assert.That(moderations, Is.Not.Null); + Assert.That(moderations.Count, Is.EqualTo(2)); + Assert.That(moderations.Model, Does.StartWith("text-moderation")); + Assert.That(moderations.Id, Is.Not.Null.Or.Empty); + + Assert.That(moderations[0], Is.Not.Null); + Assert.That(moderations[0].Flagged, Is.False); + + Assert.That(moderations[1], Is.Not.Null); + Assert.That(moderations[1].Flagged, Is.True); + Assert.That(moderations[1].Categories.Violence, Is.True); + Assert.That(moderations[1].CategoryScores.Violence, Is.GreaterThan(0.5)); + } + + [Test] + public void SerializeModerationCollection() + { + // TODO: Add this test. + } +} diff --git a/.dotnet/tests/Moderations/OpenAIModerationsModelFactoryTests.cs b/.dotnet/tests/Moderations/OpenAIModerationsModelFactoryTests.cs new file mode 100644 index 000000000..2b5660b8e --- /dev/null +++ b/.dotnet/tests/Moderations/OpenAIModerationsModelFactoryTests.cs @@ -0,0 +1,543 @@ +using System; +using System.Collections.Generic; +using System.Linq; +using NUnit.Framework; +using OpenAI.Moderations; + +namespace OpenAI.Tests.Moderations; + +[Parallelizable(ParallelScope.All)] +[Category("Smoke")] +public partial class OpenAIModerationsModelFactoryTests +{ + [Test] + public void ModerationCategoriesWithNoPropertiesWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(); + + Assert.That(moderationCategories.Hate, Is.False); + Assert.That(moderationCategories.HateThreatening, Is.False); + Assert.That(moderationCategories.Harassment, Is.False); + Assert.That(moderationCategories.HarassmentThreatening, Is.False); + Assert.That(moderationCategories.SelfHarm, Is.False); + Assert.That(moderationCategories.SelfHarmIntent, Is.False); + Assert.That(moderationCategories.SelfHarmInstructions, Is.False); + Assert.That(moderationCategories.Sexual, Is.False); + Assert.That(moderationCategories.SexualMinors, Is.False); + Assert.That(moderationCategories.Violence, Is.False); + Assert.That(moderationCategories.ViolenceGraphic, Is.False); + } + + [Test] + public void ModerationCategoriesWithHateWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(hate: true); + + Assert.That(moderationCategories.Hate, Is.True); + Assert.That(moderationCategories.HateThreatening, Is.False); + Assert.That(moderationCategories.Harassment, Is.False); + Assert.That(moderationCategories.HarassmentThreatening, Is.False); + Assert.That(moderationCategories.SelfHarm, Is.False); + Assert.That(moderationCategories.SelfHarmIntent, Is.False); + Assert.That(moderationCategories.SelfHarmInstructions, Is.False); + Assert.That(moderationCategories.Sexual, Is.False); + Assert.That(moderationCategories.SexualMinors, Is.False); + Assert.That(moderationCategories.Violence, Is.False); + Assert.That(moderationCategories.ViolenceGraphic, Is.False); + } + + [Test] + public void ModerationCategoriesWithHateThreateningWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(hateThreatening: true); + + Assert.That(moderationCategories.Hate, Is.False); + Assert.That(moderationCategories.HateThreatening, Is.True); + Assert.That(moderationCategories.Harassment, Is.False); + Assert.That(moderationCategories.HarassmentThreatening, Is.False); + Assert.That(moderationCategories.SelfHarm, Is.False); + Assert.That(moderationCategories.SelfHarmIntent, Is.False); + Assert.That(moderationCategories.SelfHarmInstructions, Is.False); + Assert.That(moderationCategories.Sexual, Is.False); + Assert.That(moderationCategories.SexualMinors, Is.False); + Assert.That(moderationCategories.Violence, Is.False); + Assert.That(moderationCategories.ViolenceGraphic, Is.False); + } + + [Test] + public void ModerationCategoriesWithHarassmentWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(harassment: true); + + Assert.That(moderationCategories.Hate, Is.False); + Assert.That(moderationCategories.HateThreatening, Is.False); + Assert.That(moderationCategories.Harassment, Is.True); + Assert.That(moderationCategories.HarassmentThreatening, Is.False); + Assert.That(moderationCategories.SelfHarm, Is.False); + Assert.That(moderationCategories.SelfHarmIntent, Is.False); + Assert.That(moderationCategories.SelfHarmInstructions, Is.False); + Assert.That(moderationCategories.Sexual, Is.False); + Assert.That(moderationCategories.SexualMinors, Is.False); + Assert.That(moderationCategories.Violence, Is.False); + Assert.That(moderationCategories.ViolenceGraphic, Is.False); + } + + [Test] + public void ModerationCategoriesWithHarassmentThreateningWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(harassmentThreatening: true); + + Assert.That(moderationCategories.Hate, Is.False); + Assert.That(moderationCategories.HateThreatening, Is.False); + Assert.That(moderationCategories.Harassment, Is.False); + Assert.That(moderationCategories.HarassmentThreatening, Is.True); + Assert.That(moderationCategories.SelfHarm, Is.False); + Assert.That(moderationCategories.SelfHarmIntent, Is.False); + Assert.That(moderationCategories.SelfHarmInstructions, Is.False); + Assert.That(moderationCategories.Sexual, Is.False); + Assert.That(moderationCategories.SexualMinors, Is.False); + Assert.That(moderationCategories.Violence, Is.False); + Assert.That(moderationCategories.ViolenceGraphic, Is.False); + } + + [Test] + public void ModerationCategoriesWithSelfHarmWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(selfHarm: true); + + Assert.That(moderationCategories.Hate, Is.False); + Assert.That(moderationCategories.HateThreatening, Is.False); + Assert.That(moderationCategories.Harassment, Is.False); + Assert.That(moderationCategories.HarassmentThreatening, Is.False); + Assert.That(moderationCategories.SelfHarm, Is.True); + Assert.That(moderationCategories.SelfHarmIntent, Is.False); + Assert.That(moderationCategories.SelfHarmInstructions, Is.False); + Assert.That(moderationCategories.Sexual, Is.False); + Assert.That(moderationCategories.SexualMinors, Is.False); + Assert.That(moderationCategories.Violence, Is.False); + Assert.That(moderationCategories.ViolenceGraphic, Is.False); + } + + [Test] + public void ModerationCategoriesWithSelfHarmIntentWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(selfHarmIntent: true); + + Assert.That(moderationCategories.Hate, Is.False); + Assert.That(moderationCategories.HateThreatening, Is.False); + Assert.That(moderationCategories.Harassment, Is.False); + Assert.That(moderationCategories.HarassmentThreatening, Is.False); + Assert.That(moderationCategories.SelfHarm, Is.False); + Assert.That(moderationCategories.SelfHarmIntent, Is.True); + Assert.That(moderationCategories.SelfHarmInstructions, Is.False); + Assert.That(moderationCategories.Sexual, Is.False); + Assert.That(moderationCategories.SexualMinors, Is.False); + Assert.That(moderationCategories.Violence, Is.False); + Assert.That(moderationCategories.ViolenceGraphic, Is.False); + } + + [Test] + public void ModerationCategoriesWithSelfHarmInstructionWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(selfHarmInstructions: true); + + Assert.That(moderationCategories.Hate, Is.False); + Assert.That(moderationCategories.HateThreatening, Is.False); + Assert.That(moderationCategories.Harassment, Is.False); + Assert.That(moderationCategories.HarassmentThreatening, Is.False); + Assert.That(moderationCategories.SelfHarm, Is.False); + Assert.That(moderationCategories.SelfHarmIntent, Is.False); + Assert.That(moderationCategories.SelfHarmInstructions, Is.True); + Assert.That(moderationCategories.Sexual, Is.False); + Assert.That(moderationCategories.SexualMinors, Is.False); + Assert.That(moderationCategories.Violence, Is.False); + Assert.That(moderationCategories.ViolenceGraphic, Is.False); + } + + [Test] + public void ModerationCategoriesWithSexualWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(sexual: true); + + Assert.That(moderationCategories.Hate, Is.False); + Assert.That(moderationCategories.HateThreatening, Is.False); + Assert.That(moderationCategories.Harassment, Is.False); + Assert.That(moderationCategories.HarassmentThreatening, Is.False); + Assert.That(moderationCategories.SelfHarm, Is.False); + Assert.That(moderationCategories.SelfHarmIntent, Is.False); + Assert.That(moderationCategories.SelfHarmInstructions, Is.False); + Assert.That(moderationCategories.Sexual, Is.True); + Assert.That(moderationCategories.SexualMinors, Is.False); + Assert.That(moderationCategories.Violence, Is.False); + Assert.That(moderationCategories.ViolenceGraphic, Is.False); + } + + [Test] + public void ModerationCategoriesWithSexualMinorsWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(sexualMinors: true); + + Assert.That(moderationCategories.Hate, Is.False); + Assert.That(moderationCategories.HateThreatening, Is.False); + Assert.That(moderationCategories.Harassment, Is.False); + Assert.That(moderationCategories.HarassmentThreatening, Is.False); + Assert.That(moderationCategories.SelfHarm, Is.False); + Assert.That(moderationCategories.SelfHarmIntent, Is.False); + Assert.That(moderationCategories.SelfHarmInstructions, Is.False); + Assert.That(moderationCategories.Sexual, Is.False); + Assert.That(moderationCategories.SexualMinors, Is.True); + Assert.That(moderationCategories.Violence, Is.False); + Assert.That(moderationCategories.ViolenceGraphic, Is.False); + } + + [Test] + public void ModerationCategoriesWithViolenceWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(violence: true); + + Assert.That(moderationCategories.Hate, Is.False); + Assert.That(moderationCategories.HateThreatening, Is.False); + Assert.That(moderationCategories.Harassment, Is.False); + Assert.That(moderationCategories.HarassmentThreatening, Is.False); + Assert.That(moderationCategories.SelfHarm, Is.False); + Assert.That(moderationCategories.SelfHarmIntent, Is.False); + Assert.That(moderationCategories.SelfHarmInstructions, Is.False); + Assert.That(moderationCategories.Sexual, Is.False); + Assert.That(moderationCategories.SexualMinors, Is.False); + Assert.That(moderationCategories.Violence, Is.True); + Assert.That(moderationCategories.ViolenceGraphic, Is.False); + } + + [Test] + public void ModerationCategoriesWithViolenceGraphicWorks() + { + ModerationCategories moderationCategories = OpenAIModerationsModelFactory.ModerationCategories(violenceGraphic: true); + + Assert.That(moderationCategories.Hate, Is.False); + Assert.That(moderationCategories.HateThreatening, Is.False); + Assert.That(moderationCategories.Harassment, Is.False); + Assert.That(moderationCategories.HarassmentThreatening, Is.False); + Assert.That(moderationCategories.SelfHarm, Is.False); + Assert.That(moderationCategories.SelfHarmIntent, Is.False); + Assert.That(moderationCategories.SelfHarmInstructions, Is.False); + Assert.That(moderationCategories.Sexual, Is.False); + Assert.That(moderationCategories.SexualMinors, Is.False); + Assert.That(moderationCategories.Violence, Is.False); + Assert.That(moderationCategories.ViolenceGraphic, Is.True); + } + + [Test] + public void ModerationCategoryScoresWithNoPropertiesWorks() + { + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(0f)); + } + + [Test] + public void ModerationCategoryScoresWithHateWorks() + { + float hate = 0.85f; + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(hate: hate); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(hate)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(0f)); + } + + [Test] + public void ModerationCategoryScoresWithHateThreateningWorks() + { + float hateThreatening = 0.85f; + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(hateThreatening: hateThreatening); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(hateThreatening)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(0f)); + } + + [Test] + public void ModerationCategoryScoresWithHarassmentWorks() + { + float harassment = 0.85f; + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(harassment: harassment); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(harassment)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(0f)); + } + + [Test] + public void ModerationCategoryScoresWithHarassmentThreateningWorks() + { + float harassmentThreatening = 0.85f; + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(harassmentThreatening: harassmentThreatening); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(harassmentThreatening)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(0f)); + } + + [Test] + public void ModerationCategoryScoresWithSelfHarmWorks() + { + float selfHarm = 0.85f; + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(selfHarm: selfHarm); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(selfHarm)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(0f)); + } + + [Test] + public void ModerationCategoryScoresWithSelfHarmIntentWorks() + { + float selfHarmIntent = 0.85f; + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(selfHarmIntent: selfHarmIntent); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(selfHarmIntent)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(0f)); + } + + [Test] + public void ModerationCategoryScoresWithSelfHarmInstructionWorks() + { + float selfHarmInstructions = 0.85f; + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(selfHarmInstructions: selfHarmInstructions); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(selfHarmInstructions)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(0f)); + } + + [Test] + public void ModerationCategoryScoresWithSexualWorks() + { + float sexual = 0.85f; + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(sexual: sexual); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(sexual)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(0f)); + } + + [Test] + public void ModerationCategoryScoresWithSexualMinorsWorks() + { + float sexualMinors = 0.85f; + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(sexualMinors: sexualMinors); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(sexualMinors)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(0f)); + } + + [Test] + public void ModerationCategoryScoresWithViolenceWorks() + { + float violence = 0.85f; + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(violence: violence); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(violence)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(0f)); + } + + [Test] + public void ModerationCategoryScoresWithViolenceGraphicWorks() + { + float violenceGraphic = 0.85f; + ModerationCategoryScores moderationCategoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(violenceGraphic: violenceGraphic); + + Assert.That(moderationCategoryScores.Hate, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HateThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Harassment, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.HarassmentThreatening, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarm, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmIntent, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SelfHarmInstructions, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Sexual, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.SexualMinors, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.Violence, Is.EqualTo(0f)); + Assert.That(moderationCategoryScores.ViolenceGraphic, Is.EqualTo(violenceGraphic)); + } + + [Test] + public void ModerationCollectionWithNoPropertiesWorks() + { + ModerationCollection moderationCollection = OpenAIModerationsModelFactory.ModerationCollection(); + + Assert.That(moderationCollection.Id, Is.Null); + Assert.That(moderationCollection.Model, Is.Null); + Assert.That(moderationCollection.Count, Is.EqualTo(0)); + } + + [Test] + public void ModerationCollectionWithIdWorks() + { + string id = "moderationId"; + ModerationCollection moderationCollection = OpenAIModerationsModelFactory.ModerationCollection(id: id); + + Assert.That(moderationCollection.Id, Is.EqualTo(id)); + Assert.That(moderationCollection.Model, Is.Null); + Assert.That(moderationCollection.Count, Is.EqualTo(0)); + } + + [Test] + public void ModerationCollectionWithModelWorks() + { + string model = "supermodel"; + ModerationCollection moderationCollection = OpenAIModerationsModelFactory.ModerationCollection(model: model); + + Assert.That(moderationCollection.Id, Is.Null); + Assert.That(moderationCollection.Model, Is.EqualTo(model)); + Assert.That(moderationCollection.Count, Is.EqualTo(0)); + } + + [Test] + public void ModerationCollectionWithItemsWorks() + { + IEnumerable items = [ + OpenAIModerationsModelFactory.ModerationResult(flagged: true), + OpenAIModerationsModelFactory.ModerationResult(flagged: false) + ]; + ModerationCollection moderationCollection = OpenAIModerationsModelFactory.ModerationCollection(items: items); + + Assert.That(moderationCollection.Id, Is.Null); + Assert.That(moderationCollection.Model, Is.Null); + Assert.That(moderationCollection.SequenceEqual(items), Is.True); + } + + [Test] + public void ModerationResultWithNoPropertiesWorks() + { + ModerationResult moderationResult = OpenAIModerationsModelFactory.ModerationResult(); + + Assert.That(moderationResult.Flagged, Is.False); + Assert.That(moderationResult.Categories, Is.Null); + Assert.That(moderationResult.CategoryScores, Is.Null); + } + + [Test] + public void ModerationResultWithFlaggedWorks() + { + ModerationResult moderationResult = OpenAIModerationsModelFactory.ModerationResult(flagged: true); + + Assert.That(moderationResult.Flagged, Is.True); + Assert.That(moderationResult.Categories, Is.Null); + Assert.That(moderationResult.CategoryScores, Is.Null); + } + + [Test] + public void ModerationResultWithCategoriesWorks() + { + ModerationCategories categories = OpenAIModerationsModelFactory.ModerationCategories(hate: true); + ModerationResult moderationResult = OpenAIModerationsModelFactory.ModerationResult(categories: categories); + + Assert.That(moderationResult.Flagged, Is.False); + Assert.That(moderationResult.Categories, Is.EqualTo(categories)); + Assert.That(moderationResult.CategoryScores, Is.Null); + } + + [Test] + public void ModerationResultWithCategoryScoresWorks() + { + ModerationCategoryScores categoryScores = OpenAIModerationsModelFactory.ModerationCategoryScores(hate: 0.85f); + ModerationResult moderationResult = OpenAIModerationsModelFactory.ModerationResult(categoryScores: categoryScores); + + Assert.That(moderationResult.Flagged, Is.False); + Assert.That(moderationResult.Categories, Is.Null); + Assert.That(moderationResult.CategoryScores, Is.EqualTo(categoryScores)); + } +} diff --git a/.dotnet/tests/OpenAI.Tests.csproj b/.dotnet/tests/OpenAI.Tests.csproj new file mode 100644 index 000000000..c05ab9f8d --- /dev/null +++ b/.dotnet/tests/OpenAI.Tests.csproj @@ -0,0 +1,23 @@ + + + net8.0 + + $(NoWarn);CS1591 + latest + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/.dotnet/tests/Telemetry/ChatTelemetryTests.cs b/.dotnet/tests/Telemetry/ChatTelemetryTests.cs new file mode 100644 index 000000000..d3b043a7c --- /dev/null +++ b/.dotnet/tests/Telemetry/ChatTelemetryTests.cs @@ -0,0 +1,315 @@ +using NUnit.Framework; +using OpenAI.Chat; +using OpenAI.Telemetry; +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Diagnostics; +using System.Diagnostics.Metrics; +using System.Linq; +using System.Net.Sockets; +using System.Reflection; +using System.Threading; +using System.Threading.Tasks; +using static OpenAI.Tests.Telemetry.TestMeterListener; +using static OpenAI.Tests.Telemetry.TestActivityListener; + +namespace OpenAI.Tests.Telemetry; + +[TestFixture] +[NonParallelizable] +[Category("Smoke")] +public class ChatTelemetryTests +{ + private const string RequestModel = "requestModel"; + private const string Host = "host"; + private const int Port = 42; + private static readonly string Endpoint = $"https://{Host}:{Port}/path"; + private const string CompletionId = "chatcmpl-9fG9OILMJnKZARXDwxoCnLcvDsDDX"; + private const string CompletionContent = "hello world"; + private const string ResponseModel = "responseModel"; + private const string FinishReason = "stop"; + private const int PromptTokens = 2; + private const int CompletionTokens = 42; + + [Test] + public void AllTelemetryOff() + { + var telemetry = new OpenTelemetrySource(RequestModel, new Uri(Endpoint)); + Assert.IsNull(telemetry.StartChatScope(new ChatCompletionOptions())); + Assert.IsNull(Activity.Current); + } + + [Test] + public void SwitchOffAllTelemetryOn() + { + using var activityListener = new TestActivityListener("OpenAI.ChatClient"); + using var meterListener = new TestMeterListener("OpenAI.ChatClient"); + var telemetry = new OpenTelemetrySource(RequestModel, new Uri(Endpoint)); + Assert.IsNull(telemetry.StartChatScope(new ChatCompletionOptions())); + Assert.IsNull(Activity.Current); + } + + [Test] + public void MetricsOnTracingOff() + { + using var _ = TestAppContextSwitchHelper.EnableOpenTelemetry(); + + var telemetry = new OpenTelemetrySource(RequestModel, new Uri(Endpoint)); + + using var meterListener = new TestMeterListener("OpenAI.ChatClient"); + + var elapsedMax = Stopwatch.StartNew(); + using var scope = telemetry.StartChatScope(new ChatCompletionOptions()); + var elapsedMin = Stopwatch.StartNew(); + + Assert.Null(Activity.Current); + Assert.NotNull(scope); + + // so we have some duration to measure + Thread.Sleep(20); + + elapsedMin.Stop(); + + var response = CreateChatCompletion(); + scope.RecordChatCompletion(response); + scope.Dispose(); + + ValidateDuration(meterListener, response, elapsedMin.Elapsed, elapsedMax.Elapsed); + ValidateUsage(meterListener, response, PromptTokens, CompletionTokens); + } + + [Test] + public void MetricsOnTracingOffException() + { + using var _ = TestAppContextSwitchHelper.EnableOpenTelemetry(); + + var telemetry = new OpenTelemetrySource(RequestModel, new Uri(Endpoint)); + using var meterListener = new TestMeterListener("OpenAI.ChatClient"); + + using (var scope = telemetry.StartChatScope(new ChatCompletionOptions())) + { + scope.RecordException(new TaskCanceledException()); + } + + ValidateDuration(meterListener, null, TimeSpan.MinValue, TimeSpan.MaxValue); + Assert.IsNull(meterListener.GetMeasurements("gen_ai.client.token.usage")); + } + + [Test] + public void TracingOnMetricsOff() + { + using var _ = TestAppContextSwitchHelper.EnableOpenTelemetry(); + + var telemetry = new OpenTelemetrySource(RequestModel, new Uri(Endpoint)); + using var listener = new TestActivityListener("OpenAI.ChatClient"); + + var chatCompletion = CreateChatCompletion(); + + Activity activity = null; + using (var scope = telemetry.StartChatScope(new ChatCompletionOptions())) + { + activity = Activity.Current; + Assert.IsNull(activity.GetTagItem("gen_ai.request.temperature")); + Assert.IsNull(activity.GetTagItem("gen_ai.request.top_p")); + Assert.IsNull(activity.GetTagItem("gen_ai.request.max_tokens")); + + Assert.NotNull(scope); + + scope.RecordChatCompletion(chatCompletion); + } + + Assert.Null(Activity.Current); + Assert.AreEqual(1, listener.Activities.Count); + + ValidateChatActivity(listener.Activities.Single(), chatCompletion, RequestModel, Host, Port); + } + + [Test] + public void ChatTracingAllAttributes() + { + using var _ = TestAppContextSwitchHelper.EnableOpenTelemetry(); + var telemetry = new OpenTelemetrySource(RequestModel, new Uri(Endpoint)); + using var listener = new TestActivityListener("OpenAI.ChatClient"); + var options = new ChatCompletionOptions() + { + Temperature = 0.42f, + MaxTokens = 200, + TopP = 0.9f + }; + SetMessages(options, new UserChatMessage("hello")); + + var chatCompletion = CreateChatCompletion(); + + using (var scope = telemetry.StartChatScope(options)) + { + Assert.AreEqual(options.Temperature.Value, (float)Activity.Current.GetTagItem("gen_ai.request.temperature"), 0.01); + Assert.AreEqual(options.TopP.Value, (float)Activity.Current.GetTagItem("gen_ai.request.top_p"), 0.01); + Assert.AreEqual(options.MaxTokens.Value, Activity.Current.GetTagItem("gen_ai.request.max_tokens")); + scope.RecordChatCompletion(chatCompletion); + } + Assert.Null(Activity.Current); + + ValidateChatActivity(listener.Activities.Single(), chatCompletion, RequestModel, Host, Port); + } + + [Test] + public void ChatTracingException() + { + using var _ = TestAppContextSwitchHelper.EnableOpenTelemetry(); + + var telemetry = new OpenTelemetrySource(RequestModel, new Uri(Endpoint)); + using var listener = new TestActivityListener("OpenAI.ChatClient"); + + var error = new SocketException(42, "test error"); + using (var scope = telemetry.StartChatScope(new ChatCompletionOptions())) + { + scope.RecordException(error); + } + + Assert.Null(Activity.Current); + + ValidateChatActivity(listener.Activities.Single(), error, RequestModel, Host, Port); + } + + [Test] + public async Task ChatTracingAndMetricsMultiple() + { + using var _ = TestAppContextSwitchHelper.EnableOpenTelemetry(); + var source = new OpenTelemetrySource(RequestModel, new Uri(Endpoint)); + + using var activityListener = new TestActivityListener("OpenAI.ChatClient"); + using var meterListener = new TestMeterListener("OpenAI.ChatClient"); + + var options = new ChatCompletionOptions(); + + var tasks = new Task[5]; + int numberOfSuccessfulResponses = 3; + int totalPromptTokens = 0, totalCompletionTokens = 0; + for (int i = 0; i < tasks.Length; i ++) + { + int t = i; + // don't let Activity.Current escape the scope + tasks[i] = Task.Run(async () => + { + using var scope = source.StartChatScope(options); + await Task.Delay(10); + if (t < numberOfSuccessfulResponses) + { + var promptTokens = Random.Shared.Next(100); + var completionTokens = Random.Shared.Next(100); + + var completion = CreateChatCompletion(promptTokens, completionTokens); + totalPromptTokens += promptTokens; + totalCompletionTokens += completionTokens; + scope.RecordChatCompletion(completion); + } + else + { + scope.RecordException(new TaskCanceledException()); + } + }); + } + + await Task.WhenAll(tasks); + + Assert.AreEqual(tasks.Length, activityListener.Activities.Count); + + var durations = meterListener.GetMeasurements("gen_ai.client.operation.duration"); + Assert.AreEqual(tasks.Length, durations.Count); + Assert.AreEqual(numberOfSuccessfulResponses, durations.Count(d => !d.tags.ContainsKey("error.type"))); + + var usages = meterListener.GetMeasurements("gen_ai.client.token.usage"); + // we don't report usage if there was no response + Assert.AreEqual(numberOfSuccessfulResponses * 2, usages.Count); + Assert.IsEmpty(usages.Where(u => u.tags.ContainsKey("error.type"))); + + Assert.AreEqual(totalPromptTokens, usages + .Where(u => u.tags.Contains(new KeyValuePair("gen_ai.token.type", "input"))) + .Sum(u => (long)u.value)); + Assert.AreEqual(totalCompletionTokens, usages + .Where(u => u.tags.Contains(new KeyValuePair("gen_ai.token.type", "output"))) + .Sum(u => (long)u.value)); + } + + private void SetMessages(ChatCompletionOptions options, params ChatMessage[] messages) + { + var messagesProperty = typeof(ChatCompletionOptions).GetProperty("Messages", BindingFlags.Instance | BindingFlags.NonPublic); + messagesProperty.SetValue(options, messages.ToList()); + } + + private void ValidateDuration(TestMeterListener listener, ChatCompletion response, TimeSpan durationMin, TimeSpan durationMax) + { + var duration = listener.GetInstrument("gen_ai.client.operation.duration"); + Assert.IsNotNull(duration); + Assert.IsInstanceOf>(duration); + + var measurements = listener.GetMeasurements("gen_ai.client.operation.duration"); + Assert.IsNotNull(measurements); + Assert.AreEqual(1, measurements.Count); + + var measurement = measurements[0]; + Assert.IsInstanceOf(measurement.value); + Assert.GreaterOrEqual((double)measurement.value, durationMin.TotalSeconds); + Assert.LessOrEqual((double)measurement.value, durationMax.TotalSeconds); + + ValidateChatMetricTags(measurement, response, RequestModel, Host, Port); + } + + private void ValidateUsage(TestMeterListener listener, ChatCompletion response, int inputTokens, int outputTokens) + { + var usage = listener.GetInstrument("gen_ai.client.token.usage"); + Assert.IsNotNull(usage); + Assert.IsInstanceOf>(usage); + + var measurements = listener.GetMeasurements("gen_ai.client.token.usage"); + Assert.IsNotNull(measurements); + Assert.AreEqual(2, measurements.Count); + + foreach (var measurement in measurements) + { + Assert.IsInstanceOf(measurement.value); + ValidateChatMetricTags(measurement, response, RequestModel, Host, Port); + } + + Assert.True(measurements[0].tags.TryGetValue("gen_ai.token.type", out var type)); + Assert.IsInstanceOf(type); + + TestMeasurement input = (type is "input") ? measurements[0] : measurements[1]; + TestMeasurement output = (type is "input") ? measurements[1] : measurements[0]; + + Assert.AreEqual(inputTokens, input.value); + Assert.AreEqual(outputTokens, output.value); + } + + private static ChatCompletion CreateChatCompletion(int promptTokens = PromptTokens, int completionTokens = CompletionTokens) + { + var completion = BinaryData.FromString( + $$""" + { + "id": "{{CompletionId}}", + "created": 1719621282, + "choices": [ + { + "message": { + "role": "assistant", + "content": "{{CompletionContent}}" + }, + "logprobs": null, + "index": 0, + "finish_reason": "{{FinishReason}}" + } + ], + "model": "{{ResponseModel}}", + "system_fingerprint": "fp_7ec89fabc6", + "usage": { + "completion_tokens": {{completionTokens}}, + "prompt_tokens": {{promptTokens}}, + "total_tokens": 42 + } + } + """); + + return ModelReaderWriter.Read(completion); + } +} diff --git a/.dotnet/tests/Telemetry/TestActivityListener.cs b/.dotnet/tests/Telemetry/TestActivityListener.cs new file mode 100644 index 000000000..f7a5208a3 --- /dev/null +++ b/.dotnet/tests/Telemetry/TestActivityListener.cs @@ -0,0 +1,72 @@ +// Copyright (c) Microsoft Corporation. All rights reserved. +// Licensed under the MIT License. + +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.Collections.Concurrent; +using System.Collections.Generic; +using System.Diagnostics; +using System.Linq; + +namespace OpenAI.Tests.Telemetry; + +internal class TestActivityListener : IDisposable +{ + private readonly ActivityListener _listener; + private readonly ConcurrentQueue stoppedActivities = new ConcurrentQueue(); + + public TestActivityListener(string sourceName) + { + _listener = new ActivityListener() + { + ActivityStopped = stoppedActivities.Enqueue, + ShouldListenTo = s => s.Name == sourceName, + Sample = (ref ActivityCreationOptions _) => ActivitySamplingResult.AllDataAndRecorded, + }; + + ActivitySource.AddActivityListener(_listener); + } + + public List Activities => stoppedActivities.ToList(); + + public void Dispose() + { + _listener.Dispose(); + } + + public static void ValidateChatActivity(Activity activity, ChatCompletion response, string requestModel = "gpt-4o-mini", string host = "api.openai.com", int port = 443) + { + Assert.NotNull(activity); + Assert.AreEqual($"chat {requestModel}", activity.DisplayName); + Assert.AreEqual("chat", activity.GetTagItem("gen_ai.operation.name")); + Assert.AreEqual("openai", activity.GetTagItem("gen_ai.system")); + Assert.AreEqual(requestModel, activity.GetTagItem("gen_ai.request.model")); + + Assert.AreEqual(host, activity.GetTagItem("server.address")); + Assert.AreEqual(port, activity.GetTagItem("server.port")); + + if (response != null) + { + Assert.AreEqual(response.Model, activity.GetTagItem("gen_ai.response.model")); + Assert.AreEqual(response.Id, activity.GetTagItem("gen_ai.response.id")); + Assert.AreEqual(new[] { response.FinishReason.ToString().ToLower() }, activity.GetTagItem("gen_ai.response.finish_reasons")); + Assert.AreEqual(response.Usage.OutputTokens, activity.GetTagItem("gen_ai.usage.output_tokens")); + Assert.AreEqual(response.Usage.InputTokens, activity.GetTagItem("gen_ai.usage.input_tokens")); + Assert.AreEqual(ActivityStatusCode.Unset, activity.Status); + Assert.Null(activity.StatusDescription); + Assert.Null(activity.GetTagItem("error.type")); + } + else + { + Assert.AreEqual(ActivityStatusCode.Error, activity.Status); + Assert.NotNull(activity.GetTagItem("error.type")); + } + } + + public static void ValidateChatActivity(Activity activity, Exception ex, string requestModel = "gpt-4o-mini", string host = "api.openai.com", int port = 443) + { + ValidateChatActivity(activity, (ChatCompletion)null, requestModel, host, port); + Assert.AreEqual(ex.GetType().FullName, activity.GetTagItem("error.type")); + } +} diff --git a/.dotnet/tests/Telemetry/TestAppContextSwitchHelper.cs b/.dotnet/tests/Telemetry/TestAppContextSwitchHelper.cs new file mode 100644 index 000000000..5faf5eca0 --- /dev/null +++ b/.dotnet/tests/Telemetry/TestAppContextSwitchHelper.cs @@ -0,0 +1,25 @@ +using System; + +namespace OpenAI.Tests.Telemetry; + +internal class TestAppContextSwitchHelper : IDisposable +{ + private const string OpenTelemetrySwitchName = "OpenAI.Experimental.EnableOpenTelemetry"; + + private string _switchName; + private TestAppContextSwitchHelper(string switchName) + { + _switchName = switchName; + AppContext.SetSwitch(_switchName, true); + } + + public static IDisposable EnableOpenTelemetry() + { + return new TestAppContextSwitchHelper(OpenTelemetrySwitchName); + } + + public void Dispose() + { + AppContext.SetSwitch(_switchName, false); + } +} diff --git a/.dotnet/tests/Telemetry/TestMeterListener.cs b/.dotnet/tests/Telemetry/TestMeterListener.cs new file mode 100644 index 000000000..b918beb7f --- /dev/null +++ b/.dotnet/tests/Telemetry/TestMeterListener.cs @@ -0,0 +1,84 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.Collections.Concurrent; +using System.Collections.Generic; +using System.Diagnostics.Metrics; + +namespace OpenAI.Tests.Telemetry; + +internal class TestMeterListener : IDisposable +{ + public record TestMeasurement(object value, Dictionary tags); + + private readonly ConcurrentDictionary> _measurements = new (); + private readonly ConcurrentDictionary _instruments = new (); + private readonly MeterListener _listener; + public TestMeterListener(string meterName) + { + _listener = new MeterListener(); + _listener.InstrumentPublished = (i, l) => + { + if (i.Meter.Name == meterName) + { + l.EnableMeasurementEvents(i); + } + }; + _listener.SetMeasurementEventCallback(OnMeasurementRecorded); + _listener.SetMeasurementEventCallback(OnMeasurementRecorded); + _listener.Start(); + } + + public List GetMeasurements(string instrumentName) + { + _measurements.TryGetValue(instrumentName, out var list); + return list; + } + + public Instrument GetInstrument(string instrumentName) + { + _instruments.TryGetValue(instrumentName, out var instrument); + return instrument; + } + + private void OnMeasurementRecorded(Instrument instrument, T measurement, ReadOnlySpan> tags, object state) + { + _instruments.TryAdd(instrument.Name, instrument); + + var testMeasurement = new TestMeasurement(measurement, new Dictionary(tags.ToArray())); + _measurements.AddOrUpdate(instrument.Name, + k => new() { testMeasurement }, + (k, l) => + { + l.Add(testMeasurement); + return l; + }); + } + + public void Dispose() + { + _listener.Dispose(); + } + + public static void ValidateChatMetricTags(TestMeasurement measurement, ChatCompletion response, string requestModel = "gpt-4o-mini", string host = "api.openai.com", int port = 443) + { + Assert.AreEqual("openai", measurement.tags["gen_ai.system"]); + Assert.AreEqual("chat", measurement.tags["gen_ai.operation.name"]); + Assert.AreEqual(host, measurement.tags["server.address"]); + Assert.AreEqual(requestModel, measurement.tags["gen_ai.request.model"]); + Assert.AreEqual(port, measurement.tags["server.port"]); + + if (response != null) + { + Assert.AreEqual(response.Model, measurement.tags["gen_ai.response.model"]); + Assert.False(measurement.tags.ContainsKey("error.type")); + } + } + + public static void ValidateChatMetricTags(TestMeasurement measurement, Exception ex, string requestModel = "gpt-4o-mini", string host = "api.openai.com", int port = 443) + { + ValidateChatMetricTags(measurement, (ChatCompletion)null, requestModel, host, port); + Assert.True(measurement.tags.ContainsKey("error.type")); + Assert.AreEqual(ex.GetType().FullName, measurement.tags["error.type"]); + } +} diff --git a/.dotnet/tests/UserAgentTests.cs b/.dotnet/tests/UserAgentTests.cs new file mode 100644 index 000000000..31b74e11d --- /dev/null +++ b/.dotnet/tests/UserAgentTests.cs @@ -0,0 +1,46 @@ +using NUnit.Framework; +using OpenAI.Chat; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.IO; + +namespace OpenAI.Tests.Miscellaneous; + +public partial class UserAgentTests +{ + [Test] + public void DefaultUserAgentStringWorks() => UserAgentStringWorks(useApplicationId: false); + + [Test] + public void UserAgentWithApplicationIdWorks() => UserAgentStringWorks(useApplicationId: true); + + private void UserAgentStringWorks(bool useApplicationId) + { + ApiKeyCredential mockKeyCredential = new("no-real-key-needed"); + string userAgent = null; + TestPipelinePolicy policy = new((m) => + { + _ = m?.Request?.Headers?.TryGetValue("User-Agent", out userAgent); + }); + + OpenAIClientOptions options = useApplicationId ? new() + { + ApplicationId = "test-application-id", + } : new(); + options.AddPolicy(policy, PipelinePosition.BeforeTransport); + + ChatClient client = new("no-real-model-needed", Environment.GetEnvironmentVariable("OPENAI_API_KEY"), options); + RequestOptions noThrowOptions = new() { ErrorOptions = ClientErrorBehaviors.NoThrow, }; + using BinaryContent emptyContent = BinaryContent.Create(new MemoryStream()); + _ = client.CompleteChat(emptyContent, noThrowOptions); + + Assert.That(userAgent, Is.Not.Null.Or.Empty); + + if (useApplicationId) + { + Assert.That(userAgent, Does.Contain("test-application-id")); + } + Assert.That(userAgent, Does.Contain("OpenAI/")); + } +} diff --git a/.dotnet/tests/Utility/MockPipelineMessage.cs b/.dotnet/tests/Utility/MockPipelineMessage.cs new file mode 100644 index 000000000..651e59301 --- /dev/null +++ b/.dotnet/tests/Utility/MockPipelineMessage.cs @@ -0,0 +1,17 @@ +using System.ClientModel.Primitives; + +#nullable enable + +namespace OpenAI.Tests; + +public class MockPipelineMessage : PipelineMessage +{ + protected internal MockPipelineMessage(PipelineRequest request) : base(request) + { + } + + public void SetResponse(MockPipelineResponse response) + { + Response = response; + } +} \ No newline at end of file diff --git a/.dotnet/tests/Utility/MockPipelineRequest.cs b/.dotnet/tests/Utility/MockPipelineRequest.cs new file mode 100644 index 000000000..b97660d16 --- /dev/null +++ b/.dotnet/tests/Utility/MockPipelineRequest.cs @@ -0,0 +1,26 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; + +namespace OpenAI.Tests; + +public class MockPipelineRequest : PipelineRequest +{ + protected override string MethodCore { get; set; } = "POST"; + + protected override Uri UriCore { get; set; } + + protected override PipelineRequestHeaders HeadersCore { get; } = new MockPipelineRequestHeaders(); + + protected override BinaryContent ContentCore { get; set; } + + public MockPipelineRequest(BinaryData requestData) + { + ContentCore = BinaryContent.Create(requestData); + } + + public override void Dispose() + { + ContentCore?.Dispose(); + } +} \ No newline at end of file diff --git a/.dotnet/tests/Utility/MockPipelineRequestHeaders.cs b/.dotnet/tests/Utility/MockPipelineRequestHeaders.cs new file mode 100644 index 000000000..b8cfe1e36 --- /dev/null +++ b/.dotnet/tests/Utility/MockPipelineRequestHeaders.cs @@ -0,0 +1,21 @@ +using System; +using System.ClientModel.Primitives; +using System.Collections.Generic; + +#nullable enable + +namespace OpenAI.Tests; + +public class MockPipelineRequestHeaders : PipelineRequestHeaders +{ + private readonly Dictionary _headers = []; + public override void Add(string name, string value) => _headers[name] = value; + public override IEnumerator> GetEnumerator() => _headers.GetEnumerator(); + public override bool Remove(string name) => _headers.Remove(name); + public override void Set(string name, string value) => _headers[name] = value; + public override bool TryGetValue(string name, out string value) => _headers.TryGetValue(name, out value!); + public override bool TryGetValues(string name, out IEnumerable values) + { + throw new NotImplementedException(); + } +} \ No newline at end of file diff --git a/.dotnet/tests/Utility/MockPipelineResponse.cs b/.dotnet/tests/Utility/MockPipelineResponse.cs new file mode 100644 index 000000000..031913979 --- /dev/null +++ b/.dotnet/tests/Utility/MockPipelineResponse.cs @@ -0,0 +1,166 @@ +using System; +using System.ClientModel.Primitives; +using System.IO; +using System.Text; +using System.Threading; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.Tests; + +public class MockPipelineResponse : PipelineResponse +{ + private int _status; + private string _reasonPhrase; + private Stream? _contentStream; + private BinaryData? _bufferedContent; + + private bool _disposed; + + public MockPipelineResponse(int status = 0, string reasonPhrase = "") + { + _status = status; + _reasonPhrase = reasonPhrase; + } + + public override int Status => _status; + + public void SetStatus(int value) => _status = value; + + public override string ReasonPhrase => _reasonPhrase; + + public void SetReasonPhrase(string value) => _reasonPhrase = value; + + public void SetContent(byte[] content, bool bufferImmediately = false) + { + ContentStream = new MemoryStream(content, 0, content.Length, false, true); + if (bufferImmediately) + { + _ = BufferContent(); + } + } + + public MockPipelineResponse SetContent(string content) + { + SetContent(Encoding.UTF8.GetBytes(content)); + return this; + } + + public override Stream? ContentStream + { + get => _contentStream; + set => _contentStream = value; + } + + public override BinaryData Content + { + get + { + if (_bufferedContent is not null) + { + return _bufferedContent; + } + + if (_contentStream is null) + { + return new BinaryData(Array.Empty()); + } + + if (ContentStream is not MemoryStream memoryContent) + { + throw new InvalidOperationException($"The response is not buffered."); + } + + if (memoryContent.TryGetBuffer(out ArraySegment segment)) + { + return new BinaryData(segment.AsMemory()); + } + else + { + return new BinaryData(memoryContent.ToArray()); + } + } + } + + protected override PipelineResponseHeaders HeadersCore + => throw new NotImplementedException(); + + public sealed override void Dispose() + { + Dispose(true); + + GC.SuppressFinalize(this); + } + + protected void Dispose(bool disposing) + { + if (disposing && !_disposed) + { + Stream? content = _contentStream; + if (content != null) + { + _contentStream = null; + content.Dispose(); + } + + _disposed = true; + } + } + + public override BinaryData BufferContent(CancellationToken cancellationToken = default) + { + if (_bufferedContent is not null) + { + return _bufferedContent; + } + + if (_contentStream is null) + { + _bufferedContent = new BinaryData(Array.Empty()); + return _bufferedContent; + } + + MemoryStream bufferStream = new(); + _contentStream.CopyTo(bufferStream); + _contentStream.Dispose(); + _contentStream = bufferStream; + + // Less efficient FromStream method called here because it is a mock. + // For intended production implementation, see HttpClientTransportResponse. + _contentStream.Position = 0; + _bufferedContent = BinaryData.FromStream(bufferStream); + return _bufferedContent; + } + + public override async ValueTask BufferContentAsync(CancellationToken cancellationToken = default) + { + if (_bufferedContent is not null) + { + return _bufferedContent; + } + + if (_contentStream is null) + { + _bufferedContent = new BinaryData(Array.Empty()); + return _bufferedContent; + } + + MemoryStream bufferStream = new(); + +#if NETSTANDARD2_0 || NETFRAMEWORK + await _contentStream.CopyToAsync(bufferStream).ConfigureAwait(false); + _contentStream.Dispose(); +#else + await _contentStream.CopyToAsync(bufferStream, cancellationToken).ConfigureAwait(false); + await _contentStream.DisposeAsync().ConfigureAwait(false); +#endif + + _contentStream = bufferStream; + + // Less efficient FromStream method called here because it is a mock. + // For intended production implementation, see HttpClientTransportResponse. + _bufferedContent = BinaryData.FromStream(bufferStream); + return _bufferedContent; + } +} diff --git a/.dotnet/tests/Utility/MockPipelineTransport.cs b/.dotnet/tests/Utility/MockPipelineTransport.cs new file mode 100644 index 000000000..8e44f4f4e --- /dev/null +++ b/.dotnet/tests/Utility/MockPipelineTransport.cs @@ -0,0 +1,37 @@ +using System; +using System.ClientModel.Primitives; +using System.Threading.Tasks; + +#nullable enable + +namespace OpenAI.Tests; + +public class MockPipelineTransport : PipelineTransport +{ + public MockPipelineRequest MockRequest { get; set; } + public MockPipelineResponse MockResponse { get; set; } + + public MockPipelineTransport(BinaryData requestData, BinaryData responseData) + { + MockRequest = new MockPipelineRequest(requestData); + MockResponse = new MockPipelineResponse(200); + MockResponse.SetContent(responseData.ToArray(), bufferImmediately: true); + } + + protected override PipelineMessage CreateMessageCore() + { + return new MockPipelineMessage(MockRequest); + } + + protected override void ProcessCore(PipelineMessage message) + { + (message as MockPipelineMessage)!.SetResponse(MockResponse); + } + + protected override ValueTask ProcessCoreAsync(PipelineMessage message) + { + (message as MockPipelineMessage)!.SetResponse(MockResponse); + return ValueTask.CompletedTask; + } +} + diff --git a/.dotnet/tests/Utility/SyncAsyncTestBase.cs b/.dotnet/tests/Utility/SyncAsyncTestBase.cs new file mode 100644 index 000000000..53036732e --- /dev/null +++ b/.dotnet/tests/Utility/SyncAsyncTestBase.cs @@ -0,0 +1,12 @@ +namespace OpenAI.Tests.Utility +{ + public class SyncAsyncTestBase + { + public bool IsAsync { get; } + + public SyncAsyncTestBase(bool isAsync) + { + IsAsync = isAsync; + } + } +} diff --git a/.dotnet/tests/Utility/TestHelpers.cs b/.dotnet/tests/Utility/TestHelpers.cs new file mode 100644 index 000000000..9fb30f472 --- /dev/null +++ b/.dotnet/tests/Utility/TestHelpers.cs @@ -0,0 +1,110 @@ +using NUnit.Framework; +using OpenAI.Assistants; +using OpenAI.Audio; +using OpenAI.Batch; +using OpenAI.Chat; +using OpenAI.Embeddings; +using OpenAI.Files; +using OpenAI.Images; +using OpenAI.Models; +using OpenAI.Moderations; +using OpenAI.VectorStores; +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.IO; +using System.Linq; + +[assembly: LevelOfParallelism(8)] + +namespace OpenAI.Tests; + +internal static class TestHelpers +{ + public enum TestScenario + { + Assistants, + Audio_TTS, + Audio_Whisper, + Batch, + Chat, + Embeddings, + Files, + FineTuning, + Images, + LegacyCompletions, + Models, + Moderations, + VectorStores, + TopLevel, + } + + public static OpenAIClient GetTestTopLevelClient() => GetTestClient(TestScenario.TopLevel); + + public static T GetTestClient(TestScenario scenario, string overrideModel = null) + { + OpenAIClientOptions options = new(); + ApiKeyCredential credential = Environment.GetEnvironmentVariable("OPENAI_API_KEY"); + options.AddPolicy(GetDumpPolicy(), PipelinePosition.PerTry); + object clientObject = scenario switch + { +#pragma warning disable OPENAI001 + TestScenario.Assistants => new AssistantClient(credential, options), +#pragma warning restore OPENAI001 + TestScenario.Audio_TTS => new AudioClient(overrideModel ?? "tts-1", credential, options), + TestScenario.Audio_Whisper => new AudioClient(overrideModel ?? "whisper-1", credential, options), + TestScenario.Batch => new BatchClient(credential, options), + TestScenario.Chat => new ChatClient(overrideModel ?? "gpt-4o-mini", credential, options), + TestScenario.Embeddings => new EmbeddingClient(overrideModel ?? "text-embedding-3-small", credential, options), + TestScenario.Files => new FileClient(credential,options), + TestScenario.Images => new ImageClient(overrideModel ?? "dall-e-3", credential, options), + TestScenario.Models => new ModelClient(credential, options), + TestScenario.Moderations => new ModerationClient(overrideModel ?? "text-moderation-stable", credential, options), +#pragma warning disable OPENAI001 + TestScenario.VectorStores => new VectorStoreClient(credential, options), +#pragma warning restore OPENAI001 + TestScenario.TopLevel => new OpenAIClient(credential, options), + _ => throw new NotImplementedException(), + }; + return (T)clientObject; + } + + private static PipelinePolicy GetDumpPolicy() + { + return new TestPipelinePolicy((message) => + { + Console.WriteLine($"--- New request ---"); + IEnumerable headerPairs = message?.Request?.Headers?.Select(header => $"{header.Key}={(header.Key.ToLower().Contains("auth") ? "***" : header.Value)}"); + string headers = string.Join(',', headerPairs); + Console.WriteLine($"Headers: {headers}"); + Console.WriteLine($"{message?.Request?.Method} URI: {message?.Request?.Uri}"); + if (message.Request?.Content != null) + { + string contentType = "Unknown Content Type"; + if (message.Request.Headers?.TryGetValue("Content-Type", out contentType) == true + && contentType == "application/json") + { + using MemoryStream stream = new(); + message.Request.Content.WriteTo(stream, default); + stream.Position = 0; + using StreamReader reader = new(stream); + Console.WriteLine(reader.ReadToEnd()); + } + else + { + string length = message.Request.Content.TryComputeLength(out long numberLength) + ? $"{numberLength} bytes" + : "unknown length"; + Console.WriteLine($"<< Non-JSON content: {contentType} >> {length}"); + } + } + if (message.Response != null) + { + Console.WriteLine("--- Begin response content ---"); + Console.WriteLine(message.Response.Content?.ToString()); + Console.WriteLine("--- End of response content ---"); + } + }); + } +} \ No newline at end of file diff --git a/.dotnet/tests/Utility/TestPipelinePolicy.cs b/.dotnet/tests/Utility/TestPipelinePolicy.cs new file mode 100644 index 000000000..688c407ed --- /dev/null +++ b/.dotnet/tests/Utility/TestPipelinePolicy.cs @@ -0,0 +1,35 @@ +using System; +using System.ClientModel; +using System.ClientModel.Primitives; +using System.Collections.Generic; +using System.Threading.Tasks; + +namespace OpenAI.Tests; + +internal partial class TestPipelinePolicy : PipelinePolicy +{ + private Action _processMessageAction; + + public TestPipelinePolicy(Action processMessageAction) + { + _processMessageAction = processMessageAction; + } + + public override void Process(PipelineMessage message, IReadOnlyList pipeline, int currentIndex) + { + _processMessageAction(message); + if (currentIndex < pipeline.Count - 1) + { + pipeline[currentIndex + 1].Process(message, pipeline, currentIndex + 1); + } + } + + public override async ValueTask ProcessAsync(PipelineMessage message, IReadOnlyList pipeline, int currentIndex) + { + _processMessageAction(message); + if (currentIndex < pipeline.Count - 1) + { + await pipeline[currentIndex + 1].ProcessAsync(message, pipeline, currentIndex + 1); + } + } +} \ No newline at end of file diff --git a/.github/README.md b/.github/README.md new file mode 100644 index 000000000..df5b3494c --- /dev/null +++ b/.github/README.md @@ -0,0 +1,17 @@ +The workflows in this repository try to follow existing, basic samples with little customization. + +## main.yml +We use a basic dotnet build/test/pack workflow +https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-net + +- Build the solution using the dotnet cli + - Strong name the assemblies using a key stored in the repository + https://github.com/dotnet/runtime/blob/main/docs/project/strong-name-signing.md +- Test the built libraries + - Use a repository secret to hold the OpenAI token used for live testing + https://docs.github.com/en/actions/security-guides/using-secrets-in-github-actions +- Package the built libraries +- Publish the package as a GitHub Release +- Publish the package to a GitHub NuGet registry + https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-nuget-registry +- Publish a single build artifact containing test results and a nuget package \ No newline at end of file diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml index bbe92ae53..ba2c7d825 100644 --- a/.github/workflows/main.yml +++ b/.github/workflows/main.yml @@ -1,27 +1,92 @@ -name: Validate OpenAPI definition +name: Build and Test + on: + workflow_dispatch: + push: + branches: + - main pull_request: types: [opened, reopened, synchronize] jobs: - test_swagger_editor_validator_service: + build: # Test, pack and publish the Open AI nuget package as a build artifact runs-on: ubuntu-latest - name: Swagger Editor Validator Service - - # Service containers to run with `runner-job` - services: - # Label used to access the service container - swagger-editor: - # Docker Hub image - image: swaggerapi/swagger-editor - ports: - # Maps port 8080 on service container to the host 80 - - 80:8080 + env: + version_suffix_args: ${{ format('--version-suffix="alpha.{0}"', github.run_number) }} + steps: + - name: Setup .NET + uses: actions/setup-dotnet@v1 + with: + dotnet-version: 8.x + + - name: Checkout code + uses: actions/checkout@v2 + + - name: Build + run: dotnet build + -c Release + ${{ env.version_suffix_args }} + working-directory: .dotnet + + - name: Test + run: dotnet test + --no-build + --configuration Release + --filter="TestCategory~${{ github.event_name == 'pull_request' && 'Offline' || 'Online' }}|TestCategory~smoke" + --logger "trx;LogFileName=${{github.workspace}}/artifacts/test-results/full.trx" + env: + SECRET_VALUE: ${{ secrets.OPENAI_TOKEN }} + working-directory: .dotnet + + - name: Run additional code checks + shell: pwsh + run: ./Run-Checks.ps1 + working-directory: .scripts + + - name: Pack + run: dotnet pack + --no-build + --configuration Release + --output "${{github.workspace}}/artifacts/packages" + ${{ env.version_suffix_args }} + working-directory: .dotnet + - name: Upload artifact + uses: actions/upload-artifact@v2 + with: + name: build-artifacts + path: ${{github.workspace}}/artifacts + + - name: NuGet Autenticate + if: github.event_name != 'pull_request' + run: dotnet nuget add source + "https://nuget.pkg.github.com/${{ github.repository_owner }}/index.json" + --name "github" + --username ${{ github.actor }} + --password ${{ secrets.GITHUB_TOKEN }} + --store-password-in-clear-text + working-directory: .dotnet + + - name: Publish + if: github.event_name != 'pull_request' + run: dotnet nuget push + ${{github.workspace}}/artifacts/packages/*.nupkg + --source "github" + --api-key ${{ secrets.GITHUB_TOKEN }} + --skip-duplicate + working-directory: .dotnet + + azure_build: # Development mirror only; validate AOAI compilation + runs-on: ubuntu-latest steps: - - uses: actions/checkout@v2 - - name: Validate OpenAPI definition - uses: char0n/swagger-editor-validate@v1 + - name: Setup .NET + uses: actions/setup-dotnet@v1 with: - swagger-editor-url: http://localhost/ - definition-file: openapi.yaml + dotnet-version: 8.x + + - name: Checkout code + uses: actions/checkout@v2 + + - name: Build + run: dotnet build + working-directory: .dotnet.azure \ No newline at end of file diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml new file mode 100644 index 000000000..8e25a5651 --- /dev/null +++ b/.github/workflows/release.yml @@ -0,0 +1,74 @@ +name: Release package + +on: + release: + types: [published] + + +jobs: + deploy: + runs-on: ubuntu-latest + permissions: + packages: write + contents: write + steps: + - uses: actions/checkout@v4 + - uses: actions/setup-dotnet@v3 + with: + dotnet-version: '8.x' # SDK Version to use. + + - name: Build + run: dotnet build + -c Release + working-directory: .dotnet + + - name: Test + run: dotnet test + --no-build + --configuration Release + --filter="TestCategory~Online" + --logger "trx;LogFileName=${{github.workspace}}/artifacts/test-results/full.trx" + env: + SECRET_VALUE: ${{ secrets.OPENAI_TOKEN }} + working-directory: .dotnet + + # Pack the client nuget package and include urls back to the repository and release tag + - name: Pack + run: dotnet pack + --no-build + --configuration Release + --output "${{github.workspace}}/artifacts/packages" + /p:RepositoryUrl="https://github.com/${{ github.repository }}" + /p:PackageProjectUrl="https://github.com/${{ github.repository }}/tree/${{ github.event.release.tag_name }}" + working-directory: .dotnet + + # Append the nuget package to the github release that triggered this workflow + - name: Upload release asset + run: gh release upload ${{ github.event.release.tag_name }} + ${{github.workspace}}/artifacts/packages/*.nupkg + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + + - name: Upload artifact + uses: actions/upload-artifact@v2 + with: + name: build-artifacts + path: ${{github.workspace}}/artifacts + + - name: NuGet Autenticate + run: dotnet nuget add source + "https://nuget.pkg.github.com/${{ github.repository_owner }}/index.json" + --name "github" + --username ${{ github.actor }} + --password ${{ secrets.GITHUB_TOKEN }} + --store-password-in-clear-text + working-directory: .dotnet + + - name: Publish + run: dotnet nuget push + ${{github.workspace}}/artifacts/packages/*.nupkg + --source "github" + --api-key ${{ secrets.GITHUB_TOKEN }} + --skip-duplicate + working-directory: .dotnet + diff --git a/.gitignore b/.gitignore new file mode 100644 index 000000000..f71e3dc8e --- /dev/null +++ b/.gitignore @@ -0,0 +1,179 @@ +## Ignore Visual Studio temporary files, build results, and +## files generated by popular Visual Studio add-ons. + +# User-specific files +*.suo +*.user +*.sln.docstates +.vs/ +*.lock.json +developer/ +launch.json +launchSettings.json + +# Default Assets restore directory +.assets + +# Build results +/artifacts +binaries/ +[Dd]ebug*/ +[Rr]elease/ +build/ +restoredPackages/ +PolicheckOutput/ +tools/net46/ +tools/SdkBuildTools/ +tools/Microsoft.WindowsAzure.Build.Tasks/packages/ +PublishedNugets/ +src/NuGet.Config +tools/7-zip/ +#tools/LocalNugetFeed/Microsoft.Internal.NetSdkBuild.Mgmt.Tools.*.nupkg + +[Tt]est[Rr]esult +[Bb]uild[Ll]og.* + +*_i.c +*_p.c +*.ilk +*.meta +*.obj +*.pch +*.pdb +*.pgc +*.pgd +*.rsp +*.sbr +*.tlb +*.tli +*.tlh +*.tmp +*.vspscc +*.vssscc +.builds + +*.pidb + +*.log +*.scc +# Visual C++ cache files +ipch/ +*.aps +*.ncb +*.opensdf +*.sdf + +# Visual Studio profiler +*.psess +*.vsp + +# VS Code +**/.vscode/* +!.vscode/cspell.json + +# Code analysis +*.CodeAnalysisLog.xml + +# Guidance Automation Toolkit +*.gpState + +# ReSharper is a .NET coding add-in +_ReSharper*/ + +*.[Rr]e[Ss]harper + +# Rider IDE +.idea + +# NCrunch +*.ncrunch* +.*crunch*.local.xml + +# Installshield output folder +[Ee]xpress + +# DocProject is a documentation generator add-in +DocProject/buildhelp/ +DocProject/Help/*.HxT +DocProject/Help/*.HxC +DocProject/Help/*.hhc +DocProject/Help/*.hhk +DocProject/Help/*.hhp +DocProject/Help/Html2 +DocProject/Help/html + +# Click-Once directory +publish + +# Publish Web Output +*.Publish.xml + +# Others +[Bb]in +[Oo]bj +TestResults +[Tt]est[Rr]esult* +*.Cache +ClientBin +~$* +*.dbmdl + +*.[Pp]ublish.xml + +Generated_Code #added for RIA/Silverlight projects + +# Build tasks +tools/*.dll + +# Sensitive files +*.keys +!Azure.Extensions.AspNetCore.DataProtection.Keys +!Azure.Security.KeyVault.Keys +*.pfx +TestConfigurations.xml +*.json.env +*.bicep.env + +# Backup & report files from converting an old project file to a newer +# Visual Studio version. Backup files are not needed, because we have git ;-) +_UpgradeReport_Files/ +Backup*/ +UpgradeLog*.XML + +# NuGet +packages +packages/repositories.config +testPackages + +# Mac development +.DS_Store + +# Specification DLLs +*.Specification.dll + +# Generated readme.txt files # +src/*/readme.txt + +build.out +.nuget/ + +# Azure Project +csx/ +*.GhostDoc.xml +pingme.txt + +# TS/Node files +dist/ +node_modules/ + +# MSBuild binary log files +msbuild.binlog + +# BenchmarkDotNet +BenchmarkDotNet.Artifacts + +artifacts +.assets + +# Temporary typespec folders for typespec generation +TempTypeSpecFiles/ diff --git a/.openapi3.azure/openapi3-azure-openai.yaml b/.openapi3.azure/openapi3-azure-openai.yaml new file mode 100644 index 000000000..94c0b0d77 --- /dev/null +++ b/.openapi3.azure/openapi3-azure-openai.yaml @@ -0,0 +1,2727 @@ +openapi: 3.0.0 +info: + title: Azure OpenAI Service + version: 0.0.0 +tags: + - name: Chat + - name: Images + - name: Assistants +paths: + /chat/completions: + post: + tags: + - Chat + operationId: createChatCompletion + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/AzureCreateChatCompletionResponse' + - $ref: '#/components/schemas/AzureOpenAIChatErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/AzureCreateChatCompletionRequest' + /deployments/{deploymentId}/images/generations: + post: + tags: + - Images + operationId: ImageGenerations_Create + parameters: + - name: deploymentId + in: path + required: true + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/OpenAI.ImagesResponse' + - $ref: '#/components/schemas/AzureOpenAIDalleErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/OpenAI.CreateImageRequest' + /threads/{thread_id}/messages: + post: + tags: + - Assistants + operationId: createMessage + summary: Create a message. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the [thread](/docs/api-reference/threads) to create a message for. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/OpenAI.MessageObject' + - $ref: '#/components/schemas/OpenAI.ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/OpenAI.CreateMessageRequest' +security: + - ApiKeyAuth: [] + - OAuth2Auth: + - https://cognitiveservices.azure.com/.default +components: + schemas: + AzureChatCompletionResponseMessage: + type: object + properties: + context: + allOf: + - $ref: '#/components/schemas/AzureChatMessageContext' + description: The Azure-specific context information associated with the chat completion response message. + allOf: + - $ref: '#/components/schemas/OpenAI.ChatCompletionResponseMessage' + description: |- + The extended response model component for chat completion response messages on the Azure OpenAI service. + This model adds support for chat message context, used by the On Your Data feature for intent, citations, and other + information related to retrieval-augmented generation performed. + AzureChatCompletionStreamResponseDelta: + type: object + properties: + context: + allOf: + - $ref: '#/components/schemas/AzureChatMessageContext' + description: The Azure-specific context information associated with the chat completion response message. + allOf: + - $ref: '#/components/schemas/OpenAI.ChatCompletionStreamResponseDelta' + description: |- + The extended response model for a streaming chat response message on the Azure OpenAI service. + This model adds support for chat message context, used by the On Your Data feature for intent, citations, and other + information related to retrieval-augmented generation performed. + AzureChatDataSource: + type: object + required: + - type + properties: + type: + type: string + description: The differentiating type identifier for the data source. + discriminator: + propertyName: type + mapping: + azure_search: '#/components/schemas/AzureSearchChatDataSource' + azure_ml_index: '#/components/schemas/AzureMachineLearningIndexChatDataSource' + azure_cosmos_db: '#/components/schemas/AzureCosmosDBChatDataSource' + elasticsearch: '#/components/schemas/ElasticsearchChatDataSource' + pinecone: '#/components/schemas/PineconeChatDataSource' + description: |- + A representation of configuration data for a single Azure OpenAI chat data source. + This will be used by a chat completions request that should use Azure OpenAI chat extensions to augment the + response behavior. + The use of this configuration is compatible only with Azure OpenAI. + AzureChatDataSourceAccessTokenAuthenticationOptions: + type: object + required: + - type + - access_token + properties: + type: + type: string + enum: + - access_token + access_token: + type: string + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceAuthenticationOptions' + AzureChatDataSourceApiKeyAuthenticationOptions: + type: object + required: + - type + - key + properties: + type: + type: string + enum: + - api_key + key: + type: string + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceAuthenticationOptions' + AzureChatDataSourceAuthenticationOptions: + type: object + required: + - type + properties: + type: + type: string + discriminator: + propertyName: type + mapping: + connection_string: '#/components/schemas/AzureChatDataSourceConnectionStringAuthenticationOptions' + key_and_key_id: '#/components/schemas/AzureChatDataSourceKeyAndKeyIdAuthenticationOptions' + encoded_api_key: '#/components/schemas/AzureChatDataSourceEncodedApiKeyAuthenticationOptions' + access_token: '#/components/schemas/AzureChatDataSourceAccessTokenAuthenticationOptions' + system_assigned_managed_identity: '#/components/schemas/AzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions' + user_assigned_managed_identity: '#/components/schemas/AzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions' + AzureChatDataSourceConnectionStringAuthenticationOptions: + type: object + required: + - type + - connection_string + properties: + type: + type: string + enum: + - connection_string + connection_string: + type: string + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceAuthenticationOptions' + AzureChatDataSourceDeploymentNameVectorizationSource: + type: object + required: + - type + - deployment_name + properties: + type: + type: string + enum: + - deployment_name + description: The type identifier, always 'deployment_name' for this vectorization source type. + deployment_name: + type: string + description: |- + The embedding model deployment to use for vectorization. This deployment must exist within the same Azure OpenAI + resource as the model deployment being used for chat completions. + dimensions: + type: integer + format: int32 + description: |- + The number of dimensions to request on embeddings. + Only supported in 'text-embedding-3' and later models. + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceVectorizationSource' + description: |- + Represents a vectorization source that makes internal service calls against an Azure OpenAI embedding model + deployment. In contrast with the endpoint-based vectorization source, a deployment-name-based vectorization source + must be part of the same Azure OpenAI resource but can be used even in private networks. + AzureChatDataSourceEncodedApiKeyAuthenticationOptions: + type: object + required: + - type + - encoded_api_key + properties: + type: + type: string + enum: + - encoded_api_key + encoded_api_key: + type: string + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceAuthenticationOptions' + AzureChatDataSourceEndpointVectorizationSource: + type: object + required: + - type + - endpoint + - authentication + properties: + type: + type: string + enum: + - endpoint + description: The type identifier, always 'endpoint' for this vectorization source type. + endpoint: + type: string + format: uri + description: |- + Specifies the resource endpoint URL from which embeddings should be retrieved. + It should be in the format of: + https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/embeddings. + The api-version query parameter is not allowed. + authentication: + anyOf: + - $ref: '#/components/schemas/AzureChatDataSourceApiKeyAuthenticationOptions' + - $ref: '#/components/schemas/AzureChatDataSourceAccessTokenAuthenticationOptions' + description: |- + The authentication mechanism to use with the endpoint-based vectorization source. + Endpoint authentication supports API key and access token mechanisms. + dimensions: + type: integer + format: int32 + description: |- + The number of dimensions to request on embeddings. + Only supported in 'text-embedding-3' and later models. + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceVectorizationSource' + description: Represents a vectorization source that makes public service calls against an Azure OpenAI embedding model deployment. + AzureChatDataSourceKeyAndKeyIdAuthenticationOptions: + type: object + required: + - type + - key + - key_id + properties: + type: + type: string + enum: + - key_and_key_id + key: + type: string + key_id: + type: string + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceAuthenticationOptions' + AzureChatDataSourceModelIdVectorizationSource: + type: object + required: + - type + - model_id + properties: + type: + type: string + enum: + - model_id + description: The type identifier, always 'model_id' for this vectorization source type. + model_id: + type: string + description: The embedding model build ID to use for vectorization. + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceVectorizationSource' + description: |- + Represents a vectorization source that makes service calls based on a search service model ID. + This source type is currently only supported by Elasticsearch. + AzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions: + type: object + required: + - type + properties: + type: + type: string + enum: + - system_assigned_managed_identity + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceAuthenticationOptions' + AzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions: + type: object + required: + - type + - managed_identity_resource_id + properties: + type: + type: string + enum: + - user_assigned_managed_identity + managed_identity_resource_id: + type: string + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceAuthenticationOptions' + AzureChatDataSourceVectorizationSource: + type: object + required: + - type + properties: + type: + type: string + description: The differentiating identifier for the concrete vectorization source. + discriminator: + propertyName: type + mapping: + deployment_name: '#/components/schemas/AzureChatDataSourceDeploymentNameVectorizationSource' + model_id: '#/components/schemas/AzureChatDataSourceModelIdVectorizationSource' + description: A representation of a data vectorization source usable as an embedding resource with a data source. + AzureChatMessageContext: + type: object + properties: + intent: + type: string + description: The detected intent from the chat history, which is used to carry conversation context between interactions + citations: + type: array + items: + type: object + properties: + content: + type: string + description: The content of the citation. + title: + type: string + description: The title for the citation. + url: + type: string + description: The URL of the citation. + filepath: + type: string + description: The file path for the citation. + chunk_id: + type: string + description: The chunk ID for the citation. + required: + - content + description: The citations produced by the data retrieval. + all_retrieved_documents: + type: object + properties: + content: + type: string + description: The content of the citation. + title: + type: string + description: The title for the citation. + url: + type: string + description: The URL of the citation. + filepath: + type: string + description: The file path for the citation. + chunk_id: + type: string + description: The chunk ID for the citation. + search_queries: + type: array + items: + type: string + description: The search queries executed to retrieve documents. + data_source_index: + type: integer + format: int32 + description: The index of the data source used for retrieval. + original_search_score: + type: number + format: double + description: The original search score for the retrieval. + rerank_score: + type: number + format: double + description: The rerank score for the retrieval. + filter_reason: + type: string + enum: + - score + - rerank + description: If applicable, an indication of why the document was filtered. + required: + - content + - search_queries + - data_source_index + description: Summary information about documents retrieved by the data retrieval operation. + description: |- + An additional property, added to chat completion response messages, produced by the Azure OpenAI service when using + extension behavior. This includes intent and citation information from the On Your Data feature. + AzureContentFilterBlocklistIdResult: + type: object + required: + - id + - filtered + properties: + id: + type: string + description: The ID of the custom blocklist associated with the filtered status. + filtered: + type: boolean + description: Whether the associated blocklist resulted in the content being filtered. + description: |- + A content filter result item that associates an existing custom blocklist ID with a value indicating whether or not + the corresponding blocklist resulted in content being filtered. + AzureContentFilterBlocklistResult: + type: object + required: + - filtered + properties: + filtered: + type: boolean + description: A value indicating whether any of the detailed blocklists resulted in a filtering action. + details: + type: array + items: + type: object + properties: + filtered: + type: boolean + description: A value indicating whether the blocklist produced a filtering action. + id: + type: string + description: The ID of the custom blocklist evaluated. + required: + - filtered + - id + description: The pairs of individual blocklist IDs and whether they resulted in a filtering action. + description: A collection of true/false filtering results for configured custom blocklists. + AzureContentFilterDetectionResult: + type: object + required: + - filtered + - detected + properties: + filtered: + type: boolean + description: Whether the content detection resulted in a content filtering action. + detected: + type: boolean + description: Whether the labeled content category was detected in the content. + description: |- + A labeled content filter result item that indicates whether the content was detected and whether the content was + filtered. + AzureContentFilterImagePromptResults: + type: object + required: + - jailbreak + properties: + profanity: + allOf: + - $ref: '#/components/schemas/AzureContentFilterDetectionResult' + description: |- + A detection result that identifies whether crude, vulgar, or otherwise objection language is present in the + content. + custom_blocklists: + allOf: + - $ref: '#/components/schemas/AzureContentFilterBlocklistResult' + description: A collection of binary filtering outcomes for configured custom blocklists. + jailbreak: + allOf: + - $ref: '#/components/schemas/AzureContentFilterDetectionResult' + description: |- + A detection result that describes user prompt injection attacks, where malicious users deliberately exploit + system vulnerabilities to elicit unauthorized behavior from the LLM. This could lead to inappropriate content + generation or violations of system-imposed restrictions. + allOf: + - $ref: '#/components/schemas/AzureContentFilterImageResponseResults' + description: A content filter result for an image generation operation's input request content. + AzureContentFilterImageResponseResults: + type: object + properties: + sexual: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category for language related to anatomical organs and genitals, romantic relationships, acts + portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an + assault or a forced sexual violent act against one's will, prostitution, pornography, and abuse. + violence: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category for language related to physical actions intended to hurt, injure, damage, or kill + someone or something; describes weapons, guns and related entities, such as manufactures, associations, + legislation, and so on. + hate: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category that can refer to any content that attacks or uses pejorative or discriminatory + language with reference to a person or identity group based on certain differentiating attributes of these groups + including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, + religion, immigration status, ability status, personal appearance, and body size. + self_harm: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category that describes language related to physical actions intended to purposely hurt, injure, + damage one's body or kill oneself. + description: A content filter result for an image generation operation's output response content. + AzureContentFilterResultForChoice: + type: object + properties: + sexual: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category for language related to anatomical organs and genitals, romantic relationships, acts + portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an + assault or a forced sexual violent act against one's will, prostitution, pornography, and abuse. + hate: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category that can refer to any content that attacks or uses pejorative or discriminatory + language with reference to a person or identity group based on certain differentiating attributes of these groups + including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, + religion, immigration status, ability status, personal appearance, and body size. + violence: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category for language related to physical actions intended to hurt, injure, damage, or kill + someone or something; describes weapons, guns and related entities, such as manufactures, associations, + legislation, and so on. + self_harm: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category that describes language related to physical actions intended to purposely hurt, injure, + damage one's body or kill oneself. + profanity: + allOf: + - $ref: '#/components/schemas/AzureContentFilterDetectionResult' + description: |- + A detection result that identifies whether crude, vulgar, or otherwise objection language is present in the + content. + custom_blocklists: + allOf: + - $ref: '#/components/schemas/AzureContentFilterBlocklistResult' + description: A collection of binary filtering outcomes for configured custom blocklists. + error: + type: object + properties: + code: + type: integer + format: int32 + description: A distinct, machine-readable code associated with the error. + message: + type: string + description: A human-readable message associated with the error. + required: + - code + - message + description: If present, details about an error that prevented content filtering from completing its evaluation. + protected_material_text: + allOf: + - $ref: '#/components/schemas/AzureContentFilterDetectionResult' + description: A detection result that describes a match against text protected under copyright or other status. + protected_material_code: + type: object + properties: + filtered: + type: boolean + description: Whether the content detection resulted in a content filtering action. + detected: + type: boolean + description: Whether the labeled content category was detected in the content. + citation: + type: object + properties: + license: + type: string + description: The name or identifier of the license associated with the detection. + URL: + type: string + format: uri + description: The URL associated with the license. + description: If available, the citation details describing the associated license and its location. + required: + - filtered + - detected + description: A detection result that describes a match against licensed code or other protected source material. + description: A content filter result for a single response item produced by a generative AI system. + AzureContentFilterResultForPrompt: + type: object + properties: + prompt_index: + type: integer + format: int32 + description: The index of the input prompt associated with the accompanying content filter result categories. + content_filter_results: + type: object + properties: + sexual: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category for language related to anatomical organs and genitals, romantic relationships, acts + portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an + assault or a forced sexual violent act against one's will, prostitution, pornography, and abuse. + hate: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category that can refer to any content that attacks or uses pejorative or discriminatory + language with reference to a person or identity group based on certain differentiating attributes of these groups + including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, + religion, immigration status, ability status, personal appearance, and body size. + violence: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category for language related to physical actions intended to hurt, injure, damage, or kill + someone or something; describes weapons, guns and related entities, such as manufactures, associations, + legislation, and so on. + self_harm: + allOf: + - $ref: '#/components/schemas/AzureContentFilterSeverityResult' + description: |- + A content filter category that describes language related to physical actions intended to purposely hurt, injure, + damage one's body or kill oneself. + profanity: + allOf: + - $ref: '#/components/schemas/AzureContentFilterDetectionResult' + description: |- + A detection result that identifies whether crude, vulgar, or otherwise objection language is present in the + content. + custom_blocklists: + allOf: + - $ref: '#/components/schemas/AzureContentFilterBlocklistResult' + description: A collection of binary filtering outcomes for configured custom blocklists. + error: + type: object + properties: + code: + type: integer + format: int32 + description: A distinct, machine-readable code associated with the error. + message: + type: string + description: A human-readable message associated with the error. + required: + - code + - message + description: If present, details about an error that prevented content filtering from completing its evaluation. + jailbreak: + allOf: + - $ref: '#/components/schemas/AzureContentFilterDetectionResult' + description: |- + A detection result that describes user prompt injection attacks, where malicious users deliberately exploit + system vulnerabilities to elicit unauthorized behavior from the LLM. This could lead to inappropriate content + generation or violations of system-imposed restrictions. + indirect_attack: + allOf: + - $ref: '#/components/schemas/AzureContentFilterDetectionResult' + description: |- + A detection result that describes attacks on systems powered by Generative AI models that can happen every time + an application processes information that wasn’t directly authored by either the developer of the application or + the user. + required: + - jailbreak + - indirect_attack + description: The content filter category details for the result. + description: A content filter result associated with a single input prompt item into a generative AI system. + AzureContentFilterSeverityResult: + type: object + required: + - filtered + - severity + properties: + filtered: + type: boolean + description: Whether the content severity resulted in a content filtering action. + severity: + type: string + enum: + - safe + - low + - medium + - high + description: The labeled severity of the content. + description: |- + A labeled content filter result item that indicates whether the content was filtered and what the qualitative + severity level of the content was, as evaluated against content filter configuration for the category. + AzureCosmosDBChatDataSource: + type: object + required: + - type + - parameters + properties: + type: + type: string + enum: + - azure_cosmos_db + description: The discriminated type identifier, which is always 'azure_cosmos_db'. + parameters: + type: object + properties: + top_n_documents: + type: integer + format: int32 + description: The configured number of documents to feature in the query. + in_scope: + type: boolean + description: Whether queries should be restricted to use of the indexed data. + strictness: + type: integer + format: int32 + minimum: 1 + maximum: 5 + description: |- + The configured strictness of the search relevance filtering. + Higher strictness will increase precision but lower recall of the answer. + role_information: + type: string + description: |- + Additional instructions for the model to inform how it should behave and any context it should reference when + generating a response. You can describe the assistant's personality and tell it how to format responses. + This is limited to 100 tokens and counts against the overall token limit. + max_search_queries: + type: integer + format: int32 + description: |- + The maximum number of rewritten queries that should be sent to the search provider for a single user message. + By default, the system will make an automatic determination. + allow_partial_result: + type: boolean + description: |- + If set to true, the system will allow partial search results to be used and the request will fail if all + partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + default: false + include_contexts: + type: array + items: + type: string + enum: + - citations + - intent + - all_retrieved_documents + maxItems: 3 + description: |- + The output context properties to include on the response. + By default, citations and intent will be requested. + default: + - citations + - intent + container_name: + type: string + database_name: + type: string + embedding_dependency: + $ref: '#/components/schemas/AzureChatDataSourceVectorizationSource' + index_name: + type: string + authentication: + $ref: '#/components/schemas/AzureChatDataSourceConnectionStringAuthenticationOptions' + fields_mapping: + type: object + properties: + content_fields: + type: array + items: + type: string + vector_fields: + type: array + items: + type: string + title_field: + type: string + url_field: + type: string + filepath_field: + type: string + content_fields_separator: + type: string + required: + - content_fields + - vector_fields + required: + - container_name + - database_name + - embedding_dependency + - index_name + - authentication + - fields_mapping + description: The parameter information to control the use of the Azure CosmosDB data source. + allOf: + - $ref: '#/components/schemas/AzureChatDataSource' + description: Represents a data source configuration that will use an Azure CosmosDB resource. + AzureCreateChatCompletionRequest: + type: object + properties: + data_sources: + type: array + items: + $ref: '#/components/schemas/AzureChatDataSource' + description: The data sources to use for the On Your Data feature, exclusive to Azure OpenAI. + allOf: + - $ref: '#/components/schemas/OpenAI.CreateChatCompletionRequest' + description: |- + The extended request model for chat completions against the Azure OpenAI service. + This adds the ability to provide data sources for the On Your Data feature. + AzureCreateChatCompletionResponse: + type: object + properties: + prompt_filter_results: + type: array + items: + type: object + properties: + prompt_index: + type: integer + format: int32 + description: The index of the input prompt that this content filter result corresponds to. + content_filter_results: + allOf: + - $ref: '#/components/schemas/AzureContentFilterResultForPrompt' + description: The content filter results associated with the indexed input prompt. + required: + - prompt_index + - content_filter_results + description: The Responsible AI content filter annotations associated with prompt inputs into chat completions. + allOf: + - $ref: '#/components/schemas/OpenAI.CreateChatCompletionResponse' + description: |- + The extended top-level chat completion response model for the Azure OpenAI service. + This model adds Responsible AI content filter annotations for prompt input. + AzureImage: + type: object + required: + - prompt_filter_results + - content_filter_results + properties: + prompt_filter_results: + $ref: '#/components/schemas/AzureContentFilterImagePromptResults' + content_filter_results: + $ref: '#/components/schemas/AzureContentFilterImageResponseResults' + allOf: + - $ref: '#/components/schemas/OpenAI.Image' + AzureMachineLearningIndexChatDataSource: + type: object + required: + - type + - parameters + properties: + type: + type: string + enum: + - azure_ml_index + description: The discriminated type identifier, which is always 'azure_ml_index'. + parameters: + type: object + properties: + top_n_documents: + type: integer + format: int32 + description: The configured number of documents to feature in the query. + in_scope: + type: boolean + description: Whether queries should be restricted to use of the indexed data. + strictness: + type: integer + format: int32 + minimum: 1 + maximum: 5 + description: |- + The configured strictness of the search relevance filtering. + Higher strictness will increase precision but lower recall of the answer. + role_information: + type: string + description: |- + Additional instructions for the model to inform how it should behave and any context it should reference when + generating a response. You can describe the assistant's personality and tell it how to format responses. + This is limited to 100 tokens and counts against the overall token limit. + max_search_queries: + type: integer + format: int32 + description: |- + The maximum number of rewritten queries that should be sent to the search provider for a single user message. + By default, the system will make an automatic determination. + allow_partial_result: + type: boolean + description: |- + If set to true, the system will allow partial search results to be used and the request will fail if all + partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + default: false + include_contexts: + type: array + items: + type: string + enum: + - citations + - intent + - all_retrieved_documents + maxItems: 3 + description: |- + The output context properties to include on the response. + By default, citations and intent will be requested. + default: + - citations + - intent + authentication: + anyOf: + - $ref: '#/components/schemas/AzureChatDataSourceAccessTokenAuthenticationOptions' + - $ref: '#/components/schemas/AzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions' + - $ref: '#/components/schemas/AzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions' + project_resource_id: + type: string + description: The ID of the Azure Machine Learning index project to use. + name: + type: string + description: The name of the Azure Machine Learning index to use. + version: + type: string + description: The version of the vector index to use. + filter: + type: string + description: A search filter, which is only applicable if the vector index is of the 'AzureSearch' type. + required: + - authentication + - project_resource_id + - name + - version + description: The parameter information to control the use of the Azure Machine Learning Index data source. + allOf: + - $ref: '#/components/schemas/AzureChatDataSource' + description: Represents a data source configuration that will use an Azure Machine Learning vector index. + AzureOpenAIChatError: + type: object + properties: + code: + type: string + description: The distinct, machine-generated identifier for the error. + message: + type: string + description: A human-readable message associated with the error. + param: + type: string + description: If applicable, the request input parameter associated with the error + type: + type: string + description: If applicable, the input line number associated with the error. + inner_error: + type: object + properties: + code: + type: string + enum: + - ResponsibleAIPolicyViolation + description: The code associated with the inner error. + revised_prompt: + type: string + description: If applicable, the modified prompt used for generation. + content_filter_results: + allOf: + - $ref: '#/components/schemas/AzureContentFilterResultForPrompt' + description: The content filter result details associated with the inner error. + description: If applicable, an upstream error that originated this error. + description: The structured representation of an error from an Azure OpenAI chat completion request. + AzureOpenAIChatErrorResponse: + type: object + properties: + error: + $ref: '#/components/schemas/AzureOpenAIChatError' + description: A structured representation of an error an Azure OpenAI request. + AzureOpenAIDalleError: + type: object + properties: + code: + type: string + description: The distinct, machine-generated identifier for the error. + message: + type: string + description: A human-readable message associated with the error. + param: + type: string + description: If applicable, the request input parameter associated with the error + type: + type: string + description: If applicable, the input line number associated with the error. + inner_error: + type: object + properties: + code: + type: string + enum: + - ResponsibleAIPolicyViolation + description: The code associated with the inner error. + revised_prompt: + type: string + description: If applicable, the modified prompt used for generation. + content_filter_results: + allOf: + - $ref: '#/components/schemas/AzureContentFilterImagePromptResults' + description: The content filter result details associated with the inner error. + description: If applicable, an upstream error that originated this error. + description: The structured representation of an error from an Azure OpenAI image generation request. + AzureOpenAIDalleErrorResponse: + type: object + properties: + error: + $ref: '#/components/schemas/AzureOpenAIDalleError' + description: A structured representation of an error an Azure OpenAI request. + AzureSearchChatDataSource: + type: object + required: + - type + - parameters + properties: + type: + type: string + enum: + - azure_search + description: The discriminated type identifier, which is always 'azure_search'. + parameters: + type: object + properties: + top_n_documents: + type: integer + format: int32 + description: The configured number of documents to feature in the query. + in_scope: + type: boolean + description: Whether queries should be restricted to use of the indexed data. + strictness: + type: integer + format: int32 + minimum: 1 + maximum: 5 + description: |- + The configured strictness of the search relevance filtering. + Higher strictness will increase precision but lower recall of the answer. + role_information: + type: string + description: |- + Additional instructions for the model to inform how it should behave and any context it should reference when + generating a response. You can describe the assistant's personality and tell it how to format responses. + This is limited to 100 tokens and counts against the overall token limit. + max_search_queries: + type: integer + format: int32 + description: |- + The maximum number of rewritten queries that should be sent to the search provider for a single user message. + By default, the system will make an automatic determination. + allow_partial_result: + type: boolean + description: |- + If set to true, the system will allow partial search results to be used and the request will fail if all + partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + default: false + include_contexts: + type: array + items: + type: string + enum: + - citations + - intent + - all_retrieved_documents + maxItems: 3 + description: |- + The output context properties to include on the response. + By default, citations and intent will be requested. + default: + - citations + - intent + endpoint: + type: string + format: uri + description: The absolute endpoint path for the Azure Search resource to use. + index_name: + type: string + description: The name of the index to use, as specified in the Azure Search resource. + authentication: + anyOf: + - $ref: '#/components/schemas/AzureChatDataSourceApiKeyAuthenticationOptions' + - $ref: '#/components/schemas/AzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions' + - $ref: '#/components/schemas/AzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions' + - $ref: '#/components/schemas/AzureChatDataSourceAccessTokenAuthenticationOptions' + description: The authentication mechanism to use with Azure Search. + fields_mapping: + type: object + properties: + title_field: + type: string + description: The name of the index field to use as a title. + url_field: + type: string + description: The name of the index field to use as a URL. + filepath_field: + type: string + description: The name of the index field to use as a filepath. + content_fields: + type: array + items: + type: string + description: The names of index fields that should be treated as content. + content_fields_separator: + type: string + description: The separator pattern that content fields should use. + vector_fields: + type: array + items: + type: string + description: The names of fields that represent vector data. + image_vector_fields: + type: array + items: + type: string + description: The names of fields that represent image vector data. + description: The field mappings to use with the Azure Search resource. + query_type: + type: string + enum: + - simple + - semantic + - vector + - vector_simple_hybrid + - vector_semantic_hybrid + description: The query type for the Azure Search resource to use. + semantic_configuration: + type: string + description: Additional semantic configuration for the query. + filter: + type: string + description: A filter to apply to the search. + embedding_dependency: + anyOf: + - $ref: '#/components/schemas/AzureChatDataSourceEndpointVectorizationSource' + - $ref: '#/components/schemas/AzureChatDataSourceDeploymentNameVectorizationSource' + description: |- + The vectorization source to use with Azure Search. + Supported sources for Azure Search include endpoint and deployment name. + required: + - endpoint + - index_name + - authentication + description: The parameter information to control the use of the Azure Search data source. + allOf: + - $ref: '#/components/schemas/AzureChatDataSource' + description: Represents a data source configuration that will use an Azure Search resource. + ChatCompletionMessageToolCallsItem: + type: array + items: + $ref: '#/components/schemas/OpenAI.ChatCompletionMessageToolCall' + description: The tool calls generated by the model, such as function calls. + ChatCompletionRequestMessageContentPart: + anyOf: + - $ref: '#/components/schemas/OpenAI.ChatCompletionRequestMessageContentPartText' + - $ref: '#/components/schemas/OpenAI.ChatCompletionRequestMessageContentPartImage' + x-oaiExpandable: true + ChatCompletionTokenLogprobBytes: + type: array + items: + type: integer + format: int32 + ChatCompletionToolChoiceOption: + anyOf: + - type: string + enum: + - none + - auto + - required + - $ref: '#/components/schemas/OpenAI.ChatCompletionNamedToolChoice' + description: |- + Controls which (if any) tool is called by the model. + `none` means the model will not call any tool and instead generates a message. + `auto` means the model can pick between generating a message or calling one or more tools. + `required` means the model must call one or more tools. + Specifying a particular tool via `{"type": "function", "function": {"name": "my_function"}}` forces the model to call that tool. + + `none` is the default when no tools are present. `auto` is the default if tools are present. + x-oaiExpandable: true + CreateMessageRequestAttachments: + type: array + items: + type: object + properties: + file_id: + type: string + description: The ID of the file to attach to the message. + tools: + type: array + items: + anyOf: + - $ref: '#/components/schemas/OpenAI.AssistantToolsCode' + - $ref: '#/components/schemas/OpenAI.AssistantToolsFileSearchTypeOnly' + description: The tools to add this file to. + x-oaiExpandable: true + required: + - file_id + - tools + ElasticsearchChatDataSource: + type: object + required: + - type + - parameters + properties: + type: + type: string + enum: + - elasticsearch + description: The discriminated type identifier, which is always 'elasticsearch'. + parameters: + type: object + properties: + top_n_documents: + type: integer + format: int32 + description: The configured number of documents to feature in the query. + in_scope: + type: boolean + description: Whether queries should be restricted to use of the indexed data. + strictness: + type: integer + format: int32 + minimum: 1 + maximum: 5 + description: |- + The configured strictness of the search relevance filtering. + Higher strictness will increase precision but lower recall of the answer. + role_information: + type: string + description: |- + Additional instructions for the model to inform how it should behave and any context it should reference when + generating a response. You can describe the assistant's personality and tell it how to format responses. + This is limited to 100 tokens and counts against the overall token limit. + max_search_queries: + type: integer + format: int32 + description: |- + The maximum number of rewritten queries that should be sent to the search provider for a single user message. + By default, the system will make an automatic determination. + allow_partial_result: + type: boolean + description: |- + If set to true, the system will allow partial search results to be used and the request will fail if all + partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + default: false + include_contexts: + type: array + items: + type: string + enum: + - citations + - intent + - all_retrieved_documents + maxItems: 3 + description: |- + The output context properties to include on the response. + By default, citations and intent will be requested. + default: + - citations + - intent + endpoint: + type: string + format: uri + index_name: + type: string + authentication: + anyOf: + - $ref: '#/components/schemas/AzureChatDataSourceKeyAndKeyIdAuthenticationOptions' + - $ref: '#/components/schemas/AzureChatDataSourceEncodedApiKeyAuthenticationOptions' + fields_mapping: + type: object + properties: + title_field: + type: string + url_field: + type: string + filepath_field: + type: string + content_fields: + type: array + items: + type: string + content_fields_separator: + type: string + vector_fields: + type: array + items: + type: string + query_type: + type: string + enum: + - simple + - vector + embedding_dependency: + $ref: '#/components/schemas/AzureChatDataSourceVectorizationSource' + required: + - endpoint + - index_name + - authentication + description: The parameter information to control the use of the Elasticsearch data source. + allOf: + - $ref: '#/components/schemas/AzureChatDataSource' + MessageObjectAttachments: + type: array + items: + type: object + properties: + file_id: + type: string + description: The ID of the file to attach to the message. + tools: + type: array + items: + anyOf: + - $ref: '#/components/schemas/OpenAI.AssistantToolsCode' + - $ref: '#/components/schemas/OpenAI.AssistantToolsFileSearchTypeOnly' + description: The tools to add this file to. + x-oaiExpandable: true + OpenAI.AssistantToolDefinition: + type: object + required: + - type + properties: + type: + type: string + discriminator: + propertyName: type + mapping: + file_search: '#/components/schemas/OpenAI.AssistantToolsFileSearch' + function: '#/components/schemas/OpenAI.AssistantToolsFunction' + OpenAI.AssistantToolsCode: + type: object + required: + - type + properties: + type: + type: string + enum: + - code_interpreter + description: 'The type of tool being defined: `code_interpreter`' + allOf: + - $ref: '#/components/schemas/OpenAI.AssistantToolDefinition' + OpenAI.AssistantToolsFileSearch: + type: object + required: + - type + properties: + type: + type: string + enum: + - file_search + description: 'The type of tool being defined: `file_search`' + file_search: + type: object + properties: + max_num_results: + type: integer + format: int32 + minimum: 1 + maximum: 50 + description: |- + The maximum number of results the file search tool should output. The default is 20 for gpt-4* models and 5 for gpt-3.5-turbo. This number should be between 1 and 50 inclusive. + + Note that the file search tool may output fewer than `max_num_results` results. See the [file search tool documentation](/docs/assistants/tools/file-search/number-of-chunks-returned) for more information. + description: Overrides for the file search tool. + allOf: + - $ref: '#/components/schemas/OpenAI.AssistantToolDefinition' + OpenAI.AssistantToolsFileSearchTypeOnly: + type: object + required: + - type + properties: + type: + type: string + enum: + - file_search + description: 'The type of tool being defined: `file_search`' + OpenAI.AssistantToolsFunction: + type: object + required: + - type + - function + properties: + type: + type: string + enum: + - function + description: 'The type of tool being defined: `function`' + function: + $ref: '#/components/schemas/OpenAI.FunctionObject' + allOf: + - $ref: '#/components/schemas/OpenAI.AssistantToolDefinition' + OpenAI.ChatCompletionFunctionCallOption: + type: object + required: + - name + properties: + name: + type: string + description: The name of the function to call. + description: 'Specifying a particular function via `{"name": "my_function"}` forces the model to call that function.' + OpenAI.ChatCompletionFunctions: + type: object + required: + - name + properties: + description: + type: string + description: A description of what the function does, used by the model to choose when and how to call the function. + name: + type: string + description: The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. + parameters: + $ref: '#/components/schemas/OpenAI.FunctionParameters' + deprecated: true + OpenAI.ChatCompletionMessageToolCall: + type: object + required: + - id + - type + - function + properties: + id: + type: string + description: The ID of the tool call. + type: + type: string + enum: + - function + description: The type of the tool. Currently, only `function` is supported. + function: + type: object + properties: + name: + type: string + description: The name of the function to call. + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + required: + - name + - arguments + description: The function that the model called. + OpenAI.ChatCompletionMessageToolCallChunk: + type: object + required: + - index + properties: + index: + type: integer + format: int32 + id: + type: string + description: The ID of the tool call. + type: + type: string + enum: + - function + description: The type of the tool. Currently, only `function` is supported. + function: + type: object + properties: + name: + type: string + description: The name of the function to call. + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + OpenAI.ChatCompletionNamedToolChoice: + type: object + required: + - type + - function + properties: + type: + type: string + enum: + - function + description: The type of the tool. Currently, only `function` is supported. + function: + type: object + properties: + name: + type: string + description: The name of the function to call. + required: + - name + description: Specifies a tool the model should use. Use to force the model to call a specific function. + OpenAI.ChatCompletionRequestAssistantMessage: + type: object + required: + - role + properties: + content: + type: string + nullable: true + description: The contents of the assistant message. Required unless `tool_calls` or `function_call` is specified. + role: + type: string + enum: + - assistant + description: The role of the messages author, in this case `assistant`. + name: + type: string + description: An optional name for the participant. Provides the model information to differentiate between participants of the same role. + tool_calls: + $ref: '#/components/schemas/ChatCompletionMessageToolCallsItem' + function_call: + type: object + properties: + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + name: + type: string + description: The name of the function to call. + required: + - arguments + - name + nullable: true + description: Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model. + deprecated: true + allOf: + - $ref: '#/components/schemas/OpenAI.ChatCompletionRequestMessage' + OpenAI.ChatCompletionRequestFunctionMessage: + type: object + required: + - role + - content + - name + properties: + role: + type: string + enum: + - function + description: The role of the messages author, in this case `function`. + content: + type: string + nullable: true + description: The contents of the function message. + name: + type: string + description: The name of the function to call. + allOf: + - $ref: '#/components/schemas/OpenAI.ChatCompletionRequestMessage' + deprecated: true + OpenAI.ChatCompletionRequestMessage: + type: object + required: + - role + properties: + role: + type: string + description: The role of the author of this message. + discriminator: + propertyName: role + mapping: + system: '#/components/schemas/OpenAI.ChatCompletionRequestSystemMessage' + user: '#/components/schemas/OpenAI.ChatCompletionRequestUserMessage' + assistant: '#/components/schemas/OpenAI.ChatCompletionRequestAssistantMessage' + tool: '#/components/schemas/OpenAI.ChatCompletionRequestToolMessage' + function: '#/components/schemas/OpenAI.ChatCompletionRequestFunctionMessage' + x-oaiExpandable: true + OpenAI.ChatCompletionRequestMessageContentPartImage: + type: object + required: + - type + - image_url + properties: + type: + type: string + enum: + - image_url + description: The type of the content part. + image_url: + type: object + properties: + url: + type: string + format: uri + description: Either a URL of the image or the base64 encoded image data. + detail: + type: string + enum: + - auto + - low + - high + description: Specifies the detail level of the image. Learn more in the [Vision guide](/docs/guides/vision/low-or-high-fidelity-image-understanding). + default: auto + required: + - url + OpenAI.ChatCompletionRequestMessageContentPartText: + type: object + required: + - type + - text + properties: + type: + type: string + enum: + - text + description: The type of the content part. + text: + type: string + description: The text content. + OpenAI.ChatCompletionRequestSystemMessage: + type: object + required: + - content + - role + properties: + content: + type: string + description: The contents of the system message. + role: + type: string + enum: + - system + description: The role of the messages author, in this case `system`. + name: + type: string + description: An optional name for the participant. Provides the model information to differentiate between participants of the same role. + allOf: + - $ref: '#/components/schemas/OpenAI.ChatCompletionRequestMessage' + OpenAI.ChatCompletionRequestToolMessage: + type: object + required: + - role + - content + - tool_call_id + properties: + role: + type: string + enum: + - tool + description: The role of the messages author, in this case `tool`. + content: + type: string + description: The contents of the tool message. + tool_call_id: + type: string + description: Tool call that this message is responding to. + allOf: + - $ref: '#/components/schemas/OpenAI.ChatCompletionRequestMessage' + OpenAI.ChatCompletionRequestUserMessage: + type: object + required: + - content + - role + properties: + content: + anyOf: + - type: string + - type: array + items: + $ref: '#/components/schemas/ChatCompletionRequestMessageContentPart' + description: The contents of the user message. + x-oaiExpandable: true + role: + type: string + enum: + - user + description: The role of the messages author, in this case `user`. + name: + type: string + description: An optional name for the participant. Provides the model information to differentiate between participants of the same role. + allOf: + - $ref: '#/components/schemas/OpenAI.ChatCompletionRequestMessage' + OpenAI.ChatCompletionResponseMessage: + type: object + required: + - content + - role + properties: + content: + type: string + nullable: true + description: The contents of the message. + tool_calls: + $ref: '#/components/schemas/ChatCompletionMessageToolCallsItem' + role: + type: string + enum: + - assistant + description: The role of the author of this message. + function_call: + type: object + properties: + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + name: + type: string + description: The name of the function to call. + required: + - arguments + - name + description: Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model. + deprecated: true + description: A chat completion message generated by the model. + OpenAI.ChatCompletionStreamOptions: + type: object + properties: + include_usage: + type: boolean + description: 'If set, an additional chunk will be streamed before the `data: [DONE]` message. The `usage` field on this chunk shows the token usage statistics for the entire request, and the `choices` field will always be an empty array. All other chunks will also include a `usage` field, but with a null value.' + description: 'Options for streaming response. Only set this when you set `stream: true`.' + OpenAI.ChatCompletionStreamResponseDelta: + type: object + properties: + content: + type: string + nullable: true + description: The contents of the chunk message. + function_call: + type: object + properties: + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + name: + type: string + description: The name of the function to call. + description: Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model. + deprecated: true + tool_calls: + type: array + items: + $ref: '#/components/schemas/OpenAI.ChatCompletionMessageToolCallChunk' + role: + type: string + enum: + - system + - user + - assistant + - tool + description: The role of the author of this message. + description: A chat completion delta generated by streamed model responses. + OpenAI.ChatCompletionTokenLogprob: + type: object + required: + - token + - logprob + - bytes + - top_logprobs + properties: + token: + type: string + description: The token. + logprob: + type: number + format: float + description: The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value `-9999.0` is used to signify that the token is very unlikely. + bytes: + type: object + allOf: + - $ref: '#/components/schemas/ChatCompletionTokenLogprobBytes' + nullable: true + description: A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be `null` if there is no bytes representation for the token. + top_logprobs: + type: array + items: + type: object + properties: + token: + type: string + description: The token. + logprob: + type: number + format: float + description: The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value `-9999.0` is used to signify that the token is very unlikely. + bytes: + type: array + items: + type: integer + format: int32 + nullable: true + description: A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be `null` if there is no bytes representation for the token. + required: + - token + - logprob + - bytes + description: List of the most likely tokens and their log probability, at this token position. In rare cases, there may be fewer than the number of requested `top_logprobs` returned. + OpenAI.ChatCompletionTool: + type: object + required: + - type + - function + properties: + type: + type: string + enum: + - function + description: The type of the tool. Currently, only `function` is supported. + function: + $ref: '#/components/schemas/OpenAI.FunctionObject' + OpenAI.CompletionUsage: + type: object + required: + - completion_tokens + - prompt_tokens + - total_tokens + properties: + completion_tokens: + type: integer + format: int32 + description: Number of tokens in the generated completion. + prompt_tokens: + type: integer + format: int32 + description: Number of tokens in the prompt. + total_tokens: + type: integer + format: int32 + description: Total number of tokens used in the request (prompt + completion). + description: Usage statistics for the completion request. + OpenAI.CreateChatCompletionRequest: + type: object + required: + - messages + - model + properties: + messages: + type: array + items: + $ref: '#/components/schemas/OpenAI.ChatCompletionRequestMessage' + minItems: 1 + description: A list of messages comprising the conversation so far. [Example Python code](https://cookbook.openai.com/examples/how_to_format_inputs_to_chatgpt_models). + model: + anyOf: + - type: string + - type: string + enum: + - gpt-4o + - gpt-4o-2024-05-13 + - gpt-4-turbo + - gpt-4-turbo-2024-04-09 + - gpt-4-0125-preview + - gpt-4-turbo-preview + - gpt-4-1106-preview + - gpt-4-vision-preview + - gpt-4 + - gpt-4-0314 + - gpt-4-0613 + - gpt-4-32k + - gpt-4-32k-0314 + - gpt-4-32k-0613 + - gpt-3.5-turbo + - gpt-3.5-turbo-16k + - gpt-3.5-turbo-0301 + - gpt-3.5-turbo-0613 + - gpt-3.5-turbo-1106 + - gpt-3.5-turbo-0125 + - gpt-3.5-turbo-16k-0613 + description: ID of the model to use. See the [model endpoint compatibility](/docs/models/model-endpoint-compatibility) table for details on which models work with the Chat API. + x-oaiTypeLabel: string + frequency_penalty: + type: number + format: float + nullable: true + minimum: -2 + maximum: 2 + description: |- + Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim. + + [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + default: 0 + logit_bias: + type: object + additionalProperties: + type: integer + format: int32 + nullable: true + description: |- + Modify the likelihood of specified tokens appearing in the completion. + + Accepts a JSON object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token. + x-oaiTypeLabel: map + default: null + logprobs: + type: boolean + nullable: true + description: Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the `content` of `message`. + default: false + top_logprobs: + type: integer + format: int32 + nullable: true + minimum: 0 + maximum: 20 + description: An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. `logprobs` must be set to `true` if this parameter is used. + max_tokens: + type: integer + format: int32 + nullable: true + description: |- + The maximum number of [tokens](/tokenizer) that can be generated in the chat completion. + + The total length of input tokens and generated tokens is limited by the model's context length. [Example Python code](https://cookbook.openai.com/examples/how_to_count_tokens_with_tiktoken) for counting tokens. + n: + type: integer + format: int32 + nullable: true + minimum: 1 + maximum: 128 + description: How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep `n` as `1` to minimize costs. + default: 1 + presence_penalty: + type: number + format: float + nullable: true + minimum: -2 + maximum: 2 + description: |- + Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics. + + [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + default: 0 + response_format: + type: object + properties: + type: + type: string + enum: + - text + - json_object + description: Must be one of `text` or `json_object`. + default: text + description: |- + An object specifying the format that the model must output. Compatible with [GPT-4 Turbo](/docs/models/gpt-4-and-gpt-4-turbo) and all GPT-3.5 Turbo models newer than `gpt-3.5-turbo-1106`. + + Setting to `{ "type": "json_object" }` enables JSON mode, which guarantees the message the model generates is valid JSON. + + **Important:** when using JSON mode, you **must** also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if `finish_reason="length"`, which indicates the generation exceeded `max_tokens` or the conversation exceeded the max context length. + seed: + type: integer + format: int64 + nullable: true + description: |- + This feature is in Beta. + If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same `seed` and parameters should return the same result. + Determinism is not guaranteed, and you should refer to the `system_fingerprint` response parameter to monitor changes in the backend. + stop: + anyOf: + - type: string + - type: array + items: + type: string + nullable: true + description: Up to 4 sequences where the API will stop generating further tokens. + default: null + stream: + type: boolean + nullable: true + description: 'If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) as they become available, with the stream terminated by a `data: [DONE]` message. [Example Python code](https://cookbook.openai.com/examples/how_to_stream_completions).' + default: false + stream_options: + type: object + allOf: + - $ref: '#/components/schemas/OpenAI.ChatCompletionStreamOptions' + nullable: true + default: null + temperature: + type: number + format: float + nullable: true + minimum: 0 + maximum: 2 + description: |- + What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + + We generally recommend altering this or `top_p` but not both. + default: 1 + top_p: + type: number + format: float + nullable: true + minimum: 0 + maximum: 1 + description: |- + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or `temperature` but not both. + default: 1 + tools: + type: array + items: + $ref: '#/components/schemas/OpenAI.ChatCompletionTool' + description: A list of tools the model may call. Currently, only functions are supported as a tool. Use this to provide a list of functions the model may generate JSON inputs for. A max of 128 functions are supported. + tool_choice: + $ref: '#/components/schemas/ChatCompletionToolChoiceOption' + parallel_tool_calls: + type: boolean + default: true + user: + type: string + description: A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). + function_call: + anyOf: + - type: string + enum: + - none + - auto + - $ref: '#/components/schemas/OpenAI.ChatCompletionFunctionCallOption' + description: |- + Deprecated in favor of `tool_choice`. + + Controls which (if any) function is called by the model. + `none` means the model will not call a function and instead generates a message. + `auto` means the model can pick between generating a message or calling a function. + Specifying a particular function via `{"name": "my_function"}` forces the model to call that function. + + `none` is the default when no functions are present. `auto` is the default if functions are present. + deprecated: true + x-oaiExpandable: true + functions: + type: array + items: + $ref: '#/components/schemas/OpenAI.ChatCompletionFunctions' + minItems: 1 + maxItems: 128 + description: |- + Deprecated in favor of `tools`. + + A list of functions the model may generate JSON inputs for. + deprecated: true + OpenAI.CreateChatCompletionResponse: + type: object + required: + - id + - choices + - created + - model + - object + properties: + id: + type: string + description: A unique identifier for the chat completion. + choices: + type: array + items: + type: object + properties: + finish_reason: + type: string + enum: + - stop + - length + - tool_calls + - content_filter + - function_call + description: |- + The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, + `length` if the maximum number of tokens specified in the request was reached, + `content_filter` if content was omitted due to a flag from our content filters, + `tool_calls` if the model called a tool, or `function_call` (deprecated) if the model called a function. + index: + type: integer + format: int32 + description: The index of the choice in the list of choices. + message: + $ref: '#/components/schemas/OpenAI.ChatCompletionResponseMessage' + logprobs: + type: object + properties: + content: + type: array + items: + $ref: '#/components/schemas/OpenAI.ChatCompletionTokenLogprob' + nullable: true + description: A list of message content tokens with log probability information. + required: + - content + nullable: true + description: Log probability information for the choice. + required: + - finish_reason + - index + - message + - logprobs + description: A list of chat completion choices. Can be more than one if `n` is greater than 1. + created: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the chat completion was created. + model: + type: string + description: The model used for the chat completion. + system_fingerprint: + type: string + description: |- + This fingerprint represents the backend configuration that the model runs with. + + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + object: + type: string + enum: + - chat.completion + description: The object type, which is always `chat.completion`. + usage: + $ref: '#/components/schemas/OpenAI.CompletionUsage' + description: Represents a chat completion response returned by model, based on the provided input. + OpenAI.CreateImageRequest: + type: object + required: + - prompt + properties: + prompt: + type: string + description: A text description of the desired image(s). The maximum length is 1000 characters for `dall-e-2` and 4000 characters for `dall-e-3`. + model: + anyOf: + - type: string + - type: string + enum: + - dall-e-2 + - dall-e-3 + nullable: true + description: The model to use for image generation. + x-oaiTypeLabel: string + default: dall-e-2 + n: + type: integer + format: int32 + nullable: true + minimum: 1 + maximum: 10 + description: The number of images to generate. Must be between 1 and 10. For `dall-e-3`, only `n=1` is supported. + default: 1 + quality: + type: string + enum: + - standard + - hd + description: The quality of the image that will be generated. `hd` creates images with finer details and greater consistency across the image. This param is only supported for `dall-e-3`. + default: standard + response_format: + type: string + enum: + - url + - b64_json + nullable: true + description: The format in which the generated images are returned. Must be one of `url` or `b64_json`. URLs are only valid for 60 minutes after the image has been generated. + default: url + size: + type: string + enum: + - 256x256 + - 512x512 + - 1024x1024 + - 1792x1024 + - 1024x1792 + nullable: true + description: The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024` for `dall-e-2`. Must be one of `1024x1024`, `1792x1024`, or `1024x1792` for `dall-e-3` models. + default: 1024x1024 + style: + type: string + enum: + - vivid + - natural + nullable: true + description: The style of the generated images. Must be one of `vivid` or `natural`. Vivid causes the model to lean towards generating hyper-real and dramatic images. Natural causes the model to produce more natural, less hyper-real looking images. This param is only supported for `dall-e-3`. + default: vivid + user: + type: string + description: A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). + OpenAI.CreateMessageRequest: + type: object + required: + - role + - content + properties: + role: + type: string + enum: + - user + - assistant + description: |- + The role of the entity that is creating the message. Allowed values include: + - `user`: Indicates the message is sent by an actual user and should be used in most cases to represent user-generated messages. + - `assistant`: Indicates the message is generated by the assistant. Use this value to insert messages from the assistant into the conversation. + content: + type: array + items: + $ref: '#/components/schemas/OpenAI.MessageContent' + x-oaiExpandable: true + attachments: + type: object + allOf: + - $ref: '#/components/schemas/CreateMessageRequestAttachments' + nullable: true + description: A list of files attached to the message, and the tools they should be added to. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + OpenAI.Error: + type: object + required: + - code + - message + - param + - type + properties: + code: + type: string + nullable: true + message: + type: string + param: + type: string + nullable: true + type: + type: string + OpenAI.ErrorResponse: + type: object + required: + - error + properties: + error: + $ref: '#/components/schemas/OpenAI.Error' + OpenAI.FunctionObject: + type: object + required: + - name + properties: + description: + type: string + description: A description of what the function does, used by the model to choose when and how to call the function. + name: + type: string + description: The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. + parameters: + $ref: '#/components/schemas/OpenAI.FunctionParameters' + OpenAI.FunctionParameters: + type: object + additionalProperties: {} + description: |- + The parameters the functions accepts, described as a JSON Schema object. See the [guide](/docs/guides/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format. + + Omitting `parameters` defines a function with an empty parameter list. + OpenAI.Image: + type: object + properties: + b64_json: + type: string + format: base64 + description: The base64-encoded JSON of the generated image, if `response_format` is `b64_json`. + url: + type: string + format: uri + description: The URL of the generated image, if `response_format` is `url` (default). + revised_prompt: + type: string + description: The prompt that was used to generate the image, if there was any revision to the prompt. + description: Represents the url or the content of an image generated by the OpenAI API. + OpenAI.ImagesResponse: + type: object + required: + - created + - data + properties: + created: + type: integer + format: unixtime + data: + type: array + items: + $ref: '#/components/schemas/OpenAI.Image' + OpenAI.MessageContent: + type: object + description: Represents a single piece of content in an Assistants API message. + OpenAI.MessageContentImageFileObject: + type: object + required: + - type + - image_file + properties: + type: + type: string + enum: + - image_file + description: Always `image_file`. + image_file: + type: object + properties: + file_id: + type: string + description: The [File](/docs/api-reference/files) ID of the image in the message content. Set `purpose="vision"` when uploading the File if you need to later display the file content. + detail: + type: string + enum: + - auto + - low + - high + description: Specifies the detail level of the image if specified by the user. `low` uses fewer tokens, you can opt in to high resolution using `high`. + default: auto + required: + - file_id + allOf: + - $ref: '#/components/schemas/OpenAI.MessageContent' + description: References an image [File](/docs/api-reference/files) in the content of a message. + OpenAI.MessageContentImageUrlObject: + type: object + required: + - type + - image_url + properties: + type: + type: string + enum: + - image_url + description: The type of the content part. + image_url: + type: object + properties: + url: + type: string + format: uri + description: 'The external URL of the image, must be a supported image types: jpeg, jpg, png, gif, webp.' + detail: + type: string + enum: + - auto + - low + - high + description: Specifies the detail level of the image. `low` uses fewer tokens, you can opt in to high resolution using `high`. Default value is `auto` + default: auto + required: + - url + allOf: + - $ref: '#/components/schemas/OpenAI.MessageContent' + description: References an image URL in the content of a message. + OpenAI.MessageContentTextAnnotationsFileCitationObject: + type: object + required: + - type + - text + - file_citation + - start_index + - end_index + properties: + type: + type: string + enum: + - file_citation + description: Always `file_citation`. + text: + type: string + description: The text in the message content that needs to be replaced. + file_citation: + type: object + properties: + file_id: + type: string + description: The ID of the specific File the citation is from. + required: + - file_id + start_index: + type: integer + format: int32 + minimum: 0 + end_index: + type: integer + format: int32 + minimum: 0 + allOf: + - $ref: '#/components/schemas/OpenAI.MessageContentTextObjectAnnotation' + description: A citation within the message that points to a specific quote from a specific File associated with the assistant or the message. Generated when the assistant uses the "file_search" tool to search files. + OpenAI.MessageContentTextAnnotationsFilePathObject: + type: object + required: + - type + - text + - file_path + - start_index + - end_index + properties: + type: + type: string + enum: + - file_path + description: Always `file_path`. + text: + type: string + description: The text in the message content that needs to be replaced. + file_path: + type: object + properties: + file_id: + type: string + description: The ID of the file that was generated. + required: + - file_id + start_index: + type: integer + format: int32 + minimum: 0 + end_index: + type: integer + format: int32 + minimum: 0 + allOf: + - $ref: '#/components/schemas/OpenAI.MessageContentTextObjectAnnotation' + description: A URL for the file that's generated when the assistant used the `code_interpreter` tool to generate a file. + OpenAI.MessageContentTextObject: + type: object + required: + - type + - text + properties: + type: + type: string + enum: + - text + description: Always `text`. + text: + type: object + properties: + value: + type: string + description: The data that makes up the text. + annotations: + type: array + items: + $ref: '#/components/schemas/OpenAI.MessageContentTextObjectAnnotation' + x-oaiExpandable: true + required: + - value + - annotations + allOf: + - $ref: '#/components/schemas/OpenAI.MessageContent' + description: The text content that is part of a message. + OpenAI.MessageContentTextObjectAnnotation: + type: object + required: + - type + properties: + type: + type: string + description: The discriminated type identifier for the content item. + discriminator: + propertyName: type + mapping: + file_citation: '#/components/schemas/OpenAI.MessageContentTextAnnotationsFileCitationObject' + file_path: '#/components/schemas/OpenAI.MessageContentTextAnnotationsFilePathObject' + OpenAI.MessageObject: + type: object + required: + - id + - object + - created_at + - thread_id + - status + - incomplete_details + - completed_at + - incomplete_at + - role + - content + - assistant_id + - run_id + - attachments + - metadata + properties: + id: + type: string + description: The identifier, which can be referenced in API endpoints. + object: + type: string + enum: + - thread.message + description: The object type, which is always `thread.message`. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the message was created. + thread_id: + type: string + description: The [thread](/docs/api-reference/threads) ID that this message belongs to. + status: + type: string + enum: + - in_progress + - incomplete + - completed + description: The status of the message, which can be either `in_progress`, `incomplete`, or `completed`. + incomplete_details: + type: object + properties: + reason: + type: string + enum: + - content_filter + - max_tokens + - run_cancelled + - run_expired + - run_failed + description: The reason the message is incomplete. + required: + - reason + nullable: true + description: On an incomplete message, details about why the message is incomplete. + completed_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the message was completed. + incomplete_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the message was marked as incomplete. + role: + type: string + enum: + - user + - assistant + description: The entity that produced the message. One of `user` or `assistant`. + content: + type: array + items: + $ref: '#/components/schemas/OpenAI.MessageContent' + description: The content of the message in array of text and/or images. + x-oaiExpandable: true + assistant_id: + type: string + nullable: true + description: If applicable, the ID of the [assistant](/docs/api-reference/assistants) that authored this message. + run_id: + type: string + nullable: true + description: The ID of the [run](/docs/api-reference/runs) associated with the creation of this message. Value is `null` when messages are created manually using the create message or create thread endpoints. + attachments: + type: object + allOf: + - $ref: '#/components/schemas/MessageObjectAttachments' + nullable: true + description: A list of files attached to the message, and the tools they were added to. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + description: Represents a message within a [thread](/docs/api-reference/threads). + OpenAI.MessageRequestContentTextObject: + type: object + required: + - type + - text + properties: + type: + type: string + enum: + - text + description: Always `text`. + text: + type: string + description: Text content to be sent to the model + allOf: + - $ref: '#/components/schemas/OpenAI.MessageContent' + description: The text content that is part of a message. + PineconeChatDataSource: + type: object + required: + - type + - parameters + properties: + type: + type: string + enum: + - pinecone + description: The discriminated type identifier, which is always 'pinecone'. + parameters: + type: object + properties: + top_n_documents: + type: integer + format: int32 + description: The configured number of documents to feature in the query. + in_scope: + type: boolean + description: Whether queries should be restricted to use of the indexed data. + strictness: + type: integer + format: int32 + minimum: 1 + maximum: 5 + description: |- + The configured strictness of the search relevance filtering. + Higher strictness will increase precision but lower recall of the answer. + role_information: + type: string + description: |- + Additional instructions for the model to inform how it should behave and any context it should reference when + generating a response. You can describe the assistant's personality and tell it how to format responses. + This is limited to 100 tokens and counts against the overall token limit. + max_search_queries: + type: integer + format: int32 + description: |- + The maximum number of rewritten queries that should be sent to the search provider for a single user message. + By default, the system will make an automatic determination. + allow_partial_result: + type: boolean + description: |- + If set to true, the system will allow partial search results to be used and the request will fail if all + partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + default: false + include_contexts: + type: array + items: + type: string + enum: + - citations + - intent + - all_retrieved_documents + maxItems: 3 + description: |- + The output context properties to include on the response. + By default, citations and intent will be requested. + default: + - citations + - intent + environment: + type: string + description: The environment name to use with Pinecone. + index_name: + type: string + description: The name of the Pinecone database index to use. + authentication: + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceApiKeyAuthenticationOptions' + description: |- + The authentication mechanism to use with Pinecone. + Supported authentication mechanisms for Pinecone include: API key. + embedding_dependency: + allOf: + - $ref: '#/components/schemas/AzureChatDataSourceVectorizationSource' + description: |- + The vectorization source to use as an embedding dependency for the Pinecone data source. + Supported vectorization sources for Pinecone include: deployment name. + fields_mapping: + type: object + properties: + content_fields: + type: array + items: + type: string + title_field: + type: string + url_field: + type: string + filepath_field: + type: string + content_fields_separator: + type: string + required: + - content_fields + description: |- + Field mappings to apply to data used by the Pinecone data source. + Note that content field mappings are required for Pinecone. + required: + - environment + - index_name + - authentication + - embedding_dependency + - fields_mapping + description: The parameter information to control the use of the Pinecone data source. + allOf: + - $ref: '#/components/schemas/AzureChatDataSource' + securitySchemes: + ApiKeyAuth: + type: apiKey + in: header + name: api-key + OAuth2Auth: + type: oauth2 + flows: + implicit: + authorizationUrl: https://login.microsoftonline.com/common/oauth2/v2.0/authorize + scopes: + https://cognitiveservices.azure.com/.default: '' +servers: + - url: '{endpoint}/openai' + description: Azure OpenAI APIs for completions and search + variables: + endpoint: + default: '' + description: |- + Supported Cognitive Services endpoints (protocol and hostname, for example: + https://westus.api.cognitive.microsoft.com). diff --git a/.openapi3/openapi3-openai.yaml b/.openapi3/openapi3-openai.yaml new file mode 100644 index 000000000..96af29c49 --- /dev/null +++ b/.openapi3/openapi3-openai.yaml @@ -0,0 +1,9416 @@ +openapi: 3.0.0 +info: + title: OpenAI API + description: The OpenAI REST API. Please see https://platform.openai.com/docs/api-reference for more details. + version: 0.0.0 +tags: + - name: Audio + - name: Assistants + - name: Batch + - name: Chat + - name: Completions + - name: Embeddings + - name: Files + - name: Fine-tuning + - name: Images + - name: Models + - name: Moderations + - name: Vector Stores + - name: Uploads +paths: + /v1/assistants: + post: + tags: + - Assistants + operationId: createAssistant + summary: Create an assistant with a model and instructions. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/AssistantObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateAssistantRequest' + get: + tags: + - Assistants + operationId: listAssistants + summary: Returns a list of assistants. + parameters: + - name: limit + in: query + required: false + description: |- + A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + default is 20. + schema: + type: integer + format: int32 + default: 20 + - name: order + in: query + required: false + description: |- + Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + for descending order. + schema: + $ref: '#/components/schemas/ListOrder' + default: desc + - name: after + in: query + required: false + description: |- + A cursor for use in pagination. `after` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include after=obj_foo in order to fetch the next page of the list. + schema: + type: string + - name: before + in: query + required: false + description: |- + A cursor for use in pagination. `before` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListAssistantsResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/assistants/{assistant_id}: + get: + tags: + - Assistants + operationId: getAssistant + summary: Retrieves an assistant. + parameters: + - name: assistant_id + in: path + required: true + description: The ID of the assistant to retrieve. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/AssistantObject' + - $ref: '#/components/schemas/ErrorResponse' + post: + tags: + - Assistants + operationId: modifyAssistant + summary: Modifies an assistant. + parameters: + - name: assistant_id + in: path + required: true + description: The ID of the assistant to modify. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/AssistantObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/ModifyAssistantRequest' + delete: + tags: + - Assistants + operationId: deleteAssistant + summary: Delete an assistant. + parameters: + - name: assistant_id + in: path + required: true + description: The ID of the assistant to delete. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/DeleteAssistantResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/audio/speech: + post: + tags: + - Audio + operationId: createSpeech + summary: Generates audio from the input text. + parameters: [] + responses: + '200': + description: The request has succeeded. + headers: + Transfer-Encoding: + required: false + description: chunked + schema: + type: string + content: + application/octet-stream: + schema: + type: string + format: binary + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateSpeechRequest' + /v1/audio/transcriptions: + post: + tags: + - Audio + operationId: createTranscription + summary: Transcribes audio into the input language. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/CreateTranscriptionResponseVerboseJson' + - $ref: '#/components/schemas/CreateTranscriptionResponseJson' + - $ref: '#/components/schemas/ErrorResponse' + text/plain: + schema: + type: string + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: '#/components/schemas/CreateTranscriptionRequestMultiPart' + /v1/audio/translations: + post: + tags: + - Audio + operationId: createTranslation + summary: Translates audio into English.. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/CreateTranslationResponseVerboseJson' + - $ref: '#/components/schemas/CreateTranslationResponseJson' + - $ref: '#/components/schemas/ErrorResponse' + text/plain: + schema: + type: string + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: '#/components/schemas/CreateTranslationRequestMultiPart' + /v1/batches: + post: + tags: + - Batch + operationId: createBatch + summary: Creates and executes a batch from an uploaded file of requests + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/Batch' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + type: object + properties: + input_file_id: + type: string + description: |- + The ID of an uploaded file that contains requests for the new batch. + + See [upload file](/docs/api-reference/files/create) for how to upload a file. + + Your input file must be formatted as a [JSONL file](/docs/api-reference/batch/requestInput), and must be uploaded with the purpose `batch`. + endpoint: + type: string + enum: + - /v1/chat/completions + - /v1/embeddings + description: The endpoint to be used for all requests in the batch. Currently `/v1/chat/completions` and `/v1/embeddings` are supported. + completion_window: + type: string + enum: + - 24h + description: The time frame within which the batch should be processed. Currently only `24h` is supported. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Optional custom metadata for the batch. + required: + - input_file_id + - endpoint + - completion_window + get: + tags: + - Batch + operationId: listBatches + summary: List your organization's batches. + parameters: + - name: after + in: query + required: false + description: A cursor for use in pagination. `after` is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list. + schema: + type: string + - name: limit + in: query + required: false + description: A limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20. + schema: + type: integer + format: int32 + default: 20 + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListBatchesResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/batches/{batch_id}: + get: + tags: + - Batch + operationId: retrieveBatch + summary: Retrieves a batch. + parameters: + - name: batch_id + in: path + required: true + description: The ID of the batch to retrieve. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/Batch' + - $ref: '#/components/schemas/ErrorResponse' + /v1/batches/{batch_id}/cancel: + post: + tags: + - Batch + operationId: cancelBatch + summary: Cancels an in-progress batch. + parameters: + - name: batch_id + in: path + required: true + description: The ID of the batch to cancel. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/Batch' + - $ref: '#/components/schemas/ErrorResponse' + /v1/chat/completions: + post: + tags: + - Chat + operationId: createChatCompletion + summary: Creates a model response for the given chat conversation. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/CreateChatCompletionResponse' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateChatCompletionRequest' + /v1/completions: + post: + tags: + - Completions + operationId: createCompletion + summary: Creates a completion for the provided prompt and parameters. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/CreateCompletionResponse' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateCompletionRequest' + /v1/embeddings: + post: + tags: + - Embeddings + operationId: createEmbedding + summary: Creates an embedding vector representing the input text. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/CreateEmbeddingResponse' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateEmbeddingRequest' + /v1/files: + post: + tags: + - Files + operationId: createFile + summary: |- + Upload a file that can be used across various endpoints. The size of all the files uploaded by + one organization can be up to 100 GB. + + The size of individual files can be a maximum of 512 MB or 2 million tokens for Assistants. See + the [Assistants Tools guide](/docs/assistants/tools) to learn more about the types of files + supported. The Fine-tuning API only supports `.jsonl` files. + + Please [contact us](https://help.openai.com/) if you need to increase these storage limits. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/OpenAIFile' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: '#/components/schemas/CreateFileRequestMultiPart' + get: + tags: + - Files + operationId: listFiles + summary: Returns a list of files that belong to the user's organization. + parameters: + - name: purpose + in: query + required: false + description: Only return files with the given purpose. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListFilesResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/files/{file_id}: + get: + tags: + - Files + operationId: retrieveFile + summary: Returns information about a specific file. + parameters: + - name: file_id + in: path + required: true + description: The ID of the file to use for this request. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/OpenAIFile' + - $ref: '#/components/schemas/ErrorResponse' + delete: + tags: + - Files + operationId: deleteFile + summary: Delete a file + parameters: + - name: file_id + in: path + required: true + description: The ID of the file to use for this request. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/DeleteFileResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/files/{file_id}/content: + get: + tags: + - Files + operationId: downloadFile + summary: Returns the contents of the specified file. + parameters: + - name: file_id + in: path + required: true + description: The ID of the file to use for this request. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - type: string + format: byte + - $ref: '#/components/schemas/ErrorResponse' + /v1/fine_tuning/jobs: + post: + tags: + - Fine-tuning + operationId: createFineTuningJob + summary: |- + Creates a fine-tuning job which begins the process of creating a new model from a given dataset. + + Response includes details of the enqueued job including job status and the name of the fine-tuned models once complete. + + [Learn more about fine-tuning](/docs/guides/fine-tuning) + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/FineTuningJob' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateFineTuningJobRequest' + get: + tags: + - Fine-tuning + operationId: listPaginatedFineTuningJobs + summary: List your organization's fine-tuning jobs + parameters: + - name: after + in: query + required: false + description: Identifier for the last job from the previous pagination request. + schema: + type: string + - name: limit + in: query + required: false + description: Number of fine-tuning jobs to retrieve. + schema: + type: integer + format: int32 + default: 20 + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListPaginatedFineTuningJobsResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/fine_tuning/jobs/{fine_tuning_job_id}: + get: + tags: + - Fine-tuning + operationId: retrieveFineTuningJob + summary: |- + Get info about a fine-tuning job. + + [Learn more about fine-tuning](/docs/guides/fine-tuning) + parameters: + - name: fine_tuning_job_id + in: path + required: true + description: The ID of the fine-tuning job. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/FineTuningJob' + - $ref: '#/components/schemas/ErrorResponse' + /v1/fine_tuning/jobs/{fine_tuning_job_id}/cancel: + post: + tags: + - Fine-tuning + operationId: cancelFineTuningJob + summary: Immediately cancel a fine-tune job. + parameters: + - name: fine_tuning_job_id + in: path + required: true + description: The ID of the fine-tuning job to cancel. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/FineTuningJob' + - $ref: '#/components/schemas/ErrorResponse' + /v1/fine_tuning/jobs/{fine_tuning_job_id}/checkpoints: + get: + tags: + - Fine-tuning + operationId: listFineTuningJobCheckpoints + summary: List the checkpoints for a fine-tuning job. + parameters: + - name: fine_tuning_job_id + in: path + required: true + description: The ID of the fine-tuning job to get checkpoints for. + schema: + type: string + - name: after + in: query + required: false + description: Identifier for the last checkpoint ID from the previous pagination request. + schema: + type: string + - name: limit + in: query + required: false + description: Number of checkpoints to retrieve. + schema: + type: integer + format: int32 + default: 10 + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListFineTuningJobCheckpointsResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/fine_tuning/jobs/{fine_tuning_job_id}/events: + get: + tags: + - Fine-tuning + operationId: listFineTuningEvents + summary: Get status updates for a fine-tuning job. + parameters: + - name: fine_tuning_job_id + in: path + required: true + description: The ID of the fine-tuning job to get events for. + schema: + type: string + - name: after + in: query + required: false + description: Identifier for the last event from the previous pagination request. + schema: + type: string + - name: limit + in: query + required: false + description: Number of events to retrieve. + schema: + type: integer + format: int32 + default: 20 + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListFineTuningJobEventsResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/images/edits: + post: + tags: + - Images + operationId: createImageEdit + summary: Creates an edited or extended image given an original image and a prompt. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ImagesResponse' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: '#/components/schemas/CreateImageEditRequestMultiPart' + /v1/images/generations: + post: + tags: + - Images + operationId: createImage + summary: Creates an image given a prompt + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ImagesResponse' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateImageRequest' + /v1/images/variations: + post: + tags: + - Images + operationId: createImageVariation + summary: Creates an edited or extended image given an original image and a prompt. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ImagesResponse' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: '#/components/schemas/CreateImageVariationRequestMultiPart' + /v1/models: + get: + tags: + - Models + operationId: listModels + summary: |- + Lists the currently available models, and provides basic information about each one such as the + owner and availability. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListModelsResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/models/{model}: + get: + tags: + - Models + operationId: retrieveModel + summary: |- + Retrieves a model instance, providing basic information about the model such as the owner and + permissioning. + parameters: + - name: model + in: path + required: true + description: The ID of the model to use for this request. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/Model' + - $ref: '#/components/schemas/ErrorResponse' + delete: + tags: + - Models + operationId: deleteModel + summary: Delete a fine-tuned model. You must have the Owner role in your organization to delete a model. + parameters: + - name: model + in: path + required: true + description: The model to delete + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/DeleteModelResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/moderations: + post: + tags: + - Moderations + operationId: createModeration + summary: Classifies if text is potentially harmful. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/CreateModerationResponse' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateModerationRequest' + /v1/threads: + post: + tags: + - Assistants + operationId: createThread + summary: Create a thread. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ThreadObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateThreadRequest' + /v1/threads/runs: + post: + tags: + - Assistants + operationId: createThreadAndRun + summary: Create a thread and run it in one request. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/RunObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateThreadAndRunRequest' + /v1/threads/{thread_id}: + get: + tags: + - Assistants + operationId: getThread + summary: Retrieves a thread. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the thread to retrieve. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ThreadObject' + - $ref: '#/components/schemas/ErrorResponse' + post: + tags: + - Assistants + operationId: modifyThread + summary: Modifies a thread. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the thread to modify. Only the `metadata` can be modified. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ThreadObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/ModifyThreadRequest' + delete: + tags: + - Assistants + operationId: deleteThread + summary: Delete a thread. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the thread to delete. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/DeleteThreadResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/threads/{thread_id}/messages: + post: + tags: + - Assistants + operationId: createMessage + summary: Create a message. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the [thread](/docs/api-reference/threads) to create a message for. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/MessageObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateMessageRequest' + get: + tags: + - Assistants + operationId: listMessages + summary: Returns a list of messages for a given thread. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the [thread](/docs/api-reference/threads) the messages belong to. + schema: + type: string + - name: limit + in: query + required: false + description: |- + A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + default is 20. + schema: + type: integer + format: int32 + default: 20 + - name: order + in: query + required: false + description: |- + Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + for descending order. + schema: + $ref: '#/components/schemas/ListOrder' + default: desc + - name: after + in: query + required: false + description: |- + A cursor for use in pagination. `after` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include after=obj_foo in order to fetch the next page of the list. + schema: + type: string + - name: before + in: query + required: false + description: |- + A cursor for use in pagination. `before` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListMessagesResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/threads/{thread_id}/messages/{message_id}: + get: + tags: + - Assistants + operationId: getMessage + summary: Retrieve a message. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the [thread](/docs/api-reference/threads) to which this message belongs. + schema: + type: string + - name: message_id + in: path + required: true + description: The ID of the message to retrieve. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/MessageObject' + - $ref: '#/components/schemas/ErrorResponse' + post: + tags: + - Assistants + operationId: modifyMessage + summary: Modifies a message. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the thread to which this message belongs. + schema: + type: string + - name: message_id + in: path + required: true + description: The ID of the message to modify. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/MessageObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/ModifyMessageRequest' + delete: + tags: + - Assistants + operationId: deleteMessage + summary: Deletes a message. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the thread to which this message belongs. + schema: + type: string + - name: message_id + in: path + required: true + description: The ID of the message to delete. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/DeleteMessageResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/threads/{thread_id}/runs: + post: + tags: + - Assistants + operationId: createRun + summary: Create a run. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the thread to run. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/RunObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateRunRequest' + get: + tags: + - Assistants + operationId: listRuns + summary: Returns a list of runs belonging to a thread. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the thread the run belongs to. + schema: + type: string + - name: limit + in: query + required: false + description: |- + A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + default is 20. + schema: + type: integer + format: int32 + default: 20 + - name: order + in: query + required: false + description: |- + Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + for descending order. + schema: + $ref: '#/components/schemas/ListOrder' + default: desc + - name: after + in: query + required: false + description: |- + A cursor for use in pagination. `after` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include after=obj_foo in order to fetch the next page of the list. + schema: + type: string + - name: before + in: query + required: false + description: |- + A cursor for use in pagination. `before` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListRunsResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/threads/{thread_id}/runs/{run_id}: + get: + tags: + - Assistants + operationId: getRun + summary: Retrieves a run. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the [thread](/docs/api-reference/threads) that was run. + schema: + type: string + - name: run_id + in: path + required: true + description: The ID of the run to retrieve. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/RunObject' + - $ref: '#/components/schemas/ErrorResponse' + post: + tags: + - Assistants + operationId: modifyRun + summary: Modifies a run. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the [thread](/docs/api-reference/threads) that was run. + schema: + type: string + - name: run_id + in: path + required: true + description: The ID of the run to modify. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/RunObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/ModifyRunRequest' + /v1/threads/{thread_id}/runs/{run_id}/cancel: + post: + tags: + - Assistants + operationId: cancelRun + summary: Cancels a run that is `in_progress`. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the thread to which this run belongs. + schema: + type: string + - name: run_id + in: path + required: true + description: The ID of the run to cancel. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/RunObject' + - $ref: '#/components/schemas/ErrorResponse' + /v1/threads/{thread_id}/runs/{run_id}/steps: + get: + tags: + - Assistants + operationId: listRunSteps + summary: Returns a list of run steps belonging to a run. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the thread the run and run steps belong to. + schema: + type: string + - name: run_id + in: path + required: true + description: The ID of the run the run steps belong to. + schema: + type: string + - name: limit + in: query + required: false + description: |- + A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + default is 20. + schema: + type: integer + format: int32 + default: 20 + - name: order + in: query + required: false + description: |- + Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + for descending order. + schema: + $ref: '#/components/schemas/ListOrder' + default: desc + - name: after + in: query + required: false + description: |- + A cursor for use in pagination. `after` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include after=obj_foo in order to fetch the next page of the list. + schema: + type: string + - name: before + in: query + required: false + description: |- + A cursor for use in pagination. `before` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListRunStepsResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/threads/{thread_id}/runs/{run_id}/steps/{step_id}: + get: + tags: + - Assistants + operationId: getRunStep + summary: Retrieves a run step. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the thread to which the run and run step belongs. + schema: + type: string + - name: run_id + in: path + required: true + description: The ID of the run to which the run step belongs. + schema: + type: string + - name: step_id + in: path + required: true + description: The ID of the run step to retrieve. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/RunStepObject' + - $ref: '#/components/schemas/ErrorResponse' + /v1/threads/{thread_id}/runs/{run_id}/submit_tool_outputs: + post: + tags: + - Assistants + operationId: submitToolOutputsToRun + summary: |- + When a run has the `status: "requires_action"` and `required_action.type` is + `submit_tool_outputs`, this endpoint can be used to submit the outputs from the tool calls once + they're all completed. All outputs must be submitted in a single request. + parameters: + - name: thread_id + in: path + required: true + description: The ID of the [thread](/docs/api-reference/threads) to which this run belongs. + schema: + type: string + - name: run_id + in: path + required: true + description: The ID of the run that requires the tool output submission. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/RunObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/SubmitToolOutputsRunRequest' + /v1/uploads: + post: + tags: + - Uploads + operationId: createUpload + summary: |- + Creates an intermediate [Upload](/docs/api-reference/uploads/object) object that you can add [Parts](/docs/api-reference/uploads/part-object) to. Currently, an Upload can accept at most 8 GB in total and expires after an hour after you create it. + + Once you complete the Upload, we will create a [File](/docs/api-reference/files/object) object that contains all the parts you uploaded. This File is usable in the rest of our platform as a regular File object. + + For certain `purpose`s, the correct `mime_type` must be specified. Please refer to documentation for the supported MIME types for your use case: + - [Assistants](/docs/assistants/tools/file-search/supported-files) + + For guidance on the proper filename extensions for each purpose, please follow the documentation on [creating a File](/docs/api-reference/files/create). + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/Upload' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateUploadRequest' + /v1/uploads/{upload_id}/cancel: + post: + tags: + - Uploads + operationId: cancelUpload + summary: Cancels the Upload. No Parts may be added after an Upload is cancelled. + parameters: + - name: upload_id + in: path + required: true + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/Upload' + - $ref: '#/components/schemas/ErrorResponse' + /v1/uploads/{upload_id}/complete: + post: + tags: + - Uploads + operationId: completeUpload + summary: |- + Completes the [Upload](/docs/api-reference/uploads/object). + + Within the returned Upload object, there is a nested [File](/docs/api-reference/files/object) object that is ready to use in the rest of the platform. + + You can specify the order of the Parts by passing in an ordered list of the Part IDs. + + The number of bytes uploaded upon completion must match the number of bytes initially specified when creating the Upload object. No Parts may be added after an Upload is completed. + parameters: + - name: upload_id + in: path + required: true + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/Upload' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CompleteUploadRequest' + /v1/uploads/{upload_id}/parts: + post: + tags: + - Uploads + operationId: addUploadPart + summary: |- + Adds a [Part](/docs/api-reference/uploads/part-object) to an [Upload](/docs/api-reference/uploads/object) object. A Part represents a chunk of bytes from the file you are trying to upload. + + Each Part can be at most 64 MB, and you can add Parts until you hit the Upload maximum of 8 GB. + + It is possible to add multiple Parts in parallel. You can decide the intended order of the Parts when you [complete the Upload](/docs/api-reference/uploads/complete). + parameters: + - name: upload_id + in: path + required: true + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/UploadPart' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: '#/components/schemas/AddUploadPartRequestMultiPart' + /v1/vector_stores: + get: + tags: + - Vector Stores + operationId: listVectorStores + summary: Returns a list of vector-stores. + parameters: + - name: limit + in: query + required: false + description: |- + A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + default is 20. + schema: + type: integer + format: int32 + default: 20 + - name: order + in: query + required: false + description: |- + Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + for descending order. + schema: + $ref: '#/components/schemas/ListOrder' + default: desc + - name: after + in: query + required: false + description: |- + A cursor for use in pagination. `after` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include after=obj_foo in order to fetch the next page of the list. + schema: + type: string + - name: before + in: query + required: false + description: |- + A cursor for use in pagination. `before` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListVectorStoresResponse' + - $ref: '#/components/schemas/ErrorResponse' + post: + tags: + - Vector Stores + operationId: createVectorStore + summary: Creates a vector store. + parameters: [] + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/VectorStoreObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateVectorStoreRequest' + /v1/vector_stores/{vector_store_id}: + get: + tags: + - Vector Stores + operationId: getVectorStore + summary: Retrieves a vector store. + parameters: + - name: vector_store_id + in: path + required: true + description: The ID of the vector store to retrieve. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/VectorStoreObject' + - $ref: '#/components/schemas/ErrorResponse' + post: + tags: + - Vector Stores + operationId: modifyVectorStore + summary: Modifies a vector store. + parameters: + - name: vector_store_id + in: path + required: true + description: The ID of the vector store to modify. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/VectorStoreObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/UpdateVectorStoreRequest' + delete: + tags: + - Vector Stores + operationId: deleteVectorStore + summary: Delete a vector store. + parameters: + - name: vector_store_id + in: path + required: true + description: The ID of the vector store to delete. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/DeleteVectorStoreResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/vector_stores/{vector_store_id}/file_batches: + post: + tags: + - Vector Stores + operationId: createVectorStoreFileBatch + summary: Create a vector store file batch. + parameters: + - name: vector_store_id + in: path + required: true + description: The ID of the vector store for which to create a file batch. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/VectorStoreFileBatchObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateVectorStoreFileBatchRequest' + /v1/vector_stores/{vector_store_id}/file_batches/{batch_id}: + get: + tags: + - Vector Stores + operationId: getVectorStoreFileBatch + summary: Retrieves a vector store file batch. + parameters: + - name: vector_store_id + in: path + required: true + description: The ID of the vector store that the file batch belongs to. + schema: + type: string + - name: batch_id + in: path + required: true + description: The ID of the file batch being retrieved. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/VectorStoreFileBatchObject' + - $ref: '#/components/schemas/ErrorResponse' + /v1/vector_stores/{vector_store_id}/file_batches/{batch_id}/cancel: + post: + tags: + - Vector Stores + operationId: cancelVectorStoreFileBatch + summary: Cancel a vector store file batch. This attempts to cancel the processing of files in this batch as soon as possible. + parameters: + - name: vector_store_id + in: path + required: true + description: The ID of the vector store that the file batch belongs to. + schema: + type: string + - name: batch_id + in: path + required: true + description: The ID of the file batch to cancel. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/VectorStoreFileBatchObject' + - $ref: '#/components/schemas/ErrorResponse' + /v1/vector_stores/{vector_store_id}/file_batches/{batch_id}/files: + get: + tags: + - Vector Stores + operationId: listFilesInVectorStoreBatch + summary: Returns a list of vector store files in a batch. + parameters: + - name: vector_store_id + in: path + required: true + description: The ID of the vector store that the file batch belongs to. + schema: + type: string + - name: batch_id + in: path + required: true + description: The ID of the file batch that the files belong to. + schema: + type: string + - name: limit + in: query + required: false + description: |- + A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + default is 20. + schema: + type: integer + format: int32 + default: 20 + - name: order + in: query + required: false + description: |- + Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + for descending order. + schema: + $ref: '#/components/schemas/ListOrder' + default: desc + - name: after + in: query + required: false + description: |- + A cursor for use in pagination. `after` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include after=obj_foo in order to fetch the next page of the list. + schema: + type: string + - name: before + in: query + required: false + description: |- + A cursor for use in pagination. `before` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + schema: + type: string + - name: filter + in: query + required: false + description: Filter by file status. One of `in_progress`, `completed`, `failed`, `cancelled`. + schema: + $ref: '#/components/schemas/ListVectorStoreFilesFilter' + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListVectorStoreFilesResponse' + - $ref: '#/components/schemas/ErrorResponse' + /v1/vector_stores/{vector_store_id}/files: + get: + tags: + - Vector Stores + operationId: listVectorStoreFiles + summary: Returns a list of vector store files. + parameters: + - name: vector_store_id + in: path + required: true + description: The ID of the vector store that the files belong to. + schema: + type: string + - name: limit + in: query + required: false + description: |- + A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + default is 20. + schema: + type: integer + format: int32 + default: 20 + - name: order + in: query + required: false + description: |- + Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + for descending order. + schema: + $ref: '#/components/schemas/ListOrder' + default: desc + - name: after + in: query + required: false + description: |- + A cursor for use in pagination. `after` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include after=obj_foo in order to fetch the next page of the list. + schema: + type: string + - name: before + in: query + required: false + description: |- + A cursor for use in pagination. `before` is an object ID that defines your place in the list. + For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + subsequent call can include before=obj_foo in order to fetch the previous page of the list. + schema: + type: string + - name: filter + in: query + required: false + description: Filter by file status. One of `in_progress`, `completed`, `failed`, `cancelled`. + schema: + $ref: '#/components/schemas/ListVectorStoreFilesFilter' + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/ListVectorStoreFilesResponse' + - $ref: '#/components/schemas/ErrorResponse' + post: + tags: + - Vector Stores + operationId: createVectorStoreFile + summary: Create a vector store file by attaching a [File](/docs/api-reference/files) to a [vector store](/docs/api-reference/vector-stores/object). + parameters: + - name: vector_store_id + in: path + required: true + description: The ID of the vector store for which to create a File. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/VectorStoreFileObject' + - $ref: '#/components/schemas/ErrorResponse' + requestBody: + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/CreateVectorStoreFileRequest' + /v1/vector_stores/{vector_store_id}/files/{file_id}: + get: + tags: + - Vector Stores + operationId: getVectorStoreFile + summary: Retrieves a vector store file. + parameters: + - name: vector_store_id + in: path + required: true + description: The ID of the vector store that the file belongs to. + schema: + type: string + - name: file_id + in: path + required: true + description: The ID of the file being retrieved. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/VectorStoreFileObject' + - $ref: '#/components/schemas/ErrorResponse' + delete: + tags: + - Vector Stores + operationId: deleteVectorStoreFile + summary: Delete a vector store file. This will remove the file from the vector store but the file itself will not be deleted. To delete the file, use the [delete file](/docs/api-reference/files/delete) endpoint. + parameters: + - name: vector_store_id + in: path + required: true + description: The ID of the vector store that the file belongs to. + schema: + type: string + - name: file_id + in: path + required: true + description: The ID of the file to delete. + schema: + type: string + responses: + '200': + description: The request has succeeded. + content: + application/json: + schema: + anyOf: + - $ref: '#/components/schemas/DeleteVectorStoreFileResponse' + - $ref: '#/components/schemas/ErrorResponse' +security: + - BearerAuth: [] +components: + schemas: + AddUploadPartRequestMultiPart: + type: object + required: + - data + properties: + data: + type: string + format: binary + AssistantObject: + type: object + required: + - id + - object + - created_at + - name + - description + - model + - instructions + - tools + - metadata + properties: + id: + type: string + description: The identifier, which can be referenced in API endpoints. + object: + type: string + enum: + - assistant + description: The object type, which is always `assistant`. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the assistant was created. + name: + type: string + nullable: true + maxLength: 256 + description: The name of the assistant. The maximum length is 256 characters. + description: + type: string + nullable: true + maxLength: 512 + description: The description of the assistant. The maximum length is 512 characters. + model: + type: string + description: ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. + instructions: + type: string + nullable: true + maxLength: 256000 + description: The system instructions that the assistant uses. The maximum length is 256,000 characters. + tools: + type: array + items: + $ref: '#/components/schemas/AssistantToolDefinition' + maxItems: 128 + description: A list of tool enabled on the assistant. There can be a maximum of 128 tools per assistant. Tools can be of types `code_interpreter`, `file_search`, or `function`. + x-oaiExpandable: true + default: [] + tool_resources: + type: object + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + items: + type: string + maxItems: 20 + description: A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter`` tool. There can be a maximum of 20 files associated with the tool. + default: [] + file_search: + $ref: '#/components/schemas/ToolResourcesFileSearchIdsOnly' + nullable: true + description: A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + temperature: + type: number + format: float + nullable: true + minimum: 0 + maximum: 2 + description: What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + default: 1 + top_p: + type: number + format: float + nullable: true + minimum: 0 + maximum: 1 + description: |- + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or temperature but not both. + default: 1 + response_format: + oneOf: + - $ref: '#/components/schemas/AssistantsApiResponseFormatOption' + nullable: true + description: Represents an `assistant` that can call the model and use tools. + AssistantResponseFormat: + type: object + required: + - type + properties: + type: + type: string + discriminator: + propertyName: type + mapping: + text: '#/components/schemas/AssistantResponseFormatText' + json_object: '#/components/schemas/AssistantResponseFormatJsonObject' + json_schema: '#/components/schemas/AssistantResponseFormatJsonSchema' + AssistantResponseFormatJsonObject: + type: object + required: + - type + properties: + type: + type: string + enum: + - json_object + description: 'The type of response format being defined: `json_object`' + allOf: + - $ref: '#/components/schemas/AssistantResponseFormat' + AssistantResponseFormatJsonSchema: + type: object + required: + - type + - json_schema + properties: + type: + type: string + enum: + - json_schema + description: 'The type of response format being defined: `json_schema`' + json_schema: + type: object + properties: + description: + type: string + description: A description of what the response format is for, used by the model to determine how to respond in the format. + name: + type: string + description: The name of the response format. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. + schema: + $ref: '#/components/schemas/ResponseFormatJsonSchemaSchema' + strict: + type: boolean + nullable: true + description: Whether to enable strict schema adherence when generating the output. If set to true, the model will always follow the exact schema defined in the `schema` field. Only a subset of JSON Schema is supported when `strict` is `true`. To learn more, read the [Structured Outputs guide](/docs/guides/structured-outputs). + default: false + required: + - name + allOf: + - $ref: '#/components/schemas/AssistantResponseFormat' + AssistantResponseFormatText: + type: object + required: + - type + properties: + type: + type: string + enum: + - text + description: 'The type of response format being defined: `text`' + allOf: + - $ref: '#/components/schemas/AssistantResponseFormat' + AssistantToolDefinition: + type: object + required: + - type + properties: + type: + type: string + discriminator: + propertyName: type + mapping: + code_interpreter: '#/components/schemas/AssistantToolsCode' + file_search: '#/components/schemas/AssistantToolsFileSearch' + function: '#/components/schemas/AssistantToolsFunction' + AssistantToolsCode: + type: object + required: + - type + properties: + type: + type: string + enum: + - code_interpreter + description: 'The type of tool being defined: `code_interpreter`' + allOf: + - $ref: '#/components/schemas/AssistantToolDefinition' + AssistantToolsFileSearch: + type: object + required: + - type + properties: + type: + type: string + enum: + - file_search + description: 'The type of tool being defined: `file_search`' + file_search: + type: object + properties: + max_num_results: + type: integer + format: int32 + minimum: 1 + maximum: 50 + description: |- + The maximum number of results the file search tool should output. The default is 20 for `gpt-4*` models and 5 for `gpt-3.5-turbo`. This number should be between 1 and 50 inclusive. + + Note that the file search tool may output fewer than `max_num_results` results. See the [file search tool documentation](/docs/assistants/tools/file-search/number-of-chunks-returned) for more information. + description: Overrides for the file search tool. + allOf: + - $ref: '#/components/schemas/AssistantToolDefinition' + AssistantToolsFileSearchTypeOnly: + type: object + required: + - type + properties: + type: + type: string + enum: + - file_search + description: 'The type of tool being defined: `file_search`' + AssistantToolsFunction: + type: object + required: + - type + - function + properties: + type: + type: string + enum: + - function + description: 'The type of tool being defined: `function`' + function: + $ref: '#/components/schemas/FunctionObject' + allOf: + - $ref: '#/components/schemas/AssistantToolDefinition' + AssistantsApiResponseFormatOption: + anyOf: + - type: string + enum: + - auto + - $ref: '#/components/schemas/ResponseFormatText' + - $ref: '#/components/schemas/ResponseFormatJsonObject' + - $ref: '#/components/schemas/ResponseFormatJsonSchema' + description: |- + Specifies the format that the model must output. Compatible with [GPT-4o](/docs/models/gpt-4o), [GPT-4 Turbo](/docs/models/gpt-4-turbo-and-gpt-4), and all GPT-3.5 Turbo models since `gpt-3.5-turbo-1106`. + + Setting to `{ "type": "json_schema", "json_schema": {...} }` enables Structured Outputs which guarantees the model will match your supplied JSON schema. Learn more in the [Structured Outputs guide](/docs/guides/structured-outputs). + + Setting to `{ "type": "json_object" }` enables JSON mode, which guarantees the message the model generates is valid JSON. + + **Important:** when using JSON mode, you **must** also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if `finish_reason="length"`, which indicates the generation exceeded `max_tokens` or the conversation exceeded the max context length. + x-oaiExpandable: true + AssistantsApiToolChoiceOption: + anyOf: + - type: string + enum: + - none + - auto + - required + - $ref: '#/components/schemas/AssistantsNamedToolChoice' + description: |- + Controls which (if any) tool is called by the model. + `none` means the model will not call any tools and instead generates a message. + `auto` is the default value and means the model can pick between generating a message or calling one or more tools. + `required` means the model must call one or more tools before responding to the user. + Specifying a particular tool like `{"type": "file_search"}` or `{"type": "function", "function": {"name": "my_function"}}` forces the model to call that tool. + x-oaiExpandable: true + AssistantsNamedToolChoice: + type: object + required: + - type + properties: + type: + type: string + enum: + - function + - code_interpreter + - file_search + description: The type of the tool. If type is `function`, the function name must be set + function: + type: object + properties: + name: + type: string + description: The name of the function to call. + required: + - name + description: Specifies a tool the model should use. Use to force the model to call a specific tool. + AuditLog: + type: object + required: + - id + - type + - effective_at + - actor + properties: + id: + type: string + description: The ID of this log. + type: + $ref: '#/components/schemas/AuditLogEventType' + effective_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of the event. + project: + type: object + properties: + id: + type: string + description: The project ID. + name: + type: string + description: The project title. + description: The project that the action was scoped to. Absent for actions not scoped to projects. + actor: + $ref: '#/components/schemas/AuditLogActor' + api_key.created: + type: object + properties: + id: + type: string + description: The tracking ID of the API key. + data: + type: object + properties: + scopes: + type: array + items: + type: string + description: A list of scopes allowed for the API key, e.g. `["api.model.request"]` + description: The payload used to create the API key. + description: The details for events with this `type`. + api_key.updated: + type: object + properties: + id: + type: string + description: The tracking ID of the API key. + changes_requested: + type: object + properties: + scopes: + type: array + items: + type: string + description: A list of scopes allowed for the API key, e.g. `["api.model.request"]` + description: The payload used to update the API key. + description: The details for events with this `type`. + api_key.deleted: + type: object + properties: + id: + type: string + description: The tracking ID of the API key. + description: The details for events with this `type`. + invite.sent: + type: object + properties: + id: + type: string + description: The ID of the invite. + data: + type: object + properties: + email: + type: string + description: The email invited to the organization. + role: + type: string + description: The role the email was invited to be. Is either `owner` or `member`. + description: The payload used to create the invite. + description: The details for events with this `type`. + invite.accepted: + type: object + properties: + id: + type: string + description: The ID of the invite. + description: The details for events with this `type`. + invite.deleted: + type: object + properties: + id: + type: string + description: The ID of the invite. + description: The details for events with this `type`. + login.failed: + type: object + properties: + error_code: + type: string + description: The error code of the failure. + error_message: + type: string + description: The error message of the failure. + description: The details for events with this `type`. + logout.failed: + type: object + properties: + error_code: + type: string + description: The error code of the failure. + error_message: + type: string + description: The error message of the failure. + description: The details for events with this `type`. + organization.updated: + type: object + properties: + id: + type: string + description: The organization ID. + changes_requested: + type: object + properties: + title: + type: string + description: The organization title. + description: + type: string + description: The organization description. + name: + type: string + description: The organization name. + settings: + type: object + properties: + threads_ui_visibility: + type: string + description: Visibility of the threads page which shows messages created with the Assistants API and Playground. One of `ANY_ROLE`, `OWNERS`, or `NONE`. + usage_dashboard_visibility: + type: string + description: Visibility of the usage dashboard which shows activity and costs for your organization. One of `ANY_ROLE` or `OWNERS`. + description: The payload used to update the organization settings. + description: The details for events with this `type`. + project.created: + type: object + properties: + id: + type: string + description: The project ID. + data: + type: object + properties: + name: + type: string + description: The project name. + title: + type: string + description: The title of the project as seen on the dashboard. + description: The payload used to create the project. + description: The details for events with this `type`. + project.updated: + type: object + properties: + id: + type: string + description: The project ID. + changes_requested: + type: object + properties: + title: + type: string + description: The title of the project as seen on the dashboard. + description: The payload used to update the project. + description: The details for events with this `type`. + project.archived: + type: object + properties: + id: + type: string + description: The project ID. + description: The details for events with this `type`. + service_account.created: + type: object + properties: + id: + type: string + description: The service account ID. + data: + type: object + properties: + role: + type: string + description: The role of the service account. Is either `owner` or `member`. + description: The payload used to create the service account. + description: The details for events with this `type`. + service_account.updated: + type: object + properties: + id: + type: string + description: The service account ID. + changes_requested: + type: object + properties: + role: + type: string + description: The role of the service account. Is either `owner` or `member`. + description: The payload used to updated the service account. + description: The details for events with this `type`. + service_account.deleted: + type: object + properties: + id: + type: string + description: The service account ID. + description: The details for events with this `type`. + user.added: + type: object + properties: + id: + type: string + description: The user ID. + data: + type: object + properties: + role: + type: string + description: The role of the user. Is either `owner` or `member`. + description: The payload used to add the user to the project. + description: The details for events with this `type`. + user.updated: + type: object + properties: + id: + type: string + description: The project ID. + changes_requested: + type: object + properties: + role: + type: string + description: The role of the user. Is either `owner` or `member`. + description: The payload used to update the user. + description: The details for events with this `type`. + user.deleted: + type: object + properties: + id: + type: string + description: The user ID. + description: The details for events with this `type`. + description: A log of a user action or configuration change within this organization. + AuditLogActor: + type: object + properties: + type: + type: string + enum: + - session + - api_key + description: The type of actor. Is either `session` or `api_key`. + session: + $ref: '#/components/schemas/AuditLogActorSession' + api_key: + $ref: '#/components/schemas/AuditLogActorApiKey' + description: The actor who performed the audit logged action. + AuditLogActorApiKey: + type: object + properties: + id: + type: string + description: The tracking id of the API key. + type: + type: string + enum: + - user + - service_account + description: The type of API key. Can be either `user` or `service_account`. + user: + $ref: '#/components/schemas/AuditLogActorUser' + service_account: + $ref: '#/components/schemas/AuditLogActorServiceAccount' + description: The API Key used to perform the audit logged action. + AuditLogActorServiceAccount: + type: object + properties: + id: + type: string + description: The service account id. + description: The service account that performed the audit logged action. + AuditLogActorSession: + type: object + properties: + user: + $ref: '#/components/schemas/AuditLogActorUser' + ip_address: + type: string + description: The IP address from which the action was performed. + description: The session in which the audit logged action was performed. + AuditLogActorUser: + type: object + properties: + id: + type: string + description: The user id. + email: + type: string + description: The user email. + description: The user who performed the audit logged action. + AuditLogEventType: + type: string + enum: + - api_key.created + - api_key.updated + - api_key.deleted + - invite.sent + - invite.accepted + - invite.deleted + - login.succeeded + - login.failed + - logout.succeeded + - logout.failed + - organization.updated + - project.created + - project.updated + - project.archived + - service_account.created + - service_account.updated + - service_account.deleted + - user.added + - user.updated + - user.deleted + description: The event type. + x-oaiExpandable: true + AutoChunkingStrategyRequestParam: + type: object + required: + - type + properties: + type: + type: string + enum: + - auto + description: Always `auto`. + allOf: + - $ref: '#/components/schemas/FileChunkingStrategyRequestParam' + description: The default strategy. This strategy currently uses a `max_chunk_size_tokens` of `800` and `chunk_overlap_tokens` of `400`. + AutoChunkingStrategyResponseParam: + type: object + required: + - type + properties: + type: + type: string + enum: + - auto + allOf: + - $ref: '#/components/schemas/FileChunkingStrategyResponseParam' + Batch: + type: object + required: + - id + - object + - endpoint + - input_file_id + - completion_window + - status + - created_at + properties: + id: + type: string + object: + type: string + enum: + - batch + description: The object type, which is always `batch`. + endpoint: + type: string + description: The OpenAI API endpoint used by the batch. + errors: + type: object + properties: + object: + type: string + enum: + - list + description: The object type, which is always `list`. + data: + type: array + items: + type: object + properties: + code: + type: string + description: An error code identifying the error type. + message: + type: string + description: A human-readable message providing more details about the error. + param: + type: string + nullable: true + description: The name of the parameter that caused the error, if applicable. + line: + type: integer + format: int32 + nullable: true + description: The line number of the input file where the error occurred, if applicable. + input_file_id: + type: string + description: The ID of the input file for the batch. + completion_window: + type: string + description: The time frame within which the batch should be processed. + status: + type: string + enum: + - validating + - failed + - in_progress + - finalizing + - completed + - expired + - cancelling + - cancelled + description: The current status of the batch. + output_file_id: + type: string + description: The ID of the file containing the outputs of successfully executed requests. + error_file_id: + type: string + description: The ID of the file containing the outputs of requests with errors. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the batch was created. + in_progress_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the batch started processing. + expires_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the batch will expire. + finalizing_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the batch started finalizing. + completed_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the batch was completed. + failed_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the batch failed. + expired_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the batch expired. + cancelling_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the batch started cancelling. + cancelled_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the batch was cancelled. + request_counts: + type: object + properties: + total: + type: integer + format: int32 + description: Total number of requests in the batch. + completed: + type: integer + format: int32 + description: Number of requests that have been completed successfully. + failed: + type: integer + format: int32 + description: Number of requests that have failed. + required: + - total + - completed + - failed + description: The request counts for different statuses within the batch. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + BatchRequestInput: + type: object + properties: + custom_id: + type: string + description: A developer-provided per-request id that will be used to match outputs to inputs. Must be unique for each request in a batch. + method: + type: string + enum: + - POST + description: The HTTP method to be used for the request. Currently only `POST` is supported. + url: + type: string + format: uri + description: The OpenAI API relative URL to be used for the request. Currently `/v1/chat/completions`, `/v1/embeddings`, and `/v1/completions` are supported. + description: The per-line object of the batch input file + BatchRequestOutput: + type: object + properties: + id: + type: string + custom_id: + type: string + description: A developer-provided per-request id that will be used to match outputs to inputs. + response: + type: object + properties: + status_code: + type: integer + format: int32 + description: The HTTP status code of the response + request_id: + type: string + description: An unique identifier for the OpenAI API request. Please include this request ID when contacting support. + body: + type: object + additionalProperties: + type: string + description: The JSON body of the response + x-oaiTypeLabel: map + nullable: true + error: + type: object + properties: + code: + type: string + description: A machine-readable error code. + message: + type: string + description: A human-readable error message. + nullable: true + description: For requests that failed with a non-HTTP error, this will contain more information on the cause of the failure. + description: The per-line object of the batch output and error files + ChatCompletionFunctionCallOption: + type: object + required: + - name + properties: + name: + type: string + description: The name of the function to call. + description: 'Specifying a particular function via `{"name": "my_function"}` forces the model to call that function.' + ChatCompletionFunctionChoice: + type: object + ChatCompletionFunctions: + type: object + required: + - name + properties: + description: + type: string + description: A description of what the function does, used by the model to choose when and how to call the function. + name: + type: string + description: The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. + parameters: + $ref: '#/components/schemas/FunctionParameters' + deprecated: true + ChatCompletionMessageToolCall: + type: object + required: + - id + - type + - function + properties: + id: + type: string + description: The ID of the tool call. + type: + type: string + enum: + - function + description: The type of the tool. Currently, only `function` is supported. + function: + type: object + properties: + name: + type: string + description: The name of the function to call. + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + required: + - name + - arguments + description: The function that the model called. + ChatCompletionMessageToolCallChunk: + type: object + required: + - index + properties: + index: + type: integer + format: int32 + id: + type: string + description: The ID of the tool call. + type: + type: string + enum: + - function + description: The type of the tool. Currently, only `function` is supported. + function: + type: object + properties: + name: + type: string + description: The name of the function to call. + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + ChatCompletionMessageToolCallsItem: + type: array + items: + $ref: '#/components/schemas/ChatCompletionMessageToolCall' + description: The tool calls generated by the model, such as function calls. + ChatCompletionNamedToolChoice: + type: object + required: + - type + - function + properties: + type: + type: string + enum: + - function + description: The type of the tool. Currently, only `function` is supported. + function: + type: object + properties: + name: + type: string + description: The name of the function to call. + required: + - name + description: Specifies a tool the model should use. Use to force the model to call a specific function. + ChatCompletionRequestAssistantMessage: + type: object + required: + - role + properties: + content: + anyOf: + - type: string + - type: array + items: + $ref: '#/components/schemas/ChatCompletionRequestAssistantMessageContentPart' + nullable: true + description: The contents of the assistant message. Required unless `tool_calls` or `function_call` is specified. + refusal: + type: string + nullable: true + description: The refusal message by the assistant. + role: + type: string + enum: + - assistant + description: The role of the messages author, in this case `assistant`. + name: + type: string + description: An optional name for the participant. Provides the model information to differentiate between participants of the same role. + tool_calls: + $ref: '#/components/schemas/ChatCompletionMessageToolCallsItem' + function_call: + type: object + properties: + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + name: + type: string + description: The name of the function to call. + required: + - arguments + - name + nullable: true + description: Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model. + deprecated: true + allOf: + - $ref: '#/components/schemas/ChatCompletionRequestMessage' + ChatCompletionRequestAssistantMessageContentPart: + anyOf: + - $ref: '#/components/schemas/ChatCompletionRequestMessageContentPartText' + - $ref: '#/components/schemas/ChatCompletionRequestMessageContentPartRefusal' + x-oaiExpandable: true + ChatCompletionRequestFunctionMessage: + type: object + required: + - role + - content + - name + properties: + role: + type: string + enum: + - function + description: The role of the messages author, in this case `function`. + content: + type: string + nullable: true + description: The contents of the function message. + name: + type: string + description: The name of the function to call. + allOf: + - $ref: '#/components/schemas/ChatCompletionRequestMessage' + deprecated: true + ChatCompletionRequestMessage: + type: object + required: + - role + properties: + role: + type: string + description: The role of the author of this message. + discriminator: + propertyName: role + mapping: + system: '#/components/schemas/ChatCompletionRequestSystemMessage' + user: '#/components/schemas/ChatCompletionRequestUserMessage' + assistant: '#/components/schemas/ChatCompletionRequestAssistantMessage' + tool: '#/components/schemas/ChatCompletionRequestToolMessage' + function: '#/components/schemas/ChatCompletionRequestFunctionMessage' + x-oaiExpandable: true + ChatCompletionRequestMessageContentPartImage: + type: object + required: + - type + - image_url + properties: + type: + type: string + enum: + - image_url + description: The type of the content part. + image_url: + type: object + properties: + url: + type: string + format: uri + description: Either a URL of the image or the base64 encoded image data. + detail: + type: string + enum: + - auto + - low + - high + description: Specifies the detail level of the image. Learn more in the [Vision guide](/docs/guides/vision/low-or-high-fidelity-image-understanding). + default: auto + required: + - url + ChatCompletionRequestMessageContentPartRefusal: + type: object + required: + - type + - refusal + properties: + type: + type: string + enum: + - refusal + description: The type of the content part. + refusal: + type: string + description: The refusal message generated by the model. + ChatCompletionRequestMessageContentPartText: + type: object + required: + - type + - text + properties: + type: + type: string + enum: + - text + description: The type of the content part. + text: + type: string + description: The text content. + ChatCompletionRequestSystemMessage: + type: object + required: + - content + - role + properties: + content: + anyOf: + - type: string + - type: array + items: + $ref: '#/components/schemas/ChatCompletionRequestSystemMessageContentPart' + description: The contents of the system message. + role: + type: string + enum: + - system + description: The role of the messages author, in this case `system`. + name: + type: string + description: An optional name for the participant. Provides the model information to differentiate between participants of the same role. + allOf: + - $ref: '#/components/schemas/ChatCompletionRequestMessage' + ChatCompletionRequestSystemMessageContentPart: + type: object + allOf: + - $ref: '#/components/schemas/ChatCompletionRequestMessageContentPartText' + x-oaiExpandable: true + ChatCompletionRequestToolMessage: + type: object + required: + - role + - content + - tool_call_id + properties: + role: + type: string + enum: + - tool + description: The role of the messages author, in this case `tool`. + content: + anyOf: + - type: string + - type: array + items: + $ref: '#/components/schemas/ChatCompletionRequestToolMessageContentPart' + description: The contents of the tool message. + tool_call_id: + type: string + description: Tool call that this message is responding to. + allOf: + - $ref: '#/components/schemas/ChatCompletionRequestMessage' + ChatCompletionRequestToolMessageContentPart: + type: object + allOf: + - $ref: '#/components/schemas/ChatCompletionRequestMessageContentPartText' + x-oaiExpandable: true + ChatCompletionRequestUserMessage: + type: object + required: + - content + - role + properties: + content: + anyOf: + - type: string + - type: array + items: + $ref: '#/components/schemas/ChatCompletionRequestUserMessageContentPart' + description: The contents of the user message. + x-oaiExpandable: true + role: + type: string + enum: + - user + description: The role of the messages author, in this case `user`. + name: + type: string + description: An optional name for the participant. Provides the model information to differentiate between participants of the same role. + allOf: + - $ref: '#/components/schemas/ChatCompletionRequestMessage' + ChatCompletionRequestUserMessageContentPart: + anyOf: + - $ref: '#/components/schemas/ChatCompletionRequestMessageContentPartText' + - $ref: '#/components/schemas/ChatCompletionRequestMessageContentPartImage' + x-oaiExpandable: true + ChatCompletionResponseMessage: + type: object + required: + - content + - refusal + - role + properties: + content: + type: string + nullable: true + description: The contents of the message. + refusal: + type: string + nullable: true + description: The refusal message generated by the model. + tool_calls: + $ref: '#/components/schemas/ChatCompletionMessageToolCallsItem' + role: + type: string + enum: + - assistant + description: The role of the author of this message. + function_call: + type: object + properties: + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + name: + type: string + description: The name of the function to call. + required: + - arguments + - name + description: Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model. + deprecated: true + description: A chat completion message generated by the model. + ChatCompletionRole: + type: string + enum: + - system + - user + - assistant + - tool + - function + description: The role of the author of a message + ChatCompletionStreamOptions: + type: object + properties: + include_usage: + type: boolean + description: 'If set, an additional chunk will be streamed before the `data: [DONE]` message. The `usage` field on this chunk shows the token usage statistics for the entire request, and the `choices` field will always be an empty array. All other chunks will also include a `usage` field, but with a null value.' + description: 'Options for streaming response. Only set this when you set `stream: true`.' + ChatCompletionStreamResponseDelta: + type: object + properties: + content: + type: string + nullable: true + description: The contents of the chunk message. + function_call: + type: object + properties: + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + name: + type: string + description: The name of the function to call. + description: Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model. + deprecated: true + tool_calls: + type: array + items: + $ref: '#/components/schemas/ChatCompletionMessageToolCallChunk' + role: + type: string + enum: + - system + - user + - assistant + - tool + description: The role of the author of this message. + refusal: + type: string + nullable: true + description: The refusal message generated by the model. + description: A chat completion delta generated by streamed model responses. + ChatCompletionTokenLogprob: + type: object + required: + - token + - logprob + - bytes + - top_logprobs + properties: + token: + type: string + description: The token. + logprob: + type: number + format: float + description: The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value `-9999.0` is used to signify that the token is very unlikely. + bytes: + type: object + allOf: + - $ref: '#/components/schemas/ChatCompletionTokenLogprobBytesItem' + nullable: true + description: A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be `null` if there is no bytes representation for the token. + top_logprobs: + type: array + items: + type: object + properties: + token: + type: string + description: The token. + logprob: + type: number + format: float + description: The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value `-9999.0` is used to signify that the token is very unlikely. + bytes: + type: array + items: + type: integer + format: int32 + nullable: true + description: A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be `null` if there is no bytes representation for the token. + required: + - token + - logprob + - bytes + description: List of the most likely tokens and their log probability, at this token position. In rare cases, there may be fewer than the number of requested `top_logprobs` returned. + ChatCompletionTokenLogprobBytesItem: + type: array + items: + type: integer + format: int32 + ChatCompletionTool: + type: object + required: + - type + - function + properties: + type: + type: string + enum: + - function + description: The type of the tool. Currently, only `function` is supported. + function: + $ref: '#/components/schemas/FunctionObject' + ChatCompletionToolChoice: + type: object + ChatCompletionToolChoiceOption: + anyOf: + - type: string + enum: + - none + - auto + - required + - $ref: '#/components/schemas/ChatCompletionNamedToolChoice' + description: |- + Controls which (if any) tool is called by the model. + `none` means the model will not call any tool and instead generates a message. + `auto` means the model can pick between generating a message or calling one or more tools. + `required` means the model must call one or more tools. + Specifying a particular tool via `{"type": "function", "function": {"name": "my_function"}}` forces the model to call that tool. + + `none` is the default when no tools are present. `auto` is the default if tools are present. + x-oaiExpandable: true + ChatMessageContentPart: + type: object + ChatResponseFormat: + type: object + required: + - type + properties: + type: + type: string + discriminator: + propertyName: type + mapping: + text: '#/components/schemas/ChatResponseFormatText' + json_object: '#/components/schemas/ChatResponseFormatJsonObject' + json_schema: '#/components/schemas/ChatResponseFormatJsonSchema' + ChatResponseFormatJsonObject: + type: object + required: + - type + properties: + type: + type: string + enum: + - json_object + description: 'The type of response format being defined: `json_object`' + allOf: + - $ref: '#/components/schemas/ChatResponseFormat' + ChatResponseFormatJsonSchema: + type: object + required: + - type + - json_schema + properties: + type: + type: string + enum: + - json_schema + description: 'The type of response format being defined: `json_schema`' + json_schema: + type: object + properties: + description: + type: string + description: A description of what the response format is for, used by the model to determine how to respond in the format. + name: + type: string + description: The name of the response format. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. + schema: + $ref: '#/components/schemas/ResponseFormatJsonSchemaSchema' + strict: + type: boolean + nullable: true + description: Whether to enable strict schema adherence when generating the output. If set to true, the model will always follow the exact schema defined in the `schema` field. Only a subset of JSON Schema is supported when `strict` is `true`. To learn more, read the [Structured Outputs guide](/docs/guides/structured-outputs). + default: false + required: + - name + allOf: + - $ref: '#/components/schemas/ChatResponseFormat' + ChatResponseFormatText: + type: object + required: + - type + properties: + type: + type: string + enum: + - text + description: 'The type of response format being defined: `text`' + allOf: + - $ref: '#/components/schemas/ChatResponseFormat' + ChunkingStrategyRequestParam: + anyOf: + - $ref: '#/components/schemas/AutoChunkingStrategyRequestParam' + - $ref: '#/components/schemas/StaticChunkingStrategyRequestParam' + description: The chunking strategy used to chunk the file(s). If not set, will use the `auto` strategy. + x-oaiExpandable: true + CompleteUploadRequest: + type: object + required: + - part_ids + properties: + part_ids: + type: array + items: + type: string + description: The ordered list of Part IDs. + md5: + type: string + description: The optional md5 checksum for the file contents to verify if the bytes uploaded matches what you expect. + CompletionUsage: + type: object + required: + - completion_tokens + - prompt_tokens + - total_tokens + properties: + completion_tokens: + type: integer + format: int32 + description: Number of tokens in the generated completion. + prompt_tokens: + type: integer + format: int32 + description: Number of tokens in the prompt. + total_tokens: + type: integer + format: int32 + description: Total number of tokens used in the request (prompt + completion). + description: Usage statistics for the completion request. + CreateAssistantRequest: + type: object + required: + - model + properties: + model: + anyOf: + - type: string + - type: string + enum: + - gpt-4o + - gpt-4o-2024-08-06 + - gpt-4o-2024-05-13 + - gpt-4o-mini + - gpt-4o-mini-2024-07-18 + - gpt-4-turbo + - gpt-4-turbo-2024-04-09 + - gpt-4-0125-preview + - gpt-4-turbo-preview + - gpt-4-1106-preview + - gpt-4-vision-preview + - gpt-4 + - gpt-4-0314 + - gpt-4-0613 + - gpt-4-32k + - gpt-4-32k-0314 + - gpt-4-32k-0613 + - gpt-3.5-turbo + - gpt-3.5-turbo-16k + - gpt-3.5-turbo-0613 + - gpt-3.5-turbo-1106 + - gpt-3.5-turbo-0125 + - gpt-3.5-turbo-16k-0613 + description: ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. + x-oaiTypeLabel: string + name: + type: string + nullable: true + maxLength: 256 + description: The name of the assistant. The maximum length is 256 characters. + description: + type: string + nullable: true + maxLength: 512 + description: The description of the assistant. The maximum length is 512 characters. + instructions: + type: string + nullable: true + maxLength: 256000 + description: The system instructions that the assistant uses. The maximum length is 256,000 characters. + tools: + type: array + items: + $ref: '#/components/schemas/AssistantToolDefinition' + maxItems: 128 + description: A list of tool enabled on the assistant. There can be a maximum of 128 tools per assistant. Tools can be of types `code_interpreter`, `file_search`, or `function`. + x-oaiExpandable: true + default: [] + tool_resources: + type: object + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + items: + type: string + maxItems: 20 + description: A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + file_search: + $ref: '#/components/schemas/ToolResourcesFileSearch' + nullable: true + description: A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + temperature: + type: number + format: float + nullable: true + minimum: 0 + maximum: 2 + description: What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + default: 1 + top_p: + type: number + format: float + nullable: true + minimum: 0 + maximum: 1 + description: |- + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or temperature but not both. + default: 1 + response_format: + oneOf: + - $ref: '#/components/schemas/AssistantsApiResponseFormatOption' + nullable: true + CreateChatCompletionFunctionResponse: + type: object + required: + - id + - choices + - created + - model + - object + properties: + id: + type: string + description: A unique identifier for the chat completion. + choices: + type: array + items: + type: object + properties: + finish_reason: + type: string + enum: + - stop + - length + - function_call + - content_filter + description: The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, `length` if the maximum number of tokens specified in the request was reached, `content_filter` if content was omitted due to a flag from our content filters, or `function_call` if the model called a function. + index: + type: integer + format: int32 + description: The index of the choice in the list of choices. + message: + $ref: '#/components/schemas/ChatCompletionResponseMessage' + required: + - finish_reason + - index + - message + description: A list of chat completion choices. Can be more than one if `n` is greater than 1. + created: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the chat completion was created. + model: + type: string + description: The model used for the chat completion. + system_fingerprint: + type: string + description: |- + This fingerprint represents the backend configuration that the model runs with. + + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + object: + type: string + enum: + - chat.completion + description: The object type, which is always `chat.completion`. + usage: + $ref: '#/components/schemas/CompletionUsage' + description: Represents a chat completion response returned by model, based on the provided input. + CreateChatCompletionRequest: + type: object + required: + - messages + - model + properties: + messages: + type: array + items: + $ref: '#/components/schemas/ChatCompletionRequestMessage' + minItems: 1 + description: A list of messages comprising the conversation so far. [Example Python code](https://cookbook.openai.com/examples/how_to_format_inputs_to_chatgpt_models). + model: + anyOf: + - type: string + - type: string + enum: + - gpt-4o + - gpt-4o-2024-05-13 + - gpt-4o-2024-08-06 + - chatgpt-4o-latest + - gpt-4o-mini + - gpt-4o-mini-2024-07-18 + - gpt-4-turbo + - gpt-4-turbo-2024-04-09 + - gpt-4-0125-preview + - gpt-4-turbo-preview + - gpt-4-1106-preview + - gpt-4-vision-preview + - gpt-4 + - gpt-4-0314 + - gpt-4-0613 + - gpt-4-32k + - gpt-4-32k-0314 + - gpt-4-32k-0613 + - gpt-3.5-turbo + - gpt-3.5-turbo-16k + - gpt-3.5-turbo-0301 + - gpt-3.5-turbo-0613 + - gpt-3.5-turbo-1106 + - gpt-3.5-turbo-0125 + - gpt-3.5-turbo-16k-0613 + description: ID of the model to use. See the [model endpoint compatibility](/docs/models/model-endpoint-compatibility) table for details on which models work with the Chat API. + x-oaiTypeLabel: string + frequency_penalty: + type: number + format: float + nullable: true + minimum: -2 + maximum: 2 + description: |- + Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim. + + [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + default: 0 + logit_bias: + type: object + additionalProperties: + type: integer + format: int32 + nullable: true + description: |- + Modify the likelihood of specified tokens appearing in the completion. + + Accepts a JSON object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token. + x-oaiTypeLabel: map + default: null + logprobs: + type: boolean + nullable: true + description: Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the `content` of `message`. + default: false + top_logprobs: + type: integer + format: int32 + nullable: true + minimum: 0 + maximum: 20 + description: An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. `logprobs` must be set to `true` if this parameter is used. + max_tokens: + type: integer + format: int32 + nullable: true + description: |- + The maximum number of [tokens](/tokenizer) that can be generated in the chat completion. + + The total length of input tokens and generated tokens is limited by the model's context length. [Example Python code](https://cookbook.openai.com/examples/how_to_count_tokens_with_tiktoken) for counting tokens. + n: + type: integer + format: int32 + nullable: true + minimum: 1 + maximum: 128 + description: How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep `n` as `1` to minimize costs. + default: 1 + presence_penalty: + type: number + format: float + nullable: true + minimum: -2 + maximum: 2 + description: |- + Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics. + + [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + default: 0 + response_format: + allOf: + - $ref: '#/components/schemas/ChatResponseFormat' + description: |- + An object specifying the format that the model must output. Compatible with [GPT-4o](/docs/models/gpt-4o), [GPT-4o mini](/docs/models/gpt-4o-mini), [GPT-4 Turbo](/docs/models/gpt-4-and-gpt-4-turbo) and all GPT-3.5 Turbo models newer than `gpt-3.5-turbo-1106`. + + Setting to `{ "type": "json_schema", "json_schema": {...} }` enables Structured Outputs which guarantees the model will match your supplied JSON schema. Learn more in the [Structured Outputs guide](/docs/guides/structured-outputs). + + Setting to `{ "type": "json_object" }` enables JSON mode, which guarantees the message the model generates is valid JSON. + + **Important:** when using JSON mode, you **must** also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if `finish_reason="length"`, which indicates the generation exceeded `max_tokens` or the conversation exceeded the max context length. + x-oaiExpandable: true + seed: + type: integer + format: int64 + nullable: true + description: |- + This feature is in Beta. + If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same `seed` and parameters should return the same result. + Determinism is not guaranteed, and you should refer to the `system_fingerprint` response parameter to monitor changes in the backend. + service_tier: + type: string + enum: + - auto + - default + nullable: true + description: |- + Specifies the latency tier to use for processing the request. This parameter is relevant for customers subscribed to the scale tier service: + - If set to 'auto', the system will utilize scale tier credits until they are exhausted. + - If set to 'default', the request will be processed using the default service tier with a lower uptime SLA and no latency guarentee. + - When not set, the default behavior is 'auto'. + + When this parameter is set, the response body will include the `service_tier` utilized. + default: null + stop: + anyOf: + - type: string + - type: array + items: + type: string + nullable: true + description: Up to 4 sequences where the API will stop generating further tokens. + default: null + stream: + type: boolean + nullable: true + description: 'If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) as they become available, with the stream terminated by a `data: [DONE]` message. [Example Python code](https://cookbook.openai.com/examples/how_to_stream_completions).' + default: false + stream_options: + type: object + allOf: + - $ref: '#/components/schemas/ChatCompletionStreamOptions' + nullable: true + default: null + temperature: + type: number + format: float + nullable: true + minimum: 0 + maximum: 2 + description: |- + What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + + We generally recommend altering this or `top_p` but not both. + default: 1 + top_p: + type: number + format: float + nullable: true + minimum: 0 + maximum: 1 + description: |- + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or `temperature` but not both. + default: 1 + tools: + type: array + items: + $ref: '#/components/schemas/ChatCompletionTool' + description: A list of tools the model may call. Currently, only functions are supported as a tool. Use this to provide a list of functions the model may generate JSON inputs for. A max of 128 functions are supported. + tool_choice: + $ref: '#/components/schemas/ChatCompletionToolChoiceOption' + parallel_tool_calls: + type: boolean + default: true + user: + type: string + description: A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). + function_call: + anyOf: + - type: string + enum: + - none + - auto + - $ref: '#/components/schemas/ChatCompletionFunctionCallOption' + description: |- + Deprecated in favor of `tool_choice`. + + Controls which (if any) function is called by the model. + `none` means the model will not call a function and instead generates a message. + `auto` means the model can pick between generating a message or calling a function. + Specifying a particular function via `{"name": "my_function"}` forces the model to call that function. + + `none` is the default when no functions are present. `auto` is the default if functions are present. + deprecated: true + x-oaiExpandable: true + functions: + type: array + items: + $ref: '#/components/schemas/ChatCompletionFunctions' + minItems: 1 + maxItems: 128 + description: |- + Deprecated in favor of `tools`. + + A list of functions the model may generate JSON inputs for. + deprecated: true + CreateChatCompletionResponse: + type: object + required: + - id + - choices + - created + - model + - object + properties: + id: + type: string + description: A unique identifier for the chat completion. + choices: + type: array + items: + type: object + properties: + finish_reason: + type: string + enum: + - stop + - length + - tool_calls + - content_filter + - function_call + description: |- + The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, + `length` if the maximum number of tokens specified in the request was reached, + `content_filter` if content was omitted due to a flag from our content filters, + `tool_calls` if the model called a tool, or `function_call` (deprecated) if the model called a function. + index: + type: integer + format: int32 + description: The index of the choice in the list of choices. + message: + $ref: '#/components/schemas/ChatCompletionResponseMessage' + logprobs: + type: object + properties: + content: + type: array + items: + $ref: '#/components/schemas/ChatCompletionTokenLogprob' + nullable: true + description: A list of message content tokens with log probability information. + refusal: + type: array + items: + $ref: '#/components/schemas/ChatCompletionTokenLogprob' + nullable: true + description: A list of message refusal tokens with log probability information. + required: + - content + - refusal + nullable: true + description: Log probability information for the choice. + required: + - finish_reason + - index + - message + - logprobs + description: A list of chat completion choices. Can be more than one if `n` is greater than 1. + created: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the chat completion was created. + model: + type: string + description: The model used for the chat completion. + service_tier: + type: string + enum: + - scale + - default + nullable: true + description: The service tier used for processing the request. This field is only included if the `service_tier` parameter is specified in the request. + system_fingerprint: + type: string + description: |- + This fingerprint represents the backend configuration that the model runs with. + + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + object: + type: string + enum: + - chat.completion + description: The object type, which is always `chat.completion`. + usage: + $ref: '#/components/schemas/CompletionUsage' + description: Represents a chat completion response returned by model, based on the provided input. + CreateChatCompletionStreamResponse: + type: object + required: + - id + - choices + - created + - model + - object + properties: + id: + type: string + description: A unique identifier for the chat completion. Each chunk has the same ID. + choices: + type: array + items: + type: object + properties: + delta: + $ref: '#/components/schemas/ChatCompletionStreamResponseDelta' + logprobs: + type: object + properties: + content: + type: array + items: + $ref: '#/components/schemas/ChatCompletionTokenLogprob' + nullable: true + description: A list of message content tokens with log probability information. + refusal: + type: array + items: + $ref: '#/components/schemas/ChatCompletionTokenLogprob' + nullable: true + description: A list of message refusal tokens with log probability information. + required: + - content + - refusal + nullable: true + description: Log probability information for the choice. + finish_reason: + type: string + enum: + - stop + - length + - tool_calls + - content_filter + - function_call + nullable: true + description: |- + The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, + `length` if the maximum number of tokens specified in the request was reached, + `content_filter` if content was omitted due to a flag from our content filters, + `tool_calls` if the model called a tool, or `function_call` (deprecated) if the model called a function. + index: + type: integer + format: int32 + description: The index of the choice in the list of choices. + required: + - delta + - finish_reason + - index + description: |- + A list of chat completion choices. Can contain more than one elements if `n` is greater than 1. Can also be empty for the + last chunk if you set `stream_options: {"include_usage": true}`. + created: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp. + model: + type: string + description: The model to generate the completion. + service_tier: + type: string + enum: + - scale + - default + nullable: true + description: The service tier used for processing the request. This field is only included if the `service_tier` parameter is specified in the request. + system_fingerprint: + type: string + description: |- + This fingerprint represents the backend configuration that the model runs with. + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + object: + type: string + enum: + - chat.completion.chunk + description: The object type, which is always `chat.completion.chunk`. + usage: + type: object + properties: + completion_tokens: + type: integer + format: int32 + description: Number of tokens in the generated completion. + prompt_tokens: + type: integer + format: int32 + description: Number of tokens in the prompt. + total_tokens: + type: integer + format: int32 + description: Total number of tokens used in the request (prompt + completion). + required: + - completion_tokens + - prompt_tokens + - total_tokens + description: |- + An optional field that will only be present when you set `stream_options: {"include_usage": true}` in your request. + When present, it contains a null value except for the last chunk which contains the token usage statistics for the entire request. + description: Represents a streamed chunk of a chat completion response returned by model, based on the provided input. + CreateCompletionRequest: + type: object + required: + - model + - prompt + properties: + model: + anyOf: + - type: string + - type: string + enum: + - gpt-3.5-turbo-instruct + - davinci-002 + - babbage-002 + description: ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. + x-oaiTypeLabel: string + prompt: + anyOf: + - type: string + - type: array + items: + type: string + - type: array + items: + type: integer + format: int32 + - type: array + items: + type: array + items: + type: integer + format: int32 + nullable: true + description: |- + The prompt(s) to generate completions for, encoded as a string, array of strings, array of tokens, or array of token arrays. + + Note that <|endoftext|> is the document separator that the model sees during training, so if a prompt is not specified the model will generate as if from the beginning of a new document. + default: <|endoftext|> + best_of: + type: integer + format: int32 + nullable: true + minimum: 0 + maximum: 20 + description: |- + Generates `best_of` completions server-side and returns the "best" (the one with the highest log probability per token). Results cannot be streamed. + + When used with `n`, `best_of` controls the number of candidate completions and `n` specifies how many to return – `best_of` must be greater than `n`. + + **Note:** Because this parameter generates many completions, it can quickly consume your token quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. + default: 1 + echo: + type: boolean + nullable: true + description: Echo back the prompt in addition to the completion + default: false + frequency_penalty: + type: number + format: float + nullable: true + minimum: -2 + maximum: 2 + description: |- + Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim. + + [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + default: 0 + logit_bias: + type: object + additionalProperties: + type: integer + format: int32 + nullable: true + description: |- + Modify the likelihood of specified tokens appearing in the completion. + + Accepts a JSON object that maps tokens (specified by their token ID in the GPT tokenizer) to an associated bias value from -100 to 100. You can use this [tokenizer tool](/tokenizer?view=bpe) to convert text to token IDs. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token. + + As an example, you can pass `{"50256": -100}` to prevent the <|endoftext|> token from being generated. + x-oaiTypeLabel: map + default: null + logprobs: + type: integer + format: int32 + nullable: true + minimum: 0 + maximum: 5 + description: |- + Include the log probabilities on the `logprobs` most likely output tokens, as well the chosen tokens. For example, if `logprobs` is 5, the API will return a list of the 5 most likely tokens. The API will always return the `logprob` of the sampled token, so there may be up to `logprobs+1` elements in the response. + + The maximum value for `logprobs` is 5. + default: null + max_tokens: + type: integer + format: int32 + nullable: true + minimum: 0 + description: |- + The maximum number of [tokens](/tokenizer) that can be generated in the completion. + + The token count of your prompt plus `max_tokens` cannot exceed the model's context length. [Example Python code](https://cookbook.openai.com/examples/how_to_count_tokens_with_tiktoken) for counting tokens. + default: 16 + n: + type: integer + format: int32 + nullable: true + minimum: 1 + maximum: 128 + description: |- + How many completions to generate for each prompt. + + **Note:** Because this parameter generates many completions, it can quickly consume your token quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. + default: 1 + presence_penalty: + type: number + format: float + nullable: true + minimum: -2 + maximum: 2 + description: |- + Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics. + + [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + default: 0 + seed: + type: integer + format: int64 + nullable: true + description: |- + If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same `seed` and parameters should return the same result. + + Determinism is not guaranteed, and you should refer to the `system_fingerprint` response parameter to monitor changes in the backend. + stop: + anyOf: + - type: string + - type: array + items: + type: string + nullable: true + description: Up to 4 sequences where the API will stop generating further tokens. The returned text will not contain the stop sequence. + default: null + stream: + type: boolean + nullable: true + description: 'Whether to stream back partial progress. If set, tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) as they become available, with the stream terminated by a `data: [DONE]` message. [Example Python code](https://cookbook.openai.com/examples/how_to_stream_completions).' + default: false + stream_options: + type: object + allOf: + - $ref: '#/components/schemas/ChatCompletionStreamOptions' + nullable: true + default: null + suffix: + type: string + nullable: true + description: |- + The suffix that comes after a completion of inserted text. + + This parameter is only supported for `gpt-3.5-turbo-instruct`. + default: null + temperature: + type: number + format: float + nullable: true + minimum: 0 + maximum: 2 + description: |- + What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + + We generally recommend altering this or `top_p` but not both. + default: 1 + top_p: + type: number + format: float + nullable: true + minimum: 0 + maximum: 1 + description: |- + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or `temperature` but not both. + default: 1 + user: + type: string + description: A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). + CreateCompletionResponse: + type: object + required: + - id + - choices + - created + - model + - object + properties: + id: + type: string + description: A unique identifier for the completion. + choices: + type: array + items: + type: object + properties: + finish_reason: + type: string + enum: + - stop + - length + - content_filter + description: |- + The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, + `length` if the maximum number of tokens specified in the request was reached, + or `content_filter` if content was omitted due to a flag from our content filters. + index: + type: integer + format: int32 + logprobs: + type: object + properties: + text_offset: + type: array + items: + type: integer + format: int32 + token_logprobs: + type: array + items: + type: number + format: float + tokens: + type: array + items: + type: string + top_logprobs: + type: array + items: + type: object + additionalProperties: + type: number + format: float + nullable: true + text: + type: string + required: + - finish_reason + - index + - logprobs + - text + description: The list of completion choices the model generated for the input prompt. + created: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the completion was created. + model: + type: string + description: The model used for completion. + system_fingerprint: + type: string + description: |- + This fingerprint represents the backend configuration that the model runs with. + + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + object: + type: string + enum: + - text_completion + description: The object type, which is always "text_completion" + usage: + $ref: '#/components/schemas/CompletionUsage' + description: 'Represents a completion response from the API. Note: both the streamed and non-streamed response objects share the same shape (unlike the chat endpoint).' + CreateEmbeddingRequest: + type: object + required: + - input + - model + properties: + input: + anyOf: + - type: string + - type: array + items: + type: string + - type: array + items: + type: integer + format: int32 + - type: array + items: + type: array + items: + type: integer + format: int32 + description: Input text to embed, encoded as a string or array of tokens. To embed multiple inputs in a single request, pass an array of strings or array of token arrays. The input must not exceed the max input tokens for the model (8192 tokens for `text-embedding-ada-002`), cannot be an empty string, and any array must be 2048 dimensions or less. [Example Python code](https://cookbook.openai.com/examples/how_to_count_tokens_with_tiktoken) for counting tokens. + x-oaiExpandable: true + model: + anyOf: + - type: string + - type: string + enum: + - text-embedding-ada-002 + - text-embedding-3-small + - text-embedding-3-large + description: ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. + x-oaiTypeLabel: string + encoding_format: + type: string + enum: + - float + - base64 + description: The format to return the embeddings in. Can be either `float` or [`base64`](https://pypi.org/project/pybase64/). + default: float + dimensions: + type: integer + format: int32 + minimum: 1 + description: The number of dimensions the resulting output embeddings should have. Only supported in `text-embedding-3` and later models. + user: + type: string + description: A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). + CreateEmbeddingResponse: + type: object + required: + - data + - model + - object + - usage + properties: + data: + type: array + items: + $ref: '#/components/schemas/Embedding' + description: The list of embeddings generated by the model. + model: + type: string + description: The name of the model used to generate the embedding. + object: + type: string + enum: + - list + description: The object type, which is always "list". + usage: + type: object + properties: + prompt_tokens: + type: integer + format: int32 + description: The number of tokens used by the prompt. + total_tokens: + type: integer + format: int32 + description: The total number of tokens used by the request. + required: + - prompt_tokens + - total_tokens + description: The usage information for the request. + CreateFileRequestMultiPart: + type: object + required: + - file + - purpose + properties: + file: + type: string + format: binary + description: The File object (not file name) to be uploaded. + purpose: + type: string + enum: + - assistants + - batch + - fine-tune + - vision + description: |- + The intended purpose of the uploaded file. + + Use "assistants" for [Assistants](/docs/api-reference/assistants) and [Message](/docs/api-reference/messages) files, "vision" for Assistants image file inputs, "batch" for [Batch API](/docs/guides/batch), and "fine-tune" for [Fine-tuning](/docs/api-reference/fine-tuning). + CreateFineTuningJobRequest: + type: object + required: + - model + - training_file + properties: + model: + anyOf: + - type: string + - type: string + enum: + - babbage-002 + - davinci-002 + - gpt-3.5-turbo + - gpt-4o-mini + description: |- + The name of the model to fine-tune. You can select one of the + [supported models](/docs/guides/fine-tuning/which-models-can-be-fine-tuned). + x-oaiTypeLabel: string + training_file: + type: string + description: |- + The ID of an uploaded file that contains training data. + + See [upload file](/docs/api-reference/files/create) for how to upload a file. + + Your dataset must be formatted as a JSONL file. Additionally, you must upload your file with the purpose `fine-tune`. + + The contents of the file should differ depending on if the model uses the [chat](/docs/api-reference/fine-tuning/chat-input) or [completions](/docs/api-reference/fine-tuning/completions-input) format. + + See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. + hyperparameters: + allOf: + - $ref: '#/components/schemas/CreateFineTuningJobRequestHyperparameters' + description: The hyperparameters used for the fine-tuning job. + suffix: + type: string + nullable: true + minLength: 1 + maxLength: 40 + description: |- + A string of up to 18 characters that will be added to your fine-tuned model name. + + For example, a `suffix` of "custom-model-name" would produce a model name like `ft:gpt-4o-mini:openai:custom-model-name:7p4lURel`. + default: null + validation_file: + type: string + nullable: true + description: |- + The ID of an uploaded file that contains validation data. + + If you provide this file, the data is used to generate validation + metrics periodically during fine-tuning. These metrics can be viewed in + the fine-tuning results file. + The same data should not be present in both train and validation files. + + Your dataset must be formatted as a JSONL file. You must upload your file with the purpose `fine-tune`. + + See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. + integrations: + type: object + allOf: + - $ref: '#/components/schemas/CreateFineTuningJobRequestIntegrations' + nullable: true + description: A list of integrations to enable for your fine-tuning job. + seed: + type: integer + format: int32 + nullable: true + minimum: 0 + maximum: 2147483647 + description: |- + The seed controls the reproducibility of the job. Passing in the same seed and job parameters should produce the same results, but may differ in rare cases. + If a seed is not specified, one will be generated for you. + CreateFineTuningJobRequestHyperparameters: + type: object + properties: + batch_size: + anyOf: + - $ref: '#/components/schemas/CreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum' + - type: integer + format: int32 + minimum: 1 + maximum: 256 + description: |- + Number of examples in each batch. A larger batch size means that model parameters + are updated less frequently, but with lower variance. + default: auto + learning_rate_multiplier: + anyOf: + - $ref: '#/components/schemas/CreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum' + - type: number + format: float + minimum: 0 + description: |- + Scaling factor for the learning rate. A smaller learning rate may be useful to avoid + overfitting. + default: auto + n_epochs: + anyOf: + - $ref: '#/components/schemas/CreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum' + - type: integer + format: int32 + minimum: 1 + maximum: 50 + description: |- + The number of epochs to train the model for. An epoch refers to one full cycle + through the training dataset. + default: auto + CreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum: + type: string + enum: + - auto + CreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum: + type: string + enum: + - auto + CreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum: + type: string + enum: + - auto + CreateFineTuningJobRequestIntegrations: + type: array + items: + type: object + properties: + type: + type: string + enum: + - wandb + description: The type of integration to enable. Currently, only "wandb" (Weights and Biases) is supported. + wandb: + type: object + properties: + project: + type: string + description: The name of the project that the new run will be created under. + name: + type: string + nullable: true + description: A display name to set for the run. If not set, we will use the Job ID as the name. + entity: + type: string + nullable: true + description: |- + The entity to use for the run. This allows you to set the team or username of the WandB user that you would + like associated with the run. If not set, the default entity for the registered WandB API key is used. + tags: + type: array + items: + type: string + description: |- + A list of tags to be attached to the newly created run. These tags are passed through directly to WandB. Some + default tags are generated by OpenAI: "openai/finetune", "openai/{base-model}", "openai/{ftjob-abcdef}". + required: + - project + description: |- + The settings for your integration with Weights and Biases. This payload specifies the project that + metrics will be sent to. Optionally, you can set an explicit display name for your run, add tags + to your run, and set a default entity (team, username, etc) to be associated with your run. + required: + - type + - wandb + CreateImageEditRequestMultiPart: + type: object + required: + - image + - prompt + properties: + image: + type: string + format: binary + description: The image to edit. Must be a valid PNG file, less than 4MB, and square. If mask is not provided, image must have transparency, which will be used as the mask. + prompt: + type: string + description: A text description of the desired image(s). The maximum length is 1000 characters. + mask: + type: string + format: binary + description: An additional image whose fully transparent areas (e.g. where alpha is zero) indicate where `image` should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions as `image`. + model: + anyOf: + - type: string + - type: string + enum: + - dall-e-2 + nullable: true + description: The model to use for image generation. Only `dall-e-2` is supported at this time. + x-oaiTypeLabel: string + default: dall-e-2 + n: + type: integer + format: int32 + nullable: true + minimum: 1 + maximum: 10 + description: The number of images to generate. Must be between 1 and 10. + default: 1 + size: + type: string + enum: + - 256x256 + - 512x512 + - 1024x1024 + nullable: true + description: The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. + default: 1024x1024 + response_format: + type: string + enum: + - url + - b64_json + nullable: true + description: The format in which the generated images are returned. Must be one of `url` or `b64_json`. URLs are only valid for 60 minutes after the image has been generated. + default: url + user: + type: string + description: A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). + CreateImageRequest: + type: object + required: + - prompt + properties: + prompt: + type: string + description: A text description of the desired image(s). The maximum length is 1000 characters for `dall-e-2` and 4000 characters for `dall-e-3`. + model: + anyOf: + - type: string + - type: string + enum: + - dall-e-2 + - dall-e-3 + nullable: true + description: The model to use for image generation. + x-oaiTypeLabel: string + default: dall-e-2 + n: + type: integer + format: int32 + nullable: true + minimum: 1 + maximum: 10 + description: The number of images to generate. Must be between 1 and 10. For `dall-e-3`, only `n=1` is supported. + default: 1 + quality: + type: string + enum: + - standard + - hd + description: The quality of the image that will be generated. `hd` creates images with finer details and greater consistency across the image. This param is only supported for `dall-e-3`. + default: standard + response_format: + type: string + enum: + - url + - b64_json + nullable: true + description: The format in which the generated images are returned. Must be one of `url` or `b64_json`. URLs are only valid for 60 minutes after the image has been generated. + default: url + size: + type: string + enum: + - 256x256 + - 512x512 + - 1024x1024 + - 1792x1024 + - 1024x1792 + nullable: true + description: The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024` for `dall-e-2`. Must be one of `1024x1024`, `1792x1024`, or `1024x1792` for `dall-e-3` models. + default: 1024x1024 + style: + type: string + enum: + - vivid + - natural + nullable: true + description: The style of the generated images. Must be one of `vivid` or `natural`. Vivid causes the model to lean towards generating hyper-real and dramatic images. Natural causes the model to produce more natural, less hyper-real looking images. This param is only supported for `dall-e-3`. + default: vivid + user: + type: string + description: A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). + CreateImageVariationRequestMultiPart: + type: object + required: + - image + properties: + image: + type: string + format: binary + model: + anyOf: + - type: string + - type: string + enum: + - dall-e-2 + nullable: true + description: The model to use for image generation. Only `dall-e-2` is supported at this time. + x-oaiTypeLabel: string + default: dall-e-2 + n: + type: integer + format: int32 + nullable: true + minimum: 1 + maximum: 10 + description: The number of images to generate. Must be between 1 and 10. For `dall-e-3`, only `n=1` is supported. + default: 1 + response_format: + type: string + enum: + - url + - b64_json + nullable: true + description: The format in which the generated images are returned. Must be one of `url` or `b64_json`. URLs are only valid for 60 minutes after the image has been generated. + default: url + size: + type: string + enum: + - 256x256 + - 512x512 + - 1024x1024 + nullable: true + description: The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. + default: 1024x1024 + user: + type: string + description: A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). + CreateMessageRequest: + type: object + required: + - role + - content + properties: + role: + type: string + enum: + - user + - assistant + description: |- + The role of the entity that is creating the message. Allowed values include: + - `user`: Indicates the message is sent by an actual user and should be used in most cases to represent user-generated messages. + - `assistant`: Indicates the message is generated by the assistant. Use this value to insert messages from the assistant into the conversation. + content: + type: array + items: + $ref: '#/components/schemas/MessageContent' + x-oaiExpandable: true + attachments: + type: object + allOf: + - $ref: '#/components/schemas/CreateMessageRequestAttachmentsItem' + nullable: true + description: A list of files attached to the message, and the tools they should be added to. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + CreateMessageRequestAttachmentsItem: + type: array + items: + type: object + properties: + file_id: + type: string + description: The ID of the file to attach to the message. + tools: + type: array + items: + anyOf: + - $ref: '#/components/schemas/AssistantToolsCode' + - $ref: '#/components/schemas/AssistantToolsFileSearchTypeOnly' + description: The tools to add this file to. + x-oaiExpandable: true + required: + - file_id + - tools + CreateModerationRequest: + type: object + required: + - input + properties: + input: + anyOf: + - type: string + - type: array + items: + type: string + description: The input text to classify + model: + anyOf: + - type: string + - type: string + enum: + - text-moderation-latest + - text-moderation-stable + description: |- + Two content moderations models are available: `text-moderation-stable` and `text-moderation-latest`. + + The default is `text-moderation-latest` which will be automatically upgraded over time. This ensures you are always using our most accurate model. If you use `text-moderation-stable`, we will provide advanced notice before updating the model. Accuracy of `text-moderation-stable` may be slightly lower than for `text-moderation-latest`. + x-oaiTypeLabel: string + default: text-moderation-latest + CreateModerationResponse: + type: object + required: + - id + - model + - results + properties: + id: + type: string + description: The unique identifier for the moderation request. + model: + type: string + description: The model used to generate the moderation results. + results: + type: array + items: + type: object + properties: + flagged: + type: boolean + description: Whether any of the below categories are flagged. + categories: + type: object + properties: + hate: + type: boolean + description: Content that expresses, incites, or promotes hate based on race, gender, ethnicity, religion, nationality, sexual orientation, disability status, or caste. Hateful content aimed at non-protected groups (e.g., chess players) is harassment. + hate/threatening: + type: boolean + description: Hateful content that also includes violence or serious harm towards the targeted group based on race, gender, ethnicity, religion, nationality, sexual orientation, disability status, or caste. + harassment: + type: boolean + description: Content that expresses, incites, or promotes harassing language towards any target. + harassment/threatening: + type: boolean + description: Harassment content that also includes violence or serious harm towards any target. + self-harm: + type: boolean + description: Content that promotes, encourages, or depicts acts of self-harm, such as suicide, cutting, and eating disorders. + self-harm/intent: + type: boolean + description: Content where the speaker expresses that they are engaging or intend to engage in acts of self-harm, such as suicide, cutting, and eating disorders. + self-harm/instructions: + type: boolean + description: Content that encourages performing acts of self-harm, such as suicide, cutting, and eating disorders, or that gives instructions or advice on how to commit such acts. + sexual: + type: boolean + description: Content meant to arouse sexual excitement, such as the description of sexual activity, or that promotes sexual services (excluding sex education and wellness). + sexual/minors: + type: boolean + description: Sexual content that includes an individual who is under 18 years old. + violence: + type: boolean + description: Content that depicts death, violence, or physical injury. + violence/graphic: + type: boolean + description: Content that depicts death, violence, or physical injury in graphic detail. + required: + - hate + - hate/threatening + - harassment + - harassment/threatening + - self-harm + - self-harm/intent + - self-harm/instructions + - sexual + - sexual/minors + - violence + - violence/graphic + description: A list of the categories, and whether they are flagged or not. + category_scores: + type: object + properties: + hate: + type: number + format: float + description: The score for the category 'hate'. + hate/threatening: + type: number + format: float + description: The score for the category 'hate/threatening'. + harassment: + type: number + format: float + description: The score for the category 'harassment'. + harassment/threatening: + type: number + format: float + description: The score for the category 'harassment/threatening'. + self-harm: + type: number + format: float + description: The score for the category 'self-harm'. + self-harm/intent: + type: number + format: float + description: The score for the category 'self-harm/intent'. + self-harm/instructions: + type: number + format: float + description: The score for the category 'self-harm/instructions'. + sexual: + type: number + format: float + description: The score for the category 'sexual'. + sexual/minors: + type: number + format: float + description: The score for the category 'sexual/minors'. + violence: + type: number + format: float + description: The score for the category 'violence'. + violence/graphic: + type: number + format: float + description: The score for the category 'violence/graphic'. + required: + - hate + - hate/threatening + - harassment + - harassment/threatening + - self-harm + - self-harm/intent + - self-harm/instructions + - sexual + - sexual/minors + - violence + - violence/graphic + description: A list of the categories along with their scores as predicted by model. + required: + - flagged + - categories + - category_scores + description: A list of moderation objects. + description: Represents if a given text input is potentially harmful. + CreateRunRequest: + type: object + required: + - assistant_id + properties: + assistant_id: + type: string + description: The ID of the [assistant](/docs/api-reference/assistants) to use to execute this run. + model: + anyOf: + - type: string + - type: string + enum: + - gpt-4o + - gpt-4o-2024-08-06 + - gpt-4o-2024-05-13 + - gpt-4o-mini + - gpt-4o-mini-2024-07-18 + - gpt-4-turbo + - gpt-4-turbo-2024-04-09 + - gpt-4-0125-preview + - gpt-4-turbo-preview + - gpt-4-1106-preview + - gpt-4-vision-preview + - gpt-4 + - gpt-4-0314 + - gpt-4-0613 + - gpt-4-32k + - gpt-4-32k-0314 + - gpt-4-32k-0613 + - gpt-3.5-turbo + - gpt-3.5-turbo-16k + - gpt-3.5-turbo-0613 + - gpt-3.5-turbo-1106 + - gpt-3.5-turbo-0125 + - gpt-3.5-turbo-16k-0613 + nullable: true + description: The ID of the [Model](/docs/api-reference/models) to be used to execute this run. If a value is provided here, it will override the model associated with the assistant. If not, the model associated with the assistant will be used. + x-oaiTypeLabel: string + instructions: + type: string + nullable: true + description: Overrides the [instructions](/docs/api-reference/assistants/createAssistant) of the assistant. This is useful for modifying the behavior on a per-run basis. + additional_instructions: + type: string + nullable: true + description: Appends additional instructions at the end of the instructions for the run. This is useful for modifying the behavior on a per-run basis without overriding other instructions. + additional_messages: + type: object + allOf: + - $ref: '#/components/schemas/CreateRunRequestAdditional_messages' + nullable: true + description: Adds additional messages to the thread before creating the run. + tools: + type: object + allOf: + - $ref: '#/components/schemas/CreateRunRequestTools' + nullable: true + description: Override the tools the assistant can use for this run. This is useful for modifying the behavior on a per-run basis. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + temperature: + type: number + format: float + nullable: true + minimum: 0 + maximum: 2 + description: What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + default: 1 + top_p: + type: number + format: float + nullable: true + minimum: 0 + maximum: 1 + description: |- + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or temperature but not both. + default: 1 + stream: + type: boolean + nullable: true + description: 'If `true`, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with a `data: [DONE]` message.' + max_prompt_tokens: + type: integer + format: int32 + nullable: true + minimum: 256 + description: The maximum number of prompt tokens that may be used over the course of the run. The run will make a best effort to use only the number of prompt tokens specified, across multiple turns of the run. If the run exceeds the number of prompt tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + max_completion_tokens: + type: integer + format: int32 + nullable: true + minimum: 256 + description: The maximum number of completion tokens that may be used over the course of the run. The run will make a best effort to use only the number of completion tokens specified, across multiple turns of the run. If the run exceeds the number of completion tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + truncation_strategy: + type: object + allOf: + - $ref: '#/components/schemas/TruncationObject' + nullable: true + tool_choice: + oneOf: + - $ref: '#/components/schemas/AssistantsApiToolChoiceOption' + nullable: true + parallel_tool_calls: + type: boolean + default: true + response_format: + oneOf: + - $ref: '#/components/schemas/AssistantsApiResponseFormatOption' + nullable: true + CreateRunRequestAdditional_messages: + type: array + items: + $ref: '#/components/schemas/CreateMessageRequest' + CreateRunRequestTools: + type: array + items: + $ref: '#/components/schemas/AssistantToolDefinition' + maxItems: 20 + x-oaiExpandable: true + CreateSpeechRequest: + type: object + required: + - model + - input + - voice + properties: + model: + anyOf: + - type: string + - type: string + enum: + - tts-1 + - tts-1-hd + description: 'One of the available [TTS models](/docs/models/tts): `tts-1` or `tts-1-hd`' + x-oaiTypeLabel: string + input: + type: string + maxLength: 4096 + description: The text to generate audio for. The maximum length is 4096 characters. + voice: + type: string + enum: + - alloy + - echo + - fable + - onyx + - nova + - shimmer + description: The voice to use when generating the audio. Supported voices are `alloy`, `echo`, `fable`, `onyx`, `nova`, and `shimmer`. Previews of the voices are available in the [Text to speech guide](/docs/guides/text-to-speech/voice-options). + response_format: + type: string + enum: + - mp3 + - opus + - aac + - flac + - wav + - pcm + description: The format to audio in. Supported formats are `mp3`, `opus`, `aac`, `flac`, `wav`, and `pcm`. + default: mp3 + speed: + type: number + format: float + minimum: 0.25 + maximum: 4 + description: The speed of the generated audio. Select a value from `0.25` to `4.0`. `1.0` is the default. + default: 1 + CreateThreadAndRunRequest: + type: object + required: + - assistant_id + properties: + assistant_id: + type: string + description: The ID of the [assistant](/docs/api-reference/assistants) to use to execute this run. + thread: + allOf: + - $ref: '#/components/schemas/CreateThreadRequest' + description: If no thread is provided, an empty thread will be created. + model: + anyOf: + - type: string + - type: string + enum: + - gpt-4o + - gpt-4o-2024-08-06 + - gpt-4o-2024-05-13 + - gpt-4o-mini + - gpt-4o-mini-2024-07-18 + - gpt-4-turbo + - gpt-4-turbo-2024-04-09 + - gpt-4-0125-preview + - gpt-4-turbo-preview + - gpt-4-1106-preview + - gpt-4-vision-preview + - gpt-4 + - gpt-4-0314 + - gpt-4-0613 + - gpt-4-32k + - gpt-4-32k-0314 + - gpt-4-32k-0613 + - gpt-3.5-turbo + - gpt-3.5-turbo-16k + - gpt-3.5-turbo-0613 + - gpt-3.5-turbo-1106 + - gpt-3.5-turbo-0125 + - gpt-3.5-turbo-16k-0613 + nullable: true + description: The ID of the [Model](/docs/api-reference/models) to be used to execute this run. If a value is provided here, it will override the model associated with the assistant. If not, the model associated with the assistant will be used. + x-oaiTypeLabel: string + instructions: + type: string + nullable: true + description: Override the default system message of the assistant. This is useful for modifying the behavior on a per-run basis. + tools: + type: object + allOf: + - $ref: '#/components/schemas/CreateThreadAndRunRequestTools' + nullable: true + description: Override the tools the assistant can use for this run. This is useful for modifying the behavior on a per-run basis. + tool_resources: + type: object + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + items: + type: string + maxItems: 20 + description: A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + file_search: + $ref: '#/components/schemas/ToolResourcesFileSearchIdsOnly' + nullable: true + description: A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + temperature: + type: number + format: float + nullable: true + minimum: 0 + maximum: 2 + description: What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + default: 1 + top_p: + type: number + format: float + nullable: true + minimum: 0 + maximum: 1 + description: |- + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or temperature but not both. + default: 1 + stream: + type: boolean + nullable: true + description: 'If `true`, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with a `data: [DONE]` message.' + max_prompt_tokens: + type: integer + format: int32 + nullable: true + minimum: 256 + description: The maximum number of prompt tokens that may be used over the course of the run. The run will make a best effort to use only the number of prompt tokens specified, across multiple turns of the run. If the run exceeds the number of prompt tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + max_completion_tokens: + type: integer + format: int32 + nullable: true + minimum: 256 + description: The maximum number of completion tokens that may be used over the course of the run. The run will make a best effort to use only the number of completion tokens specified, across multiple turns of the run. If the run exceeds the number of completion tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + truncation_strategy: + type: object + allOf: + - $ref: '#/components/schemas/TruncationObject' + nullable: true + tool_choice: + oneOf: + - $ref: '#/components/schemas/AssistantsApiToolChoiceOption' + nullable: true + parallel_tool_calls: + type: boolean + default: true + response_format: + oneOf: + - $ref: '#/components/schemas/AssistantsApiResponseFormatOption' + nullable: true + CreateThreadAndRunRequestTools: + type: array + items: + $ref: '#/components/schemas/AssistantToolDefinition' + maxItems: 20 + CreateThreadRequest: + type: object + properties: + messages: + type: array + items: + $ref: '#/components/schemas/CreateMessageRequest' + description: A list of [messages](/docs/api-reference/messages) to start the thread with. + tool_resources: + type: object + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + items: + type: string + maxItems: 20 + description: A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + file_search: + $ref: '#/components/schemas/ToolResourcesFileSearch' + nullable: true + description: A set of resources that are made available to the assistant's tools in this thread. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + CreateThreadRequestToolResourcesFileSearchBase: + type: object + CreateThreadRequestToolResourcesFileSearchVectorStoreCreationHelpers: + type: object + properties: + vector_stores: + type: array + items: + type: object + properties: + file_ids: + type: array + items: + type: string + maxItems: 10000 + description: |- + A list of [file](/docs/api-reference/files) IDs to add to the vector store. There can be + a maximum of 10000 files in a vector store. + metadata: + type: object + additionalProperties: + type: string + description: |- + Set of 16 key-value pairs that can be attached to a vector store. This can be useful for + storing additional information about the vector store in a structured format. Keys can + be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + maxItems: 1 + description: |- + A helper to create a [vector store](/docs/api-reference/vector-stores/object) with + file_ids and attach it to this thread. There can be a maximum of 1 vector store attached + to the thread. + CreateThreadRequestToolResourcesFileSearchVectorStoreIdReferences: + type: object + properties: + vector_store_ids: + type: array + items: + type: string + maxItems: 1 + description: |- + The [vector store](/docs/api-reference/vector-stores/object) attached to this thread. + There can be a maximum of 1 vector store attached to the thread. + CreateTranscriptionRequestMultiPart: + type: object + required: + - file + - model + properties: + file: + type: string + format: binary + description: 'The audio file object (not file name) to transcribe, in one of these formats: flac, mp3, mp4, mpeg, mpga, m4a, ogg, wav, or webm.' + x-oaiTypeLabel: file + model: + anyOf: + - type: string + - type: string + enum: + - whisper-1 + description: ID of the model to use. Only `whisper-1` (which is powered by our open source Whisper V2 model) is currently available. + x-oaiTypeLabel: string + language: + type: string + description: The language of the input audio. Supplying the input language in [ISO-639-1](https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes) format will improve accuracy and latency. + prompt: + type: string + description: An optional text to guide the model's style or continue a previous audio segment. The [prompt](/docs/guides/speech-to-text/prompting) should match the audio language. + response_format: + type: string + enum: + - json + - text + - srt + - verbose_json + - vtt + description: 'The format of the transcript output, in one of these options: `json`, `text`, `srt`, `verbose_json`, or `vtt`.' + default: json + temperature: + type: number + format: float + minimum: 0 + maximum: 1 + description: The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to automatically increase the temperature until certain thresholds are hit. + default: 0 + timestamp_granularities: + type: array + items: + type: string + enum: + - word + - segment + description: 'The timestamp granularities to populate for this transcription. `response_format` must be set `verbose_json` to use timestamp granularities. Either or both of these options are supported: `word`, or `segment`. Note: There is no additional latency for segment timestamps, but generating word timestamps incurs additional latency.' + default: + - segment + CreateTranscriptionResponseJson: + type: object + required: + - text + properties: + text: + type: string + description: The transcribed text. + description: Represents a transcription response returned by model, based on the provided input. + CreateTranscriptionResponseVerboseJson: + type: object + required: + - task + - language + - duration + - text + properties: + task: + type: string + enum: + - transcribe + description: The task label. + language: + type: string + description: The language of the input audio. + duration: + type: number + format: float + description: The duration of the input audio. + text: + type: string + description: The transcribed text. + words: + type: array + items: + $ref: '#/components/schemas/TranscriptionWord' + description: Extracted words and their corresponding timestamps. + segments: + type: array + items: + $ref: '#/components/schemas/TranscriptionSegment' + description: Segments of the transcribed text and their corresponding details. + description: Represents a verbose json transcription response returned by model, based on the provided input. + CreateTranslationRequestMultiPart: + type: object + required: + - file + - model + properties: + file: + type: string + format: binary + description: 'The audio file object (not file name) translate, in one of these formats: flac, mp3, mp4, mpeg, mpga, m4a, ogg, wav, or webm.' + x-oaiTypeLabel: file + model: + anyOf: + - type: string + - type: string + enum: + - whisper-1 + description: ID of the model to use. Only `whisper-1` (which is powered by our open source Whisper V2 model) is currently available. + x-oaiTypeLabel: string + prompt: + type: string + description: An optional text to guide the model's style or continue a previous audio segment. The [prompt](/docs/guides/speech-to-text/prompting) should be in English. + response_format: + type: string + enum: + - json + - text + - srt + - verbose_json + - vtt + description: 'The format of the transcript output, in one of these options: `json`, `text`, `srt`, `verbose_json`, or `vtt`.' + default: json + temperature: + type: number + format: float + minimum: 0 + maximum: 1 + description: The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to automatically increase the temperature until certain thresholds are hit. + default: 0 + CreateTranslationResponseJson: + type: object + required: + - text + properties: + text: + type: string + CreateTranslationResponseVerboseJson: + type: object + required: + - task + - language + - duration + - text + properties: + task: + type: string + enum: + - translate + description: The task label. + language: + type: string + description: The language of the output translation (always `english`). + duration: + type: number + format: float + description: The duration of the input audio. + text: + type: string + description: The translated text. + segments: + type: array + items: + $ref: '#/components/schemas/TranscriptionSegment' + description: Segments of the translated text and their corresponding details. + CreateUploadRequest: + type: object + required: + - filename + - purpose + - bytes + - mime_type + properties: + filename: + type: string + description: The name of the file to upload. + purpose: + type: string + enum: + - assistants + - batch + - fine-tune + - vision + description: |- + The intended purpose of the uploaded file. + + See the [documentation on File purposes](/docs/api-reference/files/create#files-create-purpose). + bytes: + type: integer + format: int32 + description: The number of bytes in the file you are uploading. + mime_type: + type: string + description: |- + The MIME type of the file. + + This must fall within the supported MIME types for your file purpose. See the supported MIME types for assistants and vision. + CreateVectorStoreFileBatchRequest: + type: object + required: + - file_ids + properties: + file_ids: + type: array + items: + type: string + minItems: 1 + maxItems: 500 + description: A list of [File](/docs/api-reference/files) IDs that the vector store should use. Useful for tools like `file_search` that can access files. + chunking_strategy: + $ref: '#/components/schemas/ChunkingStrategyRequestParam' + CreateVectorStoreFileRequest: + type: object + required: + - file_id + properties: + file_id: + type: string + description: A [File](/docs/api-reference/files) ID that the vector store should use. Useful for tools like `file_search` that can access files. + chunking_strategy: + $ref: '#/components/schemas/ChunkingStrategyRequestParam' + CreateVectorStoreRequest: + type: object + properties: + file_ids: + type: array + items: + type: string + maxItems: 500 + description: A list of [File](/docs/api-reference/files) IDs that the vector store should use. Useful for tools like `file_search` that can access files. + name: + type: string + description: The name of the vector store. + expires_after: + $ref: '#/components/schemas/VectorStoreExpirationAfter' + chunking_strategy: + anyOf: + - $ref: '#/components/schemas/AutoChunkingStrategyRequestParam' + - $ref: '#/components/schemas/StaticChunkingStrategyRequestParam' + description: The chunking strategy used to chunk the file(s). If not set, will use the `auto` strategy. Only applicable if `file_ids` is non-empty. + x-oaiExpandable: true + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + DefaultProjectErrorResponse: + type: object + required: + - code + - message + properties: + code: + type: integer + format: int32 + message: + type: string + DeleteAssistantResponse: + type: object + required: + - id + - deleted + - object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + enum: + - assistant.deleted + DeleteFileResponse: + type: object + required: + - id + - object + - deleted + properties: + id: + type: string + object: + type: string + enum: + - file + deleted: + type: boolean + DeleteMessageResponse: + type: object + required: + - id + - deleted + - object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + enum: + - thread.message.deleted + DeleteModelResponse: + type: object + required: + - id + - deleted + - object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + enum: + - model + DeleteThreadResponse: + type: object + required: + - id + - deleted + - object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + enum: + - thread.deleted + DeleteVectorStoreFileResponse: + type: object + required: + - id + - deleted + - object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + enum: + - vector_store.file.deleted + DeleteVectorStoreResponse: + type: object + required: + - id + - deleted + - object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + enum: + - vector_store.deleted + Embedding: + type: object + required: + - index + - embedding + - object + properties: + index: + type: integer + format: int32 + description: The index of the embedding in the list of embeddings. + embedding: + anyOf: + - type: array + items: + type: number + - type: string + description: The embedding vector, which is a list of floats. The length of vector depends on the model as listed in the [embedding guide](/docs/guides/embeddings). + object: + type: string + enum: + - embedding + description: The object type, which is always "embedding". + description: Represents an embedding vector returned by embedding endpoint. + Error: + type: object + required: + - code + - message + - param + - type + properties: + code: + type: string + nullable: true + message: + type: string + param: + type: string + nullable: true + type: + type: string + ErrorResponse: + type: object + required: + - error + properties: + error: + $ref: '#/components/schemas/Error' + FileChunkingStrategyRequestParam: + type: object + required: + - type + properties: + type: + type: string + discriminator: + propertyName: type + mapping: + static: '#/components/schemas/StaticChunkingStrategyRequestParam' + FileChunkingStrategyResponseParam: + type: object + required: + - type + properties: + type: + type: string + discriminator: + propertyName: type + mapping: + other: '#/components/schemas/OtherChunkingStrategyResponseParam' + auto: '#/components/schemas/AutoChunkingStrategyResponseParam' + FineTuneChatCompletionRequestAssistantMessage: + type: object + allOf: + - $ref: '#/components/schemas/ChatCompletionRequestAssistantMessage' + FineTuningIntegration: + type: object + required: + - type + - wandb + properties: + type: + type: string + enum: + - wandb + description: The type of the integration being enabled for the fine-tuning job + wandb: + type: object + properties: + project: + type: string + description: The name of the project that the new run will be created under. + name: + type: string + nullable: true + description: A display name to set for the run. If not set, we will use the Job ID as the name. + entity: + type: string + nullable: true + description: |- + The entity to use for the run. This allows you to set the team or username of the WandB user that you would + like associated with the run. If not set, the default entity for the registered WandB API key is used. + tags: + type: array + items: + type: string + description: |- + A list of tags to be attached to the newly created run. These tags are passed through directly to WandB. Some + default tags are generated by OpenAI: "openai/finetune", "openai/{base-model}", "openai/{ftjob-abcdef}". + required: + - project + description: |- + The settings for your integration with Weights and Biases. This payload specifies the project that + metrics will be sent to. Optionally, you can set an explicit display name for your run, add tags + to your run, and set a default entity (team, username, etc) to be associated with your run. + FineTuningJob: + type: object + required: + - id + - created_at + - error + - fine_tuned_model + - finished_at + - hyperparameters + - model + - object + - organization_id + - result_files + - status + - trained_tokens + - training_file + - validation_file + - seed + properties: + id: + type: string + description: The object identifier, which can be referenced in the API endpoints. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the fine-tuning job was created. + error: + type: object + properties: + code: + type: string + description: A machine-readable error code. + message: + type: string + description: A human-readable error message. + param: + type: string + nullable: true + description: The parameter that was invalid, usually `training_file` or `validation_file`. This field will be null if the failure was not parameter-specific. + required: + - code + - message + - param + nullable: true + description: For fine-tuning jobs that have `failed`, this will contain more information on the cause of the failure. + fine_tuned_model: + type: string + nullable: true + description: The name of the fine-tuned model that is being created. The value will be null if the fine-tuning job is still running. + finished_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the fine-tuning job was finished. The value will be null if the fine-tuning job is still running. + hyperparameters: + allOf: + - $ref: '#/components/schemas/FineTuningJobHyperparameters' + description: The hyperparameters used for the fine-tuning job. See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. + model: + type: string + description: The base model that is being fine-tuned. + object: + type: string + enum: + - fine_tuning.job + description: The object type, which is always "fine_tuning.job". + organization_id: + type: string + description: The organization that owns the fine-tuning job. + result_files: + type: array + items: + type: string + description: The compiled results file ID(s) for the fine-tuning job. You can retrieve the results with the [Files API](/docs/api-reference/files/retrieve-contents). + status: + type: string + enum: + - validating_files + - queued + - running + - succeeded + - failed + - cancelled + description: The current status of the fine-tuning job, which can be either `validating_files`, `queued`, `running`, `succeeded`, `failed`, or `cancelled`. + trained_tokens: + type: integer + format: int32 + nullable: true + description: The total number of billable tokens processed by this fine-tuning job. The value will be null if the fine-tuning job is still running. + training_file: + type: string + description: The file ID used for training. You can retrieve the training data with the [Files API](/docs/api-reference/files/retrieve-contents). + validation_file: + type: string + nullable: true + description: The file ID used for validation. You can retrieve the validation results with the [Files API](/docs/api-reference/files/retrieve-contents). + integrations: + type: object + allOf: + - $ref: '#/components/schemas/FineTuningJobIntegrationsItem' + nullable: true + description: A list of integrations to enable for this fine-tuning job. + seed: + type: integer + format: int32 + description: The seed used for the fine-tuning job. + estimated_finish: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the fine-tuning job is estimated to finish. The value will be null if the fine-tuning job is not running. + description: The `fine_tuning.job` object represents a fine-tuning job that has been created through the API. + FineTuningJobCheckpoint: + type: object + required: + - id + - created_at + - fine_tuned_model_checkpoint + - step_number + - metrics + - fine_tuning_job_id + - object + properties: + id: + type: string + description: The checkpoint identifier, which can be referenced in the API endpoints. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the checkpoint was created. + fine_tuned_model_checkpoint: + type: string + description: The name of the fine-tuned checkpoint model that is created. + step_number: + type: integer + format: int32 + description: The step number that the checkpoint was created at. + metrics: + type: object + properties: + step: + type: number + format: float + train_loss: + type: number + format: float + train_mean_token_accuracy: + type: number + format: float + valid_loss: + type: number + format: float + valid_mean_token_accuracy: + type: number + format: float + full_valid_loss: + type: number + format: float + full_valid_mean_token_accuracy: + type: number + format: float + description: Metrics at the step number during the fine-tuning job. + fine_tuning_job_id: + type: string + description: The name of the fine-tuning job that this checkpoint was created from. + object: + type: string + enum: + - fine_tuning.job.checkpoint + description: The object type, which is always "fine_tuning.job.checkpoint". + description: The `fine_tuning.job.checkpoint` object represents a model checkpoint for a fine-tuning job that is ready to use. + FineTuningJobEvent: + type: object + required: + - id + - created_at + - level + - message + - object + properties: + id: + type: string + created_at: + type: integer + format: unixtime + level: + type: string + enum: + - info + - warn + - error + message: + type: string + object: + type: string + enum: + - fine_tuning.job.event + description: Fine-tuning job event object + FineTuningJobHyperparameters: + type: object + required: + - n_epochs + properties: + n_epochs: + anyOf: + - $ref: '#/components/schemas/FineTuningJobHyperparametersNEpochsChoiceEnum' + - type: integer + format: int32 + description: |- + The number of epochs to train the model for. An epoch refers to one full cycle + through the training dataset. + default: auto + FineTuningJobHyperparametersBatchSizeChoiceEnum: + type: string + enum: + - auto + FineTuningJobHyperparametersLearningRateMultiplierChoiceEnum: + type: string + enum: + - auto + FineTuningJobHyperparametersNEpochsChoiceEnum: + type: string + enum: + - auto + FineTuningJobIntegrationsItem: + type: array + items: + $ref: '#/components/schemas/FineTuningIntegration' + maxItems: 5 + x-oaiExpandable: true + FinetuneChatRequestInput: + type: object + properties: + messages: + type: array + items: + anyOf: + - $ref: '#/components/schemas/ChatCompletionRequestSystemMessage' + - $ref: '#/components/schemas/ChatCompletionRequestUserMessage' + - $ref: '#/components/schemas/FineTuneChatCompletionRequestAssistantMessage' + - $ref: '#/components/schemas/ChatCompletionRequestToolMessage' + - $ref: '#/components/schemas/ChatCompletionRequestFunctionMessage' + minItems: 1 + x-oaiExpandable: true + tools: + type: array + items: + $ref: '#/components/schemas/ChatCompletionTool' + description: A list of tools the model may generate JSON inputs for. + parallel_tool_calls: + type: boolean + default: true + functions: + type: array + items: + $ref: '#/components/schemas/ChatCompletionFunctions' + minItems: 1 + maxItems: 128 + description: A list of functions the model may generate JSON inputs for. + deprecated: true + description: The per-line training example of a fine-tuning input file for chat models + FinetuneCompletionRequestInput: + type: object + properties: + prompt: + type: string + description: The input prompt for this training example. + completion: + type: string + description: The desired completion for this training example. + description: The per-line training example of a fine-tuning input file for completions models + FunctionObject: + type: object + required: + - name + properties: + description: + type: string + description: A description of what the function does, used by the model to choose when and how to call the function. + name: + type: string + description: The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. + parameters: + $ref: '#/components/schemas/FunctionParameters' + strict: + type: boolean + nullable: true + description: Whether to enable strict schema adherence when generating the function call. If set to true, the model will follow the exact schema defined in the `parameters` field. Only a subset of JSON Schema is supported when `strict` is `true`. Learn more about Structured Outputs in the [function calling guide](docs/guides/function-calling). + default: false + FunctionParameters: + type: object + additionalProperties: {} + description: |- + The parameters the functions accepts, described as a JSON Schema object. See the [guide](/docs/guides/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format. + + Omitting `parameters` defines a function with an empty parameter list. + Image: + type: object + properties: + b64_json: + type: string + format: base64 + description: The base64-encoded JSON of the generated image, if `response_format` is `b64_json`. + url: + type: string + format: uri + description: The URL of the generated image, if `response_format` is `url` (default). + revised_prompt: + type: string + description: The prompt that was used to generate the image, if there was any revision to the prompt. + description: Represents the url or the content of an image generated by the OpenAI API. + ImagesResponse: + type: object + required: + - created + - data + properties: + created: + type: integer + format: unixtime + data: + type: array + items: + $ref: '#/components/schemas/Image' + Invite: + type: object + required: + - object + - id + - email + - role + - status + - invited_at + - expires_at + properties: + object: + type: string + enum: + - organization.invite + description: The object type, which is always `organization.invite` + id: + type: string + description: The identifier, which can be referenced in API endpoints + email: + type: string + description: The email address of the individual to whom the invite was sent + role: + type: string + enum: + - owner + - reader + description: '`owner` or `reader`' + status: + type: string + enum: + - accepted + - expired + - pending + description: '`accepted`,`expired`, or `pending`' + invited_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the invite was sent. + expires_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the invite expires. + accepted_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the invite was accepted. + description: Represents an individual `invite` to the organization. + InviteDeleteResponse: + type: object + required: + - object + - id + - deleted + properties: + object: + type: string + enum: + - organization.invite.deleted + description: The object type, which is always `organization.invite.deleted` + id: + type: string + deleted: + type: boolean + InviteListResponse: + type: object + required: + - object + - data + properties: + object: + type: string + enum: + - list + description: The object type, which is always `list` + data: + type: array + items: + $ref: '#/components/schemas/Invite' + first_id: + type: string + description: The first `invite_id` in the retrieved `list` + last_id: + type: string + description: The last `invite_id` in the retrieved `list` + has_more: + type: boolean + description: The `has_more` property is used for pagination to indicate there are additional results. + InviteRequest: + type: object + required: + - email + - role + properties: + email: + type: string + description: Send an email to this address + role: + type: string + enum: + - reader + - owner + description: '`owner` or `reader`' + ListAssistantsResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/AssistantObject' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + ListAuditLogsResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/AuditLog' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + ListBatchesResponse: + type: object + required: + - data + - has_more + - object + properties: + data: + type: array + items: + $ref: '#/components/schemas/Batch' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + object: + type: string + enum: + - list + ListFilesResponse: + type: object + required: + - data + - object + properties: + data: + type: array + items: + $ref: '#/components/schemas/OpenAIFile' + object: + type: string + enum: + - list + ListFineTuningJobCheckpointsResponse: + type: object + required: + - data + - object + - has_more + properties: + data: + type: array + items: + $ref: '#/components/schemas/FineTuningJobCheckpoint' + object: + type: string + enum: + - list + first_id: + type: string + nullable: true + last_id: + type: string + nullable: true + has_more: + type: boolean + ListFineTuningJobEventsResponse: + type: object + required: + - data + - object + properties: + data: + type: array + items: + $ref: '#/components/schemas/FineTuningJobEvent' + object: + type: string + enum: + - list + ListMessagesResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/MessageObject' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + ListModelsResponse: + type: object + required: + - object + - data + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/Model' + ListOrder: + anyOf: + - type: string + - type: string + enum: + - asc + - desc + ListPaginatedFineTuningJobsResponse: + type: object + required: + - data + - has_more + - object + properties: + data: + type: array + items: + $ref: '#/components/schemas/FineTuningJob' + has_more: + type: boolean + object: + type: string + enum: + - list + ListRunStepsResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/RunStepObject' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + ListRunsResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/RunObject' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + ListThreadsResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/ThreadObject' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + ListVectorStoreFilesFilter: + anyOf: + - type: string + - type: string + enum: + - in_progress + - completed + - failed + - cancelled + ListVectorStoreFilesResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/VectorStoreFileObject' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + ListVectorStoresResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/VectorStoreObject' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + MessageContent: + type: object + description: Represents a single piece of content in an Assistants API message. + MessageContentImageFileObject: + type: object + required: + - type + - image_file + properties: + type: + type: string + enum: + - image_file + description: Always `image_file`. + image_file: + type: object + properties: + file_id: + type: string + description: The [File](/docs/api-reference/files) ID of the image in the message content. Set `purpose="vision"` when uploading the File if you need to later display the file content. + detail: + type: string + enum: + - auto + - low + - high + description: Specifies the detail level of the image if specified by the user. `low` uses fewer tokens, you can opt in to high resolution using `high`. + default: auto + required: + - file_id + allOf: + - $ref: '#/components/schemas/MessageContent' + description: References an image [File](/docs/api-reference/files) in the content of a message. + MessageContentImageUrlObject: + type: object + required: + - type + - image_url + properties: + type: + type: string + enum: + - image_url + description: The type of the content part. + image_url: + type: object + properties: + url: + type: string + format: uri + description: 'The external URL of the image, must be a supported image types: jpeg, jpg, png, gif, webp.' + detail: + type: string + enum: + - auto + - low + - high + description: Specifies the detail level of the image. `low` uses fewer tokens, you can opt in to high resolution using `high`. Default value is `auto` + default: auto + required: + - url + allOf: + - $ref: '#/components/schemas/MessageContent' + description: References an image URL in the content of a message. + MessageContentRefusalObject: + type: object + required: + - type + - refusal + properties: + type: + type: string + enum: + - refusal + description: Always `refusal`. + refusal: + type: string + allOf: + - $ref: '#/components/schemas/MessageContent' + description: The refusal content generated by the assistant. + MessageContentTextAnnotationsFileCitationObject: + type: object + required: + - type + - text + - file_citation + - start_index + - end_index + properties: + type: + type: string + enum: + - file_citation + description: Always `file_citation`. + text: + type: string + description: The text in the message content that needs to be replaced. + file_citation: + type: object + properties: + file_id: + type: string + description: The ID of the specific File the citation is from. + required: + - file_id + start_index: + type: integer + format: int32 + minimum: 0 + end_index: + type: integer + format: int32 + minimum: 0 + allOf: + - $ref: '#/components/schemas/MessageContentTextObjectAnnotation' + description: A citation within the message that points to a specific quote from a specific File associated with the assistant or the message. Generated when the assistant uses the "file_search" tool to search files. + MessageContentTextAnnotationsFilePathObject: + type: object + required: + - type + - text + - file_path + - start_index + - end_index + properties: + type: + type: string + enum: + - file_path + description: Always `file_path`. + text: + type: string + description: The text in the message content that needs to be replaced. + file_path: + type: object + properties: + file_id: + type: string + description: The ID of the file that was generated. + required: + - file_id + start_index: + type: integer + format: int32 + minimum: 0 + end_index: + type: integer + format: int32 + minimum: 0 + allOf: + - $ref: '#/components/schemas/MessageContentTextObjectAnnotation' + description: A URL for the file that's generated when the assistant used the `code_interpreter` tool to generate a file. + MessageContentTextObject: + type: object + required: + - type + - text + properties: + type: + type: string + enum: + - text + description: Always `text`. + text: + type: object + properties: + value: + type: string + description: The data that makes up the text. + annotations: + type: array + items: + $ref: '#/components/schemas/MessageContentTextObjectAnnotation' + x-oaiExpandable: true + required: + - value + - annotations + allOf: + - $ref: '#/components/schemas/MessageContent' + description: The text content that is part of a message. + MessageContentTextObjectAnnotation: + type: object + required: + - type + properties: + type: + type: string + description: The discriminated type identifier for the content item. + discriminator: + propertyName: type + mapping: + file_citation: '#/components/schemas/MessageContentTextAnnotationsFileCitationObject' + file_path: '#/components/schemas/MessageContentTextAnnotationsFilePathObject' + MessageDeltaContent: + type: object + required: + - type + properties: + type: + type: string + description: The discriminated type identifier for the content item. + discriminator: + propertyName: type + mapping: + image_file: '#/components/schemas/MessageDeltaContentImageFileObject' + image_url: '#/components/schemas/MessageDeltaContentImageUrlObject' + text: '#/components/schemas/MessageDeltaContentTextObject' + refusal: '#/components/schemas/MessageDeltaContentRefusalObject' + description: Represents a single piece of incremental content in an Assistants API streaming response. + MessageDeltaContentImageFileObject: + type: object + required: + - index + - type + properties: + index: + type: integer + format: int32 + description: The index of the content part in the message. + type: + type: string + enum: + - image_file + description: Always `image_file`. + image_file: + type: object + properties: + file_id: + type: string + description: The [File](/docs/api-reference/files) ID of the image in the message content. Set `purpose="vision"` when uploading the File if you need to later display the file content. + detail: + type: string + enum: + - auto + - low + - high + description: Specifies the detail level of the image if specified by the user. `low` uses fewer tokens, you can opt in to high resolution using `high`. + default: auto + allOf: + - $ref: '#/components/schemas/MessageDeltaContent' + description: References an image [File](/docs/api-reference/files) in the content of a message. + MessageDeltaContentImageUrlObject: + type: object + required: + - index + - type + properties: + index: + type: integer + format: int32 + description: The index of the content part in the message. + type: + type: string + enum: + - image_url + description: Always `image_url`. + image_url: + type: object + properties: + url: + type: string + format: uri + description: 'The URL of the image, must be a supported image types: jpeg, jpg, png, gif, webp.' + detail: + type: string + enum: + - auto + - low + - high + description: Specifies the detail level of the image. `low` uses fewer tokens, you can opt in to high resolution using `high`. + default: auto + allOf: + - $ref: '#/components/schemas/MessageDeltaContent' + description: References an image URL in the content of a message. + MessageDeltaContentRefusalObject: + type: object + required: + - index + - type + properties: + index: + type: integer + format: int32 + description: The index of the refusal part in the message. + type: + type: string + enum: + - refusal + description: Always `refusal`. + refusal: + type: string + allOf: + - $ref: '#/components/schemas/MessageDeltaContent' + description: The refusal content that is part of a message. + MessageDeltaContentTextAnnotationsFileCitationObject: + type: object + required: + - index + - type + properties: + index: + type: integer + format: int32 + description: The index of the annotation in the text content part. + type: + type: string + enum: + - file_citation + description: Always `file_citation`. + text: + type: string + description: The text in the message content that needs to be replaced. + file_citation: + type: object + properties: + file_id: + type: string + description: The ID of the specific File the citation is from. + quote: + type: string + description: The specific quote in the file. + start_index: + type: integer + format: int32 + minimum: 0 + end_index: + type: integer + format: int32 + minimum: 0 + allOf: + - $ref: '#/components/schemas/MessageDeltaTextContentAnnotation' + description: A citation within the message that points to a specific quote from a specific File associated with the assistant or the message. Generated when the assistant uses the "file_search" tool to search files. + MessageDeltaContentTextAnnotationsFilePathObject: + type: object + required: + - index + - type + properties: + index: + type: integer + format: int32 + description: The index of the annotation in the text content part. + type: + type: string + enum: + - file_path + description: Always `file_path`. + text: + type: string + description: The text in the message content that needs to be replaced. + file_path: + type: object + properties: + file_id: + type: string + description: The ID of the file that was generated. + start_index: + type: integer + format: int32 + minimum: 0 + end_index: + type: integer + format: int32 + minimum: 0 + allOf: + - $ref: '#/components/schemas/MessageDeltaTextContentAnnotation' + description: A URL for the file that's generated when the assistant used the `code_interpreter` tool to generate a file. + MessageDeltaContentTextObject: + type: object + required: + - index + - type + properties: + index: + type: integer + format: int32 + description: The index of the content part in the message. + type: + type: string + enum: + - text + description: Always `text`. + text: + type: object + properties: + value: + type: string + description: The data that makes up the text. + annotations: + type: array + items: + $ref: '#/components/schemas/MessageDeltaTextContentAnnotation' + x-oaiExpandable: true + allOf: + - $ref: '#/components/schemas/MessageDeltaContent' + description: The text content that is part of a message. + MessageDeltaObject: + type: object + required: + - id + - object + - delta + properties: + id: + type: string + description: The identifier of the message, which can be referenced in API endpoints. + object: + type: string + enum: + - thread.message.delta + description: The object type, which is always `thread.message.delta`. + delta: + type: object + properties: + role: + type: string + enum: + - user + - assistant + description: The entity that produced the message. One of `user` or `assistant`. + content: + type: array + items: + $ref: '#/components/schemas/MessageDeltaContent' + description: The content of the message in array of text and/or images. + x-oaiExpandable: true + description: The delta containing the fields that have changed on the Message. + description: Represents a message delta i.e. any changed fields on a message during streaming. + MessageDeltaTextContentAnnotation: + type: object + required: + - type + properties: + type: + type: string + description: The discriminated type identifier for the content item. + discriminator: + propertyName: type + mapping: + file_citation: '#/components/schemas/MessageDeltaContentTextAnnotationsFileCitationObject' + file_path: '#/components/schemas/MessageDeltaContentTextAnnotationsFilePathObject' + MessageObject: + type: object + required: + - id + - object + - created_at + - thread_id + - status + - incomplete_details + - completed_at + - incomplete_at + - role + - content + - assistant_id + - run_id + - attachments + - metadata + properties: + id: + type: string + description: The identifier, which can be referenced in API endpoints. + object: + type: string + enum: + - thread.message + description: The object type, which is always `thread.message`. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the message was created. + thread_id: + type: string + description: The [thread](/docs/api-reference/threads) ID that this message belongs to. + status: + type: string + enum: + - in_progress + - incomplete + - completed + description: The status of the message, which can be either `in_progress`, `incomplete`, or `completed`. + incomplete_details: + type: object + properties: + reason: + type: string + enum: + - content_filter + - max_tokens + - run_cancelled + - run_expired + - run_failed + description: The reason the message is incomplete. + required: + - reason + nullable: true + description: On an incomplete message, details about why the message is incomplete. + completed_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the message was completed. + incomplete_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the message was marked as incomplete. + role: + type: string + enum: + - user + - assistant + description: The entity that produced the message. One of `user` or `assistant`. + content: + type: array + items: + $ref: '#/components/schemas/MessageContent' + description: The content of the message in array of text and/or images. + x-oaiExpandable: true + assistant_id: + type: string + nullable: true + description: If applicable, the ID of the [assistant](/docs/api-reference/assistants) that authored this message. + run_id: + type: string + nullable: true + description: The ID of the [run](/docs/api-reference/runs) associated with the creation of this message. Value is `null` when messages are created manually using the create message or create thread endpoints. + attachments: + type: object + allOf: + - $ref: '#/components/schemas/MessageObjectAttachmentsItem' + nullable: true + description: A list of files attached to the message, and the tools they were added to. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + description: Represents a message within a [thread](/docs/api-reference/threads). + MessageObjectAttachmentsItem: + type: array + items: + type: object + properties: + file_id: + type: string + description: The ID of the file to attach to the message. + tools: + type: array + items: + anyOf: + - $ref: '#/components/schemas/AssistantToolsCode' + - $ref: '#/components/schemas/AssistantToolsFileSearchTypeOnly' + description: The tools to add this file to. + x-oaiExpandable: true + MessageRequestContentTextObject: + type: object + required: + - type + - text + properties: + type: + type: string + enum: + - text + description: Always `text`. + text: + type: string + description: Text content to be sent to the model + allOf: + - $ref: '#/components/schemas/MessageContent' + description: The text content that is part of a message. + Model: + type: object + required: + - id + - created + - object + - owned_by + properties: + id: + type: string + description: The model identifier, which can be referenced in the API endpoints. + created: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) when the model was created. + object: + type: string + enum: + - model + description: The object type, which is always "model". + owned_by: + type: string + description: The organization that owns the model. + description: Describes an OpenAI model offering that can be used with the API. + ModifyAssistantRequest: + type: object + properties: + model: + type: string + description: ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. + name: + type: string + nullable: true + maxLength: 256 + description: The name of the assistant. The maximum length is 256 characters. + description: + type: string + nullable: true + maxLength: 512 + description: The description of the assistant. The maximum length is 512 characters. + instructions: + type: string + nullable: true + maxLength: 256000 + description: The system instructions that the assistant uses. The maximum length is 256,000 characters. + tools: + type: array + items: + $ref: '#/components/schemas/AssistantToolDefinition' + maxItems: 128 + description: A list of tool enabled on the assistant. There can be a maximum of 128 tools per assistant. Tools can be of types `code_interpreter`, `file_search`, or `function`. + x-oaiExpandable: true + default: [] + tool_resources: + type: object + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + items: + type: string + maxItems: 20 + description: Overrides the list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + file_search: + $ref: '#/components/schemas/ToolResourcesFileSearchIdsOnly' + nullable: true + description: A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + temperature: + type: number + format: float + nullable: true + minimum: 0 + maximum: 2 + description: What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + default: 1 + top_p: + type: number + format: float + nullable: true + minimum: 0 + maximum: 1 + description: |- + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or temperature but not both. + default: 1 + response_format: + oneOf: + - $ref: '#/components/schemas/AssistantsApiResponseFormatOption' + nullable: true + ModifyMessageRequest: + type: object + properties: + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + ModifyRunRequest: + type: object + properties: + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + ModifyThreadRequest: + type: object + properties: + tool_resources: + type: object + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + items: + type: string + maxItems: 20 + description: A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + file_search: + $ref: '#/components/schemas/ToolResourcesFileSearchIdsOnly' + nullable: true + description: A set of resources that are made available to the assistant's tools in this thread. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + OmniTypedResponseFormat: + type: object + required: + - type + properties: + type: + type: string + discriminator: + propertyName: type + mapping: + json_object: '#/components/schemas/ResponseFormatJsonObject' + json_schema: '#/components/schemas/ResponseFormatJsonSchema' + OpenAIFile: + type: object + required: + - id + - bytes + - created_at + - filename + - object + - purpose + - status + properties: + id: + type: string + description: The file identifier, which can be referenced in the API endpoints. + bytes: + type: integer + format: int32 + nullable: true + description: The size of the file, in bytes. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the file was created. + filename: + type: string + description: The name of the file. + object: + type: string + enum: + - file + description: The object type, which is always `file`. + purpose: + type: string + enum: + - assistants + - assistants_output + - batch + - batch_output + - fine-tune + - fine-tune-results + - vision + description: The intended purpose of the file. Supported values are `assistants`, `assistants_output`, `batch`, `batch_output`, `fine-tune`, `fine-tune-results` and `vision`. + status: + type: string + enum: + - uploaded + - processed + - error + description: Deprecated. The current status of the file, which can be either `uploaded`, `processed`, or `error`. + deprecated: true + status_details: + type: string + description: Deprecated. For details on why a fine-tuning training file failed validation, see the `error` field on `fine_tuning.job`. + deprecated: true + description: The `File` object represents a document that has been uploaded to OpenAI. + OtherChunkingStrategyResponseParam: + type: object + required: + - type + properties: + type: + type: string + enum: + - other + description: Always `other`. + allOf: + - $ref: '#/components/schemas/FileChunkingStrategyResponseParam' + description: This is returned when the chunking strategy is unknown. Typically, this is because the file was indexed before the `chunking_strategy` concept was introduced in the API. + Project: + type: object + required: + - id + - object + - name + - created_at + - status + properties: + id: + type: string + description: The identifier, which can be referenced in API endpoints + object: + type: string + enum: + - organization.project + description: The object type, which is always `organization.project` + name: + type: string + description: The name of the project. This appears in reporting. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the project was created. + archived_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) of when the project was archived or `null`. + status: + type: string + enum: + - active + - archived + description: '`active` or `archived`' + description: Represents an individual project. + ProjectApiKey: + type: object + required: + - object + - redacted_value + - name + - created_at + - id + - owner + properties: + object: + type: string + enum: + - organization.project.api_key + description: The object type, which is always `organization.project.api_key` + redacted_value: + type: string + description: The redacted value of the API key + name: + type: string + description: The name of the API key + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the API key was created + id: + type: string + description: The identifier, which can be referenced in API endpoints + owner: + type: object + properties: + type: + type: string + enum: + - user + - service_account + description: '`user` or `service_account`' + user: + $ref: '#/components/schemas/ProjectUser' + service_account: + $ref: '#/components/schemas/ProjectServiceAccount' + description: Represents an individual API key in a project. + ProjectApiKeyDeleteResponse: + type: object + required: + - object + - id + - deleted + properties: + object: + type: string + enum: + - organization.project.api_key.deleted + id: + type: string + deleted: + type: boolean + ProjectApiKeyListResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/ProjectApiKey' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + ProjectCreateRequest: + type: object + required: + - name + properties: + name: + type: string + description: The friendly name of the project, this name appears in reports. + ProjectListResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/Project' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + ProjectServiceAccount: + type: object + required: + - object + - id + - name + - role + - created_at + properties: + object: + type: string + enum: + - organization.project.service_account + description: The object type, which is always `organization.project.service_account` + id: + type: string + description: The identifier, which can be referenced in API endpoints + name: + type: string + description: The name of the service account + role: + type: string + enum: + - owner + - member + description: '`owner` or `member`' + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the service account was created + description: Represents an individual service account in a project. + ProjectServiceAccountApiKey: + type: object + required: + - object + - value + - name + - created_at + - id + properties: + object: + type: string + enum: + - organization.project.service_account.api_key + description: The object type, which is always `organization.project.service_account.api_key` + value: + type: string + name: + type: string + created_at: + type: integer + format: unixtime + id: + type: string + ProjectServiceAccountCreateRequest: + type: object + required: + - name + properties: + name: + type: string + description: The name of the service account being created. + ProjectServiceAccountCreateResponse: + type: object + required: + - object + - id + - name + - role + - created_at + - api_key + properties: + object: + type: string + enum: + - organization.project.service_account + id: + type: string + name: + type: string + role: + type: string + enum: + - member + description: Service accounts can only have one role of type `member` + created_at: + type: integer + format: unixtime + api_key: + $ref: '#/components/schemas/ProjectServiceAccountApiKey' + ProjectServiceAccountDeleteResponse: + type: object + required: + - object + - id + - deleted + properties: + object: + type: string + enum: + - organization.project.service_account.deleted + id: + type: string + deleted: + type: boolean + ProjectServiceAccountListResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/ProjectServiceAccount' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + ProjectUpdateRequest: + type: object + required: + - name + properties: + name: + type: string + description: The updated name of the project, this name appears in reports. + ProjectUser: + type: object + required: + - object + - id + - name + - email + - role + - added_at + properties: + object: + type: string + enum: + - organization.project.user + description: The object type, which is always `organization.project.user` + id: + type: string + description: The identifier, which can be referenced in API endpoints + name: + type: string + description: The name of the user + email: + type: string + description: The email address of the user + role: + type: string + enum: + - owner + - member + description: '`owner` or `member`' + added_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the project was added. + description: Represents an individual user in a project. + ProjectUserCreateRequest: + type: object + required: + - user_id + - role + properties: + user_id: + type: string + description: The ID of the user. + role: + type: string + enum: + - owner + - member + description: '`owner` or `member`' + ProjectUserDeleteResponse: + type: object + required: + - object + - id + - deleted + properties: + object: + type: string + enum: + - organization.project.user.deleted + id: + type: string + deleted: + type: boolean + ProjectUserListResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + data: + type: array + items: + $ref: '#/components/schemas/ProjectUser' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + ProjectUserUpdateRequest: + type: object + required: + - role + properties: + role: + type: string + enum: + - owner + - member + description: '`owner` or `member`' + ResponseFormatJsonObject: + type: object + required: + - type + properties: + type: + type: string + enum: + - json_object + description: 'The type of response format being defined: `json_object`' + allOf: + - $ref: '#/components/schemas/OmniTypedResponseFormat' + ResponseFormatJsonSchema: + type: object + required: + - type + - json_schema + properties: + type: + type: string + enum: + - json_schema + description: 'The type of response format being defined: `json_schema`' + json_schema: + type: object + properties: + description: + type: string + description: A description of what the response format is for, used by the model to determine how to respond in the format. + name: + type: string + description: The name of the response format. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. + schema: + $ref: '#/components/schemas/ResponseFormatJsonSchemaSchema' + strict: + type: boolean + nullable: true + description: Whether to enable strict schema adherence when generating the output. If set to true, the model will always follow the exact schema defined in the `schema` field. Only a subset of JSON Schema is supported when `strict` is `true`. To learn more, read the [Structured Outputs guide](/docs/guides/structured-outputs). + default: false + required: + - name + allOf: + - $ref: '#/components/schemas/OmniTypedResponseFormat' + ResponseFormatJsonSchemaSchema: + type: object + additionalProperties: {} + description: The schema for the response format, described as a JSON Schema object. + ResponseFormatText: + type: object + required: + - type + properties: + type: + type: string + enum: + - text + description: 'The type of response format being defined: `text`' + allOf: + - $ref: '#/components/schemas/OmniTypedResponseFormat' + RunCompletionUsage: + type: object + required: + - completion_tokens + - prompt_tokens + - total_tokens + properties: + completion_tokens: + type: integer + format: int32 + description: Number of completion tokens used over the course of the run. + prompt_tokens: + type: integer + format: int32 + description: Number of prompt tokens used over the course of the run. + total_tokens: + type: integer + format: int32 + description: Total number of tokens used (prompt + completion). + description: Usage statistics related to the run. This value will be `null` if the run is not in a terminal state (i.e. `in_progress`, `queued`, etc.). + RunObject: + type: object + required: + - id + - object + - created_at + - thread_id + - assistant_id + - status + - required_action + - last_error + - expires_at + - started_at + - cancelled_at + - failed_at + - completed_at + - incomplete_details + - model + - instructions + - tools + - metadata + - usage + - max_prompt_tokens + - max_completion_tokens + - truncation_strategy + - tool_choice + - parallel_tool_calls + - response_format + properties: + id: + type: string + description: The identifier, which can be referenced in API endpoints. + object: + type: string + enum: + - thread.run + description: The object type, which is always `thread.run`. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the run was created. + thread_id: + type: string + description: The ID of the [thread](/docs/api-reference/threads) that was executed on as a part of this run. + assistant_id: + type: string + description: The ID of the [assistant](/docs/api-reference/assistants) used for execution of this run. + status: + type: string + enum: + - queued + - in_progress + - requires_action + - cancelling + - cancelled + - failed + - completed + - incomplete + - expired + description: The status of the run, which can be either `queued`, `in_progress`, `requires_action`, `cancelling`, `cancelled`, `failed`, `completed`, `incomplete`, or `expired`. + required_action: + type: object + properties: + type: + type: string + enum: + - submit_tool_outputs + description: For now, this is always `submit_tool_outputs`. + submit_tool_outputs: + type: object + properties: + tool_calls: + type: array + items: + $ref: '#/components/schemas/RunToolCallObject' + description: A list of the relevant tool calls. + required: + - tool_calls + description: Details on the tool outputs needed for this run to continue. + required: + - type + - submit_tool_outputs + nullable: true + description: Details on the action required to continue the run. Will be `null` if no action is required. + last_error: + type: object + properties: + code: + type: string + enum: + - server_error + - rate_limit_exceeded + - invalid_prompt + description: One of `server_error`, `rate_limit_exceeded`, or `invalid_prompt`. + message: + type: string + description: A human-readable description of the error. + required: + - code + - message + nullable: true + description: The last error associated with this run. Will be `null` if there are no errors. + expires_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the run will expire. + started_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the run was started. + cancelled_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the run was cancelled. + failed_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the run failed. + completed_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the run was completed. + incomplete_details: + type: object + properties: + reason: + type: string + enum: + - max_completion_tokens + - max_prompt_tokens + description: The reason why the run is incomplete. This will point to which specific token limit was reached over the course of the run. + nullable: true + description: Details on why the run is incomplete. Will be `null` if the run is not incomplete. + model: + type: string + description: The model that the [assistant](/docs/api-reference/assistants) used for this run. + instructions: + type: string + description: The instructions that the [assistant](/docs/api-reference/assistants) used for this run. + tools: + type: array + items: + $ref: '#/components/schemas/AssistantToolDefinition' + maxItems: 20 + description: The list of tools that the [assistant](/docs/api-reference/assistants) used for this run. + x-oaiExpandable: true + default: [] + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + usage: + type: object + allOf: + - $ref: '#/components/schemas/RunCompletionUsage' + nullable: true + temperature: + type: number + format: float + nullable: true + description: The sampling temperature used for this run. If not set, defaults to 1. + top_p: + type: number + format: float + nullable: true + description: The nucleus sampling value used for this run. If not set, defaults to 1. + max_prompt_tokens: + type: integer + format: int32 + nullable: true + minimum: 256 + description: The maximum number of prompt tokens specified to have been used over the course of the run. + max_completion_tokens: + type: integer + format: int32 + nullable: true + minimum: 256 + description: The maximum number of completion tokens specified to have been used over the course of the run. + truncation_strategy: + type: object + allOf: + - $ref: '#/components/schemas/TruncationObject' + nullable: true + tool_choice: + oneOf: + - $ref: '#/components/schemas/AssistantsApiToolChoiceOption' + nullable: true + parallel_tool_calls: + type: boolean + default: true + response_format: + oneOf: + - $ref: '#/components/schemas/AssistantsApiResponseFormatOption' + nullable: true + description: Represents an execution run on a [thread](/docs/api-reference/threads). + RunStepCompletionUsage: + type: object + required: + - completion_tokens + - prompt_tokens + - total_tokens + properties: + completion_tokens: + type: integer + format: int32 + description: Number of completion tokens used over the course of the run step. + prompt_tokens: + type: integer + format: int32 + description: Number of prompt tokens used over the course of the run step. + total_tokens: + type: integer + format: int32 + description: Total number of tokens used (prompt + completion). + description: Usage statistics related to the run step. This value will be `null` while the run step's status is `in_progress`. + RunStepDeltaObject: + type: object + required: + - id + - object + - delta + properties: + id: + type: string + description: The identifier of the run step, which can be referenced in API endpoints. + object: + type: string + enum: + - thread.run.step.delta + description: The object type, which is always `thread.run.step.delta`. + delta: + type: object + properties: + step_details: + allOf: + - $ref: '#/components/schemas/RunStepDeltaStepDetails' + description: The details of the run step. + x-oaiExpandable: true + description: The delta containing the fields that have changed on the run step. + description: Represents a run step delta i.e. any changed fields on a run step during streaming. + RunStepDeltaStepDetails: + type: object + required: + - type + properties: + type: + type: string + description: The discriminated type identifier for the details object. + discriminator: + propertyName: type + mapping: + message_creation: '#/components/schemas/RunStepDeltaStepDetailsMessageCreationObject' + tool_calls: '#/components/schemas/RunStepDeltaStepDetailsToolCallsObject' + RunStepDeltaStepDetailsMessageCreationObject: + type: object + required: + - type + properties: + type: + type: string + enum: + - message_creation + description: Always `message_creation`. + message_creation: + type: object + properties: + message_id: + type: string + description: The ID of the message that was created by this run step. + allOf: + - $ref: '#/components/schemas/RunStepDeltaStepDetails' + description: Details of the message creation by the run step. + RunStepDeltaStepDetailsToolCallsCodeObject: + type: object + required: + - index + - type + properties: + index: + type: integer + format: int32 + description: The index of the tool call in the tool calls array. + id: + type: string + description: The ID of the tool call. + type: + type: string + enum: + - code_interpreter + description: The type of tool call. This is always going to be `code_interpreter` for this type of tool call. + code_interpreter: + type: object + properties: + input: + type: string + description: The input to the Code Interpreter tool call. + outputs: + type: array + items: + $ref: '#/components/schemas/RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject' + description: The outputs from the Code Interpreter tool call. Code Interpreter can output one or more items, including text (`logs`) or images (`image`). Each of these are represented by a different object type. + x-oaiExpandable: true + description: The Code Interpreter tool call definition. + allOf: + - $ref: '#/components/schemas/RunStepDeltaStepDetailsToolCallsObjectToolCallsObject' + description: Details of the Code Interpreter tool call the run step was involved in. + RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject: + type: object + required: + - type + properties: + type: + type: string + description: The discriminated type identifier for the details object. + discriminator: + propertyName: type + mapping: + logs: '#/components/schemas/RunStepDeltaStepDetailsToolCallsCodeOutputLogsObject' + image: '#/components/schemas/RunStepDeltaStepDetailsToolCallsCodeOutputImageObject' + description: Abstractly represents a run step tool call details code interpreter output. + RunStepDeltaStepDetailsToolCallsCodeOutputImageObject: + type: object + required: + - index + - type + properties: + index: + type: integer + format: int32 + description: The index of the output in the outputs array. + type: + type: string + enum: + - image + description: Always `image`. + image: + type: object + properties: + file_id: + type: string + description: The [file](/docs/api-reference/files) ID of the image. + allOf: + - $ref: '#/components/schemas/RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject' + RunStepDeltaStepDetailsToolCallsCodeOutputLogsObject: + type: object + required: + - index + - type + properties: + index: + type: integer + format: int32 + description: The index of the output in the outputs array. + type: + type: string + enum: + - logs + description: Always `logs`. + logs: + type: string + description: The text output from the Code Interpreter tool call. + allOf: + - $ref: '#/components/schemas/RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject' + description: Text output from the Code Interpreter tool call as part of a run step. + RunStepDeltaStepDetailsToolCallsFileSearchObject: + type: object + required: + - index + - type + - file_search + properties: + index: + type: integer + format: int32 + description: The index of the tool call in the tool calls array. + id: + type: string + description: The ID of the tool call object. + type: + type: string + enum: + - file_search + description: The type of tool call. This is always going to be `file_search` for this type of tool call. + file_search: + type: object + additionalProperties: + type: string + description: For now, this is always going to be an empty object. + x-oaiTypeLabel: map + allOf: + - $ref: '#/components/schemas/RunStepDeltaStepDetailsToolCallsObjectToolCallsObject' + RunStepDeltaStepDetailsToolCallsFunctionObject: + type: object + required: + - index + - type + properties: + index: + type: integer + format: int32 + description: The index of the tool call in the tool calls array. + id: + type: string + description: The ID of the tool call object. + type: + type: string + enum: + - function + description: The type of tool call. This is always going to be `function` for this type of tool call. + function: + type: object + properties: + name: + type: string + description: The name of the function. + arguments: + type: string + description: The arguments passed to the function. + output: + type: string + nullable: true + description: The output of the function. This will be `null` if the outputs have not been [submitted](/docs/api-reference/runs/submitToolOutputs) yet. + description: The definition of the function that was called. + allOf: + - $ref: '#/components/schemas/RunStepDeltaStepDetailsToolCallsObjectToolCallsObject' + RunStepDeltaStepDetailsToolCallsObject: + type: object + required: + - type + properties: + type: + type: string + enum: + - tool_calls + description: Always `tool_calls`. + tool_calls: + type: array + items: + $ref: '#/components/schemas/RunStepDeltaStepDetailsToolCallsObjectToolCallsObject' + description: 'An array of tool calls the run step was involved in. These can be associated with one of three types of tools: `code_interpreter`, `file_search`, or `function`.' + x-oaiExpandable: true + allOf: + - $ref: '#/components/schemas/RunStepDeltaStepDetails' + description: Details of the tool call. + RunStepDeltaStepDetailsToolCallsObjectToolCallsObject: + type: object + required: + - type + properties: + type: + type: string + description: The discriminated type identifier for the details object. + discriminator: + propertyName: type + mapping: + code_interpreter: '#/components/schemas/RunStepDeltaStepDetailsToolCallsCodeObject' + file_search: '#/components/schemas/RunStepDeltaStepDetailsToolCallsFileSearchObject' + function: '#/components/schemas/RunStepDeltaStepDetailsToolCallsFunctionObject' + description: Abstractly represents a run step tool call details inner object. + RunStepDetailsMessageCreationObject: + type: object + required: + - type + - message_creation + properties: + type: + type: string + enum: + - message_creation + description: Always `message_creation`. + message_creation: + type: object + properties: + message_id: + type: string + description: The ID of the message that was created by this run step. + required: + - message_id + allOf: + - $ref: '#/components/schemas/RunStepObjectStepDetails' + description: Details of the message creation by the run step. + RunStepDetailsToolCallsCodeObject: + type: object + required: + - id + - type + - code_interpreter + properties: + id: + type: string + description: The ID of the tool call. + type: + type: string + enum: + - code_interpreter + description: The type of tool call. This is always going to be `code_interpreter` for this type of tool call. + code_interpreter: + type: object + properties: + input: + type: string + description: The input to the Code Interpreter tool call. + outputs: + type: array + items: + $ref: '#/components/schemas/RunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject' + description: The outputs from the Code Interpreter tool call. Code Interpreter can output one or more items, including text (`logs`) or images (`image`). Each of these are represented by a different object type. + x-oaiExpandable: true + required: + - input + - outputs + description: The Code Interpreter tool call definition. + allOf: + - $ref: '#/components/schemas/RunStepDetailsToolCallsObjectToolCallsObject' + description: Details of the Code Interpreter tool call the run step was involved in. + RunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject: + type: object + required: + - type + properties: + type: + type: string + description: The discriminated type identifier for the details object. + discriminator: + propertyName: type + mapping: + logs: '#/components/schemas/RunStepDetailsToolCallsCodeOutputLogsObject' + image: '#/components/schemas/RunStepDetailsToolCallsCodeOutputImageObject' + description: Abstractly represents a run step tool call details code interpreter output. + RunStepDetailsToolCallsCodeOutputImageObject: + type: object + required: + - type + - image + properties: + type: + type: string + enum: + - image + description: Always `image`. + image: + type: object + properties: + file_id: + type: string + description: The [file](/docs/api-reference/files) ID of the image. + required: + - file_id + allOf: + - $ref: '#/components/schemas/RunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject' + RunStepDetailsToolCallsCodeOutputLogsObject: + type: object + required: + - type + - logs + properties: + type: + type: string + enum: + - logs + description: Always `logs`. + logs: + type: string + description: The text output from the Code Interpreter tool call. + allOf: + - $ref: '#/components/schemas/RunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject' + description: Text output from the Code Interpreter tool call as part of a run step. + RunStepDetailsToolCallsFileSearchObject: + type: object + required: + - id + - type + - file_search + properties: + id: + type: string + description: The ID of the tool call object. + type: + type: string + enum: + - file_search + description: The type of tool call. This is always going to be `file_search` for this type of tool call. + file_search: + type: object + additionalProperties: + type: string + description: For now, this is always going to be an empty object. + x-oaiTypeLabel: map + allOf: + - $ref: '#/components/schemas/RunStepDetailsToolCallsObjectToolCallsObject' + RunStepDetailsToolCallsFunctionObject: + type: object + required: + - id + - type + - function + properties: + id: + type: string + description: The ID of the tool call object. + type: + type: string + enum: + - function + description: The type of tool call. This is always going to be `function` for this type of tool call. + function: + type: object + properties: + name: + type: string + description: The name of the function. + arguments: + type: string + description: The arguments passed to the function. + output: + type: string + nullable: true + description: The output of the function. This will be `null` if the outputs have not been [submitted](/docs/api-reference/runs/submitToolOutputs) yet. + required: + - name + - arguments + - output + description: The definition of the function that was called. + allOf: + - $ref: '#/components/schemas/RunStepDetailsToolCallsObjectToolCallsObject' + RunStepDetailsToolCallsObject: + type: object + required: + - type + - tool_calls + properties: + type: + type: string + enum: + - tool_calls + description: Always `tool_calls`. + tool_calls: + type: array + items: + $ref: '#/components/schemas/RunStepDetailsToolCallsObjectToolCallsObject' + description: 'An array of tool calls the run step was involved in. These can be associated with one of three types of tools: `code_interpreter`, `file_search`, or `function`.' + x-oaiExpandable: true + allOf: + - $ref: '#/components/schemas/RunStepObjectStepDetails' + description: Details of the tool call. + RunStepDetailsToolCallsObjectToolCallsObject: + type: object + required: + - type + properties: + type: + type: string + description: The discriminated type identifier for the details object. + discriminator: + propertyName: type + mapping: + code_interpreter: '#/components/schemas/RunStepDetailsToolCallsCodeObject' + file_search: '#/components/schemas/RunStepDetailsToolCallsFileSearchObject' + function: '#/components/schemas/RunStepDetailsToolCallsFunctionObject' + description: Abstractly represents a run step tool call details inner object. + RunStepObject: + type: object + required: + - id + - object + - created_at + - assistant_id + - thread_id + - run_id + - type + - status + - step_details + - last_error + - expired_at + - cancelled_at + - failed_at + - completed_at + - metadata + - usage + properties: + id: + type: string + description: The identifier of the run step, which can be referenced in API endpoints. + object: + type: string + enum: + - thread.run.step + description: The object type, which is always `thread.run.step`. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the run step was created. + assistant_id: + type: string + description: The ID of the [assistant](/docs/api-reference/assistants) associated with the run step. + thread_id: + type: string + description: The ID of the [thread](/docs/api-reference/threads) that was run. + run_id: + type: string + description: The ID of the [run](/docs/api-reference/runs) that this run step is a part of. + type: + type: string + enum: + - message_creation + - tool_calls + description: The type of run step, which can be either `message_creation` or `tool_calls`. + status: + type: string + enum: + - in_progress + - cancelled + - failed + - completed + - expired + description: The status of the run step, which can be either `in_progress`, `cancelled`, `failed`, `completed`, or `expired`. + step_details: + allOf: + - $ref: '#/components/schemas/RunStepObjectStepDetails' + description: The details of the run step. + x-oaiExpandable: true + last_error: + type: object + properties: + code: + type: string + enum: + - server_error + - rate_limit_exceeded + description: One of `server_error` or `rate_limit_exceeded`. + message: + type: string + description: A human-readable description of the error. + required: + - code + - message + nullable: true + description: The last error associated with this run step. Will be `null` if there are no errors. + expired_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the run step expired. A step is considered expired if the parent run is expired. + cancelled_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the run step was cancelled. + failed_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the run step failed. + completed_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the run step completed. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + usage: + type: object + allOf: + - $ref: '#/components/schemas/RunStepCompletionUsage' + nullable: true + description: Represents a step in execution of a run. + RunStepObjectStepDetails: + type: object + required: + - type + properties: + type: + type: string + description: The discriminated type identifier for the details object. + discriminator: + propertyName: type + mapping: + message_creation: '#/components/schemas/RunStepDetailsMessageCreationObject' + tool_calls: '#/components/schemas/RunStepDetailsToolCallsObject' + description: Abstractly represents a run step details object. + RunToolCallObject: + type: object + required: + - id + - type + - function + properties: + id: + type: string + description: The ID of the tool call. This ID must be referenced when you submit the tool outputs in using the [Submit tool outputs to run](/docs/api-reference/runs/submitToolOutputs) endpoint. + type: + type: string + enum: + - function + description: The type of tool call the output is required for. For now, this is always `function`. + function: + type: object + properties: + name: + type: string + description: The name of the function. + arguments: + type: string + description: The arguments that the model expects you to pass to the function. + required: + - name + - arguments + description: The function definition. + description: Tool call objects + StaticChunkingStrategy: + type: object + required: + - max_chunk_size_tokens + - chunk_overlap_tokens + properties: + max_chunk_size_tokens: + type: integer + format: int32 + minimum: 100 + maximum: 4096 + description: The maximum number of tokens in each chunk. The default value is `800`. The minimum value is `100` and the maximum value is `4096`. + chunk_overlap_tokens: + type: integer + format: int32 + description: |- + The number of tokens that overlap between chunks. The default value is `400`. + + Note that the overlap must not exceed half of `max_chunk_size_tokens`. + StaticChunkingStrategyRequestParam: + type: object + required: + - type + - static + properties: + type: + type: string + enum: + - static + description: Always `static`. + static: + $ref: '#/components/schemas/StaticChunkingStrategy' + allOf: + - $ref: '#/components/schemas/FileChunkingStrategyRequestParam' + StaticChunkingStrategyResponseParam: + type: object + required: + - type + - static + properties: + type: + type: string + enum: + - static + description: Always `static`. + static: + $ref: '#/components/schemas/StaticChunkingStrategy' + allOf: + - $ref: '#/components/schemas/FileChunkingStrategyResponseParam' + SubmitToolOutputsRunRequest: + type: object + required: + - tool_outputs + properties: + tool_outputs: + type: array + items: + type: object + properties: + tool_call_id: + type: string + description: The ID of the tool call in the `required_action` object within the run object the output is being submitted for. + output: + type: string + description: The output of the tool call to be submitted to continue the run. + description: A list of tools for which the outputs are being submitted. + stream: + type: boolean + nullable: true + description: 'If `true`, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with a `data: [DONE]` message.' + ThreadObject: + type: object + required: + - id + - object + - created_at + - tool_resources + - metadata + properties: + id: + type: string + description: The identifier, which can be referenced in API endpoints. + object: + type: string + enum: + - thread + description: The object type, which is always `thread`. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the thread was created. + tool_resources: + type: object + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + items: + type: string + maxItems: 20 + description: A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + file_search: + type: object + properties: + vector_store_ids: + type: array + items: + type: string + maxItems: 1 + description: The [vector store](/docs/api-reference/vector-stores/object) attached to this thread. There can be a maximum of 1 vector store attached to the thread. + nullable: true + description: A set of resources that are made available to the assistant's tools in this thread. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + description: Represents a thread that contains [messages](/docs/api-reference/messages). + ToolResourcesFileSearch: + type: object + properties: + vector_store_ids: + type: array + items: + type: string + maxItems: 1 + description: |- + The [vector store](/docs/api-reference/vector-stores/object) attached to this assistant. + There can be a maximum of 1 vector store attached to the assistant. + vector_stores: + type: array + items: + type: object + properties: + file_ids: + type: array + items: + type: string + maxItems: 10000 + description: |- + A list of [file](/docs/api-reference/files) IDs to add to the vector store. There can be + a maximum of 10000 files in a vector store. + chunking_strategy: + anyOf: + - $ref: '#/components/schemas/AutoChunkingStrategyRequestParam' + - $ref: '#/components/schemas/StaticChunkingStrategyRequestParam' + description: The chunking strategy used to chunk the file(s). If not set, will use the `auto` strategy. Only applicable if `file_ids` is non-empty. + x-oaiExpandable: true + metadata: + type: object + additionalProperties: + type: string + description: |- + Set of 16 key-value pairs that can be attached to a vector store. This can be useful for + storing additional information about the vector store in a structured format. Keys can + be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + maxItems: 1 + description: |- + A helper to create a [vector store](/docs/api-reference/vector-stores/object) with + file_ids and attach it to this assistant. There can be a maximum of 1 vector store + attached to the assistant. + ToolResourcesFileSearchIdsOnly: + type: object + properties: + vector_store_ids: + type: array + items: + type: string + maxItems: 1 + description: |- + The [vector store](/docs/api-reference/vector-stores/object) attached to this assistant. + There can be a maximum of 1 vector store attached to the assistant. + ToolResourcesFileSearchVectorStoreCreationHelpers: + type: object + properties: + vector_stores: + type: array + items: + type: object + properties: + file_ids: + type: array + items: + type: string + maxItems: 10000 + description: |- + A list of [file](/docs/api-reference/files) IDs to add to the vector store. There can be + a maximum of 10000 files in a vector store. + chunking_strategy: + anyOf: + - $ref: '#/components/schemas/AutoChunkingStrategyRequestParam' + - $ref: '#/components/schemas/StaticChunkingStrategyRequestParam' + description: The chunking strategy used to chunk the file(s). If not set, will use the `auto` strategy. Only applicable if `file_ids` is non-empty. + x-oaiExpandable: true + metadata: + type: object + additionalProperties: + type: string + description: |- + Set of 16 key-value pairs that can be attached to a vector store. This can be useful for + storing additional information about the vector store in a structured format. Keys can + be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + maxItems: 1 + description: |- + A helper to create a [vector store](/docs/api-reference/vector-stores/object) with + file_ids and attach it to this assistant. There can be a maximum of 1 vector store + attached to the assistant. + TranscriptionSegment: + type: object + required: + - id + - seek + - start + - end + - text + - tokens + - temperature + - avg_logprob + - compression_ratio + - no_speech_prob + properties: + id: + type: integer + format: int32 + description: Unique identifier of the segment. + seek: + type: integer + format: int32 + description: Seek offset of the segment. + start: + type: number + format: float + description: Start time of the segment in seconds. + end: + type: number + format: float + description: End time of the segment in seconds. + text: + type: string + description: Text content of the segment. + tokens: + type: array + items: + type: integer + format: int32 + description: Array of token IDs for the text content. + temperature: + type: number + format: float + description: Temperature parameter used for generating the segment. + avg_logprob: + type: number + format: float + description: Average logprob of the segment. If the value is lower than -1, consider the logprobs failed. + compression_ratio: + type: number + format: float + description: Compression ratio of the segment. If the value is greater than 2.4, consider the compression failed. + no_speech_prob: + type: number + format: float + description: Probability of no speech in the segment. If the value is higher than 1.0 and the `avg_logprob` is below -1, consider this segment silent. + TranscriptionWord: + type: object + required: + - word + - start + - end + properties: + word: + type: string + description: The text content of the word. + start: + type: number + format: float + description: Start time of the word in seconds. + end: + type: number + format: float + description: End time of the word in seconds. + TruncationObject: + type: object + required: + - type + properties: + type: + type: string + enum: + - auto + - last_messages + description: The truncation strategy to use for the thread. The default is `auto`. If set to `last_messages`, the thread will be truncated to the n most recent messages in the thread. When set to `auto`, messages in the middle of the thread will be dropped to fit the context length of the model, `max_prompt_tokens`. + last_messages: + type: integer + format: int32 + nullable: true + minimum: 1 + description: The number of most recent messages from the thread when constructing the context for the run. + description: Controls for how a thread will be truncated prior to the run. Use this to control the intial context window of the run. + UpdateVectorStoreRequest: + type: object + properties: + name: + type: string + nullable: true + description: The name of the vector store. + expires_after: + type: object + allOf: + - $ref: '#/components/schemas/VectorStoreExpirationAfter' + nullable: true + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + Upload: + type: object + required: + - id + - created_at + - filename + - bytes + - purpose + - status + - expires_at + properties: + id: + type: string + description: The Upload unique identifier, which can be referenced in API endpoints. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the Upload was created. + filename: + type: string + description: The name of the file to be uploaded. + bytes: + type: integer + format: int32 + description: The intended number of bytes to be uploaded. + purpose: + type: string + description: The intended purpose of the file. [Please refer here](/docs/api-reference/files/object#files/object-purpose) for acceptable values. + status: + type: string + enum: + - pending + - completed + - cancelled + - expired + description: The status of the Upload. + expires_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the Upload was created. + object: + type: string + enum: + - upload + description: The object type, which is always "upload". + file: + type: object + allOf: + - $ref: '#/components/schemas/OpenAIFile' + nullable: true + description: The ready File object after the Upload is completed. + description: The Upload object can accept byte chunks in the form of Parts. + UploadPart: + type: object + required: + - id + - created_at + - upload_id + - object + properties: + id: + type: string + description: The upload Part unique identifier, which can be referenced in API endpoints. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the Part was created. + upload_id: + type: string + description: The ID of the Upload object that this Part was added to. + object: + type: string + enum: + - upload.part + description: The object type, which is always `upload.part`. + description: The upload Part represents a chunk of bytes we can add to an Upload object. + User: + type: object + required: + - object + - id + - name + - email + - role + - added_at + properties: + object: + type: string + enum: + - organization.user + description: The object type, which is always `organization.user` + id: + type: string + description: The identifier, which can be referenced in API endpoints + name: + type: string + description: The name of the user + email: + type: string + description: The email address of the user + role: + type: string + enum: + - owner + - reader + description: '`owner` or `reader`' + added_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) of when the user was added. + description: Represents an individual `user` within an organization. + UserDeleteResponse: + type: object + required: + - object + - id + - deleted + properties: + object: + type: string + enum: + - organization.user.deleted + id: + type: string + deleted: + type: boolean + UserListResponse: + type: object + required: + - object + - data + - first_id + - last_id + - has_more + properties: + object: + type: string + enum: + - list + data: + type: array + items: + $ref: '#/components/schemas/User' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + UserRoleUpdateRequest: + type: object + required: + - role + properties: + role: + type: string + enum: + - owner + - reader + description: '`owner` or `reader`' + VectorStoreExpirationAfter: + type: object + required: + - anchor + - days + properties: + anchor: + type: string + enum: + - last_active_at + description: 'Anchor timestamp after which the expiration policy applies. Supported anchors: `last_active_at`.' + days: + type: integer + format: int32 + minimum: 1 + maximum: 365 + description: The number of days after the anchor time that the vector store will expire. + description: The expiration policy for a vector store. + VectorStoreFileBatchObject: + type: object + required: + - id + - object + - created_at + - vector_store_id + - status + - file_counts + properties: + id: + type: string + description: The identifier, which can be referenced in API endpoints. + object: + type: string + enum: + - vector_store.files_batch + description: The object type, which is always `vector_store.file_batch`. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the vector store files batch was created. + vector_store_id: + type: string + description: The ID of the [vector store](/docs/api-reference/vector-stores/object) that the [File](/docs/api-reference/files) is attached to. + status: + type: string + enum: + - in_progress + - completed + - cancelled + - failed + description: The status of the vector store files batch, which can be either `in_progress`, `completed`, `cancelled` or `failed`. + file_counts: + type: object + properties: + in_progress: + type: integer + format: int32 + description: The number of files that are currently being processed. + completed: + type: integer + format: int32 + description: The number of files that have been processed. + failed: + type: integer + format: int32 + description: The number of files that have failed to process. + cancelled: + type: integer + format: int32 + description: The number of files that where cancelled. + total: + type: integer + format: int32 + description: The total number of files. + required: + - in_progress + - completed + - failed + - cancelled + - total + description: A batch of files attached to a vector store. + VectorStoreFileObject: + type: object + required: + - id + - object + - usage_bytes + - created_at + - vector_store_id + - status + - last_error + properties: + id: + type: string + description: The identifier, which can be referenced in API endpoints. + object: + type: string + enum: + - vector_store.file + description: The object type, which is always `vector_store.file`. + usage_bytes: + type: integer + format: int32 + description: The total vector store usage in bytes. Note that this may be different from the original file size. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the vector store file was created. + vector_store_id: + type: string + description: The ID of the [vector store](/docs/api-reference/vector-stores/object) that the [File](/docs/api-reference/files) is attached to. + status: + type: string + enum: + - in_progress + - completed + - cancelled + - failed + description: The status of the vector store file, which can be either `in_progress`, `completed`, `cancelled`, or `failed`. The status `completed` indicates that the vector store file is ready for use. + last_error: + type: object + properties: + code: + type: string + enum: + - server_error + - unsupported_file + - invalid_file + description: One of `server_error` or `rate_limit_exceeded`. + message: + type: string + description: A human-readable description of the error. + required: + - code + - message + nullable: true + description: The last error associated with this vector store file. Will be `null` if there are no errors. + chunking_strategy: + anyOf: + - $ref: '#/components/schemas/StaticChunkingStrategyResponseParam' + - $ref: '#/components/schemas/OtherChunkingStrategyResponseParam' + description: The strategy used to chunk the file. + x-oaiExpandable: true + description: A list of files attached to a vector store. + VectorStoreObject: + type: object + required: + - id + - object + - created_at + - name + - usage_bytes + - file_counts + - status + - last_active_at + - metadata + properties: + id: + type: string + description: The identifier, which can be referenced in API endpoints. + object: + type: string + enum: + - vector_store + description: The object type, which is always `vector_store`. + created_at: + type: integer + format: unixtime + description: The Unix timestamp (in seconds) for when the vector store was created. + name: + type: string + description: The name of the vector store. + usage_bytes: + type: integer + format: int32 + description: The total number of bytes used by the files in the vector store. + file_counts: + type: object + properties: + in_progress: + type: integer + format: int32 + description: The number of files that are currently being processed. + completed: + type: integer + format: int32 + description: The number of files that have been successfully processed. + failed: + type: integer + format: int32 + description: The number of files that have failed to process. + cancelled: + type: integer + format: int32 + description: The number of files that were cancelled. + total: + type: integer + format: int32 + description: The total number of files. + required: + - in_progress + - completed + - failed + - cancelled + - total + status: + type: string + enum: + - expired + - in_progress + - completed + description: The status of the vector store, which can be either `expired`, `in_progress`, or `completed`. A status of `completed` indicates that the vector store is ready for use. + expires_after: + $ref: '#/components/schemas/VectorStoreExpirationAfter' + expires_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the vector store will expire. + last_active_at: + type: integer + format: unixtime + nullable: true + description: The Unix timestamp (in seconds) for when the vector store was last active. + metadata: + type: object + additionalProperties: + type: string + nullable: true + description: Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + description: A vector store is a collection of processed files can be used by the `file_search` tool. + securitySchemes: + BearerAuth: + type: http + scheme: bearer +servers: + - url: https://api.openai.com + description: OpenAI Endpoint + variables: {} diff --git a/.scripts.azure/Invoke-CodeGen.ps1 b/.scripts.azure/Invoke-CodeGen.ps1 new file mode 100644 index 000000000..c179e18a9 --- /dev/null +++ b/.scripts.azure/Invoke-CodeGen.ps1 @@ -0,0 +1,87 @@ +$repoRoot = Join-Path $PSScriptRoot .. -Resolve +$dotnetAzureFolder = Join-Path $repoRoot .dotnet.azure + +function Invoke([scriptblock]$script) { + $scriptString = $script | Out-String + Write-Host "--------------------------------------------------------------------------------`n> $scriptString" + & $script +} + +function Make-Internals-Settable { + Get-ChildItem "$dotnetAzureFolder\src\Generated" -File -Filter "Internal*.cs" | ForEach-Object { + $content = Get-Content $_.FullName -Raw + $newContent = $content -replace 'public(.*?)\{ get; \}', 'internal$1{ get; set; }' + Set-Content -Path $_.FullName -Value $newContent + } +} + +function Partialize-ClientPipelineExtensions { + $file = Get-ChildItem -Path "$dotnetAzureFolder\src\Generated\Internal\ClientPipelineExtensions.cs" + $content = Get-Content -Path $file -Raw + Write-Output "Editing $($file.FullName)" + $content = $content -creplace "internal static class ClientPipelineExtensions", "internal static partial class ClientPipelineExtensions" + $content | Set-Content -Path $file.FullName -NoNewline +} + +function Partialize-ClientUriBuilder { + $file = Get-ChildItem -Path "$dotnetAzureFolder\src\Generated\Internal\ClientUriBuilder.cs" + $content = Get-Content -Path $file -Raw + Write-Output "Editing $($file.FullName)" + $content = $content -creplace "internal class ClientUriBuilder", "internal partial class ClientUriBuilder" + $content | Set-Content -Path $file.FullName -NoNewline +} + +function Prune-Generated-Files { + $patternsToKeep = @( + "*BingSearchTool*", + "*DataSource*", + "*ContentFilter*", + "*OpenAI*Error*", + "*Context*", + "*RetrievedDoc*", + "*Citation*" + ) + $patternsToDelete = @( + "BingSearchToolDefinition.cs", + "*Elasticsearch*QueryType*", + "*FieldsMapping*", + "*ContentTextAnnotationsFileCitation*" + ) + + Get-ChildItem "$dotnetAzureFolder\src\Generated" -File | ForEach-Object { + $generatedFile = $_; + $generatedFilename = $_.Name; + $keepFile = $false + foreach ($pattern in $patternsToKeep) { + if ($generatedFilename -like "$pattern") { + $keepFile = $true + foreach ($deletePattern in $patternsToDelete) { + if ($generatedFilename -like $deletePattern) { + $keepFile = $false + break + } + } + break + } + } + if (-not $keepFile) { + Write-Output "Removing: $generatedFilename" + Remove-Item $generatedFile + } + } +} + +Push-Location $repoRoot/.typespec.azure +try { + Invoke { npm ci } + Invoke { npm exec --no -- tsp format **/*tsp } + Invoke { npm exec --no -- tsp compile . } +# Invoke { npm exec --no -- tsp compile main.tsp --emit @azure-tools/typespec-csharp --option @azure-tools/typespec-csharp.emitter-output-dir="$dotnetAzureFolder/src" } + Prune-Generated-Files + Make-Internals-Settable + Partialize-ClientPipelineExtensions + Partialize-ClientUriBuilder +} +finally { + Pop-Location +} diff --git a/.scripts/Edit-Deserialization.ps1 b/.scripts/Edit-Deserialization.ps1 new file mode 100644 index 000000000..04ef67266 --- /dev/null +++ b/.scripts/Edit-Deserialization.ps1 @@ -0,0 +1,21 @@ +$repoRoot = Join-Path $PSScriptRoot .. -Resolve +$generatedModelFolder = Join-Path $repoRoot .dotnet\src\Generated\Models + +$files = Get-ChildItem -Path $generatedModelFolder -Filter "*Serialization.cs" + +$editedFilesCount = 0 + +foreach ($file in $files) { + $statusText = "{0:D3}/{1:D3} : Processing codegen fixup for response deserialization..." -f $editedFilesCount, $files.Count + $percentComplete = [math]::Round(($editedFilesCount / $files.Count) * 100) + Write-Progress -Activity "Editing" -Status $statusText -PercentComplete $percentComplete + $content = Get-Content -Path $file.FullName + $updatedContent = $content -replace "options.Format != `"W`"", "true" + if ($content -ne $updatedContent) { + Set-Content -Path $file.FullName -Value $updatedContent + } + $editedFilesCount++ +} + +Write-Progress -Activity "Editing" -Status "Complete" -Completed +Write-Output "Complete: deserialization edited." \ No newline at end of file diff --git a/.scripts/Export-API.ps1 b/.scripts/Export-API.ps1 new file mode 100644 index 000000000..45bd8502e --- /dev/null +++ b/.scripts/Export-API.ps1 @@ -0,0 +1,106 @@ +$repoRoot = Join-Path $PSScriptRoot .. -Resolve +$sourceFolder = Join-Path $repoRoot .dotnet\src +$apiFolder = Join-Path $repoRoot .dotnet\api + +$platformTarget = "netstandard2.0" +$projectPath = Join-Path $sourceFolder OpenAI.csproj +$assemblyPath = Join-Path $sourceFolder bin\Debug $platformTarget OpenAI.dll +$outputPath = Join-Path $apiFolder "OpenAI.$($platformTarget).cs" + +Write-Output "Building OpenAI.dll..." +Write-Output "" + +dotnet build $projectPath +Write-Output "" + +Write-Output "Generating OpenAI.netstandard2.0.cs..." +Write-Output "" + +$net80ref = Get-ChildItem -Recurse ` + -Path "$($env:ProgramFiles)\dotnet\packs\Microsoft.NETCore.App.Ref" ` + -Include "net8.0" | Select-Object -Last 1 +$systemClientModelRef = Get-ChildItem -Recurse ` + -Path "$($env:UserProfile)\.nuget\packages\system.clientmodel\1.1.0-beta.5" ` + -Include "netstandard2.0" | Select-Object -Last 1 +$systemMemoryDataRef = Get-ChildItem -Recurse ` + -Path "$($env:UserProfile)\.nuget\packages\system.memory.data\1.0.2" ` + -Include "netstandard2.0" | Select-Object -Last 1 +$systemDiagnosticsDiagnosticSourceRef = Get-ChildItem -Recurse ` + -Path "$($env:UserProfile)\.nuget\packages\system.diagnostics.diagnosticsource\6.0.1" ` + -Include "netstandard2.0" | Select-Object -Last 1 +$microsoftBclAsyncInterfacesRef = Get-ChildItem -Recurse ` + -Path "$($env:UserProfile)\.nuget\packages\microsoft.bcl.asyncinterfaces\1.1.0" ` + -Include "netstandard2.0" | Select-Object -Last 1 + +Write-Output "Assembly reference paths:" +Write-Output "* NETCore:" +Write-Output " $($net80ref)" +Write-Output "" +Write-Output "* System.ClientModel:" +Write-Output " $($systemClientModelRef)" +Write-Output "" +Write-Output "* System.Memory.Data:" +Write-Output " $($systemMemoryDataRef)" +Write-Output "" +Write-Output "* System.Diagnostics.DiagnosticSource:" +Write-Output " $($systemDiagnosticsDiagnosticSourceRef)" +Write-Output "" +Write-Output "* Microsoft.Bcl.AsyncInterfaces:" +Write-Output " $($microsoftBclAsyncInterfacesRef)" +Write-Output "" +Write-Output "NOTE: if any of the above are empty, tool output may be inaccurate." +Write-Output "" + +genapi --assembly $assemblyPath --output-path $outputPath ` + --assembly-reference $net80ref ` + --assembly-reference $systemClientModelRef ` + --assembly-reference $systemMemoryDataRef ` + --assembly-reference $systemDiagnosticsDiagnosticSourceRef ` + --assembly-reference $microsoftBclAsyncInterfacesRef + +Write-Output "Cleaning up OpenAI.netstandard2.0.cs..." +Write-Output "" + +$content = Get-Content $outputPath -Raw + +# Remove empty lines. +$content = $content -creplace '//.*\r?\n', '' +$content = $content -creplace '\r?\n\r?\n', "`n" +$content = $content -creplace '\r?\n *{', " {" + +# Remove fully-qualified names. +$content = $content -creplace "System\.ComponentModel\.", "" +$content = $content -creplace "System\.ClientModel.Primitives\.", "" +$content = $content -creplace "System\.ClientModel\.", "" +$content = $content -creplace "System\.Collections\.Generic\.", "" +$content = $content -creplace "System\.Collections\.", "" +$content = $content -creplace "System\.Threading.Tasks\.", "" +$content = $content -creplace "System\.Threading\.", "" +$content = $content -creplace "System\.Text.Json\.", "" +$content = $content -creplace "System\.Text\.", "" +$content = $content -creplace "System\.IO\.", "" +$content = $content -creplace "System\.", "" +$content = $content -creplace "Assistants\.", "" +$content = $content -creplace "Audio\.", "" +$content = $content -creplace "Batch\.", "" +$content = $content -creplace "Chat\.", "" +$content = $content -creplace "Embeddings\.", "" +$content = $content -creplace "Files\.", "" +$content = $content -creplace "FineTuning\.", "" +$content = $content -creplace "Images\.", "" +$content = $content -creplace "Models\.", "" +$content = $content -creplace "Moderations\.", "" +$content = $content -creplace "VectorStores\.", "" + +# Remove Diagnostics.DebuggerStepThrough attribute. +$content = $content -creplace ".*Diagnostics.DebuggerStepThrough.*\n", "" + +# Remove internal APIs. +$content = $content -creplace " * internal.*`n", "" + +# Other cosmetic simplifications. +$content = $content -creplace "partial class", "class" +$content = $content -creplace " { throw null; }", ";" +$content = $content -creplace " { }", ";" + +Set-Content -Path $outputPath -Value $content -NoNewline \ No newline at end of file diff --git a/.scripts/Invoke-CodeGen.ps1 b/.scripts/Invoke-CodeGen.ps1 new file mode 100644 index 000000000..1e999a5e7 --- /dev/null +++ b/.scripts/Invoke-CodeGen.ps1 @@ -0,0 +1,29 @@ +$repoRoot = Join-Path $PSScriptRoot .. -Resolve +$dotnetFolder = Join-Path $repoRoot .dotnet\src + +function Invoke([scriptblock]$script) { + $scriptString = $script | Out-String + Write-Host "--------------------------------------------------------------------------------`n> $scriptString" + & $script +} + +$scriptStartTime = Get-Date + +Push-Location $repoRoot/.typespec +try { + Invoke { npm ci } + Invoke { npm exec --no -- tsp format **/*tsp } + Invoke { npm exec --no -- tsp compile . } + Invoke { .$PSScriptRoot\Update-ClientModel.ps1 } + Invoke { .$PSScriptRoot\Edit-Deserialization.ps1 } + Invoke { .$PSScriptRoot\Run-Checks.ps1 } +} +finally { + Pop-Location +} + +$scriptElapsed = $(Get-Date) - $scriptStartTime +$scriptElapsedSeconds = [math]::Round($scriptElapsed.TotalSeconds, 1) +$scriptName = $MyInvocation.MyCommand.Name + +Write-Host "${scriptName} complete. Time: ${scriptElapsedSeconds}s" \ No newline at end of file diff --git a/.scripts/Run-Checks.ps1 b/.scripts/Run-Checks.ps1 new file mode 100644 index 000000000..8d82507c3 --- /dev/null +++ b/.scripts/Run-Checks.ps1 @@ -0,0 +1,141 @@ +function Run-ModelsSubnamespaceCheck { + Write-Output "" + Write-Output "Checking for unknown files using the OpenAI.Models namespace..." + + $root = Split-Path $PSScriptRoot -Parent + $directory = Join-Path -Path $root -ChildPath ".dotnet\src" + $files = Get-ChildItem -Path $($directory + "\*") -Include "*.cs" -Recurse + + $exclusions = @( + "GeneratorStubs.cs", + "InternalDeleteModelResponse.cs", + "InternalDeleteModelResponse.Serialization.cs", + "InternalDeleteModelResponseObject.cs", + "InternalListModelsResponseObject.cs", + "InternalModelObject.cs", + "ModelClient.cs", + "ModelClient.Protocol.cs", + "OpenAIModelInfo.cs", + "OpenAIModelInfo.Serialization.cs", + "OpenAIModelInfoCollection.cs", + "OpenAIModelInfoCollection.Serialization.cs", + "OpenAIModelsModelFactory.cs" + ) + + $failures = @() + + foreach ($file in $files) { + $content = Get-Content -Path $file -Raw + + if ($file.Name -in $exclusions) { + Write-Output "Skipped $($file.FullName)" + continue + } + + if ($content -cmatch '(?m)^namespace OpenAI\.Models(;)?(\r)?$') { + $failures += $file + } + } + + if ($failures.Length -gt 0) { + $message = "" + + foreach ($failure in $failures) { + $message += "$failure " + } + + $exception = ("One or more unknown files with the OpenAI.Models namespace were detected." + + " The OpenAI.Models namespace is reserved for OpenAI's Models API." + + " If this change is intentional, please add the filenames to the exclusion list:" + + " $($message)") + + throw $exception + } + + Write-Output "" + Write-Output "The check was successful." +} + +function Run-TopLevelNamespaceCheck { + Write-Output "" + Write-Output "Checking for unknown files using the OpenAI namespace..." + + $root = Split-Path $PSScriptRoot -Parent + $directory = Join-Path -Path $root -ChildPath ".dotnet\src" + $files = Get-ChildItem -Path $($directory + "\*") -Include "*.cs" -Recurse + + $exclusions = @( + # Public types + "ListOrder.cs", + "OpenAIClient.cs", + "OpenAIClientOptions.cs", + "OpenAIModelFactory.cs", + + # Internal types + "Argument.cs", + "BinaryContentHelper.cs", + "CancellationTokenExtensions.cs", + "ChangeTrackingDictionary.cs", + "ChangeTrackingList.cs", + "ClientPipelineExtensions.cs", + "ClientUriBuilder.cs", + "ErrorResult.cs", + "IInternalListResponseOfT.cs", + "InternalFunctionDefinition.cs", + "InternalFunctionDefinition.Serialization.cs", + "ModelSerializationExtensions.cs", + "Optional.cs", + "TelemetryDetails.cs", + "Utf8JsonBinaryContent.cs", + + # Utilities + "AppContextSwitchHelper.cs", + "CodeGenClientAttribute.cs", + "CodeGenMemberAttribute.cs", + "CodeGenModelAttribute.cs", + "CodeGenSerializationAttribute.cs", + "CodeGenSuppressAttribute.cs", + "CodeGenTypeAttribute.cs", + "CustomSerializationHelpers.cs", + "GenericActionPipelinePolicy.cs", + "MultipartFormDataBinaryContent.cs", + "PageCollectionHelpers.cs", + "PageEnumerator.cs", + "PageResultEnumerator.cs" + ) + + $failures = @() + + foreach ($file in $files) { + $content = Get-Content -Path $file -Raw + + if ($file.Name -in $exclusions) { + Write-Output "Skipped $($file.FullName)" + continue + } + + if ($content -cmatch '(?m)^namespace OpenAI(;)?(\r)?$') { + $failures += $file + } + } + + if ($failures.Length -gt 0) { + $message = "" + + foreach ($failure in $failures) { + $message += "$failure " + } + + $exception = ("One or more unknown files with the OpenAI namespace were detected." + + " If this change is intentional, please add the filenames to the exclusion list:\:" + + " $($message)") + + throw $exception + } + + Write-Output "" + Write-Output "The check was successful." +} + +Run-ModelsSubnamespaceCheck +Run-TopLevelNamespaceCheck \ No newline at end of file diff --git a/.scripts/Update-ClientModel.ps1 b/.scripts/Update-ClientModel.ps1 new file mode 100644 index 000000000..bdb95cb15 --- /dev/null +++ b/.scripts/Update-ClientModel.ps1 @@ -0,0 +1,25 @@ +function Remove-MultipartFormDataBinaryContent { + $root = Split-Path $PSScriptRoot -Parent + $filePath = Join-Path -Path $root -ChildPath ".dotnet\src\Generated\Internal\MultipartFormDataBinaryContent.cs" + $file = Get-ChildItem -Path $filePath + + Write-Output "Removing $($file.FullName)" + + Remove-Item $file +} + +function Partialize-ClientPipelineExtensions { + $root = Split-Path $PSScriptRoot -Parent + $directory = Join-Path -Path $root -ChildPath ".dotnet\src\Generated\Internal" + $file = Get-ChildItem -Path $directory -Filter "ClientPipelineExtensions.cs" + $content = Get-Content -Path $file -Raw + + Write-Output "Editing $($file.FullName)" + + $content = $content -creplace "internal static class ClientPipelineExtensions", "internal static partial class ClientPipelineExtensions" + + $content | Set-Content -Path $file.FullName -NoNewline +} + +Remove-MultipartFormDataBinaryContent +Partialize-ClientPipelineExtensions diff --git a/.typespec.azure/assistants/client.tsp b/.typespec.azure/assistants/client.tsp new file mode 100644 index 000000000..5471c7065 --- /dev/null +++ b/.typespec.azure/assistants/client.tsp @@ -0,0 +1,5 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; +using AzureOpenAI; diff --git a/audio/main.tsp b/.typespec.azure/assistants/main.tsp similarity index 54% rename from audio/main.tsp rename to .typespec.azure/assistants/main.tsp index c6458821f..e7af5325f 100644 --- a/audio/main.tsp +++ b/.typespec.azure/assistants/main.tsp @@ -1,2 +1,2 @@ +import "./client.tsp"; import "./operations.tsp"; -import "./models.tsp"; diff --git a/.typespec.azure/assistants/models.tsp b/.typespec.azure/assistants/models.tsp new file mode 100644 index 000000000..4da4bf27f --- /dev/null +++ b/.typespec.azure/assistants/models.tsp @@ -0,0 +1,6 @@ +import "../common"; +import "../../.typespec/assistants"; + +using OpenAI; + +namespace AzureOpenAI; diff --git a/.typespec.azure/assistants/operations.tsp b/.typespec.azure/assistants/operations.tsp new file mode 100644 index 000000000..3a2df0910 --- /dev/null +++ b/.typespec.azure/assistants/operations.tsp @@ -0,0 +1,11 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; +using OpenAI; + +namespace AzureOpenAI; diff --git a/.typespec.azure/chat/client.tsp b/.typespec.azure/chat/client.tsp new file mode 100644 index 000000000..b03530295 --- /dev/null +++ b/.typespec.azure/chat/client.tsp @@ -0,0 +1,43 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./models.oyd.auth.tsp"; +import "./models.oyd.vectorization.tsp"; +import "./models.request.tsp"; +import "./models.response.tsp"; + +using Azure.ClientGenerator.Core; +using AzureOpenAI; + +@@access(AzureChatMessageContext, Access.public); +@@usage(AzureChatMessageContext, Usage.output); + +// +@@access(AzureChatDataSource, Access.public); +@@usage(AzureChatDataSource, Usage.input); + +@@access(AzureCosmosDBChatDataSource, Access.public); +@@usage(AzureCosmosDBChatDataSource, Usage.input); + +@@access(AzureMachineLearningIndexChatDataSource, Access.public); +@@usage(AzureMachineLearningIndexChatDataSource, Usage.input); + +@@access(AzureSearchChatDataSource, Access.public); +@@usage(AzureSearchChatDataSource, Usage.input); + +@@access(ElasticsearchChatDataSource, Access.public); +@@usage(ElasticsearchChatDataSource, Usage.input); + +@@access(PineconeChatDataSource, Access.public); +@@usage(PineconeChatDataSource, Usage.input); + +// +@@access(AzureChatDataSourceVectorizationSource, Access.public); +@@usage(AzureChatDataSourceVectorizationSource, Usage.output); + +@@access(AzureChatDataSourceEndpointVectorizationSource, Access.public); +@@usage(AzureChatDataSourceEndpointVectorizationSource, Usage.output); + +@@access(AzureChatDataSourceDeploymentNameVectorizationSource, Access.public); +@@usage(AzureChatDataSourceDeploymentNameVectorizationSource, Usage.output); + +@@access(AzureChatDataSourceModelIdVectorizationSource, Access.public); +@@usage(AzureChatDataSourceModelIdVectorizationSource, Usage.output); diff --git a/.typespec.azure/chat/main.tsp b/.typespec.azure/chat/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec.azure/chat/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec.azure/chat/models.oyd.auth.tsp b/.typespec.azure/chat/models.oyd.auth.tsp new file mode 100644 index 000000000..a2a858329 --- /dev/null +++ b/.typespec.azure/chat/models.oyd.auth.tsp @@ -0,0 +1,50 @@ +import "../../.typespec/chat/models.tsp"; + +namespace AzureOpenAI; + +@discriminator("type") +model AzureChatDataSourceAuthenticationOptions { + type: string; +} + +model AzureChatDataSourceApiKeyAuthenticationOptions + extends AzureChatDataSourceAuthenticationOptions { + type: "api_key"; + key: string; +} + +model AzureChatDataSourceConnectionStringAuthenticationOptions + extends AzureChatDataSourceAuthenticationOptions { + type: "connection_string"; + connection_string: string; +} + +model AzureChatDataSourceKeyAndKeyIdAuthenticationOptions + extends AzureChatDataSourceAuthenticationOptions { + type: "key_and_key_id"; + key: string; + key_id: string; +} + +model AzureChatDataSourceEncodedApiKeyAuthenticationOptions + extends AzureChatDataSourceAuthenticationOptions { + type: "encoded_api_key"; + encoded_api_key: string; +} + +model AzureChatDataSourceAccessTokenAuthenticationOptions + extends AzureChatDataSourceAuthenticationOptions { + type: "access_token"; + access_token: string; +} + +model AzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions + extends AzureChatDataSourceAuthenticationOptions { + type: "system_assigned_managed_identity"; +} + +model AzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions + extends AzureChatDataSourceAuthenticationOptions { + type: "user_assigned_managed_identity"; + managed_identity_resource_id: string; +} diff --git a/.typespec.azure/chat/models.oyd.vectorization.tsp b/.typespec.azure/chat/models.oyd.vectorization.tsp new file mode 100644 index 000000000..3bcbfa12d --- /dev/null +++ b/.typespec.azure/chat/models.oyd.vectorization.tsp @@ -0,0 +1,78 @@ +import "../../.typespec/chat/models.tsp"; +import "./models.oyd.auth.tsp"; + +namespace AzureOpenAI; + +/** + * A representation of a data vectorization source usable as an embedding resource with a data source. + */ +@discriminator("type") +model AzureChatDataSourceVectorizationSource { + /** The differentiating identifier for the concrete vectorization source. */ + type: string; +} + +/** + * Represents a vectorization source that makes public service calls against an Azure OpenAI embedding model deployment. + */ +model AzureChatDataSourceEndpointVectorizationSource + extends AzureChatDataSourceVectorizationSource { + /** The type identifier, always 'endpoint' for this vectorization source type. */ + type: "endpoint"; + + /** + * Specifies the resource endpoint URL from which embeddings should be retrieved. + * It should be in the format of: + * https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/embeddings. + * The api-version query parameter is not allowed. + */ + endpoint: url; + + /** + * The authentication mechanism to use with the endpoint-based vectorization source. + * Endpoint authentication supports API key and access token mechanisms. + */ + authentication: AzureChatDataSourceApiKeyAuthenticationOptions | AzureChatDataSourceAccessTokenAuthenticationOptions; + + /** + * The number of dimensions to request on embeddings. + * Only supported in 'text-embedding-3' and later models. + */ + dimensions?: int32; +} + +/** + * Represents a vectorization source that makes internal service calls against an Azure OpenAI embedding model + * deployment. In contrast with the endpoint-based vectorization source, a deployment-name-based vectorization source + * must be part of the same Azure OpenAI resource but can be used even in private networks. + */ +model AzureChatDataSourceDeploymentNameVectorizationSource + extends AzureChatDataSourceVectorizationSource { + /** The type identifier, always 'deployment_name' for this vectorization source type. */ + type: "deployment_name"; + + /** + * The embedding model deployment to use for vectorization. This deployment must exist within the same Azure OpenAI + * resource as the model deployment being used for chat completions. + */ + deployment_name: string; + + /** + * The number of dimensions to request on embeddings. + * Only supported in 'text-embedding-3' and later models. + */ + dimensions?: int32; +} + +/** + * Represents a vectorization source that makes service calls based on a search service model ID. + * This source type is currently only supported by Elasticsearch. + */ +model AzureChatDataSourceModelIdVectorizationSource + extends AzureChatDataSourceVectorizationSource { + /** The type identifier, always 'model_id' for this vectorization source type. */ + type: "model_id"; + + /** The embedding model build ID to use for vectorization. */ + model_id: string; +} diff --git a/.typespec.azure/chat/models.request.tsp b/.typespec.azure/chat/models.request.tsp new file mode 100644 index 000000000..11d85bb50 --- /dev/null +++ b/.typespec.azure/chat/models.request.tsp @@ -0,0 +1,255 @@ +import "../../.typespec/chat/models.tsp"; +import "./models.oyd.auth.tsp"; +import "./models.oyd.vectorization.tsp"; + +namespace AzureOpenAI; + +/** + * The extended request model for chat completions against the Azure OpenAI service. + * This adds the ability to provide data sources for the On Your Data feature. + */ +model AzureCreateChatCompletionRequest + extends OpenAI.CreateChatCompletionRequest { + /** + * The data sources to use for the On Your Data feature, exclusive to Azure OpenAI. + */ + data_sources?: AzureChatDataSource[]; +} + +/** + * A representation of configuration data for a single Azure OpenAI chat data source. + * This will be used by a chat completions request that should use Azure OpenAI chat extensions to augment the + * response behavior. + * The use of this configuration is compatible only with Azure OpenAI. + */ +@discriminator("type") +model AzureChatDataSource { + /** The differentiating type identifier for the data source. */ + type: string; +} + +/** Represents a data source configuration that will use an Azure Search resource. */ +model AzureSearchChatDataSource extends AzureChatDataSource { + /** The discriminated type identifier, which is always 'azure_search'. */ + type: "azure_search"; + + /** The parameter information to control the use of the Azure Search data source. */ + parameters: { + ...AzureChatDataSourceCommonParameters; + + /** The absolute endpoint path for the Azure Search resource to use. */ + endpoint: url; + + /** The name of the index to use, as specified in the Azure Search resource. */ + index_name: string; + + /** + * The authentication mechanism to use with Azure Search. + */ + authentication: + | AzureChatDataSourceApiKeyAuthenticationOptions + | AzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions + | AzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions + | AzureChatDataSourceAccessTokenAuthenticationOptions; + + /** The field mappings to use with the Azure Search resource. */ + fields_mapping?: { + /** The name of the index field to use as a title. */ + title_field?: string; + + /** The name of the index field to use as a URL. */ + url_field?: string; + + /** The name of the index field to use as a filepath. */ + filepath_field?: string; + + /** The names of index fields that should be treated as content. */ + content_fields?: string[]; + + /** The separator pattern that content fields should use. */ + content_fields_separator?: string; + + /** The names of fields that represent vector data. */ + vector_fields?: string[]; + + /** The names of fields that represent image vector data. */ + image_vector_fields?: string[]; + }; + + /** The query type for the Azure Search resource to use. */ + query_type?: + | "simple" + | "semantic" + | "vector" + | "vector_simple_hybrid" + | "vector_semantic_hybrid"; + + /** Additional semantic configuration for the query. */ + semantic_configuration?: string; + + /** A filter to apply to the search. */ + filter?: string; + + /** + * The vectorization source to use with Azure Search. + * Supported sources for Azure Search include endpoint and deployment name. + */ + embedding_dependency?: AzureChatDataSourceEndpointVectorizationSource | AzureChatDataSourceDeploymentNameVectorizationSource; + }; +} + +/** Represents a data source configuration that will use an Azure Machine Learning vector index. */ +model AzureMachineLearningIndexChatDataSource extends AzureChatDataSource { + /** The discriminated type identifier, which is always 'azure_ml_index'. */ + type: "azure_ml_index"; + + /** The parameter information to control the use of the Azure Machine Learning Index data source. */ + parameters: { + ...AzureChatDataSourceCommonParameters; + authentication: AzureChatDataSourceAccessTokenAuthenticationOptions | AzureChatDataSourceSystemAssignedManagedIdentityAuthenticationOptions | AzureChatDataSourceUserAssignedManagedIdentityAuthenticationOptions; + + /** The ID of the Azure Machine Learning index project to use. */ + project_resource_id: string; + + /** The name of the Azure Machine Learning index to use. */ + name: string; + + /** The version of the vector index to use. */ + version: string; + + /** A search filter, which is only applicable if the vector index is of the 'AzureSearch' type. */ + filter?: string; + }; +} + +/** Represents a data source configuration that will use an Azure CosmosDB resource. */ +model AzureCosmosDBChatDataSource extends AzureChatDataSource { + /** The discriminated type identifier, which is always 'azure_cosmos_db'. */ + type: "azure_cosmos_db"; + + /** The parameter information to control the use of the Azure CosmosDB data source. */ + parameters: { + ...AzureChatDataSourceCommonParameters; + container_name: string; + database_name: string; + embedding_dependency: AzureChatDataSourceVectorizationSource; + index_name: string; + authentication: AzureChatDataSourceConnectionStringAuthenticationOptions; + fields_mapping: { + content_fields: string[]; + vector_fields: string[]; + title_field?: string; + url_field?: string; + filepath_field?: string; + content_fields_separator?: string; + }; + }; +} + +model ElasticsearchChatDataSource extends AzureChatDataSource { + /** The discriminated type identifier, which is always 'elasticsearch'. */ + type: "elasticsearch"; + + /** The parameter information to control the use of the Elasticsearch data source. */ + parameters: { + ...AzureChatDataSourceCommonParameters; + endpoint: url; + index_name: string; + authentication: AzureChatDataSourceKeyAndKeyIdAuthenticationOptions | AzureChatDataSourceEncodedApiKeyAuthenticationOptions; + fields_mapping?: { + title_field?: string; + url_field?: string; + filepath_field?: string; + content_fields?: string[]; + content_fields_separator?: string; + vector_fields?: string[]; + }; + query_type?: "simple" | "vector"; + embedding_dependency?: AzureChatDataSourceVectorizationSource; + }; +} + +model PineconeChatDataSource extends AzureChatDataSource { + /** The discriminated type identifier, which is always 'pinecone'. */ + type: "pinecone"; + + /** The parameter information to control the use of the Pinecone data source. */ + parameters: { + ...AzureChatDataSourceCommonParameters; + + /** The environment name to use with Pinecone. */ + environment: string; + + /** The name of the Pinecone database index to use. */ + index_name: string; + + /** + * The authentication mechanism to use with Pinecone. + * Supported authentication mechanisms for Pinecone include: API key. + */ + authentication: AzureChatDataSourceApiKeyAuthenticationOptions; + + /** + * The vectorization source to use as an embedding dependency for the Pinecone data source. + * Supported vectorization sources for Pinecone include: deployment name. + */ + embedding_dependency: AzureChatDataSourceVectorizationSource; + + /** + * Field mappings to apply to data used by the Pinecone data source. + * Note that content field mappings are required for Pinecone. + */ + fields_mapping: { + content_fields: string[]; + title_field?: string; + url_field?: string; + filepath_field?: string; + content_fields_separator?: string; + }; + }; +} + +alias AzureChatDataSourceCommonParameters = { + /** The configured number of documents to feature in the query. */ + top_n_documents?: int32; + + /** Whether queries should be restricted to use of the indexed data. */ + in_scope?: boolean; + + /** + * The configured strictness of the search relevance filtering. + * Higher strictness will increase precision but lower recall of the answer. + */ + @minValue(1) + @maxValue(5) + strictness?: int32; + + /** + * Additional instructions for the model to inform how it should behave and any context it should reference when + * generating a response. You can describe the assistant's personality and tell it how to format responses. + * This is limited to 100 tokens and counts against the overall token limit. + */ + role_information?: string; + + /** + * The maximum number of rewritten queries that should be sent to the search provider for a single user message. + * By default, the system will make an automatic determination. + */ + max_search_queries?: int32; + + /** + * If set to true, the system will allow partial search results to be used and the request will fail if all + * partial queries fail. If not specified or specified as false, the request will fail if any search query fails. + */ + allow_partial_result?: boolean = false; + + /** + * The output context properties to include on the response. + * By default, citations and intent will be requested. + */ + @maxItems(3) + include_contexts?: ("citations" | "intent" | "all_retrieved_documents")[] = [ + "citations", + "intent" + ]; +}; diff --git a/.typespec.azure/chat/models.response.tsp b/.typespec.azure/chat/models.response.tsp new file mode 100644 index 000000000..c0c09d209 --- /dev/null +++ b/.typespec.azure/chat/models.response.tsp @@ -0,0 +1,101 @@ +import "../../.typespec/chat/models.tsp"; +import "../common"; + +namespace AzureOpenAI; + +/** + * The extended top-level chat completion response model for the Azure OpenAI service. + * This model adds Responsible AI content filter annotations for prompt input. + */ +model AzureCreateChatCompletionResponse + extends OpenAI.CreateChatCompletionResponse { + /** + * The Responsible AI content filter annotations associated with prompt inputs into chat completions. + */ + prompt_filter_results?: { + /** + * The index of the input prompt that this content filter result corresponds to. + */ + prompt_index: int32; + + /** + * The content filter results associated with the indexed input prompt. + */ + content_filter_results: AzureContentFilterResultForPrompt; + }[]; +} + +/** + * An additional property, added to chat completion response messages, produced by the Azure OpenAI service when using + * extension behavior. This includes intent and citation information from the On Your Data feature. + */ +model AzureChatMessageContext { + /** The detected intent from the chat history, which is used to carry conversation context between interactions */ + intent?: string; + + /** The citations produced by the data retrieval. */ + citations?: AzureChatDataSourceCitation[]; + + /** Summary information about documents retrieved by the data retrieval operation. */ + all_retrieved_documents?: { + ...AzureChatDataSourceCitation; + + /** The search queries executed to retrieve documents. */ + search_queries: string[]; + + /** The index of the data source used for retrieval. */ + data_source_index: int32; + + /** The original search score for the retrieval. */ + original_search_score?: float64; + + /** The rerank score for the retrieval. */ + rerank_score?: float64; + + /** If applicable, an indication of why the document was filtered. */ + filter_reason?: "score" | "rerank"; + }; +} + +/** + * The extended response model component for chat completion response messages on the Azure OpenAI service. + * This model adds support for chat message context, used by the On Your Data feature for intent, citations, and other + * information related to retrieval-augmented generation performed. + */ +model AzureChatCompletionResponseMessage + extends OpenAI.ChatCompletionResponseMessage { + /** + * The Azure-specific context information associated with the chat completion response message. + */ + context?: AzureChatMessageContext; +} + +/** + * The extended response model for a streaming chat response message on the Azure OpenAI service. + * This model adds support for chat message context, used by the On Your Data feature for intent, citations, and other + * information related to retrieval-augmented generation performed. + */ +model AzureChatCompletionStreamResponseDelta + extends OpenAI.ChatCompletionStreamResponseDelta { + /** + * The Azure-specific context information associated with the chat completion response message. + */ + context?: AzureChatMessageContext; +} + +alias AzureChatDataSourceCitation = { + /** The content of the citation. */ + content: string; + + /** The title for the citation. */ + title?: string; + + /** The URL of the citation. */ + url?: string; + + /** The file path for the citation. */ + filepath?: string; + + /** The chunk ID for the citation. */ + chunk_id?: string; +}; diff --git a/.typespec.azure/chat/operations.tsp b/.typespec.azure/chat/operations.tsp new file mode 100644 index 000000000..138a7283d --- /dev/null +++ b/.typespec.azure/chat/operations.tsp @@ -0,0 +1,23 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.request.tsp"; +import "./models.response.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; +using OpenAI; + +namespace AzureOpenAI; + +@route("/chat") +interface AzureChat { + @route("completions") + @post + @operationId("createChatCompletion") + @tag("Chat") + createChatCompletion( + ...AzureCreateChatCompletionRequest, + ): AzureCreateChatCompletionResponse | AzureOpenAIChatErrorResponse; +} diff --git a/.typespec.azure/common/client.tsp b/.typespec.azure/common/client.tsp new file mode 100644 index 000000000..0895990a6 --- /dev/null +++ b/.typespec.azure/common/client.tsp @@ -0,0 +1,33 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./models.errors.tsp"; +import "./models.rai.tsp"; + +using Azure.ClientGenerator.Core; +using AzureOpenAI; + +@@access(AzureContentFilterBlocklistResult, Access.public); +@@usage(AzureContentFilterBlocklistResult, Usage.output); + +@@access(AzureContentFilterResultForChoice, Access.public); +@@usage(AzureContentFilterResultForChoice, Usage.output); + +@@access(AzureContentFilterBlocklistIdResult, Access.public); +@@usage(AzureContentFilterBlocklistIdResult, Usage.output); + +@@access(AzureOpenAIError, Access.public); +@@usage(AzureOpenAIError, Usage.output); + +@@access(AzureOpenAIErrorResponse, Access.public); +@@usage(AzureOpenAIErrorResponse, Usage.output); + +@@access(AzureOpenAIChatError, Access.public); +@@usage(AzureOpenAIChatError, Usage.output); + +@@access(AzureOpenAIDalleError, Access.public); +@@usage(AzureOpenAIDalleError, Usage.output); + +@@access(AzureOpenAIChatErrorResponse, Access.public); +@@usage(AzureOpenAIChatErrorResponse, Usage.output); + +@@access(AzureOpenAIDalleErrorResponse, Access.public); +@@usage(AzureOpenAIDalleErrorResponse, Usage.output); diff --git a/.typespec.azure/common/main.tsp b/.typespec.azure/common/main.tsp new file mode 100644 index 000000000..2faab4e73 --- /dev/null +++ b/.typespec.azure/common/main.tsp @@ -0,0 +1,3 @@ +import "./client.tsp"; +import "./models.rai.tsp"; +import "./models.errors.tsp"; diff --git a/.typespec.azure/common/models.errors.tsp b/.typespec.azure/common/models.errors.tsp new file mode 100644 index 000000000..22476f174 --- /dev/null +++ b/.typespec.azure/common/models.errors.tsp @@ -0,0 +1,49 @@ +import "./models.rai.tsp"; + +namespace AzureOpenAI; + +/** The details of an error resulting from an Azure OpenAI service request. */ +model AzureOpenAIError { + /** The distinct, machine-generated identifier for the error. */ + code?: string; + + /** A human-readable message associated with the error. */ + message?: string; + + /** If applicable, the request input parameter associated with the error */ + param?: string; + + /** If applicable, the input line number associated with the error. */ + type?: string; + + /** If applicable, an upstream error that originated this error. */ + inner_error?: { + /** The code associated with the inner error. */ + code?: "ResponsibleAIPolicyViolation"; + + /** If applicable, the modified prompt used for generation. */ + revised_prompt?: string; + + /** The content filter result details associated with the inner error. */ + content_filter_results?: T; + }; +} + +/** A structured representation of an error an Azure OpenAI request. */ +model AzureOpenAIErrorResponse { + error?: T; +} + +/** The structured representation of an error from an Azure OpenAI chat completion request. */ +model AzureOpenAIChatError + is AzureOpenAIError; + +/** The structured representation of an error from an Azure OpenAI image generation request. */ +model AzureOpenAIDalleError + is AzureOpenAIError; + +model AzureOpenAIChatErrorResponse + is AzureOpenAIErrorResponse; + +model AzureOpenAIDalleErrorResponse + is AzureOpenAIErrorResponse; diff --git a/.typespec.azure/common/models.rai.tsp b/.typespec.azure/common/models.rai.tsp new file mode 100644 index 000000000..facab0100 --- /dev/null +++ b/.typespec.azure/common/models.rai.tsp @@ -0,0 +1,217 @@ +namespace AzureOpenAI; + +/** + * A labeled content filter result item that indicates whether the content was filtered and what the qualitative + * severity level of the content was, as evaluated against content filter configuration for the category. + */ +model AzureContentFilterSeverityResult { + /** Whether the content severity resulted in a content filtering action. */ + filtered: boolean; + + /** The labeled severity of the content. */ + severity: "safe" | "low" | "medium" | "high"; +} + +/** + * A labeled content filter result item that indicates whether the content was detected and whether the content was + * filtered. + */ +model AzureContentFilterDetectionResult { + /** Whether the content detection resulted in a content filtering action. */ + filtered: boolean; + + /** Whether the labeled content category was detected in the content. */ + detected: boolean; +} + +/** + * A content filter result item that associates an existing custom blocklist ID with a value indicating whether or not + * the corresponding blocklist resulted in content being filtered. + */ +model AzureContentFilterBlocklistIdResult { + /** The ID of the custom blocklist associated with the filtered status. */ + id: string; + + /** Whether the associated blocklist resulted in the content being filtered. */ + filtered: boolean; +} + +alias AzureContentFilterCategoriesBase = { + ...SexualSeverityCategory; + ...HateSeverityCategory; + ...ViolenceSeverityCategory; + ...SelfHarmSeverityCategory; + ...ProfanityCategory; + ...CustomBlocklistsResult; + + /** + * If present, details about an error that prevented content filtering from completing its evaluation. + */ + error?: { + /** + * A distinct, machine-readable code associated with the error. + */ + code: int32; + + /** + * A human-readable message associated with the error. + */ + message: string; + }; +}; + +/** + * A collection of true/false filtering results for configured custom blocklists. + */ +model AzureContentFilterBlocklistResult { + /** A value indicating whether any of the detailed blocklists resulted in a filtering action. */ + filtered: boolean; + + /** The pairs of individual blocklist IDs and whether they resulted in a filtering action. */ + details?: { + /** A value indicating whether the blocklist produced a filtering action. */ + filtered: boolean; + + /** The ID of the custom blocklist evaluated. */ + id: string; + }[]; +} + +/** + * A content filter result associated with a single input prompt item into a generative AI system. + */ +model AzureContentFilterResultForPrompt { + /** + * The index of the input prompt associated with the accompanying content filter result categories. + */ + prompt_index?: int32; + + /** + * The content filter category details for the result. + */ + content_filter_results?: { + ...AzureContentFilterCategoriesBase; + ...JailbreakResult; + + /** + * A detection result that describes attacks on systems powered by Generative AI models that can happen every time + * an application processes information that wasn’t directly authored by either the developer of the application or + * the user. + */ + indirect_attack: AzureContentFilterDetectionResult; + }; +} + +/** + * A content filter result for a single response item produced by a generative AI system. + */ +model AzureContentFilterResultForChoice { + ...AzureContentFilterCategoriesBase; + + /** + * A detection result that describes a match against text protected under copyright or other status. + */ + protected_material_text?: AzureContentFilterDetectionResult; + + /** + * A detection result that describes a match against licensed code or other protected source material. + */ + protected_material_code?: { + ...AzureContentFilterDetectionResult; + + /** + * If available, the citation details describing the associated license and its location. + */ + citation?: { + /** + * The name or identifier of the license associated with the detection. + */ + license?: string; + + /** + * The URL associated with the license. + */ + URL?: url; + }; + }; +} + +/** + * A content filter result for an image generation operation's output response content. + */ +model AzureContentFilterImageResponseResults { + ...SexualSeverityCategory; + ...ViolenceSeverityCategory; + ...HateSeverityCategory; + ...SelfHarmSeverityCategory; +} + +/** + * A content filter result for an image generation operation's input request content. + */ +model AzureContentFilterImagePromptResults + extends AzureContentFilterImageResponseResults { + ...ProfanityCategory; + ...CustomBlocklistsResult; + ...JailbreakResult; +} + +alias SexualSeverityCategory = { + /** + * A content filter category for language related to anatomical organs and genitals, romantic relationships, acts + * portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an + * assault or a forced sexual violent act against one's will, prostitution, pornography, and abuse. + */ + sexual?: AzureContentFilterSeverityResult; +}; + +alias ViolenceSeverityCategory = { + /** + * A content filter category for language related to physical actions intended to hurt, injure, damage, or kill + * someone or something; describes weapons, guns and related entities, such as manufactures, associations, + * legislation, and so on. + */ + violence?: AzureContentFilterSeverityResult; +}; + +alias HateSeverityCategory = { + /** + * A content filter category that can refer to any content that attacks or uses pejorative or discriminatory + * language with reference to a person or identity group based on certain differentiating attributes of these groups + * including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, + * religion, immigration status, ability status, personal appearance, and body size. + */ + hate?: AzureContentFilterSeverityResult; +}; + +alias SelfHarmSeverityCategory = { + /** + * A content filter category that describes language related to physical actions intended to purposely hurt, injure, + * damage one's body or kill oneself. + */ + self_harm?: AzureContentFilterSeverityResult; +}; + +alias ProfanityCategory = { + /** + * A detection result that identifies whether crude, vulgar, or otherwise objection language is present in the + * content. + */ + profanity?: AzureContentFilterDetectionResult; +}; + +alias JailbreakResult = { + /** + * A detection result that describes user prompt injection attacks, where malicious users deliberately exploit + * system vulnerabilities to elicit unauthorized behavior from the LLM. This could lead to inappropriate content + * generation or violations of system-imposed restrictions. + */ + jailbreak: AzureContentFilterDetectionResult; +}; + +alias CustomBlocklistsResult = { + /** + * A collection of binary filtering outcomes for configured custom blocklists. + */ + custom_blocklists?: AzureContentFilterBlocklistResult; +}; diff --git a/.typespec.azure/images/client.tsp b/.typespec.azure/images/client.tsp new file mode 100644 index 000000000..eb5e99366 --- /dev/null +++ b/.typespec.azure/images/client.tsp @@ -0,0 +1,8 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; +using AzureOpenAI; + +@@access(AzureImage, Access.public); +@@usage(AzureImage, Usage.output); diff --git a/.typespec.azure/images/main.tsp b/.typespec.azure/images/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec.azure/images/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec.azure/images/models.tsp b/.typespec.azure/images/models.tsp new file mode 100644 index 000000000..ed1794052 --- /dev/null +++ b/.typespec.azure/images/models.tsp @@ -0,0 +1,11 @@ +import "../common"; +import "../../.typespec/images"; + +using OpenAI; + +namespace AzureOpenAI; + +model AzureImage extends Image { + prompt_filter_results: AzureContentFilterImagePromptResults; + content_filter_results: AzureContentFilterImageResponseResults; +} diff --git a/.typespec.azure/images/operations.tsp b/.typespec.azure/images/operations.tsp new file mode 100644 index 000000000..3b4e79062 --- /dev/null +++ b/.typespec.azure/images/operations.tsp @@ -0,0 +1,24 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; +using OpenAI; + +namespace AzureOpenAI; + +@route("/deployments/{deploymentId}/images") +interface AzureImages { + @route("/generations") + @post + @operationId("ImageGenerations_Create") + @tag("Images") + createImage( + @path deploymentId: string, + @body image: CreateImageRequest, + // @query api-version: string; + ): ImagesResponse | AzureOpenAIDalleErrorResponse; +} diff --git a/.typespec.azure/main.tsp b/.typespec.azure/main.tsp new file mode 100644 index 000000000..cc5320f68 --- /dev/null +++ b/.typespec.azure/main.tsp @@ -0,0 +1,45 @@ +import "@typespec/http"; +import "@typespec/openapi3"; +import "@typespec/openapi"; + +import "./assistants"; +import "./chat"; +import "./images"; +import "./runs"; +import "./messages"; + +using TypeSpec.Http; + +@service({ + title: "Azure OpenAI Service", + termsOfService: "https://openai.com/policies/terms-of-use", + contact: { + name: "OpenAI Support", + url: "https://help.openai.com", + }, + license: { + name: "MIT", + url: "https://github.com/openai/openai-openapi/blob/master/LICENSE", + }, +}) +@server( + "{endpoint}/openai", + "Azure OpenAI APIs for completions and search", + { + @doc(""" + Supported Cognitive Services endpoints (protocol and hostname, for example: + https://westus.api.cognitive.microsoft.com). + """) + endpoint: string, + } +) +@useAuth( + ApiKeyAuth | OAuth2Auth<[ + { + type: OAuth2FlowType.implicit, + authorizationUrl: "https://login.microsoftonline.com/common/oauth2/v2.0/authorize", + scopes: ["https://cognitiveservices.azure.com/.default"], + } + ]> +) +namespace AzureOpenAI; diff --git a/.typespec.azure/messages/client.tsp b/.typespec.azure/messages/client.tsp new file mode 100644 index 000000000..b7a2d6b71 --- /dev/null +++ b/.typespec.azure/messages/client.tsp @@ -0,0 +1,6 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; + +namespace AzureOpenAI; diff --git a/.typespec.azure/messages/main.tsp b/.typespec.azure/messages/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec.azure/messages/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec.azure/messages/models.tsp b/.typespec.azure/messages/models.tsp new file mode 100644 index 000000000..4c8d4fa60 --- /dev/null +++ b/.typespec.azure/messages/models.tsp @@ -0,0 +1,5 @@ +import "../../.typespec/messages"; + +using TypeSpec.OpenAPI; + +namespace AzureOpenAI; diff --git a/.typespec.azure/messages/operations.tsp b/.typespec.azure/messages/operations.tsp new file mode 100644 index 000000000..d468ffbda --- /dev/null +++ b/.typespec.azure/messages/operations.tsp @@ -0,0 +1,26 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +using OpenAI; + +namespace AzureOpenAI; + +@route("threads/{thread_id}/messages") +interface AzureMessages { + @post + @operationId("createMessage") + @tag("Assistants") + @summary("Create a message.") + createMessage( + /** The ID of the [thread](/docs/api-reference/threads) to create a message for. */ + @path thread_id: string, + + @body requestBody: OpenAI.CreateMessageRequest, + ): OpenAI.MessageObject | OpenAI.ErrorResponse; +} diff --git a/.typespec.azure/runs/client.tsp b/.typespec.azure/runs/client.tsp new file mode 100644 index 000000000..b7a2d6b71 --- /dev/null +++ b/.typespec.azure/runs/client.tsp @@ -0,0 +1,6 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; + +namespace AzureOpenAI; diff --git a/.typespec.azure/runs/main.tsp b/.typespec.azure/runs/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec.azure/runs/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec.azure/runs/models.tsp b/.typespec.azure/runs/models.tsp new file mode 100644 index 000000000..22bd3ba03 --- /dev/null +++ b/.typespec.azure/runs/models.tsp @@ -0,0 +1,5 @@ +import "../../.typespec/runs"; + +using TypeSpec.OpenAPI; + +namespace AzureOpenAI; diff --git a/.typespec.azure/runs/operations.tsp b/.typespec.azure/runs/operations.tsp new file mode 100644 index 000000000..2f5c07af8 --- /dev/null +++ b/.typespec.azure/runs/operations.tsp @@ -0,0 +1,10 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../../.typespec/runs"; +import "../../.typespec/streaming/models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace AzureOpenAI; diff --git a/.typespec.azure/tspconfig.yaml b/.typespec.azure/tspconfig.yaml new file mode 100644 index 000000000..aa6b14917 --- /dev/null +++ b/.typespec.azure/tspconfig.yaml @@ -0,0 +1,17 @@ +emit: + - "@typespec/openapi3" + - "@azure-tools/typespec-csharp" +options: + "@typespec/openapi3": + output-file: "{project-root}/../.openapi3.azure/openapi3-azure-openai.yaml" + "@azure-tools/typespec-csharp": + namespace: "Azure.AI.OpenAI" + emitter-output-dir: "{project-root}/../.dotnet.azure/src" + generate-convenience-methods: false + unreferenced-types-handling: keepAll + model-namespace: false + generate-test-project: false + single-top-level-client: true + new-project: false + flavor: "unbranded" + enable-internal-raw-data: true diff --git a/.typespec/administration/main.tsp b/.typespec/administration/main.tsp new file mode 100644 index 000000000..5ad1d3a2b --- /dev/null +++ b/.typespec/administration/main.tsp @@ -0,0 +1 @@ +import "./models.tsp"; diff --git a/.typespec/administration/models.tsp b/.typespec/administration/models.tsp new file mode 100644 index 000000000..f429a8929 --- /dev/null +++ b/.typespec/administration/models.tsp @@ -0,0 +1,748 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +/** The service account that performed the audit logged action. */ +model AuditLogActorServiceAccount { + /** The service account id. */ + id?: string; +} + +/** The user who performed the audit logged action. */ +model AuditLogActorUser { + /** The user id. */ + id?: string; + + /** The user email. */ + email?: string; +} + +/** The API Key used to perform the audit logged action. */ +model AuditLogActorApiKey { + /** The tracking id of the API key. */ + id?: string; + + @doc(""" + The type of API key. Can be either `user` or `service_account`. + """) + type?: "user" | "service_account"; + + user?: AuditLogActorUser; + service_account?: AuditLogActorServiceAccount; +} + +/** The session in which the audit logged action was performed. */ +model AuditLogActorSession { + user?: AuditLogActorUser; + + /** The IP address from which the action was performed. */ + ip_address?: string; +} + +/** The actor who performed the audit logged action. */ +model AuditLogActor { + @doc(""" + The type of actor. Is either `session` or `api_key`. + """) + type?: "session" | "api_key"; + + session?: AuditLogActorSession; + api_key?: AuditLogActorApiKey; +} + +/** The event type. */ +@extension("x-oaiExpandable", true) +union AuditLogEventType { + "api_key.created", + "api_key.updated", + "api_key.deleted", + "invite.sent", + "invite.accepted", + "invite.deleted", + "login.succeeded", + "login.failed", + "logout.succeeded", + "logout.failed", + "organization.updated", + "project.created", + "project.updated", + "project.archived", + "service_account.created", + "service_account.updated", + "service_account.deleted", + "user.added", + "user.updated", + "user.deleted", +} + +/** A log of a user action or configuration change within this organization. */ +model AuditLog { + /** The ID of this log. */ + id: string; + + type: AuditLogEventType; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of the event. */ + @encode("unixTimestamp", int32) + effective_at: utcDateTime; + + /** The project that the action was scoped to. Absent for actions not scoped to projects. */ + project?: { + /** The project ID. */ + id?: string; + + /** The project title. */ + name?: string; + }; + + actor: AuditLogActor; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "api_key.created") + api_key_created?: { + /** The tracking ID of the API key. */ + id?: string; + + /** The payload used to create the API key. */ + data?: { + @doc(""" + A list of scopes allowed for the API key, e.g. `["api.model.request"]` + """) + scopes?: string[]; + }; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "api_key.updated") + api_key_updated?: { + /** The tracking ID of the API key. */ + id?: string; + + /** The payload used to update the API key. */ + changes_requested?: { + @doc(""" + A list of scopes allowed for the API key, e.g. `["api.model.request"]` + """) + scopes?: string[]; + }; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "api_key.deleted") + api_key_deleted?: { + /** The tracking ID of the API key. */ + id?: string; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "invite.sent") + invite_sent?: { + /** The ID of the invite. */ + id?: string; + + /** The payload used to create the invite. */ + data?: { + /** The email invited to the organization. */ + email?: string; + + @doc(""" + The role the email was invited to be. Is either `owner` or `member`. + """) + role?: string; + }; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "invite.accepted") + invite_accepted?: { + /** The ID of the invite. */ + id?: string; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "invite.deleted") + invite_deleted?: { + /** The ID of the invite. */ + id?: string; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "login.failed") + login_failed?: { + /** The error code of the failure. */ + error_code?: string; + + /** The error message of the failure. */ + error_message?: string; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "logout.failed") + logout_failed?: { + /** The error code of the failure. */ + error_code?: string; + + /** The error message of the failure. */ + error_message?: string; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "organization.updated") + organization_updated?: { + /** The organization ID. */ + id?: string; + + /** The payload used to update the organization settings. */ + changes_requested?: { + /** The organization title. */ + title?: string; + + /** The organization description. */ + description?: string; + + /** The organization name. */ + name?: string; + + settings?: { + @doc(""" + Visibility of the threads page which shows messages created with the Assistants API and Playground. One of `ANY_ROLE`, `OWNERS`, or `NONE`. + """) + threads_ui_visibility?: string; + + @doc(""" + Visibility of the usage dashboard which shows activity and costs for your organization. One of `ANY_ROLE` or `OWNERS`. + """) + usage_dashboard_visibility?: string; + }; + }; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "project.created") + project_created?: { + /** The project ID. */ + id?: string; + + /** The payload used to create the project. */ + data?: { + /** The project name. */ + name?: string; + + /** The title of the project as seen on the dashboard. */ + title?: string; + }; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "project.updated") + project_updated?: { + /** The project ID. */ + id?: string; + + /** The payload used to update the project. */ + changes_requested?: { + /** The title of the project as seen on the dashboard. */ + title?: string; + }; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "project.archived") + project_archived?: { + /** The project ID. */ + id?: string; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "service_account.created") + service_account_created?: { + /** The service account ID. */ + id?: string; + + /** The payload used to create the service account. */ + data?: { + @doc(""" + The role of the service account. Is either `owner` or `member`. + """) + role?: string; + }; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "service_account.updated") + service_account_updated?: { + /** The service account ID. */ + id?: string; + + /** The payload used to updated the service account. */ + changes_requested?: { + @doc(""" + The role of the service account. Is either `owner` or `member`. + """) + role?: string; + }; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "service_account.deleted") + service_account_deleted?: { + /** The service account ID. */ + id?: string; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "user.added") + user_added?: { + /** The user ID. */ + id?: string; + + /** The payload used to add the user to the project. */ + data?: { + @doc(""" + The role of the user. Is either `owner` or `member`. + """) + role?: string; + }; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "user.updated") + user_updated?: { + /** The project ID. */ + id?: string; + + /** The payload used to update the user. */ + changes_requested?: { + @doc(""" + The role of the user. Is either `owner` or `member`. + """) + role?: string; + }; + }; + + @doc(""" + The details for events with this `type`. + """) + @encodedName("application/json", "user.deleted") + user_deleted?: { + /** The user ID. */ + id?: string; + }; +} + +model ListAuditLogsResponse { + object: "list"; + data: AuditLog[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +@doc(""" + Represents an individual `invite` to the organization. + """) +model Invite { + @doc(""" + The object type, which is always `organization.invite` + """) + object: "organization.invite"; + + /** The identifier, which can be referenced in API endpoints */ + id: string; + + /** The email address of the individual to whom the invite was sent */ + email: string; + + @doc(""" + `owner` or `reader` + """) + role: "owner" | "reader"; + + @doc(""" + `accepted`,`expired`, or `pending` + """) + status: "accepted" | "expired" | "pending"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the invite was sent. */ + @encode("unixTimestamp", int32) + invited_at: utcDateTime; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the invite expires. */ + @encode("unixTimestamp", int32) + expires_at: utcDateTime; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the invite was accepted. */ + @encode("unixTimestamp", int32) + accepted_at?: utcDateTime; +} + +model InviteListResponse { + @doc(""" + The object type, which is always `list` + """) + object: "list"; + + data: Invite[]; + + @doc(""" + The first `invite_id` in the retrieved `list` + """) + first_id?: string; + + @doc(""" + The last `invite_id` in the retrieved `list` + """) + last_id?: string; + + @doc(""" + The `has_more` property is used for pagination to indicate there are additional results. + """) + has_more?: boolean; +} + +model InviteRequest { + /** Send an email to this address */ + email: string; + + @doc(""" + `owner` or `reader` + """) + role: "reader" | "owner"; +} + +model InviteDeleteResponse { + @doc(""" + The object type, which is always `organization.invite.deleted` + """) + object: "organization.invite.deleted"; + + id: string; + deleted: boolean; +} + +@doc(""" + Represents an individual `user` within an organization. + """) +model User { + @doc(""" + The object type, which is always `organization.user` + """) + object: "organization.user"; + + /** The identifier, which can be referenced in API endpoints */ + id: string; + + /** The name of the user */ + name: string; + + /** The email address of the user */ + email: string; + + @doc(""" + `owner` or `reader` + """) + role: "owner" | "reader"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the user was added. */ + @encode("unixTimestamp", int32) + added_at: utcDateTime; +} + +model UserListResponse { + object: "list"; + data: User[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +model UserRoleUpdateRequest { + @doc(""" + `owner` or `reader` + """) + role: "owner" | "reader"; +} + +model UserDeleteResponse { + object: "organization.user.deleted"; + id: string; + deleted: boolean; +} + +/** Represents an individual project. */ +model Project { + /** The identifier, which can be referenced in API endpoints */ + id: string; + + @doc(""" + The object type, which is always `organization.project` + """) + object: "organization.project"; + + /** The name of the project. This appears in reporting. */ + name: string; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the project was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + @doc(""" + The Unix timestamp (in seconds) of when the project was archived or `null`. + """) + @encode("unixTimestamp", int32) + archived_at?: utcDateTime | null; + + @doc(""" + `active` or `archived` + """) + status: "active" | "archived"; +} + +model ProjectListResponse { + object: "list"; + data: Project[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +model ProjectCreateRequest { + /** The friendly name of the project, this name appears in reports. */ + name: string; +} + +model ProjectUpdateRequest { + /** The updated name of the project, this name appears in reports. */ + name: string; +} + +model DefaultProjectErrorResponse { + code: int32; + message: string; +} + +/** Represents an individual user in a project. */ +model ProjectUser { + @doc(""" + The object type, which is always `organization.project.user` + """) + object: "organization.project.user"; + + /** The identifier, which can be referenced in API endpoints */ + id: string; + + /** The name of the user */ + name: string; + + /** The email address of the user */ + email: string; + + @doc(""" + `owner` or `member` + """) + role: "owner" | "member"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the project was added. */ + @encode("unixTimestamp", int32) + added_at: utcDateTime; +} + +model ProjectUserListResponse { + object: string; + data: ProjectUser[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +model ProjectUserCreateRequest { + /** The ID of the user. */ + user_id: string; + + @doc(""" + `owner` or `member` + """) + role: "owner" | "member"; +} + +model ProjectUserUpdateRequest { + @doc(""" + `owner` or `member` + """) + role: "owner" | "member"; +} + +model ProjectUserDeleteResponse { + object: "organization.project.user.deleted"; + id: string; + deleted: boolean; +} + +/** Represents an individual service account in a project. */ +model ProjectServiceAccount { + @doc(""" + The object type, which is always `organization.project.service_account` + """) + object: "organization.project.service_account"; + + /** The identifier, which can be referenced in API endpoints */ + id: string; + + /** The name of the service account */ + name: string; + + @doc(""" + `owner` or `member` + """) + role: "owner" | "member"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the service account was created */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; +} + +model ProjectServiceAccountListResponse { + object: "list"; + data: ProjectServiceAccount[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +model ProjectServiceAccountCreateRequest { + /** The name of the service account being created. */ + name: string; +} + +model ProjectServiceAccountCreateResponse { + object: "organization.project.service_account"; + id: string; + name: string; + + @doc(""" + Service accounts can only have one role of type `member` + """) + role: "member"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + api_key: ProjectServiceAccountApiKey; +} + +model ProjectServiceAccountApiKey { + @doc(""" + The object type, which is always `organization.project.service_account.api_key` + """) + object: "organization.project.service_account.api_key"; + + value: string; + name: string; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + id: string; +} + +model ProjectServiceAccountDeleteResponse { + object: "organization.project.service_account.deleted"; + id: string; + deleted: boolean; +} + +/** Represents an individual API key in a project. */ +model ProjectApiKey { + @doc(""" + The object type, which is always `organization.project.api_key` + """) + object: "organization.project.api_key"; + + /** The redacted value of the API key */ + redacted_value: string; + + /** The name of the API key */ + name: string; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the API key was created */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The identifier, which can be referenced in API endpoints */ + id: string; + + owner: { + @doc(""" + `user` or `service_account` + """) + type?: "user" | "service_account"; + + user?: ProjectUser; + service_account?: ProjectServiceAccount; + }; +} + +model ProjectApiKeyListResponse { + object: "list"; + data: ProjectApiKey[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +model ProjectApiKeyDeleteResponse { + object: "organization.project.api_key.deleted"; + id: string; + deleted: boolean; +} diff --git a/.typespec/assistants/client.tsp b/.typespec/assistants/client.tsp new file mode 100644 index 000000000..1e238b5ec --- /dev/null +++ b/.typespec/assistants/client.tsp @@ -0,0 +1,12 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./custom.tsp"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; +using OpenAI; + +@@access(AssistantResponseFormat, Access.public); +@@usage(AssistantResponseFormat, Usage.input); + +@@access(ToolResourcesFileSearchVectorStoreCreationHelper, Access.public); +@@usage(ToolResourcesFileSearchVectorStoreCreationHelper, Usage.input); diff --git a/.typespec/assistants/custom.tsp b/.typespec/assistants/custom.tsp new file mode 100644 index 000000000..e557bac28 --- /dev/null +++ b/.typespec/assistants/custom.tsp @@ -0,0 +1,83 @@ +import "@azure-tools/typespec-client-generator-core"; +import "@typespec/http"; +import "../common/models.tsp"; +import "../vector-stores/models.tsp"; + +using Azure.ClientGenerator.Core; +using TypeSpec.OpenAPI; +using TypeSpec.Http; + +namespace OpenAI; + +// This customization allows us to concretely specify that the file_search object must provide +// either ID references --or-- in-line creation helpers, but not both. + +model ToolResourcesFileSearch { + ...ToolResourcesFileSearchIdsOnly; + ...ToolResourcesFileSearchVectorStoreCreationHelpers; +} + +model ToolResourcesFileSearchIdsOnly { + /** + * The [vector store](/docs/api-reference/vector-stores/object) attached to this assistant. + * There can be a maximum of 1 vector store attached to the assistant. + */ + @maxItems(1) + vector_store_ids?: string[]; +} + +model ToolResourcesFileSearchVectorStoreCreationHelpers { + /** + * A helper to create a [vector store](/docs/api-reference/vector-stores/object) with + * file_ids and attach it to this assistant. There can be a maximum of 1 vector store + * attached to the assistant. + */ + @maxItems(1) + vector_stores?: ToolResourcesFileSearchVectorStoreCreationHelper[]; +} + +alias ToolResourcesFileSearchVectorStoreCreationHelper = { + /** + * A list of [file](/docs/api-reference/files) IDs to add to the vector store. There can be + * a maximum of 10000 files in a vector store. + */ + @maxItems(10000) + file_ids?: string[]; + + @doc(""" + The chunking strategy used to chunk the file(s). If not set, will use the `auto` strategy. Only applicable if `file_ids` is non-empty. + """) + @extension("x-oaiExpandable", true) + chunking_strategy?: AutoChunkingStrategyRequestParam | StaticChunkingStrategyRequestParam; + + /** + * Set of 16 key-value pairs that can be attached to a vector store. This can be useful for + * storing additional information about the vector store in a structured format. Keys can + * be a maximum of 64 characters long and values can be a maxium of 512 characters long. + */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record; +}; + +@discriminator("type") +model AssistantToolDefinition { + type: string; +} + +@encodedName("application/json", "") +@discriminator("type") +model AssistantResponseFormat { + ...OmniTypedResponseFormat; +} + +model AssistantResponseFormatText extends AssistantResponseFormat { + ...ResponseFormatText; +} + +model AssistantResponseFormatJsonObject extends AssistantResponseFormat { + ...ResponseFormatJsonObject; +} + +model AssistantResponseFormatJsonSchema extends AssistantResponseFormat { + ...ResponseFormatJsonSchema; +} diff --git a/.typespec/assistants/main.tsp b/.typespec/assistants/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec/assistants/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec/assistants/models.tsp b/.typespec/assistants/models.tsp new file mode 100644 index 000000000..9eda287ad --- /dev/null +++ b/.typespec/assistants/models.tsp @@ -0,0 +1,351 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "../common"; +import "./custom.tsp"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@doc(""" + Specifies the format that the model must output. Compatible with [GPT-4o](/docs/models/gpt-4o), [GPT-4 Turbo](/docs/models/gpt-4-turbo-and-gpt-4), and all GPT-3.5 Turbo models since `gpt-3.5-turbo-1106`. + + Setting to `{ "type": "json_schema", "json_schema": {...} }` enables Structured Outputs which guarantees the model will match your supplied JSON schema. Learn more in the [Structured Outputs guide](/docs/guides/structured-outputs). + + Setting to `{ "type": "json_object" }` enables JSON mode, which guarantees the message the model generates is valid JSON. + + **Important:** when using JSON mode, you **must** also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if `finish_reason="length"`, which indicates the generation exceeded `max_tokens` or the conversation exceeded the max context length. + """) +@extension("x-oaiExpandable", true) +union AssistantsApiResponseFormatOption { + "auto", + ResponseFormatText, + ResponseFormatJsonObject, + ResponseFormatJsonSchema, +} + +model CreateAssistantRequest { + /** ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. */ + @extension("x-oaiTypeLabel", "string") + `model`: + | string + | "gpt-4o" + | "gpt-4o-2024-08-06" + | "gpt-4o-2024-05-13" + | "gpt-4o-mini" + | "gpt-4o-mini-2024-07-18" + | "gpt-4-turbo" + | "gpt-4-turbo-2024-04-09" + | "gpt-4-0125-preview" + | "gpt-4-turbo-preview" + | "gpt-4-1106-preview" + | "gpt-4-vision-preview" + | "gpt-4" + | "gpt-4-0314" + | "gpt-4-0613" + | "gpt-4-32k" + | "gpt-4-32k-0314" + | "gpt-4-32k-0613" + | "gpt-3.5-turbo" + | "gpt-3.5-turbo-16k" + | "gpt-3.5-turbo-0613" + | "gpt-3.5-turbo-1106" + | "gpt-3.5-turbo-0125" + | "gpt-3.5-turbo-16k-0613"; + + /** The name of the assistant. The maximum length is 256 characters. */ + @maxLength(256) + name?: string | null; + + /** The description of the assistant. The maximum length is 512 characters. */ + @maxLength(512) + description?: string | null; + + /** The system instructions that the assistant uses. The maximum length is 256,000 characters. */ + @maxLength(256000) + instructions?: string | null; + + // Tool customization: use common model base for tool definitions + @doc(""" + A list of tool enabled on the assistant. There can be a maximum of 128 tools per assistant. Tools can be of types `code_interpreter`, `file_search`, or `function`. + """) + @maxItems(128) + @extension("x-oaiExpandable", true) + tools?: AssistantToolDefinition[] = #[]; + + @doc(""" + A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + """) + tool_resources?: { + code_interpreter?: { + @doc(""" + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + """) + @maxItems(20) + file_ids?: string[] = #[]; + }; + + // Tool customization: use custom type for sophisticated union + file_search?: ToolResourcesFileSearch; + } | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; + + /** What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. */ + @minValue(0) + @maxValue(2) + temperature?: float32 | null = 1; + + /** + * An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + * + * We generally recommend altering this or temperature but not both. + */ + @minValue(0) + @maxValue(1) + top_p?: float32 | null = 1; + + response_format?: AssistantsApiResponseFormatOption | null; +} + +model ModifyAssistantRequest { + /** ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. */ + `model`?: string; + + /** The name of the assistant. The maximum length is 256 characters. */ + @maxLength(256) + name?: string | null; + + /** The description of the assistant. The maximum length is 512 characters. */ + @maxLength(512) + description?: string | null; + + /** The system instructions that the assistant uses. The maximum length is 256,000 characters. */ + @maxLength(256000) + instructions?: string | null; + + // Tool customization: use common model base for tool definitions + @doc(""" + A list of tool enabled on the assistant. There can be a maximum of 128 tools per assistant. Tools can be of types `code_interpreter`, `file_search`, or `function`. + """) + @maxItems(128) + @extension("x-oaiExpandable", true) + tools?: AssistantToolDefinition[] = #[]; + + @doc(""" + A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + """) + tool_resources?: { + code_interpreter?: { + @doc(""" + Overrides the list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + """) + @maxItems(20) + file_ids?: string[] = #[]; + }; + + // Tool customization: use custom type for sophisticated union + file_search?: ToolResourcesFileSearchIdsOnly; + } | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; + + /** What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. */ + @minValue(0) + @maxValue(2) + temperature?: float32 | null = 1; + + /** + * An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + * + * We generally recommend altering this or temperature but not both. + */ + @minValue(0) + @maxValue(1) + top_p?: float32 | null = 1; + + response_format?: AssistantsApiResponseFormatOption | null; +} + +model ListAssistantsResponse { + // Tool customization: add a clear enum enforcement of constrained 'object' label + object: "list"; + + data: AssistantObject[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +model DeleteAssistantResponse { + id: string; + deleted: boolean; + object: "assistant.deleted"; +} + +// Tool customization: apply a common model base for tool definitions +model AssistantToolsCode extends AssistantToolDefinition { + @doc(""" + The type of tool being defined: `code_interpreter` + """) + type: "code_interpreter"; +} + +// Tool customization: apply a common model base for tool definitions +model AssistantToolsFileSearch extends AssistantToolDefinition { + @doc(""" + The type of tool being defined: `file_search` + """) + type: "file_search"; + + /** Overrides for the file search tool. */ + file_search?: { + @doc(""" + The maximum number of results the file search tool should output. The default is 20 for `gpt-4*` models and 5 for `gpt-3.5-turbo`. This number should be between 1 and 50 inclusive. + + Note that the file search tool may output fewer than `max_num_results` results. See the [file search tool documentation](/docs/assistants/tools/file-search/number-of-chunks-returned) for more information. + """) + @minValue(1) + @maxValue(50) + max_num_results?: int32; + }; +} + +model AssistantToolsFileSearchTypeOnly { + @doc(""" + The type of tool being defined: `file_search` + """) + type: "file_search"; +} + +// Tool customization: apply a common model base for tool definitions +model AssistantToolsFunction extends AssistantToolDefinition { + @doc(""" + The type of tool being defined: `function` + """) + type: "function"; + + function: FunctionObject; +} + +@doc(""" + Represents an `assistant` that can call the model and use tools. + """) +model AssistantObject { + /** The identifier, which can be referenced in API endpoints. */ + id: string; + + @doc(""" + The object type, which is always `assistant`. + """) + object: "assistant"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the assistant was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The name of the assistant. The maximum length is 256 characters. */ + @maxLength(256) + name: string | null; + + /** The description of the assistant. The maximum length is 512 characters. */ + @maxLength(512) + description: string | null; + + /** ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. */ + `model`: string; + + /** The system instructions that the assistant uses. The maximum length is 256,000 characters. */ + @maxLength(256000) + instructions: string | null; + + // Tool customization: use common model base for tool definitions + @doc(""" + A list of tool enabled on the assistant. There can be a maximum of 128 tools per assistant. Tools can be of types `code_interpreter`, `file_search`, or `function`. + """) + @maxItems(128) + @extension("x-oaiExpandable", true) + tools: AssistantToolDefinition[] = #[]; + + @doc(""" + A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + """) + tool_resources?: { + code_interpreter?: { + @doc(""" + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter`` tool. There can be a maximum of 20 files associated with the tool. + """) + @maxItems(20) + file_ids?: string[] = #[]; + }; + + // Tool customization: use custom type for sophisticated union + file_search?: ToolResourcesFileSearchIdsOnly; + } | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata: Record | null; + + /** What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. */ + @minValue(0) + @maxValue(2) + temperature?: float32 | null = 1; + + /** + * An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + * + * We generally recommend altering this or temperature but not both. + */ + @minValue(0) + @maxValue(1) + top_p?: float32 | null = 1; + + response_format?: AssistantsApiResponseFormatOption | null; +} + +/** Controls for how a thread will be truncated prior to the run. Use this to control the intial context window of the run. */ +model TruncationObject { + @doc(""" + The truncation strategy to use for the thread. The default is `auto`. If set to `last_messages`, the thread will be truncated to the n most recent messages in the thread. When set to `auto`, messages in the middle of the thread will be dropped to fit the context length of the model, `max_prompt_tokens`. + """) + type: "auto" | "last_messages"; + + /** The number of most recent messages from the thread when constructing the context for the run. */ + @minValue(1) + last_messages?: int32 | null; +} + +@doc(""" + Controls which (if any) tool is called by the model. + `none` means the model will not call any tools and instead generates a message. + `auto` is the default value and means the model can pick between generating a message or calling one or more tools. + `required` means the model must call one or more tools before responding to the user. + Specifying a particular tool like `{"type": "file_search"}` or `{"type": "function", "function": {"name": "my_function"}}` forces the model to call that tool. + """) +@extension("x-oaiExpandable", true) +union AssistantsApiToolChoiceOption { + "none" | "auto" | "required", + AssistantsNamedToolChoice, +} + +/** Specifies a tool the model should use. Use to force the model to call a specific tool. */ +model AssistantsNamedToolChoice { + @doc(""" + The type of the tool. If type is `function`, the function name must be set + """) + type: "function" | "code_interpreter" | "file_search"; + + function?: { + /** The name of the function to call. */ + name: string; + }; +} diff --git a/.typespec/assistants/operations.tsp b/.typespec/assistants/operations.tsp new file mode 100644 index 000000000..b7d1a11bb --- /dev/null +++ b/.typespec/assistants/operations.tsp @@ -0,0 +1,85 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/assistants") +interface Assistants { + @post + @operationId("createAssistant") + @tag("Assistants") + @summary("Create an assistant with a model and instructions.") + createAssistant( + @body requestBody: CreateAssistantRequest, + ): AssistantObject | ErrorResponse; + + @get + @operationId("listAssistants") + @tag("Assistants") + @summary("Returns a list of assistants.") + listAssistants( + /** + * A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + * default is 20. + */ + @query limit?: int32 = 20, + + /** + * Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + * for descending order. + */ + @query order?: ListOrder = ListOrder.desc, + + /** + * A cursor for use in pagination. `after` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include after=obj_foo in order to fetch the next page of the list. + */ + @query after?: string, + + /** + * A cursor for use in pagination. `before` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include before=obj_foo in order to fetch the previous page of the list. + */ + @query before?: string, + ): ListAssistantsResponse | ErrorResponse; + + @route("{assistant_id}") + @get + @operationId("getAssistant") + @tag("Assistants") + @summary("Retrieves an assistant.") + getAssistant( + /** The ID of the assistant to retrieve. */ + @path assistant_id: string, + ): AssistantObject | ErrorResponse; + + @route("{assistant_id}") + @post + @operationId("modifyAssistant") + @tag("Assistants") + @summary("Modifies an assistant.") + modifyAssistant( + /** The ID of the assistant to modify. */ + @path assistant_id: string, + + @body requestBody: ModifyAssistantRequest, + ): AssistantObject | ErrorResponse; + + @route("{assistant_id}") + @delete + @operationId("deleteAssistant") + @tag("Assistants") + @summary("Delete an assistant.") + deleteAssistant( + /** The ID of the assistant to delete. */ + @path assistant_id: string, + ): DeleteAssistantResponse | ErrorResponse; +} diff --git a/completions/main.tsp b/.typespec/audio/main.tsp similarity index 100% rename from completions/main.tsp rename to .typespec/audio/main.tsp diff --git a/.typespec/audio/models.tsp b/.typespec/audio/models.tsp new file mode 100644 index 000000000..46d50d08a --- /dev/null +++ b/.typespec/audio/models.tsp @@ -0,0 +1,211 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +model CreateSpeechRequest { + @doc(""" + One of the available [TTS models](/docs/models/tts): `tts-1` or `tts-1-hd` + """) + @extension("x-oaiTypeLabel", "string") + `model`: string | "tts-1" | "tts-1-hd"; + + /** The text to generate audio for. The maximum length is 4096 characters. */ + @maxLength(4096) + input: string; + + @doc(""" + The voice to use when generating the audio. Supported voices are `alloy`, `echo`, `fable`, `onyx`, `nova`, and `shimmer`. Previews of the voices are available in the [Text to speech guide](/docs/guides/text-to-speech/voice-options). + """) + voice: "alloy" | "echo" | "fable" | "onyx" | "nova" | "shimmer"; + + @doc(""" + The format to audio in. Supported formats are `mp3`, `opus`, `aac`, `flac`, `wav`, and `pcm`. + """) + response_format?: "mp3" | "opus" | "aac" | "flac" | "wav" | "pcm" = "mp3"; + + @doc(""" + The speed of the generated audio. Select a value from `0.25` to `4.0`. `1.0` is the default. + """) + @minValue(0.25) + @maxValue(4.0) + speed?: float32 = 1.0; +} + +model CreateTranscriptionRequest { + // Tool customization: binary payloads are encoded bytes + /** The audio file object (not file name) to transcribe, in one of these formats: flac, mp3, mp4, mpeg, mpga, m4a, ogg, wav, or webm. */ + @extension("x-oaiTypeLabel", "file") + @encode("binary") + file: bytes; + + @doc(""" + ID of the model to use. Only `whisper-1` (which is powered by our open source Whisper V2 model) is currently available. + """) + @extension("x-oaiTypeLabel", "string") + `model`: string | "whisper-1"; + + /** The language of the input audio. Supplying the input language in [ISO-639-1](https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes) format will improve accuracy and latency. */ + language?: string; + + /** An optional text to guide the model's style or continue a previous audio segment. The [prompt](/docs/guides/speech-to-text/prompting) should match the audio language. */ + prompt?: string; + + @doc(""" + The format of the transcript output, in one of these options: `json`, `text`, `srt`, `verbose_json`, or `vtt`. + """) + response_format?: "json" | "text" | "srt" | "verbose_json" | "vtt" = "json"; + + // Tool customization: add missing but documented min/max for temperature + /** The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to automatically increase the temperature until certain thresholds are hit. */ + @minValue(0) + @maxValue(1) + temperature?: float32 = 0; + + @doc(""" + The timestamp granularities to populate for this transcription. `response_format` must be set `verbose_json` to use timestamp granularities. Either or both of these options are supported: `word`, or `segment`. Note: There is no additional latency for segment timestamps, but generating word timestamps incurs additional latency. + """) + timestamp_granularities?: ("word" | "segment")[] = #["segment"]; +} + +model CreateTranslationRequest { + // Tool customization: binary payloads are encoded bytes + /** The audio file object (not file name) translate, in one of these formats: flac, mp3, mp4, mpeg, mpga, m4a, ogg, wav, or webm. */ + @extension("x-oaiTypeLabel", "file") + @encode("binary") + file: bytes; + + @doc(""" + ID of the model to use. Only `whisper-1` (which is powered by our open source Whisper V2 model) is currently available. + """) + @extension("x-oaiTypeLabel", "string") + `model`: string | "whisper-1"; + + /** An optional text to guide the model's style or continue a previous audio segment. The [prompt](/docs/guides/speech-to-text/prompting) should be in English. */ + prompt?: string; + + // Tool customization: add enum types + @doc(""" + The format of the transcript output, in one of these options: `json`, `text`, `srt`, `verbose_json`, or `vtt`. + """) + response_format?: "json" | "text" | "srt" | "verbose_json" | "vtt" = "json"; + + // Tool customization: add missing but documented min/max for temperature + /** The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to automatically increase the temperature until certain thresholds are hit. */ + @minValue(0) + @maxValue(1) + temperature?: float32 = 0; +} + +/** Represents a transcription response returned by model, based on the provided input. */ +model CreateTranscriptionResponseJson { + /** The transcribed text. */ + text: string; +} + +// Tool customization: Add a missing 'task' field, present on the wire but not in the spec +/** Represents a verbose json transcription response returned by model, based on the provided input. */ +model CreateTranscriptionResponseVerboseJson { + /** The task label. */ + task: "transcribe"; + + /** The language of the input audio. */ + language: string; + + // Tool customization: correct erroneous spec representation of duration as string + /** The duration of the input audio. */ + @encode("seconds", float32) + duration: duration; + + /** The transcribed text. */ + text: string; + + /** Extracted words and their corresponding timestamps. */ + words?: TranscriptionWord[]; + + /** Segments of the transcribed text and their corresponding details. */ + segments?: TranscriptionSegment[]; +} + +model CreateTranslationResponseJson { + text: string; +} + +// Tool customization: Add a missing 'task' field, present on the wire but not in the spec +model CreateTranslationResponseVerboseJson { + /** The task label. */ + task: "translate"; + + @doc(""" + The language of the output translation (always `english`). + """) + language: string; + + // Tool customization: correct erroneous spec representation of duration as string + /** The duration of the input audio. */ + @encode("seconds", float32) + duration: duration; + + /** The translated text. */ + text: string; + + /** Segments of the translated text and their corresponding details. */ + segments?: TranscriptionSegment[]; +} + +model TranscriptionSegment { + /** Unique identifier of the segment. */ + id: int32; + + /** Seek offset of the segment. */ + seek: int32; + + // Tool customization: timespans are encoded durations + /** Start time of the segment in seconds. */ + @encode("seconds", float32) + start: duration; + + // Tool customization: timespans are encoded durations + /** End time of the segment in seconds. */ + @encode("seconds", float32) + end: duration; + + /** Text content of the segment. */ + text: string; + + /** Array of token IDs for the text content. */ + tokens: int32[]; + + /** Temperature parameter used for generating the segment. */ + temperature: float32; + + /** Average logprob of the segment. If the value is lower than -1, consider the logprobs failed. */ + avg_logprob: float32; + + /** Compression ratio of the segment. If the value is greater than 2.4, consider the compression failed. */ + compression_ratio: float32; + + @doc(""" + Probability of no speech in the segment. If the value is higher than 1.0 and the `avg_logprob` is below -1, consider this segment silent. + """) + no_speech_prob: float32; +} + +model TranscriptionWord { + /** The text content of the word. */ + word: string; + + // Tool customization: timespans are encoded durations + /** Start time of the word in seconds. */ + @encode("seconds", float32) + start: duration; + + // Tool customization: timespans are encoded durations + /** End time of the word in seconds. */ + @encode("seconds", float32) + end: duration; +} diff --git a/.typespec/audio/operations.tsp b/.typespec/audio/operations.tsp new file mode 100644 index 000000000..b69a34d40 --- /dev/null +++ b/.typespec/audio/operations.tsp @@ -0,0 +1,64 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/audio") +interface Audio { + @route("speech") + @post + @operationId("createSpeech") + @tag("Audio") + @summary("Generates audio from the input text.") + createSpeech(@body requestBody: CreateSpeechRequest): { + /** chunked */ + @header("Transfer-Encoding") transferEncoding?: string; + + @header contentType: "application/octet-stream"; + @body @encode("binary") responseBody: bytes; + }; + + @route("transcriptions") + @post + @operationId("createTranscription") + @tag("Audio") + @summary("Transcribes audio into the input language.") + createTranscription( + @header contentType: "multipart/form-data", + @body requestBody: CreateTranscriptionRequest, + ): + | CreateTranscriptionResponseVerboseJson + | CreateTranscriptionResponseJson + | { + // TODO: This response is not defined in the OpenAPI spec. + @header contentType: "text/plain"; + + @body responseBody: string; + } + | ErrorResponse; + + @route("translations") + @post + @operationId("createTranslation") + @tag("Audio") + @summary("Translates audio into English..") + createTranslation( + @header contentType: "multipart/form-data", + @body requestBody: CreateTranslationRequest, + ): + | CreateTranslationResponseVerboseJson + | CreateTranslationResponseJson + | { + // TODO: This response is not defined in the OpenAPI spec. + @header contentType: "text/plain"; + + @body responseBody: string; + } + | ErrorResponse; +} diff --git a/.typespec/batch/client.tsp b/.typespec/batch/client.tsp new file mode 100644 index 000000000..f75ec4253 --- /dev/null +++ b/.typespec/batch/client.tsp @@ -0,0 +1,11 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; +using OpenAI; + +@@access(BatchRequestInput, Access.public); +@@usage(BatchRequestInput, Usage.input); + +@@access(BatchRequestOutput, Access.public); +@@usage(BatchRequestOutput, Usage.output); diff --git a/.typespec/batch/main.tsp b/.typespec/batch/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec/batch/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec/batch/models.tsp b/.typespec/batch/models.tsp new file mode 100644 index 000000000..b2fa2e7fe --- /dev/null +++ b/.typespec/batch/models.tsp @@ -0,0 +1,180 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +model Batch { + id: string; + + @doc(""" + The object type, which is always `batch`. + """) + object: "batch"; + + /** The OpenAI API endpoint used by the batch. */ + endpoint: string; + + errors?: { + // Tool customization: add a clear enum enforcement of constrained 'object' label + @doc(""" + The object type, which is always `list`. + """) + object?: "list"; + + data?: { + /** An error code identifying the error type. */ + code?: string; + + /** A human-readable message providing more details about the error. */ + message?: string; + + /** The name of the parameter that caused the error, if applicable. */ + param?: string | null; + + /** The line number of the input file where the error occurred, if applicable. */ + line?: int32 | null; + }[]; + }; + + /** The ID of the input file for the batch. */ + input_file_id: string; + + /** The time frame within which the batch should be processed. */ + completion_window: string; + + /** The current status of the batch. */ + status: + | "validating" + | "failed" + | "in_progress" + | "finalizing" + | "completed" + | "expired" + | "cancelling" + | "cancelled"; + + /** The ID of the file containing the outputs of successfully executed requests. */ + output_file_id?: string; + + /** The ID of the file containing the outputs of requests with errors. */ + error_file_id?: string; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the batch was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the batch started processing. */ + @encode("unixTimestamp", int32) + in_progress_at?: utcDateTime; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the batch will expire. */ + @encode("unixTimestamp", int32) + expires_at?: utcDateTime; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the batch started finalizing. */ + @encode("unixTimestamp", int32) + finalizing_at?: utcDateTime; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the batch was completed. */ + @encode("unixTimestamp", int32) + completed_at?: utcDateTime; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the batch failed. */ + @encode("unixTimestamp", int32) + failed_at?: utcDateTime; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the batch expired. */ + @encode("unixTimestamp", int32) + expired_at?: utcDateTime; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the batch started cancelling. */ + @encode("unixTimestamp", int32) + cancelling_at?: utcDateTime; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the batch was cancelled. */ + @encode("unixTimestamp", int32) + cancelled_at?: utcDateTime; + + /** The request counts for different statuses within the batch. */ + request_counts?: { + /** Total number of requests in the batch. */ + total: int32; + + /** Number of requests that have been completed successfully. */ + completed: int32; + + /** Number of requests that have failed. */ + failed: int32; + }; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; +} + +/** The per-line object of the batch input file */ +model BatchRequestInput { + /** A developer-provided per-request id that will be used to match outputs to inputs. Must be unique for each request in a batch. */ + custom_id?: string; + + @doc(""" + The HTTP method to be used for the request. Currently only `POST` is supported. + """) + method?: "POST"; + + // Tool customization: url uses the url type + @doc(""" + The OpenAI API relative URL to be used for the request. Currently `/v1/chat/completions`, `/v1/embeddings`, and `/v1/completions` are supported. + """) + url?: url; +} + +/** The per-line object of the batch output and error files */ +model BatchRequestOutput { + id?: string; + + /** A developer-provided per-request id that will be used to match outputs to inputs. */ + custom_id?: string; + + response?: { + /** The HTTP status code of the response */ + status_code?: int32; + + /** An unique identifier for the OpenAI API request. Please include this request ID when contacting support. */ + request_id?: string; + + /** The JSON body of the response */ + @extension("x-oaiTypeLabel", "map") + body?: Record; + } | null; + + /** For requests that failed with a non-HTTP error, this will contain more information on the cause of the failure. */ + error?: { + /** A machine-readable error code. */ + code?: string; + + /** A human-readable error message. */ + message?: string; + } | null; +} + +model ListBatchesResponse { + data: Batch[]; + first_id?: string; + last_id?: string; + has_more: boolean; + object: "list"; +} diff --git a/.typespec/batch/operations.tsp b/.typespec/batch/operations.tsp new file mode 100644 index 000000000..fa39ee04a --- /dev/null +++ b/.typespec/batch/operations.tsp @@ -0,0 +1,83 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/batches") +interface Batches { + @post + @operationId("createBatch") + @tag("Batch") + @summary("Creates and executes a batch from an uploaded file of requests") + createBatch( + /** + * The ID of an uploaded file that contains requests for the new batch. + * + * See [upload file](/docs/api-reference/files/create) for how to upload a file. + * + * Your input file must be formatted as a [JSONL file](/docs/api-reference/batch/requestInput), and must be uploaded with the purpose `batch`. + */ + input_file_id: string, + + /** + * The endpoint to be used for all requests in the batch. Currently `/v1/chat/completions` and `/v1/embeddings` are supported. + */ + endpoint: "/v1/chat/completions" | "/v1/embeddings", + + /** + * The time frame within which the batch should be processed. Currently only `24h` is supported. + */ + completion_window: "24h", + + /** + * Optional custom metadata for the batch. + */ + metadata?: Record | null, + ): Batch | ErrorResponse; + + @get + @operationId("listBatches") + @tag("Batch") + @summary("List your organization's batches.") + listBatches( + /** + * A cursor for use in pagination. `after` is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list. + */ + @query after?: string, + + /** + * A limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20. + */ + @query limit?: int32 = 20, + ): ListBatchesResponse | ErrorResponse; + + @route("{batch_id}") + @get + @operationId("retrieveBatch") + @tag("Batch") + @summary("Retrieves a batch.") + retrieveBatch( + /** + * The ID of the batch to retrieve. + */ + @path batch_id: string, + ): Batch | ErrorResponse; + + @route("{batch_id}/cancel") + @post + @operationId("cancelBatch") + @tag("Batch") + @summary("Cancels an in-progress batch.") + cancelBatch( + /** + * The ID of the batch to cancel. + */ + @path batch_id: string, + ): Batch | ErrorResponse; +} diff --git a/.typespec/chat/client.tsp b/.typespec/chat/client.tsp new file mode 100644 index 000000000..a0a6d9bc8 --- /dev/null +++ b/.typespec/chat/client.tsp @@ -0,0 +1,27 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./custom.tsp"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; +using OpenAI; + +@@access(ChatCompletionFunctionChoice, Access.public); +@@usage(ChatCompletionFunctionChoice, Usage.input); + +@@access(ChatCompletionToolChoice, Access.public); +@@usage(ChatCompletionToolChoice, Usage.input); + +@@access(ChatMessageContentPart, Access.public); +@@usage(ChatMessageContentPart, Usage.input | Usage.output); + +@@access(ChatCompletionRole, Access.public); +@@usage(ChatCompletionRole, Usage.input | Usage.output); + +@@access(CreateChatCompletionFunctionResponse, Access.public); +@@usage(CreateChatCompletionFunctionResponse, Usage.output); + +@@access(CreateChatCompletionStreamResponse, Access.public); +@@usage(CreateChatCompletionStreamResponse, Usage.output); + +@@access(ChatResponseFormatJsonSchema, Access.public); +@@usage(ChatResponseFormatJsonSchema, Usage.input); diff --git a/.typespec/chat/custom.tsp b/.typespec/chat/custom.tsp new file mode 100644 index 000000000..9e98e5c90 --- /dev/null +++ b/.typespec/chat/custom.tsp @@ -0,0 +1,29 @@ +import "../common"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@encodedName("application/json", "") +@discriminator("type") +model ChatResponseFormat { + ...OmniTypedResponseFormat; +} + +model ChatResponseFormatText extends ChatResponseFormat { + ...ResponseFormatText; +} + +model ChatResponseFormatJsonObject extends ChatResponseFormat { + ...ResponseFormatJsonObject; +} + +model ChatResponseFormatJsonSchema extends ChatResponseFormat { + ...ResponseFormatJsonSchema; +} + +model ChatCompletionFunctionChoice {} + +model ChatCompletionToolChoice {} + +model ChatMessageContentPart {} diff --git a/.typespec/chat/main.tsp b/.typespec/chat/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec/chat/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec/chat/models.tsp b/.typespec/chat/models.tsp new file mode 100644 index 000000000..84f8e2b39 --- /dev/null +++ b/.typespec/chat/models.tsp @@ -0,0 +1,731 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "../common"; +import "./custom.tsp"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +// Tool generated type. Extracts ChatCompletionTokenLogprob.bytes +model ChatCompletionTokenLogprobBytes is int32[]; + +model CreateChatCompletionRequest { + /** A list of messages comprising the conversation so far. [Example Python code](https://cookbook.openai.com/examples/how_to_format_inputs_to_chatgpt_models). */ + @minItems(1) + messages: ChatCompletionRequestMessage[]; + + /** ID of the model to use. See the [model endpoint compatibility](/docs/models/model-endpoint-compatibility) table for details on which models work with the Chat API. */ + @extension("x-oaiTypeLabel", "string") + `model`: + | string + | "gpt-4o" + | "gpt-4o-2024-05-13" + | "gpt-4o-2024-08-06" + | "chatgpt-4o-latest" + | "gpt-4o-mini" + | "gpt-4o-mini-2024-07-18" + | "gpt-4-turbo" + | "gpt-4-turbo-2024-04-09" + | "gpt-4-0125-preview" + | "gpt-4-turbo-preview" + | "gpt-4-1106-preview" + | "gpt-4-vision-preview" + | "gpt-4" + | "gpt-4-0314" + | "gpt-4-0613" + | "gpt-4-32k" + | "gpt-4-32k-0314" + | "gpt-4-32k-0613" + | "gpt-3.5-turbo" + | "gpt-3.5-turbo-16k" + | "gpt-3.5-turbo-0301" + | "gpt-3.5-turbo-0613" + | "gpt-3.5-turbo-1106" + | "gpt-3.5-turbo-0125" + | "gpt-3.5-turbo-16k-0613"; + + /** + * Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim. + * + * [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + */ + @minValue(-2) + @maxValue(2) + frequency_penalty?: float32 | null = 0; + + /** + * Modify the likelihood of specified tokens appearing in the completion. + * + * Accepts a JSON object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token. + */ + @extension("x-oaiTypeLabel", "map") + logit_bias?: Record | null = null; + + @doc(""" + Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the `content` of `message`. + """) + logprobs?: boolean | null = false; + + @doc(""" + An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. `logprobs` must be set to `true` if this parameter is used. + """) + @minValue(0) + @maxValue(20) + top_logprobs?: int32 | null; + + /** + * The maximum number of [tokens](/tokenizer) that can be generated in the chat completion. + * + * The total length of input tokens and generated tokens is limited by the model's context length. [Example Python code](https://cookbook.openai.com/examples/how_to_count_tokens_with_tiktoken) for counting tokens. + */ + max_tokens?: int32 | null; + + @doc(""" + How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep `n` as `1` to minimize costs. + """) + @minValue(1) + @maxValue(128) + n?: int32 | null = 1; + + /** + * Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics. + * + * [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + */ + @minValue(-2) + @maxValue(2) + presence_penalty?: float32 | null = 0; + + // Tool customization: apply a named union type + @doc(""" + An object specifying the format that the model must output. Compatible with [GPT-4o](/docs/models/gpt-4o), [GPT-4o mini](/docs/models/gpt-4o-mini), [GPT-4 Turbo](/docs/models/gpt-4-and-gpt-4-turbo) and all GPT-3.5 Turbo models newer than `gpt-3.5-turbo-1106`. + + Setting to `{ "type": "json_schema", "json_schema": {...} }` enables Structured Outputs which guarantees the model will match your supplied JSON schema. Learn more in the [Structured Outputs guide](/docs/guides/structured-outputs). + + Setting to `{ "type": "json_object" }` enables JSON mode, which guarantees the message the model generates is valid JSON. + + **Important:** when using JSON mode, you **must** also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if `finish_reason="length"`, which indicates the generation exceeded `max_tokens` or the conversation exceeded the max context length. + """) + @extension("x-oaiExpandable", true) + response_format?: ChatResponseFormat; + + @doc(""" + This feature is in Beta. + If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same `seed` and parameters should return the same result. + Determinism is not guaranteed, and you should refer to the `system_fingerprint` response parameter to monitor changes in the backend. + """) + @minValue(-9223372036854775808) + @maxValue(9223372036854775807) + seed?: int64 | null; + + @doc(""" + Specifies the latency tier to use for processing the request. This parameter is relevant for customers subscribed to the scale tier service: + - If set to 'auto', the system will utilize scale tier credits until they are exhausted. + - If set to 'default', the request will be processed using the default service tier with a lower uptime SLA and no latency guarentee. + - When not set, the default behavior is 'auto'. + + When this parameter is set, the response body will include the `service_tier` utilized. + """) + service_tier?: "auto" | "default" | null = null; + + /** Up to 4 sequences where the API will stop generating further tokens. */ + stop?: string | string[] | null = null; + + @doc(""" + If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) as they become available, with the stream terminated by a `data: [DONE]` message. [Example Python code](https://cookbook.openai.com/examples/how_to_stream_completions). + """) + stream?: boolean | null = false; + + stream_options?: ChatCompletionStreamOptions | null = null; + + @doc(""" + What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + + We generally recommend altering this or `top_p` but not both. + """) + @minValue(0) + @maxValue(2) + temperature?: float32 | null = 1; + + @doc(""" + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or `temperature` but not both. + """) + @minValue(0) + @maxValue(1) + top_p?: float32 | null = 1; + + /** A list of tools the model may call. Currently, only functions are supported as a tool. Use this to provide a list of functions the model may generate JSON inputs for. A max of 128 functions are supported. */ + tools?: ChatCompletionTool[]; + + tool_choice?: ChatCompletionToolChoiceOption; + parallel_tool_calls?: ParallelToolCalls = true; + + /** A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). */ + user?: string; + + #deprecated "This field is marked as deprecated." + @doc(""" + Deprecated in favor of `tool_choice`. + + Controls which (if any) function is called by the model. + `none` means the model will not call a function and instead generates a message. + `auto` means the model can pick between generating a message or calling a function. + Specifying a particular function via `{"name": "my_function"}` forces the model to call that function. + + `none` is the default when no functions are present. `auto` is the default if functions are present. + """) + @extension("x-oaiExpandable", true) + function_call?: "none" | "auto" | ChatCompletionFunctionCallOption; + + #deprecated "This field is marked as deprecated." + @doc(""" + Deprecated in favor of `tools`. + + A list of functions the model may generate JSON inputs for. + """) + @minItems(1) + @maxItems(128) + functions?: ChatCompletionFunctions[]; +} + +/** Represents a chat completion response returned by model, based on the provided input. */ +model CreateChatCompletionResponse { + /** A unique identifier for the chat completion. */ + id: string; + + @doc(""" + A list of chat completion choices. Can be more than one if `n` is greater than 1. + """) + choices: { + @doc(""" + The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, + `length` if the maximum number of tokens specified in the request was reached, + `content_filter` if content was omitted due to a flag from our content filters, + `tool_calls` if the model called a tool, or `function_call` (deprecated) if the model called a function. + """) + finish_reason: + | "stop" + | "length" + | "tool_calls" + | "content_filter" + | "function_call"; + + /** The index of the choice in the list of choices. */ + index: int32; + + message: ChatCompletionResponseMessage; + + /** Log probability information for the choice. */ + logprobs: { + /** A list of message content tokens with log probability information. */ + content: ChatCompletionTokenLogprob[] | null; + + /** A list of message refusal tokens with log probability information. */ + refusal: ChatCompletionTokenLogprob[] | null; + } | null; + }[]; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the chat completion was created. */ + @encode("unixTimestamp", int32) + created: utcDateTime; + + /** The model used for the chat completion. */ + `model`: string; + + @doc(""" + The service tier used for processing the request. This field is only included if the `service_tier` parameter is specified in the request. + """) + service_tier?: "scale" | "default" | null; + + @doc(""" + This fingerprint represents the backend configuration that the model runs with. + + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + """) + system_fingerprint?: string; + + @doc(""" + The object type, which is always `chat.completion`. + """) + object: "chat.completion"; + + usage?: CompletionUsage; +} + +model ChatCompletionTool { + @doc(""" + The type of the tool. Currently, only `function` is supported. + """) + type: "function"; + + function: FunctionObject; +} + +/** Specifies a tool the model should use. Use to force the model to call a specific function. */ +model ChatCompletionNamedToolChoice { + @doc(""" + The type of the tool. Currently, only `function` is supported. + """) + type: "function"; + + function: { + /** The name of the function to call. */ + name: string; + }; +} + +@doc(""" + Controls which (if any) tool is called by the model. + `none` means the model will not call any tool and instead generates a message. + `auto` means the model can pick between generating a message or calling one or more tools. + `required` means the model must call one or more tools. + Specifying a particular tool via `{"type": "function", "function": {"name": "my_function"}}` forces the model to call that tool. + + `none` is the default when no tools are present. `auto` is the default if tools are present. + """) +@extension("x-oaiExpandable", true) +union ChatCompletionToolChoiceOption { + "none" | "auto" | "required", + ChatCompletionNamedToolChoice, +} + +model ChatCompletionRequestMessageContentPartText { + /** The type of the content part. */ + type: "text"; + + /** The text content. */ + text: string; +} + +model ChatCompletionRequestMessageContentPartImage { + /** The type of the content part. */ + type: "image_url"; + + image_url: { + // Tool customization: url uses the url type + /** Either a URL of the image or the base64 encoded image data. */ + url: url; + + /** Specifies the detail level of the image. Learn more in the [Vision guide](/docs/guides/vision/low-or-high-fidelity-image-understanding). */ + detail?: "auto" | "low" | "high" = "auto"; + }; +} + +model ChatCompletionRequestMessageContentPartRefusal { + /** The type of the content part. */ + type: "refusal"; + + /** The refusal message generated by the model. */ + refusal: string; +} + +model ChatCompletionMessageToolCall { + /** The ID of the tool call. */ + id: string; + + @doc(""" + The type of the tool. Currently, only `function` is supported. + """) + type: "function"; + + /** The function that the model called. */ + function: { + /** The name of the function to call. */ + name: string; + + /** The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. */ + arguments: string; + }; +} + +// Tool customization: convert to discriminated type +@extension("x-oaiExpandable", true) +@discriminator("role") +model ChatCompletionRequestMessage { + /** The role of the author of this message. */ + role: string; +} + +@extension("x-oaiExpandable", true) +union ChatCompletionRequestSystemMessageContentPart { + ChatCompletionRequestMessageContentPartText, +} + +@extension("x-oaiExpandable", true) +union ChatCompletionRequestUserMessageContentPart { + ChatCompletionRequestMessageContentPartText, + ChatCompletionRequestMessageContentPartImage, +} + +@extension("x-oaiExpandable", true) +union ChatCompletionRequestAssistantMessageContentPart { + ChatCompletionRequestMessageContentPartText, + ChatCompletionRequestMessageContentPartRefusal, +} + +@extension("x-oaiExpandable", true) +union ChatCompletionRequestToolMessageContentPart { + ChatCompletionRequestMessageContentPartText, +} + +// Tool customization: apply discriminated type base +model ChatCompletionRequestSystemMessage extends ChatCompletionRequestMessage { + /** The contents of the system message. */ + content: string | ChatCompletionRequestSystemMessageContentPart[]; + + @doc(""" + The role of the messages author, in this case `system`. + """) + role: "system"; + + /** An optional name for the participant. Provides the model information to differentiate between participants of the same role. */ + name?: string; +} + +// Tool customization: apply discriminated type base +model ChatCompletionRequestUserMessage extends ChatCompletionRequestMessage { + /** The contents of the user message. */ + @extension("x-oaiExpandable", true) + content: string | ChatCompletionRequestUserMessageContentPart[]; + + @doc(""" + The role of the messages author, in this case `user`. + """) + role: "user"; + + /** An optional name for the participant. Provides the model information to differentiate between participants of the same role. */ + name?: string; +} + +// Tool customization: apply discriminated type base +model ChatCompletionRequestAssistantMessage + extends ChatCompletionRequestMessage { + @doc(""" + The contents of the assistant message. Required unless `tool_calls` or `function_call` is specified. + """) + content?: string | ChatCompletionRequestAssistantMessageContentPart[] | null; + + /** The refusal message by the assistant. */ + refusal?: string | null; + + @doc(""" + The role of the messages author, in this case `assistant`. + """) + role: "assistant"; + + /** An optional name for the participant. Provides the model information to differentiate between participants of the same role. */ + name?: string; + + tool_calls?: ChatCompletionMessageToolCalls; + + #deprecated "This field is marked as deprecated." + @doc(""" + Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model. + """) + function_call?: { + /** The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. */ + arguments: string; + + /** The name of the function to call. */ + name: string; + } | null; +} + +// Tool customization: apply discriminated type base +model ChatCompletionRequestToolMessage extends ChatCompletionRequestMessage { + @doc(""" + The role of the messages author, in this case `tool`. + """) + role: "tool"; + + /** The contents of the tool message. */ + content: string | ChatCompletionRequestToolMessageContentPart[]; + + /** Tool call that this message is responding to. */ + tool_call_id: string; +} + +// Tool customization: apply discriminated type base +#deprecated "This field is marked as deprecated." +model ChatCompletionRequestFunctionMessage + extends ChatCompletionRequestMessage { + @doc(""" + The role of the messages author, in this case `function`. + """) + role: "function"; + + /** The contents of the function message. */ + content: string | null; + + /** The name of the function to call. */ + name: string; +} + +/** The tool calls generated by the model, such as function calls. */ +model ChatCompletionMessageToolCalls is ChatCompletionMessageToolCall[]; + +// Tool customization: convert to enum +/** The role of the author of a message */ +enum ChatCompletionRole { + system, + user, + assistant, + tool, + function, +} + +/** Represents a chat completion response returned by model, based on the provided input. */ +model CreateChatCompletionFunctionResponse { + /** A unique identifier for the chat completion. */ + id: string; + + @doc(""" + A list of chat completion choices. Can be more than one if `n` is greater than 1. + """) + choices: { + @doc(""" + The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, `length` if the maximum number of tokens specified in the request was reached, `content_filter` if content was omitted due to a flag from our content filters, or `function_call` if the model called a function. + """) + finish_reason: "stop" | "length" | "function_call" | "content_filter"; + + /** The index of the choice in the list of choices. */ + index: int32; + + message: ChatCompletionResponseMessage; + }[]; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the chat completion was created. */ + @encode("unixTimestamp", int32) + created: utcDateTime; + + /** The model used for the chat completion. */ + `model`: string; + + @doc(""" + This fingerprint represents the backend configuration that the model runs with. + + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + """) + system_fingerprint?: string; + + @doc(""" + The object type, which is always `chat.completion`. + """) + object: "chat.completion"; + + usage?: CompletionUsage; +} + +/** A chat completion message generated by the model. */ +model ChatCompletionResponseMessage { + /** The contents of the message. */ + content: string | null; + + /** The refusal message generated by the model. */ + refusal: string | null; + + tool_calls?: ChatCompletionMessageToolCalls; + + /** The role of the author of this message. */ + role: "assistant"; + + #deprecated "This field is marked as deprecated." + @doc(""" + Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model. + """) + function_call?: { + /** The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. */ + arguments: string; + + /** The name of the function to call. */ + name: string; + }; +} + +model ChatCompletionTokenLogprob { + /** The token. */ + token: string; + + @doc(""" + The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value `-9999.0` is used to signify that the token is very unlikely. + """) + logprob: float32; + + @doc(""" + A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be `null` if there is no bytes representation for the token. + """) + bytes: ChatCompletionTokenLogprobBytes | null; + + @doc(""" + List of the most likely tokens and their log probability, at this token position. In rare cases, there may be fewer than the number of requested `top_logprobs` returned. + """) + top_logprobs: { + /** The token. */ + token: string; + + @doc(""" + The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value `-9999.0` is used to signify that the token is very unlikely. + """) + logprob: float32; + + @doc(""" + A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be `null` if there is no bytes representation for the token. + """) + bytes: int32[] | null; + }[]; +} + +@doc(""" + Specifying a particular function via `{"name": "my_function"}` forces the model to call that function. + """) +model ChatCompletionFunctionCallOption { + /** The name of the function to call. */ + name: string; +} + +#deprecated "This field is marked as deprecated." +model ChatCompletionFunctions { + /** A description of what the function does, used by the model to choose when and how to call the function. */ + description?: string; + + /** The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. */ + name: string; + + parameters?: FunctionParameters; +} + +/** A chat completion delta generated by streamed model responses. */ +model ChatCompletionStreamResponseDelta { + /** The contents of the chunk message. */ + content?: string | null; + + #deprecated "This field is marked as deprecated." + @doc(""" + Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model. + """) + function_call?: { + /** The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. */ + arguments?: string; + + /** The name of the function to call. */ + name?: string; + }; + + tool_calls?: ChatCompletionMessageToolCallChunk[]; + + /** The role of the author of this message. */ + role?: "system" | "user" | "assistant" | "tool"; + + /** The refusal message generated by the model. */ + refusal?: string | null; +} + +model ChatCompletionMessageToolCallChunk { + index: int32; + + /** The ID of the tool call. */ + id?: string; + + @doc(""" + The type of the tool. Currently, only `function` is supported. + """) + type?: "function"; + + function?: { + /** The name of the function to call. */ + name?: string; + + /** The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. */ + arguments?: string; + }; +} + +/** Represents a streamed chunk of a chat completion response returned by model, based on the provided input. */ +model CreateChatCompletionStreamResponse { + /** A unique identifier for the chat completion. Each chunk has the same ID. */ + id: string; + + @doc(""" + A list of chat completion choices. Can contain more than one elements if `n` is greater than 1. Can also be empty for the + last chunk if you set `stream_options: {"include_usage": true}`. + """) + choices: { + delta: ChatCompletionStreamResponseDelta; + + /** Log probability information for the choice. */ + logprobs?: { + /** A list of message content tokens with log probability information. */ + content: ChatCompletionTokenLogprob[] | null; + + /** A list of message refusal tokens with log probability information. */ + refusal: ChatCompletionTokenLogprob[] | null; + } | null; + + @doc(""" + The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, + `length` if the maximum number of tokens specified in the request was reached, + `content_filter` if content was omitted due to a flag from our content filters, + `tool_calls` if the model called a tool, or `function_call` (deprecated) if the model called a function. + """) + finish_reason: + | "stop" + | "length" + | "tool_calls" + | "content_filter" + | "function_call" + | null; + + /** The index of the choice in the list of choices. */ + index: int32; + }[]; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp. */ + @encode("unixTimestamp", int32) + created: utcDateTime; + + /** The model to generate the completion. */ + `model`: string; + + @doc(""" + The service tier used for processing the request. This field is only included if the `service_tier` parameter is specified in the request. + """) + service_tier?: "scale" | "default" | null; + + @doc(""" + This fingerprint represents the backend configuration that the model runs with. + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + """) + system_fingerprint?: string; + + @doc(""" + The object type, which is always `chat.completion.chunk`. + """) + object: "chat.completion.chunk"; + + @doc(""" + An optional field that will only be present when you set `stream_options: {"include_usage": true}` in your request. + When present, it contains a null value except for the last chunk which contains the token usage statistics for the entire request. + """) + usage?: { + /** Number of tokens in the generated completion. */ + completion_tokens: int32; + + /** Number of tokens in the prompt. */ + prompt_tokens: int32; + + /** Total number of tokens used in the request (prompt + completion). */ + total_tokens: int32; + }; +} + +/** Represents a streamed chunk of a chat completion response returned by model, based on the provided input. */ +alias CreateChatCompletionImageResponse = unknown; diff --git a/.typespec/chat/operations.tsp b/.typespec/chat/operations.tsp new file mode 100644 index 000000000..ecb068c73 --- /dev/null +++ b/.typespec/chat/operations.tsp @@ -0,0 +1,22 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/chat") +interface Chat { + @route("completions") + @post + @operationId("createChatCompletion") + @tag("Chat") + @summary("Creates a model response for the given chat conversation.") + createChatCompletion( + @body requestBody: CreateChatCompletionRequest, + ): CreateChatCompletionResponse | ErrorResponse; +} diff --git a/.typespec/common/custom.tsp b/.typespec/common/custom.tsp new file mode 100644 index 000000000..57386d659 --- /dev/null +++ b/.typespec/common/custom.tsp @@ -0,0 +1,14 @@ +using TypeSpec.OpenAPI; + +namespace OpenAI; + +union ListOrder { + string, + asc: "asc", + desc: "desc", +} + +@discriminator("type") +model OmniTypedResponseFormat { + type: string; +} diff --git a/.typespec/common/main.tsp b/.typespec/common/main.tsp new file mode 100644 index 000000000..223114fb0 --- /dev/null +++ b/.typespec/common/main.tsp @@ -0,0 +1,2 @@ +import "./custom.tsp"; +import "./models.tsp"; diff --git a/.typespec/common/models.tsp b/.typespec/common/models.tsp new file mode 100644 index 000000000..4afa9bdca --- /dev/null +++ b/.typespec/common/models.tsp @@ -0,0 +1,110 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "./custom.tsp"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +model Error { + code: string | null; + message: string; + param: string | null; + type: string; +} + +model ErrorResponse { + error: Error; +} + +@doc(""" + The parameters the functions accepts, described as a JSON Schema object. See the [guide](/docs/guides/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format. + + Omitting `parameters` defines a function with an empty parameter list. + """) +model FunctionParameters is Record; + +model FunctionObject { + /** A description of what the function does, used by the model to choose when and how to call the function. */ + description?: string; + + /** The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. */ + name: string; + + parameters?: FunctionParameters; + + @doc(""" + Whether to enable strict schema adherence when generating the function call. If set to true, the model will follow the exact schema defined in the `parameters` field. Only a subset of JSON Schema is supported when `strict` is `true`. Learn more about Structured Outputs in the [function calling guide](docs/guides/function-calling). + """) + strict?: boolean | null = false; +} + +// Tool customization: establish a common, discriminated union +model ResponseFormatText extends OmniTypedResponseFormat { + @doc(""" + The type of response format being defined: `text` + """) + type: "text"; +} + +// Tool customization: establish a common, discriminated union +model ResponseFormatJsonObject extends OmniTypedResponseFormat { + @doc(""" + The type of response format being defined: `json_object` + """) + type: "json_object"; +} + +/** The schema for the response format, described as a JSON Schema object. */ +model ResponseFormatJsonSchemaSchema is Record; + +// Tool customization: establish a common, discriminated union +model ResponseFormatJsonSchema extends OmniTypedResponseFormat { + @doc(""" + The type of response format being defined: `json_schema` + """) + type: "json_schema"; + + json_schema: { + /** A description of what the response format is for, used by the model to determine how to respond in the format. */ + description?: string; + + /** The name of the response format. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. */ + name: string; + + schema?: ResponseFormatJsonSchemaSchema; + + @doc(""" + Whether to enable strict schema adherence when generating the output. If set to true, the model will always follow the exact schema defined in the `schema` field. Only a subset of JSON Schema is supported when `strict` is `true`. To learn more, read the [Structured Outputs guide](/docs/guides/structured-outputs). + """) + strict?: boolean | null = false; + }; +} + +/** Whether to enable [parallel function calling](/docs/guides/function-calling/parallel-function-calling) during tool use. */ +alias ParallelToolCalls = boolean; + +/** Usage statistics for the completion request. */ +model CompletionUsage { + /** Number of tokens in the generated completion. */ + completion_tokens: int32; + + /** Number of tokens in the prompt. */ + prompt_tokens: int32; + + /** Total number of tokens used in the request (prompt + completion). */ + total_tokens: int32; +} + +@doc(""" + Options for streaming response. Only set this when you set `stream: true`. + """) +model ChatCompletionStreamOptions { + @doc(""" + If set, an additional chunk will be streamed before the `data: [DONE]` message. The `usage` field on this chunk shows the token usage statistics for the entire request, and the `choices` field will always be an empty array. All other chunks will also include a `usage` field, but with a null value. + """) + include_usage?: boolean; +} diff --git a/edits/main.tsp b/.typespec/completions/main.tsp similarity index 100% rename from edits/main.tsp rename to .typespec/completions/main.tsp diff --git a/.typespec/completions/models.tsp b/.typespec/completions/models.tsp new file mode 100644 index 000000000..3ace4c240 --- /dev/null +++ b/.typespec/completions/models.tsp @@ -0,0 +1,183 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "../common"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +model CreateCompletionRequest { + /** ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. */ + @extension("x-oaiTypeLabel", "string") + `model`: string | "gpt-3.5-turbo-instruct" | "davinci-002" | "babbage-002"; + + /** + * The prompt(s) to generate completions for, encoded as a string, array of strings, array of tokens, or array of token arrays. + * + * Note that <|endoftext|> is the document separator that the model sees during training, so if a prompt is not specified the model will generate as if from the beginning of a new document. + */ + prompt: string | string[] | int32[] | int32[][] | null = "<|endoftext|>"; + + @doc(""" + Generates `best_of` completions server-side and returns the "best" (the one with the highest log probability per token). Results cannot be streamed. + + When used with `n`, `best_of` controls the number of candidate completions and `n` specifies how many to return – `best_of` must be greater than `n`. + + **Note:** Because this parameter generates many completions, it can quickly consume your token quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. + """) + @minValue(0) + @maxValue(20) + best_of?: int32 | null = 1; + + /** Echo back the prompt in addition to the completion */ + echo?: boolean | null = false; + + /** + * Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim. + * + * [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + */ + @minValue(-2) + @maxValue(2) + frequency_penalty?: float32 | null = 0; + + @doc(""" + Modify the likelihood of specified tokens appearing in the completion. + + Accepts a JSON object that maps tokens (specified by their token ID in the GPT tokenizer) to an associated bias value from -100 to 100. You can use this [tokenizer tool](/tokenizer?view=bpe) to convert text to token IDs. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token. + + As an example, you can pass `{"50256": -100}` to prevent the <|endoftext|> token from being generated. + """) + @extension("x-oaiTypeLabel", "map") + logit_bias?: Record | null = null; + + @doc(""" + Include the log probabilities on the `logprobs` most likely output tokens, as well the chosen tokens. For example, if `logprobs` is 5, the API will return a list of the 5 most likely tokens. The API will always return the `logprob` of the sampled token, so there may be up to `logprobs+1` elements in the response. + + The maximum value for `logprobs` is 5. + """) + @minValue(0) + @maxValue(5) + logprobs?: int32 | null = null; + + @doc(""" + The maximum number of [tokens](/tokenizer) that can be generated in the completion. + + The token count of your prompt plus `max_tokens` cannot exceed the model's context length. [Example Python code](https://cookbook.openai.com/examples/how_to_count_tokens_with_tiktoken) for counting tokens. + """) + @minValue(0) + max_tokens?: int32 | null = 16; + + @doc(""" + How many completions to generate for each prompt. + + **Note:** Because this parameter generates many completions, it can quickly consume your token quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. + """) + @minValue(1) + @maxValue(128) + n?: int32 | null = 1; + + /** + * Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics. + * + * [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + */ + @minValue(-2) + @maxValue(2) + presence_penalty?: float32 | null = 0; + + @doc(""" + If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same `seed` and parameters should return the same result. + + Determinism is not guaranteed, and you should refer to the `system_fingerprint` response parameter to monitor changes in the backend. + """) + @minValue(-9223372036854775808) + @maxValue(9223372036854775807) + seed?: int64 | null; + + /** Up to 4 sequences where the API will stop generating further tokens. The returned text will not contain the stop sequence. */ + stop?: string | string[] | null = null; + + @doc(""" + Whether to stream back partial progress. If set, tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) as they become available, with the stream terminated by a `data: [DONE]` message. [Example Python code](https://cookbook.openai.com/examples/how_to_stream_completions). + """) + stream?: boolean | null = false; + + stream_options?: ChatCompletionStreamOptions | null = null; + + @doc(""" + The suffix that comes after a completion of inserted text. + + This parameter is only supported for `gpt-3.5-turbo-instruct`. + """) + suffix?: string | null = null; + + @doc(""" + What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + + We generally recommend altering this or `top_p` but not both. + """) + @minValue(0) + @maxValue(2) + temperature?: float32 | null = 1; + + @doc(""" + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or `temperature` but not both. + """) + @minValue(0) + @maxValue(1) + top_p?: float32 | null = 1; + + /** A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). */ + user?: string; +} + +/** Represents a completion response from the API. Note: both the streamed and non-streamed response objects share the same shape (unlike the chat endpoint). */ +model CreateCompletionResponse { + /** A unique identifier for the completion. */ + id: string; + + /** The list of completion choices the model generated for the input prompt. */ + choices: { + @doc(""" + The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, + `length` if the maximum number of tokens specified in the request was reached, + or `content_filter` if content was omitted due to a flag from our content filters. + """) + finish_reason: "stop" | "length" | "content_filter"; + + index: int32; + logprobs: { + text_offset?: int32[]; + token_logprobs?: float32[]; + tokens?: string[]; + top_logprobs?: Record[]; + } | null; + text: string; + }[]; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) of when the completion was created. */ + @encode("unixTimestamp", int32) + created: utcDateTime; + + /** The model used for completion. */ + `model`: string; + + @doc(""" + This fingerprint represents the backend configuration that the model runs with. + + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + """) + system_fingerprint?: string; + + /** The object type, which is always "text_completion" */ + object: "text_completion"; + + usage?: CompletionUsage; +} diff --git a/.typespec/completions/operations.tsp b/.typespec/completions/operations.tsp new file mode 100644 index 000000000..e82c5b90a --- /dev/null +++ b/.typespec/completions/operations.tsp @@ -0,0 +1,21 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/completions") +interface Completions { + @post + @operationId("createCompletion") + @tag("Completions") + @summary("Creates a completion for the provided prompt and parameters.") + createCompletion( + @body requestBody: CreateCompletionRequest, + ): CreateCompletionResponse | ErrorResponse; +} diff --git a/embeddings/main.tsp b/.typespec/embeddings/main.tsp similarity index 100% rename from embeddings/main.tsp rename to .typespec/embeddings/main.tsp diff --git a/.typespec/embeddings/models.tsp b/.typespec/embeddings/models.tsp new file mode 100644 index 000000000..7d391ad34 --- /dev/null +++ b/.typespec/embeddings/models.tsp @@ -0,0 +1,71 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +model CreateEmbeddingRequest { + @doc(""" + Input text to embed, encoded as a string or array of tokens. To embed multiple inputs in a single request, pass an array of strings or array of token arrays. The input must not exceed the max input tokens for the model (8192 tokens for `text-embedding-ada-002`), cannot be an empty string, and any array must be 2048 dimensions or less. [Example Python code](https://cookbook.openai.com/examples/how_to_count_tokens_with_tiktoken) for counting tokens. + """) + @extension("x-oaiExpandable", true) + input: string | string[] | int32[] | int32[][]; + + /** ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. */ + @extension("x-oaiTypeLabel", "string") + `model`: + | string + | "text-embedding-ada-002" + | "text-embedding-3-small" + | "text-embedding-3-large"; + + @doc(""" + The format to return the embeddings in. Can be either `float` or [`base64`](https://pypi.org/project/pybase64/). + """) + encoding_format?: "float" | "base64" = "float"; + + @doc(""" + The number of dimensions the resulting output embeddings should have. Only supported in `text-embedding-3` and later models. + """) + @minValue(1) + dimensions?: int32; + + /** A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). */ + user?: string; +} + +model CreateEmbeddingResponse { + /** The list of embeddings generated by the model. */ + data: Embedding[]; + + /** The name of the model used to generate the embedding. */ + `model`: string; + + /** The object type, which is always "list". */ + object: "list"; + + /** The usage information for the request. */ + usage: { + /** The number of tokens used by the prompt. */ + prompt_tokens: int32; + + /** The total number of tokens used by the request. */ + total_tokens: int32; + }; +} + +/** Represents an embedding vector returned by embedding endpoint. */ +model Embedding { + /** The index of the embedding in the list of embeddings. */ + index: int32; + + // Tool customization: apply missing string union for embedding response values + /** The embedding vector, which is a list of floats. The length of vector depends on the model as listed in the [embedding guide](/docs/guides/embeddings). */ + embedding: float[] | string; + + /** The object type, which is always "embedding". */ + object: "embedding"; +} diff --git a/embeddings/operations.tsp b/.typespec/embeddings/operations.tsp similarity index 75% rename from embeddings/operations.tsp rename to .typespec/embeddings/operations.tsp index 012d97c58..da892213e 100644 --- a/embeddings/operations.tsp +++ b/.typespec/embeddings/operations.tsp @@ -1,7 +1,7 @@ import "@typespec/http"; import "@typespec/openapi"; -import "../common/errors.tsp"; +import "../common"; import "./models.tsp"; using TypeSpec.Http; @@ -9,13 +9,13 @@ using TypeSpec.OpenAPI; namespace OpenAI; -@route("/embeddings") +@route("/v1/embeddings") interface Embeddings { - @tag("OpenAI") - @summary("Creates an embedding vector representing the input text.") @post @operationId("createEmbedding") + @tag("Embeddings") + @summary("Creates an embedding vector representing the input text.") createEmbedding( - @body embedding: CreateEmbeddingRequest, + @body requestBody: CreateEmbeddingRequest, ): CreateEmbeddingResponse | ErrorResponse; } diff --git a/files/main.tsp b/.typespec/files/main.tsp similarity index 100% rename from files/main.tsp rename to .typespec/files/main.tsp diff --git a/.typespec/files/models.tsp b/.typespec/files/models.tsp new file mode 100644 index 000000000..ba684dd3a --- /dev/null +++ b/.typespec/files/models.tsp @@ -0,0 +1,82 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +model CreateFileRequest { + // Tool customization: binary payloads are encoded bytes + /** The File object (not file name) to be uploaded. */ + @encode("binary") + file: bytes; + + /** + * The intended purpose of the uploaded file. + * + * Use "assistants" for [Assistants](/docs/api-reference/assistants) and [Message](/docs/api-reference/messages) files, "vision" for Assistants image file inputs, "batch" for [Batch API](/docs/guides/batch), and "fine-tune" for [Fine-tuning](/docs/api-reference/fine-tuning). + */ + purpose: "assistants" | "batch" | "fine-tune" | "vision"; +} + +model ListFilesResponse { + data: OpenAIFile[]; + object: "list"; +} + +model DeleteFileResponse { + id: string; + object: "file"; + deleted: boolean; +} + +@doc(""" + The `File` object represents a document that has been uploaded to OpenAI. + """) +model OpenAIFile { + /** The file identifier, which can be referenced in the API endpoints. */ + id: string; + + // Tool customization: file response values can have null byte counts + /** The size of the file, in bytes. */ + bytes: int32 | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the file was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The name of the file. */ + filename: string; + + @doc(""" + The object type, which is always `file`. + """) + object: "file"; + + @doc(""" + The intended purpose of the file. Supported values are `assistants`, `assistants_output`, `batch`, `batch_output`, `fine-tune`, `fine-tune-results` and `vision`. + """) + purpose: + | "assistants" + | "assistants_output" + | "batch" + | "batch_output" + | "fine-tune" + | "fine-tune-results" + | "vision"; + + #deprecated "This field is marked as deprecated." + @doc(""" + Deprecated. The current status of the file, which can be either `uploaded`, `processed`, or `error`. + """) + status: "uploaded" | "processed" | "error"; + + #deprecated "This field is marked as deprecated." + @doc(""" + Deprecated. For details on why a fine-tuning training file failed validation, see the `error` field on `fine_tuning.job`. + """) + status_details?: string; +} diff --git a/.typespec/files/operations.tsp b/.typespec/files/operations.tsp new file mode 100644 index 000000000..a4a51afec --- /dev/null +++ b/.typespec/files/operations.tsp @@ -0,0 +1,73 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/files") +interface Files { + @post + @operationId("createFile") + @tag("Files") + @summary(""" + Upload a file that can be used across various endpoints. The size of all the files uploaded by + one organization can be up to 100 GB. + + The size of individual files can be a maximum of 512 MB or 2 million tokens for Assistants. See + the [Assistants Tools guide](/docs/assistants/tools) to learn more about the types of files + supported. The Fine-tuning API only supports `.jsonl` files. + + Please [contact us](https://help.openai.com/) if you need to increase these storage limits. + """) + createFile( + @header contentType: "multipart/form-data", + @body requestBody: CreateFileRequest, + ): OpenAIFile | ErrorResponse; + + @get + @operationId("listFiles") + @tag("Files") + @summary("Returns a list of files that belong to the user's organization.") + listFiles( + // TODO: This is just a string in the OpenAPI spec. + /** Only return files with the given purpose. */ + @query purpose?: string, + ): ListFilesResponse | ErrorResponse; + + @route("{file_id}") + @get + @operationId("retrieveFile") + @tag("Files") + @summary("Returns information about a specific file.") + retrieveFile( + /** The ID of the file to use for this request. */ + @path file_id: string, + ): OpenAIFile | ErrorResponse; + + @route("{file_id}") + @delete + @operationId("deleteFile") + @tag("Files") + @summary("Delete a file") + deleteFile( + /** The ID of the file to use for this request. */ + @path file_id: string, + ): DeleteFileResponse | ErrorResponse; + + @route("{file_id}/content") + @get + @operationId("downloadFile") + @tag("Files") + @summary("Returns the contents of the specified file.") + // TODO: The OpenAPI spec says this returns a plain string but in reality it is returning + // the file content as bytes. + downloadFile( + /** The ID of the file to use for this request. */ + @path file_id: string, + ): bytes | ErrorResponse; +} diff --git a/.typespec/fine-tuning/client.tsp b/.typespec/fine-tuning/client.tsp new file mode 100644 index 000000000..911876ab9 --- /dev/null +++ b/.typespec/fine-tuning/client.tsp @@ -0,0 +1,11 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; +using OpenAI; + +@@access(FinetuneChatRequestInput, Access.public); +@@usage(FinetuneChatRequestInput, Usage.input); + +@@access(FinetuneCompletionRequestInput, Access.public); +@@usage(FinetuneCompletionRequestInput, Usage.input); diff --git a/.typespec/fine-tuning/custom.tsp b/.typespec/fine-tuning/custom.tsp new file mode 100644 index 000000000..11dd998ac --- /dev/null +++ b/.typespec/fine-tuning/custom.tsp @@ -0,0 +1,56 @@ +using TypeSpec.OpenAPI; + +namespace OpenAI; + +model CreateFineTuningJobRequestHyperparameters { + /** + * Number of examples in each batch. A larger batch size means that model parameters + * are updated less frequently, but with lower variance. + */ + @minValue(1) + @maxValue(256) + batch_size?: CreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum | int32 = CreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum.auto; + + /** + * Scaling factor for the learning rate. A smaller learning rate may be useful to avoid + * overfitting. + */ + @minValue(0) + learning_rate_multiplier?: CreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum | float32 = CreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum.auto; + + /** + * The number of epochs to train the model for. An epoch refers to one full cycle + * through the training dataset. + */ + @minValue(1) + @maxValue(50) + n_epochs?: CreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum | int32 = CreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum.auto; +} + +union CreateFineTuningJobRequestHyperparametersBatchSizeChoiceEnum { + auto: "auto", +} +union CreateFineTuningJobRequestHyperparametersLearningRateMultiplierChoiceEnum { + auto: "auto", +} +union CreateFineTuningJobRequestHyperparametersNEpochsChoiceEnum { + auto: "auto", +} + +model FineTuningJobHyperparameters { + /** + * The number of epochs to train the model for. An epoch refers to one full cycle + * through the training dataset. + */ + n_epochs: FineTuningJobHyperparametersNEpochsChoiceEnum | int32 = FineTuningJobHyperparametersNEpochsChoiceEnum.auto; +} + +union FineTuningJobHyperparametersBatchSizeChoiceEnum { + auto: "auto", +} +union FineTuningJobHyperparametersLearningRateMultiplierChoiceEnum { + auto: "auto", +} +union FineTuningJobHyperparametersNEpochsChoiceEnum { + auto: "auto", +} diff --git a/.typespec/fine-tuning/main.tsp b/.typespec/fine-tuning/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec/fine-tuning/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec/fine-tuning/models.tsp b/.typespec/fine-tuning/models.tsp new file mode 100644 index 000000000..e11a1e611 --- /dev/null +++ b/.typespec/fine-tuning/models.tsp @@ -0,0 +1,331 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "../chat"; +import "../common"; +import "./custom.tsp"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +// Tool generated type. Extracts CreateFineTuningJobRequest.integrations +model CreateFineTuningJobRequestIntegrations + is { + /** The type of integration to enable. Currently, only "wandb" (Weights and Biases) is supported. */ + type: "wandb"; + + /** + * The settings for your integration with Weights and Biases. This payload specifies the project that + * metrics will be sent to. Optionally, you can set an explicit display name for your run, add tags + * to your run, and set a default entity (team, username, etc) to be associated with your run. + */ + wandb: { + /** The name of the project that the new run will be created under. */ + project: string; + + /** A display name to set for the run. If not set, we will use the Job ID as the name. */ + name?: string | null; + + /** + * The entity to use for the run. This allows you to set the team or username of the WandB user that you would + * like associated with the run. If not set, the default entity for the registered WandB API key is used. + */ + entity?: string | null; + + /** + * A list of tags to be attached to the newly created run. These tags are passed through directly to WandB. Some + * default tags are generated by OpenAI: "openai/finetune", "openai/{base-model}", "openai/{ftjob-abcdef}". + */ + tags?: string[]; + }; + }[]; + +// Tool generated type. Extracts FineTuningJob.integrations +@maxItems(5) +@extension("x-oaiExpandable", true) +model FineTuningJobIntegrations is FineTuningIntegration[]; + +model CreateFineTuningJobRequest { + /** + * The name of the model to fine-tune. You can select one of the + * [supported models](/docs/guides/fine-tuning/which-models-can-be-fine-tuned). + */ + @extension("x-oaiTypeLabel", "string") + `model`: + | string + | "babbage-002" + | "davinci-002" + | "gpt-3.5-turbo" + | "gpt-4o-mini"; + + @doc(""" + The ID of an uploaded file that contains training data. + + See [upload file](/docs/api-reference/files/create) for how to upload a file. + + Your dataset must be formatted as a JSONL file. Additionally, you must upload your file with the purpose `fine-tune`. + + The contents of the file should differ depending on if the model uses the [chat](/docs/api-reference/fine-tuning/chat-input) or [completions](/docs/api-reference/fine-tuning/completions-input) format. + + See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. + """) + training_file: string; + + // Tool customization: reflect observed wire truth (learning_rate_multiplier, n_epochs) for hyperparameters in ft responses + /** The hyperparameters used for the fine-tuning job. */ + hyperparameters?: CreateFineTuningJobRequestHyperparameters; + + @doc(""" + A string of up to 18 characters that will be added to your fine-tuned model name. + + For example, a `suffix` of "custom-model-name" would produce a model name like `ft:gpt-4o-mini:openai:custom-model-name:7p4lURel`. + """) + @minLength(1) + @maxLength(40) + suffix?: string | null = null; + + @doc(""" + The ID of an uploaded file that contains validation data. + + If you provide this file, the data is used to generate validation + metrics periodically during fine-tuning. These metrics can be viewed in + the fine-tuning results file. + The same data should not be present in both train and validation files. + + Your dataset must be formatted as a JSONL file. You must upload your file with the purpose `fine-tune`. + + See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. + """) + validation_file?: string | null; + + /** A list of integrations to enable for your fine-tuning job. */ + integrations?: CreateFineTuningJobRequestIntegrations | null; + + /** + * The seed controls the reproducibility of the job. Passing in the same seed and job parameters should produce the same results, but may differ in rare cases. + * If a seed is not specified, one will be generated for you. + */ + @minValue(0) + @maxValue(2147483647) + seed?: int32 | null; +} + +model ListPaginatedFineTuningJobsResponse { + data: FineTuningJob[]; + has_more: boolean; + object: "list"; +} + +model ListFineTuningJobEventsResponse { + data: FineTuningJobEvent[]; + object: "list"; +} + +model ListFineTuningJobCheckpointsResponse { + data: FineTuningJobCheckpoint[]; + object: "list"; + first_id?: string | null; + last_id?: string | null; + has_more: boolean; +} + +@doc(""" + The `fine_tuning.job` object represents a fine-tuning job that has been created through the API. + """) +model FineTuningJob { + /** The object identifier, which can be referenced in the API endpoints. */ + id: string; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the fine-tuning job was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + @doc(""" + For fine-tuning jobs that have `failed`, this will contain more information on the cause of the failure. + """) + error: { + /** A machine-readable error code. */ + code: string; + + /** A human-readable error message. */ + message: string; + + @doc(""" + The parameter that was invalid, usually `training_file` or `validation_file`. This field will be null if the failure was not parameter-specific. + """) + param: string | null; + } | null; + + /** The name of the fine-tuned model that is being created. The value will be null if the fine-tuning job is still running. */ + fine_tuned_model: string | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the fine-tuning job was finished. The value will be null if the fine-tuning job is still running. */ + @encode("unixTimestamp", int32) + finished_at: utcDateTime | null; + + // Tool customization: reflect observed wire truth (learning_rate_multiplier, n_epochs) for hyperparameters in ft responses + /** The hyperparameters used for the fine-tuning job. See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. */ + hyperparameters: FineTuningJobHyperparameters; + + /** The base model that is being fine-tuned. */ + `model`: string; + + /** The object type, which is always "fine_tuning.job". */ + object: "fine_tuning.job"; + + /** The organization that owns the fine-tuning job. */ + organization_id: string; + + /** The compiled results file ID(s) for the fine-tuning job. You can retrieve the results with the [Files API](/docs/api-reference/files/retrieve-contents). */ + result_files: string[]; + + @doc(""" + The current status of the fine-tuning job, which can be either `validating_files`, `queued`, `running`, `succeeded`, `failed`, or `cancelled`. + """) + status: + | "validating_files" + | "queued" + | "running" + | "succeeded" + | "failed" + | "cancelled"; + + /** The total number of billable tokens processed by this fine-tuning job. The value will be null if the fine-tuning job is still running. */ + trained_tokens: int32 | null; + + /** The file ID used for training. You can retrieve the training data with the [Files API](/docs/api-reference/files/retrieve-contents). */ + training_file: string; + + /** The file ID used for validation. You can retrieve the validation results with the [Files API](/docs/api-reference/files/retrieve-contents). */ + validation_file: string | null; + + /** A list of integrations to enable for this fine-tuning job. */ + integrations?: FineTuningJobIntegrations | null; + + /** The seed used for the fine-tuning job. */ + seed: int32; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the fine-tuning job is estimated to finish. The value will be null if the fine-tuning job is not running. */ + @encode("unixTimestamp", int32) + estimated_finish?: utcDateTime | null; +} + +model FineTuningIntegration { + /** The type of the integration being enabled for the fine-tuning job */ + type: "wandb"; + + /** + * The settings for your integration with Weights and Biases. This payload specifies the project that + * metrics will be sent to. Optionally, you can set an explicit display name for your run, add tags + * to your run, and set a default entity (team, username, etc) to be associated with your run. + */ + wandb: { + /** The name of the project that the new run will be created under. */ + project: string; + + /** A display name to set for the run. If not set, we will use the Job ID as the name. */ + name?: string | null; + + /** + * The entity to use for the run. This allows you to set the team or username of the WandB user that you would + * like associated with the run. If not set, the default entity for the registered WandB API key is used. + */ + entity?: string | null; + + /** + * A list of tags to be attached to the newly created run. These tags are passed through directly to WandB. Some + * default tags are generated by OpenAI: "openai/finetune", "openai/{base-model}", "openai/{ftjob-abcdef}". + */ + tags?: string[]; + }; +} + +/** Fine-tuning job event object */ +model FineTuningJobEvent { + id: string; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + level: "info" | "warn" | "error"; + message: string; + object: "fine_tuning.job.event"; +} + +@doc(""" + The `fine_tuning.job.checkpoint` object represents a model checkpoint for a fine-tuning job that is ready to use. + """) +model FineTuningJobCheckpoint { + /** The checkpoint identifier, which can be referenced in the API endpoints. */ + id: string; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the checkpoint was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The name of the fine-tuned checkpoint model that is created. */ + fine_tuned_model_checkpoint: string; + + /** The step number that the checkpoint was created at. */ + step_number: int32; + + /** Metrics at the step number during the fine-tuning job. */ + metrics: { + step?: float32; + train_loss?: float32; + train_mean_token_accuracy?: float32; + valid_loss?: float32; + valid_mean_token_accuracy?: float32; + full_valid_loss?: float32; + full_valid_mean_token_accuracy?: float32; + }; + + /** The name of the fine-tuning job that this checkpoint was created from. */ + fine_tuning_job_id: string; + + /** The object type, which is always "fine_tuning.job.checkpoint". */ + object: "fine_tuning.job.checkpoint"; +} + +/** The per-line training example of a fine-tuning input file for chat models */ +model FinetuneChatRequestInput { + @minItems(1) + @extension("x-oaiExpandable", true) + messages?: ( + | ChatCompletionRequestSystemMessage + | ChatCompletionRequestUserMessage + | FineTuneChatCompletionRequestAssistantMessage + | ChatCompletionRequestToolMessage + | ChatCompletionRequestFunctionMessage)[]; + + /** A list of tools the model may generate JSON inputs for. */ + tools?: ChatCompletionTool[]; + + parallel_tool_calls?: ParallelToolCalls = true; + + /** A list of functions the model may generate JSON inputs for. */ + #deprecated "This field is marked as deprecated." + @minItems(1) + @maxItems(128) + functions?: ChatCompletionFunctions[]; +} + +/** The per-line training example of a fine-tuning input file for completions models */ +model FinetuneCompletionRequestInput { + /** The input prompt for this training example. */ + prompt?: string; + + /** The desired completion for this training example. */ + completion?: string; +} + +model FineTuneChatCompletionRequestAssistantMessage + extends ChatCompletionRequestAssistantMessage {} diff --git a/.typespec/fine-tuning/operations.tsp b/.typespec/fine-tuning/operations.tsp new file mode 100644 index 000000000..ddb4cabee --- /dev/null +++ b/.typespec/fine-tuning/operations.tsp @@ -0,0 +1,97 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/fine_tuning") +interface FineTuning { + @route("jobs") + @post + @operationId("createFineTuningJob") + @tag("Fine-tuning") + @summary(""" + Creates a fine-tuning job which begins the process of creating a new model from a given dataset. + + Response includes details of the enqueued job including job status and the name of the fine-tuned models once complete. + + [Learn more about fine-tuning](/docs/guides/fine-tuning) + """) + createFineTuningJob( + @body requestBody: CreateFineTuningJobRequest, + ): FineTuningJob | ErrorResponse; + + @route("jobs") + @get + @operationId("listPaginatedFineTuningJobs") + @tag("Fine-tuning") + @summary("List your organization's fine-tuning jobs") + listPaginatedFineTuningJobs( + /** Identifier for the last job from the previous pagination request. */ + @query after?: string, + + /** Number of fine-tuning jobs to retrieve. */ + @query limit?: int32 = 20, + ): ListPaginatedFineTuningJobsResponse | ErrorResponse; + + @route("jobs/{fine_tuning_job_id}") + @get + @operationId("retrieveFineTuningJob") + @tag("Fine-tuning") + @summary(""" + Get info about a fine-tuning job. + + [Learn more about fine-tuning](/docs/guides/fine-tuning) + """) + retrieveFineTuningJob( + /** The ID of the fine-tuning job. */ + @path fine_tuning_job_id: string, + ): FineTuningJob | ErrorResponse; + + @route("jobs/{fine_tuning_job_id}/cancel") + @post + @operationId("cancelFineTuningJob") + @tag("Fine-tuning") + @summary("Immediately cancel a fine-tune job.") + cancelFineTuningJob( + /** The ID of the fine-tuning job to cancel. */ + @path fine_tuning_job_id: string, + ): FineTuningJob | ErrorResponse; + + @route("jobs/{fine_tuning_job_id}/checkpoints") + @get + @operationId("listFineTuningJobCheckpoints") + @tag("Fine-tuning") + @summary("List the checkpoints for a fine-tuning job.") + listFineTuningJobCheckpoints( + /** The ID of the fine-tuning job to get checkpoints for. */ + @path fine_tuning_job_id: string, + + /** Identifier for the last checkpoint ID from the previous pagination request. */ + @query after?: string, + + /** Number of checkpoints to retrieve. */ + @query limit?: int32 = 10, + ): ListFineTuningJobCheckpointsResponse | ErrorResponse; + + @route("jobs/{fine_tuning_job_id}/events") + @get + @operationId("listFineTuningEvents") + @tag("Fine-tuning") + @summary("Get status updates for a fine-tuning job.") + listFineTuningEvents( + /** The ID of the fine-tuning job to get events for. */ + @path fine_tuning_job_id: string, + + /** Identifier for the last event from the previous pagination request. */ + @query after?: string, + + /** Number of events to retrieve. */ + @query limit?: int32 = 20, + ): ListFineTuningJobEventsResponse | ErrorResponse; +} diff --git a/fine-tuning/main.tsp b/.typespec/images/main.tsp similarity index 100% rename from fine-tuning/main.tsp rename to .typespec/images/main.tsp diff --git a/.typespec/images/models.tsp b/.typespec/images/models.tsp new file mode 100644 index 000000000..6121f821f --- /dev/null +++ b/.typespec/images/models.tsp @@ -0,0 +1,154 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +model CreateImageRequest { + @doc(""" + A text description of the desired image(s). The maximum length is 1000 characters for `dall-e-2` and 4000 characters for `dall-e-3`. + """) + prompt: string; + + /** The model to use for image generation. */ + @extension("x-oaiTypeLabel", "string") + `model`?: string | "dall-e-2" | "dall-e-3" | null = "dall-e-2"; + + @doc(""" + The number of images to generate. Must be between 1 and 10. For `dall-e-3`, only `n=1` is supported. + """) + @minValue(1) + @maxValue(10) + n?: int32 | null = 1; + + @doc(""" + The quality of the image that will be generated. `hd` creates images with finer details and greater consistency across the image. This param is only supported for `dall-e-3`. + """) + quality?: "standard" | "hd" = "standard"; + + @doc(""" + The format in which the generated images are returned. Must be one of `url` or `b64_json`. URLs are only valid for 60 minutes after the image has been generated. + """) + response_format?: "url" | "b64_json" | null = "url"; + + @doc(""" + The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024` for `dall-e-2`. Must be one of `1024x1024`, `1792x1024`, or `1024x1792` for `dall-e-3` models. + """) + size?: + | "256x256" + | "512x512" + | "1024x1024" + | "1792x1024" + | "1024x1792" + | null = "1024x1024"; + + @doc(""" + The style of the generated images. Must be one of `vivid` or `natural`. Vivid causes the model to lean towards generating hyper-real and dramatic images. Natural causes the model to produce more natural, less hyper-real looking images. This param is only supported for `dall-e-3`. + """) + style?: "vivid" | "natural" | null = "vivid"; + + /** A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). */ + user?: string; +} + +model CreateImageEditRequest { + // Tool customization: binary payloads are encoded bytes + /** The image to edit. Must be a valid PNG file, less than 4MB, and square. If mask is not provided, image must have transparency, which will be used as the mask. */ + @encode("binary") + image: bytes; + + /** A text description of the desired image(s). The maximum length is 1000 characters. */ + prompt: string; + + // Tool customization: binary payloads are encoded bytes + @doc(""" + An additional image whose fully transparent areas (e.g. where alpha is zero) indicate where `image` should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions as `image`. + """) + @encode("binary") + mask?: bytes; + + @doc(""" + The model to use for image generation. Only `dall-e-2` is supported at this time. + """) + @extension("x-oaiTypeLabel", "string") + `model`?: string | "dall-e-2" | null = "dall-e-2"; + + /** The number of images to generate. Must be between 1 and 10. */ + @minValue(1) + @maxValue(10) + n?: int32 | null = 1; + + @doc(""" + The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. + """) + size?: "256x256" | "512x512" | "1024x1024" | null = "1024x1024"; + + @doc(""" + The format in which the generated images are returned. Must be one of `url` or `b64_json`. URLs are only valid for 60 minutes after the image has been generated. + """) + response_format?: "url" | "b64_json" | null = "url"; + + /** A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). */ + user?: string; +} + +model CreateImageVariationRequest { + /** The image to use as the basis for the variation(s). Must be a valid PNG file, less than 4MB, and square. */ + image: bytes; + + @doc(""" + The model to use for image generation. Only `dall-e-2` is supported at this time. + """) + @extension("x-oaiTypeLabel", "string") + `model`?: string | "dall-e-2" | null = "dall-e-2"; + + @doc(""" + The number of images to generate. Must be between 1 and 10. For `dall-e-3`, only `n=1` is supported. + """) + @minValue(1) + @maxValue(10) + n?: int32 | null = 1; + + @doc(""" + The format in which the generated images are returned. Must be one of `url` or `b64_json`. URLs are only valid for 60 minutes after the image has been generated. + """) + response_format?: "url" | "b64_json" | null = "url"; + + @doc(""" + The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. + """) + size?: "256x256" | "512x512" | "1024x1024" | null = "1024x1024"; + + /** A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). */ + user?: string; +} + +model ImagesResponse { + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + @encode("unixTimestamp", int32) + created: utcDateTime; + + data: Image[]; +} + +/** Represents the url or the content of an image generated by the OpenAI API. */ +model Image { + // Tool customization: base64 input uses an encoded bytes type + @doc(""" + The base64-encoded JSON of the generated image, if `response_format` is `b64_json`. + """) + @encode("base64", string) + b64_json?: bytes; + + // Tool customization: url uses the url type + @doc(""" + The URL of the generated image, if `response_format` is `url` (default). + """) + url?: url; + + /** The prompt that was used to generate the image, if there was any revision to the prompt. */ + revised_prompt?: string; +} diff --git a/images/operations.tsp b/.typespec/images/operations.tsp similarity index 72% rename from images/operations.tsp rename to .typespec/images/operations.tsp index 09203262b..62c37f590 100644 --- a/images/operations.tsp +++ b/.typespec/images/operations.tsp @@ -1,7 +1,7 @@ import "@typespec/http"; import "@typespec/openapi"; -import "../common/errors.tsp"; +import "../common/models.tsp"; import "./models.tsp"; using TypeSpec.Http; @@ -9,32 +9,34 @@ using TypeSpec.OpenAPI; namespace OpenAI; -@route("/images") +@route("/v1/images") interface Images { @route("generations") @post @operationId("createImage") - @tag("OpenAI") + @tag("Images") @summary("Creates an image given a prompt") - createImage(@body image: CreateImageRequest): ImagesResponse | ErrorResponse; + createImage( + @body requestBody: CreateImageRequest, + ): ImagesResponse | ErrorResponse; @route("edits") @post @operationId("createImageEdit") - @tag("OpenAI") + @tag("Images") @summary("Creates an edited or extended image given an original image and a prompt.") createImageEdit( @header contentType: "multipart/form-data", - @body image: CreateImageEditRequest, + @body requestBody: CreateImageEditRequest, ): ImagesResponse | ErrorResponse; @route("variations") @post @operationId("createImageVariation") - @tag("OpenAI") + @tag("Images") @summary("Creates an edited or extended image given an original image and a prompt.") createImageVariation( @header contentType: "multipart/form-data", - @body image: CreateImageVariationRequest, + @body requestBody: CreateImageVariationRequest, ): ImagesResponse | ErrorResponse; } diff --git a/main.tsp b/.typespec/main.tsp similarity index 69% rename from main.tsp rename to .typespec/main.tsp index 2ea8cbbc3..c7b01887a 100644 --- a/main.tsp +++ b/.typespec/main.tsp @@ -2,14 +2,23 @@ import "@typespec/http"; import "@typespec/openapi3"; import "@typespec/openapi"; +import "./administration"; import "./audio"; +import "./assistants"; +import "./batch"; +import "./chat"; import "./completions"; -import "./edits"; import "./embeddings"; import "./files"; import "./fine-tuning"; import "./images"; -import "./moderation"; +import "./messages"; +import "./models"; +import "./moderations"; +import "./runs"; +import "./threads"; +import "./vector-stores"; +import "./uploads"; using TypeSpec.Http; @@ -25,8 +34,7 @@ using TypeSpec.Http; name: "MIT", url: "https://github.com/openai/openai-openapi/blob/master/LICENSE", }, - version: "2.0.0", }) -@server("https://api.openai.com/v1", "OpenAI Endpoint") +@server("https://api.openai.com", "OpenAI Endpoint") @useAuth(BearerAuth) namespace OpenAI; diff --git a/.typespec/messages/client.tsp b/.typespec/messages/client.tsp new file mode 100644 index 000000000..fdf340460 --- /dev/null +++ b/.typespec/messages/client.tsp @@ -0,0 +1,65 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./custom.tsp"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; +using OpenAI; + +// +@@access(MessageContent, Access.public); +@@usage(MessageContent, Usage.input | Usage.output); + +@@access(MessageContentImageFileObject, Access.public); +@@usage(MessageContentImageFileObject, Usage.input | Usage.output); + +@@access(MessageContentTextObject, Access.public); +@@usage(MessageContentTextObject, Usage.input | Usage.output); + +@@access(MessageContentImageUrlObject, Access.public); +@@usage(MessageContentImageUrlObject, Usage.input | Usage.output); + +@@access(MessageContentRefusalObject, Access.public); +@@usage(MessageContentRefusalObject, Usage.input | Usage.output); + +@@access(MessageRequestContentTextObject, Access.public); +@@usage(MessageRequestContentTextObject, Usage.input | Usage.output); + +@@access(MessageContentTextObjectAnnotation, Access.public); +@@usage(MessageContentTextObjectAnnotation, Usage.input | Usage.output); + +@@access(MessageContentTextAnnotationsFileCitationObject, Access.public); +@@usage(MessageContentTextAnnotationsFileCitationObject, + Usage.input | Usage.output +); + +@@access(MessageContentTextAnnotationsFilePathObject, Access.public); +@@usage(MessageContentTextAnnotationsFilePathObject, + Usage.input | Usage.output +); + +// +@@access(MessageDeltaContent, Access.public); +@@usage(MessageDeltaContent, Usage.output); + +@@access(MessageDeltaContentImageFileObject, Access.public); +@@usage(MessageDeltaContentImageFileObject, Usage.output); + +@@access(MessageDeltaContentImageUrlObject, Access.public); +@@usage(MessageDeltaContentImageUrlObject, Usage.output); + +@@access(MessageDeltaContentTextObject, Access.public); +@@usage(MessageDeltaContentTextObject, Usage.output); + +// +@@access(MessageDeltaObject, Access.public); +@@usage(MessageDeltaObject, Usage.output); + +// +@@access(MessageDeltaTextContentAnnotation, Access.public); +@@usage(MessageDeltaTextContentAnnotation, Usage.output); + +@@access(MessageDeltaContentTextAnnotationsFileCitationObject, Access.public); +@@usage(MessageDeltaContentTextAnnotationsFileCitationObject, Usage.output); + +@@access(MessageDeltaContentTextAnnotationsFilePathObject, Access.public); +@@usage(MessageDeltaContentTextAnnotationsFilePathObject, Usage.output); diff --git a/.typespec/messages/custom.tsp b/.typespec/messages/custom.tsp new file mode 100644 index 000000000..e8a6f2e95 --- /dev/null +++ b/.typespec/messages/custom.tsp @@ -0,0 +1,37 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "../assistants"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +/** + * Represents a single piece of content in an Assistants API message. + */ +// TODO: @discriminator("type") +model MessageContent {} + +@discriminator("type") +model MessageContentTextObjectAnnotation { + /** The discriminated type identifier for the content item. */ + type: string; +} + +/** + * Represents a single piece of incremental content in an Assistants API streaming response. + */ +@discriminator("type") +model MessageDeltaContent { + /** The discriminated type identifier for the content item. */ + type: string; +} + +@discriminator("type") +model MessageDeltaTextContentAnnotation { + /** The discriminated type identifier for the content item. */ + type: string; +} diff --git a/.typespec/messages/main.tsp b/.typespec/messages/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec/messages/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec/messages/models.tsp b/.typespec/messages/models.tsp new file mode 100644 index 000000000..d3bfe66e2 --- /dev/null +++ b/.typespec/messages/models.tsp @@ -0,0 +1,440 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "../assistants"; +import "./custom.tsp"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +// Tool generated type. Extracts CreateMessageRequest.attachments +model CreateMessageRequestAttachments + is { + /** The ID of the file to attach to the message. */ + file_id: string; + + /** The tools to add this file to. */ + @extension("x-oaiExpandable", true) + tools: (AssistantToolsCode | AssistantToolsFileSearchTypeOnly)[]; + }[]; + +// Tool generated type. Extracts MessageObject.attachments +model MessageObjectAttachments + is { + /** The ID of the file to attach to the message. */ + file_id?: string; + + /** The tools to add this file to. */ + @extension("x-oaiExpandable", true) + tools?: (AssistantToolsCode | AssistantToolsFileSearchTypeOnly)[]; + }[]; + +model CreateMessageRequest { + @doc(""" + The role of the entity that is creating the message. Allowed values include: + - `user`: Indicates the message is sent by an actual user and should be used in most cases to represent user-generated messages. + - `assistant`: Indicates the message is generated by the assistant. Use this value to insert messages from the assistant into the conversation. + """) + role: "user" | "assistant"; + + // Tool customization: use abstract base model for message request content items + @extension("x-oaiExpandable", true) + content: MessageContent[]; + + /** A list of files attached to the message, and the tools they should be added to. */ + attachments?: CreateMessageRequestAttachments | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; +} + +model ModifyMessageRequest { + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; +} + +model ListMessagesResponse { + // Tool customization: add a clear enum enforcement of constrained 'object' label + object: "list"; + + data: MessageObject[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +model DeleteMessageResponse { + id: string; + deleted: boolean; + object: "thread.message.deleted"; +} + +/** Represents a message within a [thread](/docs/api-reference/threads). */ +model MessageObject { + /** The identifier, which can be referenced in API endpoints. */ + id: string; + + @doc(""" + The object type, which is always `thread.message`. + """) + object: "thread.message"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the message was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The [thread](/docs/api-reference/threads) ID that this message belongs to. */ + thread_id: string; + + @doc(""" + The status of the message, which can be either `in_progress`, `incomplete`, or `completed`. + """) + status: "in_progress" | "incomplete" | "completed"; + + /** On an incomplete message, details about why the message is incomplete. */ + incomplete_details: { + /** The reason the message is incomplete. */ + reason: + | "content_filter" + | "max_tokens" + | "run_cancelled" + | "run_expired" + | "run_failed"; + } | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the message was completed. */ + @encode("unixTimestamp", int32) + completed_at: utcDateTime | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the message was marked as incomplete. */ + @encode("unixTimestamp", int32) + incomplete_at: utcDateTime | null; + + @doc(""" + The entity that produced the message. One of `user` or `assistant`. + """) + role: "user" | "assistant"; + + // Tool customization: use abstract base model for message response content items + /** The content of the message in array of text and/or images. */ + @extension("x-oaiExpandable", true) + content: MessageContent[]; + + /** If applicable, the ID of the [assistant](/docs/api-reference/assistants) that authored this message. */ + assistant_id: string | null; + + @doc(""" + The ID of the [run](/docs/api-reference/runs) associated with the creation of this message. Value is `null` when messages are created manually using the create message or create thread endpoints. + """) + run_id: string | null; + + /** A list of files attached to the message, and the tools they were added to. */ + attachments: MessageObjectAttachments | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata: Record | null; +} + +// Tool customization: apply a common model base for all assistants message content items +/** References an image [File](/docs/api-reference/files) in the content of a message. */ +model MessageContentImageFileObject extends MessageContent { + @doc(""" + Always `image_file`. + """) + type: "image_file"; + + image_file: { + @doc(""" + The [File](/docs/api-reference/files) ID of the image in the message content. Set `purpose="vision"` when uploading the File if you need to later display the file content. + """) + file_id: string; + + @doc(""" + Specifies the detail level of the image if specified by the user. `low` uses fewer tokens, you can opt in to high resolution using `high`. + """) + detail?: "auto" | "low" | "high" = "auto"; + }; +} + +// Tool customization: apply a common model base for all assistants message content items +/** The text content that is part of a message. */ +model MessageContentTextObject extends MessageContent { + @doc(""" + Always `text`. + """) + type: "text"; + + text: { + /** The data that makes up the text. */ + value: string; + + // Tool customization: replace unioned types with a custom, common base + @extension("x-oaiExpandable", true) + annotations: MessageContentTextObjectAnnotation[]; + }; +} + +// Tool customization: apply a common model base for all assistants message content items +/** The refusal content generated by the assistant. */ +model MessageContentRefusalObject extends MessageContent { + @doc(""" + Always `refusal`. + """) + type: "refusal"; + + refusal: string; +} + +// Tool customization: apply custom, common base type to union items +/** A citation within the message that points to a specific quote from a specific File associated with the assistant or the message. Generated when the assistant uses the "file_search" tool to search files. */ +model MessageContentTextAnnotationsFileCitationObject + extends MessageContentTextObjectAnnotation { + @doc(""" + Always `file_citation`. + """) + type: "file_citation"; + + /** The text in the message content that needs to be replaced. */ + text: string; + + file_citation: { + /** The ID of the specific File the citation is from. */ + file_id: string; + }; + + @minValue(0) + start_index: int32; + + @minValue(0) + end_index: int32; +} + +// Tool customization: apply custom, common base type to union items +@doc(""" + A URL for the file that's generated when the assistant used the `code_interpreter` tool to generate a file. + """) +model MessageContentTextAnnotationsFilePathObject + extends MessageContentTextObjectAnnotation { + @doc(""" + Always `file_path`. + """) + type: "file_path"; + + /** The text in the message content that needs to be replaced. */ + text: string; + + file_path: { + /** The ID of the file that was generated. */ + file_id: string; + }; + + @minValue(0) + start_index: int32; + + @minValue(0) + end_index: int32; +} + +// Tool customization: apply custom, common base type to union items +/** References an image [File](/docs/api-reference/files) in the content of a message. */ +model MessageDeltaContentImageFileObject extends MessageDeltaContent { + /** The index of the content part in the message. */ + index: int32; + + @doc(""" + Always `image_file`. + """) + type: "image_file"; + + image_file?: { + @doc(""" + The [File](/docs/api-reference/files) ID of the image in the message content. Set `purpose="vision"` when uploading the File if you need to later display the file content. + """) + file_id?: string; + + @doc(""" + Specifies the detail level of the image if specified by the user. `low` uses fewer tokens, you can opt in to high resolution using `high`. + """) + detail?: "auto" | "low" | "high" = "auto"; + }; +} + +// Tool customization: apply a common model base for all assistants message content items +/** References an image URL in the content of a message. */ +model MessageContentImageUrlObject extends MessageContent { + /** The type of the content part. */ + type: "image_url"; + + image_url: { + // Tool customization: url uses the url type + /** The external URL of the image, must be a supported image types: jpeg, jpg, png, gif, webp. */ + url: url; + + @doc(""" + Specifies the detail level of the image. `low` uses fewer tokens, you can opt in to high resolution using `high`. Default value is `auto` + """) + detail?: "auto" | "low" | "high" = "auto"; + }; +} + +// Tool customization: apply custom, common base type to union items +/** References an image URL in the content of a message. */ +model MessageDeltaContentImageUrlObject extends MessageDeltaContent { + /** The index of the content part in the message. */ + index: int32; + + @doc(""" + Always `image_url`. + """) + type: "image_url"; + + image_url?: { + // Tool customization: url uses the url type + /** The URL of the image, must be a supported image types: jpeg, jpg, png, gif, webp. */ + url?: url; + + @doc(""" + Specifies the detail level of the image. `low` uses fewer tokens, you can opt in to high resolution using `high`. + """) + detail?: "auto" | "low" | "high" = "auto"; + }; +} + +/** Represents a message delta i.e. any changed fields on a message during streaming. */ +model MessageDeltaObject { + /** The identifier of the message, which can be referenced in API endpoints. */ + id: string; + + @doc(""" + The object type, which is always `thread.message.delta`. + """) + object: "thread.message.delta"; + + /** The delta containing the fields that have changed on the Message. */ + delta: { + @doc(""" + The entity that produced the message. One of `user` or `assistant`. + """) + role?: "user" | "assistant"; + + // Tool customization: replace unioned types with a custom, common base + /** The content of the message in array of text and/or images. */ + @extension("x-oaiExpandable", true) + content?: MessageDeltaContent[]; + }; +} + +// Tool customization: apply custom, common base type to union items +/** The text content that is part of a message. */ +model MessageDeltaContentTextObject extends MessageDeltaContent { + /** The index of the content part in the message. */ + index: int32; + + @doc(""" + Always `text`. + """) + type: "text"; + + text?: { + /** The data that makes up the text. */ + value?: string; + + // Tool customization: replace unioned types with a custom, common base + @extension("x-oaiExpandable", true) + annotations?: MessageDeltaTextContentAnnotation[]; + }; +} + +// Tool customization: apply a common model base for all assistants message content items +/** The text content that is part of a message. */ +model MessageRequestContentTextObject extends MessageContent { + @doc(""" + Always `text`. + """) + type: "text"; + + /** Text content to be sent to the model */ + text: string; +} + +// Tool customization: apply custom, common base type to union items +/** A citation within the message that points to a specific quote from a specific File associated with the assistant or the message. Generated when the assistant uses the "file_search" tool to search files. */ +model MessageDeltaContentTextAnnotationsFileCitationObject + extends MessageDeltaTextContentAnnotation { + /** The index of the annotation in the text content part. */ + index: int32; + + @doc(""" + Always `file_citation`. + """) + type: "file_citation"; + + /** The text in the message content that needs to be replaced. */ + text?: string; + + file_citation?: { + /** The ID of the specific File the citation is from. */ + file_id?: string; + + /** The specific quote in the file. */ + quote?: string; + }; + + @minValue(0) + start_index?: int32; + + @minValue(0) + end_index?: int32; +} + +// Tool customization: apply custom, common base type to union items +@doc(""" + A URL for the file that's generated when the assistant used the `code_interpreter` tool to generate a file. + """) +model MessageDeltaContentTextAnnotationsFilePathObject + extends MessageDeltaTextContentAnnotation { + /** The index of the annotation in the text content part. */ + index: int32; + + @doc(""" + Always `file_path`. + """) + type: "file_path"; + + /** The text in the message content that needs to be replaced. */ + text?: string; + + file_path?: { + /** The ID of the file that was generated. */ + file_id?: string; + }; + + @minValue(0) + start_index?: int32; + + @minValue(0) + end_index?: int32; +} + +// Tool customization: apply custom, common base type to union items +/** The refusal content that is part of a message. */ +model MessageDeltaContentRefusalObject extends MessageDeltaContent { + /** The index of the refusal part in the message. */ + index: int32; + + @doc(""" + Always `refusal`. + """) + type: "refusal"; + + refusal?: string; +} diff --git a/.typespec/messages/operations.tsp b/.typespec/messages/operations.tsp new file mode 100644 index 000000000..23b0e34a0 --- /dev/null +++ b/.typespec/messages/operations.tsp @@ -0,0 +1,100 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/threads/{thread_id}/messages") +interface Messages { + @post + @operationId("createMessage") + @tag("Assistants") + @summary("Create a message.") + createMessage( + /** The ID of the [thread](/docs/api-reference/threads) to create a message for. */ + @path thread_id: string, + + @body requestBody: CreateMessageRequest, + ): MessageObject | ErrorResponse; + + @get + @operationId("listMessages") + @tag("Assistants") + @summary("Returns a list of messages for a given thread.") + listMessages( + /** The ID of the [thread](/docs/api-reference/threads) the messages belong to. */ + @path thread_id: string, + + /** + * A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + * default is 20. + */ + @query limit?: int32 = 20, + + /** + * Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + * for descending order. + */ + @query order?: ListOrder = ListOrder.desc, + + /** + * A cursor for use in pagination. `after` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include after=obj_foo in order to fetch the next page of the list. + */ + @query after?: string, + + /** + * A cursor for use in pagination. `before` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include before=obj_foo in order to fetch the previous page of the list. + */ + @query before?: string, + ): ListMessagesResponse | ErrorResponse; + + @route("{message_id}") + @get + @operationId("getMessage") + @tag("Assistants") + @summary("Retrieve a message.") + getMessage( + /** The ID of the [thread](/docs/api-reference/threads) to which this message belongs. */ + @path thread_id: string, + + /** The ID of the message to retrieve. */ + @path message_id: string, + ): MessageObject | ErrorResponse; + + @route("{message_id}") + @post + @operationId("modifyMessage") + @tag("Assistants") + @summary("Modifies a message.") + modifyMessage( + /** The ID of the thread to which this message belongs. */ + @path thread_id: string, + + /** The ID of the message to modify. */ + @path message_id: string, + + @body requestBody: ModifyMessageRequest, + ): MessageObject | ErrorResponse; + + @route("{message_id}") + @delete + @operationId("deleteMessage") + @tag("Assistants") + @summary("Deletes a message.") + deleteMessage( + /** The ID of the thread to which this message belongs. */ + @path thread_id: string, + + /** The ID of the message to delete. */ + @path message_id: string, + ): DeleteMessageResponse | ErrorResponse; +} diff --git a/images/main.tsp b/.typespec/models/main.tsp similarity index 100% rename from images/main.tsp rename to .typespec/models/main.tsp diff --git a/common/models.tsp b/.typespec/models/models.tsp similarity index 64% rename from common/models.tsp rename to .typespec/models/models.tsp index d6d0d4f91..f18c9ca31 100644 --- a/common/models.tsp +++ b/.typespec/models/models.tsp @@ -1,39 +1,38 @@ -namespace OpenAI; +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + using TypeSpec.OpenAPI; +namespace OpenAI; + model ListModelsResponse { - object: string; + object: "list"; data: Model[]; } +model DeleteModelResponse { + id: string; + deleted: boolean; + + // Tool customization: add a clear enum enforcement of constrained 'object' label + object: "model"; +} + /** Describes an OpenAI model offering that can be used with the API. */ model Model { /** The model identifier, which can be referenced in the API endpoints. */ id: string; - /** The object type, which is always "model". */ - object: "model"; - + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime /** The Unix timestamp (in seconds) when the model was created. */ @encode("unixTimestamp", int32) created: utcDateTime; + /** The object type, which is always "model". */ + object: "model"; + /** The organization that owns the model. */ owned_by: string; } - -model DeleteModelResponse { - id: string; - object: string; - deleted: boolean; -} - -// this is using yaml refs instead of a def in the openapi, not sure if that's required? - -scalar User extends string; - -@minItems(1) -model TokenArray is safeint[]; - -@minItems(1) -model TokenArrayArray is TokenArray[]; diff --git a/.typespec/models/operations.tsp b/.typespec/models/operations.tsp new file mode 100644 index 000000000..558cc9716 --- /dev/null +++ b/.typespec/models/operations.tsp @@ -0,0 +1,47 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/models") +interface Models { + @get + @operationId("listModels") + @tag("Models") + @summary(""" + Lists the currently available models, and provides basic information about each one such as the + owner and availability. + """) + listModels(): ListModelsResponse | ErrorResponse; + + @route("{model}") + @get + @operationId("retrieveModel") + @tag("Models") + @summary(""" + Retrieves a model instance, providing basic information about the model such as the owner and + permissioning. + """) + retrieve( + /** The ID of the model to use for this request. */ + @path `model`: string, + ): Model | ErrorResponse; + + @route("{model}") + @delete + @operationId("deleteModel") + @tag("Models") + @summary(""" + Delete a fine-tuned model. You must have the Owner role in your organization to delete a model. + """) + delete( + /** The model to delete */ + @path `model`: string, + ): DeleteModelResponse | ErrorResponse; +} diff --git a/moderation/main.tsp b/.typespec/moderations/main.tsp similarity index 100% rename from moderation/main.tsp rename to .typespec/moderations/main.tsp diff --git a/.typespec/moderations/models.tsp b/.typespec/moderations/models.tsp new file mode 100644 index 000000000..5af81e7ec --- /dev/null +++ b/.typespec/moderations/models.tsp @@ -0,0 +1,108 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +model CreateModerationRequest { + /** The input text to classify */ + input: string | string[]; + + @doc(""" + Two content moderations models are available: `text-moderation-stable` and `text-moderation-latest`. + + The default is `text-moderation-latest` which will be automatically upgraded over time. This ensures you are always using our most accurate model. If you use `text-moderation-stable`, we will provide advanced notice before updating the model. Accuracy of `text-moderation-stable` may be slightly lower than for `text-moderation-latest`. + """) + @extension("x-oaiTypeLabel", "string") + `model`?: string | "text-moderation-latest" | "text-moderation-stable" = "text-moderation-latest"; +} + +/** Represents if a given text input is potentially harmful. */ +model CreateModerationResponse { + /** The unique identifier for the moderation request. */ + id: string; + + /** The model used to generate the moderation results. */ + `model`: string; + + /** A list of moderation objects. */ + results: { + /** Whether any of the below categories are flagged. */ + flagged: boolean; + + /** A list of the categories, and whether they are flagged or not. */ + categories: { + /** Content that expresses, incites, or promotes hate based on race, gender, ethnicity, religion, nationality, sexual orientation, disability status, or caste. Hateful content aimed at non-protected groups (e.g., chess players) is harassment. */ + hate: boolean; + + /** Hateful content that also includes violence or serious harm towards the targeted group based on race, gender, ethnicity, religion, nationality, sexual orientation, disability status, or caste. */ + `hate/threatening`: boolean; + + /** Content that expresses, incites, or promotes harassing language towards any target. */ + harassment: boolean; + + /** Harassment content that also includes violence or serious harm towards any target. */ + `harassment/threatening`: boolean; + + /** Content that promotes, encourages, or depicts acts of self-harm, such as suicide, cutting, and eating disorders. */ + `self-harm`: boolean; + + /** Content where the speaker expresses that they are engaging or intend to engage in acts of self-harm, such as suicide, cutting, and eating disorders. */ + `self-harm/intent`: boolean; + + /** Content that encourages performing acts of self-harm, such as suicide, cutting, and eating disorders, or that gives instructions or advice on how to commit such acts. */ + `self-harm/instructions`: boolean; + + /** Content meant to arouse sexual excitement, such as the description of sexual activity, or that promotes sexual services (excluding sex education and wellness). */ + sexual: boolean; + + /** Sexual content that includes an individual who is under 18 years old. */ + `sexual/minors`: boolean; + + /** Content that depicts death, violence, or physical injury. */ + violence: boolean; + + /** Content that depicts death, violence, or physical injury in graphic detail. */ + `violence/graphic`: boolean; + }; + + /** A list of the categories along with their scores as predicted by model. */ + category_scores: { + /** The score for the category 'hate'. */ + hate: float32; + + /** The score for the category 'hate/threatening'. */ + `hate/threatening`: float32; + + /** The score for the category 'harassment'. */ + harassment: float32; + + /** The score for the category 'harassment/threatening'. */ + `harassment/threatening`: float32; + + /** The score for the category 'self-harm'. */ + `self-harm`: float32; + + /** The score for the category 'self-harm/intent'. */ + `self-harm/intent`: float32; + + /** The score for the category 'self-harm/instructions'. */ + `self-harm/instructions`: float32; + + /** The score for the category 'sexual'. */ + sexual: float32; + + /** The score for the category 'sexual/minors'. */ + `sexual/minors`: float32; + + /** The score for the category 'violence'. */ + violence: float32; + + /** The score for the category 'violence/graphic'. */ + `violence/graphic`: float32; + }; + }[]; +} diff --git a/moderation/operations.tsp b/.typespec/moderations/operations.tsp similarity index 59% rename from moderation/operations.tsp rename to .typespec/moderations/operations.tsp index 5f29bc3be..b288ccb6e 100644 --- a/moderation/operations.tsp +++ b/.typespec/moderations/operations.tsp @@ -1,7 +1,7 @@ import "@typespec/http"; import "@typespec/openapi"; -import "../common/errors.tsp"; +import "../common"; import "./models.tsp"; using TypeSpec.Http; @@ -9,12 +9,13 @@ using TypeSpec.OpenAPI; namespace OpenAI; -@route("/moderations") +@route("/v1/moderations") interface Moderations { + @post @operationId("createModeration") - @tag("OpenAI") - @summary("Classifies if text violates OpenAI's Content Policy") + @tag("Moderations") + @summary("Classifies if text is potentially harmful.") createModeration( - @body content: CreateModerationRequest, + @body requestBody: CreateModerationRequest, ): CreateModerationResponse | ErrorResponse; } diff --git a/.typespec/runs/client.tsp b/.typespec/runs/client.tsp new file mode 100644 index 000000000..bbc9b40c6 --- /dev/null +++ b/.typespec/runs/client.tsp @@ -0,0 +1,38 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./custom.tsp"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; +using OpenAI; + +@@access(RunStepDeltaObject, Access.public); +@@usage(RunStepDeltaObject, Usage.output); + +@@access(RunStepDeltaStepDetailsMessageCreationObject, Access.public); +@@usage(RunStepDeltaStepDetailsMessageCreationObject, Usage.output); + +@@access(RunStepDeltaStepDetailsToolCallsObject, Access.public); +@@usage(RunStepDeltaStepDetailsToolCallsObject, Usage.output); + +// +// @@access(RunStepDeltaStepDetailsToolCallsObjectToolCallsObject, Access.public); +// @@usage(RunStepDeltaStepDetailsToolCallsObjectToolCallsObject, Usage.output); + +@@access(RunStepDeltaStepDetailsToolCallsCodeObject, Access.public); +@@usage(RunStepDeltaStepDetailsToolCallsCodeObject, Usage.output); + +@@access(RunStepDeltaStepDetailsToolCallsFileSearchObject, Access.public); +@@usage(RunStepDeltaStepDetailsToolCallsFileSearchObject, Usage.output); + +@@access(RunStepDeltaStepDetailsToolCallsFunctionObject, Access.public); +@@usage(RunStepDeltaStepDetailsToolCallsFunctionObject, Usage.output); + +// +// @@access(RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject, Access.public); +// @@usage(RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject, Usage.output); + +@@access(RunStepDeltaStepDetailsToolCallsCodeOutputImageObject, Access.public); +@@usage(RunStepDeltaStepDetailsToolCallsCodeOutputImageObject, Usage.output); + +@@access(RunStepDeltaStepDetailsToolCallsCodeOutputLogsObject, Access.public); +@@usage(RunStepDeltaStepDetailsToolCallsCodeOutputLogsObject, Usage.output); diff --git a/.typespec/runs/custom.tsp b/.typespec/runs/custom.tsp new file mode 100644 index 000000000..2127e29f9 --- /dev/null +++ b/.typespec/runs/custom.tsp @@ -0,0 +1,61 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "../assistants"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +/** + * Abstractly represents a run step details object. + */ +@discriminator("type") +model RunStepObjectStepDetails { + /** The discriminated type identifier for the details object. */ + type: string; +} + +/** + * Abstractly represents a run step tool call details inner object. + */ +@discriminator("type") +model RunStepDetailsToolCallsObjectToolCallsObject { + /** The discriminated type identifier for the details object. */ + type: string; +} + +/** + * Abstractly represents a run step tool call details code interpreter output. + */ +@discriminator("type") +model RunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject { + /** The discriminated type identifier for the details object. */ + type: string; +} + +@discriminator("type") +model RunStepDeltaStepDetails { + /** The discriminated type identifier for the details object. */ + type: string; +} + +/** + * Abstractly represents a run step tool call details inner object. + */ +@discriminator("type") +model RunStepDeltaStepDetailsToolCallsObjectToolCallsObject { + /** The discriminated type identifier for the details object. */ + type: string; +} + +/** + * Abstractly represents a run step tool call details code interpreter output. + */ +@discriminator("type") +model RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject { + /** The discriminated type identifier for the details object. */ + type: string; +} diff --git a/.typespec/runs/main.tsp b/.typespec/runs/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec/runs/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec/runs/models.tsp b/.typespec/runs/models.tsp new file mode 100644 index 000000000..8de731324 --- /dev/null +++ b/.typespec/runs/models.tsp @@ -0,0 +1,804 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "../assistants"; +import "../common"; +import "../messages"; +import "../threads"; +import "./custom.tsp"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +// Tool generated type. Extracts CreateRunRequest.additional_messages +model CreateRunRequestAdditional_messages is CreateMessageRequest[]; + +// Tool generated type. Extracts CreateRunRequest.tools +@maxItems(20) +@extension("x-oaiExpandable", true) +model CreateRunRequestTools is AssistantToolDefinition[]; + +// Tool generated type. Extracts CreateThreadAndRunRequest.tools +@maxItems(20) +model CreateThreadAndRunRequestTools is AssistantToolDefinition[]; + +model CreateRunRequest { + /** The ID of the [assistant](/docs/api-reference/assistants) to use to execute this run. */ + assistant_id: string; + + /** The ID of the [Model](/docs/api-reference/models) to be used to execute this run. If a value is provided here, it will override the model associated with the assistant. If not, the model associated with the assistant will be used. */ + @extension("x-oaiTypeLabel", "string") + `model`?: + | string + | "gpt-4o" + | "gpt-4o-2024-08-06" + | "gpt-4o-2024-05-13" + | "gpt-4o-mini" + | "gpt-4o-mini-2024-07-18" + | "gpt-4-turbo" + | "gpt-4-turbo-2024-04-09" + | "gpt-4-0125-preview" + | "gpt-4-turbo-preview" + | "gpt-4-1106-preview" + | "gpt-4-vision-preview" + | "gpt-4" + | "gpt-4-0314" + | "gpt-4-0613" + | "gpt-4-32k" + | "gpt-4-32k-0314" + | "gpt-4-32k-0613" + | "gpt-3.5-turbo" + | "gpt-3.5-turbo-16k" + | "gpt-3.5-turbo-0613" + | "gpt-3.5-turbo-1106" + | "gpt-3.5-turbo-0125" + | "gpt-3.5-turbo-16k-0613" + | null; + + /** Overrides the [instructions](/docs/api-reference/assistants/createAssistant) of the assistant. This is useful for modifying the behavior on a per-run basis. */ + instructions?: string | null; + + /** Appends additional instructions at the end of the instructions for the run. This is useful for modifying the behavior on a per-run basis without overriding other instructions. */ + additional_instructions?: string | null; + + /** Adds additional messages to the thread before creating the run. */ + additional_messages?: CreateRunRequestAdditional_messages | null; + + // Tool customization: use common model base for tool definitions + /** Override the tools the assistant can use for this run. This is useful for modifying the behavior on a per-run basis. */ + tools?: CreateRunRequestTools | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; + + /** What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. */ + @minValue(0) + @maxValue(2) + temperature?: float32 | null = 1; + + /** + * An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + * + * We generally recommend altering this or temperature but not both. + */ + @minValue(0) + @maxValue(1) + top_p?: float32 | null = 1; + + @doc(""" + If `true`, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with a `data: [DONE]` message. + """) + stream?: boolean | null; + + @doc(""" + The maximum number of prompt tokens that may be used over the course of the run. The run will make a best effort to use only the number of prompt tokens specified, across multiple turns of the run. If the run exceeds the number of prompt tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + """) + @minValue(256) + max_prompt_tokens?: int32 | null; + + @doc(""" + The maximum number of completion tokens that may be used over the course of the run. The run will make a best effort to use only the number of completion tokens specified, across multiple turns of the run. If the run exceeds the number of completion tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + """) + @minValue(256) + max_completion_tokens?: int32 | null; + + truncation_strategy?: TruncationObject | null; + tool_choice?: AssistantsApiToolChoiceOption | null; + parallel_tool_calls?: ParallelToolCalls = true; + response_format?: AssistantsApiResponseFormatOption | null; +} + +model ModifyRunRequest { + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; +} + +model CreateThreadAndRunRequest { + /** The ID of the [assistant](/docs/api-reference/assistants) to use to execute this run. */ + assistant_id: string; + + /** If no thread is provided, an empty thread will be created. */ + thread?: CreateThreadRequest; + + /** The ID of the [Model](/docs/api-reference/models) to be used to execute this run. If a value is provided here, it will override the model associated with the assistant. If not, the model associated with the assistant will be used. */ + @extension("x-oaiTypeLabel", "string") + `model`?: + | string + | "gpt-4o" + | "gpt-4o-2024-08-06" + | "gpt-4o-2024-05-13" + | "gpt-4o-mini" + | "gpt-4o-mini-2024-07-18" + | "gpt-4-turbo" + | "gpt-4-turbo-2024-04-09" + | "gpt-4-0125-preview" + | "gpt-4-turbo-preview" + | "gpt-4-1106-preview" + | "gpt-4-vision-preview" + | "gpt-4" + | "gpt-4-0314" + | "gpt-4-0613" + | "gpt-4-32k" + | "gpt-4-32k-0314" + | "gpt-4-32k-0613" + | "gpt-3.5-turbo" + | "gpt-3.5-turbo-16k" + | "gpt-3.5-turbo-0613" + | "gpt-3.5-turbo-1106" + | "gpt-3.5-turbo-0125" + | "gpt-3.5-turbo-16k-0613" + | null; + + /** Override the default system message of the assistant. This is useful for modifying the behavior on a per-run basis. */ + instructions?: string | null; + + // Tool customization: use common model base for tool definitions + /** Override the tools the assistant can use for this run. This is useful for modifying the behavior on a per-run basis. */ + tools?: CreateThreadAndRunRequestTools | null; + + @doc(""" + A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + """) + tool_resources?: { + code_interpreter?: { + @doc(""" + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + """) + @maxItems(20) + file_ids?: string[] = #[]; + }; + + // Tool customization: use custom type for sophisticated union + file_search?: ToolResourcesFileSearchIdsOnly; + } | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; + + /** What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. */ + @minValue(0) + @maxValue(2) + temperature?: float32 | null = 1; + + /** + * An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + * + * We generally recommend altering this or temperature but not both. + */ + @minValue(0) + @maxValue(1) + top_p?: float32 | null = 1; + + @doc(""" + If `true`, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with a `data: [DONE]` message. + """) + stream?: boolean | null; + + @doc(""" + The maximum number of prompt tokens that may be used over the course of the run. The run will make a best effort to use only the number of prompt tokens specified, across multiple turns of the run. If the run exceeds the number of prompt tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + """) + @minValue(256) + max_prompt_tokens?: int32 | null; + + @doc(""" + The maximum number of completion tokens that may be used over the course of the run. The run will make a best effort to use only the number of completion tokens specified, across multiple turns of the run. If the run exceeds the number of completion tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + """) + @minValue(256) + max_completion_tokens?: int32 | null; + + truncation_strategy?: TruncationObject | null; + tool_choice?: AssistantsApiToolChoiceOption | null; + parallel_tool_calls?: ParallelToolCalls = true; + response_format?: AssistantsApiResponseFormatOption | null; +} + +model SubmitToolOutputsRunRequest { + /** A list of tools for which the outputs are being submitted. */ + tool_outputs: { + @doc(""" + The ID of the tool call in the `required_action` object within the run object the output is being submitted for. + """) + tool_call_id?: string; + + /** The output of the tool call to be submitted to continue the run. */ + output?: string; + }[]; + + @doc(""" + If `true`, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with a `data: [DONE]` message. + """) + stream?: boolean | null; +} + +model ListRunsResponse { + // Tool customization: add a clear enum enforcement of constrained 'object' label + object: "list"; + + data: RunObject[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +model ListRunStepsResponse { + // Tool customization: add a clear enum enforcement of constrained 'object' label + object: "list"; + + data: RunStepObject[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +// Tool customization: apply custom, common base type to union items +/** Details of the message creation by the run step. */ +model RunStepDetailsMessageCreationObject extends RunStepObjectStepDetails { + @doc(""" + Always `message_creation`. + """) + type: "message_creation"; + + message_creation: { + /** The ID of the message that was created by this run step. */ + message_id: string; + }; +} + +// Tool customization: apply custom, common base type to union items +/** Details of the tool call. */ +model RunStepDetailsToolCallsObject extends RunStepObjectStepDetails { + @doc(""" + Always `tool_calls`. + """) + type: "tool_calls"; + + // Tool customization: replace unioned types with a custom, common base + @doc(""" + An array of tool calls the run step was involved in. These can be associated with one of three types of tools: `code_interpreter`, `file_search`, or `function`. + """) + @extension("x-oaiExpandable", true) + tool_calls: RunStepDetailsToolCallsObjectToolCallsObject[]; +} + +// Tool customization: apply custom, common base type to union items +/** Details of the Code Interpreter tool call the run step was involved in. */ +model RunStepDetailsToolCallsCodeObject + extends RunStepDetailsToolCallsObjectToolCallsObject { + /** The ID of the tool call. */ + id: string; + + @doc(""" + The type of tool call. This is always going to be `code_interpreter` for this type of tool call. + """) + type: "code_interpreter"; + + /** The Code Interpreter tool call definition. */ + code_interpreter: { + /** The input to the Code Interpreter tool call. */ + input: string; + + // Tool customization: replace unioned types with a custom, common base + @doc(""" + The outputs from the Code Interpreter tool call. Code Interpreter can output one or more items, including text (`logs`) or images (`image`). Each of these are represented by a different object type. + """) + @extension("x-oaiExpandable", true) + outputs: RunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject[]; + }; +} + +// Tool customization: apply custom, common base type to union items +/** Text output from the Code Interpreter tool call as part of a run step. */ +model RunStepDetailsToolCallsCodeOutputLogsObject + extends RunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject { + @doc(""" + Always `logs`. + """) + type: "logs"; + + /** The text output from the Code Interpreter tool call. */ + logs: string; +} + +// Tool customization: apply custom, common base type to union items +model RunStepDetailsToolCallsCodeOutputImageObject + extends RunStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject { + @doc(""" + Always `image`. + """) + type: "image"; + + image: { + /** The [file](/docs/api-reference/files) ID of the image. */ + file_id: string; + }; +} + +// Tool customization: apply custom, common base type to union items +model RunStepDetailsToolCallsFileSearchObject + extends RunStepDetailsToolCallsObjectToolCallsObject { + /** The ID of the tool call object. */ + id: string; + + @doc(""" + The type of tool call. This is always going to be `file_search` for this type of tool call. + """) + type: "file_search"; + + /** For now, this is always going to be an empty object. */ + @extension("x-oaiTypeLabel", "map") + file_search: Record; +} + +// Tool customization: apply custom, common base type to union items +model RunStepDetailsToolCallsFunctionObject + extends RunStepDetailsToolCallsObjectToolCallsObject { + /** The ID of the tool call object. */ + id: string; + + @doc(""" + The type of tool call. This is always going to be `function` for this type of tool call. + """) + type: "function"; + + /** The definition of the function that was called. */ + function: { + /** The name of the function. */ + name: string; + + /** The arguments passed to the function. */ + arguments: string; + + @doc(""" + The output of the function. This will be `null` if the outputs have not been [submitted](/docs/api-reference/runs/submitToolOutputs) yet. + """) + output: string | null; + }; +} + +@doc(""" + Usage statistics related to the run. This value will be `null` if the run is not in a terminal state (i.e. `in_progress`, `queued`, etc.). + """) +model RunCompletionUsage { + /** Number of completion tokens used over the course of the run. */ + completion_tokens: int32; + + /** Number of prompt tokens used over the course of the run. */ + prompt_tokens: int32; + + /** Total number of tokens used (prompt + completion). */ + total_tokens: int32; +} + +@doc(""" + Usage statistics related to the run step. This value will be `null` while the run step's status is `in_progress`. + """) +model RunStepCompletionUsage { + /** Number of completion tokens used over the course of the run step. */ + completion_tokens: int32; + + /** Number of prompt tokens used over the course of the run step. */ + prompt_tokens: int32; + + /** Total number of tokens used (prompt + completion). */ + total_tokens: int32; +} + +/** Represents an execution run on a [thread](/docs/api-reference/threads). */ +model RunObject { + /** The identifier, which can be referenced in API endpoints. */ + id: string; + + @doc(""" + The object type, which is always `thread.run`. + """) + object: "thread.run"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the run was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The ID of the [thread](/docs/api-reference/threads) that was executed on as a part of this run. */ + thread_id: string; + + /** The ID of the [assistant](/docs/api-reference/assistants) used for execution of this run. */ + assistant_id: string; + + @doc(""" + The status of the run, which can be either `queued`, `in_progress`, `requires_action`, `cancelling`, `cancelled`, `failed`, `completed`, `incomplete`, or `expired`. + """) + status: + | "queued" + | "in_progress" + | "requires_action" + | "cancelling" + | "cancelled" + | "failed" + | "completed" + | "incomplete" + | "expired"; + + @doc(""" + Details on the action required to continue the run. Will be `null` if no action is required. + """) + required_action: { + @doc(""" + For now, this is always `submit_tool_outputs`. + """) + type: "submit_tool_outputs"; + + /** Details on the tool outputs needed for this run to continue. */ + submit_tool_outputs: { + /** A list of the relevant tool calls. */ + tool_calls: RunToolCallObject[]; + }; + } | null; + + @doc(""" + The last error associated with this run. Will be `null` if there are no errors. + """) + last_error: { + @doc(""" + One of `server_error`, `rate_limit_exceeded`, or `invalid_prompt`. + """) + code: "server_error" | "rate_limit_exceeded" | "invalid_prompt"; + + /** A human-readable description of the error. */ + message: string; + } | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the run will expire. */ + @encode("unixTimestamp", int32) + expires_at: utcDateTime | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the run was started. */ + @encode("unixTimestamp", int32) + started_at: utcDateTime | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the run was cancelled. */ + @encode("unixTimestamp", int32) + cancelled_at: utcDateTime | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the run failed. */ + @encode("unixTimestamp", int32) + failed_at: utcDateTime | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the run was completed. */ + @encode("unixTimestamp", int32) + completed_at: utcDateTime | null; + + @doc(""" + Details on why the run is incomplete. Will be `null` if the run is not incomplete. + """) + incomplete_details: { + /** The reason why the run is incomplete. This will point to which specific token limit was reached over the course of the run. */ + reason?: "max_completion_tokens" | "max_prompt_tokens"; + } | null; + + /** The model that the [assistant](/docs/api-reference/assistants) used for this run. */ + `model`: string; + + /** The instructions that the [assistant](/docs/api-reference/assistants) used for this run. */ + instructions: string; + + // Tool customization: use common model base for tool definitions + /** The list of tools that the [assistant](/docs/api-reference/assistants) used for this run. */ + @maxItems(20) + @extension("x-oaiExpandable", true) + tools: AssistantToolDefinition[] = #[]; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata: Record | null; + + usage: RunCompletionUsage | null; + + /** The sampling temperature used for this run. If not set, defaults to 1. */ + temperature?: float32 | null; + + /** The nucleus sampling value used for this run. If not set, defaults to 1. */ + top_p?: float32 | null; + + /** The maximum number of prompt tokens specified to have been used over the course of the run. */ + @minValue(256) + max_prompt_tokens: int32 | null; + + /** The maximum number of completion tokens specified to have been used over the course of the run. */ + @minValue(256) + max_completion_tokens: int32 | null; + + truncation_strategy: TruncationObject | null; + tool_choice: AssistantsApiToolChoiceOption | null; + parallel_tool_calls: ParallelToolCalls = true; + response_format: AssistantsApiResponseFormatOption | null; +} + +/** Represents a step in execution of a run. */ +model RunStepObject { + /** The identifier of the run step, which can be referenced in API endpoints. */ + id: string; + + @doc(""" + The object type, which is always `thread.run.step`. + """) + object: "thread.run.step"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the run step was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The ID of the [assistant](/docs/api-reference/assistants) associated with the run step. */ + assistant_id: string; + + /** The ID of the [thread](/docs/api-reference/threads) that was run. */ + thread_id: string; + + /** The ID of the [run](/docs/api-reference/runs) that this run step is a part of. */ + run_id: string; + + @doc(""" + The type of run step, which can be either `message_creation` or `tool_calls`. + """) + type: "message_creation" | "tool_calls"; + + @doc(""" + The status of the run step, which can be either `in_progress`, `cancelled`, `failed`, `completed`, or `expired`. + """) + status: "in_progress" | "cancelled" | "failed" | "completed" | "expired"; + + // Tool customization: replace unioned types with a custom, common base + /** The details of the run step. */ + @extension("x-oaiExpandable", true) + step_details: RunStepObjectStepDetails; + + @doc(""" + The last error associated with this run step. Will be `null` if there are no errors. + """) + last_error: { + @doc(""" + One of `server_error` or `rate_limit_exceeded`. + """) + code: "server_error" | "rate_limit_exceeded"; + + /** A human-readable description of the error. */ + message: string; + } | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the run step expired. A step is considered expired if the parent run is expired. */ + @encode("unixTimestamp", int32) + expired_at: utcDateTime | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the run step was cancelled. */ + @encode("unixTimestamp", int32) + cancelled_at: utcDateTime | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the run step failed. */ + @encode("unixTimestamp", int32) + failed_at: utcDateTime | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the run step completed. */ + @encode("unixTimestamp", int32) + completed_at: utcDateTime | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata: Record | null; + + usage: RunStepCompletionUsage | null; +} + +/** Tool call objects */ +model RunToolCallObject { + /** The ID of the tool call. This ID must be referenced when you submit the tool outputs in using the [Submit tool outputs to run](/docs/api-reference/runs/submitToolOutputs) endpoint. */ + id: string; + + @doc(""" + The type of tool call the output is required for. For now, this is always `function`. + """) + type: "function"; + + /** The function definition. */ + function: { + /** The name of the function. */ + name: string; + + /** The arguments that the model expects you to pass to the function. */ + arguments: string; + }; +} + +/** Represents a run step delta i.e. any changed fields on a run step during streaming. */ +model RunStepDeltaObject { + /** The identifier of the run step, which can be referenced in API endpoints. */ + id: string; + + @doc(""" + The object type, which is always `thread.run.step.delta`. + """) + object: "thread.run.step.delta"; + + /** The delta containing the fields that have changed on the run step. */ + delta: { + // Tool customization: replace unioned types with a custom, common base + /** The details of the run step. */ + @extension("x-oaiExpandable", true) + step_details?: RunStepDeltaStepDetails; + }; +} + +// Tool customization: apply custom, common base type to union items +/** Details of the message creation by the run step. */ +model RunStepDeltaStepDetailsMessageCreationObject + extends RunStepDeltaStepDetails { + @doc(""" + Always `message_creation`. + """) + type: "message_creation"; + + message_creation?: { + /** The ID of the message that was created by this run step. */ + message_id?: string; + }; +} + +// Tool customization: apply custom, common base type to union items +/** Details of the tool call. */ +model RunStepDeltaStepDetailsToolCallsObject extends RunStepDeltaStepDetails { + @doc(""" + Always `tool_calls`. + """) + type: "tool_calls"; + + // Tool customization: replace unioned types with a custom, common base + @doc(""" + An array of tool calls the run step was involved in. These can be associated with one of three types of tools: `code_interpreter`, `file_search`, or `function`. + """) + @extension("x-oaiExpandable", true) + tool_calls?: RunStepDeltaStepDetailsToolCallsObjectToolCallsObject[]; +} + +// Tool customization: apply custom, common base type to union items +/** Details of the Code Interpreter tool call the run step was involved in. */ +model RunStepDeltaStepDetailsToolCallsCodeObject + extends RunStepDeltaStepDetailsToolCallsObjectToolCallsObject { + /** The index of the tool call in the tool calls array. */ + index: int32; + + /** The ID of the tool call. */ + id?: string; + + @doc(""" + The type of tool call. This is always going to be `code_interpreter` for this type of tool call. + """) + type: "code_interpreter"; + + /** The Code Interpreter tool call definition. */ + code_interpreter?: { + /** The input to the Code Interpreter tool call. */ + input?: string; + + // Tool customization: replace unioned types with a custom, common base + @doc(""" + The outputs from the Code Interpreter tool call. Code Interpreter can output one or more items, including text (`logs`) or images (`image`). Each of these are represented by a different object type. + """) + @extension("x-oaiExpandable", true) + outputs?: RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject[]; + }; +} + +// Tool customization: apply custom, common base type to union items +/** Text output from the Code Interpreter tool call as part of a run step. */ +model RunStepDeltaStepDetailsToolCallsCodeOutputLogsObject + extends RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject { + /** The index of the output in the outputs array. */ + index: int32; + + @doc(""" + Always `logs`. + """) + type: "logs"; + + /** The text output from the Code Interpreter tool call. */ + logs?: string; +} + +// Tool customization: apply custom, common base type to union items +model RunStepDeltaStepDetailsToolCallsCodeOutputImageObject + extends RunStepDeltaStepDetailsToolCallsCodeObjectCodeInterpreterOutputsObject { + /** The index of the output in the outputs array. */ + index: int32; + + @doc(""" + Always `image`. + """) + type: "image"; + + image?: { + /** The [file](/docs/api-reference/files) ID of the image. */ + file_id?: string; + }; +} + +// Tool customization: apply custom, common base type to union items +model RunStepDeltaStepDetailsToolCallsFileSearchObject + extends RunStepDeltaStepDetailsToolCallsObjectToolCallsObject { + /** The index of the tool call in the tool calls array. */ + index: int32; + + /** The ID of the tool call object. */ + id?: string; + + @doc(""" + The type of tool call. This is always going to be `file_search` for this type of tool call. + """) + type: "file_search"; + + /** For now, this is always going to be an empty object. */ + @extension("x-oaiTypeLabel", "map") + file_search: Record; +} + +// Tool customization: apply custom, common base type to union items +model RunStepDeltaStepDetailsToolCallsFunctionObject + extends RunStepDeltaStepDetailsToolCallsObjectToolCallsObject { + /** The index of the tool call in the tool calls array. */ + index: int32; + + /** The ID of the tool call object. */ + id?: string; + + @doc(""" + The type of tool call. This is always going to be `function` for this type of tool call. + """) + type: "function"; + + /** The definition of the function that was called. */ + function?: { + /** The name of the function. */ + name?: string; + + /** The arguments passed to the function. */ + arguments?: string; + + @doc(""" + The output of the function. This will be `null` if the outputs have not been [submitted](/docs/api-reference/runs/submitToolOutputs) yet. + """) + output?: string | null; + }; +} diff --git a/.typespec/runs/operations.tsp b/.typespec/runs/operations.tsp new file mode 100644 index 000000000..18cd5498d --- /dev/null +++ b/.typespec/runs/operations.tsp @@ -0,0 +1,185 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/threads") +interface Runs { + @route("runs") + @post + @operationId("createThreadAndRun") + @tag("Assistants") + @summary("Create a thread and run it in one request.") + createThreadAndRun( + @body requestBody: CreateThreadAndRunRequest, + ): RunObject | ErrorResponse; + + @route("{thread_id}/runs") + @post + @operationId("createRun") + @tag("Assistants") + @summary("Create a run.") + createRun( + /** The ID of the thread to run. */ + @path thread_id: string, + + @body requestBody: CreateRunRequest, + ): RunObject | ErrorResponse; + + @route("{thread_id}/runs") + @get + @operationId("listRuns") + @tag("Assistants") + @summary("Returns a list of runs belonging to a thread.") + listRuns( + /** The ID of the thread the run belongs to. */ + @path thread_id: string, + + /** + * A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + * default is 20. + */ + @query limit?: int32 = 20, + + /** + * Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + * for descending order. + */ + @query order?: ListOrder = ListOrder.desc, + + /** + * A cursor for use in pagination. `after` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include after=obj_foo in order to fetch the next page of the list. + */ + @query after?: string, + + /** + * A cursor for use in pagination. `before` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include before=obj_foo in order to fetch the previous page of the list. + */ + @query before?: string, + ): ListRunsResponse | ErrorResponse; + + @route("{thread_id}/runs/{run_id}") + @get + @operationId("getRun") + @tag("Assistants") + @summary("Retrieves a run.") + getRun( + /** The ID of the [thread](/docs/api-reference/threads) that was run. */ + @path thread_id: string, + + /** The ID of the run to retrieve. */ + @path run_id: string, + ): RunObject | ErrorResponse; + + @route("{thread_id}/runs/{run_id}") + @post + @operationId("modifyRun") + @tag("Assistants") + @summary("Modifies a run.") + modifyRun( + /** The ID of the [thread](/docs/api-reference/threads) that was run. */ + @path thread_id: string, + + /** The ID of the run to modify. */ + @path run_id: string, + + @body requestBody: ModifyRunRequest, + ): RunObject | ErrorResponse; + + @route("{thread_id}/runs/{run_id}/cancel") + @post + @operationId("cancelRun") + @tag("Assistants") + @summary("Cancels a run that is `in_progress`.") + cancelRun( + /** The ID of the thread to which this run belongs. */ + @path thread_id: string, + + /** The ID of the run to cancel. */ + @path run_id: string, + ): RunObject | ErrorResponse; + + @route("{thread_id}/runs/{run_id}/submit_tool_outputs") + @post + @operationId("submitToolOutputsToRun") + @tag("Assistants") + @summary(""" + When a run has the `status: "requires_action"` and `required_action.type` is + `submit_tool_outputs`, this endpoint can be used to submit the outputs from the tool calls once + they're all completed. All outputs must be submitted in a single request. + """) + submitToolOutputsToRun( + /** The ID of the [thread](/docs/api-reference/threads) to which this run belongs. */ + @path thread_id: string, + + /** The ID of the run that requires the tool output submission. */ + @path run_id: string, + + @body requestBody: SubmitToolOutputsRunRequest, + ): RunObject | ErrorResponse; + + @route("{thread_id}/runs/{run_id}/steps") + @get + @operationId("listRunSteps") + @tag("Assistants") + @summary("Returns a list of run steps belonging to a run.") + listRunSteps( + /** The ID of the thread the run and run steps belong to. */ + @path thread_id: string, + + /** The ID of the run the run steps belong to. */ + @path run_id: string, + + /** + * A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + * default is 20. + */ + @query limit?: int32 = 20, + + /** + * Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + * for descending order. + */ + @query order?: ListOrder = ListOrder.desc, + + /** + * A cursor for use in pagination. `after` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include after=obj_foo in order to fetch the next page of the list. + */ + @query after?: string, + + /** + * A cursor for use in pagination. `before` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include before=obj_foo in order to fetch the previous page of the list. + */ + @query before?: string, + ): ListRunStepsResponse | ErrorResponse; + + @route("{thread_id}/runs/{run_id}/steps/{step_id}") + @get + @operationId("getRunStep") + @tag("Assistants") + @summary("Retrieves a run step.") + getRunStep( + /** The ID of the thread to which the run and run step belongs. */ + @path thread_id: string, + + /** The ID of the run to which the run step belongs. */ + @path run_id: string, + + /** The ID of the run step to retrieve. */ + @path step_id: string, + ): RunStepObject | ErrorResponse; +} diff --git a/.typespec/streaming/models.tsp b/.typespec/streaming/models.tsp new file mode 100644 index 000000000..ef8144ce4 --- /dev/null +++ b/.typespec/streaming/models.tsp @@ -0,0 +1,159 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "../common"; +import "../messages"; +import "../runs"; +import "../threads"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@doc(""" + Represents an event emitted when streaming a Run. + + Each event in a server-sent events stream has an `event` and `data` property: + + ``` + event: thread.created + data: {"id": "thread_123", "object": "thread", ...} + ``` + + We emit events whenever a new object is created, transitions to a new state, or is being + streamed in parts (deltas). For example, we emit `thread.run.created` when a new run + is created, `thread.run.completed` when a run completes, and so on. When an Assistant chooses + to create a message during a run, we emit a `thread.message.created event`, a + `thread.message.in_progress` event, many `thread.message.delta` events, and finally a + `thread.message.completed` event. + + We may add additional events over time, so we recommend handling unknown events gracefully + in your code. See the [Assistants API quickstart](/docs/assistants/overview) to learn how to + integrate the Assistants API with streaming. + """) +union AssistantStreamEvent { + ThreadStreamEvent, + RunStreamEvent, + RunStepStreamEvent, + MessageStreamEvent, + ErrorEvent, + DoneEvent, +} + +union ThreadStreamEvent { + { + event: "thread.created", + data: ThreadObject, + }, +} + +union RunStreamEvent { + { + event: "thread.run.created", + data: RunObject, + }, + { + event: "thread.run.queued", + data: RunObject, + }, + { + event: "thread.run.in_progress", + data: RunObject, + }, + { + event: "thread.run.requires_action", + data: RunObject, + }, + { + event: "thread.run.completed", + data: RunObject, + }, + { + event: "thread.run.incomplete", + data: RunObject, + }, + { + event: "thread.run.failed", + data: RunObject, + }, + { + event: "thread.run.cancelling", + data: RunObject, + }, + { + event: "thread.run.cancelled", + data: RunObject, + }, + { + event: "thread.run.expired", + data: RunObject, + }, +} + +union RunStepStreamEvent { + { + event: "thread.run.step.created", + data: RunStepObject, + }, + { + event: "thread.run.step.in_progress", + data: RunStepObject, + }, + { + event: "thread.run.step.delta", + data: RunStepDeltaObject, + }, + { + event: "thread.run.step.completed", + data: RunStepObject, + }, + { + event: "thread.run.step.failed", + data: RunStepObject, + }, + { + event: "thread.run.step.cancelled", + data: RunStepObject, + }, + { + event: "thread.run.step.expired", + data: RunStepObject, + }, +} + +union MessageStreamEvent { + { + event: "thread.message.created", + data: MessageObject, + }, + { + event: "thread.message.in_progress", + data: MessageObject, + }, + { + event: "thread.message.delta", + data: MessageDeltaObject, + }, + { + event: "thread.message.completed", + data: MessageObject, + }, + { + event: "thread.message.incomplete", + data: MessageObject, + }, +} + +/** Occurs when an [error](/docs/guides/error-codes/api-errors) occurs. This can happen due to an internal server error or a timeout. */ +model ErrorEvent { + event: "error"; + data: Error; +} + +/** Occurs when a stream ends. */ +model DoneEvent { + event: "done"; + data: "[DONE]"; +} diff --git a/.typespec/threads/client.tsp b/.typespec/threads/client.tsp new file mode 100644 index 000000000..b1519d63c --- /dev/null +++ b/.typespec/threads/client.tsp @@ -0,0 +1,12 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./custom.tsp"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; +using OpenAI; + +@@access(ListThreadsResponse, Access.public); +@@usage(ListThreadsResponse, Usage.output); + +@@access(CreateThreadRequestToolResourcesFileSearchBase, Access.public); +@@usage(CreateThreadRequestToolResourcesFileSearchBase, Usage.input); diff --git a/.typespec/threads/custom.tsp b/.typespec/threads/custom.tsp new file mode 100644 index 000000000..256b16f67 --- /dev/null +++ b/.typespec/threads/custom.tsp @@ -0,0 +1,52 @@ +using TypeSpec.OpenAPI; + +namespace OpenAI; + +// This customization allows us to concretely specify that the file_search object must provide +// either ID references --or-- in-line creation helpers, but not both. + +alias CreateThreadRequestToolResourcesFileSearch = CreateThreadRequestToolResourcesFileSearchVectorStoreIdReferences | CreateThreadRequestToolResourcesFileSearchVectorStoreCreationHelpers; + +model CreateThreadRequestToolResourcesFileSearchVectorStoreIdReferences { + ...CreateThreadRequestToolResourcesFileSearchBase; + + /** + * The [vector store](/docs/api-reference/vector-stores/object) attached to this thread. + * There can be a maximum of 1 vector store attached to the thread. + */ + @maxItems(1) + vector_store_ids?: string[]; +} + +model CreateThreadRequestToolResourcesFileSearchVectorStoreCreationHelpers { + ...CreateThreadRequestToolResourcesFileSearchBase; + + /** + * A helper to create a [vector store](/docs/api-reference/vector-stores/object) with + * file_ids and attach it to this thread. There can be a maximum of 1 vector store attached + * to the thread. + */ + @maxItems(1) + vector_stores?: CreateThreadRequestToolResourcesFileSearchVectorStoreCreationHelper[]; +} + +model CreateThreadRequestToolResourcesFileSearchBase { + // Common fields (currently none) +} + +alias CreateThreadRequestToolResourcesFileSearchVectorStoreCreationHelper = { + /** + * A list of [file](/docs/api-reference/files) IDs to add to the vector store. There can be + * a maximum of 10000 files in a vector store. + */ + @maxItems(10000) + file_ids?: string[]; + + /** + * Set of 16 key-value pairs that can be attached to a vector store. This can be useful for + * storing additional information about the vector store in a structured format. Keys can + * be a maximum of 64 characters long and values can be a maxium of 512 characters long. + */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record; +}; diff --git a/.typespec/threads/main.tsp b/.typespec/threads/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec/threads/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec/threads/models.tsp b/.typespec/threads/models.tsp new file mode 100644 index 000000000..aad860a05 --- /dev/null +++ b/.typespec/threads/models.tsp @@ -0,0 +1,112 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "../messages"; +import "./custom.tsp"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +model CreateThreadRequest { + /** A list of [messages](/docs/api-reference/messages) to start the thread with. */ + messages?: CreateMessageRequest[]; + + @doc(""" + A set of resources that are made available to the assistant's tools in this thread. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + """) + tool_resources?: { + code_interpreter?: { + @doc(""" + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + """) + @maxItems(20) + file_ids?: string[] = #[]; + }; + + // Tool customization: use custom type for sophisticated union + file_search?: ToolResourcesFileSearch; + } | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; +} + +model ModifyThreadRequest { + @doc(""" + A set of resources that are made available to the assistant's tools in this thread. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + """) + tool_resources?: { + code_interpreter?: { + @doc(""" + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + """) + @maxItems(20) + file_ids?: string[] = #[]; + }; + + // Tool customization: use custom type for sophisticated union + file_search?: ToolResourcesFileSearchIdsOnly; + } | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; +} + +model DeleteThreadResponse { + id: string; + deleted: boolean; + object: "thread.deleted"; +} + +/** Represents a thread that contains [messages](/docs/api-reference/messages). */ +model ThreadObject { + /** The identifier, which can be referenced in API endpoints. */ + id: string; + + @doc(""" + The object type, which is always `thread`. + """) + object: "thread"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the thread was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + @doc(""" + A set of resources that are made available to the assistant's tools in this thread. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + """) + tool_resources: { + code_interpreter?: { + @doc(""" + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + """) + @maxItems(20) + file_ids?: string[] = #[]; + }; + file_search?: { + /** The [vector store](/docs/api-reference/vector-stores/object) attached to this thread. There can be a maximum of 1 vector store attached to the thread. */ + @maxItems(1) + vector_store_ids?: string[]; + }; + } | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata: Record | null; +} + +model ListThreadsResponse { + // Tool customization: add a clear enum enforcement of constrained 'object' label + object: "list"; + + data: ThreadObject[]; + first_id: string; + last_id: string; + has_more: boolean; +} diff --git a/.typespec/threads/operations.tsp b/.typespec/threads/operations.tsp new file mode 100644 index 000000000..00dffe377 --- /dev/null +++ b/.typespec/threads/operations.tsp @@ -0,0 +1,53 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/threads") +interface Threads { + @post + @operationId("createThread") + @tag("Assistants") + @summary("Create a thread.") + createThread( + @body requestBody: CreateThreadRequest, + ): ThreadObject | ErrorResponse; + + @route("{thread_id}") + @get + @operationId("getThread") + @tag("Assistants") + @summary("Retrieves a thread.") + getThread( + /** The ID of the thread to retrieve. */ + @path thread_id: string, + ): ThreadObject | ErrorResponse; + + @route("{thread_id}") + @post + @operationId("modifyThread") + @tag("Assistants") + @summary("Modifies a thread.") + modifyThread( + /** The ID of the thread to modify. Only the `metadata` can be modified. */ + @path thread_id: string, + + @body requestBody: ModifyThreadRequest, + ): ThreadObject | ErrorResponse; + + @route("{thread_id}") + @delete + @operationId("deleteThread") + @tag("Assistants") + @summary("Delete a thread.") + deleteThread( + /** The ID of the thread to delete. */ + @path thread_id: string, + ): DeleteThreadResponse | ErrorResponse; +} diff --git a/.typespec/tspconfig.yaml b/.typespec/tspconfig.yaml new file mode 100644 index 000000000..97d577288 --- /dev/null +++ b/.typespec/tspconfig.yaml @@ -0,0 +1,11 @@ +emit: + - "@typespec/openapi3" + - "@azure-tools/typespec-csharp" +options: + "@typespec/openapi3": + output-file: "{project-root}/../.openapi3/openapi3-openai.yaml" + "@azure-tools/typespec-csharp": + emitter-output-dir: "{project-root}/../.dotnet/src" + unreferenced-types-handling: keepAll + disable-xml-docs: true + enable-internal-raw-data: true diff --git a/.typespec/uploads/main.tsp b/.typespec/uploads/main.tsp new file mode 100644 index 000000000..144c4aeaf --- /dev/null +++ b/.typespec/uploads/main.tsp @@ -0,0 +1 @@ +import "./operations.tsp"; diff --git a/.typespec/uploads/models.tsp b/.typespec/uploads/models.tsp new file mode 100644 index 000000000..13fe13989 --- /dev/null +++ b/.typespec/uploads/models.tsp @@ -0,0 +1,100 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "../files"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +model CreateUploadRequest { + /** The name of the file to upload. */ + filename: string; + + /** + * The intended purpose of the uploaded file. + * + * See the [documentation on File purposes](/docs/api-reference/files/create#files-create-purpose). + */ + purpose: "assistants" | "batch" | "fine-tune" | "vision"; + + /** The number of bytes in the file you are uploading. */ + bytes: int32; + + /** + * The MIME type of the file. + * + * This must fall within the supported MIME types for your file purpose. See the supported MIME types for assistants and vision. + */ + mime_type: string; +} + +model AddUploadPartRequest { + /** The chunk of bytes for this Part. */ + data: bytes; +} + +model CompleteUploadRequest { + /** The ordered list of Part IDs. */ + part_ids: string[]; + + /** The optional md5 checksum for the file contents to verify if the bytes uploaded matches what you expect. */ + md5?: string; +} + +alias CancelUploadRequest = unknown; + +/** The Upload object can accept byte chunks in the form of Parts. */ +model Upload { + /** The Upload unique identifier, which can be referenced in API endpoints. */ + id: string; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the Upload was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The name of the file to be uploaded. */ + filename: string; + + /** The intended number of bytes to be uploaded. */ + bytes: int32; + + /** The intended purpose of the file. [Please refer here](/docs/api-reference/files/object#files/object-purpose) for acceptable values. */ + purpose: string; + + /** The status of the Upload. */ + status: "pending" | "completed" | "cancelled" | "expired"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the Upload was created. */ + @encode("unixTimestamp", int32) + expires_at: utcDateTime; + + /** The object type, which is always "upload". */ + object?: "upload"; + + /** The ready File object after the Upload is completed. */ + file?: OpenAIFile | null; +} + +/** The upload Part represents a chunk of bytes we can add to an Upload object. */ +model UploadPart { + /** The upload Part unique identifier, which can be referenced in API endpoints. */ + id: string; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the Part was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The ID of the Upload object that this Part was added to. */ + upload_id: string; + + @doc(""" + The object type, which is always `upload.part`. + """) + object: "upload.part"; +} diff --git a/.typespec/uploads/operations.tsp b/.typespec/uploads/operations.tsp new file mode 100644 index 000000000..eabef6ea2 --- /dev/null +++ b/.typespec/uploads/operations.tsp @@ -0,0 +1,70 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/uploads") +interface Uploads { + @post + @operationId("createUpload") + @tag("Uploads") + @summary(""" + Creates an intermediate [Upload](/docs/api-reference/uploads/object) object that you can add [Parts](/docs/api-reference/uploads/part-object) to. Currently, an Upload can accept at most 8 GB in total and expires after an hour after you create it. + + Once you complete the Upload, we will create a [File](/docs/api-reference/files/object) object that contains all the parts you uploaded. This File is usable in the rest of our platform as a regular File object. + + For certain `purpose`s, the correct `mime_type` must be specified. Please refer to documentation for the supported MIME types for your use case: + - [Assistants](/docs/assistants/tools/file-search/supported-files) + + For guidance on the proper filename extensions for each purpose, please follow the documentation on [creating a File](/docs/api-reference/files/create). + """) + createUpload(@body requestBody: CreateUploadRequest): Upload | ErrorResponse; + + @route("{upload_id}/parts") + @post + @operationId("addUploadPart") + @tag("Uploads") + @summary(""" + Adds a [Part](/docs/api-reference/uploads/part-object) to an [Upload](/docs/api-reference/uploads/object) object. A Part represents a chunk of bytes from the file you are trying to upload. + + Each Part can be at most 64 MB, and you can add Parts until you hit the Upload maximum of 8 GB. + + It is possible to add multiple Parts in parallel. You can decide the intended order of the Parts when you [complete the Upload](/docs/api-reference/uploads/complete). + """) + addUploadPart( + @path upload_id: string, + @header contentType: "multipart/form-data", + @body requestBody: AddUploadPartRequest, + ): UploadPart | ErrorResponse; + + @route("{upload_id}/complete") + @post + @operationId("completeUpload") + @tag("Uploads") + @summary(""" + Completes the [Upload](/docs/api-reference/uploads/object). + + Within the returned Upload object, there is a nested [File](/docs/api-reference/files/object) object that is ready to use in the rest of the platform. + + You can specify the order of the Parts by passing in an ordered list of the Part IDs. + + The number of bytes uploaded upon completion must match the number of bytes initially specified when creating the Upload object. No Parts may be added after an Upload is completed. + """) + completeUpload( + @path upload_id: string, + @body requestBody: CompleteUploadRequest, + ): Upload | ErrorResponse; + + @route("{upload_id}/cancel") + @post + @operationId("cancelUpload") + @tag("Uploads") + @summary("Cancels the Upload. No Parts may be added after an Upload is cancelled.") + cancelUpload(@path upload_id: string): Upload | ErrorResponse; +} diff --git a/.typespec/vector-stores/client.tsp b/.typespec/vector-stores/client.tsp new file mode 100644 index 000000000..81c3d5c86 --- /dev/null +++ b/.typespec/vector-stores/client.tsp @@ -0,0 +1,13 @@ +import "@azure-tools/typespec-client-generator-core"; +import "./custom.tsp"; +import "./models.tsp"; + +using Azure.ClientGenerator.Core; +using OpenAI; + +@@access(FileChunkingStrategyResponseParam, Access.public); +@@usage(FileChunkingStrategyResponseParam, Usage.input | Usage.output); + +// TODO: Auto should only be an input. +@@access(AutoChunkingStrategyResponseParam, Access.public); +@@usage(AutoChunkingStrategyResponseParam, Usage.input | Usage.output); diff --git a/.typespec/vector-stores/custom.tsp b/.typespec/vector-stores/custom.tsp new file mode 100644 index 000000000..29877d183 --- /dev/null +++ b/.typespec/vector-stores/custom.tsp @@ -0,0 +1,26 @@ +using TypeSpec.OpenAPI; + +namespace OpenAI; + +union ListVectorStoreFilesFilter { + string, + "in_progress", + "completed", + "failed", + "cancelled", +} + +@discriminator("type") +model FileChunkingStrategyRequestParam { + type: string; +} + +@discriminator("type") +model FileChunkingStrategyResponseParam { + type: string; +} + +model AutoChunkingStrategyResponseParam + extends FileChunkingStrategyResponseParam { + type: "auto"; +} diff --git a/.typespec/vector-stores/main.tsp b/.typespec/vector-stores/main.tsp new file mode 100644 index 000000000..e7af5325f --- /dev/null +++ b/.typespec/vector-stores/main.tsp @@ -0,0 +1,2 @@ +import "./client.tsp"; +import "./operations.tsp"; diff --git a/.typespec/vector-stores/models.tsp b/.typespec/vector-stores/models.tsp new file mode 100644 index 000000000..c24a9f21f --- /dev/null +++ b/.typespec/vector-stores/models.tsp @@ -0,0 +1,327 @@ +/* + * This file was automatically generated from an OpenAPI .yaml file. + * Edits made directly to this file will be lost. + */ + +import "./custom.tsp"; + +using TypeSpec.OpenAPI; + +namespace OpenAI; + +/** The expiration policy for a vector store. */ +model VectorStoreExpirationAfter { + @doc(""" + Anchor timestamp after which the expiration policy applies. Supported anchors: `last_active_at`. + """) + anchor: "last_active_at"; + + /** The number of days after the anchor time that the vector store will expire. */ + @minValue(1) + @maxValue(365) + days: int32; +} + +@doc(""" + A vector store is a collection of processed files can be used by the `file_search` tool. + """) +model VectorStoreObject { + /** The identifier, which can be referenced in API endpoints. */ + id: string; + + @doc(""" + The object type, which is always `vector_store`. + """) + object: "vector_store"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the vector store was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The name of the vector store. */ + name: string; + + /** The total number of bytes used by the files in the vector store. */ + usage_bytes: int32; + + file_counts: { + /** The number of files that are currently being processed. */ + in_progress: int32; + + /** The number of files that have been successfully processed. */ + completed: int32; + + /** The number of files that have failed to process. */ + failed: int32; + + /** The number of files that were cancelled. */ + cancelled: int32; + + /** The total number of files. */ + total: int32; + }; + + @doc(""" + The status of the vector store, which can be either `expired`, `in_progress`, or `completed`. A status of `completed` indicates that the vector store is ready for use. + """) + status: "expired" | "in_progress" | "completed"; + + expires_after?: VectorStoreExpirationAfter; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the vector store will expire. */ + @encode("unixTimestamp", int32) + expires_at?: utcDateTime | null; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the vector store was last active. */ + @encode("unixTimestamp", int32) + last_active_at: utcDateTime | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata: Record | null; +} + +model CreateVectorStoreRequest { + @doc(""" + A list of [File](/docs/api-reference/files) IDs that the vector store should use. Useful for tools like `file_search` that can access files. + """) + @maxItems(500) + file_ids?: string[]; + + /** The name of the vector store. */ + name?: string; + + expires_after?: VectorStoreExpirationAfter; + + @doc(""" + The chunking strategy used to chunk the file(s). If not set, will use the `auto` strategy. Only applicable if `file_ids` is non-empty. + """) + @extension("x-oaiExpandable", true) + chunking_strategy?: AutoChunkingStrategyRequestParam | StaticChunkingStrategyRequestParam; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; +} + +model UpdateVectorStoreRequest { + /** The name of the vector store. */ + name?: string | null; + + expires_after?: VectorStoreExpirationAfter | null; + + /** Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. */ + @extension("x-oaiTypeLabel", "map") + metadata?: Record | null; +} + +model ListVectorStoresResponse { + // Tool customization: add a clear enum enforcement of constrained 'object' label + object: "list"; + + data: VectorStoreObject[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +model DeleteVectorStoreResponse { + id: string; + deleted: boolean; + object: "vector_store.deleted"; +} + +/** A list of files attached to a vector store. */ +model VectorStoreFileObject { + /** The identifier, which can be referenced in API endpoints. */ + id: string; + + @doc(""" + The object type, which is always `vector_store.file`. + """) + object: "vector_store.file"; + + /** The total vector store usage in bytes. Note that this may be different from the original file size. */ + usage_bytes: int32; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the vector store file was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The ID of the [vector store](/docs/api-reference/vector-stores/object) that the [File](/docs/api-reference/files) is attached to. */ + vector_store_id: string; + + @doc(""" + The status of the vector store file, which can be either `in_progress`, `completed`, `cancelled`, or `failed`. The status `completed` indicates that the vector store file is ready for use. + """) + status: "in_progress" | "completed" | "cancelled" | "failed"; + + @doc(""" + The last error associated with this vector store file. Will be `null` if there are no errors. + """) + last_error: { + @doc(""" + One of `server_error` or `rate_limit_exceeded`. + """) + code: "server_error" | "unsupported_file" | "invalid_file"; + + /** A human-readable description of the error. */ + message: string; + } | null; + + /** The strategy used to chunk the file. */ + @extension("x-oaiExpandable", true) + chunking_strategy?: StaticChunkingStrategyResponseParam | OtherChunkingStrategyResponseParam; +} + +model CreateVectorStoreFileRequest { + @doc(""" + A [File](/docs/api-reference/files) ID that the vector store should use. Useful for tools like `file_search` that can access files. + """) + file_id: string; + + chunking_strategy?: ChunkingStrategyRequestParam; +} + +model ListVectorStoreFilesResponse { + // Tool customization: add a clear enum enforcement of constrained 'object' label + object: "list"; + + data: VectorStoreFileObject[]; + first_id: string; + last_id: string; + has_more: boolean; +} + +model DeleteVectorStoreFileResponse { + id: string; + deleted: boolean; + object: "vector_store.file.deleted"; +} + +/** A batch of files attached to a vector store. */ +model VectorStoreFileBatchObject { + /** The identifier, which can be referenced in API endpoints. */ + id: string; + + @doc(""" + The object type, which is always `vector_store.file_batch`. + """) + object: "vector_store.files_batch"; + + // Tool customization: 'created' and fields ending in '_at' are Unix encoded utcDateTime + /** The Unix timestamp (in seconds) for when the vector store files batch was created. */ + @encode("unixTimestamp", int32) + created_at: utcDateTime; + + /** The ID of the [vector store](/docs/api-reference/vector-stores/object) that the [File](/docs/api-reference/files) is attached to. */ + vector_store_id: string; + + @doc(""" + The status of the vector store files batch, which can be either `in_progress`, `completed`, `cancelled` or `failed`. + """) + status: "in_progress" | "completed" | "cancelled" | "failed"; + + file_counts: { + /** The number of files that are currently being processed. */ + in_progress: int32; + + /** The number of files that have been processed. */ + completed: int32; + + /** The number of files that have failed to process. */ + failed: int32; + + /** The number of files that where cancelled. */ + cancelled: int32; + + /** The total number of files. */ + total: int32; + }; +} + +model CreateVectorStoreFileBatchRequest { + @doc(""" + A list of [File](/docs/api-reference/files) IDs that the vector store should use. Useful for tools like `file_search` that can access files. + """) + @minItems(1) + @maxItems(500) + file_ids: string[]; + + chunking_strategy?: ChunkingStrategyRequestParam; +} + +// Tool customization: apply custom common base +@doc(""" + This is returned when the chunking strategy is unknown. Typically, this is because the file was indexed before the `chunking_strategy` concept was introduced in the API. + """) +model OtherChunkingStrategyResponseParam + extends FileChunkingStrategyResponseParam { + @doc(""" + Always `other`. + """) + type: "other"; +} + +// Tool customization: apply custom common base +model StaticChunkingStrategyResponseParam + extends FileChunkingStrategyResponseParam { + @doc(""" + Always `static`. + """) + type: "static"; + + static: StaticChunkingStrategy; +} + +model StaticChunkingStrategy { + @doc(""" + The maximum number of tokens in each chunk. The default value is `800`. The minimum value is `100` and the maximum value is `4096`. + """) + @minValue(100) + @maxValue(4096) + max_chunk_size_tokens: int32; + + @doc(""" + The number of tokens that overlap between chunks. The default value is `400`. + + Note that the overlap must not exceed half of `max_chunk_size_tokens`. + """) + chunk_overlap_tokens: int32; +} + +// Tool customization: apply custom common base +@doc(""" + The default strategy. This strategy currently uses a `max_chunk_size_tokens` of `800` and `chunk_overlap_tokens` of `400`. + """) +model AutoChunkingStrategyRequestParam + extends FileChunkingStrategyRequestParam { + @doc(""" + Always `auto`. + """) + type: "auto"; +} + +// Tool customization: apply custom common base +model StaticChunkingStrategyRequestParam + extends FileChunkingStrategyRequestParam { + @doc(""" + Always `static`. + """) + type: "static"; + + static: StaticChunkingStrategy; +} + +@doc(""" + The chunking strategy used to chunk the file(s). If not set, will use the `auto` strategy. + """) +@extension("x-oaiExpandable", true) +union ChunkingStrategyRequestParam { + AutoChunkingStrategyRequestParam, + StaticChunkingStrategyRequestParam, +} diff --git a/.typespec/vector-stores/operations.tsp b/.typespec/vector-stores/operations.tsp new file mode 100644 index 000000000..908cf6583 --- /dev/null +++ b/.typespec/vector-stores/operations.tsp @@ -0,0 +1,247 @@ +import "@typespec/http"; +import "@typespec/openapi"; + +import "../common"; +import "./custom.tsp"; +import "./models.tsp"; + +using TypeSpec.Http; +using TypeSpec.OpenAPI; + +namespace OpenAI; + +@route("/v1/vector_stores") +interface VectorStores { + @get + @operationId("listVectorStores") + @tag("Vector Stores") + @summary("Returns a list of vector-stores.") + listVectorStores( + /** + * A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + * default is 20. + */ + @query limit?: int32 = 20, + + /** + * Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + * for descending order. + */ + @query order?: ListOrder = ListOrder.desc, + + /** + * A cursor for use in pagination. `after` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include after=obj_foo in order to fetch the next page of the list. + */ + @query after?: string, + + /** + * A cursor for use in pagination. `before` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include before=obj_foo in order to fetch the previous page of the list. + */ + @query before?: string, + ): ListVectorStoresResponse | ErrorResponse; + + @post + @operationId("createVectorStore") + @tag("Vector Stores") + @summary("Creates a vector store.") + createVectorStore( + @body requestBody: CreateVectorStoreRequest, + ): VectorStoreObject | ErrorResponse; + + @route("{vector_store_id}") + @get + @operationId("getVectorStore") + @tag("Vector Stores") + @summary("Retrieves a vector store.") + getVectorStore( + /** The ID of the vector store to retrieve. */ + @path vector_store_id: string, + ): VectorStoreObject | ErrorResponse; + + @route("{vector_store_id}") + @post + @operationId("modifyVectorStore") + @tag("Vector Stores") + @summary("Modifies a vector store.") + modifyVectorStore( + /** The ID of the vector store to modify. */ + @path vector_store_id: string, + + @body requestBody: UpdateVectorStoreRequest, + ): VectorStoreObject | ErrorResponse; + + @route("{vector_store_id}") + @delete + @operationId("deleteVectorStore") + @tag("Vector Stores") + @summary("Delete a vector store.") + deleteVectorStore( + /** The ID of the vector store to delete. */ + @path vector_store_id: string, + ): DeleteVectorStoreResponse | ErrorResponse; + + @route("{vector_store_id}/files") + @get + @operationId("listVectorStoreFiles") + @tag("Vector Stores") + @summary("Returns a list of vector store files.") + listVectorStoreFiles( + /** The ID of the vector store that the files belong to. */ + @path vector_store_id: string, + + /** + * A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + * default is 20. + */ + @query limit?: int32 = 20, + + /** + * Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + * for descending order. + */ + @query order?: ListOrder = ListOrder.desc, + + /** + * A cursor for use in pagination. `after` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include after=obj_foo in order to fetch the next page of the list. + */ + @query after?: string, + + /** + * A cursor for use in pagination. `before` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include before=obj_foo in order to fetch the previous page of the list. + */ + @query before?: string, + + /** + * Filter by file status. One of `in_progress`, `completed`, `failed`, `cancelled`. + */ + @query filter?: ListVectorStoreFilesFilter, + ): ListVectorStoreFilesResponse | ErrorResponse; + + @route("{vector_store_id}/files") + @post + @operationId("createVectorStoreFile") + @tag("Vector Stores") + @summary("Create a vector store file by attaching a [File](/docs/api-reference/files) to a [vector store](/docs/api-reference/vector-stores/object).") + createVectorStoreFile( + /** The ID of the vector store for which to create a File. */ + @path vector_store_id: string, + + @body requestBody: CreateVectorStoreFileRequest, + ): VectorStoreFileObject | ErrorResponse; + + @route("{vector_store_id}/files/{file_id}") + @get + @operationId("getVectorStoreFile") + @tag("Vector Stores") + @summary("Retrieves a vector store file.") + getVectorStoreFile( + /** The ID of the vector store that the file belongs to. */ + @path vector_store_id: string, + + /** The ID of the file being retrieved. */ + @path file_id: string, + ): VectorStoreFileObject | ErrorResponse; + + @route("{vector_store_id}/files/{file_id}") + @delete + @operationId("deleteVectorStoreFile") + @tag("Vector Stores") + @summary("Delete a vector store file. This will remove the file from the vector store but the file itself will not be deleted. To delete the file, use the [delete file](/docs/api-reference/files/delete) endpoint.") + deleteVectorStoreFile( + /** The ID of the vector store that the file belongs to. */ + @path vector_store_id: string, + + /** The ID of the file to delete. */ + @path file_id: string, + ): DeleteVectorStoreFileResponse | ErrorResponse; + + @route("{vector_store_id}/file_batches") + @post + @operationId("createVectorStoreFileBatch") + @tag("Vector Stores") + @summary("Create a vector store file batch.") + createVectorStoreFileBatch( + /** The ID of the vector store for which to create a file batch. */ + @path vector_store_id: string, + + @body requestBody: CreateVectorStoreFileBatchRequest, + ): VectorStoreFileBatchObject | ErrorResponse; + + @route("{vector_store_id}/file_batches/{batch_id}") + @get + @operationId("getVectorStoreFileBatch") + @tag("Vector Stores") + @summary("Retrieves a vector store file batch.") + getVectorStoreFileBatch( + /** The ID of the vector store that the file batch belongs to. */ + @path vector_store_id: string, + + /** The ID of the file batch being retrieved. */ + @path batch_id: string, + ): VectorStoreFileBatchObject | ErrorResponse; + + @route("{vector_store_id}/file_batches/{batch_id}/cancel") + @post + @operationId("cancelVectorStoreFileBatch") + @tag("Vector Stores") + @summary("Cancel a vector store file batch. This attempts to cancel the processing of files in this batch as soon as possible.") + cancelVectorStoreFileBatch( + /** The ID of the vector store that the file batch belongs to. */ + @path vector_store_id: string, + + /** The ID of the file batch to cancel. */ + @path batch_id: string, + ): VectorStoreFileBatchObject | ErrorResponse; + + @route("{vector_store_id}/file_batches/{batch_id}/files") + @get + @operationId("listFilesInVectorStoreBatch") + @tag("Vector Stores") + @summary("Returns a list of vector store files in a batch.") + listFilesInVectorStoreBatch( + /** The ID of the vector store that the file batch belongs to. */ + @path vector_store_id: string, + + /** The ID of the file batch that the files belong to. */ + @path batch_id: string, + + /** + * A limit on the number of objects to be returned. Limit can range between 1 and 100, and the + * default is 20. + */ + @query limit?: int32 = 20, + + /** + * Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and`desc` + * for descending order. + */ + @query order?: ListOrder = ListOrder.desc, + + /** + * A cursor for use in pagination. `after` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include after=obj_foo in order to fetch the next page of the list. + */ + @query after?: string, + + /** + * A cursor for use in pagination. `before` is an object ID that defines your place in the list. + * For instance, if you make a list request and receive 100 objects, ending with obj_foo, your + * subsequent call can include before=obj_foo in order to fetch the previous page of the list. + */ + @query before?: string, + + /** + * Filter by file status. One of `in_progress`, `completed`, `failed`, `cancelled`. + */ + @query filter?: ListVectorStoreFilesFilter, + ): ListVectorStoreFilesResponse | ErrorResponse; +} diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 000000000..bdb2d9e1a --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,16 @@ +# CONTRIBUTING + +## How to run code generation + +1. Run the following command to install the necessary tools: + `npm install` +1. Regenerate the OpenAPI spec by running the following command: + `npx tsp compile .\openai-in-typespec\main.tsp --emit @typespec/openapi3` +1. Regenerate the library by running the following command: + `npx tsp compile .\openai-in-typespec\main.tsp --emit @azure-tools/typespec-csharp --option @azure-tools/typespec-csharp.emitter-output-dir=.\openai-in-typespec\.dotnet\src` +1. Run the following script: + `.\openai-in-typespec\.scripts\Update-ClientModel.ps1` +1. Run the following script: + `.\openai-in-typespec\.scripts\ConvertTo-Internal.ps1` +1. Run the following script: + `.\openai-in-typespec\.scripts\Add-Customizations.ps1` diff --git a/README.md b/README.md index 04bcd9be3..f9d6e9508 100644 --- a/README.md +++ b/README.md @@ -1,3 +1,17 @@ -# OpenAPI spec for the OpenAI API +# A conversion of the OpenAI OpenAPI to TypeSpec -This repository contains an [OpenAPI](https://www.openapis.org/) specification for the [OpenAI API](https://platform.openai.com/docs/api-reference). +Snapshot: https://raw.githubusercontent.com/openai/openai-openapi/3d5576596e5fe1cd3b88ddcd407dd1c5f3594f02/openapi.yaml + +There are some deltas: + +### Changes to API Semantics + +- Many things are missing defaults (mostly due to bug where we can't set null defaults) +- Error responses have been added. +- Where known, the `object` property's type is narrowed from string to the constant value it will always be + +### Changes to API metadata or OpenAPI format + +- Much of the x-oaiMeta entries have not been added. +- In some cases, new schemas needed to be defined in order to be defined in TypeSpec (e.g. because the constraints could not be added to a model property with a heterogeneous type) +- There is presently no way to set `title` diff --git a/audio/models.tsp b/audio/models.tsp deleted file mode 100644 index a2a440a90..000000000 --- a/audio/models.tsp +++ /dev/null @@ -1,92 +0,0 @@ -namespace OpenAI; -using TypeSpec.OpenAPI; - -model CreateTranscriptionRequest { - /** - * The audio file object (not file name) to transcribe, in one of these formats: flac, mp3, mp4, - * mpeg, mpga, m4a, ogg, wav, or webm. - */ - @encode("binary") - @extension("x-oaiTypeLabel", "file") - file: bytes; - - /** ID of the model to use. Only `whisper-1` is currently available. */ - @extension("x-oaiTypeLabel", "string") - `model`: string | "whisper-1"; - - /** - * An optional text to guide the model's style or continue a previous audio segment. The - * [prompt](/docs/guides/speech-to-text/prompting) should match the audio language. - */ - prompt?: string; - - /** - * The format of the transcript output, in one of these options: json, text, srt, verbose_json, or - * vtt. - */ - response_format?: "json" | "text" | "srt" | "verbose_json" | "vtt" = "json"; - - /** - * The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more - * random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, - * the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to - * automatically increase the temperature until certain thresholds are hit. - */ - @minValue(0) - @maxValue(1) - temperature?: float64 = 0; - - /** - * The language of the input audio. Supplying the input language in - * [ISO-639-1](https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes) format will improve accuracy - * and latency. - */ - language?: string; -} - -// Note: This does not currently support the non-default response format types. -model CreateTranscriptionResponse { - text: string; -} - -model CreateTranslationRequest { - /** - * The audio file object (not file name) to translate, in one of these formats: flac, mp3, mp4, - * mpeg, mpga, m4a, ogg, wav, or webm. - */ - @encode("binary") - @extension("x-oaiTypeLabel", "file") - file: bytes; - - /** ID of the model to use. Only `whisper-1` is currently available. */ - @extension("x-oaiTypeLabel", "string") - `model`: string | "whisper-1"; - - /** - * An optional text to guide the model's style or continue a previous audio segment. The - * [prompt](/docs/guides/speech-to-text/prompting) should match the audio language. - */ - prompt?: string; - - // NOTE: this is just string in the actual API? - /** - * The format of the transcript output, in one of these options: json, text, srt, verbose_json, or - * vtt. - */ - response_format?: "json" | "text" | "srt" | "verbose_json" | "vtt" = "json"; - - /** - * The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more - * random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, - * the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to - * automatically increase the temperature until certain thresholds are hit. - */ - @minValue(0) - @maxValue(1) - temperature?: float64 = 0; -} - -// Note: This does not currently support the non-default response format types. -model CreateTranslationResponse { - text: string; -} diff --git a/audio/operations.tsp b/audio/operations.tsp deleted file mode 100644 index 636fb941a..000000000 --- a/audio/operations.tsp +++ /dev/null @@ -1,35 +0,0 @@ -import "@typespec/http"; -import "@typespec/openapi"; - -import "../common/errors.tsp"; - -using TypeSpec.Http; -using TypeSpec.OpenAPI; - -namespace OpenAI; -@route("/audio") -namespace Audio { - @route("transcriptions") - interface Transcriptions { - @post - @operationId("createTranscription") - @tag("OpenAI") - @summary("Transcribes audio into the input language.") - createTranscription( - @header contentType: "multipart/form-data", - @body audio: CreateTranscriptionRequest, - ): CreateTranscriptionResponse | ErrorResponse; - } - - @route("translations") - interface Translations { - @post - @operationId("createTranslation") - @tag("OpenAI") - @summary("Transcribes audio into the input language.") - createTranslation( - @header contentType: "multipart/form-data", - @body audio: CreateTranslationRequest, - ): CreateTranslationResponse | ErrorResponse; - } -} diff --git a/common/errors.tsp b/common/errors.tsp deleted file mode 100644 index cd4395a05..000000000 --- a/common/errors.tsp +++ /dev/null @@ -1,13 +0,0 @@ -namespace OpenAI; - -model Error { - type: string; - message: string; - param: string | null; - code: string | null; -} - -@error -model ErrorResponse { - error: Error; -} diff --git a/completions/chat-meta.tsp b/completions/chat-meta.tsp deleted file mode 100644 index 7823da41d..000000000 --- a/completions/chat-meta.tsp +++ /dev/null @@ -1,168 +0,0 @@ -using TypeSpec.OpenAPI; - -@@extension(OpenAI.Completions.createCompletion, - "x-oaiMeta", - { - name: "Create chat completion", - group: "chat", - returns: """ - Returns a [chat completion](/docs/api-reference/chat/object) object, or a streamed sequence of - [chat completion chunk](/docs/api-reference/chat/streaming) objects if the request is streamed. - """, - path: "create", - examples: [ - { - title: "No streaming", - request: { - curl: """ - curl https://api.openai.com/v1/chat/completions \\ - -H "Content-Type: application/json" \\ - -H "Authorization: Bearer $OPENAI_API_KEY" \\ - -d '{ - "model": "VAR_model_id", - "messages": [ - { - "role": "system", - "content": "You are a helpful assistant." - }, - { - "role": "user", - "content": "Hello!" - } - ] - """, - python: """ - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - - completion = openai.ChatCompletion.create( - model="VAR_model_id", - messages=[ - {"role": "system", "content": "You are a helpful assistant."}, - {"role": "user", "content": "Hello!"} - ] - ) - - print(completion.choices[0].message) - """, - `node.js`: """ - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const completion = await openai.chat.completions.create({ - messages: [{ role: "system", content: "string" }], - model: "VAR_model_id", - }); - - console.log(completion.choices[0]); - } - - main(); - """, - }, - response: """ - { - "id": "chatcmpl-123", - "object": "chat.completion", - "created": 1677652288, - "model": "gpt-3.5-turbo-0613", - "choices": [{ - "index": 0, - "message": { - "role": "assistant", - "content": "\n\nHello there, how may I assist you today?", - }, - "finish_reason": "stop" - }], - "usage": { - "prompt_tokens": 9, - "completion_tokens": 12, - "total_tokens": 21 - } - } - """, - }, - { - title: "Streaming", - request: { - curl: """ - curl https://api.openai.com/v1/chat/completions \\ - -H "Content-Type: application/json" \\ - -H "Authorization: Bearer $OPENAI_API_KEY" \\ - -d '{ - "model": "VAR_model_id", - "messages": [ - { - "role": "system", - "content": "You are a helpful assistant." - }, - { - "role": "user", - "content": "Hello!" - } - ], - "stream": true - }' - """, - python: """ - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - - completion = openai.ChatCompletion.create( - model="VAR_model_id", - messages=[ - {"role": "system", "content": "You are a helpful assistant."}, - {"role": "user", "content": "Hello!"} - ], - stream=True - ) - - for chunk in completion: - print(chunk.choices[0].delta) - """, - `node.js`: """ - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const completion = await openai.chat.completions.create({ - model: "VAR_model_id", - messages: [ - {"role": "system", "content": "You are a helpful assistant."}, - {"role": "user", "content": "Hello!"} - ], - stream: true, - }); - - for await (const chunk of completion) { - console.log(chunk.choices[0].delta.content); - } - } - - main(); - """, - }, - response: """ - { - "id": "chatcmpl-123", - "object": "chat.completion.chunk", - "created": 1677652288, - "model": "gpt-3.5-turbo", - "choices": [{ - "index": 0, - "delta": { - "content": "Hello", - }, - "finish_reason": "stop" - }] - } - """, - } - ], - } -); diff --git a/completions/models.tsp b/completions/models.tsp deleted file mode 100644 index 5aa332b32..000000000 --- a/completions/models.tsp +++ /dev/null @@ -1,420 +0,0 @@ -namespace OpenAI; -using TypeSpec.OpenAPI; - -alias CHAT_COMPLETION_MODELS = - | "gpt4" - | "gpt-4-0314" - | "gpt-4-0613" - | "gpt-4-32k" - | "gpt-4-32k-0314" - | "gpt-4-32k-0613" - | "gpt-3.5-turbo" - | "gpt-3.5-turbo-16k" - | "gpt-3.5-turbo-0301" - | "gpt-3.5-turbo-0613" - | "gpt-3.5-turbo-16k-0613"; - -alias COMPLETION_MODELS = - | "babbage-002" - | "davinci-002" - | "text-davinci-003" - | "text-davinci-002" - | "text-davinci-001" - | "code-davinci-002" - | "text-curie-001" - | "text-babbage-001" - | "text-ada-001"; - -alias SharedCompletionProperties = { - /** - * What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output - * more random, while lower values like 0.2 will make it more focused and deterministic. - * - * We generally recommend altering this or `top_p` but not both. - */ - temperature?: Temperature | null = 1; - - /** - * An alternative to sampling with temperature, called nucleus sampling, where the model considers - * the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising - * the top 10% probability mass are considered. - * - * We generally recommend altering this or `temperature` but not both. - */ - top_p?: TopP | null = 1; - - /** - * How many completions to generate for each prompt. - * **Note:** Because this parameter generates many completions, it can quickly consume your token - * quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. - */ - n?: N | null = 1; - - /** - * The maximum number of [tokens](/tokenizer) to generate in the completion. - * - * The token count of your prompt plus `max_tokens` cannot exceed the model's context length. - * [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb) - * for counting tokens. - */ - max_tokens?: MaxTokens | null = 16; - - // todo: consider inlining when https://github.com/microsoft/typespec/issues/2356 is resolved - // https://github.com/microsoft/typespec/issues/2355 - /** Up to 4 sequences where the API will stop generating further tokens. */ - stop?: Stop = null; - - // needs default - // https://github.com/microsoft/typespec/issues/1646 - /** - * Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear - * in the text so far, increasing the model's likelihood to talk about new topics. - * - * [See more information about frequency and presence penalties.](/docs/guides/gpt/parameter-details) - */ - presence_penalty?: Penalty | null; - - // needs default - /** - * Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing - * frequency in the text so far, decreasing the model's likelihood to repeat the same line - * verbatim. - * - * [See more information about frequency and presence penalties.](/docs/guides/gpt/parameter-details) - */ - frequency_penalty?: Penalty | null; - - // needs default of null - /** - * Modify the likelihood of specified tokens appearing in the completion. - * Accepts a json object that maps tokens (specified by their token ID in the tokenizer) to an - * associated bias value from -100 to 100. Mathematically, the bias is added to the logits - * generated by the model prior to sampling. The exact effect will vary per model, but values - * between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 - * should result in a ban or exclusive selection of the relevant token. - */ - @extension("x-oaiTypeLabel", "map") - logit_bias?: Record | null; - - /** - * A unique identifier representing your end-user, which can help OpenAI to monitor and detect - * abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). - */ - user?: User; - - /** - * If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only - * [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) - * as they become available, with the stream terminated by a `data: [DONE]` message. - * [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb). - */ - stream?: boolean | null = true; -}; - -@oneOf -union Stop { - string, - StopSequences, - null, -} - -@minValue(-2) -@maxValue(2) -scalar Penalty extends float64; - -@minItems(1) -@maxItems(4) -model StopSequences is string[]; - -@minValue(0) -@maxValue(2) -scalar Temperature extends float64; - -@minValue(0) -@maxValue(1) -scalar TopP extends float64; - -@minValue(1) -@maxValue(128) -scalar N extends safeint; - -@minValue(0) -scalar MaxTokens extends safeint; - -model CreateChatCompletionRequest { - /** - * ID of the model to use. See the [model endpoint compatibility](/docs/models/model-endpoint-compatibility) - * table for details on which models work with the Chat API. - */ - @extension("x-oaiTypeLabel", "string") - `model`: string | CHAT_COMPLETION_MODELS; - - /** - * A list of messages comprising the conversation so far. - * [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_format_inputs_to_ChatGPT_models.ipynb). - */ - @minItems(1) - messages: ChatCompletionRequestMessage[]; - - /** A list of functions the model may generate JSON inputs for. */ - @minItems(1) - @maxItems(128) - functions?: ChatCompletionFunctions[]; - - /** - * Controls how the model responds to function calls. `none` means the model does not call a - * function, and responds to the end-user. `auto` means the model can pick between an end-user or - * calling a function. Specifying a particular function via `{\"name":\ \"my_function\"}` forces the - * model to call that function. `none` is the default when no functions are present. `auto` is the - * default if functions are present. - */ - function_call?: "none" | "auto" | ChatCompletionFunctionCallOption; - - ...SharedCompletionProperties; -} - -model ChatCompletionFunctionCallOption { - /** The name of the function to call. */ - name: string; -} - -model ChatCompletionFunctions { - /** - * The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and - * dashes, with a maximum length of 64. - */ - name: string; - - /** - * A description of what the function does, used by the model to choose when and how to call the - * function. - */ - description?: string; - - /** - * The parameters the functions accepts, described as a JSON Schema object. See the - * [guide](/docs/guides/gpt/function-calling) for examples, and the - * [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation - * about the format.\n\nTo describe a function that accepts no parameters, provide the value - * `{\"type\": \"object\", \"properties\": {}}`. - */ - parameters: ChatCompletionFunctionParameters; -} - -model ChatCompletionFunctionParameters is Record; - -model ChatCompletionRequestMessage { - /** The role of the messages author. One of `system`, `user`, `assistant`, or `function`. */ - role: "system" | "user" | "assistant" | "function"; - - /** - * The contents of the message. `content` is required for all messages, and may be null for - * assistant messages with function calls. - */ - content: string | null; - - // TODO: the constraints are not specified in the API - /** - * The name of the author of this message. `name` is required if role is `function`, and it - * should be the name of the function whose response is in the `content`. May contain a-z, - * A-Z, 0-9, and underscores, with a maximum length of 64 characters. - */ - name?: string; - - /** The name and arguments of a function that should be called, as generated by the model. */ - function_call?: { - /** The name of the function to call. */ - name: string; - - /** - * The arguments to call the function with, as generated by the model in JSON format. Note that - * the model does not always generate valid JSON, and may hallucinate parameters not defined by - * your function schema. Validate the arguments in your code before calling your function. - */ - arguments: string; - }; -} - -/** Represents a chat completion response returned by model, based on the provided input. */ -// TODO: Fill in example here. -@extension( - "x-oaiMeta", - { - name: "The chat completion object", - group: "chat", - example: "", - } -) -model CreateChatCompletionResponse { - /** A unique identifier for the chat completion. */ - id: string; - - /** The object type, which is always `chat.completion`. */ - object: string; - - /** The Unix timestamp (in seconds) of when the chat completion was created. */ - @encode("unixTimestamp", int32) - created: utcDateTime; - - /** The model used for the chat completion. */ - `model`: string; - - /** A list of chat completion choices. Can be more than one if `n` is greater than 1. */ - choices: { - /** The index of the choice in the list of choices. */ - index: safeint; - - message: ChatCompletionResponseMessage; - - /** - * The reason the model stopped generating tokens. This will be `stop` if the model hit a - * natural stop point or a provided stop sequence, `length` if the maximum number of tokens - * specified in the request was reached, `content_filter` if the content was omitted due to - * a flag from our content filters, or `function_call` if the model called a function. - */ - finish_reason: "stop" | "length" | "function_call" | "content_filter"; - }[]; - - usage?: CompletionUsage; -} - -/** Usage statistics for the completion request. */ -model CompletionUsage { - /** Number of tokens in the prompt. */ - prompt_tokens: safeint; - - /** Number of tokens in the generated completion */ - completion_tokens: safeint; - - /** Total number of tokens used in the request (prompt + completion). */ - total_tokens: safeint; -} - -model ChatCompletionResponseMessage { - /** The role of the author of this message. */ - role: "system" | "user" | "assistant" | "function"; - - /** The contents of the message. */ - content: string | null; - - /** The name and arguments of a function that should be called, as generated by the model. */ - function_call?: { - /** The name of the function to call. */ - name: string; - - /** - * The arguments to call the function with, as generated by the model in JSON format. Note that - * the model does not always generate valid JSON, and may hallucinate parameters not defined by - * your function schema. Validate the arguments in your code before calling your function. - */ - arguments: string; - }; -} - -model CreateCompletionRequest { - /** - * ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to - * see all of your available models, or see our [Model overview](/docs/models/overview) for - * descriptions of them. - */ - @extension("x-oaiTypeLabel", "string") - `model`: string | COMPLETION_MODELS; - - /** - * The prompt(s) to generate completions for, encoded as a string, array of strings, array of - * tokens, or array of token arrays. - * - * Note that <|endoftext|> is the document separator that the model sees during training, so if a - * prompt is not specified the model will generate as if from the beginning of a new document. - */ - // TODO: consider inlining when https://github.com/microsoft/typespec/issues/2356 fixed - prompt: Prompt = "<|endoftext|>"; - - /** The suffix that comes after a completion of inserted text. */ - suffix?: string | null = null; - - ...SharedCompletionProperties; - - /** - * Include the log probabilities on the `logprobs` most likely tokens, as well the chosen tokens. - * For example, if `logprobs` is 5, the API will return a list of the 5 most likely tokens. The - * API will always return the `logprob` of the sampled token, so there may be up to `logprobs+1` - * elements in the response. - * - * The maximum value for `logprobs` is 5. - */ - logprobs?: safeint | null = null; - - /** Echo back the prompt in addition to the completion */ - echo?: boolean | null = false; - - /** - * Generates `best_of` completions server-side and returns the "best" (the one with the highest - * log probability per token). Results cannot be streamed. - * - * When used with `n`, `best_of` controls the number of candidate completions and `n` specifies - * how many to return – `best_of` must be greater than `n`. - * - * **Note:** Because this parameter generates many completions, it can quickly consume your token - * quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. - */ - best_of?: safeint | null = 1; -} - -@oneOf -union Prompt { - string, - string[], - TokenArray, - TokenArrayArray, - null, -} -/** - * Represents a completion response from the API. Note: both the streamed and non-streamed response - * objects share the same shape (unlike the chat endpoint). - */ -@extension( - "x-oaiMeta", - { - name: "The completion object", - legacy: true, - example: "", // fill in - } -) -model CreateCompletionResponse { - /** A unique identifier for the completion. */ - id: string; - - /** The object type, which is always `text_completion`. */ - object: string; - - /** The Unix timestamp (in seconds) of when the completion was created. */ - @encode("unixTimestamp", int32) - created: utcDateTime; - - /** The model used for the completion. */ - `model`: string; - - /** The list of completion choices the model generated for the input. */ - choices: { - index: safeint; - text: string; - logprobs: null | { - tokens: string[]; - token_logprobs: float64[]; - top_logprobs: Record[]; - text_offset: safeint[]; - }; - - /** - * The reason the model stopped generating tokens. This will be `stop` if the model hit a - * natural stop point or a provided stop sequence, or `content_filter` if content was omitted - * due to a flag from our content filters, `length` if the maximum number of tokens specified - * in the request was reached, or `content_filter` if content was omitted due to a flag from our - * content filters. - */ - finish_reason: "stop" | "length" | "content_filter"; - }[]; - - usage?: CompletionUsage; -} diff --git a/completions/operations.tsp b/completions/operations.tsp deleted file mode 100644 index d53245f7c..000000000 --- a/completions/operations.tsp +++ /dev/null @@ -1,33 +0,0 @@ -import "@typespec/http"; -import "@typespec/openapi"; - -import "../common/errors.tsp"; -import "./models.tsp"; -import "./chat-meta.tsp"; - -using TypeSpec.Http; -using TypeSpec.OpenAPI; - -namespace OpenAI; - -@route("/chat") -namespace Chat { - @route("/completions") - interface Completions { - @tag("OpenAI") - @post - @operationId("createChatCompletion") - createChatCompletion( - ...CreateChatCompletionRequest, - ): CreateChatCompletionResponse | ErrorResponse; - } -} -@route("/completions") -interface Completions { - @tag("OpenAI") - @post - @operationId("createCompletion") - createCompletion( - ...CreateCompletionRequest, - ): CreateCompletionResponse | ErrorResponse; -} diff --git a/edits/models.tsp b/edits/models.tsp deleted file mode 100644 index d76372649..000000000 --- a/edits/models.tsp +++ /dev/null @@ -1,69 +0,0 @@ -namespace OpenAI; -using TypeSpec.OpenAPI; - -model CreateEditRequest { - /** - * ID of the model to use. You can use the `text-davinci-edit-001` or `code-davinci-edit-001` - * model with this endpoint. - */ - @extension("x-oaiTypeLabel", "string") - `model`: string | "text-davinci-edit-001" | "code-davinci-edit-001"; - - /** The input text to use as a starting point for the edit. */ - input?: string | null = ""; - - /** The instruction that tells the model how to edit the prompt. */ - instruction: string; - - /** How many edits to generate for the input and instruction. */ - n?: EditN | null = 1; - - /** - * What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output - * more random, while lower values like 0.2 will make it more focused and deterministic. - * - * We generally recommend altering this or `top_p` but not both. - */ - temperature?: Temperature | null = 1; - - /** - * An alternative to sampling with temperature, called nucleus sampling, where the model considers - * the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising - * the top 10% probability mass are considered. - * - * We generally recommend altering this or `temperature` but not both. - */ - top_p?: TopP | null = 1; -} - -#deprecated "deprecated" -model CreateEditResponse { - /** The object type, which is always `edit`. */ - object: "edit"; - - /** The Unix timestamp (in seconds) of when the edit was created. */ - @encode("unixTimestamp", int32) - created: utcDateTime; - - /** description: A list of edit choices. Can be more than one if `n` is greater than 1. */ - choices: { - /** The edited result. */ - text: string; - - /** The index of the choice in the list of choices. */ - index: safeint; - - /** - * The reason the model stopped generating tokens. This will be `stop` if the model hit a - * natural stop point or a provided stop sequence, or `length` if the maximum number of tokens - * specified in the request was reached. - */ - finish_reason: "stop" | "length"; - }[]; - - usage: CompletionUsage; -} - -@minValue(0) -@maxValue(20) -scalar EditN extends safeint; diff --git a/edits/operations.tsp b/edits/operations.tsp deleted file mode 100644 index 08497364e..000000000 --- a/edits/operations.tsp +++ /dev/null @@ -1,19 +0,0 @@ -import "@typespec/http"; -import "@typespec/openapi"; - -import "../common/errors.tsp"; -import "./models.tsp"; - -using TypeSpec.Http; -using TypeSpec.OpenAPI; - -namespace OpenAI; - -@route("/edits") -interface Edits { - #deprecated "deprecated" - @post - @tag("OpenAI") - @operationId("createEdit") - createEdit(@body edit: CreateEditRequest): CreateEditResponse | ErrorResponse; -} diff --git a/embeddings/models.tsp b/embeddings/models.tsp deleted file mode 100644 index ab46275b2..000000000 --- a/embeddings/models.tsp +++ /dev/null @@ -1,55 +0,0 @@ -import "../common/models.tsp"; - -namespace OpenAI; -using TypeSpec.OpenAPI; - -model CreateEmbeddingRequest { - /** ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. */ - @extension("x-oaiTypeLabel", "string") - `model`: string | "text-embedding-ada-002"; - - /** - * Input text to embed, encoded as a string or array of tokens. To embed multiple inputs in a - * single request, pass an array of strings or array of token arrays. Each input must not exceed - * the max input tokens for the model (8191 tokens for `text-embedding-ada-002`) and cannot be an empty string. - * [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb) - * for counting tokens. - */ - input: string | string[] | TokenArray | TokenArrayArray; - - user?: User; -} -model CreateEmbeddingResponse { - /** The object type, which is always "embedding". */ - object: "embedding"; - - /** The name of the model used to generate the embedding. */ - `model`: string; - - /** The list of embeddings generated by the model. */ - data: Embedding[]; - - /** The usage information for the request. */ - usage: { - /** The number of tokens used by the prompt. */ - prompt_tokens: safeint; - - /** The total number of tokens used by the request. */ - total_tokens: safeint; - }; -} - -/** Represents an embedding vector returned by embedding endpoint. */ -model Embedding { - /** The index of the embedding in the list of embeddings. */ - index: safeint; - - /** The object type, which is always "embedding". */ - object: "embedding"; - - /** - * The embedding vector, which is a list of floats. The length of vector depends on the model as\ - * listed in the [embedding guide](/docs/guides/embeddings). - */ - embedding: float64[]; -} diff --git a/files/models.tsp b/files/models.tsp deleted file mode 100644 index 990c1ea11..000000000 --- a/files/models.tsp +++ /dev/null @@ -1,70 +0,0 @@ -namespace OpenAI; -using TypeSpec.OpenAPI; - -model ListFilesResponse { - object: string; // presumably this is always some constant, but not defined. - data: OpenAIFile[]; -} - -model CreateFileRequest { - /** - * Name of the [JSON Lines](https://jsonlines.readthedocs.io/en/latest/) file to be uploaded. - * - * If the `purpose` is set to "fine-tune", the file will be used for fine-tuning. - */ - @encode("binary") - file: bytes; - - /** - * The intended purpose of the uploaded documents. Use "fine-tune" for - * [fine-tuning](/docs/api-reference/fine-tuning). This allows us to validate the format of the - * uploaded file. - */ - purpose: string; -} - -/** The `File` object represents a document that has been uploaded to OpenAI. */ -model OpenAIFile { - /** The file identifier, which can be referenced in the API endpoints. */ - id: string; - - /** The object type, which is always "file". */ - object: "file"; - - /** The size of the file in bytes. */ - bytes: safeint; - - /** The Unix timestamp (in seconds) for when the file was created. */ - @encode("unixTimestamp", int32) - createdAt: utcDateTime; - - /** The name of the file. */ - filename: string; - - /** The intended purpose of the file. Currently, only "fine-tune" is supported. */ - purpose: string; - - /** - * The current status of the file, which can be either `uploaded`, `processed`, `pending`, - * `error`, `deleting` or `deleted`. - */ - status: - | "uploaded" - | "processed" - | "pending" - | "error" - | "deleting" - | "deleted"; - - /** - * Additional details about the status of the file. If the file is in the `error` state, this will - * include a message describing the error. - */ - status_details?: string | null; -} - -model DeleteFileResponse { - id: string; - object: string; - deleted: boolean; -} diff --git a/files/operations.tsp b/files/operations.tsp deleted file mode 100644 index 2e601ae03..000000000 --- a/files/operations.tsp +++ /dev/null @@ -1,58 +0,0 @@ -import "@typespec/http"; -import "@typespec/openapi"; - -import "../common/errors.tsp"; -import "./models.tsp"; - -using TypeSpec.Http; -using TypeSpec.OpenAPI; - -namespace OpenAI; - -@route("/files") -interface Files { - @tag("OpenAI") - @get - @summary("Returns a list of files that belong to the user's organization.") - @operationId("listFiles") - listFiles(): ListFilesResponse | ErrorResponse; - - @tag("OpenAI") - @post - @summary("Returns a list of files that belong to the user's organization.") - @operationId("createFile") - createFile( - @header contentType: "multipart/form-data", - @body file: CreateFileRequest, - ): OpenAIFile | ErrorResponse; - - @tag("OpenAI") - @post - @summary("Returns information about a specific file.") - @operationId("retrieveFile") - @route("/files/{file_id}") - retrieveFile( - /** The ID of the file to use for this request. */ - @path file_id: string, - ): OpenAIFile | ErrorResponse; - - @tag("OpenAI") - @delete - @summary("Delete a file") - @operationId("deleteFile") - @route("/files/{file_id}") - deleteFile( - /** The ID of the file to use for this request. */ - @path file_id: string, - ): DeleteFileResponse | ErrorResponse; - - @route("/files/{file_id}/content") - @tag("OpenAI") - @get - @summary("Returns the contents of the specified file.") - @operationId("downloadFile") - downloadFile( - /** The ID of the file to use for this request. */ - @path file_id: string, - ): string | ErrorResponse; -} diff --git a/fine-tuning/models.tsp b/fine-tuning/models.tsp deleted file mode 100644 index bf846072b..000000000 --- a/fine-tuning/models.tsp +++ /dev/null @@ -1,416 +0,0 @@ -namespace OpenAI; -using TypeSpec.OpenAPI; - -model FineTuningJob { - /** The object identifier, which can be referenced in the API endpoints. */ - id: string; - - /** The object type, which is always "fine_tuning.job". */ - object: "fine_tuning.job"; - - /** The Unix timestamp (in seconds) for when the fine-tuning job was created. */ - @encode("unixTimestamp", int32) - created_at: utcDateTime; - - /** - * The Unix timestamp (in seconds) for when the fine-tuning job was finished. The value will be - * null if the fine-tuning job is still running. - */ - @encode("unixTimestamp", int32) - finished_at: utcDateTime | null; - - /** The base model that is being fine-tuned. */ - `model`: string; - - /** - * The name of the fine-tuned model that is being created. The value will be null if the - * fine-tuning job is still running. - */ - fine_tuned_model: string | null; - - /** The organization that owns the fine-tuning job. */ - organization_id: string; - - /** - * The current status of the fine-tuning job, which can be either `created`, `pending`, `running`, - * `succeeded`, `failed`, or `cancelled`. - */ - status: - | "created" - | "pending" - | "running" - | "succeeded" - | "failed" - | "cancelled"; - - /** - * The hyperparameters used for the fine-tuning job. See the - * [fine-tuning guide](/docs/guides/fine-tuning) for more details. - */ - hyperparameters: { - /** - * The number of epochs to train the model for. An epoch refers to one full cycle through the - * training dataset. - * - * "Auto" decides the optimal number of epochs based on the size of the dataset. If setting the - * number manually, we support any number between 1 and 50 epochs. - */ - n_epochs?: "auto" | NEpochs = "auto"; - }; - - /** - * The file ID used for training. You can retrieve the training data with the - * [Files API](/docs/api-reference/files/retrieve-contents). - */ - training_file: string; - - /** - * The file ID used for validation. You can retrieve the validation results with the - * [Files API](/docs/api-reference/files/retrieve-contents). - */ - validation_file: string | null; - - /** - * The compiled results file ID(s) for the fine-tuning job. You can retrieve the results with the - * [Files API](/docs/api-reference/files/retrieve-contents). - */ - result_files: string[]; - - /** - * The total number of billable tokens processed by this fine tuning job. The value will be null - * if the fine-tuning job is still running. - */ - trained_tokens: safeint | null; - - /** - * For fine-tuning jobs that have `failed`, this will contain more information on the cause of the - * failure. - */ - error: { - /** A human-readable error message. */ - message?: string; // likely should be required, but spec doesn't say so. - - /** A machine-readable error code. */ - code?: string; - - /** - * The parameter that was invalid, usually `training_file` or `validation_file`. This field - * will be null if the failure was not parameter-specific. - */ - param?: string | null; - } | null; -} - -model FineTuningEvent { - object: string; - - @encode("unixTimestamp", int32) - created_at: utcDateTime; - - level: string; - message: string; - data?: Record | null; - type?: "message" | "metrics"; // "default is "none"? -} - -/** The `FineTune` object represents a legacy fine-tune job that has been created through the API. */ -#deprecated "deprecated" -model FineTune { - /** The object identifier, which can be referenced in the API endpoints. */ - id: string; - - /** The object type, which is always "fine-tune". */ - object: "fine-tune"; - - /** The Unix timestamp (in seconds) for when the fine-tuning job was created. */ - @encode("unixTimestamp", int32) - created_at: utcDateTime; - - /** The Unix timestamp (in seconds) for when the fine-tuning job was last updated. */ - @encode("unixTimestamp", int32) - updated_at: utcDateTime; - - /** The base model that is being fine-tuned. */ - `model`: string; - - /** The name of the fine-tuned model that is being created. */ - fine_tuned_model: string | null; - - /** The organization that owns the fine-tuning job. */ - organization_id: string; - - /** - * The current status of the fine-tuning job, which can be either `created`, `running`, - * `succeeded`, `failed`, or `cancelled`. - */ - status: "created" | "running" | "succeeded" | "failed" | "cancelled"; - - /** - * The hyperparameters used for the fine-tuning job. See the - * [fine-tuning guide](/docs/guides/legacy-fine-tuning/hyperparameters) for more details. - */ - hyperparams: { - /** - * The number of epochs to train the model for. An epoch refers to one full cycle through the - * training dataset. - */ - n_epochs: safeint; - - /** - * The batch size to use for training. The batch size is the number of training examples used to - * train a single forward and backward pass. - */ - batch_size: safeint; - - /** The weight to use for loss on the prompt tokens. */ - prompt_loss_weight: float64; - - /** The learning rate multiplier to use for training. */ - learning_rate_multiplier: float64; - - /** The classification metrics to compute using the validation dataset at the end of every epoch. */ - compute_classification_metrics?: boolean; - - /** The positive class to use for computing classification metrics. */ - classification_positive_class?: string; - - /** The number of classes to use for computing classification metrics. */ - classification_n_classes?: safeint; - }; - - /** The list of files used for training. */ - training_files: OpenAIFile[]; - - /** The list of files used for validation. */ - validation_files: OpenAIFile[]; - - /** The compiled results files for the fine-tuning job. */ - result_files: OpenAIFile[]; - - /** The list of events that have been observed in the lifecycle of the FineTune job. */ - events?: FineTuneEvent[]; -} - -model FineTuningJobEvent { - id: string; - object: string; - - @encode("unixTimestamp", int32) - created_at: utcDateTime; - - level: "info" | "warn" | "error"; - message: string; -} - -model FineTuneEvent { - object: string; - - @encode("unixTimestamp", int32) - created_at: utcDateTime; - - level: string; - message: string; -} - -model CreateFineTuningJobRequest { - /** - * The ID of an uploaded file that contains training data. - * - * See [upload file](/docs/api-reference/files/upload) for how to upload a file. - * - * Your dataset must be formatted as a JSONL file. Additionally, you must upload your file with - * the purpose `fine-tune`. - * - * See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. - */ - training_file: string; - - /** - * The ID of an uploaded file that contains validation data. - * - * If you provide this file, the data is used to generate validation metrics periodically during - * fine-tuning. These metrics can be viewed in the fine-tuning results file. The same data should - * not be present in both train and validation files. - * - * Your dataset must be formatted as a JSONL file. You must upload your file with the purpose - * `fine-tune`. - * - * See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. - */ - validation_file?: string | null; - - /** - * The name of the model to fine-tune. You can select one of the - * [supported models](/docs/guides/fine-tuning/what-models-can-be-fine-tuned). - */ - @extension("x-oaiTypeLabel", "string") - `model`: string | "babbage-002" | "davinci-002" | "gpt-3.5-turbo"; - - /** The hyperparameters used for the fine-tuning job. */ - hyperparameters?: { - /** - * The number of epochs to train the model for. An epoch refers to one full cycle through the - * training dataset. - */ - n_epochs?: "auto" | NEpochs = "auto"; - }; - - /** - * A string of up to 18 characters that will be added to your fine-tuned model name. - * - * For example, a `suffix` of "custom-model-name" would produce a model name like - * `ft:gpt-3.5-turbo:openai:custom-model-name:7p4lURel`. - */ - suffix?: SuffixString | null = null; -} - -@minValue(1) -@maxValue(50) -scalar NEpochs extends safeint; - -model ListFineTuningJobEventsResponse { - object: string; - data: FineTuningJobEvent[]; -} - -model CreateFineTuneRequest { - /** - * The ID of an uploaded file that contains training data. - * - * See [upload file](/docs/api-reference/files/upload) for how to upload a file. - * - * Your dataset must be formatted as a JSONL file, where each training example is a JSON object - * with the keys "prompt" and "completion". Additionally, you must upload your file with the - * purpose `fine-tune`. - * - * See the [fine-tuning guide](/docs/guides/legacy-fine-tuning/creating-training-data) for more - * details. - */ - training_file: string; - - /** - * The ID of an uploaded file that contains validation data. - * - * If you provide this file, the data is used to generate validation metrics periodically during - * fine-tuning. These metrics can be viewed in the - * [fine-tuning results file](/docs/guides/legacy-fine-tuning/analyzing-your-fine-tuned-model). - * Your train and validation data should be mutually exclusive. - * - * Your dataset must be formatted as a JSONL file, where each validation example is a JSON object - * with the keys "prompt" and "completion". Additionally, you must upload your file with the - * purpose `fine-tune`. - * - * See the [fine-tuning guide](/docs/guides/legacy-fine-tuning/creating-training-data) for more - * details. - */ - validation_file?: string | null; - - /** - * The name of the base model to fine-tune. You can select one of "ada", "babbage", "curie", - * "davinci", or a fine-tuned model created after 2022-04-21 and before 2023-08-22. To learn more - * about these models, see the [Models](/docs/models) documentation. - */ - @extension("x-oaiTypeLabel", "string") - `model`?: string | "ada" | "babbage" | "curie" | "davinci" | null; - - /** - * The number of epochs to train the model for. An epoch refers to one full cycle through the - * training dataset. - */ - n_epochs?: safeint | null = 4; - - /** - * The batch size to use for training. The batch size is the number of training examples used to - * train a single forward and backward pass. - * - * By default, the batch size will be dynamically configured to be ~0.2% of the number of examples - * in the training set, capped at 256 - in general, we've found that larger batch sizes tend to - * work better for larger datasets. - */ - batch_size?: safeint | null = null; - - /** - * The learning rate multiplier to use for training. The fine-tuning learning rate is the original - * learning rate used for pretraining multiplied by this value. - * - * By default, the learning rate multiplier is the 0.05, 0.1, or 0.2 depending on final - * `batch_size` (larger learning rates tend to perform better with larger batch sizes). We - * recommend experimenting with values in the range 0.02 to 0.2 to see what produces the best - * results. - */ - learning_rate_multiplier?: float64 | null = null; - - /** - * The weight to use for loss on the prompt tokens. This controls how much the model tries to - * learn to generate the prompt (as compared to the completion which always has a weight of 1.0), - * and can add a stabilizing effect to training when completions are short. - * - * If prompts are extremely long (relative to completions), it may make sense to reduce this - * weight so as to avoid over-prioritizing learning the prompt. - */ - prompt_loss_rate?: float64 | null = 0.01; - - /** - * If set, we calculate classification-specific metrics such as accuracy and F-1 score using the - * validation set at the end of every epoch. These metrics can be viewed in the - * [results file](/docs/guides/legacy-fine-tuning/analyzing-your-fine-tuned-model). - * - * In order to compute classification metrics, you must provide a `validation_file`. Additionally, - * you must specify `classification_n_classes` for multiclass classification or - * `classification_positive_class` for binary classification. - */ - compute_classification_metrics?: boolean | null = false; - - /** - * The number of classes in a classification task. - * - * This parameter is required for multiclass classification. - */ - classification_n_classes?: safeint | null = null; - - /** - * The positive class in binary classification. - * - * This parameter is needed to generate precision, recall, and F1 metrics when doing binary - * classification. - */ - classification_positive_class?: string | null = null; - - /** - * If this is provided, we calculate F-beta scores at the specified beta values. The F-beta score - * is a generalization of F-1 score. This is only used for binary classification. - * - * With a beta of 1 (i.e. the F-1 score), precision and recall are given the same weight. A larger - * beta score puts more weight on recall and less on precision. A smaller beta score puts more - * weight on precision and less on recall. - */ - classification_betas?: float64[] | null = null; - - /** - * A string of up to 18 characters that will be added to your fine-tuned model name. - * - * For example, a `suffix` of "custom-model-name" would produce a model name like - * `ada:ft-your-org:custom-model-name-2022-02-15-04-21-04`. - */ - suffix?: SuffixString | null = null; -} - -@minLength(1) -@maxLength(40) -scalar SuffixString extends string; - -model ListFineTunesResponse { - object: string; - data: FineTune[]; -} - -model ListFineTuneEventsResponse { - object: string; - data: FineTuneEvent[]; -} - -model ListPaginatedFineTuningJobsResponse { - object: string; - data: FineTuningJob[]; - has_more: boolean; -} diff --git a/fine-tuning/operations.tsp b/fine-tuning/operations.tsp deleted file mode 100644 index 15491f62e..000000000 --- a/fine-tuning/operations.tsp +++ /dev/null @@ -1,191 +0,0 @@ -import "@typespec/http"; -import "@typespec/openapi"; - -import "../common/errors.tsp"; -import "./models.tsp"; - -using TypeSpec.Http; -using TypeSpec.OpenAPI; - -namespace OpenAI; - -@route("/fine_tuning") -namespace FineTuning { - @route("jobs") - interface Jobs { - /** - * Creates a job that fine-tunes a specified model from a given dataset. - * - * Response includes details of the enqueued job including job status and the name of the - * fine-tuned models once complete. - * - * [Learn more about fine-tuning](/docs/guides/fine-tuning) - */ - @post - @tag("OpenAI") - @operationId("createFineTuningJob") - createFineTuningJob( - @body job: CreateFineTuningJobRequest, - ): FineTuningJob | ErrorResponse; - - @get - @tag("OpenAI") - @operationId("listPaginatedFineTuningJobs") - listPaginatedFineTuningJobs( - /** Identifier for the last job from the previous pagination request. */ - @query after?: string, - - /** Number of fine-tuning jobs to retrieve. */ - @query limit?: safeint = 20, - ): ListPaginatedFineTuningJobsResponse | ErrorResponse; - - @summary(""" - Get info about a fine-tuning job. - - [Learn more about fine-tuning](/docs/guides/fine-tuning) - """) - @route("{fine_tuning_job_id}") - @tag("OpenAI") - @get - @operationId("retrieveFineTuningJob") - retrieveFineTuningJob( - @path fine_tuning_job_id: string, - ): FineTuningJob | ErrorResponse; - - @summary("Get status updates for a fine-tuning job.") - @tag("OpenAI") - @route("{fine_tuning_job_id}/events") - @get - @operationId("listFineTuningEvents") - listFineTuningEvents( - /** The ID of the fine-tuning job to get events for. */ - @path fine_tuning_job_id: string, - - /** Identifier for the last event from the previous pagination request. */ - @query after?: string, - - /** Number of events to retrieve. */ - @query limit?: integer = 20, - ): ListFineTuningJobEventsResponse | ErrorResponse; - - @summary("Immediately cancel a fine-tune job.") - @tag("OpenAI") - @route("{fine_tuning_job_id}/cancel") - @post - @operationId("cancelFineTuningJob") - cancelFineTuningJob( - /** The ID of the fine-tuning job to cancel. */ - @path fine_tuning_job_id: string, - ): FineTuningJob | ErrorResponse; - } -} - -@route("/fine-tunes") -interface FineTunes { - #deprecated "deprecated" - @post - @tag("OpenAI") - @summary(""" - Creates a job that fine-tunes a specified model from a given dataset. - - Response includes details of the enqueued job including job status and the name of the fine-tuned models once complete. - - [Learn more about fine-tuning](/docs/guides/legacy-fine-tuning) - """) - @operationId("createFineTune") - createFineTune( - @body fine_tune: CreateFineTuneRequest, - ): FineTune | ErrorResponse; - - #deprecated "deprecated" - @get - @tag("OpenAI") - @summary("List your organization's fine-tuning jobs") - @operationId("listFineTunes") - listFineTunes(): ListFineTunesResponse | ErrorResponse; - - #deprecated "deprecated" - @get - @route("{fine_tune_id}") - @tag("OpenAI") - @summary(""" - Gets info about the fine-tune job. - - [Learn more about fine-tuning](/docs/guides/legacy-fine-tuning) - """) - @operationId("retrieveFineTune") - retrieveFineTune( - /** The ID of the fine-tune job */ - @path fine_tune_id: string, - ): FineTune | ErrorResponse; - - #deprecated "deprecated" - @route("{fine_tune_id}/events") - @get - @tag("OpenAI") - @summary("Get fine-grained status updates for a fine-tune job.") - @operationId("listFineTuneEvents") - listFineTuneEvents( - /** The ID of the fine-tune job to get events for. */ - @path fine_tune_id: string, - - /** - * Whether to stream events for the fine-tune job. If set to true, events will be sent as - * data-only - * [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) - * as they become available. The stream will terminate with a `data: [DONE]` message when the - * job is finished (succeeded, cancelled, or failed). - * - * If set to false, only events generated so far will be returned. - */ - @query stream?: boolean = false, - ): ListFineTuneEventsResponse | ErrorResponse; - - #deprecated "deprecated" - @route("{fine_tune_id}/cancel") - @post - @tag("OpenAI") - @summary("Immediately cancel a fine-tune job.") - @operationId("cancelFineTune") - cancelFineTune( - /** The ID of the fine-tune job to cancel */ - @path fine_tune_id: string, - ): FineTune | ErrorResponse; -} - -@route("/models") -interface Models { - @get - @tag("OpenAI") - @summary(""" - Lists the currently available models, and provides basic information about each one such as the - owner and availability. - """) - @operationId("listModels") - listModels(): ListModelsResponse | ErrorResponse; - - @get - @route("{model}") - @operationId("retrieveModel") - @tag("OpenAI") - @summary(""" - Retrieves a model instance, providing basic information about the model such as the owner and - permissioning. - """) - retrieve( - /** The ID of the model to use for this request. */ - @path `model`: string, - ): Model | ErrorResponse; - - @delete - @route("{model}") - @operationId("deleteModel") - @tag("OpenAI") - @summary(""" - Delete a fine-tuned model. You must have the Owner role in your organization to delete a model. - """) - delete( - /** The model to delete */ - @path `model`: string, - ): DeleteModelResponse | ErrorResponse; -} diff --git a/images/models.tsp b/images/models.tsp deleted file mode 100644 index 3d7020b51..000000000 --- a/images/models.tsp +++ /dev/null @@ -1,80 +0,0 @@ -import "../common/models.tsp"; - -namespace OpenAI; -using TypeSpec.OpenAPI; - -alias SharedImageProperties = { - /** The number of images to generate. Must be between 1 and 10. */ - n?: ImagesN | null = 1; - - /** The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. */ - size?: IMAGE_SIZES | null = "1024x1024"; - - /** The format in which the generated images are returned. Must be one of `url` or `b64_json`. */ - response_format?: "url" | "b64_json" | null = "url"; - - user?: User; -}; - -model CreateImageRequest { - /** A text description of the desired image(s). The maximum length is 1000 characters. */ - prompt: string; - - ...SharedImageProperties; -} - -model ImagesResponse { - @encode("unixTimestamp", int32) - created: utcDateTime; - - data: Image[]; -} - -alias IMAGE_SIZES = "256x256" | "512x512" | "1024x1024"; - -/** Represents the url or the content of an image generated by the OpenAI API. */ -model Image { - /** The URL of the generated image, if `response_format` is `url` (default). */ - url?: url; - - /** The base64-encoded JSON of the generated image, if `response_format` is `b64_json`. */ - @encode("base64", string) - b64_json?: bytes; -} - -model CreateImageEditRequest { - /** A text description of the desired image(s). The maximum length is 1000 characters. */ - prompt: string; - - /** - * The image to edit. Must be a valid PNG file, less than 4MB, and square. If mask is not - * provided, image must have transparency, which will be used as the mask. - */ - @encode("binary") - image: bytes; - - /** - * An additional image whose fully transparent areas (e.g. where alpha is zero) indicate where - * `image` should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions - * as `image`. - */ - @encode("binary") - mask?: bytes; - - ...SharedImageProperties; -} - -model CreateImageVariationRequest { - /** - * The image to use as the basis for the variation(s). Must be a valid PNG file, less than 4MB, - * and square. - */ - @encode("binary") - image: bytes; - - ...SharedImageProperties; -} - -@minValue(1) -@maxValue(10) -scalar ImagesN extends safeint; diff --git a/moderation/models.tsp b/moderation/models.tsp deleted file mode 100644 index f47b21be1..000000000 --- a/moderation/models.tsp +++ /dev/null @@ -1,123 +0,0 @@ -namespace OpenAI; -using TypeSpec.OpenAPI; - -model CreateModerationRequest { - /** The input text to classify */ - input: string | string[]; - - /** - * Two content moderations models are available: `text-moderation-stable` and - * `text-moderation-latest`. The default is `text-moderation-latest` which will be automatically - * upgraded over time. This ensures you are always using our most accurate model. If you use - * `text-moderation-stable`, we will provide advanced notice before updating the model. Accuracy - * of `text-moderation-stable` may be slightly lower than for `text-moderation-latest`. - */ - @extension("x-oaiTypeLabel", "string") - `model`?: string | "text-moderation-latest" | "text-moderation-stable" = "text-moderation-latest"; -} - -model CreateModerationResponse { - /** The unique identifier for the moderation request. */ - id: string; - - /** The model used to generate the moderation results. */ - `model`: string; - - /** A list of moderation objects. */ - results: { - /** Whether the content violates [OpenAI's usage policies](/policies/usage-policies). */ - flagged: boolean; - - /** A list of the categories, and whether they are flagged or not. */ - categories: { - /** - * Content that expresses, incites, or promotes hate based on race, gender, ethnicity, - * religion, nationality, sexual orientation, disability status, or caste. Hateful content - * aimed at non-protected groups (e.g., chess players) is harrassment. - */ - hate: boolean; - - /** - * Hateful content that also includes violence or serious harm towards the targeted group - * based on race, gender, ethnicity, religion, nationality, sexual orientation, disability - * status, or caste. - */ - `hate/threatening`: boolean; - - /** Content that expresses, incites, or promotes harassing language towards any target. */ - harassment: boolean; - - /** Harassment content that also includes violence or serious harm towards any target. */ - `harassment/threatening`: boolean; - - /** - * Content that promotes, encourages, or depicts acts of self-harm, such as suicide, cutting, - * and eating disorders. - */ - `self-harm`: boolean; - - /** - * Content where the speaker expresses that they are engaging or intend to engage in acts of - * self-harm, such as suicide, cutting, and eating disorders. - */ - `self-harm/intent`: boolean; - - /** - * Content that encourages performing acts of self-harm, such as suicide, cutting, and eating - * disorders, or that gives instructions or advice on how to commit such acts. - */ - `self-harm/instructive`: boolean; - - /** - * Content meant to arouse sexual excitement, such as the description of sexual activity, or - * that promotes sexual services (excluding sex education and wellness). - */ - sexual: boolean; - - /** Sexual content that includes an individual who is under 18 years old. */ - `sexual/minors`: boolean; - - /** Content that depicts death, violence, or physical injury. */ - violence: boolean; - - /** Content that depicts death, violence, or physical injury in graphic detail. */ - `violence/graphic`: boolean; - }; - - /** A list of the categories along with their scores as predicted by model. */ - category_scores: { - /** The score for the category 'hate'. */ - hate: float64; - - /** The score for the category 'hate/threatening'. */ - `hate/threatening`: float64; - - /** The score for the category 'harassment'. */ - harassment: float64; - - /** The score for the category 'harassment/threatening'. */ - `harassment/threatening`: float64; - - /** The score for the category 'self-harm'. */ - `self-harm`: float64; - - /** The score for the category 'self-harm/intent'. */ - `self-harm/intent`: float64; - - /** The score for the category 'self-harm/instructive'. */ - `self-harm/instructive`: float64; - - /** The score for the category 'sexual'. */ - sexual: float64; - - /** The score for the category 'sexual/minors'. */ - `sexual/minors`: float64; - - /** The score for the category 'violence'. */ - violence: float64; - - /** The score for the category 'violence/graphic'. */ - `violence/graphic`: float64; - }; - }[]; -} diff --git a/openapi.yaml b/openapi.yaml deleted file mode 100644 index 011ccf375..000000000 --- a/openapi.yaml +++ /dev/null @@ -1,4382 +0,0 @@ -openapi: 3.0.0 -info: - title: OpenAI API - description: The OpenAI REST API. Please see https://platform.openai.com/docs/api-reference for more details. - version: "2.0.0" - termsOfService: https://openai.com/policies/terms-of-use - contact: - name: OpenAI Support - url: https://help.openai.com/ - license: - name: MIT - url: https://github.com/openai/openai-openapi/blob/master/LICENSE -servers: - - url: https://api.openai.com/v1 -tags: - - name: OpenAI - description: The OpenAI REST API -paths: - # Note: When adding an endpoint, make sure you also add it in the `groups` section, in the end of this file, - # under the appropriate group - /chat/completions: - post: - operationId: createChatCompletion - tags: - - OpenAI - summary: Creates a model response for the given chat conversation. - requestBody: - required: true - content: - application/json: - schema: - $ref: "#/components/schemas/CreateChatCompletionRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/CreateChatCompletionResponse" - - x-oaiMeta: - name: Create chat completion - group: chat - returns: | - Returns a [chat completion](/docs/api-reference/chat/object) object, or a streamed sequence of [chat completion chunk](/docs/api-reference/chat/streaming) objects if the request is streamed. - path: create - examples: - - title: No Streaming - request: - curl: | - curl https://api.openai.com/v1/chat/completions \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "model": "VAR_model_id", - "messages": [ - { - "role": "system", - "content": "You are a helpful assistant." - }, - { - "role": "user", - "content": "Hello!" - } - ] - }' - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - - completion = openai.ChatCompletion.create( - model="VAR_model_id", - messages=[ - {"role": "system", "content": "You are a helpful assistant."}, - {"role": "user", "content": "Hello!"} - ] - ) - - print(completion.choices[0].message) - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const completion = await openai.chat.completions.create({ - messages: [{ role: "system", content: "You are a helpful assistant." }], - model: "VAR_model_id", - }); - - console.log(completion.choices[0]); - } - - main(); - response: &chat_completion_example | - { - "id": "chatcmpl-123", - "object": "chat.completion", - "created": 1677652288, - "model": "gpt-3.5-turbo-0613", - "choices": [{ - "index": 0, - "message": { - "role": "assistant", - "content": "\n\nHello there, how may I assist you today?", - }, - "finish_reason": "stop" - }], - "usage": { - "prompt_tokens": 9, - "completion_tokens": 12, - "total_tokens": 21 - } - } - - title: Streaming - request: - curl: | - curl https://api.openai.com/v1/chat/completions \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "model": "VAR_model_id", - "messages": [ - { - "role": "system", - "content": "You are a helpful assistant." - }, - { - "role": "user", - "content": "Hello!" - } - ], - "stream": true - }' - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - - completion = openai.ChatCompletion.create( - model="VAR_model_id", - messages=[ - {"role": "system", "content": "You are a helpful assistant."}, - {"role": "user", "content": "Hello!"} - ], - stream=True - ) - - for chunk in completion: - print(chunk.choices[0].delta) - - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const completion = await openai.chat.completions.create({ - model: "VAR_model_id", - messages: [ - {"role": "system", "content": "You are a helpful assistant."}, - {"role": "user", "content": "Hello!"} - ], - stream: true, - }); - - for await (const chunk of completion) { - console.log(chunk.choices[0].delta.content); - } - } - - main(); - response: &chat_completion_chunk_example | - { - "id": "chatcmpl-123", - "object": "chat.completion.chunk", - "created": 1677652288, - "model": "gpt-3.5-turbo", - "choices": [{ - "index": 0, - "delta": { - "content": "Hello", - }, - "finish_reason": "stop" - }] - } - /completions: - post: - operationId: createCompletion - tags: - - OpenAI - summary: Creates a completion for the provided prompt and parameters. - requestBody: - required: true - content: - application/json: - schema: - $ref: "#/components/schemas/CreateCompletionRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/CreateCompletionResponse" - x-oaiMeta: - name: Create completion - returns: | - Returns a [completion](/docs/api-reference/completions/object) object, or a sequence of completion objects if the request is streamed. - legacy: true - examples: - - title: No streaming - request: - curl: | - curl https://api.openai.com/v1/completions \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "model": "VAR_model_id", - "prompt": "Say this is a test", - "max_tokens": 7, - "temperature": 0 - }' - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.Completion.create( - model="VAR_model_id", - prompt="Say this is a test", - max_tokens=7, - temperature=0 - ) - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const completion = await openai.completions.create({ - model: "VAR_model_id", - prompt: "Say this is a test.", - max_tokens: 7, - temperature: 0, - }); - - console.log(completion); - } - main(); - response: | - { - "id": "cmpl-uqkvlQyYK7bGYrRHQ0eXlWi7", - "object": "text_completion", - "created": 1589478378, - "model": "VAR_model_id", - "choices": [ - { - "text": "\n\nThis is indeed a test", - "index": 0, - "logprobs": null, - "finish_reason": "length" - } - ], - "usage": { - "prompt_tokens": 5, - "completion_tokens": 7, - "total_tokens": 12 - } - } - - title: Streaming - request: - curl: | - curl https://api.openai.com/v1/completions \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "model": "VAR_model_id", - "prompt": "Say this is a test", - "max_tokens": 7, - "temperature": 0, - "stream": true - }' - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - for chunk in openai.Completion.create( - model="VAR_model_id", - prompt="Say this is a test", - max_tokens=7, - temperature=0, - stream=True - ): - print(chunk['choices'][0]['text']) - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const stream = await openai.completions.create({ - model: "VAR_model_id", - prompt: "Say this is a test.", - stream: true, - }); - - for await (const chunk of stream) { - console.log(chunk.choices[0].text) - } - } - main(); - response: | - { - "id": "cmpl-7iA7iJjj8V2zOkCGvWF2hAkDWBQZe", - "object": "text_completion", - "created": 1690759702, - "choices": [ - { - "text": "This", - "index": 0, - "logprobs": null, - "finish_reason": null - } - ], - "model": "gpt-3.5-turbo-instruct" - } - /edits: - post: - operationId: createEdit - deprecated: true - tags: - - OpenAI - summary: Creates a new edit for the provided input, instruction, and parameters. - requestBody: - required: true - content: - application/json: - schema: - $ref: "#/components/schemas/CreateEditRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/CreateEditResponse" - x-oaiMeta: - name: Create edit - returns: | - Returns an [edit](/docs/api-reference/edits/object) object. - group: edits - examples: - request: - curl: | - curl https://api.openai.com/v1/edits \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "model": "VAR_model_id", - "input": "What day of the wek is it?", - "instruction": "Fix the spelling mistakes" - }' - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.Edit.create( - model="VAR_model_id", - input="What day of the wek is it?", - instruction="Fix the spelling mistakes" - ) - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const edit = await openai.edits.create({ - model: "VAR_model_id", - input: "What day of the wek is it?", - instruction: "Fix the spelling mistakes.", - }); - - console.log(edit); - } - - main(); - response: &edit_example | - { - "object": "edit", - "created": 1589478378, - "choices": [ - { - "text": "What day of the week is it?", - "index": 0, - } - ], - "usage": { - "prompt_tokens": 25, - "completion_tokens": 32, - "total_tokens": 57 - } - } - - /images/generations: - post: - operationId: createImage - tags: - - OpenAI - summary: Creates an image given a prompt. - requestBody: - required: true - content: - application/json: - schema: - $ref: "#/components/schemas/CreateImageRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/ImagesResponse" - x-oaiMeta: - name: Create image - returns: Returns a list of [image](/docs/api-reference/images/object) objects. - examples: - request: - curl: | - curl https://api.openai.com/v1/images/generations \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "prompt": "A cute baby sea otter", - "n": 2, - "size": "1024x1024" - }' - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.Image.create( - prompt="A cute baby sea otter", - n=2, - size="1024x1024" - ) - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const image = await openai.images.generate({ prompt: "A cute baby sea otter" }); - - console.log(image.data); - } - main(); - response: | - { - "created": 1589478378, - "data": [ - { - "url": "https://..." - }, - { - "url": "https://..." - } - ] - } - - /images/edits: - post: - operationId: createImageEdit - tags: - - OpenAI - summary: Creates an edited or extended image given an original image and a prompt. - requestBody: - required: true - content: - multipart/form-data: - schema: - $ref: "#/components/schemas/CreateImageEditRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/ImagesResponse" - x-oaiMeta: - name: Create image edit - returns: Returns a list of [image](/docs/api-reference/images/object) objects. - examples: - request: - curl: | - curl https://api.openai.com/v1/images/edits \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -F image="@otter.png" \ - -F mask="@mask.png" \ - -F prompt="A cute baby sea otter wearing a beret" \ - -F n=2 \ - -F size="1024x1024" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.Image.create_edit( - image=open("otter.png", "rb"), - mask=open("mask.png", "rb"), - prompt="A cute baby sea otter wearing a beret", - n=2, - size="1024x1024" - ) - node.js: |- - import fs from "fs"; - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const image = await openai.images.edit({ - image: fs.createReadStream("otter.png"), - mask: fs.createReadStream("mask.png"), - prompt: "A cute baby sea otter wearing a beret", - }); - - console.log(image.data); - } - main(); - response: | - { - "created": 1589478378, - "data": [ - { - "url": "https://..." - }, - { - "url": "https://..." - } - ] - } - - /images/variations: - post: - operationId: createImageVariation - tags: - - OpenAI - summary: Creates a variation of a given image. - requestBody: - required: true - content: - multipart/form-data: - schema: - $ref: "#/components/schemas/CreateImageVariationRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/ImagesResponse" - x-oaiMeta: - name: Create image variation - returns: Returns a list of [image](/docs/api-reference/images/object) objects. - examples: - request: - curl: | - curl https://api.openai.com/v1/images/variations \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -F image="@otter.png" \ - -F n=2 \ - -F size="1024x1024" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.Image.create_variation( - image=open("otter.png", "rb"), - n=2, - size="1024x1024" - ) - node.js: |- - import fs from "fs"; - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const image = await openai.images.createVariation({ - image: fs.createReadStream("otter.png"), - }); - - console.log(image.data); - } - main(); - response: | - { - "created": 1589478378, - "data": [ - { - "url": "https://..." - }, - { - "url": "https://..." - } - ] - } - - /embeddings: - post: - operationId: createEmbedding - tags: - - OpenAI - summary: Creates an embedding vector representing the input text. - requestBody: - required: true - content: - application/json: - schema: - $ref: "#/components/schemas/CreateEmbeddingRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/CreateEmbeddingResponse" - x-oaiMeta: - name: Create embeddings - returns: A list of [embedding](/docs/api-reference/embeddings/object) objects. - examples: - request: - curl: | - curl https://api.openai.com/v1/embeddings \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -H "Content-Type: application/json" \ - -d '{ - "input": "The food was delicious and the waiter...", - "model": "text-embedding-ada-002" - }' - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.Embedding.create( - model="text-embedding-ada-002", - input="The food was delicious and the waiter..." - ) - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const embedding = await openai.embeddings.create({ - model: "text-embedding-ada-002", - input: "The quick brown fox jumped over the lazy dog", - }); - - console.log(embedding); - } - - main(); - response: | - { - "object": "list", - "data": [ - { - "object": "embedding", - "embedding": [ - 0.0023064255, - -0.009327292, - .... (1536 floats total for ada-002) - -0.0028842222, - ], - "index": 0 - } - ], - "model": "text-embedding-ada-002", - "usage": { - "prompt_tokens": 8, - "total_tokens": 8 - } - } - - /audio/transcriptions: - post: - operationId: createTranscription - tags: - - OpenAI - summary: Transcribes audio into the input language. - requestBody: - required: true - content: - multipart/form-data: - schema: - $ref: "#/components/schemas/CreateTranscriptionRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/CreateTranscriptionResponse" - x-oaiMeta: - name: Create transcription - returns: The transcriped text. - examples: - request: - curl: | - curl https://api.openai.com/v1/audio/transcriptions \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -H "Content-Type: multipart/form-data" \ - -F file="@/path/to/file/audio.mp3" \ - -F model="whisper-1" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - audio_file = open("audio.mp3", "rb") - transcript = openai.Audio.transcribe("whisper-1", audio_file) - node: |- - import fs from "fs"; - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const transcription = await openai.audio.transcriptions.create({ - file: fs.createReadStream("audio.mp3"), - model: "whisper-1", - }); - - console.log(transcription.text); - } - main(); - response: | - { - "text": "Imagine the wildest idea that you've ever had, and you're curious about how it might scale to something that's a 100, a 1,000 times bigger. This is a place where you can get to do that." - } - - /audio/translations: - post: - operationId: createTranslation - tags: - - OpenAI - summary: Translates audio into English. - requestBody: - required: true - content: - multipart/form-data: - schema: - $ref: "#/components/schemas/CreateTranslationRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/CreateTranslationResponse" - x-oaiMeta: - name: Create translation - returns: The translated text. - examples: - request: - curl: | - curl https://api.openai.com/v1/audio/translations \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -H "Content-Type: multipart/form-data" \ - -F file="@/path/to/file/german.m4a" \ - -F model="whisper-1" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - audio_file = open("german.m4a", "rb") - transcript = openai.Audio.translate("whisper-1", audio_file) - node: | - const { Configuration, OpenAIApi } = require("openai"); - const configuration = new Configuration({ - apiKey: process.env.OPENAI_API_KEY, - }); - const openai = new OpenAIApi(configuration); - const resp = await openai.createTranslation( - fs.createReadStream("audio.mp3"), - "whisper-1" - ); - response: | - { - "text": "Hello, my name is Wolfgang and I come from Germany. Where are you heading today?" - } - - /files: - get: - operationId: listFiles - tags: - - OpenAI - summary: Returns a list of files that belong to the user's organization. - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/ListFilesResponse" - x-oaiMeta: - name: List files - returns: A list of [file](/docs/api-reference/files/object) objects. - examples: - request: - curl: | - curl https://api.openai.com/v1/files \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.File.list() - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const list = await openai.files.list(); - - for await (const file of list) { - console.log(file); - } - } - - main(); - response: | - { - "data": [ - { - "id": "file-abc123", - "object": "file", - "bytes": 175, - "created_at": 1613677385, - "filename": "train.jsonl", - "purpose": "search" - }, - { - "id": "file-abc123", - "object": "file", - "bytes": 140, - "created_at": 1613779121, - "filename": "puppy.jsonl", - "purpose": "search" - } - ], - "object": "list" - } - post: - operationId: createFile - tags: - - OpenAI - summary: | - Upload a file that contains document(s) to be used across various endpoints/features. Currently, the size of all the files uploaded by one organization can be up to 1 GB. Please contact us if you need to increase the storage limit. - - requestBody: - required: true - content: - multipart/form-data: - schema: - $ref: "#/components/schemas/CreateFileRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/OpenAIFile" - x-oaiMeta: - name: Upload file - returns: The uploaded [file](/docs/api-reference/files/object) object. - examples: - request: - curl: | - curl https://api.openai.com/v1/files \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -F purpose="fine-tune" \ - -F file="@mydata.jsonl" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.File.create( - file=open("mydata.jsonl", "rb"), - purpose='fine-tune' - ) - node.js: |- - import fs from "fs"; - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const file = await openai.files.create({ - file: fs.createReadStream("mydata.jsonl"), - purpose: "fine-tune", - }); - - console.log(file); - } - - main(); - response: | - { - "id": "file-abc123", - "object": "file", - "bytes": 140, - "created_at": 1613779121, - "filename": "mydata.jsonl", - "purpose": "fine-tune", - "status": "uploaded" | "processed" | "pending" | "error" - } - - /files/{file_id}: - delete: - operationId: deleteFile - tags: - - OpenAI - summary: Delete a file. - parameters: - - in: path - name: file_id - required: true - schema: - type: string - description: The ID of the file to use for this request. - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/DeleteFileResponse" - x-oaiMeta: - name: Delete file - returns: Deletion status. - examples: - request: - curl: | - curl https://api.openai.com/v1/files/file-abc123 \ - -X DELETE \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.File.delete("file-abc123") - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const file = await openai.files.del("file-abc123"); - - console.log(file); - } - - main(); - response: | - { - "id": "file-abc123", - "object": "file", - "deleted": true - } - get: - operationId: retrieveFile - tags: - - OpenAI - summary: Returns information about a specific file. - parameters: - - in: path - name: file_id - required: true - schema: - type: string - description: The ID of the file to use for this request. - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/OpenAIFile" - x-oaiMeta: - name: Retrieve file - returns: The [file](/docs/api-reference/files/object) object matching the specified ID. - examples: - request: - curl: | - curl https://api.openai.com/v1/files/file-abc123 \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.File.retrieve("file-abc123") - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const file = await openai.files.retrieve("file-abc123"); - - console.log(file); - } - - main(); - response: | - { - "id": "file-abc123", - "object": "file", - "bytes": 140, - "created_at": 1613779657, - "filename": "mydata.jsonl", - "purpose": "fine-tune" - } - - /files/{file_id}/content: - get: - operationId: downloadFile - tags: - - OpenAI - summary: Returns the contents of the specified file. - parameters: - - in: path - name: file_id - required: true - schema: - type: string - description: The ID of the file to use for this request. - responses: - "200": - description: OK - content: - application/json: - schema: - type: string - x-oaiMeta: - name: Retrieve file content - returns: The file content. - examples: - request: - curl: | - curl https://api.openai.com/v1/files/file-abc123/content \ - -H "Authorization: Bearer $OPENAI_API_KEY" > file.jsonl - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - content = openai.File.download("file-abc123") - node.js: | - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const file = await openai.files.retrieveContent("file-abc123"); - - console.log(file); - } - - main(); - - /fine_tuning/jobs: - post: - operationId: createFineTuningJob - tags: - - OpenAI - summary: | - Creates a job that fine-tunes a specified model from a given dataset. - - Response includes details of the enqueued job including job status and the name of the fine-tuned models once complete. - - [Learn more about fine-tuning](/docs/guides/fine-tuning) - requestBody: - required: true - content: - application/json: - schema: - $ref: "#/components/schemas/CreateFineTuningJobRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/FineTuningJob" - x-oaiMeta: - name: Create fine-tuning job - returns: A [fine-tuning.job](/docs/api-reference/fine-tuning/object) object. - examples: - - title: No hyperparameters - request: - curl: | - curl https://api.openai.com/v1/fine_tuning/jobs \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "training_file": "file-abc123" - "model": "gpt-3.5-turbo", - }' - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.FineTuningJob.create(training_file="file-abc123", model="gpt-3.5-turbo") - node.js: | - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const fineTune = await openai.fineTuning.jobs.create({ - training_file: "file-abc123" - }); - - console.log(fineTune); - } - - main(); - response: | - { - "object": "fine_tuning.job", - "id": "ft-AF1WoRqd3aJAHsqc9NY7iL8F", - "model": "gpt-3.5-turbo-0613", - "created_at": 1614807352, - "fine_tuned_model": null, - "organization_id": "org-123", - "result_files": [], - "status": "queued", - "validation_file": null, - "training_file": "file-abc123", - } - - title: Hyperparameters - request: - curl: | - curl https://api.openai.com/v1/fine_tuning/jobs \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "training_file": "file-abc123" - "model": "gpt-3.5-turbo", - "hyperparameters": { - "n_epochs": 2 - } - }' - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.FineTuningJob.create(training_file="file-abc123", model="gpt-3.5-turbo", hyperparameters={"n_epochs":2}) - node.js: | - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const fineTune = await openai.fineTuning.jobs.create({ - training_file: "file-abc123", - model: "gpt-3.5-turbo", - hyperparameters: { n_epochs: 2 }, - }); - - console.log(fineTune); - } - - main(); - response: | - { - "object": "fine_tuning.job", - "id": "ft-AF1WoRqd3aJAHsqc9NY7iL8F", - "model": "gpt-3.5-turbo-0613", - "created_at": 1614807352, - "fine_tuned_model": null, - "organization_id": "org-123", - "result_files": [], - "status": "queued", - "validation_file": null, - "training_file": "file-abc123", - "hyperparameters":{"n_epochs":2}, - } - get: - operationId: listPaginatedFineTuningJobs - tags: - - OpenAI - summary: | - List your organization's fine-tuning jobs - parameters: - - name: after - in: query - description: Identifier for the last job from the previous pagination request. - required: false - schema: - type: string - - name: limit - in: query - description: Number of fine-tuning jobs to retrieve. - required: false - schema: - type: integer - default: 20 - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/ListPaginatedFineTuningJobsResponse" - x-oaiMeta: - name: List fine-tuning jobs - returns: A list of paginated [fine-tuning job](/docs/api-reference/fine-tuning/object) objects. - examples: - request: - curl: | - curl https://api.openai.com/v1/fine_tuning/jobs?limit=2 \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.FineTuningJob.list() - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const list = await openai.fineTuning.jobs.list(); - - for await (const fineTune of list) { - console.log(fineTune); - } - } - - main(); - response: | - { - "object": "list", - "data": [ - { - "object": "fine_tuning.job.event", - "id": "ft-event-TjX0lMfOniCZX64t9PUQT5hn", - "created_at": 1689813489, - "level": "warn", - "message": "Fine tuning process stopping due to job cancellation", - "data": null, - "type": "message" - }, - { ... }, - { ... } - ], "has_more": true - } - /fine_tuning/jobs/{fine_tuning_job_id}: - get: - operationId: retrieveFineTuningJob - tags: - - OpenAI - summary: | - Get info about a fine-tuning job. - - [Learn more about fine-tuning](/docs/guides/fine-tuning) - parameters: - - in: path - name: fine_tuning_job_id - required: true - schema: - type: string - example: ft-AF1WoRqd3aJAHsqc9NY7iL8F - description: | - The ID of the fine-tuning job. - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/FineTuningJob" - x-oaiMeta: - name: Retrieve fine-tuning job - returns: The [fine-tuning](/docs/api-reference/fine-tunes/object) object with the given ID. - examples: - request: - curl: | - curl https://api.openai.com/v1/fine_tuning/jobs/ft-AF1WoRqd3aJAHsqc9NY7iL8F \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.FineTuningJob.retrieve("ft-anaKUAgnnBkNGB3QcSr4pImR") - node.js: | - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const fineTune = await openai.fineTuning.jobs.retrieve("ft-AF1WoRqd3aJAHsqc9NY7iL8F"); - - console.log(fineTune); - } - - main(); - response: &fine_tuning_example | - { - "object": "fine_tuning.job", - "id": "ft-zRdUkP4QeZqeYjDcQL0wwam1", - "model": "davinci-002", - "created_at": 1692661014, - "finished_at": 1692661190, - "fine_tuned_model": "ft:davinci-002:my-org:custom_suffix:7q8mpxmy", - "organization_id": "org-123", - "result_files": [ - "file-abc123" - ], - "status": "succeeded", - "validation_file": null, - "training_file": "file-abc123", - "hyperparameters": { - "n_epochs": 4, - }, - "trained_tokens": 5768 - } - - /fine_tuning/jobs/{fine_tuning_job_id}/events: - get: - operationId: listFineTuningEvents - tags: - - OpenAI - summary: | - Get status updates for a fine-tuning job. - parameters: - - in: path - name: fine_tuning_job_id - required: true - schema: - type: string - example: ft-AF1WoRqd3aJAHsqc9NY7iL8F - description: | - The ID of the fine-tuning job to get events for. - - name: after - in: query - description: Identifier for the last event from the previous pagination request. - required: false - schema: - type: string - - name: limit - in: query - description: Number of events to retrieve. - required: false - schema: - type: integer - default: 20 - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/ListFineTuningJobEventsResponse" - x-oaiMeta: - name: List fine-tuning events - returns: A list of fine-tuning event objects. - examples: - request: - curl: | - curl https://api.openai.com/v1/fine_tuning/jobs/ft-AF1WoRqd3aJAHsqc9NY7iL8F/events \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.FineTuningJob.list_events(id="ft-w9WJrnTe9vcVopaTy9LrlGQv", limit=2) - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const list = await openai.fineTuning.list_events(id="ft-w9WJrnTe9vcVopaTy9LrlGQv", limit=2); - - for await (const fineTune of list) { - console.log(fineTune); - } - } - - main(); - response: | - { - "object": "list", - "data": [ - { - "object": "fine_tuning.job.event", - "id": "ft-event-ddTJfwuMVpfLXseO0Am0Gqjm", - "created_at": 1692407401, - "level": "info", - "message": "Fine tuning job successfully completed", - "data": null, - "type": "message" - }, - { - "object": "fine_tuning.job.event", - "id": "ft-event-tyiGuB72evQncpH87xe505Sv", - "created_at": 1692407400, - "level": "info", - "message": "New fine-tuned model created: ft:gpt-3.5-turbo:openai::7p4lURel", - "data": null, - "type": "message" - } - ], - "has_more": true - } - - /fine_tuning/jobs/{fine_tuning_job_id}/cancel: - post: - operationId: cancelFineTuningJob - tags: - - OpenAI - summary: | - Immediately cancel a fine-tune job. - parameters: - - in: path - name: fine_tuning_job_id - required: true - schema: - type: string - example: ft-AF1WoRqd3aJAHsqc9NY7iL8F - description: | - The ID of the fine-tuning job to cancel. - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/FineTuningJob" - x-oaiMeta: - name: Cancel fine-tuning - returns: The cancelled [fine-tuning](/docs/api-reference/fine-tuning/object) object. - examples: - request: - curl: | - curl -X POST https://api.openai.com/v1/fine_tuning/jobs/ft-AF1WoRqd3aJAHsqc9NY7iL8F/cancel \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.FineTuningJob.cancel("ft-anaKUAgnnBkNGB3QcSr4pImR") - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const fineTune = await openai.fineTuning.jobs.cancel("ft-AF1WoRqd3aJAHsqc9NY7iL8F"); - - console.log(fineTune); - } - main(); - response: | - { - "object": "fine_tuning.job", - "id": "ft-gleYLJhWh1YFufiy29AahVpj", - "model": "gpt-3.5-turbo-0613", - "created_at": 1689376978, - "fine_tuned_model": null, - "organization_id": "org-123", - "result_files": [], - "hyperparameters": { - "n_epochs": "auto" - }, - "status": "cancelled", - "validation_file": "file-abc123", - "training_file": "file-abc123" - } - - /fine-tunes: - post: - operationId: createFineTune - deprecated: true - tags: - - OpenAI - summary: | - Creates a job that fine-tunes a specified model from a given dataset. - - Response includes details of the enqueued job including job status and the name of the fine-tuned models once complete. - - [Learn more about fine-tuning](/docs/guides/legacy-fine-tuning) - requestBody: - required: true - content: - application/json: - schema: - $ref: "#/components/schemas/CreateFineTuneRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/FineTune" - x-oaiMeta: - name: Create fine-tune - returns: A [fine-tune](/docs/api-reference/fine-tunes/object) object. - examples: - request: - curl: | - curl https://api.openai.com/v1/fine-tunes \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "training_file": "file-abc123" - }' - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.FineTune.create(training_file="file-abc123") - node.js: | - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const fineTune = await openai.fineTunes.create({ - training_file: "file-abc123" - }); - - console.log(fineTune); - } - - main(); - response: | - { - "id": "ft-AF1WoRqd3aJAHsqc9NY7iL8F", - "object": "fine-tune", - "model": "curie", - "created_at": 1614807352, - "events": [ - { - "object": "fine-tune-event", - "created_at": 1614807352, - "level": "info", - "message": "Job enqueued. Waiting for jobs ahead to complete. Queue number: 0." - } - ], - "fine_tuned_model": null, - "hyperparams": { - "batch_size": 4, - "learning_rate_multiplier": 0.1, - "n_epochs": 4, - "prompt_loss_weight": 0.1, - }, - "organization_id": "org-123", - "result_files": [], - "status": "pending", - "validation_files": [], - "training_files": [ - { - "id": "file-abc123", - "object": "file", - "bytes": 1547276, - "created_at": 1610062281, - "filename": "my-data-train.jsonl", - "purpose": "fine-tune-train" - } - ], - "updated_at": 1614807352, - } - get: - operationId: listFineTunes - deprecated: true - tags: - - OpenAI - summary: | - List your organization's fine-tuning jobs - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/ListFineTunesResponse" - x-oaiMeta: - name: List fine-tunes - returns: A list of [fine-tune](/docs/api-reference/fine-tunes/object) objects. - examples: - request: - curl: | - curl https://api.openai.com/v1/fine-tunes \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.FineTune.list() - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const list = await openai.fineTunes.list(); - - for await (const fineTune of list) { - console.log(fineTune); - } - } - - main(); - response: | - { - "object": "list", - "data": [ - { - "id": "ft-AF1WoRqd3aJAHsqc9NY7iL8F", - "object": "fine-tune", - "model": "curie", - "created_at": 1614807352, - "fine_tuned_model": null, - "hyperparams": { ... }, - "organization_id": "org-123", - "result_files": [], - "status": "pending", - "validation_files": [], - "training_files": [ { ... } ], - "updated_at": 1614807352, - }, - { ... }, - { ... } - ] - } - - /fine-tunes/{fine_tune_id}: - get: - operationId: retrieveFineTune - deprecated: true - tags: - - OpenAI - summary: | - Gets info about the fine-tune job. - - [Learn more about fine-tuning](/docs/guides/legacy-fine-tuning) - parameters: - - in: path - name: fine_tune_id - required: true - schema: - type: string - example: ft-AF1WoRqd3aJAHsqc9NY7iL8F - description: | - The ID of the fine-tune job - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/FineTune" - x-oaiMeta: - name: Retrieve fine-tune - returns: The [fine-tune](/docs/api-reference/fine-tunes/object) object with the given ID. - examples: - request: - curl: | - curl https://api.openai.com/v1/fine-tunes/ft-AF1WoRqd3aJAHsqc9NY7iL8F \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.FineTune.retrieve(id="ft-AF1WoRqd3aJAHsqc9NY7iL8F") - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const fineTune = await openai.fineTunes.retrieve("ft-AF1WoRqd3aJAHsqc9NY7iL8F"); - - console.log(fineTune); - } - - main(); - response: &fine_tune_example | - { - "id": "ft-AF1WoRqd3aJAHsqc9NY7iL8F", - "object": "fine-tune", - "model": "curie", - "created_at": 1614807352, - "events": [ - { - "object": "fine-tune-event", - "created_at": 1614807352, - "level": "info", - "message": "Job enqueued. Waiting for jobs ahead to complete. Queue number: 0." - }, - { - "object": "fine-tune-event", - "created_at": 1614807356, - "level": "info", - "message": "Job started." - }, - { - "object": "fine-tune-event", - "created_at": 1614807861, - "level": "info", - "message": "Uploaded snapshot: curie:ft-acmeco-2021-03-03-21-44-20." - }, - { - "object": "fine-tune-event", - "created_at": 1614807864, - "level": "info", - "message": "Uploaded result files: file-abc123." - }, - { - "object": "fine-tune-event", - "created_at": 1614807864, - "level": "info", - "message": "Job succeeded." - } - ], - "fine_tuned_model": "curie:ft-acmeco-2021-03-03-21-44-20", - "hyperparams": { - "batch_size": 4, - "learning_rate_multiplier": 0.1, - "n_epochs": 4, - "prompt_loss_weight": 0.1, - }, - "organization_id": "org-123", - "result_files": [ - { - "id": "file-abc123", - "object": "file", - "bytes": 81509, - "created_at": 1614807863, - "filename": "compiled_results.csv", - "purpose": "fine-tune-results" - } - ], - "status": "succeeded", - "validation_files": [], - "training_files": [ - { - "id": "file-abc123", - "object": "file", - "bytes": 1547276, - "created_at": 1610062281, - "filename": "my-data-train.jsonl", - "purpose": "fine-tune-train" - } - ], - "updated_at": 1614807865, - } - - /fine-tunes/{fine_tune_id}/cancel: - post: - operationId: cancelFineTune - deprecated: true - tags: - - OpenAI - summary: | - Immediately cancel a fine-tune job. - parameters: - - in: path - name: fine_tune_id - required: true - schema: - type: string - example: ft-AF1WoRqd3aJAHsqc9NY7iL8F - description: | - The ID of the fine-tune job to cancel - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/FineTune" - x-oaiMeta: - name: Cancel fine-tune - returns: The cancelled [fine-tune](/docs/api-reference/fine-tunes/object) object. - examples: - request: - curl: | - curl https://api.openai.com/v1/fine-tunes/ft-AF1WoRqd3aJAHsqc9NY7iL8F/cancel \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.FineTune.cancel(id="ft-AF1WoRqd3aJAHsqc9NY7iL8F") - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const fineTune = await openai.fineTunes.cancel("ft-AF1WoRqd3aJAHsqc9NY7iL8F"); - - console.log(fineTune); - } - main(); - response: | - { - "id": "ft-xhrpBbvVUzYGo8oUO1FY4nI7", - "object": "fine-tune", - "model": "curie", - "created_at": 1614807770, - "events": [ { ... } ], - "fine_tuned_model": null, - "hyperparams": { ... }, - "organization_id": "org-123", - "result_files": [], - "status": "cancelled", - "validation_files": [], - "training_files": [ - { - "id": "file-abc123", - "object": "file", - "bytes": 1547276, - "created_at": 1610062281, - "filename": "my-data-train.jsonl", - "purpose": "fine-tune-train" - } - ], - "updated_at": 1614807789, - } - - /fine-tunes/{fine_tune_id}/events: - get: - operationId: listFineTuneEvents - deprecated: true - tags: - - OpenAI - summary: | - Get fine-grained status updates for a fine-tune job. - parameters: - - in: path - name: fine_tune_id - required: true - schema: - type: string - example: ft-AF1WoRqd3aJAHsqc9NY7iL8F - description: | - The ID of the fine-tune job to get events for. - - in: query - name: stream - required: false - schema: - type: boolean - default: false - description: | - Whether to stream events for the fine-tune job. If set to true, - events will be sent as data-only - [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) - as they become available. The stream will terminate with a - `data: [DONE]` message when the job is finished (succeeded, cancelled, - or failed). - - If set to false, only events generated so far will be returned. - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/ListFineTuneEventsResponse" - x-oaiMeta: - name: List fine-tune events - returns: A list of fine-tune event objects. - examples: - request: - curl: | - curl https://api.openai.com/v1/fine-tunes/ft-AF1WoRqd3aJAHsqc9NY7iL8F/events \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.FineTune.list_events(id="ft-AF1WoRqd3aJAHsqc9NY7iL8F") - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const fineTune = await openai.fineTunes.listEvents("ft-AF1WoRqd3aJAHsqc9NY7iL8F"); - - console.log(fineTune); - } - main(); - response: | - { - "object": "list", - "data": [ - { - "object": "fine-tune-event", - "created_at": 1614807352, - "level": "info", - "message": "Job enqueued. Waiting for jobs ahead to complete. Queue number: 0." - }, - { - "object": "fine-tune-event", - "created_at": 1614807356, - "level": "info", - "message": "Job started." - }, - { - "object": "fine-tune-event", - "created_at": 1614807861, - "level": "info", - "message": "Uploaded snapshot: curie:ft-acmeco-2021-03-03-21-44-20." - }, - { - "object": "fine-tune-event", - "created_at": 1614807864, - "level": "info", - "message": "Uploaded result files: file-abc123" - }, - { - "object": "fine-tune-event", - "created_at": 1614807864, - "level": "info", - "message": "Job succeeded." - } - ] - } - - /models: - get: - operationId: listModels - tags: - - OpenAI - summary: Lists the currently available models, and provides basic information about each one such as the owner and availability. - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/ListModelsResponse" - x-oaiMeta: - name: List models - returns: A list of [model](/docs/api-reference/models/object) objects. - examples: - request: - curl: | - curl https://api.openai.com/v1/models \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.Model.list() - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const list = await openai.models.list(); - - for await (const model of list) { - console.log(model); - } - } - main(); - response: | - { - "object": "list", - "data": [ - { - "id": "model-id-0", - "object": "model", - "created": 1686935002, - "owned_by": "organization-owner" - }, - { - "id": "model-id-1", - "object": "model", - "created": 1686935002, - "owned_by": "organization-owner", - }, - { - "id": "model-id-2", - "object": "model", - "created": 1686935002, - "owned_by": "openai" - }, - ], - "object": "list" - } - - /models/{model}: - get: - operationId: retrieveModel - tags: - - OpenAI - summary: Retrieves a model instance, providing basic information about the model such as the owner and permissioning. - parameters: - - in: path - name: model - required: true - schema: - type: string - # ideally this will be an actual ID, so this will always work from browser - example: gpt-3.5-turbo - description: The ID of the model to use for this request - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/Model" - x-oaiMeta: - name: Retrieve model - returns: The [model](/docs/api-reference/models/object) object matching the specified ID. - examples: - request: - curl: | - curl https://api.openai.com/v1/models/VAR_model_id \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.Model.retrieve("VAR_model_id") - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const model = await openai.models.retrieve("gpt-3.5-turbo"); - - console.log(model); - } - - main(); - response: &retrieve_model_response | - { - "id": "VAR_model_id", - "object": "model", - "created": 1686935002, - "owned_by": "openai" - } - delete: - operationId: deleteModel - tags: - - OpenAI - summary: Delete a fine-tuned model. You must have the Owner role in your organization to delete a model. - parameters: - - in: path - name: model - required: true - schema: - type: string - example: ft:gpt-3.5-turbo:acemeco:suffix:abc123 - description: The model to delete - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/DeleteModelResponse" - x-oaiMeta: - name: Delete fine-tune model - returns: Deletion status. - examples: - request: - curl: | - curl https://api.openai.com/v1/models/ft:gpt-3.5-turbo:acemeco:suffix:abc123 \ - -X DELETE \ - -H "Authorization: Bearer $OPENAI_API_KEY" - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.Model.delete("ft:gpt-3.5-turbo:acemeco:suffix:abc123") - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const model = await openai.models.del("ft:gpt-3.5-turbo:acemeco:suffix:abc123"); - - console.log(model); - } - main(); - response: | - { - "id": "ft:gpt-3.5-turbo:acemeco:suffix:abc123", - "object": "model", - "deleted": true - } - - /moderations: - post: - operationId: createModeration - tags: - - OpenAI - summary: Classifies if text violates OpenAI's Content Policy - requestBody: - required: true - content: - application/json: - schema: - $ref: "#/components/schemas/CreateModerationRequest" - responses: - "200": - description: OK - content: - application/json: - schema: - $ref: "#/components/schemas/CreateModerationResponse" - x-oaiMeta: - name: Create moderation - returns: A [moderation](/docs/api-reference/moderations/object) object. - examples: - request: - curl: | - curl https://api.openai.com/v1/moderations \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "input": "I want to kill them." - }' - python: | - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - openai.Moderation.create( - input="I want to kill them.", - ) - node.js: | - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const moderation = await openai.moderations.create({ input: "I want to kill them." }); - - console.log(moderation); - } - main(); - response: &moderation_example | - { - "id": "modr-XXXXX", - "model": "text-moderation-005", - "results": [ - { - "flagged": true, - "categories": { - "sexual": false, - "hate": false, - "harassment": false, - "self-harm": false, - "sexual/minors": false, - "hate/threatening": false, - "violence/graphic": false, - "self-harm/intent": false, - "self-harm/instructions": false, - "harassment/threatening": true, - "violence": true, - }, - "category_scores": { - "sexual": 1.2282071e-06, - "hate": 0.010696256, - "harassment": 0.29842457, - "self-harm": 1.5236925e-08, - "sexual/minors": 5.7246268e-08, - "hate/threatening": 0.0060676364, - "violence/graphic": 4.435014e-06, - "self-harm/intent": 8.098441e-10, - "self-harm/instructions": 2.8498655e-11, - "harassment/threatening": 0.63055265, - "violence": 0.99011886, - } - } - ] - } - -components: - - securitySchemes: - ApiKeyAuth: - type: http - scheme: 'bearer' - - schemas: - Error: - type: object - properties: - type: - type: string - nullable: false - message: - type: string - nullable: false - param: - type: string - nullable: true - code: - type: string - nullable: true - required: - - type - - message - - param - - code - - ErrorResponse: - type: object - properties: - error: - $ref: "#/components/schemas/Error" - required: - - error - - ListModelsResponse: - type: object - properties: - object: - type: string - data: - type: array - items: - $ref: "#/components/schemas/Model" - required: - - object - - data - - DeleteModelResponse: - type: object - properties: - id: - type: string - object: - type: string - deleted: - type: boolean - required: - - id - - object - - deleted - - CreateCompletionRequest: - type: object - properties: - model: - description: &model_description | - ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. - anyOf: - - type: string - - type: string - enum: - [ - "babbage-002", - "davinci-002", - "gpt-3.5-turbo-instruct", - "text-davinci-003", - "text-davinci-002", - "text-davinci-001", - "code-davinci-002", - "text-curie-001", - "text-babbage-001", - "text-ada-001", - ] - x-oaiTypeLabel: string - prompt: - description: &completions_prompt_description | - The prompt(s) to generate completions for, encoded as a string, array of strings, array of tokens, or array of token arrays. - - Note that <|endoftext|> is the document separator that the model sees during training, so if a prompt is not specified the model will generate as if from the beginning of a new document. - default: "<|endoftext|>" - nullable: true - oneOf: - - type: string - default: "" - example: "This is a test." - - type: array - items: - type: string - default: "" - example: "This is a test." - - type: array - minItems: 1 - items: - type: integer - example: "[1212, 318, 257, 1332, 13]" - - type: array - minItems: 1 - items: - type: array - minItems: 1 - items: - type: integer - example: "[[1212, 318, 257, 1332, 13]]" - suffix: - description: The suffix that comes after a completion of inserted text. - default: null - nullable: true - type: string - example: "test." - max_tokens: - type: integer - minimum: 0 - default: 16 - example: 16 - nullable: true - description: &completions_max_tokens_description | - The maximum number of [tokens](/tokenizer) to generate in the completion. - - The token count of your prompt plus `max_tokens` cannot exceed the model's context length. [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb) for counting tokens. - temperature: - type: number - minimum: 0 - maximum: 2 - default: 1 - example: 1 - nullable: true - description: &completions_temperature_description | - What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. - - We generally recommend altering this or `top_p` but not both. - top_p: - type: number - minimum: 0 - maximum: 1 - default: 1 - example: 1 - nullable: true - description: &completions_top_p_description | - An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. - - We generally recommend altering this or `temperature` but not both. - n: - type: integer - minimum: 1 - maximum: 128 - default: 1 - example: 1 - nullable: true - description: &completions_completions_description | - How many completions to generate for each prompt. - - **Note:** Because this parameter generates many completions, it can quickly consume your token quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. - stream: - description: > - Whether to stream back partial progress. If set, tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) - as they become available, with the stream terminated by a `data: [DONE]` message. [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb). - type: boolean - nullable: true - default: false - logprobs: &completions_logprobs_configuration - type: integer - minimum: 0 - maximum: 5 - default: null - nullable: true - description: &completions_logprobs_description | - Include the log probabilities on the `logprobs` most likely tokens, as well the chosen tokens. For example, if `logprobs` is 5, the API will return a list of the 5 most likely tokens. The API will always return the `logprob` of the sampled token, so there may be up to `logprobs+1` elements in the response. - - The maximum value for `logprobs` is 5. - echo: - type: boolean - default: false - nullable: true - description: &completions_echo_description > - Echo back the prompt in addition to the completion - stop: - description: &completions_stop_description > - Up to 4 sequences where the API will stop generating further tokens. The returned text will not contain the stop sequence. - default: null - nullable: true - oneOf: - - type: string - default: <|endoftext|> - example: "\n" - nullable: true - - type: array - minItems: 1 - maxItems: 4 - items: - type: string - example: '["\n"]' - presence_penalty: - type: number - default: 0 - minimum: -2 - maximum: 2 - nullable: true - description: &completions_presence_penalty_description | - Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics. - - [See more information about frequency and presence penalties.](/docs/guides/gpt/parameter-details) - frequency_penalty: - type: number - default: 0 - minimum: -2 - maximum: 2 - nullable: true - description: &completions_frequency_penalty_description | - Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim. - - [See more information about frequency and presence penalties.](/docs/guides/gpt/parameter-details) - best_of: - type: integer - default: 1 - minimum: 0 - maximum: 20 - nullable: true - description: &completions_best_of_description | - Generates `best_of` completions server-side and returns the "best" (the one with the highest log probability per token). Results cannot be streamed. - - When used with `n`, `best_of` controls the number of candidate completions and `n` specifies how many to return – `best_of` must be greater than `n`. - - **Note:** Because this parameter generates many completions, it can quickly consume your token quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. - logit_bias: &completions_logit_bias - type: object - x-oaiTypeLabel: map - default: null - nullable: true - additionalProperties: - type: integer - description: &completions_logit_bias_description | - Modify the likelihood of specified tokens appearing in the completion. - - Accepts a json object that maps tokens (specified by their token ID in the GPT tokenizer) to an associated bias value from -100 to 100. You can use this [tokenizer tool](/tokenizer?view=bpe) (which works for both GPT-2 and GPT-3) to convert text to token IDs. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token. - - As an example, you can pass `{"50256": -100}` to prevent the <|endoftext|> token from being generated. - user: &end_user_param_configuration - type: string - example: user-1234 - description: | - A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). - required: - - model - - prompt - - CreateCompletionResponse: - type: object - description: | - Represents a completion response from the API. Note: both the streamed and non-streamed response objects share the same shape (unlike the chat endpoint). - properties: - id: - type: string - description: A unique identifier for the completion. - object: - type: string - description: The object type, which is always "text_completion" - created: - type: integer - description: The Unix timestamp (in seconds) of when the completion was created. - model: - type: string - description: The model used for completion. - choices: - type: array - description: The list of completion choices the model generated for the input prompt. - items: - type: object - required: - - text - - index - - logprobs - - finish_reason - properties: - text: - type: string - index: - type: integer - logprobs: - type: object - nullable: true - properties: - tokens: - type: array - items: - type: string - token_logprobs: - type: array - items: - type: number - top_logprobs: - type: array - items: - type: object - additionalProperties: - type: integer - text_offset: - type: array - items: - type: integer - finish_reason: - type: string - description: &completion_finish_reason_description | - The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, - `length` if the maximum number of tokens specified in the request was reached, - or `content_filter` if content was omitted due to a flag from our content filters. - enum: ["stop", "length", "content_filter"] - usage: - $ref: "#/components/schemas/CompletionUsage" - required: - - id - - object - - created - - model - - choices - x-oaiMeta: - name: The completion object - legacy: true - example: | - { - "id": "cmpl-uqkvlQyYK7bGYrRHQ0eXlWi7", - "object": "text_completion", - "created": 1589478378, - "model": "gpt-3.5-turbo", - "choices": [ - { - "text": "\n\nThis is indeed a test", - "index": 0, - "logprobs": null, - "finish_reason": "length" - } - ], - "usage": { - "prompt_tokens": 5, - "completion_tokens": 7, - "total_tokens": 12 - } - } - - ChatCompletionRequestMessage: - type: object - properties: - role: - type: string - enum: ["system", "user", "assistant", "function"] - description: The role of the messages author. One of `system`, `user`, `assistant`, or `function`. - content: - type: string - nullable: true - description: The contents of the message. `content` is required for all messages, and may be null for assistant messages with function calls. - name: - type: string - description: The name of the author of this message. `name` is required if role is `function`, and it should be the name of the function whose response is in the `content`. May contain a-z, A-Z, 0-9, and underscores, with a maximum length of 64 characters. - function_call: - type: object - description: The name and arguments of a function that should be called, as generated by the model. - properties: - name: - type: string - description: The name of the function to call. - arguments: - type: string - description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. - required: - - name - - arguments - required: - - role - - content - - ChatCompletionFunctionParameters: - type: object - description: "The parameters the functions accepts, described as a JSON Schema object. See the [guide](/docs/guides/gpt/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.\n\nTo describe a function that accepts no parameters, provide the value `{\"type\": \"object\", \"properties\": {}}`." - additionalProperties: true - - ChatCompletionFunctions: - type: object - properties: - name: - type: string - description: The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. - description: - type: string - description: A description of what the function does, used by the model to choose when and how to call the function. - parameters: - $ref: "#/components/schemas/ChatCompletionFunctionParameters" - required: - - name - - parameters - - ChatCompletionFunctionCallOption: - type: object - properties: - name: - type: string - description: The name of the function to call. - required: - - name - - ChatCompletionResponseMessage: - type: object - description: A chat completion message generated by the model. - properties: - role: - type: string - enum: ["system", "user", "assistant", "function"] - description: The role of the author of this message. - content: - type: string - description: The contents of the message. - nullable: true - function_call: - type: object - description: The name and arguments of a function that should be called, as generated by the model. - properties: - name: - type: string - description: The name of the function to call. - arguments: - type: string - description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. - required: - - name - - arguments - required: - - role - - content - - ChatCompletionStreamResponseDelta: - type: object - description: A chat completion delta generated by streamed model responses. - properties: - role: - type: string - enum: ["system", "user", "assistant", "function"] - description: The role of the author of this message. - content: - type: string - description: The contents of the chunk message. - nullable: true - function_call: - type: object - description: The name and arguments of a function that should be called, as generated by the model. - properties: - name: - type: string - description: The name of the function to call. - arguments: - type: string - description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. - - CreateChatCompletionRequest: - type: object - properties: - model: - description: ID of the model to use. See the [model endpoint compatibility](/docs/models/model-endpoint-compatibility) table for details on which models work with the Chat API. - example: "gpt-3.5-turbo" - anyOf: - - type: string - - type: string - enum: - [ - "gpt-4", - "gpt-4-0314", - "gpt-4-0613", - "gpt-4-32k", - "gpt-4-32k-0314", - "gpt-4-32k-0613", - "gpt-3.5-turbo", - "gpt-3.5-turbo-16k", - "gpt-3.5-turbo-0301", - "gpt-3.5-turbo-0613", - "gpt-3.5-turbo-16k-0613", - ] - x-oaiTypeLabel: string - messages: - description: A list of messages comprising the conversation so far. [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_format_inputs_to_ChatGPT_models.ipynb). - type: array - minItems: 1 - items: - $ref: "#/components/schemas/ChatCompletionRequestMessage" - functions: - description: A list of functions the model may generate JSON inputs for. - type: array - minItems: 1 - maxItems: 128 - items: - $ref: "#/components/schemas/ChatCompletionFunctions" - function_call: - description: "Controls how the model responds to function calls. `none` means the model does not call a function, and responds to the end-user. `auto` means the model can pick between an end-user or calling a function. Specifying a particular function via `{\"name\": \"my_function\"}` forces the model to call that function. `none` is the default when no functions are present. `auto` is the default if functions are present." - oneOf: - - type: string - enum: [none, auto] - - $ref: "#/components/schemas/ChatCompletionFunctionCallOption" - temperature: - type: number - minimum: 0 - maximum: 2 - default: 1 - example: 1 - nullable: true - description: *completions_temperature_description - top_p: - type: number - minimum: 0 - maximum: 1 - default: 1 - example: 1 - nullable: true - description: *completions_top_p_description - n: - type: integer - minimum: 1 - maximum: 128 - default: 1 - example: 1 - nullable: true - description: How many chat completion choices to generate for each input message. - stream: - description: > - If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) - as they become available, with the stream terminated by a `data: [DONE]` message. [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb). - type: boolean - nullable: true - default: false - stop: - description: | - Up to 4 sequences where the API will stop generating further tokens. - default: null - oneOf: - - type: string - nullable: true - - type: array - minItems: 1 - maxItems: 4 - items: - type: string - max_tokens: - description: | - The maximum number of [tokens](/tokenizer) to generate in the chat completion. - - The total length of input tokens and generated tokens is limited by the model's context length. [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb) for counting tokens. - default: inf - type: integer - nullable: true - presence_penalty: - type: number - default: 0 - minimum: -2 - maximum: 2 - nullable: true - description: *completions_presence_penalty_description - frequency_penalty: - type: number - default: 0 - minimum: -2 - maximum: 2 - nullable: true - description: *completions_frequency_penalty_description - logit_bias: - type: object - x-oaiTypeLabel: map - default: null - nullable: true - additionalProperties: - type: integer - description: | - Modify the likelihood of specified tokens appearing in the completion. - - Accepts a json object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token. - user: *end_user_param_configuration - required: - - model - - messages - - CreateChatCompletionResponse: - type: object - description: Represents a chat completion response returned by model, based on the provided input. - properties: - id: - type: string - description: A unique identifier for the chat completion. - object: - type: string - description: The object type, which is always `chat.completion`. - created: - type: integer - description: The Unix timestamp (in seconds) of when the chat completion was created. - model: - type: string - description: The model used for the chat completion. - choices: - type: array - description: A list of chat completion choices. Can be more than one if `n` is greater than 1. - items: - type: object - required: - - index - - message - - finish_reason - properties: - index: - type: integer - description: The index of the choice in the list of choices. - message: - $ref: "#/components/schemas/ChatCompletionResponseMessage" - finish_reason: - type: string - description: &chat_completion_finish_reason_description | - The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, - `length` if the maximum number of tokens specified in the request was reached, - `content_filter` if content was omitted due to a flag from our content filters, - or `function_call` if the model called a function. - enum: ["stop", "length", "function_call", "content_filter"] - usage: - $ref: "#/components/schemas/CompletionUsage" - required: - - id - - object - - created - - model - - choices - x-oaiMeta: - name: The chat completion object - group: chat - example: *chat_completion_example - - ListPaginatedFineTuningJobsResponse: - type: object - properties: - object: - type: string - data: - type: array - items: - $ref: "#/components/schemas/FineTuningJob" - has_more: - type: boolean - required: - - object - - data - - has_more - - CreateChatCompletionStreamResponse: - type: object - description: Represents a streamed chunk of a chat completion response returned by model, based on the provided input. - properties: - id: - type: string - description: A unique identifier for the chat completion chunk. - object: - type: string - description: The object type, which is always `chat.completion.chunk`. - created: - type: integer - description: The Unix timestamp (in seconds) of when the chat completion chunk was created. - model: - type: string - description: The model to generate the completion. - choices: - type: array - description: A list of chat completion choices. Can be more than one if `n` is greater than 1. - items: - type: object - required: - - index - - delta - - finish_reason - properties: - index: - type: integer - description: The index of the choice in the list of choices. - delta: - $ref: "#/components/schemas/ChatCompletionStreamResponseDelta" - finish_reason: - type: string - description: *chat_completion_finish_reason_description - enum: ["stop", "length", "function_call"] - nullable: true - required: - - id - - object - - created - - model - - choices - x-oaiMeta: - name: The chat completion chunk object - group: chat - example: *chat_completion_chunk_example - - CreateEditRequest: - type: object - properties: - model: - description: ID of the model to use. You can use the `text-davinci-edit-001` or `code-davinci-edit-001` model with this endpoint. - example: "text-davinci-edit-001" - anyOf: - - type: string - - type: string - enum: ["text-davinci-edit-001", "code-davinci-edit-001"] - x-oaiTypeLabel: string - input: - description: The input text to use as a starting point for the edit. - type: string - default: "" - nullable: true - example: "What day of the wek is it?" - instruction: - description: The instruction that tells the model how to edit the prompt. - type: string - example: "Fix the spelling mistakes." - n: - type: integer - minimum: 1 - maximum: 20 - default: 1 - example: 1 - nullable: true - description: How many edits to generate for the input and instruction. - temperature: - type: number - minimum: 0 - maximum: 2 - default: 1 - example: 1 - nullable: true - description: *completions_temperature_description - top_p: - type: number - minimum: 0 - maximum: 1 - default: 1 - example: 1 - nullable: true - description: *completions_top_p_description - required: - - model - - instruction - - CreateEditResponse: - type: object - title: Edit - deprecated: true - properties: - object: - type: string - description: The object type, which is always `edit`. - created: - type: integer - description: The Unix timestamp (in seconds) of when the edit was created. - choices: - type: array - description: A list of edit choices. Can be more than one if `n` is greater than 1. - items: - type: object - required: - - text - - index - - finish_reason - properties: - text: - type: string - description: The edited result. - index: - type: integer - description: The index of the choice in the list of choices. - finish_reason: - type: string - description: *completion_finish_reason_description - enum: ["stop", "length"] - usage: - $ref: "#/components/schemas/CompletionUsage" - required: - - object - - created - - choices - - usage - x-oaiMeta: - name: The edit object - example: *edit_example - - CreateImageRequest: - type: object - properties: - prompt: - description: A text description of the desired image(s). The maximum length is 1000 characters. - type: string - example: "A cute baby sea otter" - n: &images_n - type: integer - minimum: 1 - maximum: 10 - default: 1 - example: 1 - nullable: true - description: The number of images to generate. Must be between 1 and 10. - size: &images_size - type: string - enum: ["256x256", "512x512", "1024x1024"] - default: "1024x1024" - example: "1024x1024" - nullable: true - description: The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. - response_format: &images_response_format - type: string - enum: ["url", "b64_json"] - default: "url" - example: "url" - nullable: true - description: The format in which the generated images are returned. Must be one of `url` or `b64_json`. - user: *end_user_param_configuration - required: - - prompt - - ImagesResponse: - properties: - created: - type: integer - data: - type: array - items: - $ref: "#/components/schemas/Image" - required: - - created - - data - - Image: - type: object - description: Represents the url or the content of an image generated by the OpenAI API. - properties: - url: - type: string - description: The URL of the generated image, if `response_format` is `url` (default). - b64_json: - type: string - description: The base64-encoded JSON of the generated image, if `response_format` is `b64_json`. - x-oaiMeta: - name: The image object - example: | - { - "url": "..." - } - - CreateImageEditRequest: - type: object - properties: - image: - description: The image to edit. Must be a valid PNG file, less than 4MB, and square. If mask is not provided, image must have transparency, which will be used as the mask. - type: string - format: binary - mask: - description: An additional image whose fully transparent areas (e.g. where alpha is zero) indicate where `image` should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions as `image`. - type: string - format: binary - prompt: - description: A text description of the desired image(s). The maximum length is 1000 characters. - type: string - example: "A cute baby sea otter wearing a beret" - n: *images_n - size: *images_size - response_format: *images_response_format - user: *end_user_param_configuration - required: - - prompt - - image - - CreateImageVariationRequest: - type: object - properties: - image: - description: The image to use as the basis for the variation(s). Must be a valid PNG file, less than 4MB, and square. - type: string - format: binary - n: *images_n - size: *images_size - response_format: *images_response_format - user: *end_user_param_configuration - required: - - image - - CreateModerationRequest: - type: object - properties: - input: - description: The input text to classify - oneOf: - - type: string - default: "" - example: "I want to kill them." - - type: array - items: - type: string - default: "" - example: "I want to kill them." - model: - description: | - Two content moderations models are available: `text-moderation-stable` and `text-moderation-latest`. - - The default is `text-moderation-latest` which will be automatically upgraded over time. This ensures you are always using our most accurate model. If you use `text-moderation-stable`, we will provide advanced notice before updating the model. Accuracy of `text-moderation-stable` may be slightly lower than for `text-moderation-latest`. - nullable: false - default: "text-moderation-latest" - example: "text-moderation-stable" - anyOf: - - type: string - - type: string - enum: ["text-moderation-latest", "text-moderation-stable"] - x-oaiTypeLabel: string - required: - - input - - CreateModerationResponse: - type: object - description: Represents policy compliance report by OpenAI's content moderation model against a given input. - properties: - id: - type: string - description: The unique identifier for the moderation request. - model: - type: string - description: The model used to generate the moderation results. - results: - type: array - description: A list of moderation objects. - items: - type: object - properties: - flagged: - type: boolean - description: Whether the content violates [OpenAI's usage policies](/policies/usage-policies). - categories: - type: object - description: A list of the categories, and whether they are flagged or not. - properties: - hate: - type: boolean - description: Content that expresses, incites, or promotes hate based on race, gender, ethnicity, religion, nationality, sexual orientation, disability status, or caste. Hateful content aimed at non-protected groups (e.g., chess players) is harrassment. - hate/threatening: - type: boolean - description: Hateful content that also includes violence or serious harm towards the targeted group based on race, gender, ethnicity, religion, nationality, sexual orientation, disability status, or caste. - harassment: - type: boolean - description: Content that expresses, incites, or promotes harassing language towards any target. - harassment/threatening: - type: boolean - description: Harassment content that also includes violence or serious harm towards any target. - self-harm: - type: boolean - description: Content that promotes, encourages, or depicts acts of self-harm, such as suicide, cutting, and eating disorders. - self-harm/intent: - type: boolean - description: Content where the speaker expresses that they are engaging or intend to engage in acts of self-harm, such as suicide, cutting, and eating disorders. - self-harm/instructions: - type: boolean - description: Content that encourages performing acts of self-harm, such as suicide, cutting, and eating disorders, or that gives instructions or advice on how to commit such acts. - sexual: - type: boolean - description: Content meant to arouse sexual excitement, such as the description of sexual activity, or that promotes sexual services (excluding sex education and wellness). - sexual/minors: - type: boolean - description: Sexual content that includes an individual who is under 18 years old. - violence: - type: boolean - description: Content that depicts death, violence, or physical injury. - violence/graphic: - type: boolean - description: Content that depicts death, violence, or physical injury in graphic detail. - required: - - hate - - hate/threatening - - harassment - - harassment/threatening - - self-harm - - self-harm/intent - - self-harm/instructions - - sexual - - sexual/minors - - violence - - violence/graphic - category_scores: - type: object - description: A list of the categories along with their scores as predicted by model. - properties: - hate: - type: number - description: The score for the category 'hate'. - hate/threatening: - type: number - description: The score for the category 'hate/threatening'. - harassment: - type: number - description: The score for the category 'harassment'. - harassment/threatening: - type: number - description: The score for the category 'harassment/threatening'. - self-harm: - type: number - description: The score for the category 'self-harm'. - self-harm/intent: - type: number - description: The score for the category 'self-harm/intent'. - self-harm/instructions: - type: number - description: The score for the category 'self-harm/instructions'. - sexual: - type: number - description: The score for the category 'sexual'. - sexual/minors: - type: number - description: The score for the category 'sexual/minors'. - violence: - type: number - description: The score for the category 'violence'. - violence/graphic: - type: number - description: The score for the category 'violence/graphic'. - required: - - hate - - hate/threatening - - harassment - - harassment/threatening - - self-harm - - self-harm/intent - - self-harm/instructions - - sexual - - sexual/minors - - violence - - violence/graphic - required: - - flagged - - categories - - category_scores - required: - - id - - model - - results - x-oaiMeta: - name: The moderation object - example: *moderation_example - - ListFilesResponse: - type: object - properties: - object: - type: string - data: - type: array - items: - $ref: "#/components/schemas/OpenAIFile" - required: - - object - - data - - CreateFileRequest: - type: object - additionalProperties: false - properties: - file: - description: | - Name of the [JSON Lines](https://jsonlines.readthedocs.io/en/latest/) file to be uploaded. - - If the `purpose` is set to "fine-tune", the file will be used for fine-tuning. - type: string - format: binary - purpose: - description: | - The intended purpose of the uploaded documents. - - Use "fine-tune" for [fine-tuning](/docs/api-reference/fine-tuning). This allows us to validate the format of the uploaded file. - type: string - required: - - file - - purpose - - DeleteFileResponse: - type: object - properties: - id: - type: string - object: - type: string - deleted: - type: boolean - required: - - id - - object - - deleted - - CreateFineTuningJobRequest: - type: object - properties: - training_file: - description: | - The ID of an uploaded file that contains training data. - - See [upload file](/docs/api-reference/files/upload) for how to upload a file. - - Your dataset must be formatted as a JSONL file. Additionally, you must upload your file with the purpose `fine-tune`. - - See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. - type: string - example: "file-abc123" - validation_file: - description: | - The ID of an uploaded file that contains validation data. - - If you provide this file, the data is used to generate validation - metrics periodically during fine-tuning. These metrics can be viewed in - the fine-tuning results file. - The same data should not be present in both train and validation files. - - Your dataset must be formatted as a JSONL file. You must upload your file with the purpose `fine-tune`. - - See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. - type: string - nullable: true - example: "file-abc123" - model: - description: | - The name of the model to fine-tune. You can select one of the - [supported models](/docs/guides/fine-tuning/what-models-can-be-fine-tuned). - example: "gpt-3.5-turbo" - anyOf: - - type: string - - type: string - enum: ["babbage-002", "davinci-002", "gpt-3.5-turbo"] - x-oaiTypeLabel: string - hyperparameters: - type: object - description: The hyperparameters used for the fine-tuning job. - properties: - n_epochs: - description: | - The number of epochs to train the model for. An epoch refers to one - full cycle through the training dataset. - oneOf: - - type: string - enum: [auto] - - type: integer - minimum: 1 - maximum: 50 - default: auto - suffix: - description: | - A string of up to 18 characters that will be added to your fine-tuned model name. - - For example, a `suffix` of "custom-model-name" would produce a model name like `ft:gpt-3.5-turbo:openai:custom-model-name:7p4lURel`. - type: string - minLength: 1 - maxLength: 40 - default: null - nullable: true - required: - - training_file - - model - - ListFineTuningJobEventsResponse: - type: object - properties: - object: - type: string - data: - type: array - items: - $ref: "#/components/schemas/FineTuningJobEvent" - required: - - object - - data - - CreateFineTuneRequest: - type: object - properties: - training_file: - description: | - The ID of an uploaded file that contains training data. - - See [upload file](/docs/api-reference/files/upload) for how to upload a file. - - Your dataset must be formatted as a JSONL file, where each training - example is a JSON object with the keys "prompt" and "completion". - Additionally, you must upload your file with the purpose `fine-tune`. - - See the [fine-tuning guide](/docs/guides/legacy-fine-tuning/creating-training-data) for more details. - type: string - example: "file-abc123" - validation_file: - description: | - The ID of an uploaded file that contains validation data. - - If you provide this file, the data is used to generate validation - metrics periodically during fine-tuning. These metrics can be viewed in - the [fine-tuning results file](/docs/guides/legacy-fine-tuning/analyzing-your-fine-tuned-model). - Your train and validation data should be mutually exclusive. - - Your dataset must be formatted as a JSONL file, where each validation - example is a JSON object with the keys "prompt" and "completion". - Additionally, you must upload your file with the purpose `fine-tune`. - - See the [fine-tuning guide](/docs/guides/legacy-fine-tuning/creating-training-data) for more details. - type: string - nullable: true - example: "file-abc123" - model: - description: | - The name of the base model to fine-tune. You can select one of "ada", - "babbage", "curie", "davinci", or a fine-tuned model created after 2022-04-21 and before 2023-08-22. - To learn more about these models, see the - [Models](/docs/models) documentation. - default: "curie" - example: "curie" - nullable: true - anyOf: - - type: string - - type: string - enum: ["ada", "babbage", "curie", "davinci"] - x-oaiTypeLabel: string - n_epochs: - description: | - The number of epochs to train the model for. An epoch refers to one - full cycle through the training dataset. - default: 4 - type: integer - nullable: true - batch_size: - description: | - The batch size to use for training. The batch size is the number of - training examples used to train a single forward and backward pass. - - By default, the batch size will be dynamically configured to be - ~0.2% of the number of examples in the training set, capped at 256 - - in general, we've found that larger batch sizes tend to work better - for larger datasets. - default: null - type: integer - nullable: true - learning_rate_multiplier: - description: | - The learning rate multiplier to use for training. - The fine-tuning learning rate is the original learning rate used for - pretraining multiplied by this value. - - By default, the learning rate multiplier is the 0.05, 0.1, or 0.2 - depending on final `batch_size` (larger learning rates tend to - perform better with larger batch sizes). We recommend experimenting - with values in the range 0.02 to 0.2 to see what produces the best - results. - default: null - type: number - nullable: true - prompt_loss_weight: - description: | - The weight to use for loss on the prompt tokens. This controls how - much the model tries to learn to generate the prompt (as compared - to the completion which always has a weight of 1.0), and can add - a stabilizing effect to training when completions are short. - - If prompts are extremely long (relative to completions), it may make - sense to reduce this weight so as to avoid over-prioritizing - learning the prompt. - default: 0.01 - type: number - nullable: true - compute_classification_metrics: - description: | - If set, we calculate classification-specific metrics such as accuracy - and F-1 score using the validation set at the end of every epoch. - These metrics can be viewed in the [results file](/docs/guides/legacy-fine-tuning/analyzing-your-fine-tuned-model). - - In order to compute classification metrics, you must provide a - `validation_file`. Additionally, you must - specify `classification_n_classes` for multiclass classification or - `classification_positive_class` for binary classification. - type: boolean - default: false - nullable: true - classification_n_classes: - description: | - The number of classes in a classification task. - - This parameter is required for multiclass classification. - type: integer - default: null - nullable: true - classification_positive_class: - description: | - The positive class in binary classification. - - This parameter is needed to generate precision, recall, and F1 - metrics when doing binary classification. - type: string - default: null - nullable: true - classification_betas: - description: | - If this is provided, we calculate F-beta scores at the specified - beta values. The F-beta score is a generalization of F-1 score. - This is only used for binary classification. - - With a beta of 1 (i.e. the F-1 score), precision and recall are - given the same weight. A larger beta score puts more weight on - recall and less on precision. A smaller beta score puts more weight - on precision and less on recall. - type: array - items: - type: number - example: [0.6, 1, 1.5, 2] - default: null - nullable: true - suffix: - description: | - A string of up to 40 characters that will be added to your fine-tuned model name. - - For example, a `suffix` of "custom-model-name" would produce a model name like `ada:ft-your-org:custom-model-name-2022-02-15-04-21-04`. - type: string - minLength: 1 - maxLength: 40 - default: null - nullable: true - required: - - training_file - - ListFineTunesResponse: - type: object - properties: - object: - type: string - data: - type: array - items: - $ref: "#/components/schemas/FineTune" - required: - - object - - data - - ListFineTuneEventsResponse: - type: object - properties: - object: - type: string - data: - type: array - items: - $ref: "#/components/schemas/FineTuneEvent" - required: - - object - - data - - CreateEmbeddingRequest: - type: object - additionalProperties: false - properties: - model: - description: *model_description - example: "text-embedding-ada-002" - anyOf: - - type: string - - type: string - enum: ["text-embedding-ada-002"] - x-oaiTypeLabel: string - input: - description: | - Input text to embed, encoded as a string or array of tokens. To embed multiple inputs in a single request, pass an array of strings or array of token arrays. Each input must not exceed the max input tokens for the model (8191 tokens for `text-embedding-ada-002`) and cannot be an empty string. [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb) for counting tokens. - example: "The quick brown fox jumped over the lazy dog" - oneOf: - - type: string - default: "" - example: "This is a test." - - type: array - items: - type: string - default: "" - example: "This is a test." - - type: array - minItems: 1 - items: - type: integer - example: "[1212, 318, 257, 1332, 13]" - - type: array - minItems: 1 - items: - type: array - minItems: 1 - items: - type: integer - example: "[[1212, 318, 257, 1332, 13]]" - user: *end_user_param_configuration - required: - - model - - input - - CreateEmbeddingResponse: - type: object - properties: - object: - type: string - description: The object type, which is always "embedding". - model: - type: string - description: The name of the model used to generate the embedding. - data: - type: array - description: The list of embeddings generated by the model. - items: - $ref: "#/components/schemas/Embedding" - usage: - type: object - description: The usage information for the request. - properties: - prompt_tokens: - type: integer - description: The number of tokens used by the prompt. - total_tokens: - type: integer - description: The total number of tokens used by the request. - required: - - prompt_tokens - - total_tokens - required: - - object - - model - - data - - usage - - CreateTranscriptionRequest: - type: object - additionalProperties: false - properties: - file: - description: | - The audio file object (not file name) to transcribe, in one of these formats: flac, mp3, mp4, mpeg, mpga, m4a, ogg, wav, or webm. - type: string - x-oaiTypeLabel: file - format: binary - model: - description: | - ID of the model to use. Only `whisper-1` is currently available. - example: whisper-1 - anyOf: - - type: string - - type: string - enum: ["whisper-1"] - x-oaiTypeLabel: string - prompt: - description: | - An optional text to guide the model's style or continue a previous audio segment. The [prompt](/docs/guides/speech-to-text/prompting) should match the audio language. - type: string - response_format: - description: | - The format of the transcript output, in one of these options: json, text, srt, verbose_json, or vtt. - type: string - enum: - - json - - text - - srt - - verbose_json - - vtt - default: json - temperature: - description: | - The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to automatically increase the temperature until certain thresholds are hit. - type: number - default: 0 - language: - description: | - The language of the input audio. Supplying the input language in [ISO-639-1](https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes) format will improve accuracy and latency. - type: string - required: - - file - - model - - # Note: This does not currently support the non-default response format types. - CreateTranscriptionResponse: - type: object - properties: - text: - type: string - required: - - text - - CreateTranslationRequest: - type: object - additionalProperties: false - properties: - file: - description: | - The audio file object (not file name) translate, in one of these formats: flac, mp3, mp4, mpeg, mpga, m4a, ogg, wav, or webm. - type: string - x-oaiTypeLabel: file - format: binary - model: - description: | - ID of the model to use. Only `whisper-1` is currently available. - example: whisper-1 - anyOf: - - type: string - - type: string - enum: ["whisper-1"] - x-oaiTypeLabel: string - prompt: - description: | - An optional text to guide the model's style or continue a previous audio segment. The [prompt](/docs/guides/speech-to-text/prompting) should be in English. - type: string - response_format: - description: | - The format of the transcript output, in one of these options: json, text, srt, verbose_json, or vtt. - type: string - default: json - temperature: - description: | - The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to automatically increase the temperature until certain thresholds are hit. - type: number - default: 0 - required: - - file - - model - - # Note: This does not currently support the non-default response format types. - CreateTranslationResponse: - type: object - properties: - text: - type: string - required: - - text - - Model: - title: Model - description: Describes an OpenAI model offering that can be used with the API. - properties: - id: - type: string - description: The model identifier, which can be referenced in the API endpoints. - object: - type: string - description: The object type, which is always "model". - created: - type: integer - description: The Unix timestamp (in seconds) when the model was created. - owned_by: - type: string - description: The organization that owns the model. - required: - - id - - object - - created - - owned_by - x-oaiMeta: - name: The model object - example: *retrieve_model_response - - OpenAIFile: - title: OpenAIFile - description: | - The `File` object represents a document that has been uploaded to OpenAI. - properties: - id: - type: string - description: The file identifier, which can be referenced in the API endpoints. - object: - type: string - description: The object type, which is always "file". - bytes: - type: integer - description: The size of the file in bytes. - created_at: - type: integer - description: The Unix timestamp (in seconds) for when the file was created. - filename: - type: string - description: The name of the file. - purpose: - type: string - description: The intended purpose of the file. Currently, only "fine-tune" is supported. - status: - type: string - description: The current status of the file, which can be either `uploaded`, `processed`, `pending`, `error`, `deleting` or `deleted`. - status_details: - type: string - nullable: true - description: | - Additional details about the status of the file. If the file is in the `error` state, this will include a message describing the error. - required: - - id - - object - - bytes - - created_at - - filename - - purpose - - format - x-oaiMeta: - name: The file object - example: | - { - "id": "file-abc123", - "object": "file", - "bytes": 120000, - "created_at": 1677610602, - "filename": "my_file.jsonl", - "purpose": "fine-tune", - "status": "uploaded", - "status_details": null - } - Embedding: - type: object - description: | - Represents an embedding vector returned by embedding endpoint. - properties: - index: - type: integer - description: The index of the embedding in the list of embeddings. - object: - type: string - description: The object type, which is always "embedding". - embedding: - type: array - description: | - The embedding vector, which is a list of floats. The length of vector depends on the model as listed in the [embedding guide](/docs/guides/embeddings). - items: - type: number - required: - - index - - object - - embedding - x-oaiMeta: - name: The embedding object - example: | - { - "object": "embedding", - "embedding": [ - 0.0023064255, - -0.009327292, - .... (1536 floats total for ada-002) - -0.0028842222, - ], - "index": 0 - } - - FineTuningJob: - title: FineTuningJob - description: | - The `fine_tuning.job` object represents a fine-tuning job that has been created through the API. - properties: - id: - type: string - description: The object identifier, which can be referenced in the API endpoints. - object: - type: string - description: The object type, which is always "fine_tuning.job". - created_at: - type: integer - description: The Unix timestamp (in seconds) for when the fine-tuning job was created. - finished_at: - type: integer - nullable: true - description: The Unix timestamp (in seconds) for when the fine-tuning job was finished. The value will be null if the fine-tuning job is still running. - model: - type: string - description: The base model that is being fine-tuned. - fine_tuned_model: - type: string - nullable: true - description: The name of the fine-tuned model that is being created. The value will be null if the fine-tuning job is still running. - organization_id: - type: string - description: The organization that owns the fine-tuning job. - status: - type: string - description: The current status of the fine-tuning job, which can be either `validating_files`, `queued`, `running`, `succeeded`, `failed`, or `cancelled`. - hyperparameters: - type: object - description: The hyperparameters used for the fine-tuning job. See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. - properties: - n_epochs: - oneOf: - - type: string - enum: [auto] - - type: integer - minimum: 1 - maximum: 50 - default: auto - description: - The number of epochs to train the model for. An epoch refers to one full cycle through the training dataset. - - "auto" decides the optimal number of epochs based on the size of the dataset. If setting the number manually, we support any number between 1 and 50 epochs. - required: - - n_epochs - training_file: - type: string - description: The file ID used for training. You can retrieve the training data with the [Files API](/docs/api-reference/files/retrieve-contents). - validation_file: - type: string - nullable: true - description: The file ID used for validation. You can retrieve the validation results with the [Files API](/docs/api-reference/files/retrieve-contents). - result_files: - type: array - description: The compiled results file ID(s) for the fine-tuning job. You can retrieve the results with the [Files API](/docs/api-reference/files/retrieve-contents). - items: - type: string - example: file-abc123 - trained_tokens: - type: integer - nullable: true - description: The total number of billable tokens processed by this fine-tuning job. The value will be null if the fine-tuning job is still running. - error: - type: object - nullable: true - description: For fine-tuning jobs that have `failed`, this will contain more information on the cause of the failure. - properties: - message: - type: string - description: A human-readable error message. - code: - type: string - description: A machine-readable error code. - param: - type: string - description: The parameter that was invalid, usually `training_file` or `validation_file`. This field will be null if the failure was not parameter-specific. - nullable: true - required: - - message - - code - - param - required: - - id - - object - - created_at - - finished_at - - model - - fine_tuned_model - - organization_id - - status - - hyperparameters - - training_file - - validation_file - - result_files - - trained_tokens - - error - x-oaiMeta: - name: The fine-tuning job object - example: *fine_tuning_example - - FineTuningEvent: - title: FineTuningEvent - properties: - object: - type: string - created_at: - type: integer - level: - type: string - message: - type: string - data: - oneOf: - - type: string - default: none - enum: [none, string] - type: - oneOf: - - type: string - default: none - enum: ["message", "metrics"] - required: - - object - - created_at - - level - - message - x-oiMeta: - name: The fine-tuning event object - example: | - { - "object": "fine_tuning.job.event", - "created_at": "1689376978", - "level": "info" | "warn" | "error", - "message": "", - "data": null | JSON, - "type": "message"| "metrics" - } - - FineTune: - title: FineTune - deprecated: true - description: | - The `FineTune` object represents a legacy fine-tune job that has been created through the API. - properties: - id: - type: string - description: The object identifier, which can be referenced in the API endpoints. - object: - type: string - description: The object type, which is always "fine-tune". - created_at: - type: integer - description: The Unix timestamp (in seconds) for when the fine-tuning job was created. - updated_at: - type: integer - description: The Unix timestamp (in seconds) for when the fine-tuning job was last updated. - model: - type: string - description: The base model that is being fine-tuned. - fine_tuned_model: - type: string - nullable: true - description: The name of the fine-tuned model that is being created. - organization_id: - type: string - description: The organization that owns the fine-tuning job. - status: - type: string - description: The current status of the fine-tuning job, which can be either `created`, `running`, `succeeded`, `failed`, or `cancelled`. - hyperparams: - type: object - description: The hyperparameters used for the fine-tuning job. See the [fine-tuning guide](/docs/guides/legacy-fine-tuning/hyperparameters) for more details. - properties: - n_epochs: - type: integer - description: | - The number of epochs to train the model for. An epoch refers to one - full cycle through the training dataset. - batch_size: - type: integer - description: | - The batch size to use for training. The batch size is the number of - training examples used to train a single forward and backward pass. - prompt_loss_weight: - type: number - description: | - The weight to use for loss on the prompt tokens. - learning_rate_multiplier: - type: number - description: | - The learning rate multiplier to use for training. - compute_classification_metrics: - type: boolean - description: | - The classification metrics to compute using the validation dataset at the end of every epoch. - classification_positive_class: - type: string - description: | - The positive class to use for computing classification metrics. - classification_n_classes: - type: integer - description: | - The number of classes to use for computing classification metrics. - required: - - n_epochs - - batch_size - - prompt_loss_weight - - learning_rate_multiplier - training_files: - type: array - description: The list of files used for training. - items: - $ref: "#/components/schemas/OpenAIFile" - validation_files: - type: array - description: The list of files used for validation. - items: - $ref: "#/components/schemas/OpenAIFile" - result_files: - type: array - description: The compiled results files for the fine-tuning job. - items: - $ref: "#/components/schemas/OpenAIFile" - events: - type: array - description: The list of events that have been observed in the lifecycle of the FineTune job. - items: - $ref: "#/components/schemas/FineTuneEvent" - required: - - id - - object - - created_at - - updated_at - - model - - fine_tuned_model - - organization_id - - status - - hyperparams - - training_files - - validation_files - - result_files - x-oaiMeta: - name: The fine-tune object - example: *fine_tune_example - - FineTuningJobEvent: - title: FineTuningJobEvent - properties: - id: - type: string - object: - type: string - created_at: - type: integer - level: - type: string - enum: ["info", "warn", "error"] - message: - type: string - required: - - id - - object - - created_at - - level - - message - x-oiMeta: - name: The fine-tuning job event object - example: | - { - "object": "event", - "id": "ftevent-abc123" - "created_at": 1677610602, - "level": "info", - "message": "Created fine-tuning job" - } - - FineTuneEvent: - title: FineTuneEvent - properties: - object: - type: string - created_at: - type: integer - level: - type: string - message: - type: string - required: - - object - - created_at - - level - - message - x-oiMeta: - name: The fine-tune event object - example: | - { - "object": "event", - "created_at": 1677610602, - "level": "info", - "message": "Created fine-tune job" - } - CompletionUsage: - type: object - description: Usage statistics for the completion request. - properties: - prompt_tokens: - type: integer - description: Number of tokens in the prompt. - completion_tokens: - type: integer - description: Number of tokens in the generated completion. - total_tokens: - type: integer - description: Total number of tokens used in the request (prompt + completion). - required: - - prompt_tokens - - completion_tokens - - total_tokens - -security: - - ApiKeyAuth: [] - -x-oaiMeta: - groups: - # > General Notes - # The `groups` section is used to generate the API reference pages and navigation, in the same - # order listed below. Additionally, each `group` can have a list of `sections`, each of which - # will become a navigation subroute and subsection under the group. Each section has: - # - `type`: Currently, either an `endpoint` or `object`, depending on how the section needs to - # be rendered - # - `key`: The reference key that can be used to lookup the section definition - # - `path`: The path (url) of the section, which is used to generate the navigation link. - # - # > The `object` sections maps to a schema component and the following fields are read for rendering - # - `x-oaiMeta.name`: The name of the object, which will become the section title - # - `x-oaiMeta.example`: The example object, which will be used to generate the example sample (always JSON) - # - `description`: The description of the object, which will be used to generate the section description - # - # > The `endpoint` section maps to an operation path and the following fields are read for rendering: - # - `x-oaiMeta.name`: The name of the endpoint, which will become the section title - # - `x-oaiMeta.examples`: The endpoint examples, which can be an object (meaning a single variation, most - # endpoints, or an array of objects, meaning multiple variations, e.g. the - # chat completion and completion endpoints, with streamed and non-streamed examples. - # - `x-oaiMeta.returns`: text describing what the endpoint returns. - # - `summary`: The summary of the endpoint, which will be used to generate the section description - - id: audio - title: Audio - description: | - Learn how to turn audio into text. - - Related guide: [Speech to text](/docs/guides/speech-to-text) - sections: - - type: endpoint - key: createTranscription - path: createTranscription - - type: endpoint - key: createTranslation - path: createTranslation - - id: chat - title: Chat - description: | - Given a list of messages comprising a conversation, the model will return a response. - - Related guide: [Chat completions](/docs/guides/gpt) - sections: - - type: object - key: CreateChatCompletionResponse - path: object - - type: object - key: CreateChatCompletionStreamResponse - path: streaming - - type: endpoint - key: createChatCompletion - path: create - - id: completions - title: Completions - legacy: true - description: | - Given a prompt, the model will return one or more predicted completions, and can also return the probabilities of alternative tokens at each position. We recommend most users use our Chat completions API. [Learn more](/docs/deprecations/2023-07-06-gpt-and-embeddings) - - Related guide: [Legacy Completions](/docs/guides/gpt/completions-api) - sections: - - type: object - key: CreateCompletionResponse - path: object - - type: endpoint - key: createCompletion - path: create - - id: embeddings - title: Embeddings - description: | - Get a vector representation of a given input that can be easily consumed by machine learning models and algorithms. - - Related guide: [Embeddings](/docs/guides/embeddings) - sections: - - type: object - key: Embedding - path: object - - type: endpoint - key: createEmbedding - path: create - - id: fine-tuning - title: Fine-tuning - description: | - Manage fine-tuning jobs to tailor a model to your specific training data. - - Related guide: [fine-tune models](/docs/guides/fine-tuning) - sections: - - type: object - path: object - key: FineTuningJob - - type: endpoint - key: createFineTuningJob - path: create - - type: endpoint - key: listPaginatedFineTuningJobs - - type: endpoint - key: retrieveFineTuningJob - path: retrieve - - type: endpoint - key: cancelFineTuningJob - path: cancel - - type: endpoint - key: listFineTuningEvents - path: list-events - - id: files - title: Files - description: | - Files are used to upload documents that can be used with features like [fine-tuning](/docs/api-reference/fine-tuning). - sections: - - type: object - key: OpenAIFile - path: object - - type: endpoint - key: listFiles - path: list - - type: endpoint - key: createFile - path: create - - type: endpoint - key: deleteFile - path: delete - - type: endpoint - key: retrieveFile - path: retrieve - - type: endpoint - key: downloadFile - path: retrieve-contents - - id: images - title: Images - description: | - Given a prompt and/or an input image, the model will generate a new image. - - Related guide: [Image generation](/docs/guides/images) - sections: - - type: object - key: Image - path: object - - type: endpoint - key: createImage - path: create - - type: endpoint - key: createImageEdit - path: createEdit - - type: endpoint - key: createImageVariation - path: createVariation - - id: models - title: Models - description: | - List and describe the various models available in the API. You can refer to the [Models](/docs/models) documentation to understand what models are available and the differences between them. - sections: - - type: object - key: Model - path: object - - type: endpoint - key: listModels - path: list - - type: endpoint - key: retrieveModel - path: retrieve - - type: endpoint - key: deleteModel - path: delete - - id: moderations - title: Moderations - description: | - Given a input text, outputs if the model classifies it as violating OpenAI's content policy. - - Related guide: [Moderations](/docs/guides/moderation) - sections: - - type: object - key: CreateModerationResponse - path: object - - type: endpoint - key: createModeration - path: create - - id: fine-tunes - title: Fine-tunes - deprecated: true - description: | - Manage legacy fine-tuning jobs to tailor a model to your specific training data. - - We recommend transitioning to the updating [fine-tuning API](/docs/guides/fine-tuning) - sections: - - type: object - path: object - key: FineTune - - type: endpoint - key: createFineTune - path: create - - type: endpoint - key: listFineTunes - path: list - - type: endpoint - key: retrieveFineTune - path: retrieve - - type: endpoint - key: cancelFineTune - path: cancel - - type: endpoint - key: listFineTuneEvents - path: list-events - - id: edits - title: Edits - deprecated: true - description: | - Given a prompt and an instruction, the model will return an edited version of the prompt. - sections: - - type: object - key: CreateEditResponse - path: object - - type: endpoint - key: createEdit - path: create diff --git a/openapi3-original.yaml b/openapi3-original.yaml new file mode 100644 index 000000000..81f8e0ed1 --- /dev/null +++ b/openapi3-original.yaml @@ -0,0 +1,17073 @@ +openapi: 3.0.0 +info: + title: OpenAI API + description: The OpenAI REST API. Please see https://platform.openai.com/docs/api-reference for more details. + version: "2.3.0" + termsOfService: https://openai.com/policies/terms-of-use + contact: + name: OpenAI Support + url: https://help.openai.com/ + license: + name: MIT + url: https://github.com/openai/openai-openapi/blob/master/LICENSE +servers: + - url: https://api.openai.com/v1 +tags: + - name: Assistants + description: Build Assistants that can call models and use tools. + - name: Audio + description: Turn audio into text or text into audio. + - name: Chat + description: Given a list of messages comprising a conversation, the model will return a response. + - name: Completions + description: Given a prompt, the model will return one or more predicted completions, and can also return the probabilities of alternative tokens at each position. + - name: Embeddings + description: Get a vector representation of a given input that can be easily consumed by machine learning models and algorithms. + - name: Fine-tuning + description: Manage fine-tuning jobs to tailor a model to your specific training data. + - name: Batch + description: Create large batches of API requests to run asynchronously. + - name: Files + description: Files are used to upload documents that can be used with features like Assistants and Fine-tuning. + - name: Uploads + description: Use Uploads to upload large files in multiple parts. + - name: Images + description: Given a prompt and/or an input image, the model will generate a new image. + - name: Models + description: List and describe the various models available in the API. + - name: Moderations + description: Given a input text, outputs if the model classifies it as potentially harmful. + - name: Audit Logs + description: List user actions and configuration changes within this organization. +paths: + # Note: When adding an endpoint, make sure you also add it in the `groups` section, in the end of this file, + # under the appropriate group + /chat/completions: + post: + operationId: createChatCompletion + tags: + - Chat + summary: Creates a model response for the given chat conversation. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateChatCompletionRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/CreateChatCompletionResponse" + + x-oaiMeta: + name: Create chat completion + group: chat + returns: | + Returns a [chat completion](/docs/api-reference/chat/object) object, or a streamed sequence of [chat completion chunk](/docs/api-reference/chat/streaming) objects if the request is streamed. + path: create + examples: + - title: Default + request: + curl: | + curl https://api.openai.com/v1/chat/completions \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "model": "VAR_model_id", + "messages": [ + { + "role": "system", + "content": "You are a helpful assistant." + }, + { + "role": "user", + "content": "Hello!" + } + ] + }' + python: | + from openai import OpenAI + client = OpenAI() + + completion = client.chat.completions.create( + model="VAR_model_id", + messages=[ + {"role": "system", "content": "You are a helpful assistant."}, + {"role": "user", "content": "Hello!"} + ] + ) + + print(completion.choices[0].message) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const completion = await openai.chat.completions.create({ + messages: [{ role: "system", content: "You are a helpful assistant." }], + model: "VAR_model_id", + }); + + console.log(completion.choices[0]); + } + + main(); + response: &chat_completion_example | + { + "id": "chatcmpl-123", + "object": "chat.completion", + "created": 1677652288, + "model": "gpt-4o-mini", + "system_fingerprint": "fp_44709d6fcb", + "choices": [{ + "index": 0, + "message": { + "role": "assistant", + "content": "\n\nHello there, how may I assist you today?", + }, + "logprobs": null, + "finish_reason": "stop" + }], + "usage": { + "prompt_tokens": 9, + "completion_tokens": 12, + "total_tokens": 21 + } + } + - title: Image input + request: + curl: | + curl https://api.openai.com/v1/chat/completions \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "model": "gpt-4o", + "messages": [ + { + "role": "user", + "content": [ + { + "type": "text", + "text": "What'\''s in this image?" + }, + { + "type": "image_url", + "image_url": { + "url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg" + } + } + ] + } + ], + "max_tokens": 300 + }' + python: | + from openai import OpenAI + + client = OpenAI() + + response = client.chat.completions.create( + model="gpt-4o", + messages=[ + { + "role": "user", + "content": [ + {"type": "text", "text": "What's in this image?"}, + { + "type": "image_url", + "image_url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg", + }, + ], + } + ], + max_tokens=300, + ) + + print(response.choices[0]) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const response = await openai.chat.completions.create({ + model: "gpt-4o", + messages: [ + { + role: "user", + content: [ + { type: "text", text: "What's in this image?" }, + { + type: "image_url", + image_url: + "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg", + }, + ], + }, + ], + }); + console.log(response.choices[0]); + } + main(); + response: &chat_completion_image_example | + { + "id": "chatcmpl-123", + "object": "chat.completion", + "created": 1677652288, + "model": "gpt-4o-mini", + "system_fingerprint": "fp_44709d6fcb", + "choices": [{ + "index": 0, + "message": { + "role": "assistant", + "content": "\n\nThis image shows a wooden boardwalk extending through a lush green marshland.", + }, + "logprobs": null, + "finish_reason": "stop" + }], + "usage": { + "prompt_tokens": 9, + "completion_tokens": 12, + "total_tokens": 21 + } + } + - title: Streaming + request: + curl: | + curl https://api.openai.com/v1/chat/completions \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "model": "VAR_model_id", + "messages": [ + { + "role": "system", + "content": "You are a helpful assistant." + }, + { + "role": "user", + "content": "Hello!" + } + ], + "stream": true + }' + python: | + from openai import OpenAI + client = OpenAI() + + completion = client.chat.completions.create( + model="VAR_model_id", + messages=[ + {"role": "system", "content": "You are a helpful assistant."}, + {"role": "user", "content": "Hello!"} + ], + stream=True + ) + + for chunk in completion: + print(chunk.choices[0].delta) + + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const completion = await openai.chat.completions.create({ + model: "VAR_model_id", + messages: [ + {"role": "system", "content": "You are a helpful assistant."}, + {"role": "user", "content": "Hello!"} + ], + stream: true, + }); + + for await (const chunk of completion) { + console.log(chunk.choices[0].delta.content); + } + } + + main(); + response: &chat_completion_chunk_example | + {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1694268190,"model":"gpt-4o-mini", "system_fingerprint": "fp_44709d6fcb", "choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]} + + {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1694268190,"model":"gpt-4o-mini", "system_fingerprint": "fp_44709d6fcb", "choices":[{"index":0,"delta":{"content":"Hello"},"logprobs":null,"finish_reason":null}]} + + .... + + {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1694268190,"model":"gpt-4o-mini", "system_fingerprint": "fp_44709d6fcb", "choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]} + - title: Functions + request: + curl: | + curl https://api.openai.com/v1/chat/completions \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "model": "gpt-4o", + "messages": [ + { + "role": "user", + "content": "What'\''s the weather like in Boston today?" + } + ], + "tools": [ + { + "type": "function", + "function": { + "name": "get_current_weather", + "description": "Get the current weather in a given location", + "parameters": { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. San Francisco, CA" + }, + "unit": { + "type": "string", + "enum": ["celsius", "fahrenheit"] + } + }, + "required": ["location"] + } + } + } + ], + "tool_choice": "auto" + }' + python: | + from openai import OpenAI + client = OpenAI() + + tools = [ + { + "type": "function", + "function": { + "name": "get_current_weather", + "description": "Get the current weather in a given location", + "parameters": { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. San Francisco, CA", + }, + "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}, + }, + "required": ["location"], + }, + } + } + ] + messages = [{"role": "user", "content": "What's the weather like in Boston today?"}] + completion = client.chat.completions.create( + model="VAR_model_id", + messages=messages, + tools=tools, + tool_choice="auto" + ) + + print(completion) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const messages = [{"role": "user", "content": "What's the weather like in Boston today?"}]; + const tools = [ + { + "type": "function", + "function": { + "name": "get_current_weather", + "description": "Get the current weather in a given location", + "parameters": { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. San Francisco, CA", + }, + "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}, + }, + "required": ["location"], + }, + } + } + ]; + + const response = await openai.chat.completions.create({ + model: "gpt-4o", + messages: messages, + tools: tools, + tool_choice: "auto", + }); + + console.log(response); + } + + main(); + response: &chat_completion_function_example | + { + "id": "chatcmpl-abc123", + "object": "chat.completion", + "created": 1699896916, + "model": "gpt-4o-mini", + "choices": [ + { + "index": 0, + "message": { + "role": "assistant", + "content": null, + "tool_calls": [ + { + "id": "call_abc123", + "type": "function", + "function": { + "name": "get_current_weather", + "arguments": "{\n\"location\": \"Boston, MA\"\n}" + } + } + ] + }, + "logprobs": null, + "finish_reason": "tool_calls" + } + ], + "usage": { + "prompt_tokens": 82, + "completion_tokens": 17, + "total_tokens": 99 + } + } + - title: Logprobs + request: + curl: | + curl https://api.openai.com/v1/chat/completions \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "model": "VAR_model_id", + "messages": [ + { + "role": "user", + "content": "Hello!" + } + ], + "logprobs": true, + "top_logprobs": 2 + }' + python: | + from openai import OpenAI + client = OpenAI() + + completion = client.chat.completions.create( + model="VAR_model_id", + messages=[ + {"role": "user", "content": "Hello!"} + ], + logprobs=True, + top_logprobs=2 + ) + + print(completion.choices[0].message) + print(completion.choices[0].logprobs) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const completion = await openai.chat.completions.create({ + messages: [{ role: "user", content: "Hello!" }], + model: "VAR_model_id", + logprobs: true, + top_logprobs: 2, + }); + + console.log(completion.choices[0]); + } + + main(); + response: | + { + "id": "chatcmpl-123", + "object": "chat.completion", + "created": 1702685778, + "model": "gpt-4o-mini", + "choices": [ + { + "index": 0, + "message": { + "role": "assistant", + "content": "Hello! How can I assist you today?" + }, + "logprobs": { + "content": [ + { + "token": "Hello", + "logprob": -0.31725305, + "bytes": [72, 101, 108, 108, 111], + "top_logprobs": [ + { + "token": "Hello", + "logprob": -0.31725305, + "bytes": [72, 101, 108, 108, 111] + }, + { + "token": "Hi", + "logprob": -1.3190403, + "bytes": [72, 105] + } + ] + }, + { + "token": "!", + "logprob": -0.02380986, + "bytes": [ + 33 + ], + "top_logprobs": [ + { + "token": "!", + "logprob": -0.02380986, + "bytes": [33] + }, + { + "token": " there", + "logprob": -3.787621, + "bytes": [32, 116, 104, 101, 114, 101] + } + ] + }, + { + "token": " How", + "logprob": -0.000054669687, + "bytes": [32, 72, 111, 119], + "top_logprobs": [ + { + "token": " How", + "logprob": -0.000054669687, + "bytes": [32, 72, 111, 119] + }, + { + "token": "<|end|>", + "logprob": -10.953937, + "bytes": null + } + ] + }, + { + "token": " can", + "logprob": -0.015801601, + "bytes": [32, 99, 97, 110], + "top_logprobs": [ + { + "token": " can", + "logprob": -0.015801601, + "bytes": [32, 99, 97, 110] + }, + { + "token": " may", + "logprob": -4.161023, + "bytes": [32, 109, 97, 121] + } + ] + }, + { + "token": " I", + "logprob": -3.7697225e-6, + "bytes": [ + 32, + 73 + ], + "top_logprobs": [ + { + "token": " I", + "logprob": -3.7697225e-6, + "bytes": [32, 73] + }, + { + "token": " assist", + "logprob": -13.596657, + "bytes": [32, 97, 115, 115, 105, 115, 116] + } + ] + }, + { + "token": " assist", + "logprob": -0.04571125, + "bytes": [32, 97, 115, 115, 105, 115, 116], + "top_logprobs": [ + { + "token": " assist", + "logprob": -0.04571125, + "bytes": [32, 97, 115, 115, 105, 115, 116] + }, + { + "token": " help", + "logprob": -3.1089056, + "bytes": [32, 104, 101, 108, 112] + } + ] + }, + { + "token": " you", + "logprob": -5.4385737e-6, + "bytes": [32, 121, 111, 117], + "top_logprobs": [ + { + "token": " you", + "logprob": -5.4385737e-6, + "bytes": [32, 121, 111, 117] + }, + { + "token": " today", + "logprob": -12.807695, + "bytes": [32, 116, 111, 100, 97, 121] + } + ] + }, + { + "token": " today", + "logprob": -0.0040071653, + "bytes": [32, 116, 111, 100, 97, 121], + "top_logprobs": [ + { + "token": " today", + "logprob": -0.0040071653, + "bytes": [32, 116, 111, 100, 97, 121] + }, + { + "token": "?", + "logprob": -5.5247097, + "bytes": [63] + } + ] + }, + { + "token": "?", + "logprob": -0.0008108172, + "bytes": [63], + "top_logprobs": [ + { + "token": "?", + "logprob": -0.0008108172, + "bytes": [63] + }, + { + "token": "?\n", + "logprob": -7.184561, + "bytes": [63, 10] + } + ] + } + ] + }, + "finish_reason": "stop" + } + ], + "usage": { + "prompt_tokens": 9, + "completion_tokens": 9, + "total_tokens": 18 + }, + "system_fingerprint": null + } + + /completions: + post: + operationId: createCompletion + tags: + - Completions + summary: Creates a completion for the provided prompt and parameters. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateCompletionRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/CreateCompletionResponse" + x-oaiMeta: + name: Create completion + group: completions + returns: | + Returns a [completion](/docs/api-reference/completions/object) object, or a sequence of completion objects if the request is streamed. + legacy: true + examples: + - title: No streaming + request: + curl: | + curl https://api.openai.com/v1/completions \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "model": "VAR_model_id", + "prompt": "Say this is a test", + "max_tokens": 7, + "temperature": 0 + }' + python: | + from openai import OpenAI + client = OpenAI() + + client.completions.create( + model="VAR_model_id", + prompt="Say this is a test", + max_tokens=7, + temperature=0 + ) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const completion = await openai.completions.create({ + model: "VAR_model_id", + prompt: "Say this is a test.", + max_tokens: 7, + temperature: 0, + }); + + console.log(completion); + } + main(); + response: | + { + "id": "cmpl-uqkvlQyYK7bGYrRHQ0eXlWi7", + "object": "text_completion", + "created": 1589478378, + "model": "VAR_model_id", + "system_fingerprint": "fp_44709d6fcb", + "choices": [ + { + "text": "\n\nThis is indeed a test", + "index": 0, + "logprobs": null, + "finish_reason": "length" + } + ], + "usage": { + "prompt_tokens": 5, + "completion_tokens": 7, + "total_tokens": 12 + } + } + - title: Streaming + request: + curl: | + curl https://api.openai.com/v1/completions \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "model": "VAR_model_id", + "prompt": "Say this is a test", + "max_tokens": 7, + "temperature": 0, + "stream": true + }' + python: | + from openai import OpenAI + client = OpenAI() + + for chunk in client.completions.create( + model="VAR_model_id", + prompt="Say this is a test", + max_tokens=7, + temperature=0, + stream=True + ): + print(chunk.choices[0].text) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const stream = await openai.completions.create({ + model: "VAR_model_id", + prompt: "Say this is a test.", + stream: true, + }); + + for await (const chunk of stream) { + console.log(chunk.choices[0].text) + } + } + main(); + response: | + { + "id": "cmpl-7iA7iJjj8V2zOkCGvWF2hAkDWBQZe", + "object": "text_completion", + "created": 1690759702, + "choices": [ + { + "text": "This", + "index": 0, + "logprobs": null, + "finish_reason": null + } + ], + "model": "gpt-3.5-turbo-instruct" + "system_fingerprint": "fp_44709d6fcb", + } + + /images/generations: + post: + operationId: createImage + tags: + - Images + summary: Creates an image given a prompt. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateImageRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ImagesResponse" + x-oaiMeta: + name: Create image + group: images + returns: Returns a list of [image](/docs/api-reference/images/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/images/generations \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "model": "dall-e-3", + "prompt": "A cute baby sea otter", + "n": 1, + "size": "1024x1024" + }' + python: | + from openai import OpenAI + client = OpenAI() + + client.images.generate( + model="dall-e-3", + prompt="A cute baby sea otter", + n=1, + size="1024x1024" + ) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const image = await openai.images.generate({ model: "dall-e-3", prompt: "A cute baby sea otter" }); + + console.log(image.data); + } + main(); + response: | + { + "created": 1589478378, + "data": [ + { + "url": "https://..." + }, + { + "url": "https://..." + } + ] + } + /images/edits: + post: + operationId: createImageEdit + tags: + - Images + summary: Creates an edited or extended image given an original image and a prompt. + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: "#/components/schemas/CreateImageEditRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ImagesResponse" + x-oaiMeta: + name: Create image edit + group: images + returns: Returns a list of [image](/docs/api-reference/images/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/images/edits \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -F image="@otter.png" \ + -F mask="@mask.png" \ + -F prompt="A cute baby sea otter wearing a beret" \ + -F n=2 \ + -F size="1024x1024" + python: | + from openai import OpenAI + client = OpenAI() + + client.images.edit( + image=open("otter.png", "rb"), + mask=open("mask.png", "rb"), + prompt="A cute baby sea otter wearing a beret", + n=2, + size="1024x1024" + ) + node.js: |- + import fs from "fs"; + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const image = await openai.images.edit({ + image: fs.createReadStream("otter.png"), + mask: fs.createReadStream("mask.png"), + prompt: "A cute baby sea otter wearing a beret", + }); + + console.log(image.data); + } + main(); + response: | + { + "created": 1589478378, + "data": [ + { + "url": "https://..." + }, + { + "url": "https://..." + } + ] + } + /images/variations: + post: + operationId: createImageVariation + tags: + - Images + summary: Creates a variation of a given image. + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: "#/components/schemas/CreateImageVariationRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ImagesResponse" + x-oaiMeta: + name: Create image variation + group: images + returns: Returns a list of [image](/docs/api-reference/images/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/images/variations \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -F image="@otter.png" \ + -F n=2 \ + -F size="1024x1024" + python: | + from openai import OpenAI + client = OpenAI() + + response = client.images.create_variation( + image=open("image_edit_original.png", "rb"), + n=2, + size="1024x1024" + ) + node.js: |- + import fs from "fs"; + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const image = await openai.images.createVariation({ + image: fs.createReadStream("otter.png"), + }); + + console.log(image.data); + } + main(); + response: | + { + "created": 1589478378, + "data": [ + { + "url": "https://..." + }, + { + "url": "https://..." + } + ] + } + + /embeddings: + post: + operationId: createEmbedding + tags: + - Embeddings + summary: Creates an embedding vector representing the input text. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateEmbeddingRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/CreateEmbeddingResponse" + x-oaiMeta: + name: Create embeddings + group: embeddings + returns: A list of [embedding](/docs/api-reference/embeddings/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/embeddings \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -d '{ + "input": "The food was delicious and the waiter...", + "model": "text-embedding-ada-002", + "encoding_format": "float" + }' + python: | + from openai import OpenAI + client = OpenAI() + + client.embeddings.create( + model="text-embedding-ada-002", + input="The food was delicious and the waiter...", + encoding_format="float" + ) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const embedding = await openai.embeddings.create({ + model: "text-embedding-ada-002", + input: "The quick brown fox jumped over the lazy dog", + encoding_format: "float", + }); + + console.log(embedding); + } + + main(); + response: | + { + "object": "list", + "data": [ + { + "object": "embedding", + "embedding": [ + 0.0023064255, + -0.009327292, + .... (1536 floats total for ada-002) + -0.0028842222, + ], + "index": 0 + } + ], + "model": "text-embedding-ada-002", + "usage": { + "prompt_tokens": 8, + "total_tokens": 8 + } + } + + /audio/speech: + post: + operationId: createSpeech + tags: + - Audio + summary: Generates audio from the input text. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateSpeechRequest" + responses: + "200": + description: OK + headers: + Transfer-Encoding: + schema: + type: string + description: chunked + content: + application/octet-stream: + schema: + type: string + format: binary + x-oaiMeta: + name: Create speech + group: audio + returns: The audio file content. + examples: + request: + curl: | + curl https://api.openai.com/v1/audio/speech \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -d '{ + "model": "tts-1", + "input": "The quick brown fox jumped over the lazy dog.", + "voice": "alloy" + }' \ + --output speech.mp3 + python: | + from pathlib import Path + import openai + + speech_file_path = Path(__file__).parent / "speech.mp3" + response = openai.audio.speech.create( + model="tts-1", + voice="alloy", + input="The quick brown fox jumped over the lazy dog." + ) + response.stream_to_file(speech_file_path) + node: | + import fs from "fs"; + import path from "path"; + import OpenAI from "openai"; + + const openai = new OpenAI(); + + const speechFile = path.resolve("./speech.mp3"); + + async function main() { + const mp3 = await openai.audio.speech.create({ + model: "tts-1", + voice: "alloy", + input: "Today is a wonderful day to build something people love!", + }); + console.log(speechFile); + const buffer = Buffer.from(await mp3.arrayBuffer()); + await fs.promises.writeFile(speechFile, buffer); + } + main(); + /audio/transcriptions: + post: + operationId: createTranscription + tags: + - Audio + summary: Transcribes audio into the input language. + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: "#/components/schemas/CreateTranscriptionRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + oneOf: + - $ref: "#/components/schemas/CreateTranscriptionResponseJson" + - $ref: "#/components/schemas/CreateTranscriptionResponseVerboseJson" + x-oaiMeta: + name: Create transcription + group: audio + returns: The [transcription object](/docs/api-reference/audio/json-object) or a [verbose transcription object](/docs/api-reference/audio/verbose-json-object). + examples: + - title: Default + request: + curl: | + curl https://api.openai.com/v1/audio/transcriptions \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: multipart/form-data" \ + -F file="@/path/to/file/audio.mp3" \ + -F model="whisper-1" + python: | + from openai import OpenAI + client = OpenAI() + + audio_file = open("speech.mp3", "rb") + transcript = client.audio.transcriptions.create( + model="whisper-1", + file=audio_file + ) + node: | + import fs from "fs"; + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const transcription = await openai.audio.transcriptions.create({ + file: fs.createReadStream("audio.mp3"), + model: "whisper-1", + }); + + console.log(transcription.text); + } + main(); + response: &basic_transcription_response_example | + { + "text": "Imagine the wildest idea that you've ever had, and you're curious about how it might scale to something that's a 100, a 1,000 times bigger. This is a place where you can get to do that." + } + - title: Word timestamps + request: + curl: | + curl https://api.openai.com/v1/audio/transcriptions \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: multipart/form-data" \ + -F file="@/path/to/file/audio.mp3" \ + -F "timestamp_granularities[]=word" \ + -F model="whisper-1" \ + -F response_format="verbose_json" + python: | + from openai import OpenAI + client = OpenAI() + + audio_file = open("speech.mp3", "rb") + transcript = client.audio.transcriptions.create( + file=audio_file, + model="whisper-1", + response_format="verbose_json", + timestamp_granularities=["word"] + ) + + print(transcript.words) + node: | + import fs from "fs"; + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const transcription = await openai.audio.transcriptions.create({ + file: fs.createReadStream("audio.mp3"), + model: "whisper-1", + response_format: "verbose_json", + timestamp_granularities: ["word"] + }); + + console.log(transcription.text); + } + main(); + response: | + { + "task": "transcribe", + "language": "english", + "duration": 8.470000267028809, + "text": "The beach was a popular spot on a hot summer day. People were swimming in the ocean, building sandcastles, and playing beach volleyball.", + "words": [ + { + "word": "The", + "start": 0.0, + "end": 0.23999999463558197 + }, + ... + { + "word": "volleyball", + "start": 7.400000095367432, + "end": 7.900000095367432 + } + ] + } + - title: Segment timestamps + request: + curl: | + curl https://api.openai.com/v1/audio/transcriptions \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: multipart/form-data" \ + -F file="@/path/to/file/audio.mp3" \ + -F "timestamp_granularities[]=segment" \ + -F model="whisper-1" \ + -F response_format="verbose_json" + python: | + from openai import OpenAI + client = OpenAI() + + audio_file = open("speech.mp3", "rb") + transcript = client.audio.transcriptions.create( + file=audio_file, + model="whisper-1", + response_format="verbose_json", + timestamp_granularities=["segment"] + ) + + print(transcript.words) + node: | + import fs from "fs"; + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const transcription = await openai.audio.transcriptions.create({ + file: fs.createReadStream("audio.mp3"), + model: "whisper-1", + response_format: "verbose_json", + timestamp_granularities: ["segment"] + }); + + console.log(transcription.text); + } + main(); + response: &verbose_transcription_response_example | + { + "task": "transcribe", + "language": "english", + "duration": 8.470000267028809, + "text": "The beach was a popular spot on a hot summer day. People were swimming in the ocean, building sandcastles, and playing beach volleyball.", + "segments": [ + { + "id": 0, + "seek": 0, + "start": 0.0, + "end": 3.319999933242798, + "text": " The beach was a popular spot on a hot summer day.", + "tokens": [ + 50364, 440, 7534, 390, 257, 3743, 4008, 322, 257, 2368, 4266, 786, 13, 50530 + ], + "temperature": 0.0, + "avg_logprob": -0.2860786020755768, + "compression_ratio": 1.2363636493682861, + "no_speech_prob": 0.00985979475080967 + }, + ... + ] + } + /audio/translations: + post: + operationId: createTranslation + tags: + - Audio + summary: Translates audio into English. + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: "#/components/schemas/CreateTranslationRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + oneOf: + - $ref: "#/components/schemas/CreateTranslationResponseJson" + - $ref: "#/components/schemas/CreateTranslationResponseVerboseJson" + x-oaiMeta: + name: Create translation + group: audio + returns: The translated text. + examples: + request: + curl: | + curl https://api.openai.com/v1/audio/translations \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: multipart/form-data" \ + -F file="@/path/to/file/german.m4a" \ + -F model="whisper-1" + python: | + from openai import OpenAI + client = OpenAI() + + audio_file = open("speech.mp3", "rb") + transcript = client.audio.translations.create( + model="whisper-1", + file=audio_file + ) + node: | + import fs from "fs"; + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const translation = await openai.audio.translations.create({ + file: fs.createReadStream("speech.mp3"), + model: "whisper-1", + }); + + console.log(translation.text); + } + main(); + response: | + { + "text": "Hello, my name is Wolfgang and I come from Germany. Where are you heading today?" + } + + /files: + get: + operationId: listFiles + tags: + - Files + summary: Returns a list of files that belong to the user's organization. + parameters: + - in: query + name: purpose + required: false + schema: + type: string + description: Only return files with the given purpose. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListFilesResponse" + x-oaiMeta: + name: List files + group: files + returns: A list of [File](/docs/api-reference/files/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/files \ + -H "Authorization: Bearer $OPENAI_API_KEY" + python: | + from openai import OpenAI + client = OpenAI() + + client.files.list() + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const list = await openai.files.list(); + + for await (const file of list) { + console.log(file); + } + } + + main(); + response: | + { + "data": [ + { + "id": "file-abc123", + "object": "file", + "bytes": 175, + "created_at": 1613677385, + "filename": "salesOverview.pdf", + "purpose": "assistants", + }, + { + "id": "file-abc123", + "object": "file", + "bytes": 140, + "created_at": 1613779121, + "filename": "puppy.jsonl", + "purpose": "fine-tune", + } + ], + "object": "list" + } + post: + operationId: createFile + tags: + - Files + summary: | + Upload a file that can be used across various endpoints. Individual files can be up to 512 MB, and the size of all files uploaded by one organization can be up to 100 GB. + + The Assistants API supports files up to 2 million tokens and of specific file types. See the [Assistants Tools guide](/docs/assistants/tools) for details. + + The Fine-tuning API only supports `.jsonl` files. The input also has certain required formats for fine-tuning [chat](/docs/api-reference/fine-tuning/chat-input) or [completions](/docs/api-reference/fine-tuning/completions-input) models. + + The Batch API only supports `.jsonl` files up to 100 MB in size. The input also has a specific required [format](/docs/api-reference/batch/request-input). + + Please [contact us](https://help.openai.com/) if you need to increase these storage limits. + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: "#/components/schemas/CreateFileRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/OpenAIFile" + x-oaiMeta: + name: Upload file + group: files + returns: The uploaded [File](/docs/api-reference/files/object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/files \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -F purpose="fine-tune" \ + -F file="@mydata.jsonl" + python: | + from openai import OpenAI + client = OpenAI() + + client.files.create( + file=open("mydata.jsonl", "rb"), + purpose="fine-tune" + ) + node.js: |- + import fs from "fs"; + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const file = await openai.files.create({ + file: fs.createReadStream("mydata.jsonl"), + purpose: "fine-tune", + }); + + console.log(file); + } + + main(); + response: | + { + "id": "file-abc123", + "object": "file", + "bytes": 120000, + "created_at": 1677610602, + "filename": "mydata.jsonl", + "purpose": "fine-tune", + } + /files/{file_id}: + delete: + operationId: deleteFile + tags: + - Files + summary: Delete a file. + parameters: + - in: path + name: file_id + required: true + schema: + type: string + description: The ID of the file to use for this request. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/DeleteFileResponse" + x-oaiMeta: + name: Delete file + group: files + returns: Deletion status. + examples: + request: + curl: | + curl https://api.openai.com/v1/files/file-abc123 \ + -X DELETE \ + -H "Authorization: Bearer $OPENAI_API_KEY" + python: | + from openai import OpenAI + client = OpenAI() + + client.files.delete("file-abc123") + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const file = await openai.files.del("file-abc123"); + + console.log(file); + } + + main(); + response: | + { + "id": "file-abc123", + "object": "file", + "deleted": true + } + get: + operationId: retrieveFile + tags: + - Files + summary: Returns information about a specific file. + parameters: + - in: path + name: file_id + required: true + schema: + type: string + description: The ID of the file to use for this request. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/OpenAIFile" + x-oaiMeta: + name: Retrieve file + group: files + returns: The [File](/docs/api-reference/files/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/files/file-abc123 \ + -H "Authorization: Bearer $OPENAI_API_KEY" + python: | + from openai import OpenAI + client = OpenAI() + + client.files.retrieve("file-abc123") + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const file = await openai.files.retrieve("file-abc123"); + + console.log(file); + } + + main(); + response: | + { + "id": "file-abc123", + "object": "file", + "bytes": 120000, + "created_at": 1677610602, + "filename": "mydata.jsonl", + "purpose": "fine-tune", + } + /files/{file_id}/content: + get: + operationId: downloadFile + tags: + - Files + summary: Returns the contents of the specified file. + parameters: + - in: path + name: file_id + required: true + schema: + type: string + description: The ID of the file to use for this request. + responses: + "200": + description: OK + content: + application/json: + schema: + type: string + x-oaiMeta: + name: Retrieve file content + group: files + returns: The file content. + examples: + request: + curl: | + curl https://api.openai.com/v1/files/file-abc123/content \ + -H "Authorization: Bearer $OPENAI_API_KEY" > file.jsonl + python: | + from openai import OpenAI + client = OpenAI() + + content = client.files.content("file-abc123") + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const file = await openai.files.content("file-abc123"); + + console.log(file); + } + + main(); + /uploads: + post: + operationId: createUpload + tags: + - Uploads + summary: | + Creates an intermediate [Upload](/docs/api-reference/uploads/object) object that you can add [Parts](/docs/api-reference/uploads/part-object) to. Currently, an Upload can accept at most 8 GB in total and expires after an hour after you create it. + + Once you complete the Upload, we will create a [File](/docs/api-reference/files/object) object that contains all the parts you uploaded. This File is usable in the rest of our platform as a regular File object. + + For certain `purpose`s, the correct `mime_type` must be specified. Please refer to documentation for the supported MIME types for your use case: + - [Assistants](/docs/assistants/tools/file-search/supported-files) + + For guidance on the proper filename extensions for each purpose, please follow the documentation on [creating a File](/docs/api-reference/files/create). + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateUploadRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/Upload" + x-oaiMeta: + name: Create upload + group: uploads + returns: The [Upload](/docs/api-reference/uploads/object) object with status `pending`. + examples: + request: + curl: | + curl https://api.openai.com/v1/uploads \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "purpose": "fine-tune", + "filename": "training_examples.jsonl", + "bytes": 2147483648, + "mime_type": "text/jsonl" + }' + response: | + { + "id": "upload_abc123", + "object": "upload", + "bytes": 2147483648, + "created_at": 1719184911, + "filename": "training_examples.jsonl", + "purpose": "fine-tune", + "status": "pending", + "expires_at": 1719127296 + } + + /uploads/{upload_id}/parts: + post: + operationId: addUploadPart + tags: + - Uploads + summary: | + Adds a [Part](/docs/api-reference/uploads/part-object) to an [Upload](/docs/api-reference/uploads/object) object. A Part represents a chunk of bytes from the file you are trying to upload. + + Each Part can be at most 64 MB, and you can add Parts until you hit the Upload maximum of 8 GB. + + It is possible to add multiple Parts in parallel. You can decide the intended order of the Parts when you [complete the Upload](/docs/api-reference/uploads/complete). + parameters: + - in: path + name: upload_id + required: true + schema: + type: string + example: upload_abc123 + description: | + The ID of the Upload. + requestBody: + required: true + content: + multipart/form-data: + schema: + $ref: "#/components/schemas/AddUploadPartRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/UploadPart" + x-oaiMeta: + name: Add upload part + group: uploads + returns: The upload [Part](/docs/api-reference/uploads/part-object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/uploads/upload_abc123/parts + -F data="aHR0cHM6Ly9hcGkub3BlbmFpLmNvbS92MS91cGxvYWRz..." + response: | + { + "id": "part_def456", + "object": "upload.part", + "created_at": 1719185911, + "upload_id": "upload_abc123" + } + + /uploads/{upload_id}/complete: + post: + operationId: completeUpload + tags: + - Uploads + summary: | + Completes the [Upload](/docs/api-reference/uploads/object). + + Within the returned Upload object, there is a nested [File](/docs/api-reference/files/object) object that is ready to use in the rest of the platform. + + You can specify the order of the Parts by passing in an ordered list of the Part IDs. + + The number of bytes uploaded upon completion must match the number of bytes initially specified when creating the Upload object. No Parts may be added after an Upload is completed. + parameters: + - in: path + name: upload_id + required: true + schema: + type: string + example: upload_abc123 + description: | + The ID of the Upload. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CompleteUploadRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/Upload" + x-oaiMeta: + name: Complete upload + group: uploads + returns: The [Upload](/docs/api-reference/uploads/object) object with status `completed` with an additional `file` property containing the created usable File object. + examples: + request: + curl: | + curl https://api.openai.com/v1/uploads/upload_abc123/complete + -d '{ + "part_ids": ["part_def456", "part_ghi789"] + }' + response: | + { + "id": "upload_abc123", + "object": "upload", + "bytes": 2147483648, + "created_at": 1719184911, + "filename": "training_examples.jsonl", + "purpose": "fine-tune", + "status": "completed", + "expires_at": 1719127296, + "file": { + "id": "file-xyz321", + "object": "file", + "bytes": 2147483648, + "created_at": 1719186911, + "filename": "training_examples.jsonl", + "purpose": "fine-tune", + } + } + + /uploads/{upload_id}/cancel: + post: + operationId: cancelUpload + tags: + - Uploads + summary: | + Cancels the Upload. No Parts may be added after an Upload is cancelled. + parameters: + - in: path + name: upload_id + required: true + schema: + type: string + example: upload_abc123 + description: | + The ID of the Upload. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/Upload" + x-oaiMeta: + name: Cancel upload + group: uploads + returns: The [Upload](/docs/api-reference/uploads/object) object with status `cancelled`. + examples: + request: + curl: | + curl https://api.openai.com/v1/uploads/upload_abc123/cancel + response: | + { + "id": "upload_abc123", + "object": "upload", + "bytes": 2147483648, + "created_at": 1719184911, + "filename": "training_examples.jsonl", + "purpose": "fine-tune", + "status": "cancelled", + "expires_at": 1719127296 + } + + /fine_tuning/jobs: + post: + operationId: createFineTuningJob + tags: + - Fine-tuning + summary: | + Creates a fine-tuning job which begins the process of creating a new model from a given dataset. + + Response includes details of the enqueued job including job status and the name of the fine-tuned models once complete. + + [Learn more about fine-tuning](/docs/guides/fine-tuning) + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateFineTuningJobRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/FineTuningJob" + x-oaiMeta: + name: Create fine-tuning job + group: fine-tuning + returns: A [fine-tuning.job](/docs/api-reference/fine-tuning/object) object. + examples: + - title: Default + request: + curl: | + curl https://api.openai.com/v1/fine_tuning/jobs \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "training_file": "file-BK7bzQj3FfZFXr7DbL6xJwfo", + "model": "gpt-4o-mini" + }' + python: | + from openai import OpenAI + client = OpenAI() + + client.fine_tuning.jobs.create( + training_file="file-abc123", + model="gpt-4o-mini" + ) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const fineTune = await openai.fineTuning.jobs.create({ + training_file: "file-abc123" + }); + + console.log(fineTune); + } + + main(); + response: | + { + "object": "fine_tuning.job", + "id": "ftjob-abc123", + "model": "gpt-4o-mini-2024-07-18", + "created_at": 1721764800, + "fine_tuned_model": null, + "organization_id": "org-123", + "result_files": [], + "status": "queued", + "validation_file": null, + "training_file": "file-abc123", + } + - title: Epochs + request: + curl: | + curl https://api.openai.com/v1/fine_tuning/jobs \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "training_file": "file-abc123", + "model": "gpt-4o-mini", + "hyperparameters": { + "n_epochs": 2 + } + }' + python: | + from openai import OpenAI + client = OpenAI() + + client.fine_tuning.jobs.create( + training_file="file-abc123", + model="gpt-4o-mini", + hyperparameters={ + "n_epochs":2 + } + ) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const fineTune = await openai.fineTuning.jobs.create({ + training_file: "file-abc123", + model: "gpt-4o-mini", + hyperparameters: { n_epochs: 2 } + }); + + console.log(fineTune); + } + + main(); + response: | + { + "object": "fine_tuning.job", + "id": "ftjob-abc123", + "model": "gpt-4o-mini-2024-07-18", + "created_at": 1721764800, + "fine_tuned_model": null, + "organization_id": "org-123", + "result_files": [], + "status": "queued", + "validation_file": null, + "training_file": "file-abc123", + "hyperparameters": {"n_epochs": 2}, + } + - title: Validation file + request: + curl: | + curl https://api.openai.com/v1/fine_tuning/jobs \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "training_file": "file-abc123", + "validation_file": "file-abc123", + "model": "gpt-4o-mini" + }' + python: | + from openai import OpenAI + client = OpenAI() + + client.fine_tuning.jobs.create( + training_file="file-abc123", + validation_file="file-def456", + model="gpt-4o-mini" + ) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const fineTune = await openai.fineTuning.jobs.create({ + training_file: "file-abc123", + validation_file: "file-abc123" + }); + + console.log(fineTune); + } + + main(); + response: | + { + "object": "fine_tuning.job", + "id": "ftjob-abc123", + "model": "gpt-4o-mini-2024-07-18", + "created_at": 1721764800, + "fine_tuned_model": null, + "organization_id": "org-123", + "result_files": [], + "status": "queued", + "validation_file": "file-abc123", + "training_file": "file-abc123", + } + - title: W&B Integration + request: + curl: | + curl https://api.openai.com/v1/fine_tuning/jobs \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "training_file": "file-abc123", + "validation_file": "file-abc123", + "model": "gpt-4o-mini", + "integrations": [ + { + "type": "wandb", + "wandb": { + "project": "my-wandb-project", + "name": "ft-run-display-name" + "tags": [ + "first-experiment", "v2" + ] + } + } + ] + }' + response: | + { + "object": "fine_tuning.job", + "id": "ftjob-abc123", + "model": "gpt-4o-mini-2024-07-18", + "created_at": 1721764800, + "fine_tuned_model": null, + "organization_id": "org-123", + "result_files": [], + "status": "queued", + "validation_file": "file-abc123", + "training_file": "file-abc123", + "integrations": [ + { + "type": "wandb", + "wandb": { + "project": "my-wandb-project", + "entity": None, + "run_id": "ftjob-abc123" + } + } + ] + } + get: + operationId: listPaginatedFineTuningJobs + tags: + - Fine-tuning + summary: | + List your organization's fine-tuning jobs + parameters: + - name: after + in: query + description: Identifier for the last job from the previous pagination request. + required: false + schema: + type: string + - name: limit + in: query + description: Number of fine-tuning jobs to retrieve. + required: false + schema: + type: integer + default: 20 + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListPaginatedFineTuningJobsResponse" + x-oaiMeta: + name: List fine-tuning jobs + group: fine-tuning + returns: A list of paginated [fine-tuning job](/docs/api-reference/fine-tuning/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/fine_tuning/jobs?limit=2 \ + -H "Authorization: Bearer $OPENAI_API_KEY" + python: | + from openai import OpenAI + client = OpenAI() + + client.fine_tuning.jobs.list() + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const list = await openai.fineTuning.jobs.list(); + + for await (const fineTune of list) { + console.log(fineTune); + } + } + + main(); + response: | + { + "object": "list", + "data": [ + { + "object": "fine_tuning.job.event", + "id": "ft-event-TjX0lMfOniCZX64t9PUQT5hn", + "created_at": 1689813489, + "level": "warn", + "message": "Fine tuning process stopping due to job cancellation", + "data": null, + "type": "message" + }, + { ... }, + { ... } + ], "has_more": true + } + /fine_tuning/jobs/{fine_tuning_job_id}: + get: + operationId: retrieveFineTuningJob + tags: + - Fine-tuning + summary: | + Get info about a fine-tuning job. + + [Learn more about fine-tuning](/docs/guides/fine-tuning) + parameters: + - in: path + name: fine_tuning_job_id + required: true + schema: + type: string + example: ft-AF1WoRqd3aJAHsqc9NY7iL8F + description: | + The ID of the fine-tuning job. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/FineTuningJob" + x-oaiMeta: + name: Retrieve fine-tuning job + group: fine-tuning + returns: The [fine-tuning](/docs/api-reference/fine-tuning/object) object with the given ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/fine_tuning/jobs/ft-AF1WoRqd3aJAHsqc9NY7iL8F \ + -H "Authorization: Bearer $OPENAI_API_KEY" + python: | + from openai import OpenAI + client = OpenAI() + + client.fine_tuning.jobs.retrieve("ftjob-abc123") + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const fineTune = await openai.fineTuning.jobs.retrieve("ftjob-abc123"); + + console.log(fineTune); + } + + main(); + response: &fine_tuning_example | + { + "object": "fine_tuning.job", + "id": "ftjob-abc123", + "model": "davinci-002", + "created_at": 1692661014, + "finished_at": 1692661190, + "fine_tuned_model": "ft:davinci-002:my-org:custom_suffix:7q8mpxmy", + "organization_id": "org-123", + "result_files": [ + "file-abc123" + ], + "status": "succeeded", + "validation_file": null, + "training_file": "file-abc123", + "hyperparameters": { + "n_epochs": 4, + "batch_size": 1, + "learning_rate_multiplier": 1.0 + }, + "trained_tokens": 5768, + "integrations": [], + "seed": 0, + "estimated_finish": 0 + } + /fine_tuning/jobs/{fine_tuning_job_id}/events: + get: + operationId: listFineTuningEvents + tags: + - Fine-tuning + summary: | + Get status updates for a fine-tuning job. + parameters: + - in: path + name: fine_tuning_job_id + required: true + schema: + type: string + example: ft-AF1WoRqd3aJAHsqc9NY7iL8F + description: | + The ID of the fine-tuning job to get events for. + - name: after + in: query + description: Identifier for the last event from the previous pagination request. + required: false + schema: + type: string + - name: limit + in: query + description: Number of events to retrieve. + required: false + schema: + type: integer + default: 20 + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListFineTuningJobEventsResponse" + x-oaiMeta: + name: List fine-tuning events + group: fine-tuning + returns: A list of fine-tuning event objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/fine_tuning/jobs/ftjob-abc123/events \ + -H "Authorization: Bearer $OPENAI_API_KEY" + python: | + from openai import OpenAI + client = OpenAI() + + client.fine_tuning.jobs.list_events( + fine_tuning_job_id="ftjob-abc123", + limit=2 + ) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const list = await openai.fineTuning.list_events(id="ftjob-abc123", limit=2); + + for await (const fineTune of list) { + console.log(fineTune); + } + } + + main(); + response: | + { + "object": "list", + "data": [ + { + "object": "fine_tuning.job.event", + "id": "ft-event-ddTJfwuMVpfLXseO0Am0Gqjm", + "created_at": 1721764800, + "level": "info", + "message": "Fine tuning job successfully completed", + "data": null, + "type": "message" + }, + { + "object": "fine_tuning.job.event", + "id": "ft-event-tyiGuB72evQncpH87xe505Sv", + "created_at": 1721764800, + "level": "info", + "message": "New fine-tuned model created: ft:gpt-4o-mini:openai::7p4lURel", + "data": null, + "type": "message" + } + ], + "has_more": true + } + /fine_tuning/jobs/{fine_tuning_job_id}/cancel: + post: + operationId: cancelFineTuningJob + tags: + - Fine-tuning + summary: | + Immediately cancel a fine-tune job. + parameters: + - in: path + name: fine_tuning_job_id + required: true + schema: + type: string + example: ft-AF1WoRqd3aJAHsqc9NY7iL8F + description: | + The ID of the fine-tuning job to cancel. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/FineTuningJob" + x-oaiMeta: + name: Cancel fine-tuning + group: fine-tuning + returns: The cancelled [fine-tuning](/docs/api-reference/fine-tuning/object) object. + examples: + request: + curl: | + curl -X POST https://api.openai.com/v1/fine_tuning/jobs/ftjob-abc123/cancel \ + -H "Authorization: Bearer $OPENAI_API_KEY" + python: | + from openai import OpenAI + client = OpenAI() + + client.fine_tuning.jobs.cancel("ftjob-abc123") + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const fineTune = await openai.fineTuning.jobs.cancel("ftjob-abc123"); + + console.log(fineTune); + } + main(); + response: | + { + "object": "fine_tuning.job", + "id": "ftjob-abc123", + "model": "gpt-4o-mini-2024-07-18", + "created_at": 1721764800, + "fine_tuned_model": null, + "organization_id": "org-123", + "result_files": [], + "hyperparameters": { + "n_epochs": "auto" + }, + "status": "cancelled", + "validation_file": "file-abc123", + "training_file": "file-abc123" + } + /fine_tuning/jobs/{fine_tuning_job_id}/checkpoints: + get: + operationId: listFineTuningJobCheckpoints + tags: + - Fine-tuning + summary: | + List checkpoints for a fine-tuning job. + parameters: + - in: path + name: fine_tuning_job_id + required: true + schema: + type: string + example: ft-AF1WoRqd3aJAHsqc9NY7iL8F + description: | + The ID of the fine-tuning job to get checkpoints for. + - name: after + in: query + description: Identifier for the last checkpoint ID from the previous pagination request. + required: false + schema: + type: string + - name: limit + in: query + description: Number of checkpoints to retrieve. + required: false + schema: + type: integer + default: 10 + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListFineTuningJobCheckpointsResponse" + x-oaiMeta: + name: List fine-tuning checkpoints + group: fine-tuning + returns: A list of fine-tuning [checkpoint objects](/docs/api-reference/fine-tuning/checkpoint-object) for a fine-tuning job. + examples: + request: + curl: | + curl https://api.openai.com/v1/fine_tuning/jobs/ftjob-abc123/checkpoints \ + -H "Authorization: Bearer $OPENAI_API_KEY" + response: | + { + "object": "list" + "data": [ + { + "object": "fine_tuning.job.checkpoint", + "id": "ftckpt_zc4Q7MP6XxulcVzj4MZdwsAB", + "created_at": 1721764867, + "fine_tuned_model_checkpoint": "ft:gpt-4o-mini-2024-07-18:my-org:custom-suffix:96olL566:ckpt-step-2000", + "metrics": { + "full_valid_loss": 0.134, + "full_valid_mean_token_accuracy": 0.874 + }, + "fine_tuning_job_id": "ftjob-abc123", + "step_number": 2000, + }, + { + "object": "fine_tuning.job.checkpoint", + "id": "ftckpt_enQCFmOTGj3syEpYVhBRLTSy", + "created_at": 1721764800, + "fine_tuned_model_checkpoint": "ft:gpt-4o-mini-2024-07-18:my-org:custom-suffix:7q8mpxmy:ckpt-step-1000", + "metrics": { + "full_valid_loss": 0.167, + "full_valid_mean_token_accuracy": 0.781 + }, + "fine_tuning_job_id": "ftjob-abc123", + "step_number": 1000, + }, + ], + "first_id": "ftckpt_zc4Q7MP6XxulcVzj4MZdwsAB", + "last_id": "ftckpt_enQCFmOTGj3syEpYVhBRLTSy", + "has_more": true + } + + /models: + get: + operationId: listModels + tags: + - Models + summary: Lists the currently available models, and provides basic information about each one such as the owner and availability. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListModelsResponse" + x-oaiMeta: + name: List models + group: models + returns: A list of [model](/docs/api-reference/models/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/models \ + -H "Authorization: Bearer $OPENAI_API_KEY" + python: | + from openai import OpenAI + client = OpenAI() + + client.models.list() + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const list = await openai.models.list(); + + for await (const model of list) { + console.log(model); + } + } + main(); + response: | + { + "object": "list", + "data": [ + { + "id": "model-id-0", + "object": "model", + "created": 1686935002, + "owned_by": "organization-owner" + }, + { + "id": "model-id-1", + "object": "model", + "created": 1686935002, + "owned_by": "organization-owner", + }, + { + "id": "model-id-2", + "object": "model", + "created": 1686935002, + "owned_by": "openai" + }, + ], + "object": "list" + } + /models/{model}: + get: + operationId: retrieveModel + tags: + - Models + summary: Retrieves a model instance, providing basic information about the model such as the owner and permissioning. + parameters: + - in: path + name: model + required: true + schema: + type: string + # ideally this will be an actual ID, so this will always work from browser + example: gpt-4o-mini + description: The ID of the model to use for this request + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/Model" + x-oaiMeta: + name: Retrieve model + group: models + returns: The [model](/docs/api-reference/models/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/models/VAR_model_id \ + -H "Authorization: Bearer $OPENAI_API_KEY" + python: | + from openai import OpenAI + client = OpenAI() + + client.models.retrieve("VAR_model_id") + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const model = await openai.models.retrieve("VAR_model_id"); + + console.log(model); + } + + main(); + response: &retrieve_model_response | + { + "id": "VAR_model_id", + "object": "model", + "created": 1686935002, + "owned_by": "openai" + } + delete: + operationId: deleteModel + tags: + - Models + summary: Delete a fine-tuned model. You must have the Owner role in your organization to delete a model. + parameters: + - in: path + name: model + required: true + schema: + type: string + example: ft:gpt-4o-mini:acemeco:suffix:abc123 + description: The model to delete + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/DeleteModelResponse" + x-oaiMeta: + name: Delete a fine-tuned model + group: models + returns: Deletion status. + examples: + request: + curl: | + curl https://api.openai.com/v1/models/ft:gpt-4o-mini:acemeco:suffix:abc123 \ + -X DELETE \ + -H "Authorization: Bearer $OPENAI_API_KEY" + python: | + from openai import OpenAI + client = OpenAI() + + client.models.delete("ft:gpt-4o-mini:acemeco:suffix:abc123") + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const model = await openai.models.del("ft:gpt-4o-mini:acemeco:suffix:abc123"); + + console.log(model); + } + main(); + response: | + { + "id": "ft:gpt-4o-mini:acemeco:suffix:abc123", + "object": "model", + "deleted": true + } + + /moderations: + post: + operationId: createModeration + tags: + - Moderations + summary: Classifies if text is potentially harmful. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateModerationRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/CreateModerationResponse" + x-oaiMeta: + name: Create moderation + group: moderations + returns: A [moderation](/docs/api-reference/moderations/object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/moderations \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -d '{ + "input": "I want to kill them." + }' + python: | + from openai import OpenAI + client = OpenAI() + + moderation = client.moderations.create(input="I want to kill them.") + print(moderation) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const moderation = await openai.moderations.create({ input: "I want to kill them." }); + + console.log(moderation); + } + main(); + response: &moderation_example | + { + "id": "modr-XXXXX", + "model": "text-moderation-005", + "results": [ + { + "flagged": true, + "categories": { + "sexual": false, + "hate": false, + "harassment": false, + "self-harm": false, + "sexual/minors": false, + "hate/threatening": false, + "violence/graphic": false, + "self-harm/intent": false, + "self-harm/instructions": false, + "harassment/threatening": true, + "violence": true, + }, + "category_scores": { + "sexual": 1.2282071e-06, + "hate": 0.010696256, + "harassment": 0.29842457, + "self-harm": 1.5236925e-08, + "sexual/minors": 5.7246268e-08, + "hate/threatening": 0.0060676364, + "violence/graphic": 4.435014e-06, + "self-harm/intent": 8.098441e-10, + "self-harm/instructions": 2.8498655e-11, + "harassment/threatening": 0.63055265, + "violence": 0.99011886, + } + } + ] + } + + /assistants: + get: + operationId: listAssistants + tags: + - Assistants + summary: Returns a list of assistants. + parameters: + - name: limit + in: query + description: &pagination_limit_param_description | + A limit on the number of objects to be returned. Limit can range between 1 and 100, and the default is 20. + required: false + schema: + type: integer + default: 20 + - name: order + in: query + description: &pagination_order_param_description | + Sort order by the `created_at` timestamp of the objects. `asc` for ascending order and `desc` for descending order. + schema: + type: string + default: desc + enum: ["asc", "desc"] + - name: after + in: query + description: &pagination_after_param_description | + A cursor for use in pagination. `after` is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include after=obj_foo in order to fetch the next page of the list. + schema: + type: string + - name: before + in: query + description: &pagination_before_param_description | + A cursor for use in pagination. `before` is an object ID that defines your place in the list. For instance, if you make a list request and receive 100 objects, ending with obj_foo, your subsequent call can include before=obj_foo in order to fetch the previous page of the list. + schema: + type: string + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListAssistantsResponse" + x-oaiMeta: + name: List assistants + group: assistants + beta: true + returns: A list of [assistant](/docs/api-reference/assistants/object) objects. + examples: + request: + curl: | + curl "https://api.openai.com/v1/assistants?order=desc&limit=20" \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + my_assistants = client.beta.assistants.list( + order="desc", + limit="20", + ) + print(my_assistants.data) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const myAssistants = await openai.beta.assistants.list({ + order: "desc", + limit: "20", + }); + + console.log(myAssistants.data); + } + + main(); + response: &list_assistants_example | + { + "object": "list", + "data": [ + { + "id": "asst_abc123", + "object": "assistant", + "created_at": 1698982736, + "name": "Coding Tutor", + "description": null, + "model": "gpt-4o", + "instructions": "You are a helpful assistant designed to make me better at coding!", + "tools": [], + "tool_resources": {}, + "metadata": {}, + "top_p": 1.0, + "temperature": 1.0, + "response_format": "auto" + }, + { + "id": "asst_abc456", + "object": "assistant", + "created_at": 1698982718, + "name": "My Assistant", + "description": null, + "model": "gpt-4o", + "instructions": "You are a helpful assistant designed to make me better at coding!", + "tools": [], + "tool_resources": {}, + "metadata": {}, + "top_p": 1.0, + "temperature": 1.0, + "response_format": "auto" + }, + { + "id": "asst_abc789", + "object": "assistant", + "created_at": 1698982643, + "name": null, + "description": null, + "model": "gpt-4o", + "instructions": null, + "tools": [], + "tool_resources": {}, + "metadata": {}, + "top_p": 1.0, + "temperature": 1.0, + "response_format": "auto" + } + ], + "first_id": "asst_abc123", + "last_id": "asst_abc789", + "has_more": false + } + post: + operationId: createAssistant + tags: + - Assistants + summary: Create an assistant with a model and instructions. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateAssistantRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/AssistantObject" + x-oaiMeta: + name: Create assistant + group: assistants + beta: true + returns: An [assistant](/docs/api-reference/assistants/object) object. + examples: + - title: Code Interpreter + request: + curl: | + curl "https://api.openai.com/v1/assistants" \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "instructions": "You are a personal math tutor. When asked a question, write and run Python code to answer the question.", + "name": "Math Tutor", + "tools": [{"type": "code_interpreter"}], + "model": "gpt-4o" + }' + + python: | + from openai import OpenAI + client = OpenAI() + + my_assistant = client.beta.assistants.create( + instructions="You are a personal math tutor. When asked a question, write and run Python code to answer the question.", + name="Math Tutor", + tools=[{"type": "code_interpreter"}], + model="gpt-4o", + ) + print(my_assistant) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const myAssistant = await openai.beta.assistants.create({ + instructions: + "You are a personal math tutor. When asked a question, write and run Python code to answer the question.", + name: "Math Tutor", + tools: [{ type: "code_interpreter" }], + model: "gpt-4o", + }); + + console.log(myAssistant); + } + + main(); + response: &create_assistants_example | + { + "id": "asst_abc123", + "object": "assistant", + "created_at": 1698984975, + "name": "Math Tutor", + "description": null, + "model": "gpt-4o", + "instructions": "You are a personal math tutor. When asked a question, write and run Python code to answer the question.", + "tools": [ + { + "type": "code_interpreter" + } + ], + "metadata": {}, + "top_p": 1.0, + "temperature": 1.0, + "response_format": "auto" + } + - title: Files + request: + curl: | + curl https://api.openai.com/v1/assistants \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "instructions": "You are an HR bot, and you have access to files to answer employee questions about company policies.", + "tools": [{"type": "file_search"}], + "tool_resources": {"file_search": {"vector_store_ids": ["vs_123"]}}, + "model": "gpt-4o" + }' + python: | + from openai import OpenAI + client = OpenAI() + + my_assistant = client.beta.assistants.create( + instructions="You are an HR bot, and you have access to files to answer employee questions about company policies.", + name="HR Helper", + tools=[{"type": "file_search"}], + tool_resources={"file_search": {"vector_store_ids": ["vs_123"]}}, + model="gpt-4o" + ) + print(my_assistant) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const myAssistant = await openai.beta.assistants.create({ + instructions: + "You are an HR bot, and you have access to files to answer employee questions about company policies.", + name: "HR Helper", + tools: [{ type: "file_search" }], + tool_resources: { + file_search: { + vector_store_ids: ["vs_123"] + } + }, + model: "gpt-4o" + }); + + console.log(myAssistant); + } + + main(); + response: | + { + "id": "asst_abc123", + "object": "assistant", + "created_at": 1699009403, + "name": "HR Helper", + "description": null, + "model": "gpt-4o", + "instructions": "You are an HR bot, and you have access to files to answer employee questions about company policies.", + "tools": [ + { + "type": "file_search" + } + ], + "tool_resources": { + "file_search": { + "vector_store_ids": ["vs_123"] + } + }, + "metadata": {}, + "top_p": 1.0, + "temperature": 1.0, + "response_format": "auto" + } + + /assistants/{assistant_id}: + get: + operationId: getAssistant + tags: + - Assistants + summary: Retrieves an assistant. + parameters: + - in: path + name: assistant_id + required: true + schema: + type: string + description: The ID of the assistant to retrieve. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/AssistantObject" + x-oaiMeta: + name: Retrieve assistant + group: assistants + beta: true + returns: The [assistant](/docs/api-reference/assistants/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/assistants/asst_abc123 \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + my_assistant = client.beta.assistants.retrieve("asst_abc123") + print(my_assistant) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const myAssistant = await openai.beta.assistants.retrieve( + "asst_abc123" + ); + + console.log(myAssistant); + } + + main(); + response: | + { + "id": "asst_abc123", + "object": "assistant", + "created_at": 1699009709, + "name": "HR Helper", + "description": null, + "model": "gpt-4o", + "instructions": "You are an HR bot, and you have access to files to answer employee questions about company policies.", + "tools": [ + { + "type": "file_search" + } + ], + "metadata": {}, + "top_p": 1.0, + "temperature": 1.0, + "response_format": "auto" + } + post: + operationId: modifyAssistant + tags: + - Assistants + summary: Modifies an assistant. + parameters: + - in: path + name: assistant_id + required: true + schema: + type: string + description: The ID of the assistant to modify. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/ModifyAssistantRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/AssistantObject" + x-oaiMeta: + name: Modify assistant + group: assistants + beta: true + returns: The modified [assistant](/docs/api-reference/assistants/object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/assistants/asst_abc123 \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "instructions": "You are an HR bot, and you have access to files to answer employee questions about company policies. Always response with info from either of the files.", + "tools": [{"type": "file_search"}], + "model": "gpt-4o" + }' + python: | + from openai import OpenAI + client = OpenAI() + + my_updated_assistant = client.beta.assistants.update( + "asst_abc123", + instructions="You are an HR bot, and you have access to files to answer employee questions about company policies. Always response with info from either of the files.", + name="HR Helper", + tools=[{"type": "file_search"}], + model="gpt-4o" + ) + + print(my_updated_assistant) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const myUpdatedAssistant = await openai.beta.assistants.update( + "asst_abc123", + { + instructions: + "You are an HR bot, and you have access to files to answer employee questions about company policies. Always response with info from either of the files.", + name: "HR Helper", + tools: [{ type: "file_search" }], + model: "gpt-4o" + } + ); + + console.log(myUpdatedAssistant); + } + + main(); + response: | + { + "id": "asst_123", + "object": "assistant", + "created_at": 1699009709, + "name": "HR Helper", + "description": null, + "model": "gpt-4o", + "instructions": "You are an HR bot, and you have access to files to answer employee questions about company policies. Always response with info from either of the files.", + "tools": [ + { + "type": "file_search" + } + ], + "tool_resources": { + "file_search": { + "vector_store_ids": [] + } + }, + "metadata": {}, + "top_p": 1.0, + "temperature": 1.0, + "response_format": "auto" + } + delete: + operationId: deleteAssistant + tags: + - Assistants + summary: Delete an assistant. + parameters: + - in: path + name: assistant_id + required: true + schema: + type: string + description: The ID of the assistant to delete. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/DeleteAssistantResponse" + x-oaiMeta: + name: Delete assistant + group: assistants + beta: true + returns: Deletion status + examples: + request: + curl: | + curl https://api.openai.com/v1/assistants/asst_abc123 \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" \ + -X DELETE + python: | + from openai import OpenAI + client = OpenAI() + + response = client.beta.assistants.delete("asst_abc123") + print(response) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const response = await openai.beta.assistants.del("asst_abc123"); + + console.log(response); + } + main(); + response: | + { + "id": "asst_abc123", + "object": "assistant.deleted", + "deleted": true + } + + /threads: + post: + operationId: createThread + tags: + - Assistants + summary: Create a thread. + requestBody: + content: + application/json: + schema: + $ref: "#/components/schemas/CreateThreadRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ThreadObject" + x-oaiMeta: + name: Create thread + group: threads + beta: true + returns: A [thread](/docs/api-reference/threads) object. + examples: + - title: Empty + request: + curl: | + curl https://api.openai.com/v1/threads \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '' + python: | + from openai import OpenAI + client = OpenAI() + + empty_thread = client.beta.threads.create() + print(empty_thread) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const emptyThread = await openai.beta.threads.create(); + + console.log(emptyThread); + } + + main(); + response: | + { + "id": "thread_abc123", + "object": "thread", + "created_at": 1699012949, + "metadata": {}, + "tool_resources": {} + } + - title: Messages + request: + curl: | + curl https://api.openai.com/v1/threads \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "messages": [{ + "role": "user", + "content": "Hello, what is AI?" + }, { + "role": "user", + "content": "How does AI work? Explain it in simple terms." + }] + }' + python: | + from openai import OpenAI + client = OpenAI() + + message_thread = client.beta.threads.create( + messages=[ + { + "role": "user", + "content": "Hello, what is AI?" + }, + { + "role": "user", + "content": "How does AI work? Explain it in simple terms." + }, + ] + ) + + print(message_thread) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const messageThread = await openai.beta.threads.create({ + messages: [ + { + role: "user", + content: "Hello, what is AI?" + }, + { + role: "user", + content: "How does AI work? Explain it in simple terms.", + }, + ], + }); + + console.log(messageThread); + } + + main(); + response: | + { + "id": "thread_abc123", + "object": "thread", + "created_at": 1699014083, + "metadata": {}, + "tool_resources": {} + } + + /threads/{thread_id}: + get: + operationId: getThread + tags: + - Assistants + summary: Retrieves a thread. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the thread to retrieve. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ThreadObject" + x-oaiMeta: + name: Retrieve thread + group: threads + beta: true + returns: The [thread](/docs/api-reference/threads/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123 \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + my_thread = client.beta.threads.retrieve("thread_abc123") + print(my_thread) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const myThread = await openai.beta.threads.retrieve( + "thread_abc123" + ); + + console.log(myThread); + } + + main(); + response: | + { + "id": "thread_abc123", + "object": "thread", + "created_at": 1699014083, + "metadata": {}, + "tool_resources": { + "code_interpreter": { + "file_ids": [] + } + } + } + post: + operationId: modifyThread + tags: + - Assistants + summary: Modifies a thread. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the thread to modify. Only the `metadata` can be modified. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/ModifyThreadRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ThreadObject" + x-oaiMeta: + name: Modify thread + group: threads + beta: true + returns: The modified [thread](/docs/api-reference/threads/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123 \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "metadata": { + "modified": "true", + "user": "abc123" + } + }' + python: | + from openai import OpenAI + client = OpenAI() + + my_updated_thread = client.beta.threads.update( + "thread_abc123", + metadata={ + "modified": "true", + "user": "abc123" + } + ) + print(my_updated_thread) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const updatedThread = await openai.beta.threads.update( + "thread_abc123", + { + metadata: { modified: "true", user: "abc123" }, + } + ); + + console.log(updatedThread); + } + + main(); + response: | + { + "id": "thread_abc123", + "object": "thread", + "created_at": 1699014083, + "metadata": { + "modified": "true", + "user": "abc123" + }, + "tool_resources": {} + } + delete: + operationId: deleteThread + tags: + - Assistants + summary: Delete a thread. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the thread to delete. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/DeleteThreadResponse" + x-oaiMeta: + name: Delete thread + group: threads + beta: true + returns: Deletion status + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123 \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" \ + -X DELETE + python: | + from openai import OpenAI + client = OpenAI() + + response = client.beta.threads.delete("thread_abc123") + print(response) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const response = await openai.beta.threads.del("thread_abc123"); + + console.log(response); + } + main(); + response: | + { + "id": "thread_abc123", + "object": "thread.deleted", + "deleted": true + } + + /threads/{thread_id}/messages: + get: + operationId: listMessages + tags: + - Assistants + summary: Returns a list of messages for a given thread. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the [thread](/docs/api-reference/threads) the messages belong to. + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: order + in: query + description: *pagination_order_param_description + schema: + type: string + default: desc + enum: ["asc", "desc"] + - name: after + in: query + description: *pagination_after_param_description + schema: + type: string + - name: before + in: query + description: *pagination_before_param_description + schema: + type: string + - name: run_id + in: query + description: | + Filter messages by the run ID that generated them. + schema: + type: string + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListMessagesResponse" + x-oaiMeta: + name: List messages + group: threads + beta: true + returns: A list of [message](/docs/api-reference/messages) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/messages \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + thread_messages = client.beta.threads.messages.list("thread_abc123") + print(thread_messages.data) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const threadMessages = await openai.beta.threads.messages.list( + "thread_abc123" + ); + + console.log(threadMessages.data); + } + + main(); + response: | + { + "object": "list", + "data": [ + { + "id": "msg_abc123", + "object": "thread.message", + "created_at": 1699016383, + "assistant_id": null, + "thread_id": "thread_abc123", + "run_id": null, + "role": "user", + "content": [ + { + "type": "text", + "text": { + "value": "How does AI work? Explain it in simple terms.", + "annotations": [] + } + } + ], + "attachments": [], + "metadata": {} + }, + { + "id": "msg_abc456", + "object": "thread.message", + "created_at": 1699016383, + "assistant_id": null, + "thread_id": "thread_abc123", + "run_id": null, + "role": "user", + "content": [ + { + "type": "text", + "text": { + "value": "Hello, what is AI?", + "annotations": [] + } + } + ], + "attachments": [], + "metadata": {} + } + ], + "first_id": "msg_abc123", + "last_id": "msg_abc456", + "has_more": false + } + post: + operationId: createMessage + tags: + - Assistants + summary: Create a message. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the [thread](/docs/api-reference/threads) to create a message for. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateMessageRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/MessageObject" + x-oaiMeta: + name: Create message + group: threads + beta: true + returns: A [message](/docs/api-reference/messages/object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/messages \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "role": "user", + "content": "How does AI work? Explain it in simple terms." + }' + python: | + from openai import OpenAI + client = OpenAI() + + thread_message = client.beta.threads.messages.create( + "thread_abc123", + role="user", + content="How does AI work? Explain it in simple terms.", + ) + print(thread_message) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const threadMessages = await openai.beta.threads.messages.create( + "thread_abc123", + { role: "user", content: "How does AI work? Explain it in simple terms." } + ); + + console.log(threadMessages); + } + + main(); + response: | + { + "id": "msg_abc123", + "object": "thread.message", + "created_at": 1713226573, + "assistant_id": null, + "thread_id": "thread_abc123", + "run_id": null, + "role": "user", + "content": [ + { + "type": "text", + "text": { + "value": "How does AI work? Explain it in simple terms.", + "annotations": [] + } + } + ], + "attachments": [], + "metadata": {} + } + + /threads/{thread_id}/messages/{message_id}: + get: + operationId: getMessage + tags: + - Assistants + summary: Retrieve a message. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the [thread](/docs/api-reference/threads) to which this message belongs. + - in: path + name: message_id + required: true + schema: + type: string + description: The ID of the message to retrieve. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/MessageObject" + x-oaiMeta: + name: Retrieve message + group: threads + beta: true + returns: The [message](/docs/api-reference/messages/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/messages/msg_abc123 \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + message = client.beta.threads.messages.retrieve( + message_id="msg_abc123", + thread_id="thread_abc123", + ) + print(message) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const message = await openai.beta.threads.messages.retrieve( + "thread_abc123", + "msg_abc123" + ); + + console.log(message); + } + + main(); + response: | + { + "id": "msg_abc123", + "object": "thread.message", + "created_at": 1699017614, + "assistant_id": null, + "thread_id": "thread_abc123", + "run_id": null, + "role": "user", + "content": [ + { + "type": "text", + "text": { + "value": "How does AI work? Explain it in simple terms.", + "annotations": [] + } + } + ], + "attachments": [], + "metadata": {} + } + post: + operationId: modifyMessage + tags: + - Assistants + summary: Modifies a message. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the thread to which this message belongs. + - in: path + name: message_id + required: true + schema: + type: string + description: The ID of the message to modify. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/ModifyMessageRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/MessageObject" + x-oaiMeta: + name: Modify message + group: threads + beta: true + returns: The modified [message](/docs/api-reference/messages/object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/messages/msg_abc123 \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "metadata": { + "modified": "true", + "user": "abc123" + } + }' + python: | + from openai import OpenAI + client = OpenAI() + + message = client.beta.threads.messages.update( + message_id="msg_abc12", + thread_id="thread_abc123", + metadata={ + "modified": "true", + "user": "abc123", + }, + ) + print(message) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const message = await openai.beta.threads.messages.update( + "thread_abc123", + "msg_abc123", + { + metadata: { + modified: "true", + user: "abc123", + }, + } + }' + response: | + { + "id": "msg_abc123", + "object": "thread.message", + "created_at": 1699017614, + "assistant_id": null, + "thread_id": "thread_abc123", + "run_id": null, + "role": "user", + "content": [ + { + "type": "text", + "text": { + "value": "How does AI work? Explain it in simple terms.", + "annotations": [] + } + } + ], + "file_ids": [], + "metadata": { + "modified": "true", + "user": "abc123" + } + } + delete: + operationId: deleteMessage + tags: + - Assistants + summary: Deletes a message. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the thread to which this message belongs. + - in: path + name: message_id + required: true + schema: + type: string + description: The ID of the message to delete. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/DeleteMessageResponse" + x-oaiMeta: + name: Delete message + group: threads + beta: true + returns: Deletion status + examples: + request: + curl: | + curl -X DELETE https://api.openai.com/v1/threads/thread_abc123/messages/msg_abc123 \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + deleted_message = client.beta.threads.messages.delete( + message_id="msg_abc12", + thread_id="thread_abc123", + ) + print(deleted_message) + node.js: |- + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const deletedMessage = await openai.beta.threads.messages.del( + "thread_abc123", + "msg_abc123" + ); + + console.log(deletedMessage); + } + response: | + { + "id": "msg_abc123", + "object": "thread.message.deleted", + "deleted": true + } + + /threads/runs: + post: + operationId: createThreadAndRun + tags: + - Assistants + summary: Create a thread and run it in one request. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateThreadAndRunRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/RunObject" + x-oaiMeta: + name: Create thread and run + group: threads + beta: true + returns: A [run](/docs/api-reference/runs/object) object. + examples: + - title: Default + request: + curl: | + curl https://api.openai.com/v1/threads/runs \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "assistant_id": "asst_abc123", + "thread": { + "messages": [ + {"role": "user", "content": "Explain deep learning to a 5 year old."} + ] + } + }' + python: | + from openai import OpenAI + client = OpenAI() + + run = client.beta.threads.create_and_run( + assistant_id="asst_abc123", + thread={ + "messages": [ + {"role": "user", "content": "Explain deep learning to a 5 year old."} + ] + } + ) + + print(run) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const run = await openai.beta.threads.createAndRun({ + assistant_id: "asst_abc123", + thread: { + messages: [ + { role: "user", content: "Explain deep learning to a 5 year old." }, + ], + }, + }); + + console.log(run); + } + + main(); + response: | + { + "id": "run_abc123", + "object": "thread.run", + "created_at": 1699076792, + "assistant_id": "asst_abc123", + "thread_id": "thread_abc123", + "status": "queued", + "started_at": null, + "expires_at": 1699077392, + "cancelled_at": null, + "failed_at": null, + "completed_at": null, + "required_action": null, + "last_error": null, + "model": "gpt-4o", + "instructions": "You are a helpful assistant.", + "tools": [], + "tool_resources": {}, + "metadata": {}, + "temperature": 1.0, + "top_p": 1.0, + "max_completion_tokens": null, + "max_prompt_tokens": null, + "truncation_strategy": { + "type": "auto", + "last_messages": null + }, + "incomplete_details": null, + "usage": null, + "response_format": "auto", + "tool_choice": "auto", + "parallel_tool_calls": true + } + + - title: Streaming + request: + curl: | + curl https://api.openai.com/v1/threads/runs \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "assistant_id": "asst_123", + "thread": { + "messages": [ + {"role": "user", "content": "Hello"} + ] + }, + "stream": true + }' + python: | + from openai import OpenAI + client = OpenAI() + + stream = client.beta.threads.create_and_run( + assistant_id="asst_123", + thread={ + "messages": [ + {"role": "user", "content": "Hello"} + ] + }, + stream=True + ) + + for event in stream: + print(event) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const stream = await openai.beta.threads.createAndRun({ + assistant_id: "asst_123", + thread: { + messages: [ + { role: "user", content: "Hello" }, + ], + }, + stream: true + }); + + for await (const event of stream) { + console.log(event); + } + } + + main(); + response: | + event: thread.created + data: {"id":"thread_123","object":"thread","created_at":1710348075,"metadata":{}} + + event: thread.run.created + data: {"id":"run_123","object":"thread.run","created_at":1710348075,"assistant_id":"asst_123","thread_id":"thread_123","status":"queued","started_at":null,"expires_at":1710348675,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"tool_resources":{},"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true} + + event: thread.run.queued + data: {"id":"run_123","object":"thread.run","created_at":1710348075,"assistant_id":"asst_123","thread_id":"thread_123","status":"queued","started_at":null,"expires_at":1710348675,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"tool_resources":{},"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true} + + event: thread.run.in_progress + data: {"id":"run_123","object":"thread.run","created_at":1710348075,"assistant_id":"asst_123","thread_id":"thread_123","status":"in_progress","started_at":null,"expires_at":1710348675,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"tool_resources":{},"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true} + + event: thread.run.step.created + data: {"id":"step_001","object":"thread.run.step","created_at":1710348076,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"in_progress","cancelled_at":null,"completed_at":null,"expires_at":1710348675,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_001"}},"usage":null} + + event: thread.run.step.in_progress + data: {"id":"step_001","object":"thread.run.step","created_at":1710348076,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"in_progress","cancelled_at":null,"completed_at":null,"expires_at":1710348675,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_001"}},"usage":null} + + event: thread.message.created + data: {"id":"msg_001","object":"thread.message","created_at":1710348076,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"in_progress","incomplete_details":null,"incomplete_at":null,"completed_at":null,"role":"assistant","content":[], "metadata":{}} + + event: thread.message.in_progress + data: {"id":"msg_001","object":"thread.message","created_at":1710348076,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"in_progress","incomplete_details":null,"incomplete_at":null,"completed_at":null,"role":"assistant","content":[], "metadata":{}} + + event: thread.message.delta + data: {"id":"msg_001","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"Hello","annotations":[]}}]}} + + ... + + event: thread.message.delta + data: {"id":"msg_001","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" today"}}]}} + + event: thread.message.delta + data: {"id":"msg_001","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"?"}}]}} + + event: thread.message.completed + data: {"id":"msg_001","object":"thread.message","created_at":1710348076,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"completed","incomplete_details":null,"incomplete_at":null,"completed_at":1710348077,"role":"assistant","content":[{"type":"text","text":{"value":"Hello! How can I assist you today?","annotations":[]}}], "metadata":{}} + + event: thread.run.step.completed + data: {"id":"step_001","object":"thread.run.step","created_at":1710348076,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"completed","cancelled_at":null,"completed_at":1710348077,"expires_at":1710348675,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_001"}},"usage":{"prompt_tokens":20,"completion_tokens":11,"total_tokens":31}} + + event: thread.run.completed + {"id":"run_123","object":"thread.run","created_at":1710348076,"assistant_id":"asst_123","thread_id":"thread_123","status":"completed","started_at":1713226836,"expires_at":null,"cancelled_at":null,"failed_at":null,"completed_at":1713226837,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":{"prompt_tokens":345,"completion_tokens":11,"total_tokens":356},"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true} + + event: done + data: [DONE] + + - title: Streaming with Functions + request: + curl: | + curl https://api.openai.com/v1/threads/runs \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "assistant_id": "asst_abc123", + "thread": { + "messages": [ + {"role": "user", "content": "What is the weather like in San Francisco?"} + ] + }, + "tools": [ + { + "type": "function", + "function": { + "name": "get_current_weather", + "description": "Get the current weather in a given location", + "parameters": { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. San Francisco, CA" + }, + "unit": { + "type": "string", + "enum": ["celsius", "fahrenheit"] + } + }, + "required": ["location"] + } + } + } + ], + "stream": true + }' + python: | + from openai import OpenAI + client = OpenAI() + + tools = [ + { + "type": "function", + "function": { + "name": "get_current_weather", + "description": "Get the current weather in a given location", + "parameters": { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. San Francisco, CA", + }, + "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}, + }, + "required": ["location"], + }, + } + } + ] + + stream = client.beta.threads.create_and_run( + thread={ + "messages": [ + {"role": "user", "content": "What is the weather like in San Francisco?"} + ] + }, + assistant_id="asst_abc123", + tools=tools, + stream=True + ) + + for event in stream: + print(event) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + const tools = [ + { + "type": "function", + "function": { + "name": "get_current_weather", + "description": "Get the current weather in a given location", + "parameters": { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. San Francisco, CA", + }, + "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}, + }, + "required": ["location"], + }, + } + } + ]; + + async function main() { + const stream = await openai.beta.threads.createAndRun({ + assistant_id: "asst_123", + thread: { + messages: [ + { role: "user", content: "What is the weather like in San Francisco?" }, + ], + }, + tools: tools, + stream: true + }); + + for await (const event of stream) { + console.log(event); + } + } + + main(); + response: | + event: thread.created + data: {"id":"thread_123","object":"thread","created_at":1710351818,"metadata":{}} + + event: thread.run.created + data: {"id":"run_123","object":"thread.run","created_at":1710351818,"assistant_id":"asst_123","thread_id":"thread_123","status":"queued","started_at":null,"expires_at":1710352418,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[{"type":"function","function":{"name":"get_current_weather","description":"Get the current weather in a given location","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and state, e.g. San Francisco, CA"},"unit":{"type":"string","enum":["celsius","fahrenheit"]}},"required":["location"]}}}],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: thread.run.queued + data: {"id":"run_123","object":"thread.run","created_at":1710351818,"assistant_id":"asst_123","thread_id":"thread_123","status":"queued","started_at":null,"expires_at":1710352418,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[{"type":"function","function":{"name":"get_current_weather","description":"Get the current weather in a given location","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and state, e.g. San Francisco, CA"},"unit":{"type":"string","enum":["celsius","fahrenheit"]}},"required":["location"]}}}],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: thread.run.in_progress + data: {"id":"run_123","object":"thread.run","created_at":1710351818,"assistant_id":"asst_123","thread_id":"thread_123","status":"in_progress","started_at":1710351818,"expires_at":1710352418,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[{"type":"function","function":{"name":"get_current_weather","description":"Get the current weather in a given location","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and state, e.g. San Francisco, CA"},"unit":{"type":"string","enum":["celsius","fahrenheit"]}},"required":["location"]}}}],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: thread.run.step.created + data: {"id":"step_001","object":"thread.run.step","created_at":1710351819,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"tool_calls","status":"in_progress","cancelled_at":null,"completed_at":null,"expires_at":1710352418,"failed_at":null,"last_error":null,"step_details":{"type":"tool_calls","tool_calls":[]},"usage":null} + + event: thread.run.step.in_progress + data: {"id":"step_001","object":"thread.run.step","created_at":1710351819,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"tool_calls","status":"in_progress","cancelled_at":null,"completed_at":null,"expires_at":1710352418,"failed_at":null,"last_error":null,"step_details":{"type":"tool_calls","tool_calls":[]},"usage":null} + + event: thread.run.step.delta + data: {"id":"step_001","object":"thread.run.step.delta","delta":{"step_details":{"type":"tool_calls","tool_calls":[{"index":0,"id":"call_XXNp8YGaFrjrSjgqxtC8JJ1B","type":"function","function":{"name":"get_current_weather","arguments":"","output":null}}]}}} + + event: thread.run.step.delta + data: {"id":"step_001","object":"thread.run.step.delta","delta":{"step_details":{"type":"tool_calls","tool_calls":[{"index":0,"type":"function","function":{"arguments":"{\""}}]}}} + + event: thread.run.step.delta + data: {"id":"step_001","object":"thread.run.step.delta","delta":{"step_details":{"type":"tool_calls","tool_calls":[{"index":0,"type":"function","function":{"arguments":"location"}}]}}} + + ... + + event: thread.run.step.delta + data: {"id":"step_001","object":"thread.run.step.delta","delta":{"step_details":{"type":"tool_calls","tool_calls":[{"index":0,"type":"function","function":{"arguments":"ahrenheit"}}]}}} + + event: thread.run.step.delta + data: {"id":"step_001","object":"thread.run.step.delta","delta":{"step_details":{"type":"tool_calls","tool_calls":[{"index":0,"type":"function","function":{"arguments":"\"}"}}]}}} + + event: thread.run.requires_action + data: {"id":"run_123","object":"thread.run","created_at":1710351818,"assistant_id":"asst_123","thread_id":"thread_123","status":"requires_action","started_at":1710351818,"expires_at":1710352418,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":{"type":"submit_tool_outputs","submit_tool_outputs":{"tool_calls":[{"id":"call_XXNp8YGaFrjrSjgqxtC8JJ1B","type":"function","function":{"name":"get_current_weather","arguments":"{\"location\":\"San Francisco, CA\",\"unit\":\"fahrenheit\"}"}}]}},"last_error":null,"model":"gpt-4o","instructions":null,"tools":[{"type":"function","function":{"name":"get_current_weather","description":"Get the current weather in a given location","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and state, e.g. San Francisco, CA"},"unit":{"type":"string","enum":["celsius","fahrenheit"]}},"required":["location"]}}}],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":{"prompt_tokens":345,"completion_tokens":11,"total_tokens":356},"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: done + data: [DONE] + + /threads/{thread_id}/runs: + get: + operationId: listRuns + tags: + - Assistants + summary: Returns a list of runs belonging to a thread. + parameters: + - name: thread_id + in: path + required: true + schema: + type: string + description: The ID of the thread the run belongs to. + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: order + in: query + description: *pagination_order_param_description + schema: + type: string + default: desc + enum: ["asc", "desc"] + - name: after + in: query + description: *pagination_after_param_description + schema: + type: string + - name: before + in: query + description: *pagination_before_param_description + schema: + type: string + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListRunsResponse" + x-oaiMeta: + name: List runs + group: threads + beta: true + returns: A list of [run](/docs/api-reference/runs/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/runs \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + runs = client.beta.threads.runs.list( + "thread_abc123" + ) + + print(runs) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const runs = await openai.beta.threads.runs.list( + "thread_abc123" + ); + + console.log(runs); + } + + main(); + response: | + { + "object": "list", + "data": [ + { + "id": "run_abc123", + "object": "thread.run", + "created_at": 1699075072, + "assistant_id": "asst_abc123", + "thread_id": "thread_abc123", + "status": "completed", + "started_at": 1699075072, + "expires_at": null, + "cancelled_at": null, + "failed_at": null, + "completed_at": 1699075073, + "last_error": null, + "model": "gpt-4o", + "instructions": null, + "incomplete_details": null, + "tools": [ + { + "type": "code_interpreter" + } + ], + "tool_resources": { + "code_interpreter": { + "file_ids": [ + "file-abc123", + "file-abc456" + ] + } + }, + "metadata": {}, + "usage": { + "prompt_tokens": 123, + "completion_tokens": 456, + "total_tokens": 579 + }, + "temperature": 1.0, + "top_p": 1.0, + "max_prompt_tokens": 1000, + "max_completion_tokens": 1000, + "truncation_strategy": { + "type": "auto", + "last_messages": null + }, + "response_format": "auto", + "tool_choice": "auto", + "parallel_tool_calls": true + }, + { + "id": "run_abc456", + "object": "thread.run", + "created_at": 1699063290, + "assistant_id": "asst_abc123", + "thread_id": "thread_abc123", + "status": "completed", + "started_at": 1699063290, + "expires_at": null, + "cancelled_at": null, + "failed_at": null, + "completed_at": 1699063291, + "last_error": null, + "model": "gpt-4o", + "instructions": null, + "incomplete_details": null, + "tools": [ + { + "type": "code_interpreter" + } + ], + "tool_resources": { + "code_interpreter": { + "file_ids": [ + "file-abc123", + "file-abc456" + ] + } + }, + "metadata": {}, + "usage": { + "prompt_tokens": 123, + "completion_tokens": 456, + "total_tokens": 579 + }, + "temperature": 1.0, + "top_p": 1.0, + "max_prompt_tokens": 1000, + "max_completion_tokens": 1000, + "truncation_strategy": { + "type": "auto", + "last_messages": null + }, + "response_format": "auto", + "tool_choice": "auto", + "parallel_tool_calls": true + } + ], + "first_id": "run_abc123", + "last_id": "run_abc456", + "has_more": false + } + post: + operationId: createRun + tags: + - Assistants + summary: Create a run. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the thread to run. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateRunRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/RunObject" + x-oaiMeta: + name: Create run + group: threads + beta: true + returns: A [run](/docs/api-reference/runs/object) object. + examples: + - title: Default + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/runs \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "assistant_id": "asst_abc123" + }' + python: | + from openai import OpenAI + client = OpenAI() + + run = client.beta.threads.runs.create( + thread_id="thread_abc123", + assistant_id="asst_abc123" + ) + + print(run) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const run = await openai.beta.threads.runs.create( + "thread_abc123", + { assistant_id: "asst_abc123" } + ); + + console.log(run); + } + + main(); + response: &run_object_example | + { + "id": "run_abc123", + "object": "thread.run", + "created_at": 1699063290, + "assistant_id": "asst_abc123", + "thread_id": "thread_abc123", + "status": "queued", + "started_at": 1699063290, + "expires_at": null, + "cancelled_at": null, + "failed_at": null, + "completed_at": 1699063291, + "last_error": null, + "model": "gpt-4o", + "instructions": null, + "incomplete_details": null, + "tools": [ + { + "type": "code_interpreter" + } + ], + "metadata": {}, + "usage": null, + "temperature": 1.0, + "top_p": 1.0, + "max_prompt_tokens": 1000, + "max_completion_tokens": 1000, + "truncation_strategy": { + "type": "auto", + "last_messages": null + }, + "response_format": "auto", + "tool_choice": "auto", + "parallel_tool_calls": true + } + - title: Streaming + request: + curl: | + curl https://api.openai.com/v1/threads/thread_123/runs \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "assistant_id": "asst_123", + "stream": true + }' + python: | + from openai import OpenAI + client = OpenAI() + + stream = client.beta.threads.runs.create( + thread_id="thread_123", + assistant_id="asst_123", + stream=True + ) + + for event in stream: + print(event) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const stream = await openai.beta.threads.runs.create( + "thread_123", + { assistant_id: "asst_123", stream: true } + ); + + for await (const event of stream) { + console.log(event); + } + } + + main(); + response: | + event: thread.run.created + data: {"id":"run_123","object":"thread.run","created_at":1710330640,"assistant_id":"asst_123","thread_id":"thread_123","status":"queued","started_at":null,"expires_at":1710331240,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: thread.run.queued + data: {"id":"run_123","object":"thread.run","created_at":1710330640,"assistant_id":"asst_123","thread_id":"thread_123","status":"queued","started_at":null,"expires_at":1710331240,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: thread.run.in_progress + data: {"id":"run_123","object":"thread.run","created_at":1710330640,"assistant_id":"asst_123","thread_id":"thread_123","status":"in_progress","started_at":1710330641,"expires_at":1710331240,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: thread.run.step.created + data: {"id":"step_001","object":"thread.run.step","created_at":1710330641,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"in_progress","cancelled_at":null,"completed_at":null,"expires_at":1710331240,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_001"}},"usage":null} + + event: thread.run.step.in_progress + data: {"id":"step_001","object":"thread.run.step","created_at":1710330641,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"in_progress","cancelled_at":null,"completed_at":null,"expires_at":1710331240,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_001"}},"usage":null} + + event: thread.message.created + data: {"id":"msg_001","object":"thread.message","created_at":1710330641,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"in_progress","incomplete_details":null,"incomplete_at":null,"completed_at":null,"role":"assistant","content":[],"metadata":{}} + + event: thread.message.in_progress + data: {"id":"msg_001","object":"thread.message","created_at":1710330641,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"in_progress","incomplete_details":null,"incomplete_at":null,"completed_at":null,"role":"assistant","content":[],"metadata":{}} + + event: thread.message.delta + data: {"id":"msg_001","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"Hello","annotations":[]}}]}} + + ... + + event: thread.message.delta + data: {"id":"msg_001","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" today"}}]}} + + event: thread.message.delta + data: {"id":"msg_001","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"?"}}]}} + + event: thread.message.completed + data: {"id":"msg_001","object":"thread.message","created_at":1710330641,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"completed","incomplete_details":null,"incomplete_at":null,"completed_at":1710330642,"role":"assistant","content":[{"type":"text","text":{"value":"Hello! How can I assist you today?","annotations":[]}}],"metadata":{}} + + event: thread.run.step.completed + data: {"id":"step_001","object":"thread.run.step","created_at":1710330641,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"completed","cancelled_at":null,"completed_at":1710330642,"expires_at":1710331240,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_001"}},"usage":{"prompt_tokens":20,"completion_tokens":11,"total_tokens":31}} + + event: thread.run.completed + data: {"id":"run_123","object":"thread.run","created_at":1710330640,"assistant_id":"asst_123","thread_id":"thread_123","status":"completed","started_at":1710330641,"expires_at":null,"cancelled_at":null,"failed_at":null,"completed_at":1710330642,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":{"prompt_tokens":20,"completion_tokens":11,"total_tokens":31},"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: done + data: [DONE] + + - title: Streaming with Functions + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/runs \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "assistant_id": "asst_abc123", + "tools": [ + { + "type": "function", + "function": { + "name": "get_current_weather", + "description": "Get the current weather in a given location", + "parameters": { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. San Francisco, CA" + }, + "unit": { + "type": "string", + "enum": ["celsius", "fahrenheit"] + } + }, + "required": ["location"] + } + } + } + ], + "stream": true + }' + python: | + from openai import OpenAI + client = OpenAI() + + tools = [ + { + "type": "function", + "function": { + "name": "get_current_weather", + "description": "Get the current weather in a given location", + "parameters": { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. San Francisco, CA", + }, + "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}, + }, + "required": ["location"], + }, + } + } + ] + + stream = client.beta.threads.runs.create( + thread_id="thread_abc123", + assistant_id="asst_abc123", + tools=tools, + stream=True + ) + + for event in stream: + print(event) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + const tools = [ + { + "type": "function", + "function": { + "name": "get_current_weather", + "description": "Get the current weather in a given location", + "parameters": { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. San Francisco, CA", + }, + "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}, + }, + "required": ["location"], + }, + } + } + ]; + + async function main() { + const stream = await openai.beta.threads.runs.create( + "thread_abc123", + { + assistant_id: "asst_abc123", + tools: tools, + stream: true + } + ); + + for await (const event of stream) { + console.log(event); + } + } + + main(); + response: | + event: thread.run.created + data: {"id":"run_123","object":"thread.run","created_at":1710348075,"assistant_id":"asst_123","thread_id":"thread_123","status":"queued","started_at":null,"expires_at":1710348675,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: thread.run.queued + data: {"id":"run_123","object":"thread.run","created_at":1710348075,"assistant_id":"asst_123","thread_id":"thread_123","status":"queued","started_at":null,"expires_at":1710348675,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: thread.run.in_progress + data: {"id":"run_123","object":"thread.run","created_at":1710348075,"assistant_id":"asst_123","thread_id":"thread_123","status":"in_progress","started_at":1710348075,"expires_at":1710348675,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: thread.run.step.created + data: {"id":"step_001","object":"thread.run.step","created_at":1710348076,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"in_progress","cancelled_at":null,"completed_at":null,"expires_at":1710348675,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_001"}},"usage":null} + + event: thread.run.step.in_progress + data: {"id":"step_001","object":"thread.run.step","created_at":1710348076,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"in_progress","cancelled_at":null,"completed_at":null,"expires_at":1710348675,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_001"}},"usage":null} + + event: thread.message.created + data: {"id":"msg_001","object":"thread.message","created_at":1710348076,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"in_progress","incomplete_details":null,"incomplete_at":null,"completed_at":null,"role":"assistant","content":[],"metadata":{}} + + event: thread.message.in_progress + data: {"id":"msg_001","object":"thread.message","created_at":1710348076,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"in_progress","incomplete_details":null,"incomplete_at":null,"completed_at":null,"role":"assistant","content":[],"metadata":{}} + + event: thread.message.delta + data: {"id":"msg_001","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"Hello","annotations":[]}}]}} + + ... + + event: thread.message.delta + data: {"id":"msg_001","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" today"}}]}} + + event: thread.message.delta + data: {"id":"msg_001","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"?"}}]}} + + event: thread.message.completed + data: {"id":"msg_001","object":"thread.message","created_at":1710348076,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"completed","incomplete_details":null,"incomplete_at":null,"completed_at":1710348077,"role":"assistant","content":[{"type":"text","text":{"value":"Hello! How can I assist you today?","annotations":[]}}],"metadata":{}} + + event: thread.run.step.completed + data: {"id":"step_001","object":"thread.run.step","created_at":1710348076,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"completed","cancelled_at":null,"completed_at":1710348077,"expires_at":1710348675,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_001"}},"usage":{"prompt_tokens":20,"completion_tokens":11,"total_tokens":31}} + + event: thread.run.completed + data: {"id":"run_123","object":"thread.run","created_at":1710348075,"assistant_id":"asst_123","thread_id":"thread_123","status":"completed","started_at":1710348075,"expires_at":null,"cancelled_at":null,"failed_at":null,"completed_at":1710348077,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":{"prompt_tokens":20,"completion_tokens":11,"total_tokens":31},"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: done + data: [DONE] + + /threads/{thread_id}/runs/{run_id}: + get: + operationId: getRun + tags: + - Assistants + summary: Retrieves a run. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the [thread](/docs/api-reference/threads) that was run. + - in: path + name: run_id + required: true + schema: + type: string + description: The ID of the run to retrieve. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/RunObject" + x-oaiMeta: + name: Retrieve run + group: threads + beta: true + returns: The [run](/docs/api-reference/runs/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/runs/run_abc123 \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + run = client.beta.threads.runs.retrieve( + thread_id="thread_abc123", + run_id="run_abc123" + ) + + print(run) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const run = await openai.beta.threads.runs.retrieve( + "thread_abc123", + "run_abc123" + ); + + console.log(run); + } + + main(); + response: | + { + "id": "run_abc123", + "object": "thread.run", + "created_at": 1699075072, + "assistant_id": "asst_abc123", + "thread_id": "thread_abc123", + "status": "completed", + "started_at": 1699075072, + "expires_at": null, + "cancelled_at": null, + "failed_at": null, + "completed_at": 1699075073, + "last_error": null, + "model": "gpt-4o", + "instructions": null, + "incomplete_details": null, + "tools": [ + { + "type": "code_interpreter" + } + ], + "metadata": {}, + "usage": { + "prompt_tokens": 123, + "completion_tokens": 456, + "total_tokens": 579 + }, + "temperature": 1.0, + "top_p": 1.0, + "max_prompt_tokens": 1000, + "max_completion_tokens": 1000, + "truncation_strategy": { + "type": "auto", + "last_messages": null + }, + "response_format": "auto", + "tool_choice": "auto", + "parallel_tool_calls": true + } + post: + operationId: modifyRun + tags: + - Assistants + summary: Modifies a run. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the [thread](/docs/api-reference/threads) that was run. + - in: path + name: run_id + required: true + schema: + type: string + description: The ID of the run to modify. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/ModifyRunRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/RunObject" + x-oaiMeta: + name: Modify run + group: threads + beta: true + returns: The modified [run](/docs/api-reference/runs/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/runs/run_abc123 \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "metadata": { + "user_id": "user_abc123" + } + }' + python: | + from openai import OpenAI + client = OpenAI() + + run = client.beta.threads.runs.update( + thread_id="thread_abc123", + run_id="run_abc123", + metadata={"user_id": "user_abc123"}, + ) + + print(run) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const run = await openai.beta.threads.runs.update( + "thread_abc123", + "run_abc123", + { + metadata: { + user_id: "user_abc123", + }, + } + ); + + console.log(run); + } + + main(); + response: | + { + "id": "run_abc123", + "object": "thread.run", + "created_at": 1699075072, + "assistant_id": "asst_abc123", + "thread_id": "thread_abc123", + "status": "completed", + "started_at": 1699075072, + "expires_at": null, + "cancelled_at": null, + "failed_at": null, + "completed_at": 1699075073, + "last_error": null, + "model": "gpt-4o", + "instructions": null, + "incomplete_details": null, + "tools": [ + { + "type": "code_interpreter" + } + ], + "tool_resources": { + "code_interpreter": { + "file_ids": [ + "file-abc123", + "file-abc456" + ] + } + }, + "metadata": { + "user_id": "user_abc123" + }, + "usage": { + "prompt_tokens": 123, + "completion_tokens": 456, + "total_tokens": 579 + }, + "temperature": 1.0, + "top_p": 1.0, + "max_prompt_tokens": 1000, + "max_completion_tokens": 1000, + "truncation_strategy": { + "type": "auto", + "last_messages": null + }, + "response_format": "auto", + "tool_choice": "auto", + "parallel_tool_calls": true + } + + /threads/{thread_id}/runs/{run_id}/submit_tool_outputs: + post: + operationId: submitToolOuputsToRun + tags: + - Assistants + summary: | + When a run has the `status: "requires_action"` and `required_action.type` is `submit_tool_outputs`, this endpoint can be used to submit the outputs from the tool calls once they're all completed. All outputs must be submitted in a single request. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the [thread](/docs/api-reference/threads) to which this run belongs. + - in: path + name: run_id + required: true + schema: + type: string + description: The ID of the run that requires the tool output submission. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/SubmitToolOutputsRunRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/RunObject" + x-oaiMeta: + name: Submit tool outputs to run + group: threads + beta: true + returns: The modified [run](/docs/api-reference/runs/object) object matching the specified ID. + examples: + - title: Default + request: + curl: | + curl https://api.openai.com/v1/threads/thread_123/runs/run_123/submit_tool_outputs \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "tool_outputs": [ + { + "tool_call_id": "call_001", + "output": "70 degrees and sunny." + } + ] + }' + python: | + from openai import OpenAI + client = OpenAI() + + run = client.beta.threads.runs.submit_tool_outputs( + thread_id="thread_123", + run_id="run_123", + tool_outputs=[ + { + "tool_call_id": "call_001", + "output": "70 degrees and sunny." + } + ] + ) + + print(run) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const run = await openai.beta.threads.runs.submitToolOutputs( + "thread_123", + "run_123", + { + tool_outputs: [ + { + tool_call_id: "call_001", + output: "70 degrees and sunny.", + }, + ], + } + ); + + console.log(run); + } + + main(); + response: | + { + "id": "run_123", + "object": "thread.run", + "created_at": 1699075592, + "assistant_id": "asst_123", + "thread_id": "thread_123", + "status": "queued", + "started_at": 1699075592, + "expires_at": 1699076192, + "cancelled_at": null, + "failed_at": null, + "completed_at": null, + "last_error": null, + "model": "gpt-4o", + "instructions": null, + "tools": [ + { + "type": "function", + "function": { + "name": "get_current_weather", + "description": "Get the current weather in a given location", + "parameters": { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and state, e.g. San Francisco, CA" + }, + "unit": { + "type": "string", + "enum": ["celsius", "fahrenheit"] + } + }, + "required": ["location"] + } + } + } + ], + "metadata": {}, + "usage": null, + "temperature": 1.0, + "top_p": 1.0, + "max_prompt_tokens": 1000, + "max_completion_tokens": 1000, + "truncation_strategy": { + "type": "auto", + "last_messages": null + }, + "response_format": "auto", + "tool_choice": "auto", + "parallel_tool_calls": true + } + + - title: Streaming + request: + curl: | + curl https://api.openai.com/v1/threads/thread_123/runs/run_123/submit_tool_outputs \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "tool_outputs": [ + { + "tool_call_id": "call_001", + "output": "70 degrees and sunny." + } + ], + "stream": true + }' + python: | + from openai import OpenAI + client = OpenAI() + + stream = client.beta.threads.runs.submit_tool_outputs( + thread_id="thread_123", + run_id="run_123", + tool_outputs=[ + { + "tool_call_id": "call_001", + "output": "70 degrees and sunny." + } + ], + stream=True + ) + + for event in stream: + print(event) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const stream = await openai.beta.threads.runs.submitToolOutputs( + "thread_123", + "run_123", + { + tool_outputs: [ + { + tool_call_id: "call_001", + output: "70 degrees and sunny.", + }, + ], + } + ); + + for await (const event of stream) { + console.log(event); + } + } + + main(); + response: | + event: thread.run.step.completed + data: {"id":"step_001","object":"thread.run.step","created_at":1710352449,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"tool_calls","status":"completed","cancelled_at":null,"completed_at":1710352475,"expires_at":1710353047,"failed_at":null,"last_error":null,"step_details":{"type":"tool_calls","tool_calls":[{"id":"call_iWr0kQ2EaYMaxNdl0v3KYkx7","type":"function","function":{"name":"get_current_weather","arguments":"{\"location\":\"San Francisco, CA\",\"unit\":\"fahrenheit\"}","output":"70 degrees and sunny."}}]},"usage":{"prompt_tokens":291,"completion_tokens":24,"total_tokens":315}} + + event: thread.run.queued + data: {"id":"run_123","object":"thread.run","created_at":1710352447,"assistant_id":"asst_123","thread_id":"thread_123","status":"queued","started_at":1710352448,"expires_at":1710353047,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[{"type":"function","function":{"name":"get_current_weather","description":"Get the current weather in a given location","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and state, e.g. San Francisco, CA"},"unit":{"type":"string","enum":["celsius","fahrenheit"]}},"required":["location"]}}}],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: thread.run.in_progress + data: {"id":"run_123","object":"thread.run","created_at":1710352447,"assistant_id":"asst_123","thread_id":"thread_123","status":"in_progress","started_at":1710352475,"expires_at":1710353047,"cancelled_at":null,"failed_at":null,"completed_at":null,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[{"type":"function","function":{"name":"get_current_weather","description":"Get the current weather in a given location","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and state, e.g. San Francisco, CA"},"unit":{"type":"string","enum":["celsius","fahrenheit"]}},"required":["location"]}}}],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":null,"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: thread.run.step.created + data: {"id":"step_002","object":"thread.run.step","created_at":1710352476,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"in_progress","cancelled_at":null,"completed_at":null,"expires_at":1710353047,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_002"}},"usage":null} + + event: thread.run.step.in_progress + data: {"id":"step_002","object":"thread.run.step","created_at":1710352476,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"in_progress","cancelled_at":null,"completed_at":null,"expires_at":1710353047,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_002"}},"usage":null} + + event: thread.message.created + data: {"id":"msg_002","object":"thread.message","created_at":1710352476,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"in_progress","incomplete_details":null,"incomplete_at":null,"completed_at":null,"role":"assistant","content":[],"metadata":{}} + + event: thread.message.in_progress + data: {"id":"msg_002","object":"thread.message","created_at":1710352476,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"in_progress","incomplete_details":null,"incomplete_at":null,"completed_at":null,"role":"assistant","content":[],"metadata":{}} + + event: thread.message.delta + data: {"id":"msg_002","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"The","annotations":[]}}]}} + + event: thread.message.delta + data: {"id":"msg_002","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" current"}}]}} + + event: thread.message.delta + data: {"id":"msg_002","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" weather"}}]}} + + ... + + event: thread.message.delta + data: {"id":"msg_002","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":" sunny"}}]}} + + event: thread.message.delta + data: {"id":"msg_002","object":"thread.message.delta","delta":{"content":[{"index":0,"type":"text","text":{"value":"."}}]}} + + event: thread.message.completed + data: {"id":"msg_002","object":"thread.message","created_at":1710352476,"assistant_id":"asst_123","thread_id":"thread_123","run_id":"run_123","status":"completed","incomplete_details":null,"incomplete_at":null,"completed_at":1710352477,"role":"assistant","content":[{"type":"text","text":{"value":"The current weather in San Francisco, CA is 70 degrees Fahrenheit and sunny.","annotations":[]}}],"metadata":{}} + + event: thread.run.step.completed + data: {"id":"step_002","object":"thread.run.step","created_at":1710352476,"run_id":"run_123","assistant_id":"asst_123","thread_id":"thread_123","type":"message_creation","status":"completed","cancelled_at":null,"completed_at":1710352477,"expires_at":1710353047,"failed_at":null,"last_error":null,"step_details":{"type":"message_creation","message_creation":{"message_id":"msg_002"}},"usage":{"prompt_tokens":329,"completion_tokens":18,"total_tokens":347}} + + event: thread.run.completed + data: {"id":"run_123","object":"thread.run","created_at":1710352447,"assistant_id":"asst_123","thread_id":"thread_123","status":"completed","started_at":1710352475,"expires_at":null,"cancelled_at":null,"failed_at":null,"completed_at":1710352477,"required_action":null,"last_error":null,"model":"gpt-4o","instructions":null,"tools":[{"type":"function","function":{"name":"get_current_weather","description":"Get the current weather in a given location","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and state, e.g. San Francisco, CA"},"unit":{"type":"string","enum":["celsius","fahrenheit"]}},"required":["location"]}}}],"metadata":{},"temperature":1.0,"top_p":1.0,"max_completion_tokens":null,"max_prompt_tokens":null,"truncation_strategy":{"type":"auto","last_messages":null},"incomplete_details":null,"usage":{"prompt_tokens":20,"completion_tokens":11,"total_tokens":31},"response_format":"auto","tool_choice":"auto","parallel_tool_calls":true}} + + event: done + data: [DONE] + + /threads/{thread_id}/runs/{run_id}/cancel: + post: + operationId: cancelRun + tags: + - Assistants + summary: Cancels a run that is `in_progress`. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the thread to which this run belongs. + - in: path + name: run_id + required: true + schema: + type: string + description: The ID of the run to cancel. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/RunObject" + x-oaiMeta: + name: Cancel a run + group: threads + beta: true + returns: The modified [run](/docs/api-reference/runs/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/runs/run_abc123/cancel \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "OpenAI-Beta: assistants=v2" \ + -X POST + python: | + from openai import OpenAI + client = OpenAI() + + run = client.beta.threads.runs.cancel( + thread_id="thread_abc123", + run_id="run_abc123" + ) + + print(run) + node.js: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const run = await openai.beta.threads.runs.cancel( + "thread_abc123", + "run_abc123" + ); + + console.log(run); + } + + main(); + response: | + { + "id": "run_abc123", + "object": "thread.run", + "created_at": 1699076126, + "assistant_id": "asst_abc123", + "thread_id": "thread_abc123", + "status": "cancelling", + "started_at": 1699076126, + "expires_at": 1699076726, + "cancelled_at": null, + "failed_at": null, + "completed_at": null, + "last_error": null, + "model": "gpt-4o", + "instructions": "You summarize books.", + "tools": [ + { + "type": "file_search" + } + ], + "tool_resources": { + "file_search": { + "vector_store_ids": ["vs_123"] + } + }, + "metadata": {}, + "usage": null, + "temperature": 1.0, + "top_p": 1.0, + "response_format": "auto", + "tool_choice": "auto", + "parallel_tool_calls": true + } + + /threads/{thread_id}/runs/{run_id}/steps: + get: + operationId: listRunSteps + tags: + - Assistants + summary: Returns a list of run steps belonging to a run. + parameters: + - name: thread_id + in: path + required: true + schema: + type: string + description: The ID of the thread the run and run steps belong to. + - name: run_id + in: path + required: true + schema: + type: string + description: The ID of the run the run steps belong to. + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: order + in: query + description: *pagination_order_param_description + schema: + type: string + default: desc + enum: ["asc", "desc"] + - name: after + in: query + description: *pagination_after_param_description + schema: + type: string + - name: before + in: query + description: *pagination_before_param_description + schema: + type: string + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListRunStepsResponse" + x-oaiMeta: + name: List run steps + group: threads + beta: true + returns: A list of [run step](/docs/api-reference/runs/step-object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/runs/run_abc123/steps \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + run_steps = client.beta.threads.runs.steps.list( + thread_id="thread_abc123", + run_id="run_abc123" + ) + + print(run_steps) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const runStep = await openai.beta.threads.runs.steps.list( + "thread_abc123", + "run_abc123" + ); + console.log(runStep); + } + + main(); + response: | + { + "object": "list", + "data": [ + { + "id": "step_abc123", + "object": "thread.run.step", + "created_at": 1699063291, + "run_id": "run_abc123", + "assistant_id": "asst_abc123", + "thread_id": "thread_abc123", + "type": "message_creation", + "status": "completed", + "cancelled_at": null, + "completed_at": 1699063291, + "expired_at": null, + "failed_at": null, + "last_error": null, + "step_details": { + "type": "message_creation", + "message_creation": { + "message_id": "msg_abc123" + } + }, + "usage": { + "prompt_tokens": 123, + "completion_tokens": 456, + "total_tokens": 579 + } + } + ], + "first_id": "step_abc123", + "last_id": "step_abc456", + "has_more": false + } + + /threads/{thread_id}/runs/{run_id}/steps/{step_id}: + get: + operationId: getRunStep + tags: + - Assistants + summary: Retrieves a run step. + parameters: + - in: path + name: thread_id + required: true + schema: + type: string + description: The ID of the thread to which the run and run step belongs. + - in: path + name: run_id + required: true + schema: + type: string + description: The ID of the run to which the run step belongs. + - in: path + name: step_id + required: true + schema: + type: string + description: The ID of the run step to retrieve. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/RunStepObject" + x-oaiMeta: + name: Retrieve run step + group: threads + beta: true + returns: The [run step](/docs/api-reference/runs/step-object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/threads/thread_abc123/runs/run_abc123/steps/step_abc123 \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + run_step = client.beta.threads.runs.steps.retrieve( + thread_id="thread_abc123", + run_id="run_abc123", + step_id="step_abc123" + ) + + print(run_step) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const runStep = await openai.beta.threads.runs.steps.retrieve( + "thread_abc123", + "run_abc123", + "step_abc123" + ); + console.log(runStep); + } + + main(); + response: &run_step_object_example | + { + "id": "step_abc123", + "object": "thread.run.step", + "created_at": 1699063291, + "run_id": "run_abc123", + "assistant_id": "asst_abc123", + "thread_id": "thread_abc123", + "type": "message_creation", + "status": "completed", + "cancelled_at": null, + "completed_at": 1699063291, + "expired_at": null, + "failed_at": null, + "last_error": null, + "step_details": { + "type": "message_creation", + "message_creation": { + "message_id": "msg_abc123" + } + }, + "usage": { + "prompt_tokens": 123, + "completion_tokens": 456, + "total_tokens": 579 + } + } + + /vector_stores: + get: + operationId: listVectorStores + tags: + - Vector Stores + summary: Returns a list of vector stores. + parameters: + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: order + in: query + description: *pagination_order_param_description + schema: + type: string + default: desc + enum: ["asc", "desc"] + - name: after + in: query + description: *pagination_after_param_description + schema: + type: string + - name: before + in: query + description: *pagination_before_param_description + schema: + type: string + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListVectorStoresResponse" + x-oaiMeta: + name: List vector stores + group: vector_stores + beta: true + returns: A list of [vector store](/docs/api-reference/vector-stores/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + vector_stores = client.beta.vector_stores.list() + print(vector_stores) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const vectorStores = await openai.beta.vectorStores.list(); + console.log(vectorStores); + } + + main(); + response: | + { + "object": "list", + "data": [ + { + "id": "vs_abc123", + "object": "vector_store", + "created_at": 1699061776, + "name": "Support FAQ", + "bytes": 139920, + "file_counts": { + "in_progress": 0, + "completed": 3, + "failed": 0, + "cancelled": 0, + "total": 3 + } + }, + { + "id": "vs_abc456", + "object": "vector_store", + "created_at": 1699061776, + "name": "Support FAQ v2", + "bytes": 139920, + "file_counts": { + "in_progress": 0, + "completed": 3, + "failed": 0, + "cancelled": 0, + "total": 3 + } + } + ], + "first_id": "vs_abc123", + "last_id": "vs_abc456", + "has_more": false + } + post: + operationId: createVectorStore + tags: + - Vector Stores + summary: Create a vector store. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateVectorStoreRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/VectorStoreObject" + x-oaiMeta: + name: Create vector store + group: vector_stores + beta: true + returns: A [vector store](/docs/api-reference/vector-stores/object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" + -d '{ + "name": "Support FAQ" + }' + python: | + from openai import OpenAI + client = OpenAI() + + vector_store = client.beta.vector_stores.create( + name="Support FAQ" + ) + print(vector_store) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const vectorStore = await openai.beta.vectorStores.create({ + name: "Support FAQ" + }); + console.log(vectorStore); + } + + main(); + response: | + { + "id": "vs_abc123", + "object": "vector_store", + "created_at": 1699061776, + "name": "Support FAQ", + "bytes": 139920, + "file_counts": { + "in_progress": 0, + "completed": 3, + "failed": 0, + "cancelled": 0, + "total": 3 + } + } + + /vector_stores/{vector_store_id}: + get: + operationId: getVectorStore + tags: + - Vector Stores + summary: Retrieves a vector store. + parameters: + - in: path + name: vector_store_id + required: true + schema: + type: string + description: The ID of the vector store to retrieve. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/VectorStoreObject" + x-oaiMeta: + name: Retrieve vector store + group: vector_stores + beta: true + returns: The [vector store](/docs/api-reference/vector-stores/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores/vs_abc123 \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + vector_store = client.beta.vector_stores.retrieve( + vector_store_id="vs_abc123" + ) + print(vector_store) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const vectorStore = await openai.beta.vectorStores.retrieve( + "vs_abc123" + ); + console.log(vectorStore); + } + + main(); + response: | + { + "id": "vs_abc123", + "object": "vector_store", + "created_at": 1699061776 + } + post: + operationId: modifyVectorStore + tags: + - Vector Stores + summary: Modifies a vector store. + parameters: + - in: path + name: vector_store_id + required: true + schema: + type: string + description: The ID of the vector store to modify. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/UpdateVectorStoreRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/VectorStoreObject" + x-oaiMeta: + name: Modify vector store + group: vector_stores + beta: true + returns: The modified [vector store](/docs/api-reference/vector-stores/object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores/vs_abc123 \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" + -d '{ + "name": "Support FAQ" + }' + python: | + from openai import OpenAI + client = OpenAI() + + vector_store = client.beta.vector_stores.update( + vector_store_id="vs_abc123", + name="Support FAQ" + ) + print(vector_store) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const vectorStore = await openai.beta.vectorStores.update( + "vs_abc123", + { + name: "Support FAQ" + } + ); + console.log(vectorStore); + } + + main(); + response: | + { + "id": "vs_abc123", + "object": "vector_store", + "created_at": 1699061776, + "name": "Support FAQ", + "bytes": 139920, + "file_counts": { + "in_progress": 0, + "completed": 3, + "failed": 0, + "cancelled": 0, + "total": 3 + } + } + + delete: + operationId: deleteVectorStore + tags: + - Vector Stores + summary: Delete a vector store. + parameters: + - in: path + name: vector_store_id + required: true + schema: + type: string + description: The ID of the vector store to delete. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/DeleteVectorStoreResponse" + x-oaiMeta: + name: Delete vector store + group: vector_stores + beta: true + returns: Deletion status + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores/vs_abc123 \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -X DELETE + python: | + from openai import OpenAI + client = OpenAI() + + deleted_vector_store = client.beta.vector_stores.delete( + vector_store_id="vs_abc123" + ) + print(deleted_vector_store) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const deletedVectorStore = await openai.beta.vectorStores.del( + "vs_abc123" + ); + console.log(deletedVectorStore); + } + + main(); + response: | + { + id: "vs_abc123", + object: "vector_store.deleted", + deleted: true + } + + /vector_stores/{vector_store_id}/files: + get: + operationId: listVectorStoreFiles + tags: + - Vector Stores + summary: Returns a list of vector store files. + parameters: + - name: vector_store_id + in: path + description: The ID of the vector store that the files belong to. + required: true + schema: + type: string + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: order + in: query + description: *pagination_order_param_description + schema: + type: string + default: desc + enum: ["asc", "desc"] + - name: after + in: query + description: *pagination_after_param_description + schema: + type: string + - name: before + in: query + description: *pagination_before_param_description + schema: + type: string + - name: filter + in: query + description: "Filter by file status. One of `in_progress`, `completed`, `failed`, `cancelled`." + schema: + type: string + enum: ["in_progress", "completed", "failed", "cancelled"] + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListVectorStoreFilesResponse" + x-oaiMeta: + name: List vector store files + group: vector_stores + beta: true + returns: A list of [vector store file](/docs/api-reference/vector-stores-files/file-object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores/vs_abc123/files \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + vector_store_files = client.beta.vector_stores.files.list( + vector_store_id="vs_abc123" + ) + print(vector_store_files) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const vectorStoreFiles = await openai.beta.vectorStores.files.list( + "vs_abc123" + ); + console.log(vectorStoreFiles); + } + + main(); + response: | + { + "object": "list", + "data": [ + { + "id": "file-abc123", + "object": "vector_store.file", + "created_at": 1699061776, + "vector_store_id": "vs_abc123" + }, + { + "id": "file-abc456", + "object": "vector_store.file", + "created_at": 1699061776, + "vector_store_id": "vs_abc123" + } + ], + "first_id": "file-abc123", + "last_id": "file-abc456", + "has_more": false + } + post: + operationId: createVectorStoreFile + tags: + - Vector Stores + summary: Create a vector store file by attaching a [File](/docs/api-reference/files) to a [vector store](/docs/api-reference/vector-stores/object). + parameters: + - in: path + name: vector_store_id + required: true + schema: + type: string + example: vs_abc123 + description: | + The ID of the vector store for which to create a File. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateVectorStoreFileRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/VectorStoreFileObject" + x-oaiMeta: + name: Create vector store file + group: vector_stores + beta: true + returns: A [vector store file](/docs/api-reference/vector-stores-files/file-object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores/vs_abc123/files \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "file_id": "file-abc123" + }' + python: | + from openai import OpenAI + client = OpenAI() + + vector_store_file = client.beta.vector_stores.files.create( + vector_store_id="vs_abc123", + file_id="file-abc123" + ) + print(vector_store_file) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const myVectorStoreFile = await openai.beta.vectorStores.files.create( + "vs_abc123", + { + file_id: "file-abc123" + } + ); + console.log(myVectorStoreFile); + } + + main(); + response: | + { + "id": "file-abc123", + "object": "vector_store.file", + "created_at": 1699061776, + "usage_bytes": 1234, + "vector_store_id": "vs_abcd", + "status": "completed", + "last_error": null + } + + /vector_stores/{vector_store_id}/files/{file_id}: + get: + operationId: getVectorStoreFile + tags: + - Vector Stores + summary: Retrieves a vector store file. + parameters: + - in: path + name: vector_store_id + required: true + schema: + type: string + example: vs_abc123 + description: The ID of the vector store that the file belongs to. + - in: path + name: file_id + required: true + schema: + type: string + example: file-abc123 + description: The ID of the file being retrieved. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/VectorStoreFileObject" + x-oaiMeta: + name: Retrieve vector store file + group: vector_stores + beta: true + returns: The [vector store file](/docs/api-reference/vector-stores-files/file-object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores/vs_abc123/files/file-abc123 \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + vector_store_file = client.beta.vector_stores.files.retrieve( + vector_store_id="vs_abc123", + file_id="file-abc123" + ) + print(vector_store_file) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const vectorStoreFile = await openai.beta.vectorStores.files.retrieve( + "vs_abc123", + "file-abc123" + ); + console.log(vectorStoreFile); + } + + main(); + response: | + { + "id": "file-abc123", + "object": "vector_store.file", + "created_at": 1699061776, + "vector_store_id": "vs_abcd", + "status": "completed", + "last_error": null + } + delete: + operationId: deleteVectorStoreFile + tags: + - Vector Stores + summary: Delete a vector store file. This will remove the file from the vector store but the file itself will not be deleted. To delete the file, use the [delete file](/docs/api-reference/files/delete) endpoint. + parameters: + - in: path + name: vector_store_id + required: true + schema: + type: string + description: The ID of the vector store that the file belongs to. + - in: path + name: file_id + required: true + schema: + type: string + description: The ID of the file to delete. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/DeleteVectorStoreFileResponse" + x-oaiMeta: + name: Delete vector store file + group: vector_stores + beta: true + returns: Deletion status + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores/vs_abc123/files/file-abc123 \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -X DELETE + python: | + from openai import OpenAI + client = OpenAI() + + deleted_vector_store_file = client.beta.vector_stores.files.delete( + vector_store_id="vs_abc123", + file_id="file-abc123" + ) + print(deleted_vector_store_file) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const deletedVectorStoreFile = await openai.beta.vectorStores.files.del( + "vs_abc123", + "file-abc123" + ); + console.log(deletedVectorStoreFile); + } + + main(); + response: | + { + id: "file-abc123", + object: "vector_store.file.deleted", + deleted: true + } + + /vector_stores/{vector_store_id}/file_batches: + post: + operationId: createVectorStoreFileBatch + tags: + - Vector Stores + summary: Create a vector store file batch. + parameters: + - in: path + name: vector_store_id + required: true + schema: + type: string + example: vs_abc123 + description: | + The ID of the vector store for which to create a File Batch. + requestBody: + required: true + content: + application/json: + schema: + $ref: "#/components/schemas/CreateVectorStoreFileBatchRequest" + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/VectorStoreFileBatchObject" + x-oaiMeta: + name: Create vector store file batch + group: vector_stores + beta: true + returns: A [vector store file batch](/docs/api-reference/vector-stores-file-batches/batch-object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores/vs_abc123/file_batches \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json \ + -H "OpenAI-Beta: assistants=v2" \ + -d '{ + "file_ids": ["file-abc123", "file-abc456"] + }' + python: | + from openai import OpenAI + client = OpenAI() + + vector_store_file_batch = client.beta.vector_stores.file_batches.create( + vector_store_id="vs_abc123", + file_ids=["file-abc123", "file-abc456"] + ) + print(vector_store_file_batch) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const myVectorStoreFileBatch = await openai.beta.vectorStores.fileBatches.create( + "vs_abc123", + { + file_ids: ["file-abc123", "file-abc456"] + } + ); + console.log(myVectorStoreFileBatch); + } + + main(); + response: | + { + "id": "vsfb_abc123", + "object": "vector_store.file_batch", + "created_at": 1699061776, + "vector_store_id": "vs_abc123", + "status": "in_progress", + "file_counts": { + "in_progress": 1, + "completed": 1, + "failed": 0, + "cancelled": 0, + "total": 0, + } + } + + /vector_stores/{vector_store_id}/file_batches/{batch_id}: + get: + operationId: getVectorStoreFileBatch + tags: + - Vector Stores + summary: Retrieves a vector store file batch. + parameters: + - in: path + name: vector_store_id + required: true + schema: + type: string + example: vs_abc123 + description: The ID of the vector store that the file batch belongs to. + - in: path + name: batch_id + required: true + schema: + type: string + example: vsfb_abc123 + description: The ID of the file batch being retrieved. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/VectorStoreFileBatchObject" + x-oaiMeta: + name: Retrieve vector store file batch + group: vector_stores + beta: true + returns: The [vector store file batch](/docs/api-reference/vector-stores-file-batches/batch-object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores/vs_abc123/files_batches/vsfb_abc123 \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + vector_store_file_batch = client.beta.vector_stores.file_batches.retrieve( + vector_store_id="vs_abc123", + batch_id="vsfb_abc123" + ) + print(vector_store_file_batch) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const vectorStoreFileBatch = await openai.beta.vectorStores.fileBatches.retrieve( + "vs_abc123", + "vsfb_abc123" + ); + console.log(vectorStoreFileBatch); + } + + main(); + response: | + { + "id": "vsfb_abc123", + "object": "vector_store.file_batch", + "created_at": 1699061776, + "vector_store_id": "vs_abc123", + "status": "in_progress", + "file_counts": { + "in_progress": 1, + "completed": 1, + "failed": 0, + "cancelled": 0, + "total": 0, + } + } + + /vector_stores/{vector_store_id}/file_batches/{batch_id}/cancel: + post: + operationId: cancelVectorStoreFileBatch + tags: + - Vector Stores + summary: Cancel a vector store file batch. This attempts to cancel the processing of files in this batch as soon as possible. + parameters: + - in: path + name: vector_store_id + required: true + schema: + type: string + description: The ID of the vector store that the file batch belongs to. + - in: path + name: batch_id + required: true + schema: + type: string + description: The ID of the file batch to cancel. + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/VectorStoreFileBatchObject" + x-oaiMeta: + name: Cancel vector store file batch + group: vector_stores + beta: true + returns: The modified vector store file batch object. + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores/vs_abc123/files_batches/vsfb_abc123/cancel \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" \ + -X POST + python: | + from openai import OpenAI + client = OpenAI() + + deleted_vector_store_file_batch = client.beta.vector_stores.file_batches.cancel( + vector_store_id="vs_abc123", + file_batch_id="vsfb_abc123" + ) + print(deleted_vector_store_file_batch) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const deletedVectorStoreFileBatch = await openai.vector_stores.fileBatches.cancel( + "vs_abc123", + "vsfb_abc123" + ); + console.log(deletedVectorStoreFileBatch); + } + + main(); + response: | + { + "id": "vsfb_abc123", + "object": "vector_store.file_batch", + "created_at": 1699061776, + "vector_store_id": "vs_abc123", + "status": "cancelling", + "file_counts": { + "in_progress": 12, + "completed": 3, + "failed": 0, + "cancelled": 0, + "total": 15, + } + } + + /vector_stores/{vector_store_id}/file_batches/{batch_id}/files: + get: + operationId: listFilesInVectorStoreBatch + tags: + - Vector Stores + summary: Returns a list of vector store files in a batch. + parameters: + - name: vector_store_id + in: path + description: The ID of the vector store that the files belong to. + required: true + schema: + type: string + - name: batch_id + in: path + description: The ID of the file batch that the files belong to. + required: true + schema: + type: string + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: order + in: query + description: *pagination_order_param_description + schema: + type: string + default: desc + enum: ["asc", "desc"] + - name: after + in: query + description: *pagination_after_param_description + schema: + type: string + - name: before + in: query + description: *pagination_before_param_description + schema: + type: string + - name: filter + in: query + description: "Filter by file status. One of `in_progress`, `completed`, `failed`, `cancelled`." + schema: + type: string + enum: ["in_progress", "completed", "failed", "cancelled"] + responses: + "200": + description: OK + content: + application/json: + schema: + $ref: "#/components/schemas/ListVectorStoreFilesResponse" + x-oaiMeta: + name: List vector store files in a batch + group: vector_stores + beta: true + returns: A list of [vector store file](/docs/api-reference/vector-stores-files/file-object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/vector_stores/vs_abc123/files_batches/vsfb_abc123/files \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -H "OpenAI-Beta: assistants=v2" + python: | + from openai import OpenAI + client = OpenAI() + + vector_store_files = client.beta.vector_stores.file_batches.list_files( + vector_store_id="vs_abc123", + batch_id="vsfb_abc123" + ) + print(vector_store_files) + node.js: | + import OpenAI from "openai"; + const openai = new OpenAI(); + + async function main() { + const vectorStoreFiles = await openai.beta.vectorStores.fileBatches.listFiles( + "vs_abc123", + "vsfb_abc123" + ); + console.log(vectorStoreFiles); + } + + main(); + response: | + { + "object": "list", + "data": [ + { + "id": "file-abc123", + "object": "vector_store.file", + "created_at": 1699061776, + "vector_store_id": "vs_abc123" + }, + { + "id": "file-abc456", + "object": "vector_store.file", + "created_at": 1699061776, + "vector_store_id": "vs_abc123" + } + ], + "first_id": "file-abc123", + "last_id": "file-abc456", + "has_more": false + } + + /batches: + post: + summary: Creates and executes a batch from an uploaded file of requests + operationId: createBatch + tags: + - Batch + requestBody: + required: true + content: + application/json: + schema: + type: object + required: + - input_file_id + - endpoint + - completion_window + properties: + input_file_id: + type: string + description: | + The ID of an uploaded file that contains requests for the new batch. + + See [upload file](/docs/api-reference/files/create) for how to upload a file. + + Your input file must be formatted as a [JSONL file](/docs/api-reference/batch/request-input), and must be uploaded with the purpose `batch`. The file can contain up to 50,000 requests, and can be up to 100 MB in size. + endpoint: + type: string + enum: + [ + "/v1/chat/completions", + "/v1/embeddings", + "/v1/completions", + ] + description: The endpoint to be used for all requests in the batch. Currently `/v1/chat/completions`, `/v1/embeddings`, and `/v1/completions` are supported. Note that `/v1/embeddings` batches are also restricted to a maximum of 50,000 embedding inputs across all requests in the batch. + completion_window: + type: string + enum: ["24h"] + description: The time frame within which the batch should be processed. Currently only `24h` is supported. + metadata: + type: object + additionalProperties: + type: string + description: Optional custom metadata for the batch. + nullable: true + responses: + "200": + description: Batch created successfully. + content: + application/json: + schema: + $ref: "#/components/schemas/Batch" + x-oaiMeta: + name: Create batch + group: batch + returns: The created [Batch](/docs/api-reference/batch/object) object. + examples: + request: + curl: | + curl https://api.openai.com/v1/batches \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -d '{ + "input_file_id": "file-abc123", + "endpoint": "/v1/chat/completions", + "completion_window": "24h" + }' + python: | + from openai import OpenAI + client = OpenAI() + + client.batches.create( + input_file_id="file-abc123", + endpoint="/v1/chat/completions", + completion_window="24h" + ) + node: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const batch = await openai.batches.create({ + input_file_id: "file-abc123", + endpoint: "/v1/chat/completions", + completion_window: "24h" + }); + + console.log(batch); + } + + main(); + response: | + { + "id": "batch_abc123", + "object": "batch", + "endpoint": "/v1/chat/completions", + "errors": null, + "input_file_id": "file-abc123", + "completion_window": "24h", + "status": "validating", + "output_file_id": null, + "error_file_id": null, + "created_at": 1711471533, + "in_progress_at": null, + "expires_at": null, + "finalizing_at": null, + "completed_at": null, + "failed_at": null, + "expired_at": null, + "cancelling_at": null, + "cancelled_at": null, + "request_counts": { + "total": 0, + "completed": 0, + "failed": 0 + }, + "metadata": { + "customer_id": "user_123456789", + "batch_description": "Nightly eval job", + } + } + get: + operationId: listBatches + tags: + - Batch + summary: List your organization's batches. + parameters: + - in: query + name: after + required: false + schema: + type: string + description: *pagination_after_param_description + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + responses: + "200": + description: Batch listed successfully. + content: + application/json: + schema: + $ref: "#/components/schemas/ListBatchesResponse" + x-oaiMeta: + name: List batch + group: batch + returns: A list of paginated [Batch](/docs/api-reference/batch/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/batches?limit=2 \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" + python: | + from openai import OpenAI + client = OpenAI() + + client.batches.list() + node: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const list = await openai.batches.list(); + + for await (const batch of list) { + console.log(batch); + } + } + + main(); + response: | + { + "object": "list", + "data": [ + { + "id": "batch_abc123", + "object": "batch", + "endpoint": "/v1/chat/completions", + "errors": null, + "input_file_id": "file-abc123", + "completion_window": "24h", + "status": "completed", + "output_file_id": "file-cvaTdG", + "error_file_id": "file-HOWS94", + "created_at": 1711471533, + "in_progress_at": 1711471538, + "expires_at": 1711557933, + "finalizing_at": 1711493133, + "completed_at": 1711493163, + "failed_at": null, + "expired_at": null, + "cancelling_at": null, + "cancelled_at": null, + "request_counts": { + "total": 100, + "completed": 95, + "failed": 5 + }, + "metadata": { + "customer_id": "user_123456789", + "batch_description": "Nightly job", + } + }, + { ... }, + ], + "first_id": "batch_abc123", + "last_id": "batch_abc456", + "has_more": true + } + + /batches/{batch_id}: + get: + operationId: retrieveBatch + tags: + - Batch + summary: Retrieves a batch. + parameters: + - in: path + name: batch_id + required: true + schema: + type: string + description: The ID of the batch to retrieve. + responses: + "200": + description: Batch retrieved successfully. + content: + application/json: + schema: + $ref: "#/components/schemas/Batch" + x-oaiMeta: + name: Retrieve batch + group: batch + returns: The [Batch](/docs/api-reference/batch/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/batches/batch_abc123 \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + python: | + from openai import OpenAI + client = OpenAI() + + client.batches.retrieve("batch_abc123") + node: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const batch = await openai.batches.retrieve("batch_abc123"); + + console.log(batch); + } + + main(); + response: &batch_object | + { + "id": "batch_abc123", + "object": "batch", + "endpoint": "/v1/completions", + "errors": null, + "input_file_id": "file-abc123", + "completion_window": "24h", + "status": "completed", + "output_file_id": "file-cvaTdG", + "error_file_id": "file-HOWS94", + "created_at": 1711471533, + "in_progress_at": 1711471538, + "expires_at": 1711557933, + "finalizing_at": 1711493133, + "completed_at": 1711493163, + "failed_at": null, + "expired_at": null, + "cancelling_at": null, + "cancelled_at": null, + "request_counts": { + "total": 100, + "completed": 95, + "failed": 5 + }, + "metadata": { + "customer_id": "user_123456789", + "batch_description": "Nightly eval job", + } + } + + /batches/{batch_id}/cancel: + post: + operationId: cancelBatch + tags: + - Batch + summary: Cancels an in-progress batch. The batch will be in status `cancelling` for up to 10 minutes, before changing to `cancelled`, where it will have partial results (if any) available in the output file. + parameters: + - in: path + name: batch_id + required: true + schema: + type: string + description: The ID of the batch to cancel. + responses: + "200": + description: Batch is cancelling. Returns the cancelling batch's details. + content: + application/json: + schema: + $ref: "#/components/schemas/Batch" + x-oaiMeta: + name: Cancel batch + group: batch + returns: The [Batch](/docs/api-reference/batch/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/batches/batch_abc123/cancel \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -H "Content-Type: application/json" \ + -X POST + python: | + from openai import OpenAI + client = OpenAI() + + client.batches.cancel("batch_abc123") + node: | + import OpenAI from "openai"; + + const openai = new OpenAI(); + + async function main() { + const batch = await openai.batches.cancel("batch_abc123"); + + console.log(batch); + } + + main(); + response: | + { + "id": "batch_abc123", + "object": "batch", + "endpoint": "/v1/chat/completions", + "errors": null, + "input_file_id": "file-abc123", + "completion_window": "24h", + "status": "cancelling", + "output_file_id": null, + "error_file_id": null, + "created_at": 1711471533, + "in_progress_at": 1711471538, + "expires_at": 1711557933, + "finalizing_at": null, + "completed_at": null, + "failed_at": null, + "expired_at": null, + "cancelling_at": 1711475133, + "cancelled_at": null, + "request_counts": { + "total": 100, + "completed": 23, + "failed": 1 + }, + "metadata": { + "customer_id": "user_123456789", + "batch_description": "Nightly eval job", + } + } + + # Organization + # Audit Logs List + /organization/audit_logs: + get: + summary: List user actions and configuration changes within this organization. + operationId: list-audit-logs + tags: + - Audit Logs + parameters: + - name: effective_at + in: query + description: Return only events whose `effective_at` (Unix seconds) is in this range. + required: false + schema: + type: object + properties: + gt: + type: integer + description: Return only events whose `effective_at` (Unix seconds) is greater than this value. + gte: + type: integer + description: Return only events whose `effective_at` (Unix seconds) is greater than or equal to this value. + lt: + type: integer + description: Return only events whose `effective_at` (Unix seconds) is less than this value. + lte: + type: integer + description: Return only events whose `effective_at` (Unix seconds) is less than or equal to this value. + - name: project_ids[] + in: query + description: Return only events for these projects. + required: false + schema: + type: array + items: + type: string + - name: event_types[] + in: query + description: Return only events with a `type` in one of these values. For example, `project.created`. For all options, see the documentation for the [audit log object](/docs/api-reference/audit-logs/object). + required: false + schema: + type: array + items: + $ref: "#/components/schemas/AuditLogEventType" + - name: actor_ids[] + in: query + description: Return only events performed by these actors. Can be a user ID, a service account ID, or an api key tracking ID. + required: false + schema: + type: array + items: + type: string + - name: actor_emails[] + in: query + description: Return only events performed by users with these emails. + required: false + schema: + type: array + items: + type: string + - name: resource_ids[] + in: query + description: Return only events performed on these targets. For example, a project ID updated. + required: false + schema: + type: array + items: + type: string + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: after + in: query + description: *pagination_after_param_description + schema: + type: string + - name: before + in: query + description: *pagination_before_param_description + schema: + type: string + responses: + "200": + description: Audit logs listed successfully. + content: + application/json: + schema: + $ref: "#/components/schemas/ListAuditLogsResponse" + x-oaiMeta: + name: List audit logs + group: audit-logs + returns: A list of paginated [Audit Log](/docs/api-reference/audit-logs/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/audit_logs \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" \ + response: | + { + "object": "list", + "data": [ + { + "id": "audit_log-xxx_yyyymmdd", + "type": "project.archived", + "effective_at": 1722461446, + "actor": { + "type": "api_key", + "api_key": { + "type": "user", + "user": { + "id": "user-xxx", + "email": "user@example.com" + } + } + }, + "project.archived": { + "id": "proj_abc" + }, + }, + { + "id": "audit_log-yyy__20240101", + "type": "api_key.updated", + "effective_at": 1720804190, + "actor": { + "type": "session", + "session": { + "user": { + "id": "user-xxx", + "email": "user@example.com" + }, + "ip_address": "127.0.0.1", + "user_agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36" + } + }, + "api_key.updated": { + "id": "key_xxxx", + "data": { + "scopes": ["resource_2.operation_2"] + } + }, + } + ], + "first_id": "audit_log-xxx__20240101", + "last_id": "audit_log_yyy__20240101", + "has_more": true + } + /organization/invites: + get: + summary: Returns a list of invites in the organization. + operationId: list-invites + tags: + - Invites + parameters: + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: after + in: query + description: *pagination_after_param_description + required: false + schema: + type: string + responses: + "200": + description: Invites listed successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/InviteListResponse' + x-oaiMeta: + name: List invites + group: administration + returns: A list of [Invite](/docs/api-reference/invite/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/invites?after=invite-abc&limit=20 \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "list", + "data": [ + { + "object": "organization.invite", + "id": "invite-abc", + "email": "user@example.com", + "role": "owner", + "status": "accepted", + "invited_at": 1711471533, + "expires_at": 1711471533, + "accepted_at": 1711471533 + } + ], + "first_id": "invite-abc", + "last_id": "invite-abc", + "has_more": false + } + + post: + summary: Create an invite for a user to the organization. The invite must be accepted by the user before they have access to the organization. + operationId: inviteUser + tags: + - Invites + requestBody: + description: The invite request payload. + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/InviteRequest' + responses: + "200": + description: User invited successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/Invite' + x-oaiMeta: + name: Create invite + group: administration + returns: The created [Invite](/docs/api-reference/invite/object) object. + examples: + request: + curl: | + curl -X POST https://api.openai.com/v1/organization/invites \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" \ + -d '{ + "email": "user@example.com", + "role": "owner" + }' + response: + content: | + { + "object": "organization.invite", + "id": "invite-abc", + "email": "user@example.com", + "role": "owner", + "invited_at": 1711471533, + "expires_at": 1711471533, + "accepted_at": null + } + + /organization/invites/{invite_id}: + get: + summary: Retrieves an invite. + operationId: retrieve-invite + tags: + - Invites + parameters: + - in: path + name: invite_id + required: true + schema: + type: string + description: The ID of the invite to retrieve. + responses: + "200": + description: Invite retrieved successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/Invite' + x-oaiMeta: + name: Retrieve invite + group: administration + returns: The [Invite](/docs/api-reference/invite/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/invites/invite-abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "organization.invite", + "id": "invite-abc", + "email": "user@example.com", + "role": "owner", + "status": "accepted", + "invited_at": 1711471533, + "expires_at": 1711471533, + "accepted_at": 1711471533 + } + delete: + summary: Delete an invite. If the invite has already been accepted, it cannot be deleted. + operationId: delete-invite + tags: + - Invites + parameters: + - in: path + name: invite_id + required: true + schema: + type: string + description: The ID of the invite to delete. + responses: + "200": + description: Invite deleted successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/InviteDeleteResponse' + x-oaiMeta: + name: Delete invite + group: administration + returns: Confirmation that the invite has been deleted + examples: + request: + curl: | + curl -X DELETE https://api.openai.com/v1/organization/invites/invite-abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "organization.invite.deleted", + "id": "invite-abc", + "deleted": true + } + + /organization/users: + get: + summary: Lists all of the users in the organization. + operationId: list-users + tags: + - Users + parameters: + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: after + in: query + description: *pagination_after_param_description + required: false + schema: + type: string + responses: + "200": + description: Users listed successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/UserListResponse' + x-oaiMeta: + name: List users + group: administration + returns: A list of [User](/docs/api-reference/users/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/users?after=user_abc&limit=20 \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "list", + "data": [ + { + "object": "organization.user", + "id": "user_abc", + "name": "First Last", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + ], + "first_id": "user-abc", + "last_id": "user-xyz", + "has_more": false + } + + /organization/users/{user_id}: + get: + summary: Retrieves a user by their identifier. + operationId: retrieve-user + tags: + - Users + parameters: + - name: user_id + in: path + description: The ID of the user. + required: true + schema: + type: string + responses: + "200": + description: User retrieved successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/User' + x-oaiMeta: + name: Retrieve user + group: administration + returns: The [User](/docs/api-reference/users/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/users/user_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "organization.user", + "id": "user_abc", + "name": "First Last", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + + post: + summary: Modifies a user's role in the organization. + operationId: modify-user + tags: + - Users + requestBody: + description: The new user role to modify. This must be one of `owner` or `member`. + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/UserRoleUpdateRequest' + responses: + "200": + description: User role updated successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/User' + x-oaiMeta: + name: Modify user + group: administration + returns: The updated [User](/docs/api-reference/users/object) object. + examples: + request: + curl: | + curl -X POST https://api.openai.com/v1/organization/users/user_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" \ + -d '{ + "role": "owner" + }' + response: + content: | + { + "object": "organization.user", + "id": "user_abc", + "name": "First Last", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + + delete: + summary: Deletes a user from the organization. + operationId: delete-user + tags: + - Users + parameters: + - name: user_id + in: path + description: The ID of the user. + required: true + schema: + type: string + responses: + "200": + description: User deleted successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/UserDeleteResponse' + x-oaiMeta: + name: Delete user + group: administration + returns: Confirmation of the deleted user + examples: + request: + curl: | + curl -X DELETE https://api.openai.com/v1/organization/users/user_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "organization.user.deleted", + "id": "user_abc", + "deleted": true + } + /organization/projects: + get: + summary: Returns a list of projects. + operationId: list-projects + tags: + - Projects + parameters: + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: after + in: query + description: *pagination_after_param_description + required: false + schema: + type: string + - name: include_archived + in: query + schema: + type: boolean + default: false + description: If `true` returns all projects including those that have been `archived`. Archived projects are not included by default. + responses: + "200": + description: Projects listed successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectListResponse' + x-oaiMeta: + name: List projects + group: administration + returns: A list of [Project](/docs/api-reference/projects/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/projects?after=proj_abc&limit=20&include_archived=false \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "list", + "data": [ + { + "id": "proj_abc", + "object": "organization.project", + "name": "Project example", + "created_at": 1711471533, + "archived_at": null, + "status": "active" + } + ], + "first_id": "proj-abc", + "last_id": "proj-xyz", + "has_more": false + } + + post: + summary: Create a new project in the organization. Projects can be created and archived, but cannot be deleted. + operationId: create-project + tags: + - Projects + requestBody: + description: The project create request payload. + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectCreateRequest' + responses: + "200": + description: Project created successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/Project' + x-oaiMeta: + name: Create project + group: administration + returns: The created [Project](/docs/api-reference/projects/object) object. + examples: + request: + curl: | + curl -X POST https://api.openai.com/v1/organization/projects \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" \ + -d '{ + "name": "Project ABC" + }' + response: + content: | + { + "id": "proj_abc", + "object": "organization.project", + "name": "Project ABC", + "created_at": 1711471533, + "archived_at": null, + "status": "active" + } + + /organization/projects/{project_id}: + get: + summary: Retrieves a project. + operationId: retrieve-project + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + responses: + "200": + description: Project retrieved successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/Project' + x-oaiMeta: + name: Retrieve project + group: administration + description: Retrieve a project. + returns: The [Project](/docs/api-reference/projects/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/projects/proj_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "id": "proj_abc", + "object": "organization.project", + "name": "Project example", + "created_at": 1711471533, + "archived_at": null, + "status": "active" + } + + post: + summary: Modifies a project in the organization. + operationId: modify-project + tags: + - Projects + requestBody: + description: The project update request payload. + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectUpdateRequest' + responses: + "200": + description: Project updated successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/Project' + "400": + description: Error response when updating the default project. + content: + application/json: + schema: + $ref: '#/components/schemas/ErrorResponse' + x-oaiMeta: + name: Modify project + group: administration + returns: The updated [Project](/docs/api-reference/projects/object) object. + examples: + request: + curl: | + curl -X POST https://api.openai.com/v1/organization/projects/proj_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" \ + -d '{ + "name": "Project DEF" + }' + + /organization/projects/{project_id}/archive: + post: + summary: Archives a project in the organization. Archived projects cannot be used or updated. + operationId: archive-project + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + responses: + "200": + description: Project archived successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/Project' + x-oaiMeta: + name: Archive project + group: administration + returns: The archived [Project](/docs/api-reference/projects/object) object. + examples: + request: + curl: | + curl -X POST https://api.openai.com/v1/organization/projects/proj_abc/archive \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "id": "proj_abc", + "object": "organization.project", + "name": "Project DEF", + "created_at": 1711471533, + "archived_at": 1711471533, + "status": "archived" + } + + + /organization/projects/{project_id}/users: + get: + summary: Returns a list of users in the project. + operationId: list-project-users + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: after + in: query + description: *pagination_after_param_description + required: false + schema: + type: string + responses: + "200": + description: Project users listed successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectUserListResponse' + "400": + description: Error response when project is archived. + content: + application/json: + schema: + $ref: '#/components/schemas/ErrorResponse' + x-oaiMeta: + name: List project users + group: administration + returns: A list of [ProjectUser](/docs/api-reference/project-users/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/projects/proj_abc/users?after=user_abc&limit=20 \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "list", + "data": [ + { + "object": "organization.project.user", + "id": "user_abc", + "name": "First Last", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + ], + "first_id": "user-abc", + "last_id": "user-xyz", + "has_more": false + } + error_response: + content: | + { + "code": 400, + "message": "Project {name} is archived" + } + + post: + summary: Adds a user to the project. Users must already be members of the organization to be added to a project. + operationId: create-project-user + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + tags: + - Projects + requestBody: + description: The project user create request payload. + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectUserCreateRequest' + responses: + "200": + description: User added to project successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectUser' + "400": + description: Error response for various conditions. + content: + application/json: + schema: + $ref: '#/components/schemas/ErrorResponse' + x-oaiMeta: + name: Create project user + group: administration + returns: The created [ProjectUser](/docs/api-reference/project-users/object) object. + examples: + request: + curl: | + curl -X POST https://api.openai.com/v1/organization/projects/proj_abc/users \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" \ + -d '{ + "user_id": "user_abc", + "role": "member" + }' + response: + content: | + { + "object": "organization.project.user", + "id": "user_abc", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + error_response: + content: | + { + "code": 400, + "message": "Project {name} is archived" + } + + /organization/projects/{project_id}/users/{user_id}: + get: + summary: Retrieves a user in the project. + operationId: retrieve-project-user + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + - name: user_id + in: path + description: The ID of the user. + required: true + schema: + type: string + responses: + "200": + description: Project user retrieved successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectUser' + x-oaiMeta: + name: Retrieve project user + group: administration + returns: The [ProjectUser](/docs/api-reference/project-users/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/projects/proj_abc/users/user_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "organization.project.user", + "id": "user_abc", + "name": "First Last", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + + post: + summary: Modifies a user's role in the project. + operationId: modify-project-user + tags: + - Projects + requestBody: + description: The project user update request payload. + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectUserUpdateRequest' + responses: + "200": + description: Project user's role updated successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectUser' + "400": + description: Error response for various conditions. + content: + application/json: + schema: + $ref: '#/components/schemas/ErrorResponse' + x-oaiMeta: + name: Modify project user + group: administration + returns: The updated [ProjectUser](/docs/api-reference/project-users/object) object. + examples: + request: + curl: | + curl -X POST https://api.openai.com/v1/organization/projects/proj_abc/users/user_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" \ + -d '{ + "role": "owner" + }' + response: + content: | + { + "object": "organization.project.user", + "id": "user_abc", + "name": "First Last", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + + delete: + summary: Deletes a user from the project. + operationId: delete-project-user + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + - name: user_id + in: path + description: The ID of the user. + required: true + schema: + type: string + responses: + "200": + description: Project user deleted successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectUserDeleteResponse' + "400": + description: Error response for various conditions. + content: + application/json: + schema: + $ref: '#/components/schemas/ErrorResponse' + x-oaiMeta: + name: Delete project user + group: administration + returns: Confirmation that project has been deleted or an error in case of an archived project, which has no users + examples: + request: + curl: | + curl -X DELETE https://api.openai.com/v1/organization/projects/proj_abc/users/user_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "organization.project.user.deleted", + "id": "user_abc", + "deleted": true + } + + /organization/projects/{project_id}/service_accounts: + get: + summary: Returns a list of service accounts in the project. + operationId: list-project-service-accounts + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: after + in: query + description: *pagination_after_param_description + required: false + schema: + type: string + responses: + "200": + description: Project service accounts listed successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectServiceAccountListResponse' + "400": + description: Error response when project is archived. + content: + application/json: + schema: + $ref: '#/components/schemas/ErrorResponse' + x-oaiMeta: + name: List project service accounts + group: administration + returns: A list of [ProjectServiceAccount](/docs/api-reference/project-service-accounts/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/projects/proj_abc/service_accounts?after=custom_id&limit=20 \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "list", + "data": [ + { + "object": "organization.project.service_account", + "id": "svc_acct_abc", + "name": "Service Account", + "role": "owner", + "created_at": 1711471533 + } + ], + "first_id": "svc_acct_abc", + "last_id": "svc_acct_xyz", + "has_more": false + } + + post: + summary: Creates a new service account in the project. This also returns an unredacted API key for the service account. + operationId: create-project-service-account + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + requestBody: + description: The project service account create request payload. + required: true + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectServiceAccountCreateRequest' + responses: + "200": + description: Project service account created successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectServiceAccountCreateResponse' + "400": + description: Error response when project is archived. + content: + application/json: + schema: + $ref: '#/components/schemas/ErrorResponse' + x-oaiMeta: + name: Create project service account + group: administration + returns: The created [ProjectServiceAccount](/docs/api-reference/project-service-accounts/object) object. + examples: + request: + curl: | + curl -X POST https://api.openai.com/v1/organization/projects/proj_abc/service_accounts \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" \ + -d '{ + "name": "Production App" + }' + response: + content: | + { + "object": "organization.project.service_account", + "id": "svc_acct_abc", + "name": "Production App", + "role": "member", + "created_at": 1711471533, + "api_key": { + "object": "organization.project.service_account.api_key", + "value": "sk-abcdefghijklmnop123", + "name": "Secret Key", + "created_at": 1711471533, + "id": "key_abc" + } + } + + /organization/projects/{project_id}/service_accounts/{service_account_id}: + get: + summary: Retrieves a service account in the project. + operationId: retrieve-project-service-account + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + - name: service_account_id + in: path + description: The ID of the service account. + required: true + schema: + type: string + responses: + "200": + description: Project service account retrieved successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectServiceAccount' + x-oaiMeta: + name: Retrieve project service account + group: administration + returns: The [ProjectServiceAccount](/docs/api-reference/project-service-accounts/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/projects/proj_abc/service_accounts/svc_acct_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "organization.project.service_account", + "id": "svc_acct_abc", + "name": "Service Account", + "role": "owner", + "created_at": 1711471533 + } + + delete: + summary: Deletes a service account from the project. + operationId: delete-project-service-account + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + - name: service_account_id + in: path + description: The ID of the service account. + required: true + schema: + type: string + responses: + "200": + description: Project service account deleted successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectServiceAccountDeleteResponse' + x-oaiMeta: + name: Delete project service account + group: administration + returns: Confirmation of service account being deleted, or an error in case of an archived project, which has no service accounts + examples: + request: + curl: | + curl -X DELETE https://api.openai.com/v1/organization/projects/proj_abc/service_accounts/svc_acct_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "organization.project.service_account.deleted", + "id": "svc_acct_abc", + "deleted": true + } + + /organization/projects/{project_id}/api_keys: + get: + summary: Returns a list of API keys in the project. + operationId: list-project-api-keys + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + - name: limit + in: query + description: *pagination_limit_param_description + required: false + schema: + type: integer + default: 20 + - name: after + in: query + description: *pagination_after_param_description + required: false + schema: + type: string + responses: + "200": + description: Project API keys listed successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectApiKeyListResponse' + + x-oaiMeta: + name: List project API keys + group: administration + returns: A list of [ProjectApiKey](/docs/api-reference/project-api-keys/object) objects. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/projects/proj_abc/api_keys?after=key_abc&limit=20 \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "list", + "data": [ + { + "object": "organization.project.api_key", + "redacted_value": "sk-abc...def", + "name": "My API Key", + "created_at": 1711471533, + "id": "key_abc", + "owner": { + "type": "user", + "user": { + "object": "organization.project.user", + "id": "user_abc", + "name": "First Last", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + } + } + ], + "first_id": "key_abc", + "last_id": "key_xyz", + "has_more": false + } + error_response: + content: | + { + "code": 400, + "message": "Project {name} is archived" + } + + /organization/projects/{project_id}/api_keys/{key_id}: + get: + summary: Retrieves an API key in the project. + operationId: retrieve-project-api-key + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + - name: key_id + in: path + description: The ID of the API key. + required: true + schema: + type: string + responses: + "200": + description: Project API key retrieved successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectApiKey' + x-oaiMeta: + name: Retrieve project API key + group: administration + returns: The [ProjectApiKey](/docs/api-reference/project-api-keys/object) object matching the specified ID. + examples: + request: + curl: | + curl https://api.openai.com/v1/organization/projects/proj_abc/api_keys/key_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "organization.project.api_key", + "redacted_value": "sk-abc...def", + "name": "My API Key", + "created_at": 1711471533, + "id": "key_abc", + "owner": { + "type": "user", + "user": { + "object": "organization.project.user", + "id": "user_abc", + "name": "First Last", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + } + } + + delete: + summary: Deletes an API key from the project. + operationId: delete-project-api-key + tags: + - Projects + parameters: + - name: project_id + in: path + description: The ID of the project. + required: true + schema: + type: string + - name: key_id + in: path + description: The ID of the API key. + required: true + schema: + type: string + responses: + "200": + description: Project API key deleted successfully. + content: + application/json: + schema: + $ref: '#/components/schemas/ProjectApiKeyDeleteResponse' + "400": + description: Error response for various conditions. + content: + application/json: + schema: + $ref: '#/components/schemas/ErrorResponse' + x-oaiMeta: + name: Delete project API key + group: administration + returns: Confirmation of the key's deletion or an error if the key belonged to a service account + examples: + request: + curl: | + curl -X DELETE https://api.openai.com/v1/organization/projects/proj_abc/api_keys/key_abc \ + -H "Authorization: Bearer $OPENAI_ADMIN_KEY" \ + -H "Content-Type: application/json" + response: + content: | + { + "object": "organization.project.api_key.deleted", + "id": "key_abc", + "deleted": true + } + error_response: + content: | + { + "code": 400, + "message": "API keys cannot be deleted for service accounts, please delete the service account" + } + +components: + securitySchemes: + ApiKeyAuth: + type: http + scheme: "bearer" + + schemas: + Error: + type: object + properties: + code: + type: string + nullable: true + message: + type: string + nullable: false + param: + type: string + nullable: true + type: + type: string + nullable: false + required: + - type + - message + - param + - code + ErrorResponse: + type: object + properties: + error: + $ref: "#/components/schemas/Error" + required: + - error + + ListModelsResponse: + type: object + properties: + object: + type: string + enum: [list] + data: + type: array + items: + $ref: "#/components/schemas/Model" + required: + - object + - data + DeleteModelResponse: + type: object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + required: + - id + - object + - deleted + + CreateCompletionRequest: + type: object + properties: + model: + description: &model_description | + ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. + anyOf: + - type: string + - type: string + enum: ["gpt-3.5-turbo-instruct", "davinci-002", "babbage-002"] + x-oaiTypeLabel: string + prompt: + description: &completions_prompt_description | + The prompt(s) to generate completions for, encoded as a string, array of strings, array of tokens, or array of token arrays. + + Note that <|endoftext|> is the document separator that the model sees during training, so if a prompt is not specified the model will generate as if from the beginning of a new document. + default: "<|endoftext|>" + nullable: true + oneOf: + - type: string + default: "" + example: "This is a test." + - type: array + items: + type: string + default: "" + example: "This is a test." + - type: array + minItems: 1 + items: + type: integer + example: "[1212, 318, 257, 1332, 13]" + - type: array + minItems: 1 + items: + type: array + minItems: 1 + items: + type: integer + example: "[[1212, 318, 257, 1332, 13]]" + best_of: + type: integer + default: 1 + minimum: 0 + maximum: 20 + nullable: true + description: &completions_best_of_description | + Generates `best_of` completions server-side and returns the "best" (the one with the highest log probability per token). Results cannot be streamed. + + When used with `n`, `best_of` controls the number of candidate completions and `n` specifies how many to return – `best_of` must be greater than `n`. + + **Note:** Because this parameter generates many completions, it can quickly consume your token quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. + echo: + type: boolean + default: false + nullable: true + description: &completions_echo_description > + Echo back the prompt in addition to the completion + frequency_penalty: + type: number + default: 0 + minimum: -2 + maximum: 2 + nullable: true + description: &completions_frequency_penalty_description | + Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim. + + [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + logit_bias: &completions_logit_bias + type: object + x-oaiTypeLabel: map + default: null + nullable: true + additionalProperties: + type: integer + description: &completions_logit_bias_description | + Modify the likelihood of specified tokens appearing in the completion. + + Accepts a JSON object that maps tokens (specified by their token ID in the GPT tokenizer) to an associated bias value from -100 to 100. You can use this [tokenizer tool](/tokenizer?view=bpe) to convert text to token IDs. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token. + + As an example, you can pass `{"50256": -100}` to prevent the <|endoftext|> token from being generated. + logprobs: &completions_logprobs_configuration + type: integer + minimum: 0 + maximum: 5 + default: null + nullable: true + description: &completions_logprobs_description | + Include the log probabilities on the `logprobs` most likely output tokens, as well the chosen tokens. For example, if `logprobs` is 5, the API will return a list of the 5 most likely tokens. The API will always return the `logprob` of the sampled token, so there may be up to `logprobs+1` elements in the response. + + The maximum value for `logprobs` is 5. + max_tokens: + type: integer + minimum: 0 + default: 16 + example: 16 + nullable: true + description: &completions_max_tokens_description | + The maximum number of [tokens](/tokenizer) that can be generated in the completion. + + The token count of your prompt plus `max_tokens` cannot exceed the model's context length. [Example Python code](https://cookbook.openai.com/examples/how_to_count_tokens_with_tiktoken) for counting tokens. + n: + type: integer + minimum: 1 + maximum: 128 + default: 1 + example: 1 + nullable: true + description: &completions_completions_description | + How many completions to generate for each prompt. + + **Note:** Because this parameter generates many completions, it can quickly consume your token quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. + presence_penalty: + type: number + default: 0 + minimum: -2 + maximum: 2 + nullable: true + description: &completions_presence_penalty_description | + Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics. + + [See more information about frequency and presence penalties.](/docs/guides/text-generation/parameter-details) + seed: &completions_seed_param + type: integer + minimum: -9223372036854775808 + maximum: 9223372036854775807 + nullable: true + description: | + If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same `seed` and parameters should return the same result. + + Determinism is not guaranteed, and you should refer to the `system_fingerprint` response parameter to monitor changes in the backend. + stop: + description: &completions_stop_description > + Up to 4 sequences where the API will stop generating further tokens. The returned text will not contain the stop sequence. + default: null + nullable: true + oneOf: + - type: string + default: <|endoftext|> + example: "\n" + nullable: true + - type: array + minItems: 1 + maxItems: 4 + items: + type: string + example: '["\n"]' + stream: + description: > + Whether to stream back partial progress. If set, tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) + as they become available, with the stream terminated by a `data: [DONE]` message. [Example Python code](https://cookbook.openai.com/examples/how_to_stream_completions). + type: boolean + nullable: true + default: false + stream_options: + $ref: "#/components/schemas/ChatCompletionStreamOptions" + suffix: + description: | + The suffix that comes after a completion of inserted text. + + This parameter is only supported for `gpt-3.5-turbo-instruct`. + default: null + nullable: true + type: string + example: "test." + temperature: + type: number + minimum: 0 + maximum: 2 + default: 1 + example: 1 + nullable: true + description: &completions_temperature_description | + What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + + We generally recommend altering this or `top_p` but not both. + top_p: + type: number + minimum: 0 + maximum: 1 + default: 1 + example: 1 + nullable: true + description: &completions_top_p_description | + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or `temperature` but not both. + user: &end_user_param_configuration + type: string + example: user-1234 + description: | + A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). + required: + - model + - prompt + + CreateCompletionResponse: + type: object + description: | + Represents a completion response from the API. Note: both the streamed and non-streamed response objects share the same shape (unlike the chat endpoint). + properties: + id: + type: string + description: A unique identifier for the completion. + choices: + type: array + description: The list of completion choices the model generated for the input prompt. + items: + type: object + required: + - finish_reason + - index + - logprobs + - text + properties: + finish_reason: + type: string + description: &completion_finish_reason_description | + The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, + `length` if the maximum number of tokens specified in the request was reached, + or `content_filter` if content was omitted due to a flag from our content filters. + enum: ["stop", "length", "content_filter"] + index: + type: integer + logprobs: + type: object + nullable: true + properties: + text_offset: + type: array + items: + type: integer + token_logprobs: + type: array + items: + type: number + tokens: + type: array + items: + type: string + top_logprobs: + type: array + items: + type: object + additionalProperties: + type: number + text: + type: string + created: + type: integer + description: The Unix timestamp (in seconds) of when the completion was created. + model: + type: string + description: The model used for completion. + system_fingerprint: + type: string + description: | + This fingerprint represents the backend configuration that the model runs with. + + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + object: + type: string + description: The object type, which is always "text_completion" + enum: [text_completion] + usage: + $ref: "#/components/schemas/CompletionUsage" + required: + - id + - object + - created + - model + - choices + x-oaiMeta: + name: The completion object + legacy: true + example: | + { + "id": "cmpl-uqkvlQyYK7bGYrRHQ0eXlWi7", + "object": "text_completion", + "created": 1589478378, + "model": "gpt-4-turbo", + "choices": [ + { + "text": "\n\nThis is indeed a test", + "index": 0, + "logprobs": null, + "finish_reason": "length" + } + ], + "usage": { + "prompt_tokens": 5, + "completion_tokens": 7, + "total_tokens": 12 + } + } + + ChatCompletionRequestMessageContentPartText: + type: object + title: Text content part + properties: + type: + type: string + enum: ["text"] + description: The type of the content part. + text: + type: string + description: The text content. + required: + - type + - text + + ChatCompletionRequestMessageContentPartImage: + type: object + title: Image content part + properties: + type: + type: string + enum: ["image_url"] + description: The type of the content part. + image_url: + type: object + properties: + url: + type: string + description: Either a URL of the image or the base64 encoded image data. + format: uri + detail: + type: string + description: Specifies the detail level of the image. Learn more in the [Vision guide](/docs/guides/vision/low-or-high-fidelity-image-understanding). + enum: ["auto", "low", "high"] + default: "auto" + required: + - url + required: + - type + - image_url + + ChatCompletionRequestMessageContentPartRefusal: + type: object + title: Refusal content part + properties: + type: + type: string + enum: ["refusal"] + description: The type of the content part. + refusal: + type: string + description: The refusal message generated by the model. + required: + - type + - refusal + + ChatCompletionRequestMessage: + oneOf: + - $ref: "#/components/schemas/ChatCompletionRequestSystemMessage" + - $ref: "#/components/schemas/ChatCompletionRequestUserMessage" + - $ref: "#/components/schemas/ChatCompletionRequestAssistantMessage" + - $ref: "#/components/schemas/ChatCompletionRequestToolMessage" + - $ref: "#/components/schemas/ChatCompletionRequestFunctionMessage" + x-oaiExpandable: true + + ChatCompletionRequestSystemMessageContentPart: + oneOf: + - $ref: "#/components/schemas/ChatCompletionRequestMessageContentPartText" + x-oaiExpandable: true + + ChatCompletionRequestUserMessageContentPart: + oneOf: + - $ref: "#/components/schemas/ChatCompletionRequestMessageContentPartText" + - $ref: "#/components/schemas/ChatCompletionRequestMessageContentPartImage" + x-oaiExpandable: true + + ChatCompletionRequestAssistantMessageContentPart: + oneOf: + - $ref: "#/components/schemas/ChatCompletionRequestMessageContentPartText" + - $ref: "#/components/schemas/ChatCompletionRequestMessageContentPartRefusal" + x-oaiExpandable: true + + ChatCompletionRequestToolMessageContentPart: + oneOf: + - $ref: "#/components/schemas/ChatCompletionRequestMessageContentPartText" + x-oaiExpandable: true + + ChatCompletionRequestSystemMessage: + type: object + title: System message + properties: + content: + description: The contents of the system message. + oneOf: + - type: string + description: The contents of the system message. + title: Text content + - type: array + description: An array of content parts with a defined type. For system messages, only type `text` is supported. + title: Array of content parts + items: + $ref: "#/components/schemas/ChatCompletionRequestSystemMessageContentPart" + minItems: 1 + role: + type: string + enum: ["system"] + description: The role of the messages author, in this case `system`. + name: + type: string + description: An optional name for the participant. Provides the model information to differentiate between participants of the same role. + required: + - content + - role + + ChatCompletionRequestUserMessage: + type: object + title: User message + properties: + content: + description: | + The contents of the user message. + oneOf: + - type: string + description: The text contents of the message. + title: Text content + - type: array + description: An array of content parts with a defined type, each can be of type `text` or `image_url` when passing in images. You can pass multiple images by adding multiple `image_url` content parts. Image input is only supported when using the `gpt-4o` model. + title: Array of content parts + items: + $ref: "#/components/schemas/ChatCompletionRequestUserMessageContentPart" + minItems: 1 + x-oaiExpandable: true + role: + type: string + enum: ["user"] + description: The role of the messages author, in this case `user`. + name: + type: string + description: An optional name for the participant. Provides the model information to differentiate between participants of the same role. + required: + - content + - role + + ChatCompletionRequestAssistantMessage: + type: object + title: Assistant message + properties: + content: + nullable: true + oneOf: + - type: string + description: The contents of the assistant message. + title: Text content + - type: array + description: An array of content parts with a defined type. Can be one or more of type `text`, or exactly one of type `refusal`. + title: Array of content parts + items: + $ref: "#/components/schemas/ChatCompletionRequestAssistantMessageContentPart" + minItems: 1 + description: | + The contents of the assistant message. Required unless `tool_calls` or `function_call` is specified. + refusal: + nullable: true + type: string + description: The refusal message by the assistant. + role: + type: string + enum: ["assistant"] + description: The role of the messages author, in this case `assistant`. + name: + type: string + description: An optional name for the participant. Provides the model information to differentiate between participants of the same role. + tool_calls: + $ref: "#/components/schemas/ChatCompletionMessageToolCalls" + function_call: + type: object + deprecated: true + description: "Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model." + nullable: true + properties: + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + name: + type: string + description: The name of the function to call. + required: + - arguments + - name + required: + - role + + FineTuneChatCompletionRequestAssistantMessage: + allOf: + - type: object + title: Assistant message + deprecated: false + properties: + weight: + type: integer + enum: [0, 1] + description: "Controls whether the assistant message is trained against (0 or 1)" + - $ref: "#/components/schemas/ChatCompletionRequestAssistantMessage" + required: + - role + + ChatCompletionRequestToolMessage: + type: object + title: Tool message + properties: + role: + type: string + enum: ["tool"] + description: The role of the messages author, in this case `tool`. + content: + oneOf: + - type: string + description: The contents of the tool message. + title: Text content + - type: array + description: An array of content parts with a defined type. For tool messages, only type `text` is supported. + title: Array of content parts + items: + $ref: "#/components/schemas/ChatCompletionRequestToolMessageContentPart" + minItems: 1 + description: The contents of the tool message. + tool_call_id: + type: string + description: Tool call that this message is responding to. + required: + - role + - content + - tool_call_id + + ChatCompletionRequestFunctionMessage: + type: object + title: Function message + deprecated: true + properties: + role: + type: string + enum: ["function"] + description: The role of the messages author, in this case `function`. + content: + nullable: true + type: string + description: The contents of the function message. + name: + type: string + description: The name of the function to call. + required: + - role + - content + - name + + FunctionParameters: + type: object + description: "The parameters the functions accepts, described as a JSON Schema object. See the [guide](/docs/guides/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format. \n\nOmitting `parameters` defines a function with an empty parameter list." + additionalProperties: true + + ChatCompletionFunctions: + type: object + deprecated: true + properties: + description: + type: string + description: A description of what the function does, used by the model to choose when and how to call the function. + name: + type: string + description: The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. + parameters: + $ref: "#/components/schemas/FunctionParameters" + required: + - name + + ChatCompletionFunctionCallOption: + type: object + description: > + Specifying a particular function via `{"name": "my_function"}` forces the model to call that function. + properties: + name: + type: string + description: The name of the function to call. + required: + - name + + ChatCompletionTool: + type: object + properties: + type: + type: string + enum: ["function"] + description: The type of the tool. Currently, only `function` is supported. + function: + $ref: "#/components/schemas/FunctionObject" + required: + - type + - function + + FunctionObject: + type: object + properties: + description: + type: string + description: A description of what the function does, used by the model to choose when and how to call the function. + name: + type: string + description: The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. + parameters: + $ref: "#/components/schemas/FunctionParameters" + strict: + type: boolean + nullable: true + default: false + description: Whether to enable strict schema adherence when generating the function call. If set to true, the model will follow the exact schema defined in the `parameters` field. Only a subset of JSON Schema is supported when `strict` is `true`. Learn more about Structured Outputs in the [function calling guide](docs/guides/function-calling). + required: + - name + + ResponseFormatText: + type: object + properties: + type: + type: string + description: "The type of response format being defined: `text`" + enum: ["text"] + required: + - type + + ResponseFormatJsonObject: + type: object + properties: + type: + type: string + description: "The type of response format being defined: `json_object`" + enum: ["json_object"] + required: + - type + + ResponseFormatJsonSchemaSchema: + type: object + description: "The schema for the response format, described as a JSON Schema object." + additionalProperties: true + + ResponseFormatJsonSchema: + type: object + properties: + type: + type: string + description: 'The type of response format being defined: `json_schema`' + enum: ['json_schema'] + json_schema: + type: object + properties: + description: + type: string + description: A description of what the response format is for, used by the model to determine how to respond in the format. + name: + type: string + description: The name of the response format. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. + schema: + $ref: '#/components/schemas/ResponseFormatJsonSchemaSchema' + strict: + type: boolean + nullable: true + default: false + description: Whether to enable strict schema adherence when generating the output. If set to true, the model will always follow the exact schema defined in the `schema` field. Only a subset of JSON Schema is supported when `strict` is `true`. To learn more, read the [Structured Outputs guide](/docs/guides/structured-outputs). + required: + - type + - name + required: + - type + - json_schema + + ChatCompletionToolChoiceOption: + description: | + Controls which (if any) tool is called by the model. + `none` means the model will not call any tool and instead generates a message. + `auto` means the model can pick between generating a message or calling one or more tools. + `required` means the model must call one or more tools. + Specifying a particular tool via `{"type": "function", "function": {"name": "my_function"}}` forces the model to call that tool. + + `none` is the default when no tools are present. `auto` is the default if tools are present. + oneOf: + - type: string + description: > + `none` means the model will not call any tool and instead generates a message. + `auto` means the model can pick between generating a message or calling one or more tools. + `required` means the model must call one or more tools. + enum: [none, auto, required] + - $ref: "#/components/schemas/ChatCompletionNamedToolChoice" + x-oaiExpandable: true + + ChatCompletionNamedToolChoice: + type: object + description: Specifies a tool the model should use. Use to force the model to call a specific function. + properties: + type: + type: string + enum: ["function"] + description: The type of the tool. Currently, only `function` is supported. + function: + type: object + properties: + name: + type: string + description: The name of the function to call. + required: + - name + required: + - type + - function + + ParallelToolCalls: + description: Whether to enable [parallel function calling](/docs/guides/function-calling/parallel-function-calling) during tool use. + type: boolean + default: true + + ChatCompletionMessageToolCalls: + type: array + description: The tool calls generated by the model, such as function calls. + items: + $ref: "#/components/schemas/ChatCompletionMessageToolCall" + + ChatCompletionMessageToolCall: + type: object + properties: + # TODO: index included when streaming + id: + type: string + description: The ID of the tool call. + type: + type: string + enum: ["function"] + description: The type of the tool. Currently, only `function` is supported. + function: + type: object + description: The function that the model called. + properties: + name: + type: string + description: The name of the function to call. + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + required: + - name + - arguments + required: + - id + - type + - function + + ChatCompletionMessageToolCallChunk: + type: object + properties: + index: + type: integer + id: + type: string + description: The ID of the tool call. + type: + type: string + enum: ["function"] + description: The type of the tool. Currently, only `function` is supported. + function: + type: object + properties: + name: + type: string + description: The name of the function to call. + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + required: + - index + + # Note, this isn't referenced anywhere, but is kept as a convenience to record all possible roles in one place. + ChatCompletionRole: + type: string + description: The role of the author of a message + enum: + - system + - user + - assistant + - tool + - function + + ChatCompletionStreamOptions: + description: | + Options for streaming response. Only set this when you set `stream: true`. + type: object + nullable: true + default: null + properties: + include_usage: + type: boolean + description: | + If set, an additional chunk will be streamed before the `data: [DONE]` message. The `usage` field on this chunk shows the token usage statistics for the entire request, and the `choices` field will always be an empty array. All other chunks will also include a `usage` field, but with a null value. + + ChatCompletionResponseMessage: + type: object + description: A chat completion message generated by the model. + properties: + content: + type: string + description: The contents of the message. + nullable: true + refusal: + type: string + description: The refusal message generated by the model. + nullable: true + tool_calls: + $ref: "#/components/schemas/ChatCompletionMessageToolCalls" + role: + type: string + enum: ["assistant"] + description: The role of the author of this message. + function_call: + type: object + deprecated: true + description: "Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model." + properties: + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + name: + type: string + description: The name of the function to call. + required: + - name + - arguments + required: + - role + - content + - refusal + + ChatCompletionStreamResponseDelta: + type: object + description: A chat completion delta generated by streamed model responses. + properties: + content: + type: string + description: The contents of the chunk message. + nullable: true + function_call: + deprecated: true + type: object + description: "Deprecated and replaced by `tool_calls`. The name and arguments of a function that should be called, as generated by the model." + properties: + arguments: + type: string + description: The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function. + name: + type: string + description: The name of the function to call. + tool_calls: + type: array + items: + $ref: "#/components/schemas/ChatCompletionMessageToolCallChunk" + role: + type: string + enum: ["system", "user", "assistant", "tool"] + description: The role of the author of this message. + refusal: + type: string + description: The refusal message generated by the model. + nullable: true + + CreateChatCompletionRequest: + type: object + properties: + messages: + description: A list of messages comprising the conversation so far. [Example Python code](https://cookbook.openai.com/examples/how_to_format_inputs_to_chatgpt_models). + type: array + minItems: 1 + items: + $ref: "#/components/schemas/ChatCompletionRequestMessage" + model: + description: ID of the model to use. See the [model endpoint compatibility](/docs/models/model-endpoint-compatibility) table for details on which models work with the Chat API. + example: "gpt-4o" + anyOf: + - type: string + - type: string + enum: + [ + "gpt-4o", + "gpt-4o-2024-05-13", + "gpt-4o-2024-08-06", + "chatgpt-4o-latest", + "gpt-4o-mini", + "gpt-4o-mini-2024-07-18", + "gpt-4-turbo", + "gpt-4-turbo-2024-04-09", + "gpt-4-0125-preview", + "gpt-4-turbo-preview", + "gpt-4-1106-preview", + "gpt-4-vision-preview", + "gpt-4", + "gpt-4-0314", + "gpt-4-0613", + "gpt-4-32k", + "gpt-4-32k-0314", + "gpt-4-32k-0613", + "gpt-3.5-turbo", + "gpt-3.5-turbo-16k", + "gpt-3.5-turbo-0301", + "gpt-3.5-turbo-0613", + "gpt-3.5-turbo-1106", + "gpt-3.5-turbo-0125", + "gpt-3.5-turbo-16k-0613", + ] + x-oaiTypeLabel: string + frequency_penalty: + type: number + default: 0 + minimum: -2 + maximum: 2 + nullable: true + description: *completions_frequency_penalty_description + logit_bias: + type: object + x-oaiTypeLabel: map + default: null + nullable: true + additionalProperties: + type: integer + description: | + Modify the likelihood of specified tokens appearing in the completion. + + Accepts a JSON object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token. + logprobs: + description: Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the `content` of `message`. + type: boolean + default: false + nullable: true + top_logprobs: + description: An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. `logprobs` must be set to `true` if this parameter is used. + type: integer + minimum: 0 + maximum: 20 + nullable: true + max_tokens: + description: | + The maximum number of [tokens](/tokenizer) that can be generated in the chat completion. + + The total length of input tokens and generated tokens is limited by the model's context length. [Example Python code](https://cookbook.openai.com/examples/how_to_count_tokens_with_tiktoken) for counting tokens. + type: integer + nullable: true + n: + type: integer + minimum: 1 + maximum: 128 + default: 1 + example: 1 + nullable: true + description: How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep `n` as `1` to minimize costs. + presence_penalty: + type: number + default: 0 + minimum: -2 + maximum: 2 + nullable: true + description: *completions_presence_penalty_description + response_format: + description: | + An object specifying the format that the model must output. Compatible with [GPT-4o](/docs/models/gpt-4o), [GPT-4o mini](/docs/models/gpt-4o-mini), [GPT-4 Turbo](/docs/models/gpt-4-and-gpt-4-turbo) and all GPT-3.5 Turbo models newer than `gpt-3.5-turbo-1106`. + + Setting to `{ "type": "json_schema", "json_schema": {...} }` enables Structured Outputs which guarantees the model will match your supplied JSON schema. Learn more in the [Structured Outputs guide](/docs/guides/structured-outputs). + + Setting to `{ "type": "json_object" }` enables JSON mode, which guarantees the message the model generates is valid JSON. + + **Important:** when using JSON mode, you **must** also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if `finish_reason="length"`, which indicates the generation exceeded `max_tokens` or the conversation exceeded the max context length. + oneOf: + - $ref: "#/components/schemas/ResponseFormatText" + - $ref: "#/components/schemas/ResponseFormatJsonObject" + - $ref: "#/components/schemas/ResponseFormatJsonSchema" + x-oaiExpandable: true + seed: + type: integer + minimum: -9223372036854775808 + maximum: 9223372036854775807 + nullable: true + description: | + This feature is in Beta. + If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same `seed` and parameters should return the same result. + Determinism is not guaranteed, and you should refer to the `system_fingerprint` response parameter to monitor changes in the backend. + x-oaiMeta: + beta: true + service_tier: + description: | + Specifies the latency tier to use for processing the request. This parameter is relevant for customers subscribed to the scale tier service: + - If set to 'auto', the system will utilize scale tier credits until they are exhausted. + - If set to 'default', the request will be processed using the default service tier with a lower uptime SLA and no latency guarentee. + - When not set, the default behavior is 'auto'. + + When this parameter is set, the response body will include the `service_tier` utilized. + type: string + enum: ["auto", "default"] + nullable: true + default: null + stop: + description: | + Up to 4 sequences where the API will stop generating further tokens. + default: null + oneOf: + - type: string + nullable: true + - type: array + minItems: 1 + maxItems: 4 + items: + type: string + stream: + description: > + If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) + as they become available, with the stream terminated by a `data: [DONE]` message. [Example Python code](https://cookbook.openai.com/examples/how_to_stream_completions). + type: boolean + nullable: true + default: false + stream_options: + $ref: "#/components/schemas/ChatCompletionStreamOptions" + temperature: + type: number + minimum: 0 + maximum: 2 + default: 1 + example: 1 + nullable: true + description: *completions_temperature_description + top_p: + type: number + minimum: 0 + maximum: 1 + default: 1 + example: 1 + nullable: true + description: *completions_top_p_description + tools: + type: array + description: > + A list of tools the model may call. Currently, only functions are supported as a tool. + Use this to provide a list of functions the model may generate JSON inputs for. A max of 128 functions are supported. + items: + $ref: "#/components/schemas/ChatCompletionTool" + tool_choice: + $ref: "#/components/schemas/ChatCompletionToolChoiceOption" + parallel_tool_calls: + $ref: "#/components/schemas/ParallelToolCalls" + user: *end_user_param_configuration + function_call: + deprecated: true + description: | + Deprecated in favor of `tool_choice`. + + Controls which (if any) function is called by the model. + `none` means the model will not call a function and instead generates a message. + `auto` means the model can pick between generating a message or calling a function. + Specifying a particular function via `{"name": "my_function"}` forces the model to call that function. + + `none` is the default when no functions are present. `auto` is the default if functions are present. + oneOf: + - type: string + description: > + `none` means the model will not call a function and instead generates a message. + `auto` means the model can pick between generating a message or calling a function. + enum: [none, auto] + - $ref: "#/components/schemas/ChatCompletionFunctionCallOption" + x-oaiExpandable: true + functions: + deprecated: true + description: | + Deprecated in favor of `tools`. + + A list of functions the model may generate JSON inputs for. + type: array + minItems: 1 + maxItems: 128 + items: + $ref: "#/components/schemas/ChatCompletionFunctions" + + required: + - model + - messages + + CreateChatCompletionResponse: + type: object + description: Represents a chat completion response returned by model, based on the provided input. + properties: + id: + type: string + description: A unique identifier for the chat completion. + choices: + type: array + description: A list of chat completion choices. Can be more than one if `n` is greater than 1. + items: + type: object + required: + - finish_reason + - index + - message + - logprobs + properties: + finish_reason: + type: string + description: &chat_completion_finish_reason_description | + The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, + `length` if the maximum number of tokens specified in the request was reached, + `content_filter` if content was omitted due to a flag from our content filters, + `tool_calls` if the model called a tool, or `function_call` (deprecated) if the model called a function. + enum: + [ + "stop", + "length", + "tool_calls", + "content_filter", + "function_call", + ] + index: + type: integer + description: The index of the choice in the list of choices. + message: + $ref: "#/components/schemas/ChatCompletionResponseMessage" + logprobs: &chat_completion_response_logprobs + description: Log probability information for the choice. + type: object + nullable: true + properties: + content: + description: A list of message content tokens with log probability information. + type: array + items: + $ref: "#/components/schemas/ChatCompletionTokenLogprob" + nullable: true + refusal: + description: A list of message refusal tokens with log probability information. + type: array + items: + $ref: "#/components/schemas/ChatCompletionTokenLogprob" + nullable: true + required: + - content + - refusal + + created: + type: integer + description: The Unix timestamp (in seconds) of when the chat completion was created. + model: + type: string + description: The model used for the chat completion. + service_tier: + description: The service tier used for processing the request. This field is only included if the `service_tier` parameter is specified in the request. + type: string + enum: ["scale", "default"] + example: "scale" + nullable: true + system_fingerprint: + type: string + description: | + This fingerprint represents the backend configuration that the model runs with. + + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + object: + type: string + description: The object type, which is always `chat.completion`. + enum: [chat.completion] + usage: + $ref: "#/components/schemas/CompletionUsage" + required: + - choices + - created + - id + - model + - object + x-oaiMeta: + name: The chat completion object + group: chat + example: *chat_completion_example + + CreateChatCompletionFunctionResponse: + type: object + description: Represents a chat completion response returned by model, based on the provided input. + properties: + id: + type: string + description: A unique identifier for the chat completion. + choices: + type: array + description: A list of chat completion choices. Can be more than one if `n` is greater than 1. + items: + type: object + required: + - finish_reason + - index + - message + - logprobs + properties: + finish_reason: + type: string + description: + &chat_completion_function_finish_reason_description | + The reason the model stopped generating tokens. This will be `stop` if the model hit a natural stop point or a provided stop sequence, `length` if the maximum number of tokens specified in the request was reached, `content_filter` if content was omitted due to a flag from our content filters, or `function_call` if the model called a function. + enum: + ["stop", "length", "function_call", "content_filter"] + index: + type: integer + description: The index of the choice in the list of choices. + message: + $ref: "#/components/schemas/ChatCompletionResponseMessage" + created: + type: integer + description: The Unix timestamp (in seconds) of when the chat completion was created. + model: + type: string + description: The model used for the chat completion. + system_fingerprint: + type: string + description: | + This fingerprint represents the backend configuration that the model runs with. + + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + object: + type: string + description: The object type, which is always `chat.completion`. + enum: [chat.completion] + usage: + $ref: "#/components/schemas/CompletionUsage" + required: + - choices + - created + - id + - model + - object + x-oaiMeta: + name: The chat completion object + group: chat + example: *chat_completion_function_example + + ChatCompletionTokenLogprob: + type: object + properties: + token: &chat_completion_response_logprobs_token + description: The token. + type: string + logprob: &chat_completion_response_logprobs_token_logprob + description: The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value `-9999.0` is used to signify that the token is very unlikely. + type: number + bytes: &chat_completion_response_logprobs_bytes + description: A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be `null` if there is no bytes representation for the token. + type: array + items: + type: integer + nullable: true + top_logprobs: + description: List of the most likely tokens and their log probability, at this token position. In rare cases, there may be fewer than the number of requested `top_logprobs` returned. + type: array + items: + type: object + properties: + token: *chat_completion_response_logprobs_token + logprob: *chat_completion_response_logprobs_token_logprob + bytes: *chat_completion_response_logprobs_bytes + required: + - token + - logprob + - bytes + required: + - token + - logprob + - bytes + - top_logprobs + + ListPaginatedFineTuningJobsResponse: + type: object + properties: + data: + type: array + items: + $ref: "#/components/schemas/FineTuningJob" + has_more: + type: boolean + object: + type: string + enum: [list] + required: + - object + - data + - has_more + + CreateChatCompletionStreamResponse: + type: object + description: Represents a streamed chunk of a chat completion response returned by model, based on the provided input. + properties: + id: + type: string + description: A unique identifier for the chat completion. Each chunk has the same ID. + choices: + type: array + description: | + A list of chat completion choices. Can contain more than one elements if `n` is greater than 1. Can also be empty for the + last chunk if you set `stream_options: {"include_usage": true}`. + items: + type: object + required: + - delta + - finish_reason + - index + properties: + delta: + $ref: "#/components/schemas/ChatCompletionStreamResponseDelta" + logprobs: *chat_completion_response_logprobs + finish_reason: + type: string + description: *chat_completion_finish_reason_description + enum: + [ + "stop", + "length", + "tool_calls", + "content_filter", + "function_call", + ] + nullable: true + index: + type: integer + description: The index of the choice in the list of choices. + created: + type: integer + description: The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp. + model: + type: string + description: The model to generate the completion. + service_tier: + description: The service tier used for processing the request. This field is only included if the `service_tier` parameter is specified in the request. + type: string + enum: ["scale", "default"] + example: "scale" + nullable: true + system_fingerprint: + type: string + description: | + This fingerprint represents the backend configuration that the model runs with. + Can be used in conjunction with the `seed` request parameter to understand when backend changes have been made that might impact determinism. + object: + type: string + description: The object type, which is always `chat.completion.chunk`. + enum: [chat.completion.chunk] + usage: + type: object + description: | + An optional field that will only be present when you set `stream_options: {"include_usage": true}` in your request. + When present, it contains a null value except for the last chunk which contains the token usage statistics for the entire request. + properties: + completion_tokens: + type: integer + description: Number of tokens in the generated completion. + prompt_tokens: + type: integer + description: Number of tokens in the prompt. + total_tokens: + type: integer + description: Total number of tokens used in the request (prompt + completion). + required: + - prompt_tokens + - completion_tokens + - total_tokens + required: + - choices + - created + - id + - model + - object + x-oaiMeta: + name: The chat completion chunk object + group: chat + example: *chat_completion_chunk_example + + CreateChatCompletionImageResponse: + type: object + description: Represents a streamed chunk of a chat completion response returned by model, based on the provided input. + x-oaiMeta: + name: The chat completion chunk object + group: chat + example: *chat_completion_image_example + + CreateImageRequest: + type: object + properties: + prompt: + description: A text description of the desired image(s). The maximum length is 1000 characters for `dall-e-2` and 4000 characters for `dall-e-3`. + type: string + example: "A cute baby sea otter" + model: + anyOf: + - type: string + - type: string + enum: ["dall-e-2", "dall-e-3"] + x-oaiTypeLabel: string + default: "dall-e-2" + example: "dall-e-3" + nullable: true + description: The model to use for image generation. + n: &images_n + type: integer + minimum: 1 + maximum: 10 + default: 1 + example: 1 + nullable: true + description: The number of images to generate. Must be between 1 and 10. For `dall-e-3`, only `n=1` is supported. + quality: + type: string + enum: ["standard", "hd"] + default: "standard" + example: "standard" + description: The quality of the image that will be generated. `hd` creates images with finer details and greater consistency across the image. This param is only supported for `dall-e-3`. + response_format: &images_response_format + type: string + enum: ["url", "b64_json"] + default: "url" + example: "url" + nullable: true + description: The format in which the generated images are returned. Must be one of `url` or `b64_json`. URLs are only valid for 60 minutes after the image has been generated. + size: &images_size + type: string + enum: ["256x256", "512x512", "1024x1024", "1792x1024", "1024x1792"] + default: "1024x1024" + example: "1024x1024" + nullable: true + description: The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024` for `dall-e-2`. Must be one of `1024x1024`, `1792x1024`, or `1024x1792` for `dall-e-3` models. + style: + type: string + enum: ["vivid", "natural"] + default: "vivid" + example: "vivid" + nullable: true + description: The style of the generated images. Must be one of `vivid` or `natural`. Vivid causes the model to lean towards generating hyper-real and dramatic images. Natural causes the model to produce more natural, less hyper-real looking images. This param is only supported for `dall-e-3`. + user: *end_user_param_configuration + required: + - prompt + + ImagesResponse: + properties: + created: + type: integer + data: + type: array + items: + $ref: "#/components/schemas/Image" + required: + - created + - data + + Image: + type: object + description: Represents the url or the content of an image generated by the OpenAI API. + properties: + b64_json: + type: string + description: The base64-encoded JSON of the generated image, if `response_format` is `b64_json`. + url: + type: string + description: The URL of the generated image, if `response_format` is `url` (default). + revised_prompt: + type: string + description: The prompt that was used to generate the image, if there was any revision to the prompt. + x-oaiMeta: + name: The image object + example: | + { + "url": "...", + "revised_prompt": "..." + } + + CreateImageEditRequest: + type: object + properties: + image: + description: The image to edit. Must be a valid PNG file, less than 4MB, and square. If mask is not provided, image must have transparency, which will be used as the mask. + type: string + format: binary + prompt: + description: A text description of the desired image(s). The maximum length is 1000 characters. + type: string + example: "A cute baby sea otter wearing a beret" + mask: + description: An additional image whose fully transparent areas (e.g. where alpha is zero) indicate where `image` should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions as `image`. + type: string + format: binary + model: + anyOf: + - type: string + - type: string + enum: ["dall-e-2"] + x-oaiTypeLabel: string + default: "dall-e-2" + example: "dall-e-2" + nullable: true + description: The model to use for image generation. Only `dall-e-2` is supported at this time. + n: + type: integer + minimum: 1 + maximum: 10 + default: 1 + example: 1 + nullable: true + description: The number of images to generate. Must be between 1 and 10. + size: &dalle2_images_size + type: string + enum: ["256x256", "512x512", "1024x1024"] + default: "1024x1024" + example: "1024x1024" + nullable: true + description: The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. + response_format: *images_response_format + user: *end_user_param_configuration + required: + - prompt + - image + + CreateImageVariationRequest: + type: object + properties: + image: + description: The image to use as the basis for the variation(s). Must be a valid PNG file, less than 4MB, and square. + type: string + format: binary + model: + anyOf: + - type: string + - type: string + enum: ["dall-e-2"] + x-oaiTypeLabel: string + default: "dall-e-2" + example: "dall-e-2" + nullable: true + description: The model to use for image generation. Only `dall-e-2` is supported at this time. + n: *images_n + response_format: *images_response_format + size: *dalle2_images_size + user: *end_user_param_configuration + required: + - image + + CreateModerationRequest: + type: object + properties: + input: + description: The input text to classify + oneOf: + - type: string + default: "" + example: "I want to kill them." + - type: array + items: + type: string + default: "" + example: "I want to kill them." + model: + description: | + Two content moderations models are available: `text-moderation-stable` and `text-moderation-latest`. + + The default is `text-moderation-latest` which will be automatically upgraded over time. This ensures you are always using our most accurate model. If you use `text-moderation-stable`, we will provide advanced notice before updating the model. Accuracy of `text-moderation-stable` may be slightly lower than for `text-moderation-latest`. + nullable: false + default: "text-moderation-latest" + example: "text-moderation-stable" + anyOf: + - type: string + - type: string + enum: ["text-moderation-latest", "text-moderation-stable"] + x-oaiTypeLabel: string + required: + - input + + CreateModerationResponse: + type: object + description: Represents if a given text input is potentially harmful. + properties: + id: + type: string + description: The unique identifier for the moderation request. + model: + type: string + description: The model used to generate the moderation results. + results: + type: array + description: A list of moderation objects. + items: + type: object + properties: + flagged: + type: boolean + description: Whether any of the below categories are flagged. + categories: + type: object + description: A list of the categories, and whether they are flagged or not. + properties: + hate: + type: boolean + description: Content that expresses, incites, or promotes hate based on race, gender, ethnicity, religion, nationality, sexual orientation, disability status, or caste. Hateful content aimed at non-protected groups (e.g., chess players) is harassment. + hate/threatening: + type: boolean + description: Hateful content that also includes violence or serious harm towards the targeted group based on race, gender, ethnicity, religion, nationality, sexual orientation, disability status, or caste. + harassment: + type: boolean + description: Content that expresses, incites, or promotes harassing language towards any target. + harassment/threatening: + type: boolean + description: Harassment content that also includes violence or serious harm towards any target. + self-harm: + type: boolean + description: Content that promotes, encourages, or depicts acts of self-harm, such as suicide, cutting, and eating disorders. + self-harm/intent: + type: boolean + description: Content where the speaker expresses that they are engaging or intend to engage in acts of self-harm, such as suicide, cutting, and eating disorders. + self-harm/instructions: + type: boolean + description: Content that encourages performing acts of self-harm, such as suicide, cutting, and eating disorders, or that gives instructions or advice on how to commit such acts. + sexual: + type: boolean + description: Content meant to arouse sexual excitement, such as the description of sexual activity, or that promotes sexual services (excluding sex education and wellness). + sexual/minors: + type: boolean + description: Sexual content that includes an individual who is under 18 years old. + violence: + type: boolean + description: Content that depicts death, violence, or physical injury. + violence/graphic: + type: boolean + description: Content that depicts death, violence, or physical injury in graphic detail. + required: + - hate + - hate/threatening + - harassment + - harassment/threatening + - self-harm + - self-harm/intent + - self-harm/instructions + - sexual + - sexual/minors + - violence + - violence/graphic + category_scores: + type: object + description: A list of the categories along with their scores as predicted by model. + properties: + hate: + type: number + description: The score for the category 'hate'. + hate/threatening: + type: number + description: The score for the category 'hate/threatening'. + harassment: + type: number + description: The score for the category 'harassment'. + harassment/threatening: + type: number + description: The score for the category 'harassment/threatening'. + self-harm: + type: number + description: The score for the category 'self-harm'. + self-harm/intent: + type: number + description: The score for the category 'self-harm/intent'. + self-harm/instructions: + type: number + description: The score for the category 'self-harm/instructions'. + sexual: + type: number + description: The score for the category 'sexual'. + sexual/minors: + type: number + description: The score for the category 'sexual/minors'. + violence: + type: number + description: The score for the category 'violence'. + violence/graphic: + type: number + description: The score for the category 'violence/graphic'. + required: + - hate + - hate/threatening + - harassment + - harassment/threatening + - self-harm + - self-harm/intent + - self-harm/instructions + - sexual + - sexual/minors + - violence + - violence/graphic + required: + - flagged + - categories + - category_scores + required: + - id + - model + - results + x-oaiMeta: + name: The moderation object + example: *moderation_example + + ListFilesResponse: + type: object + properties: + data: + type: array + items: + $ref: "#/components/schemas/OpenAIFile" + object: + type: string + enum: [list] + required: + - object + - data + + CreateFileRequest: + type: object + additionalProperties: false + properties: + file: + description: | + The File object (not file name) to be uploaded. + type: string + format: binary + purpose: + description: | + The intended purpose of the uploaded file. + + Use "assistants" for [Assistants](/docs/api-reference/assistants) and [Message](/docs/api-reference/messages) files, "vision" for Assistants image file inputs, "batch" for [Batch API](/docs/guides/batch), and "fine-tune" for [Fine-tuning](/docs/api-reference/fine-tuning). + type: string + enum: ["assistants", "batch", "fine-tune", "vision"] + required: + - file + - purpose + + DeleteFileResponse: + type: object + properties: + id: + type: string + object: + type: string + enum: [file] + deleted: + type: boolean + required: + - id + - object + - deleted + + CreateUploadRequest: + type: object + additionalProperties: false + properties: + filename: + description: | + The name of the file to upload. + type: string + purpose: + description: | + The intended purpose of the uploaded file. + + See the [documentation on File purposes](/docs/api-reference/files/create#files-create-purpose). + type: string + enum: ["assistants", "batch", "fine-tune", "vision"] + bytes: + description: | + The number of bytes in the file you are uploading. + type: integer + mime_type: + description: | + The MIME type of the file. + + This must fall within the supported MIME types for your file purpose. See the supported MIME types for assistants and vision. + type: string + required: + - filename + - purpose + - bytes + - mime_type + + AddUploadPartRequest: + type: object + additionalProperties: false + properties: + data: + description: | + The chunk of bytes for this Part. + type: string + format: binary + required: + - data + + CompleteUploadRequest: + type: object + additionalProperties: false + properties: + part_ids: + type: array + description: | + The ordered list of Part IDs. + items: + type: string + md5: + description: | + The optional md5 checksum for the file contents to verify if the bytes uploaded matches what you expect. + type: string + required: + - part_ids + + CancelUploadRequest: + type: object + additionalProperties: false + + CreateFineTuningJobRequest: + type: object + properties: + model: + description: | + The name of the model to fine-tune. You can select one of the + [supported models](/docs/guides/fine-tuning/which-models-can-be-fine-tuned). + example: "gpt-4o-mini" + anyOf: + - type: string + - type: string + enum: ["babbage-002", "davinci-002", "gpt-3.5-turbo", "gpt-4o-mini"] + x-oaiTypeLabel: string + training_file: + description: | + The ID of an uploaded file that contains training data. + + See [upload file](/docs/api-reference/files/create) for how to upload a file. + + Your dataset must be formatted as a JSONL file. Additionally, you must upload your file with the purpose `fine-tune`. + + The contents of the file should differ depending on if the model uses the [chat](/docs/api-reference/fine-tuning/chat-input) or [completions](/docs/api-reference/fine-tuning/completions-input) format. + + See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. + type: string + example: "file-abc123" + hyperparameters: + type: object + description: The hyperparameters used for the fine-tuning job. + properties: + batch_size: + description: | + Number of examples in each batch. A larger batch size means that model parameters + are updated less frequently, but with lower variance. + oneOf: + - type: string + enum: [auto] + - type: integer + minimum: 1 + maximum: 256 + default: auto + learning_rate_multiplier: + description: | + Scaling factor for the learning rate. A smaller learning rate may be useful to avoid + overfitting. + oneOf: + - type: string + enum: [auto] + - type: number + minimum: 0 + exclusiveMinimum: true + default: auto + n_epochs: + description: | + The number of epochs to train the model for. An epoch refers to one full cycle + through the training dataset. + oneOf: + - type: string + enum: [auto] + - type: integer + minimum: 1 + maximum: 50 + default: auto + suffix: + description: | + A string of up to 18 characters that will be added to your fine-tuned model name. + + For example, a `suffix` of "custom-model-name" would produce a model name like `ft:gpt-4o-mini:openai:custom-model-name:7p4lURel`. + type: string + minLength: 1 + maxLength: 40 + default: null + nullable: true + validation_file: + description: | + The ID of an uploaded file that contains validation data. + + If you provide this file, the data is used to generate validation + metrics periodically during fine-tuning. These metrics can be viewed in + the fine-tuning results file. + The same data should not be present in both train and validation files. + + Your dataset must be formatted as a JSONL file. You must upload your file with the purpose `fine-tune`. + + See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. + type: string + nullable: true + example: "file-abc123" + integrations: + type: array + description: A list of integrations to enable for your fine-tuning job. + nullable: true + items: + type: object + required: + - type + - wandb + properties: + type: + description: | + The type of integration to enable. Currently, only "wandb" (Weights and Biases) is supported. + oneOf: + - type: string + enum: [wandb] + wandb: + type: object + description: | + The settings for your integration with Weights and Biases. This payload specifies the project that + metrics will be sent to. Optionally, you can set an explicit display name for your run, add tags + to your run, and set a default entity (team, username, etc) to be associated with your run. + required: + - project + properties: + project: + description: | + The name of the project that the new run will be created under. + type: string + example: "my-wandb-project" + name: + description: | + A display name to set for the run. If not set, we will use the Job ID as the name. + nullable: true + type: string + entity: + description: | + The entity to use for the run. This allows you to set the team or username of the WandB user that you would + like associated with the run. If not set, the default entity for the registered WandB API key is used. + nullable: true + type: string + tags: + description: | + A list of tags to be attached to the newly created run. These tags are passed through directly to WandB. Some + default tags are generated by OpenAI: "openai/finetune", "openai/{base-model}", "openai/{ftjob-abcdef}". + type: array + items: + type: string + example: "custom-tag" + + seed: + description: | + The seed controls the reproducibility of the job. Passing in the same seed and job parameters should produce the same results, but may differ in rare cases. + If a seed is not specified, one will be generated for you. + type: integer + nullable: true + minimum: 0 + maximum: 2147483647 + example: 42 + required: + - model + - training_file + + ListFineTuningJobEventsResponse: + type: object + properties: + data: + type: array + items: + $ref: "#/components/schemas/FineTuningJobEvent" + object: + type: string + enum: [list] + required: + - object + - data + + ListFineTuningJobCheckpointsResponse: + type: object + properties: + data: + type: array + items: + $ref: "#/components/schemas/FineTuningJobCheckpoint" + object: + type: string + enum: [list] + first_id: + type: string + nullable: true + last_id: + type: string + nullable: true + has_more: + type: boolean + required: + - object + - data + - has_more + + CreateEmbeddingRequest: + type: object + additionalProperties: false + properties: + input: + description: | + Input text to embed, encoded as a string or array of tokens. To embed multiple inputs in a single request, pass an array of strings or array of token arrays. The input must not exceed the max input tokens for the model (8192 tokens for `text-embedding-ada-002`), cannot be an empty string, and any array must be 2048 dimensions or less. [Example Python code](https://cookbook.openai.com/examples/how_to_count_tokens_with_tiktoken) for counting tokens. + example: "The quick brown fox jumped over the lazy dog" + oneOf: + - type: string + title: string + description: The string that will be turned into an embedding. + default: "" + example: "This is a test." + - type: array + title: array + description: The array of strings that will be turned into an embedding. + minItems: 1 + maxItems: 2048 + items: + type: string + default: "" + example: "['This is a test.']" + - type: array + title: array + description: The array of integers that will be turned into an embedding. + minItems: 1 + maxItems: 2048 + items: + type: integer + example: "[1212, 318, 257, 1332, 13]" + - type: array + title: array + description: The array of arrays containing integers that will be turned into an embedding. + minItems: 1 + maxItems: 2048 + items: + type: array + minItems: 1 + items: + type: integer + example: "[[1212, 318, 257, 1332, 13]]" + x-oaiExpandable: true + model: + description: *model_description + example: "text-embedding-3-small" + anyOf: + - type: string + - type: string + enum: + [ + "text-embedding-ada-002", + "text-embedding-3-small", + "text-embedding-3-large", + ] + x-oaiTypeLabel: string + encoding_format: + description: "The format to return the embeddings in. Can be either `float` or [`base64`](https://pypi.org/project/pybase64/)." + example: "float" + default: "float" + type: string + enum: ["float", "base64"] + dimensions: + description: | + The number of dimensions the resulting output embeddings should have. Only supported in `text-embedding-3` and later models. + type: integer + minimum: 1 + user: *end_user_param_configuration + required: + - model + - input + + CreateEmbeddingResponse: + type: object + properties: + data: + type: array + description: The list of embeddings generated by the model. + items: + $ref: "#/components/schemas/Embedding" + model: + type: string + description: The name of the model used to generate the embedding. + object: + type: string + description: The object type, which is always "list". + enum: [list] + usage: + type: object + description: The usage information for the request. + properties: + prompt_tokens: + type: integer + description: The number of tokens used by the prompt. + total_tokens: + type: integer + description: The total number of tokens used by the request. + required: + - prompt_tokens + - total_tokens + required: + - object + - model + - data + - usage + + CreateTranscriptionRequest: + type: object + additionalProperties: false + properties: + file: + description: | + The audio file object (not file name) to transcribe, in one of these formats: flac, mp3, mp4, mpeg, mpga, m4a, ogg, wav, or webm. + type: string + x-oaiTypeLabel: file + format: binary + model: + description: | + ID of the model to use. Only `whisper-1` (which is powered by our open source Whisper V2 model) is currently available. + example: whisper-1 + anyOf: + - type: string + - type: string + enum: ["whisper-1"] + x-oaiTypeLabel: string + language: + description: | + The language of the input audio. Supplying the input language in [ISO-639-1](https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes) format will improve accuracy and latency. + type: string + prompt: + description: | + An optional text to guide the model's style or continue a previous audio segment. The [prompt](/docs/guides/speech-to-text/prompting) should match the audio language. + type: string + response_format: + description: | + The format of the transcript output, in one of these options: `json`, `text`, `srt`, `verbose_json`, or `vtt`. + type: string + enum: + - json + - text + - srt + - verbose_json + - vtt + default: json + temperature: + description: | + The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to automatically increase the temperature until certain thresholds are hit. + type: number + default: 0 + timestamp_granularities[]: + description: | + The timestamp granularities to populate for this transcription. `response_format` must be set `verbose_json` to use timestamp granularities. Either or both of these options are supported: `word`, or `segment`. Note: There is no additional latency for segment timestamps, but generating word timestamps incurs additional latency. + type: array + items: + type: string + enum: + - word + - segment + default: [segment] + required: + - file + - model + + # Note: This does not currently support the non-default response format types. + CreateTranscriptionResponseJson: + type: object + description: Represents a transcription response returned by model, based on the provided input. + properties: + text: + type: string + description: The transcribed text. + required: + - text + x-oaiMeta: + name: The transcription object (JSON) + group: audio + example: *basic_transcription_response_example + + TranscriptionSegment: + type: object + properties: + id: + type: integer + description: Unique identifier of the segment. + seek: + type: integer + description: Seek offset of the segment. + start: + type: number + format: float + description: Start time of the segment in seconds. + end: + type: number + format: float + description: End time of the segment in seconds. + text: + type: string + description: Text content of the segment. + tokens: + type: array + items: + type: integer + description: Array of token IDs for the text content. + temperature: + type: number + format: float + description: Temperature parameter used for generating the segment. + avg_logprob: + type: number + format: float + description: Average logprob of the segment. If the value is lower than -1, consider the logprobs failed. + compression_ratio: + type: number + format: float + description: Compression ratio of the segment. If the value is greater than 2.4, consider the compression failed. + no_speech_prob: + type: number + format: float + description: Probability of no speech in the segment. If the value is higher than 1.0 and the `avg_logprob` is below -1, consider this segment silent. + required: + - id + - seek + - start + - end + - text + - tokens + - temperature + - avg_logprob + - compression_ratio + - no_speech_prob + + TranscriptionWord: + type: object + properties: + word: + type: string + description: The text content of the word. + start: + type: number + format: float + description: Start time of the word in seconds. + end: + type: number + format: float + description: End time of the word in seconds. + required: [word, start, end] + + CreateTranscriptionResponseVerboseJson: + type: object + description: Represents a verbose json transcription response returned by model, based on the provided input. + properties: + language: + type: string + description: The language of the input audio. + duration: + type: string + description: The duration of the input audio. + text: + type: string + description: The transcribed text. + words: + type: array + description: Extracted words and their corresponding timestamps. + items: + $ref: "#/components/schemas/TranscriptionWord" + segments: + type: array + description: Segments of the transcribed text and their corresponding details. + items: + $ref: "#/components/schemas/TranscriptionSegment" + required: [language, duration, text] + x-oaiMeta: + name: The transcription object (Verbose JSON) + group: audio + example: *verbose_transcription_response_example + + CreateTranslationRequest: + type: object + additionalProperties: false + properties: + file: + description: | + The audio file object (not file name) translate, in one of these formats: flac, mp3, mp4, mpeg, mpga, m4a, ogg, wav, or webm. + type: string + x-oaiTypeLabel: file + format: binary + model: + description: | + ID of the model to use. Only `whisper-1` (which is powered by our open source Whisper V2 model) is currently available. + example: whisper-1 + anyOf: + - type: string + - type: string + enum: ["whisper-1"] + x-oaiTypeLabel: string + prompt: + description: | + An optional text to guide the model's style or continue a previous audio segment. The [prompt](/docs/guides/speech-to-text/prompting) should be in English. + type: string + response_format: + description: | + The format of the transcript output, in one of these options: `json`, `text`, `srt`, `verbose_json`, or `vtt`. + type: string + default: json + temperature: + description: | + The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to automatically increase the temperature until certain thresholds are hit. + type: number + default: 0 + required: + - file + - model + + # Note: This does not currently support the non-default response format types. + CreateTranslationResponseJson: + type: object + properties: + text: + type: string + required: + - text + + CreateTranslationResponseVerboseJson: + type: object + properties: + language: + type: string + description: The language of the output translation (always `english`). + duration: + type: string + description: The duration of the input audio. + text: + type: string + description: The translated text. + segments: + type: array + description: Segments of the translated text and their corresponding details. + items: + $ref: "#/components/schemas/TranscriptionSegment" + required: [language, duration, text] + + CreateSpeechRequest: + type: object + additionalProperties: false + properties: + model: + description: | + One of the available [TTS models](/docs/models/tts): `tts-1` or `tts-1-hd` + anyOf: + - type: string + - type: string + enum: ["tts-1", "tts-1-hd"] + x-oaiTypeLabel: string + input: + type: string + description: The text to generate audio for. The maximum length is 4096 characters. + maxLength: 4096 + voice: + description: The voice to use when generating the audio. Supported voices are `alloy`, `echo`, `fable`, `onyx`, `nova`, and `shimmer`. Previews of the voices are available in the [Text to speech guide](/docs/guides/text-to-speech/voice-options). + type: string + enum: ["alloy", "echo", "fable", "onyx", "nova", "shimmer"] + response_format: + description: "The format to audio in. Supported formats are `mp3`, `opus`, `aac`, `flac`, `wav`, and `pcm`." + default: "mp3" + type: string + enum: ["mp3", "opus", "aac", "flac", "wav", "pcm"] + speed: + description: "The speed of the generated audio. Select a value from `0.25` to `4.0`. `1.0` is the default." + type: number + default: 1.0 + minimum: 0.25 + maximum: 4.0 + required: + - model + - input + - voice + + Model: + title: Model + description: Describes an OpenAI model offering that can be used with the API. + properties: + id: + type: string + description: The model identifier, which can be referenced in the API endpoints. + created: + type: integer + description: The Unix timestamp (in seconds) when the model was created. + object: + type: string + description: The object type, which is always "model". + enum: [model] + owned_by: + type: string + description: The organization that owns the model. + required: + - id + - object + - created + - owned_by + x-oaiMeta: + name: The model object + example: *retrieve_model_response + + OpenAIFile: + title: OpenAIFile + description: The `File` object represents a document that has been uploaded to OpenAI. + properties: + id: + type: string + description: The file identifier, which can be referenced in the API endpoints. + bytes: + type: integer + description: The size of the file, in bytes. + created_at: + type: integer + description: The Unix timestamp (in seconds) for when the file was created. + filename: + type: string + description: The name of the file. + object: + type: string + description: The object type, which is always `file`. + enum: ["file"] + purpose: + type: string + description: The intended purpose of the file. Supported values are `assistants`, `assistants_output`, `batch`, `batch_output`, `fine-tune`, `fine-tune-results` and `vision`. + enum: + [ + "assistants", + "assistants_output", + "batch", + "batch_output", + "fine-tune", + "fine-tune-results", + "vision", + ] + status: + type: string + deprecated: true + description: Deprecated. The current status of the file, which can be either `uploaded`, `processed`, or `error`. + enum: ["uploaded", "processed", "error"] + status_details: + type: string + deprecated: true + description: Deprecated. For details on why a fine-tuning training file failed validation, see the `error` field on `fine_tuning.job`. + required: + - id + - object + - bytes + - created_at + - filename + - purpose + - status + x-oaiMeta: + name: The file object + example: | + { + "id": "file-abc123", + "object": "file", + "bytes": 120000, + "created_at": 1677610602, + "filename": "salesOverview.pdf", + "purpose": "assistants", + } + Upload: + type: object + title: Upload + description: | + The Upload object can accept byte chunks in the form of Parts. + properties: + id: + type: string + description: The Upload unique identifier, which can be referenced in API endpoints. + created_at: + type: integer + description: The Unix timestamp (in seconds) for when the Upload was created. + filename: + type: string + description: The name of the file to be uploaded. + bytes: + type: integer + description: The intended number of bytes to be uploaded. + purpose: + type: string + description: The intended purpose of the file. [Please refer here](/docs/api-reference/files/object#files/object-purpose) for acceptable values. + status: + type: string + description: The status of the Upload. + enum: ["pending", "completed", "cancelled", "expired"] + expires_at: + type: integer + description: The Unix timestamp (in seconds) for when the Upload was created. + object: + type: string + description: The object type, which is always "upload". + enum: [upload] + file: + $ref: "#/components/schemas/OpenAIFile" + nullable: true + description: The ready File object after the Upload is completed. + required: + - bytes + - created_at + - expires_at + - filename + - id + - purpose + - status + - step_number + x-oaiMeta: + name: The upload object + example: | + { + "id": "upload_abc123", + "object": "upload", + "bytes": 2147483648, + "created_at": 1719184911, + "filename": "training_examples.jsonl", + "purpose": "fine-tune", + "status": "completed", + "expires_at": 1719127296, + "file": { + "id": "file-xyz321", + "object": "file", + "bytes": 2147483648, + "created_at": 1719186911, + "filename": "training_examples.jsonl", + "purpose": "fine-tune", + } + } + UploadPart: + type: object + title: UploadPart + description: | + The upload Part represents a chunk of bytes we can add to an Upload object. + properties: + id: + type: string + description: The upload Part unique identifier, which can be referenced in API endpoints. + created_at: + type: integer + description: The Unix timestamp (in seconds) for when the Part was created. + upload_id: + type: string + description: The ID of the Upload object that this Part was added to. + object: + type: string + description: The object type, which is always `upload.part`. + enum: ['upload.part'] + required: + - created_at + - id + - object + - upload_id + x-oaiMeta: + name: The upload part object + example: | + { + "id": "part_def456", + "object": "upload.part", + "created_at": 1719186911, + "upload_id": "upload_abc123" + } + Embedding: + type: object + description: | + Represents an embedding vector returned by embedding endpoint. + properties: + index: + type: integer + description: The index of the embedding in the list of embeddings. + embedding: + type: array + description: | + The embedding vector, which is a list of floats. The length of vector depends on the model as listed in the [embedding guide](/docs/guides/embeddings). + items: + type: number + object: + type: string + description: The object type, which is always "embedding". + enum: [embedding] + required: + - index + - object + - embedding + x-oaiMeta: + name: The embedding object + example: | + { + "object": "embedding", + "embedding": [ + 0.0023064255, + -0.009327292, + .... (1536 floats total for ada-002) + -0.0028842222, + ], + "index": 0 + } + + FineTuningJob: + type: object + title: FineTuningJob + description: | + The `fine_tuning.job` object represents a fine-tuning job that has been created through the API. + properties: + id: + type: string + description: The object identifier, which can be referenced in the API endpoints. + created_at: + type: integer + description: The Unix timestamp (in seconds) for when the fine-tuning job was created. + error: + type: object + nullable: true + description: For fine-tuning jobs that have `failed`, this will contain more information on the cause of the failure. + properties: + code: + type: string + description: A machine-readable error code. + message: + type: string + description: A human-readable error message. + param: + type: string + description: The parameter that was invalid, usually `training_file` or `validation_file`. This field will be null if the failure was not parameter-specific. + nullable: true + required: + - code + - message + - param + fine_tuned_model: + type: string + nullable: true + description: The name of the fine-tuned model that is being created. The value will be null if the fine-tuning job is still running. + finished_at: + type: integer + nullable: true + description: The Unix timestamp (in seconds) for when the fine-tuning job was finished. The value will be null if the fine-tuning job is still running. + hyperparameters: + type: object + description: The hyperparameters used for the fine-tuning job. See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. + properties: + n_epochs: + oneOf: + - type: string + enum: [auto] + - type: integer + minimum: 1 + maximum: 50 + default: auto + description: + The number of epochs to train the model for. An epoch refers to one full cycle through the training dataset. + + "auto" decides the optimal number of epochs based on the size of the dataset. If setting the number manually, we support any number between 1 and 50 epochs. + required: + - n_epochs + model: + type: string + description: The base model that is being fine-tuned. + object: + type: string + description: The object type, which is always "fine_tuning.job". + enum: [fine_tuning.job] + organization_id: + type: string + description: The organization that owns the fine-tuning job. + result_files: + type: array + description: The compiled results file ID(s) for the fine-tuning job. You can retrieve the results with the [Files API](/docs/api-reference/files/retrieve-contents). + items: + type: string + example: file-abc123 + status: + type: string + description: The current status of the fine-tuning job, which can be either `validating_files`, `queued`, `running`, `succeeded`, `failed`, or `cancelled`. + enum: + [ + "validating_files", + "queued", + "running", + "succeeded", + "failed", + "cancelled", + ] + trained_tokens: + type: integer + nullable: true + description: The total number of billable tokens processed by this fine-tuning job. The value will be null if the fine-tuning job is still running. + training_file: + type: string + description: The file ID used for training. You can retrieve the training data with the [Files API](/docs/api-reference/files/retrieve-contents). + validation_file: + type: string + nullable: true + description: The file ID used for validation. You can retrieve the validation results with the [Files API](/docs/api-reference/files/retrieve-contents). + integrations: + type: array + nullable: true + description: A list of integrations to enable for this fine-tuning job. + maxItems: 5 + items: + oneOf: + - $ref: "#/components/schemas/FineTuningIntegration" + x-oaiExpandable: true + seed: + type: integer + description: The seed used for the fine-tuning job. + estimated_finish: + type: integer + nullable: true + description: The Unix timestamp (in seconds) for when the fine-tuning job is estimated to finish. The value will be null if the fine-tuning job is not running. + required: + - created_at + - error + - finished_at + - fine_tuned_model + - hyperparameters + - id + - model + - object + - organization_id + - result_files + - status + - trained_tokens + - training_file + - validation_file + - seed + x-oaiMeta: + name: The fine-tuning job object + example: *fine_tuning_example + + FineTuningIntegration: + type: object + title: Fine-Tuning Job Integration + required: + - type + - wandb + properties: + type: + type: string + description: "The type of the integration being enabled for the fine-tuning job" + enum: ["wandb"] + wandb: + type: object + description: | + The settings for your integration with Weights and Biases. This payload specifies the project that + metrics will be sent to. Optionally, you can set an explicit display name for your run, add tags + to your run, and set a default entity (team, username, etc) to be associated with your run. + required: + - project + properties: + project: + description: | + The name of the project that the new run will be created under. + type: string + example: "my-wandb-project" + name: + description: | + A display name to set for the run. If not set, we will use the Job ID as the name. + nullable: true + type: string + entity: + description: | + The entity to use for the run. This allows you to set the team or username of the WandB user that you would + like associated with the run. If not set, the default entity for the registered WandB API key is used. + nullable: true + type: string + tags: + description: | + A list of tags to be attached to the newly created run. These tags are passed through directly to WandB. Some + default tags are generated by OpenAI: "openai/finetune", "openai/{base-model}", "openai/{ftjob-abcdef}". + type: array + items: + type: string + example: "custom-tag" + + FineTuningJobEvent: + type: object + description: Fine-tuning job event object + properties: + id: + type: string + created_at: + type: integer + level: + type: string + enum: ["info", "warn", "error"] + message: + type: string + object: + type: string + enum: [fine_tuning.job.event] + required: + - id + - object + - created_at + - level + - message + x-oaiMeta: + name: The fine-tuning job event object + example: | + { + "object": "fine_tuning.job.event", + "id": "ftevent-abc123" + "created_at": 1677610602, + "level": "info", + "message": "Created fine-tuning job" + } + + FineTuningJobCheckpoint: + type: object + title: FineTuningJobCheckpoint + description: | + The `fine_tuning.job.checkpoint` object represents a model checkpoint for a fine-tuning job that is ready to use. + properties: + id: + type: string + description: The checkpoint identifier, which can be referenced in the API endpoints. + created_at: + type: integer + description: The Unix timestamp (in seconds) for when the checkpoint was created. + fine_tuned_model_checkpoint: + type: string + description: The name of the fine-tuned checkpoint model that is created. + step_number: + type: integer + description: The step number that the checkpoint was created at. + metrics: + type: object + description: Metrics at the step number during the fine-tuning job. + properties: + step: + type: number + train_loss: + type: number + train_mean_token_accuracy: + type: number + valid_loss: + type: number + valid_mean_token_accuracy: + type: number + full_valid_loss: + type: number + full_valid_mean_token_accuracy: + type: number + fine_tuning_job_id: + type: string + description: The name of the fine-tuning job that this checkpoint was created from. + object: + type: string + description: The object type, which is always "fine_tuning.job.checkpoint". + enum: [fine_tuning.job.checkpoint] + required: + - created_at + - fine_tuning_job_id + - fine_tuned_model_checkpoint + - id + - metrics + - object + - step_number + x-oaiMeta: + name: The fine-tuning job checkpoint object + example: | + { + "object": "fine_tuning.job.checkpoint", + "id": "ftckpt_qtZ5Gyk4BLq1SfLFWp3RtO3P", + "created_at": 1712211699, + "fine_tuned_model_checkpoint": "ft:gpt-4o-mini-2024-07-18:my-org:custom_suffix:9ABel2dg:ckpt-step-88", + "fine_tuning_job_id": "ftjob-fpbNQ3H1GrMehXRf8cO97xTN", + "metrics": { + "step": 88, + "train_loss": 0.478, + "train_mean_token_accuracy": 0.924, + "valid_loss": 10.112, + "valid_mean_token_accuracy": 0.145, + "full_valid_loss": 0.567, + "full_valid_mean_token_accuracy": 0.944 + }, + "step_number": 88 + } + + FinetuneChatRequestInput: + type: object + description: The per-line training example of a fine-tuning input file for chat models + properties: + messages: + type: array + minItems: 1 + items: + oneOf: + - $ref: "#/components/schemas/ChatCompletionRequestSystemMessage" + - $ref: "#/components/schemas/ChatCompletionRequestUserMessage" + - $ref: "#/components/schemas/FineTuneChatCompletionRequestAssistantMessage" + - $ref: "#/components/schemas/ChatCompletionRequestToolMessage" + - $ref: "#/components/schemas/ChatCompletionRequestFunctionMessage" + x-oaiExpandable: true + tools: + type: array + description: A list of tools the model may generate JSON inputs for. + items: + $ref: "#/components/schemas/ChatCompletionTool" + parallel_tool_calls: + $ref: "#/components/schemas/ParallelToolCalls" + functions: + deprecated: true + description: + A list of functions the model may generate JSON inputs for. + type: array + minItems: 1 + maxItems: 128 + items: + $ref: "#/components/schemas/ChatCompletionFunctions" + x-oaiMeta: + name: Training format for chat models + example: | + { + "messages": [ + { "role": "user", "content": "What is the weather in San Francisco?" }, + { + "role": "assistant", + "tool_calls": [ + { + "id": "call_id", + "type": "function", + "function": { + "name": "get_current_weather", + "arguments": "{\"location\": \"San Francisco, USA\", \"format\": \"celsius\"}" + } + } + ] + } + ], + "parallel_tool_calls": false, + "tools": [ + { + "type": "function", + "function": { + "name": "get_current_weather", + "description": "Get the current weather", + "parameters": { + "type": "object", + "properties": { + "location": { + "type": "string", + "description": "The city and country, eg. San Francisco, USA" + }, + "format": { "type": "string", "enum": ["celsius", "fahrenheit"] } + }, + "required": ["location", "format"] + } + } + } + ] + } + + FinetuneCompletionRequestInput: + type: object + description: The per-line training example of a fine-tuning input file for completions models + properties: + prompt: + type: string + description: The input prompt for this training example. + completion: + type: string + description: The desired completion for this training example. + x-oaiMeta: + name: Training format for completions models + example: | + { + "prompt": "What is the answer to 2+2", + "completion": "4" + } + + CompletionUsage: + type: object + description: Usage statistics for the completion request. + properties: + completion_tokens: + type: integer + description: Number of tokens in the generated completion. + prompt_tokens: + type: integer + description: Number of tokens in the prompt. + total_tokens: + type: integer + description: Total number of tokens used in the request (prompt + completion). + required: + - prompt_tokens + - completion_tokens + - total_tokens + + RunCompletionUsage: + type: object + description: Usage statistics related to the run. This value will be `null` if the run is not in a terminal state (i.e. `in_progress`, `queued`, etc.). + properties: + completion_tokens: + type: integer + description: Number of completion tokens used over the course of the run. + prompt_tokens: + type: integer + description: Number of prompt tokens used over the course of the run. + total_tokens: + type: integer + description: Total number of tokens used (prompt + completion). + required: + - prompt_tokens + - completion_tokens + - total_tokens + nullable: true + + RunStepCompletionUsage: + type: object + description: Usage statistics related to the run step. This value will be `null` while the run step's status is `in_progress`. + properties: + completion_tokens: + type: integer + description: Number of completion tokens used over the course of the run step. + prompt_tokens: + type: integer + description: Number of prompt tokens used over the course of the run step. + total_tokens: + type: integer + description: Total number of tokens used (prompt + completion). + required: + - prompt_tokens + - completion_tokens + - total_tokens + nullable: true + + AssistantsApiResponseFormatOption: + description: | + Specifies the format that the model must output. Compatible with [GPT-4o](/docs/models/gpt-4o), [GPT-4 Turbo](/docs/models/gpt-4-turbo-and-gpt-4), and all GPT-3.5 Turbo models since `gpt-3.5-turbo-1106`. + + Setting to `{ "type": "json_schema", "json_schema": {...} }` enables Structured Outputs which guarantees the model will match your supplied JSON schema. Learn more in the [Structured Outputs guide](/docs/guides/structured-outputs). + + Setting to `{ "type": "json_object" }` enables JSON mode, which guarantees the message the model generates is valid JSON. + + **Important:** when using JSON mode, you **must** also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if `finish_reason="length"`, which indicates the generation exceeded `max_tokens` or the conversation exceeded the max context length. + oneOf: + - type: string + description: > + `auto` is the default value + enum: [auto] + - $ref: '#/components/schemas/ResponseFormatText' + - $ref: '#/components/schemas/ResponseFormatJsonObject' + - $ref: '#/components/schemas/ResponseFormatJsonSchema' + x-oaiExpandable: true + + AssistantObject: + type: object + title: Assistant + description: Represents an `assistant` that can call the model and use tools. + properties: + id: + description: The identifier, which can be referenced in API endpoints. + type: string + object: + description: The object type, which is always `assistant`. + type: string + enum: [assistant] + created_at: + description: The Unix timestamp (in seconds) for when the assistant was created. + type: integer + name: + description: &assistant_name_param_description | + The name of the assistant. The maximum length is 256 characters. + type: string + maxLength: 256 + nullable: true + description: + description: &assistant_description_param_description | + The description of the assistant. The maximum length is 512 characters. + type: string + maxLength: 512 + nullable: true + model: + description: *model_description + type: string + instructions: + description: &assistant_instructions_param_description | + The system instructions that the assistant uses. The maximum length is 256,000 characters. + type: string + maxLength: 256000 + nullable: true + tools: + description: &assistant_tools_param_description | + A list of tool enabled on the assistant. There can be a maximum of 128 tools per assistant. Tools can be of types `code_interpreter`, `file_search`, or `function`. + default: [] + type: array + maxItems: 128 + items: + oneOf: + - $ref: "#/components/schemas/AssistantToolsCode" + - $ref: "#/components/schemas/AssistantToolsFileSearch" + - $ref: "#/components/schemas/AssistantToolsFunction" + x-oaiExpandable: true + tool_resources: + type: object + description: | + A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + description: | + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter`` tool. There can be a maximum of 20 files associated with the tool. + default: [] + maxItems: 20 + items: + type: string + file_search: + type: object + properties: + vector_store_ids: + type: array + description: | + The ID of the [vector store](/docs/api-reference/vector-stores/object) attached to this assistant. There can be a maximum of 1 vector store attached to the assistant. + maxItems: 1 + items: + type: string + nullable: true + metadata: + description: &metadata_description | + Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + type: object + x-oaiTypeLabel: map + nullable: true + temperature: + description: &run_temperature_description | + What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. + type: number + minimum: 0 + maximum: 2 + default: 1 + example: 1 + nullable: true + top_p: + type: number + minimum: 0 + maximum: 1 + default: 1 + example: 1 + nullable: true + description: &run_top_p_description | + An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. + + We generally recommend altering this or temperature but not both. + response_format: + $ref: "#/components/schemas/AssistantsApiResponseFormatOption" + nullable: true + required: + - id + - object + - created_at + - name + - description + - model + - instructions + - tools + - metadata + x-oaiMeta: + name: The assistant object + beta: true + example: *create_assistants_example + + CreateAssistantRequest: + type: object + additionalProperties: false + properties: + model: + description: *model_description + example: "gpt-4o" + anyOf: + - type: string + - type: string + enum: + [ + "gpt-4o", + "gpt-4o-2024-08-06", + "gpt-4o-2024-05-13", + "gpt-4o-2024-08-06", + "gpt-4o-mini", + "gpt-4o-mini-2024-07-18", + "gpt-4-turbo", + "gpt-4-turbo-2024-04-09", + "gpt-4-0125-preview", + "gpt-4-turbo-preview", + "gpt-4-1106-preview", + "gpt-4-vision-preview", + "gpt-4", + "gpt-4-0314", + "gpt-4-0613", + "gpt-4-32k", + "gpt-4-32k-0314", + "gpt-4-32k-0613", + "gpt-3.5-turbo", + "gpt-3.5-turbo-16k", + "gpt-3.5-turbo-0613", + "gpt-3.5-turbo-1106", + "gpt-3.5-turbo-0125", + "gpt-3.5-turbo-16k-0613", + ] + x-oaiTypeLabel: string + name: + description: *assistant_name_param_description + type: string + nullable: true + maxLength: 256 + description: + description: *assistant_description_param_description + type: string + nullable: true + maxLength: 512 + instructions: + description: *assistant_instructions_param_description + type: string + nullable: true + maxLength: 256000 + tools: + description: *assistant_tools_param_description + default: [] + type: array + maxItems: 128 + items: + oneOf: + - $ref: "#/components/schemas/AssistantToolsCode" + - $ref: "#/components/schemas/AssistantToolsFileSearch" + - $ref: "#/components/schemas/AssistantToolsFunction" + x-oaiExpandable: true + tool_resources: + type: object + description: | + A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + description: | + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + maxItems: 20 + items: + type: string + file_search: + type: object + properties: + vector_store_ids: + type: array + description: | + The [vector store](/docs/api-reference/vector-stores/object) attached to this assistant. There can be a maximum of 1 vector store attached to the assistant. + maxItems: 1 + items: + type: string + vector_stores: + type: array + description: | + A helper to create a [vector store](/docs/api-reference/vector-stores/object) with file_ids and attach it to this assistant. There can be a maximum of 1 vector store attached to the assistant. + maxItems: 1 + items: + type: object + properties: + file_ids: + type: array + description: | + A list of [file](/docs/api-reference/files) IDs to add to the vector store. There can be a maximum of 10000 files in a vector store. + maxItems: 10000 + items: + type: string + chunking_strategy: + # Ideally we'd reuse the chunking strategy schema here, but it doesn't expand properly + type: object + description: The chunking strategy used to chunk the file(s). If not set, will use the `auto` strategy. + oneOf: + - type: object + title: Auto Chunking Strategy + description: The default strategy. This strategy currently uses a `max_chunk_size_tokens` of `800` and `chunk_overlap_tokens` of `400`. + additionalProperties: false + properties: + type: + type: string + description: Always `auto`. + enum: ["auto"] + required: + - type + - type: object + title: Static Chunking Strategy + additionalProperties: false + properties: + type: + type: string + description: Always `static`. + enum: ["static"] + static: + type: object + additionalProperties: false + properties: + max_chunk_size_tokens: + type: integer + minimum: 100 + maximum: 4096 + description: The maximum number of tokens in each chunk. The default value is `800`. The minimum value is `100` and the maximum value is `4096`. + chunk_overlap_tokens: + type: integer + description: | + The number of tokens that overlap between chunks. The default value is `400`. + + Note that the overlap must not exceed half of `max_chunk_size_tokens`. + required: + - max_chunk_size_tokens + - chunk_overlap_tokens + required: + - type + - static + x-oaiExpandable: true + metadata: + type: object + description: | + Set of 16 key-value pairs that can be attached to a vector store. This can be useful for storing additional information about the vector store in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + oneOf: + - required: [vector_store_ids] + - required: [vector_stores] + nullable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + temperature: + description: *run_temperature_description + type: number + minimum: 0 + maximum: 2 + default: 1 + example: 1 + nullable: true + top_p: + type: number + minimum: 0 + maximum: 1 + default: 1 + example: 1 + nullable: true + description: *run_top_p_description + response_format: + $ref: "#/components/schemas/AssistantsApiResponseFormatOption" + nullable: true + required: + - model + + ModifyAssistantRequest: + type: object + additionalProperties: false + properties: + model: + description: *model_description + anyOf: + - type: string + name: + description: *assistant_name_param_description + type: string + nullable: true + maxLength: 256 + description: + description: *assistant_description_param_description + type: string + nullable: true + maxLength: 512 + instructions: + description: *assistant_instructions_param_description + type: string + nullable: true + maxLength: 256000 + tools: + description: *assistant_tools_param_description + default: [] + type: array + maxItems: 128 + items: + oneOf: + - $ref: "#/components/schemas/AssistantToolsCode" + - $ref: "#/components/schemas/AssistantToolsFileSearch" + - $ref: "#/components/schemas/AssistantToolsFunction" + x-oaiExpandable: true + tool_resources: + type: object + description: | + A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + description: | + Overrides the list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + maxItems: 20 + items: + type: string + file_search: + type: object + properties: + vector_store_ids: + type: array + description: | + Overrides the [vector store](/docs/api-reference/vector-stores/object) attached to this assistant. There can be a maximum of 1 vector store attached to the assistant. + maxItems: 1 + items: + type: string + nullable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + temperature: + description: *run_temperature_description + type: number + minimum: 0 + maximum: 2 + default: 1 + example: 1 + nullable: true + top_p: + type: number + minimum: 0 + maximum: 1 + default: 1 + example: 1 + nullable: true + description: *run_top_p_description + response_format: + $ref: "#/components/schemas/AssistantsApiResponseFormatOption" + nullable: true + + DeleteAssistantResponse: + type: object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + enum: [assistant.deleted] + required: + - id + - object + - deleted + + ListAssistantsResponse: + type: object + properties: + object: + type: string + example: "list" + data: + type: array + items: + $ref: "#/components/schemas/AssistantObject" + first_id: + type: string + example: "asst_abc123" + last_id: + type: string + example: "asst_abc456" + has_more: + type: boolean + example: false + required: + - object + - data + - first_id + - last_id + - has_more + x-oaiMeta: + name: List assistants response object + group: chat + example: *list_assistants_example + + AssistantToolsCode: + type: object + title: Code interpreter tool + properties: + type: + type: string + description: "The type of tool being defined: `code_interpreter`" + enum: ["code_interpreter"] + required: + - type + + AssistantToolsFileSearch: + type: object + title: FileSearch tool + properties: + type: + type: string + description: "The type of tool being defined: `file_search`" + enum: ["file_search"] + file_search: + type: object + description: Overrides for the file search tool. + properties: + max_num_results: + type: integer + minimum: 1 + maximum: 50 + description: | + The maximum number of results the file search tool should output. The default is 20 for `gpt-4*` models and 5 for `gpt-3.5-turbo`. This number should be between 1 and 50 inclusive. + + Note that the file search tool may output fewer than `max_num_results` results. See the [file search tool documentation](/docs/assistants/tools/file-search/number-of-chunks-returned) for more information. + required: + - type + + AssistantToolsFileSearchTypeOnly: + type: object + title: FileSearch tool + properties: + type: + type: string + description: "The type of tool being defined: `file_search`" + enum: ["file_search"] + required: + - type + + AssistantToolsFunction: + type: object + title: Function tool + properties: + type: + type: string + description: "The type of tool being defined: `function`" + enum: ["function"] + function: + $ref: "#/components/schemas/FunctionObject" + required: + - type + - function + + TruncationObject: + type: object + title: Thread Truncation Controls + description: Controls for how a thread will be truncated prior to the run. Use this to control the intial context window of the run. + properties: + type: + type: string + description: The truncation strategy to use for the thread. The default is `auto`. If set to `last_messages`, the thread will be truncated to the n most recent messages in the thread. When set to `auto`, messages in the middle of the thread will be dropped to fit the context length of the model, `max_prompt_tokens`. + enum: ["auto", "last_messages"] + last_messages: + type: integer + description: The number of most recent messages from the thread when constructing the context for the run. + minimum: 1 + nullable: true + required: + - type + + AssistantsApiToolChoiceOption: + description: | + Controls which (if any) tool is called by the model. + `none` means the model will not call any tools and instead generates a message. + `auto` is the default value and means the model can pick between generating a message or calling one or more tools. + `required` means the model must call one or more tools before responding to the user. + Specifying a particular tool like `{"type": "file_search"}` or `{"type": "function", "function": {"name": "my_function"}}` forces the model to call that tool. + + oneOf: + - type: string + description: > + `none` means the model will not call any tools and instead generates a message. + `auto` means the model can pick between generating a message or calling one or more tools. + `required` means the model must call one or more tools before responding to the user. + enum: [none, auto, required] + - $ref: "#/components/schemas/AssistantsNamedToolChoice" + x-oaiExpandable: true + + AssistantsNamedToolChoice: + type: object + description: Specifies a tool the model should use. Use to force the model to call a specific tool. + properties: + type: + type: string + enum: ["function", "code_interpreter", "file_search"] + description: The type of the tool. If type is `function`, the function name must be set + function: + type: object + properties: + name: + type: string + description: The name of the function to call. + required: + - name + required: + - type + + RunObject: + type: object + title: A run on a thread + description: Represents an execution run on a [thread](/docs/api-reference/threads). + properties: + id: + description: The identifier, which can be referenced in API endpoints. + type: string + object: + description: The object type, which is always `thread.run`. + type: string + enum: ["thread.run"] + created_at: + description: The Unix timestamp (in seconds) for when the run was created. + type: integer + thread_id: + description: The ID of the [thread](/docs/api-reference/threads) that was executed on as a part of this run. + type: string + assistant_id: + description: The ID of the [assistant](/docs/api-reference/assistants) used for execution of this run. + type: string + status: + description: The status of the run, which can be either `queued`, `in_progress`, `requires_action`, `cancelling`, `cancelled`, `failed`, `completed`, `incomplete`, or `expired`. + type: string + enum: + [ + "queued", + "in_progress", + "requires_action", + "cancelling", + "cancelled", + "failed", + "completed", + "incomplete", + "expired", + ] + required_action: + type: object + description: Details on the action required to continue the run. Will be `null` if no action is required. + nullable: true + properties: + type: + description: For now, this is always `submit_tool_outputs`. + type: string + enum: ["submit_tool_outputs"] + submit_tool_outputs: + type: object + description: Details on the tool outputs needed for this run to continue. + properties: + tool_calls: + type: array + description: A list of the relevant tool calls. + items: + $ref: "#/components/schemas/RunToolCallObject" + required: + - tool_calls + required: + - type + - submit_tool_outputs + last_error: + type: object + description: The last error associated with this run. Will be `null` if there are no errors. + nullable: true + properties: + code: + type: string + description: One of `server_error`, `rate_limit_exceeded`, or `invalid_prompt`. + enum: + ["server_error", "rate_limit_exceeded", "invalid_prompt"] + message: + type: string + description: A human-readable description of the error. + required: + - code + - message + expires_at: + description: The Unix timestamp (in seconds) for when the run will expire. + type: integer + nullable: true + started_at: + description: The Unix timestamp (in seconds) for when the run was started. + type: integer + nullable: true + cancelled_at: + description: The Unix timestamp (in seconds) for when the run was cancelled. + type: integer + nullable: true + failed_at: + description: The Unix timestamp (in seconds) for when the run failed. + type: integer + nullable: true + completed_at: + description: The Unix timestamp (in seconds) for when the run was completed. + type: integer + nullable: true + incomplete_details: + description: Details on why the run is incomplete. Will be `null` if the run is not incomplete. + type: object + nullable: true + properties: + reason: + description: The reason why the run is incomplete. This will point to which specific token limit was reached over the course of the run. + type: string + enum: ["max_completion_tokens", "max_prompt_tokens"] + model: + description: The model that the [assistant](/docs/api-reference/assistants) used for this run. + type: string + instructions: + description: The instructions that the [assistant](/docs/api-reference/assistants) used for this run. + type: string + tools: + description: The list of tools that the [assistant](/docs/api-reference/assistants) used for this run. + default: [] + type: array + maxItems: 20 + items: + oneOf: + - $ref: "#/components/schemas/AssistantToolsCode" + - $ref: "#/components/schemas/AssistantToolsFileSearch" + - $ref: "#/components/schemas/AssistantToolsFunction" + x-oaiExpandable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + usage: + $ref: "#/components/schemas/RunCompletionUsage" + temperature: + description: The sampling temperature used for this run. If not set, defaults to 1. + type: number + nullable: true + top_p: + description: The nucleus sampling value used for this run. If not set, defaults to 1. + type: number + nullable: true + max_prompt_tokens: + type: integer + nullable: true + description: | + The maximum number of prompt tokens specified to have been used over the course of the run. + minimum: 256 + max_completion_tokens: + type: integer + nullable: true + description: | + The maximum number of completion tokens specified to have been used over the course of the run. + minimum: 256 + truncation_strategy: + $ref: "#/components/schemas/TruncationObject" + nullable: true + tool_choice: + $ref: "#/components/schemas/AssistantsApiToolChoiceOption" + nullable: true + parallel_tool_calls: + $ref: "#/components/schemas/ParallelToolCalls" + response_format: + $ref: "#/components/schemas/AssistantsApiResponseFormatOption" + nullable: true + required: + - id + - object + - created_at + - thread_id + - assistant_id + - status + - required_action + - last_error + - expires_at + - started_at + - cancelled_at + - failed_at + - completed_at + - model + - instructions + - tools + - metadata + - usage + - incomplete_details + - max_prompt_tokens + - max_completion_tokens + - truncation_strategy + - tool_choice + - parallel_tool_calls + - response_format + x-oaiMeta: + name: The run object + beta: true + example: | + { + "id": "run_abc123", + "object": "thread.run", + "created_at": 1698107661, + "assistant_id": "asst_abc123", + "thread_id": "thread_abc123", + "status": "completed", + "started_at": 1699073476, + "expires_at": null, + "cancelled_at": null, + "failed_at": null, + "completed_at": 1699073498, + "last_error": null, + "model": "gpt-4o", + "instructions": null, + "tools": [{"type": "file_search"}, {"type": "code_interpreter"}], + "metadata": {}, + "incomplete_details": null, + "usage": { + "prompt_tokens": 123, + "completion_tokens": 456, + "total_tokens": 579 + }, + "temperature": 1.0, + "top_p": 1.0, + "max_prompt_tokens": 1000, + "max_completion_tokens": 1000, + "truncation_strategy": { + "type": "auto", + "last_messages": null + }, + "response_format": "auto", + "tool_choice": "auto", + "parallel_tool_calls": true + } + CreateRunRequest: + type: object + additionalProperties: false + properties: + assistant_id: + description: The ID of the [assistant](/docs/api-reference/assistants) to use to execute this run. + type: string + model: + description: The ID of the [Model](/docs/api-reference/models) to be used to execute this run. If a value is provided here, it will override the model associated with the assistant. If not, the model associated with the assistant will be used. + example: "gpt-4o" + anyOf: + - type: string + - type: string + enum: + [ + "gpt-4o", + "gpt-4o-2024-08-06", + "gpt-4o-2024-05-13", + "gpt-4o-2024-08-06", + "gpt-4o-mini", + "gpt-4o-mini-2024-07-18", + "gpt-4-turbo", + "gpt-4-turbo-2024-04-09", + "gpt-4-0125-preview", + "gpt-4-turbo-preview", + "gpt-4-1106-preview", + "gpt-4-vision-preview", + "gpt-4", + "gpt-4-0314", + "gpt-4-0613", + "gpt-4-32k", + "gpt-4-32k-0314", + "gpt-4-32k-0613", + "gpt-3.5-turbo", + "gpt-3.5-turbo-16k", + "gpt-3.5-turbo-0613", + "gpt-3.5-turbo-1106", + "gpt-3.5-turbo-0125", + "gpt-3.5-turbo-16k-0613", + ] + x-oaiTypeLabel: string + nullable: true + instructions: + description: Overrides the [instructions](/docs/api-reference/assistants/createAssistant) of the assistant. This is useful for modifying the behavior on a per-run basis. + type: string + nullable: true + additional_instructions: + description: Appends additional instructions at the end of the instructions for the run. This is useful for modifying the behavior on a per-run basis without overriding other instructions. + type: string + nullable: true + additional_messages: + description: Adds additional messages to the thread before creating the run. + type: array + items: + $ref: "#/components/schemas/CreateMessageRequest" + nullable: true + tools: + description: Override the tools the assistant can use for this run. This is useful for modifying the behavior on a per-run basis. + nullable: true + type: array + maxItems: 20 + items: + oneOf: + - $ref: "#/components/schemas/AssistantToolsCode" + - $ref: "#/components/schemas/AssistantToolsFileSearch" + - $ref: "#/components/schemas/AssistantToolsFunction" + x-oaiExpandable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + temperature: + type: number + minimum: 0 + maximum: 2 + default: 1 + example: 1 + nullable: true + description: *run_temperature_description + top_p: + type: number + minimum: 0 + maximum: 1 + default: 1 + example: 1 + nullable: true + description: *run_top_p_description + stream: + type: boolean + nullable: true + description: | + If `true`, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with a `data: [DONE]` message. + max_prompt_tokens: + type: integer + nullable: true + description: | + The maximum number of prompt tokens that may be used over the course of the run. The run will make a best effort to use only the number of prompt tokens specified, across multiple turns of the run. If the run exceeds the number of prompt tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + minimum: 256 + max_completion_tokens: + type: integer + nullable: true + description: | + The maximum number of completion tokens that may be used over the course of the run. The run will make a best effort to use only the number of completion tokens specified, across multiple turns of the run. If the run exceeds the number of completion tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + minimum: 256 + truncation_strategy: + $ref: "#/components/schemas/TruncationObject" + nullable: true + tool_choice: + $ref: "#/components/schemas/AssistantsApiToolChoiceOption" + nullable: true + parallel_tool_calls: + $ref: "#/components/schemas/ParallelToolCalls" + response_format: + $ref: "#/components/schemas/AssistantsApiResponseFormatOption" + nullable: true + required: + - thread_id + - assistant_id + ListRunsResponse: + type: object + properties: + object: + type: string + example: "list" + data: + type: array + items: + $ref: "#/components/schemas/RunObject" + first_id: + type: string + example: "run_abc123" + last_id: + type: string + example: "run_abc456" + has_more: + type: boolean + example: false + required: + - object + - data + - first_id + - last_id + - has_more + ModifyRunRequest: + type: object + additionalProperties: false + properties: + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + SubmitToolOutputsRunRequest: + type: object + additionalProperties: false + properties: + tool_outputs: + description: A list of tools for which the outputs are being submitted. + type: array + items: + type: object + properties: + tool_call_id: + type: string + description: The ID of the tool call in the `required_action` object within the run object the output is being submitted for. + output: + type: string + description: The output of the tool call to be submitted to continue the run. + stream: + type: boolean + nullable: true + description: | + If `true`, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with a `data: [DONE]` message. + required: + - tool_outputs + + RunToolCallObject: + type: object + description: Tool call objects + properties: + id: + type: string + description: The ID of the tool call. This ID must be referenced when you submit the tool outputs in using the [Submit tool outputs to run](/docs/api-reference/runs/submitToolOutputs) endpoint. + type: + type: string + description: The type of tool call the output is required for. For now, this is always `function`. + enum: ["function"] + function: + type: object + description: The function definition. + properties: + name: + type: string + description: The name of the function. + arguments: + type: string + description: The arguments that the model expects you to pass to the function. + required: + - name + - arguments + required: + - id + - type + - function + + CreateThreadAndRunRequest: + type: object + additionalProperties: false + properties: + assistant_id: + description: The ID of the [assistant](/docs/api-reference/assistants) to use to execute this run. + type: string + thread: + $ref: "#/components/schemas/CreateThreadRequest" + description: If no thread is provided, an empty thread will be created. + model: + description: The ID of the [Model](/docs/api-reference/models) to be used to execute this run. If a value is provided here, it will override the model associated with the assistant. If not, the model associated with the assistant will be used. + example: "gpt-4o" + anyOf: + - type: string + - type: string + enum: + [ + "gpt-4o", + "gpt-4o-2024-08-06", + "gpt-4o-2024-05-13", + "gpt-4o-2024-08-06", + "gpt-4o-mini", + "gpt-4o-mini-2024-07-18", + "gpt-4-turbo", + "gpt-4-turbo-2024-04-09", + "gpt-4-0125-preview", + "gpt-4-turbo-preview", + "gpt-4-1106-preview", + "gpt-4-vision-preview", + "gpt-4", + "gpt-4-0314", + "gpt-4-0613", + "gpt-4-32k", + "gpt-4-32k-0314", + "gpt-4-32k-0613", + "gpt-3.5-turbo", + "gpt-3.5-turbo-16k", + "gpt-3.5-turbo-0613", + "gpt-3.5-turbo-1106", + "gpt-3.5-turbo-0125", + "gpt-3.5-turbo-16k-0613", + ] + x-oaiTypeLabel: string + nullable: true + instructions: + description: Override the default system message of the assistant. This is useful for modifying the behavior on a per-run basis. + type: string + nullable: true + tools: + description: Override the tools the assistant can use for this run. This is useful for modifying the behavior on a per-run basis. + nullable: true + type: array + maxItems: 20 + items: + oneOf: + - $ref: "#/components/schemas/AssistantToolsCode" + - $ref: "#/components/schemas/AssistantToolsFileSearch" + - $ref: "#/components/schemas/AssistantToolsFunction" + tool_resources: + type: object + description: | + A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + description: | + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + maxItems: 20 + items: + type: string + file_search: + type: object + properties: + vector_store_ids: + type: array + description: | + The ID of the [vector store](/docs/api-reference/vector-stores/object) attached to this assistant. There can be a maximum of 1 vector store attached to the assistant. + maxItems: 1 + items: + type: string + nullable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + temperature: + type: number + minimum: 0 + maximum: 2 + default: 1 + example: 1 + nullable: true + description: *run_temperature_description + top_p: + type: number + minimum: 0 + maximum: 1 + default: 1 + example: 1 + nullable: true + description: *run_top_p_description + stream: + type: boolean + nullable: true + description: | + If `true`, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with a `data: [DONE]` message. + max_prompt_tokens: + type: integer + nullable: true + description: | + The maximum number of prompt tokens that may be used over the course of the run. The run will make a best effort to use only the number of prompt tokens specified, across multiple turns of the run. If the run exceeds the number of prompt tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + minimum: 256 + max_completion_tokens: + type: integer + nullable: true + description: | + The maximum number of completion tokens that may be used over the course of the run. The run will make a best effort to use only the number of completion tokens specified, across multiple turns of the run. If the run exceeds the number of completion tokens specified, the run will end with status `incomplete`. See `incomplete_details` for more info. + minimum: 256 + truncation_strategy: + $ref: "#/components/schemas/TruncationObject" + nullable: true + tool_choice: + $ref: "#/components/schemas/AssistantsApiToolChoiceOption" + nullable: true + parallel_tool_calls: + $ref: "#/components/schemas/ParallelToolCalls" + response_format: + $ref: "#/components/schemas/AssistantsApiResponseFormatOption" + nullable: true + required: + - thread_id + - assistant_id + + ThreadObject: + type: object + title: Thread + description: Represents a thread that contains [messages](/docs/api-reference/messages). + properties: + id: + description: The identifier, which can be referenced in API endpoints. + type: string + object: + description: The object type, which is always `thread`. + type: string + enum: ["thread"] + created_at: + description: The Unix timestamp (in seconds) for when the thread was created. + type: integer + tool_resources: + type: object + description: | + A set of resources that are made available to the assistant's tools in this thread. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + description: | + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + maxItems: 20 + items: + type: string + file_search: + type: object + properties: + vector_store_ids: + type: array + description: | + The [vector store](/docs/api-reference/vector-stores/object) attached to this thread. There can be a maximum of 1 vector store attached to the thread. + maxItems: 1 + items: + type: string + nullable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + required: + - id + - object + - created_at + - tool_resources + - metadata + x-oaiMeta: + name: The thread object + beta: true + example: | + { + "id": "thread_abc123", + "object": "thread", + "created_at": 1698107661, + "metadata": {} + } + + CreateThreadRequest: + type: object + additionalProperties: false + properties: + messages: + description: A list of [messages](/docs/api-reference/messages) to start the thread with. + type: array + items: + $ref: "#/components/schemas/CreateMessageRequest" + tool_resources: + type: object + description: | + A set of resources that are made available to the assistant's tools in this thread. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + description: | + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + maxItems: 20 + items: + type: string + file_search: + type: object + properties: + vector_store_ids: + type: array + description: | + The [vector store](/docs/api-reference/vector-stores/object) attached to this thread. There can be a maximum of 1 vector store attached to the thread. + maxItems: 1 + items: + type: string + vector_stores: + type: array + description: | + A helper to create a [vector store](/docs/api-reference/vector-stores/object) with file_ids and attach it to this thread. There can be a maximum of 1 vector store attached to the thread. + maxItems: 1 + items: + type: object + properties: + file_ids: + type: array + description: | + A list of [file](/docs/api-reference/files) IDs to add to the vector store. There can be a maximum of 10000 files in a vector store. + maxItems: 10000 + items: + type: string + chunking_strategy: + # Ideally we'd reuse the chunking strategy schema here, but it doesn't expand properly + type: object + description: The chunking strategy used to chunk the file(s). If not set, will use the `auto` strategy. + oneOf: + - type: object + title: Auto Chunking Strategy + description: The default strategy. This strategy currently uses a `max_chunk_size_tokens` of `800` and `chunk_overlap_tokens` of `400`. + additionalProperties: false + properties: + type: + type: string + description: Always `auto`. + enum: ["auto"] + required: + - type + - type: object + title: Static Chunking Strategy + additionalProperties: false + properties: + type: + type: string + description: Always `static`. + enum: ["static"] + static: + type: object + additionalProperties: false + properties: + max_chunk_size_tokens: + type: integer + minimum: 100 + maximum: 4096 + description: The maximum number of tokens in each chunk. The default value is `800`. The minimum value is `100` and the maximum value is `4096`. + chunk_overlap_tokens: + type: integer + description: | + The number of tokens that overlap between chunks. The default value is `400`. + + Note that the overlap must not exceed half of `max_chunk_size_tokens`. + required: + - max_chunk_size_tokens + - chunk_overlap_tokens + required: + - type + - static + x-oaiExpandable: true + metadata: + type: object + description: | + Set of 16 key-value pairs that can be attached to a vector store. This can be useful for storing additional information about the vector store in a structured format. Keys can be a maximum of 64 characters long and values can be a maxium of 512 characters long. + x-oaiTypeLabel: map + x-oaiExpandable: true + oneOf: + - required: [vector_store_ids] + - required: [vector_stores] + nullable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + + ModifyThreadRequest: + type: object + additionalProperties: false + properties: + tool_resources: + type: object + description: | + A set of resources that are made available to the assistant's tools in this thread. The resources are specific to the type of tool. For example, the `code_interpreter` tool requires a list of file IDs, while the `file_search` tool requires a list of vector store IDs. + properties: + code_interpreter: + type: object + properties: + file_ids: + type: array + description: | + A list of [file](/docs/api-reference/files) IDs made available to the `code_interpreter` tool. There can be a maximum of 20 files associated with the tool. + default: [] + maxItems: 20 + items: + type: string + file_search: + type: object + properties: + vector_store_ids: + type: array + description: | + The [vector store](/docs/api-reference/vector-stores/object) attached to this thread. There can be a maximum of 1 vector store attached to the thread. + maxItems: 1 + items: + type: string + nullable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + + DeleteThreadResponse: + type: object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + enum: [thread.deleted] + required: + - id + - object + - deleted + + ListThreadsResponse: + properties: + object: + type: string + example: "list" + data: + type: array + items: + $ref: "#/components/schemas/ThreadObject" + first_id: + type: string + example: "asst_abc123" + last_id: + type: string + example: "asst_abc456" + has_more: + type: boolean + example: false + required: + - object + - data + - first_id + - last_id + - has_more + + MessageObject: + type: object + title: The message object + description: Represents a message within a [thread](/docs/api-reference/threads). + properties: + id: + description: The identifier, which can be referenced in API endpoints. + type: string + object: + description: The object type, which is always `thread.message`. + type: string + enum: ["thread.message"] + created_at: + description: The Unix timestamp (in seconds) for when the message was created. + type: integer + thread_id: + description: The [thread](/docs/api-reference/threads) ID that this message belongs to. + type: string + status: + description: The status of the message, which can be either `in_progress`, `incomplete`, or `completed`. + type: string + enum: ["in_progress", "incomplete", "completed"] + incomplete_details: + description: On an incomplete message, details about why the message is incomplete. + type: object + properties: + reason: + type: string + description: The reason the message is incomplete. + enum: + [ + "content_filter", + "max_tokens", + "run_cancelled", + "run_expired", + "run_failed", + ] + nullable: true + required: + - reason + completed_at: + description: The Unix timestamp (in seconds) for when the message was completed. + type: integer + nullable: true + incomplete_at: + description: The Unix timestamp (in seconds) for when the message was marked as incomplete. + type: integer + nullable: true + role: + description: The entity that produced the message. One of `user` or `assistant`. + type: string + enum: ["user", "assistant"] + content: + description: The content of the message in array of text and/or images. + type: array + items: + oneOf: + - $ref: "#/components/schemas/MessageContentImageFileObject" + - $ref: "#/components/schemas/MessageContentImageUrlObject" + - $ref: "#/components/schemas/MessageContentTextObject" + - $ref: "#/components/schemas/MessageContentRefusalObject" + x-oaiExpandable: true + assistant_id: + description: If applicable, the ID of the [assistant](/docs/api-reference/assistants) that authored this message. + type: string + nullable: true + run_id: + description: The ID of the [run](/docs/api-reference/runs) associated with the creation of this message. Value is `null` when messages are created manually using the create message or create thread endpoints. + type: string + nullable: true + attachments: + type: array + items: + type: object + properties: + file_id: + type: string + description: The ID of the file to attach to the message. + tools: + description: The tools to add this file to. + type: array + items: + oneOf: + - $ref: "#/components/schemas/AssistantToolsCode" + - $ref: "#/components/schemas/AssistantToolsFileSearchTypeOnly" + x-oaiExpandable: true + description: A list of files attached to the message, and the tools they were added to. + nullable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + required: + - id + - object + - created_at + - thread_id + - status + - incomplete_details + - completed_at + - incomplete_at + - role + - content + - assistant_id + - run_id + - attachments + - metadata + x-oaiMeta: + name: The message object + beta: true + example: | + { + "id": "msg_abc123", + "object": "thread.message", + "created_at": 1698983503, + "thread_id": "thread_abc123", + "role": "assistant", + "content": [ + { + "type": "text", + "text": { + "value": "Hi! How can I help you today?", + "annotations": [] + } + } + ], + "assistant_id": "asst_abc123", + "run_id": "run_abc123", + "attachments": [], + "metadata": {} + } + + MessageDeltaObject: + type: object + title: Message delta object + description: | + Represents a message delta i.e. any changed fields on a message during streaming. + properties: + id: + description: The identifier of the message, which can be referenced in API endpoints. + type: string + object: + description: The object type, which is always `thread.message.delta`. + type: string + enum: ["thread.message.delta"] + delta: + description: The delta containing the fields that have changed on the Message. + type: object + properties: + role: + description: The entity that produced the message. One of `user` or `assistant`. + type: string + enum: ["user", "assistant"] + content: + description: The content of the message in array of text and/or images. + type: array + items: + oneOf: + - $ref: "#/components/schemas/MessageDeltaContentImageFileObject" + - $ref: "#/components/schemas/MessageDeltaContentTextObject" + - $ref: "#/components/schemas/MessageDeltaContentRefusalObject" + - $ref: "#/components/schemas/MessageDeltaContentImageUrlObject" + x-oaiExpandable: true + required: + - id + - object + - delta + x-oaiMeta: + name: The message delta object + beta: true + example: | + { + "id": "msg_123", + "object": "thread.message.delta", + "delta": { + "content": [ + { + "index": 0, + "type": "text", + "text": { "value": "Hello", "annotations": [] } + } + ] + } + } + + CreateMessageRequest: + type: object + additionalProperties: false + required: + - role + - content + properties: + role: + type: string + enum: ["user", "assistant"] + description: | + The role of the entity that is creating the message. Allowed values include: + - `user`: Indicates the message is sent by an actual user and should be used in most cases to represent user-generated messages. + - `assistant`: Indicates the message is generated by the assistant. Use this value to insert messages from the assistant into the conversation. + content: + oneOf: + - type: string + description: The text contents of the message. + title: Text content + - type: array + description: An array of content parts with a defined type, each can be of type `text` or images can be passed with `image_url` or `image_file`. Image types are only supported on [Vision-compatible models](/docs/models/overview). + title: Array of content parts + items: + oneOf: + - $ref: "#/components/schemas/MessageContentImageFileObject" + - $ref: "#/components/schemas/MessageContentImageUrlObject" + - $ref: "#/components/schemas/MessageRequestContentTextObject" + x-oaiExpandable: true + minItems: 1 + x-oaiExpandable: true + attachments: + type: array + items: + type: object + properties: + file_id: + type: string + description: The ID of the file to attach to the message. + tools: + description: The tools to add this file to. + type: array + items: + oneOf: + - $ref: "#/components/schemas/AssistantToolsCode" + - $ref: "#/components/schemas/AssistantToolsFileSearchTypeOnly" + x-oaiExpandable: true + description: A list of files attached to the message, and the tools they should be added to. + required: + - file_id + - tools + nullable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + + ModifyMessageRequest: + type: object + additionalProperties: false + properties: + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + + DeleteMessageResponse: + type: object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + enum: [thread.message.deleted] + required: + - id + - object + - deleted + + ListMessagesResponse: + properties: + object: + type: string + example: "list" + data: + type: array + items: + $ref: "#/components/schemas/MessageObject" + first_id: + type: string + example: "msg_abc123" + last_id: + type: string + example: "msg_abc123" + has_more: + type: boolean + example: false + required: + - object + - data + - first_id + - last_id + - has_more + + MessageContentImageFileObject: + title: Image file + type: object + description: References an image [File](/docs/api-reference/files) in the content of a message. + properties: + type: + description: Always `image_file`. + type: string + enum: ["image_file"] + image_file: + type: object + properties: + file_id: + description: The [File](/docs/api-reference/files) ID of the image in the message content. Set `purpose="vision"` when uploading the File if you need to later display the file content. + type: string + detail: + type: string + description: Specifies the detail level of the image if specified by the user. `low` uses fewer tokens, you can opt in to high resolution using `high`. + enum: ["auto", "low", "high"] + default: "auto" + required: + - file_id + required: + - type + - image_file + + MessageDeltaContentImageFileObject: + title: Image file + type: object + description: References an image [File](/docs/api-reference/files) in the content of a message. + properties: + index: + type: integer + description: The index of the content part in the message. + type: + description: Always `image_file`. + type: string + enum: ["image_file"] + image_file: + type: object + properties: + file_id: + description: The [File](/docs/api-reference/files) ID of the image in the message content. Set `purpose="vision"` when uploading the File if you need to later display the file content. + type: string + detail: + type: string + description: Specifies the detail level of the image if specified by the user. `low` uses fewer tokens, you can opt in to high resolution using `high`. + enum: ["auto", "low", "high"] + default: "auto" + required: + - index + - type + + MessageContentImageUrlObject: + title: Image URL + type: object + description: References an image URL in the content of a message. + properties: + type: + type: string + enum: ["image_url"] + description: The type of the content part. + image_url: + type: object + properties: + url: + type: string + description: "The external URL of the image, must be a supported image types: jpeg, jpg, png, gif, webp." + format: uri + detail: + type: string + description: Specifies the detail level of the image. `low` uses fewer tokens, you can opt in to high resolution using `high`. Default value is `auto` + enum: ["auto", "low", "high"] + default: "auto" + required: + - url + required: + - type + - image_url + + MessageDeltaContentImageUrlObject: + title: Image URL + type: object + description: References an image URL in the content of a message. + properties: + index: + type: integer + description: The index of the content part in the message. + type: + description: Always `image_url`. + type: string + enum: ["image_url"] + image_url: + type: object + properties: + url: + description: "The URL of the image, must be a supported image types: jpeg, jpg, png, gif, webp." + type: string + detail: + type: string + description: Specifies the detail level of the image. `low` uses fewer tokens, you can opt in to high resolution using `high`. + enum: ["auto", "low", "high"] + default: "auto" + required: + - index + - type + + MessageContentTextObject: + title: Text + type: object + description: The text content that is part of a message. + properties: + type: + description: Always `text`. + type: string + enum: ["text"] + text: + type: object + properties: + value: + description: The data that makes up the text. + type: string + annotations: + type: array + items: + oneOf: + - $ref: "#/components/schemas/MessageContentTextAnnotationsFileCitationObject" + - $ref: "#/components/schemas/MessageContentTextAnnotationsFilePathObject" + x-oaiExpandable: true + required: + - value + - annotations + required: + - type + - text + + MessageContentRefusalObject: + title: Refusal + type: object + description: The refusal content generated by the assistant. + properties: + type: + description: Always `refusal`. + type: string + enum: ["refusal"] + refusal: + type: string + nullable: false + required: + - type + - refusal + + MessageRequestContentTextObject: + title: Text + type: object + description: The text content that is part of a message. + properties: + type: + description: Always `text`. + type: string + enum: ["text"] + text: + type: string + description: Text content to be sent to the model + required: + - type + - text + + MessageContentTextAnnotationsFileCitationObject: + title: File citation + type: object + description: A citation within the message that points to a specific quote from a specific File associated with the assistant or the message. Generated when the assistant uses the "file_search" tool to search files. + properties: + type: + description: Always `file_citation`. + type: string + enum: ["file_citation"] + text: + description: The text in the message content that needs to be replaced. + type: string + file_citation: + type: object + properties: + file_id: + description: The ID of the specific File the citation is from. + type: string + required: + - file_id + start_index: + type: integer + minimum: 0 + end_index: + type: integer + minimum: 0 + required: + - type + - text + - file_citation + - start_index + - end_index + + MessageContentTextAnnotationsFilePathObject: + title: File path + type: object + description: A URL for the file that's generated when the assistant used the `code_interpreter` tool to generate a file. + properties: + type: + description: Always `file_path`. + type: string + enum: ["file_path"] + text: + description: The text in the message content that needs to be replaced. + type: string + file_path: + type: object + properties: + file_id: + description: The ID of the file that was generated. + type: string + required: + - file_id + start_index: + type: integer + minimum: 0 + end_index: + type: integer + minimum: 0 + required: + - type + - text + - file_path + - start_index + - end_index + + MessageDeltaContentTextObject: + title: Text + type: object + description: The text content that is part of a message. + properties: + index: + type: integer + description: The index of the content part in the message. + type: + description: Always `text`. + type: string + enum: ["text"] + text: + type: object + properties: + value: + description: The data that makes up the text. + type: string + annotations: + type: array + items: + oneOf: + - $ref: "#/components/schemas/MessageDeltaContentTextAnnotationsFileCitationObject" + - $ref: "#/components/schemas/MessageDeltaContentTextAnnotationsFilePathObject" + x-oaiExpandable: true + required: + - index + - type + + MessageDeltaContentRefusalObject: + title: Refusal + type: object + description: The refusal content that is part of a message. + properties: + index: + type: integer + description: The index of the refusal part in the message. + type: + description: Always `refusal`. + type: string + enum: ["refusal"] + refusal: + type: string + required: + - index + - type + + + MessageDeltaContentTextAnnotationsFileCitationObject: + title: File citation + type: object + description: A citation within the message that points to a specific quote from a specific File associated with the assistant or the message. Generated when the assistant uses the "file_search" tool to search files. + properties: + index: + type: integer + description: The index of the annotation in the text content part. + type: + description: Always `file_citation`. + type: string + enum: ["file_citation"] + text: + description: The text in the message content that needs to be replaced. + type: string + file_citation: + type: object + properties: + file_id: + description: The ID of the specific File the citation is from. + type: string + quote: + description: The specific quote in the file. + type: string + start_index: + type: integer + minimum: 0 + end_index: + type: integer + minimum: 0 + required: + - index + - type + + MessageDeltaContentTextAnnotationsFilePathObject: + title: File path + type: object + description: A URL for the file that's generated when the assistant used the `code_interpreter` tool to generate a file. + properties: + index: + type: integer + description: The index of the annotation in the text content part. + type: + description: Always `file_path`. + type: string + enum: ["file_path"] + text: + description: The text in the message content that needs to be replaced. + type: string + file_path: + type: object + properties: + file_id: + description: The ID of the file that was generated. + type: string + start_index: + type: integer + minimum: 0 + end_index: + type: integer + minimum: 0 + required: + - index + - type + + RunStepObject: + type: object + title: Run steps + description: | + Represents a step in execution of a run. + properties: + id: + description: The identifier of the run step, which can be referenced in API endpoints. + type: string + object: + description: The object type, which is always `thread.run.step`. + type: string + enum: ["thread.run.step"] + created_at: + description: The Unix timestamp (in seconds) for when the run step was created. + type: integer + assistant_id: + description: The ID of the [assistant](/docs/api-reference/assistants) associated with the run step. + type: string + thread_id: + description: The ID of the [thread](/docs/api-reference/threads) that was run. + type: string + run_id: + description: The ID of the [run](/docs/api-reference/runs) that this run step is a part of. + type: string + type: + description: The type of run step, which can be either `message_creation` or `tool_calls`. + type: string + enum: ["message_creation", "tool_calls"] + status: + description: The status of the run step, which can be either `in_progress`, `cancelled`, `failed`, `completed`, or `expired`. + type: string + enum: ["in_progress", "cancelled", "failed", "completed", "expired"] + step_details: + type: object + description: The details of the run step. + oneOf: + - $ref: "#/components/schemas/RunStepDetailsMessageCreationObject" + - $ref: "#/components/schemas/RunStepDetailsToolCallsObject" + x-oaiExpandable: true + last_error: + type: object + description: The last error associated with this run step. Will be `null` if there are no errors. + nullable: true + properties: + code: + type: string + description: One of `server_error` or `rate_limit_exceeded`. + enum: ["server_error", "rate_limit_exceeded"] + message: + type: string + description: A human-readable description of the error. + required: + - code + - message + expired_at: + description: The Unix timestamp (in seconds) for when the run step expired. A step is considered expired if the parent run is expired. + type: integer + nullable: true + cancelled_at: + description: The Unix timestamp (in seconds) for when the run step was cancelled. + type: integer + nullable: true + failed_at: + description: The Unix timestamp (in seconds) for when the run step failed. + type: integer + nullable: true + completed_at: + description: The Unix timestamp (in seconds) for when the run step completed. + type: integer + nullable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + usage: + $ref: "#/components/schemas/RunStepCompletionUsage" + required: + - id + - object + - created_at + - assistant_id + - thread_id + - run_id + - type + - status + - step_details + - last_error + - expired_at + - cancelled_at + - failed_at + - completed_at + - metadata + - usage + x-oaiMeta: + name: The run step object + beta: true + example: *run_step_object_example + + RunStepDeltaObject: + type: object + title: Run step delta object + description: | + Represents a run step delta i.e. any changed fields on a run step during streaming. + properties: + id: + description: The identifier of the run step, which can be referenced in API endpoints. + type: string + object: + description: The object type, which is always `thread.run.step.delta`. + type: string + enum: ["thread.run.step.delta"] + delta: + description: The delta containing the fields that have changed on the run step. + type: object + properties: + step_details: + type: object + description: The details of the run step. + oneOf: + - $ref: "#/components/schemas/RunStepDeltaStepDetailsMessageCreationObject" + - $ref: "#/components/schemas/RunStepDeltaStepDetailsToolCallsObject" + x-oaiExpandable: true + required: + - id + - object + - delta + x-oaiMeta: + name: The run step delta object + beta: true + example: | + { + "id": "step_123", + "object": "thread.run.step.delta", + "delta": { + "step_details": { + "type": "tool_calls", + "tool_calls": [ + { + "index": 0, + "id": "call_123", + "type": "code_interpreter", + "code_interpreter": { "input": "", "outputs": [] } + } + ] + } + } + } + + ListRunStepsResponse: + properties: + object: + type: string + example: "list" + data: + type: array + items: + $ref: "#/components/schemas/RunStepObject" + first_id: + type: string + example: "step_abc123" + last_id: + type: string + example: "step_abc456" + has_more: + type: boolean + example: false + required: + - object + - data + - first_id + - last_id + - has_more + + RunStepDetailsMessageCreationObject: + title: Message creation + type: object + description: Details of the message creation by the run step. + properties: + type: + description: Always `message_creation`. + type: string + enum: ["message_creation"] + message_creation: + type: object + properties: + message_id: + type: string + description: The ID of the message that was created by this run step. + required: + - message_id + required: + - type + - message_creation + + RunStepDeltaStepDetailsMessageCreationObject: + title: Message creation + type: object + description: Details of the message creation by the run step. + properties: + type: + description: Always `message_creation`. + type: string + enum: ["message_creation"] + message_creation: + type: object + properties: + message_id: + type: string + description: The ID of the message that was created by this run step. + required: + - type + + RunStepDetailsToolCallsObject: + title: Tool calls + type: object + description: Details of the tool call. + properties: + type: + description: Always `tool_calls`. + type: string + enum: ["tool_calls"] + tool_calls: + type: array + description: | + An array of tool calls the run step was involved in. These can be associated with one of three types of tools: `code_interpreter`, `file_search`, or `function`. + items: + oneOf: + - $ref: "#/components/schemas/RunStepDetailsToolCallsCodeObject" + - $ref: "#/components/schemas/RunStepDetailsToolCallsFileSearchObject" + - $ref: "#/components/schemas/RunStepDetailsToolCallsFunctionObject" + x-oaiExpandable: true + required: + - type + - tool_calls + + RunStepDeltaStepDetailsToolCallsObject: + title: Tool calls + type: object + description: Details of the tool call. + properties: + type: + description: Always `tool_calls`. + type: string + enum: ["tool_calls"] + tool_calls: + type: array + description: | + An array of tool calls the run step was involved in. These can be associated with one of three types of tools: `code_interpreter`, `file_search`, or `function`. + items: + oneOf: + - $ref: "#/components/schemas/RunStepDeltaStepDetailsToolCallsCodeObject" + - $ref: "#/components/schemas/RunStepDeltaStepDetailsToolCallsFileSearchObject" + - $ref: "#/components/schemas/RunStepDeltaStepDetailsToolCallsFunctionObject" + x-oaiExpandable: true + required: + - type + + RunStepDetailsToolCallsCodeObject: + title: Code Interpreter tool call + type: object + description: Details of the Code Interpreter tool call the run step was involved in. + properties: + id: + type: string + description: The ID of the tool call. + type: + type: string + description: The type of tool call. This is always going to be `code_interpreter` for this type of tool call. + enum: ["code_interpreter"] + code_interpreter: + type: object + description: The Code Interpreter tool call definition. + required: + - input + - outputs + properties: + input: + type: string + description: The input to the Code Interpreter tool call. + outputs: + type: array + description: The outputs from the Code Interpreter tool call. Code Interpreter can output one or more items, including text (`logs`) or images (`image`). Each of these are represented by a different object type. + items: + type: object + oneOf: + - $ref: "#/components/schemas/RunStepDetailsToolCallsCodeOutputLogsObject" + - $ref: "#/components/schemas/RunStepDetailsToolCallsCodeOutputImageObject" + x-oaiExpandable: true + required: + - id + - type + - code_interpreter + + RunStepDeltaStepDetailsToolCallsCodeObject: + title: Code interpreter tool call + type: object + description: Details of the Code Interpreter tool call the run step was involved in. + properties: + index: + type: integer + description: The index of the tool call in the tool calls array. + id: + type: string + description: The ID of the tool call. + type: + type: string + description: The type of tool call. This is always going to be `code_interpreter` for this type of tool call. + enum: ["code_interpreter"] + code_interpreter: + type: object + description: The Code Interpreter tool call definition. + properties: + input: + type: string + description: The input to the Code Interpreter tool call. + outputs: + type: array + description: The outputs from the Code Interpreter tool call. Code Interpreter can output one or more items, including text (`logs`) or images (`image`). Each of these are represented by a different object type. + items: + type: object + oneOf: + - $ref: "#/components/schemas/RunStepDeltaStepDetailsToolCallsCodeOutputLogsObject" + - $ref: "#/components/schemas/RunStepDeltaStepDetailsToolCallsCodeOutputImageObject" + x-oaiExpandable: true + required: + - index + - type + + RunStepDetailsToolCallsCodeOutputLogsObject: + title: Code Interpreter log output + type: object + description: Text output from the Code Interpreter tool call as part of a run step. + properties: + type: + description: Always `logs`. + type: string + enum: ["logs"] + logs: + type: string + description: The text output from the Code Interpreter tool call. + required: + - type + - logs + + RunStepDeltaStepDetailsToolCallsCodeOutputLogsObject: + title: Code interpreter log output + type: object + description: Text output from the Code Interpreter tool call as part of a run step. + properties: + index: + type: integer + description: The index of the output in the outputs array. + type: + description: Always `logs`. + type: string + enum: ["logs"] + logs: + type: string + description: The text output from the Code Interpreter tool call. + required: + - index + - type + + RunStepDetailsToolCallsCodeOutputImageObject: + title: Code Interpreter image output + type: object + properties: + type: + description: Always `image`. + type: string + enum: ["image"] + image: + type: object + properties: + file_id: + description: The [file](/docs/api-reference/files) ID of the image. + type: string + required: + - file_id + required: + - type + - image + + RunStepDeltaStepDetailsToolCallsCodeOutputImageObject: + title: Code interpreter image output + type: object + properties: + index: + type: integer + description: The index of the output in the outputs array. + type: + description: Always `image`. + type: string + enum: ["image"] + image: + type: object + properties: + file_id: + description: The [file](/docs/api-reference/files) ID of the image. + type: string + required: + - index + - type + + RunStepDetailsToolCallsFileSearchObject: + title: File search tool call + type: object + properties: + id: + type: string + description: The ID of the tool call object. + type: + type: string + description: The type of tool call. This is always going to be `file_search` for this type of tool call. + enum: ["file_search"] + file_search: + type: object + description: For now, this is always going to be an empty object. + x-oaiTypeLabel: map + required: + - id + - type + - file_search + + RunStepDeltaStepDetailsToolCallsFileSearchObject: + title: File search tool call + type: object + properties: + index: + type: integer + description: The index of the tool call in the tool calls array. + id: + type: string + description: The ID of the tool call object. + type: + type: string + description: The type of tool call. This is always going to be `file_search` for this type of tool call. + enum: ["file_search"] + file_search: + type: object + description: For now, this is always going to be an empty object. + x-oaiTypeLabel: map + required: + - index + - type + - file_search + + RunStepDetailsToolCallsFunctionObject: + type: object + title: Function tool call + properties: + id: + type: string + description: The ID of the tool call object. + type: + type: string + description: The type of tool call. This is always going to be `function` for this type of tool call. + enum: ["function"] + function: + type: object + description: The definition of the function that was called. + properties: + name: + type: string + description: The name of the function. + arguments: + type: string + description: The arguments passed to the function. + output: + type: string + description: The output of the function. This will be `null` if the outputs have not been [submitted](/docs/api-reference/runs/submitToolOutputs) yet. + nullable: true + required: + - name + - arguments + - output + required: + - id + - type + - function + + RunStepDeltaStepDetailsToolCallsFunctionObject: + type: object + title: Function tool call + properties: + index: + type: integer + description: The index of the tool call in the tool calls array. + id: + type: string + description: The ID of the tool call object. + type: + type: string + description: The type of tool call. This is always going to be `function` for this type of tool call. + enum: ["function"] + function: + type: object + description: The definition of the function that was called. + properties: + name: + type: string + description: The name of the function. + arguments: + type: string + description: The arguments passed to the function. + output: + type: string + description: The output of the function. This will be `null` if the outputs have not been [submitted](/docs/api-reference/runs/submitToolOutputs) yet. + nullable: true + required: + - index + - type + + VectorStoreExpirationAfter: + type: object + title: Vector store expiration policy + description: The expiration policy for a vector store. + properties: + anchor: + description: "Anchor timestamp after which the expiration policy applies. Supported anchors: `last_active_at`." + type: string + enum: ["last_active_at"] + days: + description: The number of days after the anchor time that the vector store will expire. + type: integer + minimum: 1 + maximum: 365 + required: + - anchor + - days + + VectorStoreObject: + type: object + title: Vector store + description: A vector store is a collection of processed files can be used by the `file_search` tool. + properties: + id: + description: The identifier, which can be referenced in API endpoints. + type: string + object: + description: The object type, which is always `vector_store`. + type: string + enum: ["vector_store"] + created_at: + description: The Unix timestamp (in seconds) for when the vector store was created. + type: integer + name: + description: The name of the vector store. + type: string + usage_bytes: + description: The total number of bytes used by the files in the vector store. + type: integer + file_counts: + type: object + properties: + in_progress: + description: The number of files that are currently being processed. + type: integer + completed: + description: The number of files that have been successfully processed. + type: integer + failed: + description: The number of files that have failed to process. + type: integer + cancelled: + description: The number of files that were cancelled. + type: integer + total: + description: The total number of files. + type: integer + required: + - in_progress + - completed + - failed + - cancelled + - total + status: + description: The status of the vector store, which can be either `expired`, `in_progress`, or `completed`. A status of `completed` indicates that the vector store is ready for use. + type: string + enum: ["expired", "in_progress", "completed"] + expires_after: + $ref: "#/components/schemas/VectorStoreExpirationAfter" + expires_at: + description: The Unix timestamp (in seconds) for when the vector store will expire. + type: integer + nullable: true + last_active_at: + description: The Unix timestamp (in seconds) for when the vector store was last active. + type: integer + nullable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + required: + - id + - object + - usage_bytes + - created_at + - status + - last_active_at + - name + - file_counts + - metadata + x-oaiMeta: + name: The vector store object + beta: true + example: | + { + "id": "vs_123", + "object": "vector_store", + "created_at": 1698107661, + "usage_bytes": 123456, + "last_active_at": 1698107661, + "name": "my_vector_store", + "status": "completed", + "file_counts": { + "in_progress": 0, + "completed": 100, + "cancelled": 0, + "failed": 0, + "total": 100 + }, + "metadata": {}, + "last_used_at": 1698107661 + } + + CreateVectorStoreRequest: + type: object + additionalProperties: false + properties: + file_ids: + description: A list of [File](/docs/api-reference/files) IDs that the vector store should use. Useful for tools like `file_search` that can access files. + type: array + maxItems: 500 + items: + type: string + name: + description: The name of the vector store. + type: string + expires_after: + $ref: "#/components/schemas/VectorStoreExpirationAfter" + chunking_strategy: + type: object + description: The chunking strategy used to chunk the file(s). If not set, will use the `auto` strategy. Only applicable if `file_ids` is non-empty. + oneOf: + - $ref: "#/components/schemas/AutoChunkingStrategyRequestParam" + - $ref: "#/components/schemas/StaticChunkingStrategyRequestParam" + x-oaiExpandable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + + UpdateVectorStoreRequest: + type: object + additionalProperties: false + properties: + name: + description: The name of the vector store. + type: string + nullable: true + expires_after: + $ref: "#/components/schemas/VectorStoreExpirationAfter" + nullable: true + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + + ListVectorStoresResponse: + properties: + object: + type: string + example: "list" + data: + type: array + items: + $ref: "#/components/schemas/VectorStoreObject" + first_id: + type: string + example: "vs_abc123" + last_id: + type: string + example: "vs_abc456" + has_more: + type: boolean + example: false + required: + - object + - data + - first_id + - last_id + - has_more + + DeleteVectorStoreResponse: + type: object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + enum: [vector_store.deleted] + required: + - id + - object + - deleted + + VectorStoreFileObject: + type: object + title: Vector store files + description: A list of files attached to a vector store. + properties: + id: + description: The identifier, which can be referenced in API endpoints. + type: string + object: + description: The object type, which is always `vector_store.file`. + type: string + enum: ["vector_store.file"] + usage_bytes: + description: The total vector store usage in bytes. Note that this may be different from the original file size. + type: integer + created_at: + description: The Unix timestamp (in seconds) for when the vector store file was created. + type: integer + vector_store_id: + description: The ID of the [vector store](/docs/api-reference/vector-stores/object) that the [File](/docs/api-reference/files) is attached to. + type: string + status: + description: The status of the vector store file, which can be either `in_progress`, `completed`, `cancelled`, or `failed`. The status `completed` indicates that the vector store file is ready for use. + type: string + enum: ["in_progress", "completed", "cancelled", "failed"] + last_error: + type: object + description: The last error associated with this vector store file. Will be `null` if there are no errors. + nullable: true + properties: + code: + type: string + description: One of `server_error` or `rate_limit_exceeded`. + enum: + [ + "server_error", + "unsupported_file", + "invalid_file", + ] + message: + type: string + description: A human-readable description of the error. + required: + - code + - message + chunking_strategy: + type: object + description: The strategy used to chunk the file. + oneOf: + - $ref: "#/components/schemas/StaticChunkingStrategyResponseParam" + - $ref: "#/components/schemas/OtherChunkingStrategyResponseParam" + x-oaiExpandable: true + required: + - id + - object + - usage_bytes + - created_at + - vector_store_id + - status + - last_error + x-oaiMeta: + name: The vector store file object + beta: true + example: | + { + "id": "file-abc123", + "object": "vector_store.file", + "usage_bytes": 1234, + "created_at": 1698107661, + "vector_store_id": "vs_abc123", + "status": "completed", + "last_error": null, + "chunking_strategy": { + "type": "static", + "static": { + "max_chunk_size_tokens": 800, + "chunk_overlap_tokens": 400 + } + } + } + + OtherChunkingStrategyResponseParam: + type: object + title: Other Chunking Strategy + description: This is returned when the chunking strategy is unknown. Typically, this is because the file was indexed before the `chunking_strategy` concept was introduced in the API. + additionalProperties: false + properties: + type: + type: string + description: Always `other`. + enum: ["other"] + required: + - type + + StaticChunkingStrategyResponseParam: + type: object + title: Static Chunking Strategy + additionalProperties: false + properties: + type: + type: string + description: Always `static`. + enum: ["static"] + static: + $ref: "#/components/schemas/StaticChunkingStrategy" + required: + - type + - static + + StaticChunkingStrategy: + type: object + additionalProperties: false + properties: + max_chunk_size_tokens: + type: integer + minimum: 100 + maximum: 4096 + description: The maximum number of tokens in each chunk. The default value is `800`. The minimum value is `100` and the maximum value is `4096`. + chunk_overlap_tokens: + type: integer + description: | + The number of tokens that overlap between chunks. The default value is `400`. + + Note that the overlap must not exceed half of `max_chunk_size_tokens`. + required: + - max_chunk_size_tokens + - chunk_overlap_tokens + + AutoChunkingStrategyRequestParam: + type: object + title: Auto Chunking Strategy + description: The default strategy. This strategy currently uses a `max_chunk_size_tokens` of `800` and `chunk_overlap_tokens` of `400`. + additionalProperties: false + properties: + type: + type: string + description: Always `auto`. + enum: ["auto"] + required: + - type + + StaticChunkingStrategyRequestParam: + type: object + title: Static Chunking Strategy + additionalProperties: false + properties: + type: + type: string + description: Always `static`. + enum: ["static"] + static: + $ref: "#/components/schemas/StaticChunkingStrategy" + required: + - type + - static + + ChunkingStrategyRequestParam: + type: object + description: The chunking strategy used to chunk the file(s). If not set, will use the `auto` strategy. + oneOf: + - $ref: "#/components/schemas/AutoChunkingStrategyRequestParam" + - $ref: "#/components/schemas/StaticChunkingStrategyRequestParam" + x-oaiExpandable: true + + CreateVectorStoreFileRequest: + type: object + additionalProperties: false + properties: + file_id: + description: A [File](/docs/api-reference/files) ID that the vector store should use. Useful for tools like `file_search` that can access files. + type: string + chunking_strategy: + $ref: "#/components/schemas/ChunkingStrategyRequestParam" + required: + - file_id + + ListVectorStoreFilesResponse: + properties: + object: + type: string + example: "list" + data: + type: array + items: + $ref: "#/components/schemas/VectorStoreFileObject" + first_id: + type: string + example: "file-abc123" + last_id: + type: string + example: "file-abc456" + has_more: + type: boolean + example: false + required: + - object + - data + - first_id + - last_id + - has_more + + DeleteVectorStoreFileResponse: + type: object + properties: + id: + type: string + deleted: + type: boolean + object: + type: string + enum: [vector_store.file.deleted] + required: + - id + - object + - deleted + + VectorStoreFileBatchObject: + type: object + title: Vector store file batch + description: A batch of files attached to a vector store. + properties: + id: + description: The identifier, which can be referenced in API endpoints. + type: string + object: + description: The object type, which is always `vector_store.file_batch`. + type: string + enum: ["vector_store.files_batch"] + created_at: + description: The Unix timestamp (in seconds) for when the vector store files batch was created. + type: integer + vector_store_id: + description: The ID of the [vector store](/docs/api-reference/vector-stores/object) that the [File](/docs/api-reference/files) is attached to. + type: string + status: + description: The status of the vector store files batch, which can be either `in_progress`, `completed`, `cancelled` or `failed`. + type: string + enum: ["in_progress", "completed", "cancelled", "failed"] + file_counts: + type: object + properties: + in_progress: + description: The number of files that are currently being processed. + type: integer + completed: + description: The number of files that have been processed. + type: integer + failed: + description: The number of files that have failed to process. + type: integer + cancelled: + description: The number of files that where cancelled. + type: integer + total: + description: The total number of files. + type: integer + required: + - in_progress + - completed + - cancelled + - failed + - total + required: + - id + - object + - created_at + - vector_store_id + - status + - file_counts + x-oaiMeta: + name: The vector store files batch object + beta: true + example: | + { + "id": "vsfb_123", + "object": "vector_store.files_batch", + "created_at": 1698107661, + "vector_store_id": "vs_abc123", + "status": "completed", + "file_counts": { + "in_progress": 0, + "completed": 100, + "failed": 0, + "cancelled": 0, + "total": 100 + } + } + + CreateVectorStoreFileBatchRequest: + type: object + additionalProperties: false + properties: + file_ids: + description: A list of [File](/docs/api-reference/files) IDs that the vector store should use. Useful for tools like `file_search` that can access files. + type: array + minItems: 1 + maxItems: 500 + items: + type: string + chunking_strategy: + $ref: "#/components/schemas/ChunkingStrategyRequestParam" + required: + - file_ids + + AssistantStreamEvent: + description: | + Represents an event emitted when streaming a Run. + + Each event in a server-sent events stream has an `event` and `data` property: + + ``` + event: thread.created + data: {"id": "thread_123", "object": "thread", ...} + ``` + + We emit events whenever a new object is created, transitions to a new state, or is being + streamed in parts (deltas). For example, we emit `thread.run.created` when a new run + is created, `thread.run.completed` when a run completes, and so on. When an Assistant chooses + to create a message during a run, we emit a `thread.message.created event`, a + `thread.message.in_progress` event, many `thread.message.delta` events, and finally a + `thread.message.completed` event. + + We may add additional events over time, so we recommend handling unknown events gracefully + in your code. See the [Assistants API quickstart](/docs/assistants/overview) to learn how to + integrate the Assistants API with streaming. + oneOf: + - $ref: "#/components/schemas/ThreadStreamEvent" + - $ref: "#/components/schemas/RunStreamEvent" + - $ref: "#/components/schemas/RunStepStreamEvent" + - $ref: "#/components/schemas/MessageStreamEvent" + - $ref: "#/components/schemas/ErrorEvent" + - $ref: "#/components/schemas/DoneEvent" + x-oaiMeta: + name: Assistant stream events + beta: true + + ThreadStreamEvent: + oneOf: + - type: object + properties: + event: + type: string + enum: ["thread.created"] + data: + $ref: "#/components/schemas/ThreadObject" + required: + - event + - data + description: Occurs when a new [thread](/docs/api-reference/threads/object) is created. + x-oaiMeta: + dataDescription: "`data` is a [thread](/docs/api-reference/threads/object)" + + RunStreamEvent: + oneOf: + - type: object + properties: + event: + type: string + enum: ["thread.run.created"] + data: + $ref: "#/components/schemas/RunObject" + required: + - event + - data + description: Occurs when a new [run](/docs/api-reference/runs/object) is created. + x-oaiMeta: + dataDescription: "`data` is a [run](/docs/api-reference/runs/object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.queued"] + data: + $ref: "#/components/schemas/RunObject" + required: + - event + - data + description: Occurs when a [run](/docs/api-reference/runs/object) moves to a `queued` status. + x-oaiMeta: + dataDescription: "`data` is a [run](/docs/api-reference/runs/object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.in_progress"] + data: + $ref: "#/components/schemas/RunObject" + required: + - event + - data + description: Occurs when a [run](/docs/api-reference/runs/object) moves to an `in_progress` status. + x-oaiMeta: + dataDescription: "`data` is a [run](/docs/api-reference/runs/object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.requires_action"] + data: + $ref: "#/components/schemas/RunObject" + required: + - event + - data + description: Occurs when a [run](/docs/api-reference/runs/object) moves to a `requires_action` status. + x-oaiMeta: + dataDescription: "`data` is a [run](/docs/api-reference/runs/object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.completed"] + data: + $ref: "#/components/schemas/RunObject" + required: + - event + - data + description: Occurs when a [run](/docs/api-reference/runs/object) is completed. + x-oaiMeta: + dataDescription: "`data` is a [run](/docs/api-reference/runs/object)" + - type: object + properties: + event: + type: string + enum: [ "thread.run.incomplete" ] + data: + $ref: "#/components/schemas/RunObject" + required: + - event + - data + description: Occurs when a [run](/docs/api-reference/runs/object) ends with status `incomplete`. + x-oaiMeta: + dataDescription: "`data` is a [run](/docs/api-reference/runs/object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.failed"] + data: + $ref: "#/components/schemas/RunObject" + required: + - event + - data + description: Occurs when a [run](/docs/api-reference/runs/object) fails. + x-oaiMeta: + dataDescription: "`data` is a [run](/docs/api-reference/runs/object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.cancelling"] + data: + $ref: "#/components/schemas/RunObject" + required: + - event + - data + description: Occurs when a [run](/docs/api-reference/runs/object) moves to a `cancelling` status. + x-oaiMeta: + dataDescription: "`data` is a [run](/docs/api-reference/runs/object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.cancelled"] + data: + $ref: "#/components/schemas/RunObject" + required: + - event + - data + description: Occurs when a [run](/docs/api-reference/runs/object) is cancelled. + x-oaiMeta: + dataDescription: "`data` is a [run](/docs/api-reference/runs/object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.expired"] + data: + $ref: "#/components/schemas/RunObject" + required: + - event + - data + description: Occurs when a [run](/docs/api-reference/runs/object) expires. + x-oaiMeta: + dataDescription: "`data` is a [run](/docs/api-reference/runs/object)" + + RunStepStreamEvent: + oneOf: + - type: object + properties: + event: + type: string + enum: ["thread.run.step.created"] + data: + $ref: "#/components/schemas/RunStepObject" + required: + - event + - data + description: Occurs when a [run step](/docs/api-reference/runs/step-object) is created. + x-oaiMeta: + dataDescription: "`data` is a [run step](/docs/api-reference/runs/step-object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.step.in_progress"] + data: + $ref: "#/components/schemas/RunStepObject" + required: + - event + - data + description: Occurs when a [run step](/docs/api-reference/runs/step-object) moves to an `in_progress` state. + x-oaiMeta: + dataDescription: "`data` is a [run step](/docs/api-reference/runs/step-object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.step.delta"] + data: + $ref: "#/components/schemas/RunStepDeltaObject" + required: + - event + - data + description: Occurs when parts of a [run step](/docs/api-reference/runs/step-object) are being streamed. + x-oaiMeta: + dataDescription: "`data` is a [run step delta](/docs/api-reference/assistants-streaming/run-step-delta-object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.step.completed"] + data: + $ref: "#/components/schemas/RunStepObject" + required: + - event + - data + description: Occurs when a [run step](/docs/api-reference/runs/step-object) is completed. + x-oaiMeta: + dataDescription: "`data` is a [run step](/docs/api-reference/runs/step-object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.step.failed"] + data: + $ref: "#/components/schemas/RunStepObject" + required: + - event + - data + description: Occurs when a [run step](/docs/api-reference/runs/step-object) fails. + x-oaiMeta: + dataDescription: "`data` is a [run step](/docs/api-reference/runs/step-object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.step.cancelled"] + data: + $ref: "#/components/schemas/RunStepObject" + required: + - event + - data + description: Occurs when a [run step](/docs/api-reference/runs/step-object) is cancelled. + x-oaiMeta: + dataDescription: "`data` is a [run step](/docs/api-reference/runs/step-object)" + - type: object + properties: + event: + type: string + enum: ["thread.run.step.expired"] + data: + $ref: "#/components/schemas/RunStepObject" + required: + - event + - data + description: Occurs when a [run step](/docs/api-reference/runs/step-object) expires. + x-oaiMeta: + dataDescription: "`data` is a [run step](/docs/api-reference/runs/step-object)" + + MessageStreamEvent: + oneOf: + - type: object + properties: + event: + type: string + enum: ["thread.message.created"] + data: + $ref: "#/components/schemas/MessageObject" + required: + - event + - data + description: Occurs when a [message](/docs/api-reference/messages/object) is created. + x-oaiMeta: + dataDescription: "`data` is a [message](/docs/api-reference/messages/object)" + - type: object + properties: + event: + type: string + enum: ["thread.message.in_progress"] + data: + $ref: "#/components/schemas/MessageObject" + required: + - event + - data + description: Occurs when a [message](/docs/api-reference/messages/object) moves to an `in_progress` state. + x-oaiMeta: + dataDescription: "`data` is a [message](/docs/api-reference/messages/object)" + - type: object + properties: + event: + type: string + enum: ["thread.message.delta"] + data: + $ref: "#/components/schemas/MessageDeltaObject" + required: + - event + - data + description: Occurs when parts of a [Message](/docs/api-reference/messages/object) are being streamed. + x-oaiMeta: + dataDescription: "`data` is a [message delta](/docs/api-reference/assistants-streaming/message-delta-object)" + - type: object + properties: + event: + type: string + enum: ["thread.message.completed"] + data: + $ref: "#/components/schemas/MessageObject" + required: + - event + - data + description: Occurs when a [message](/docs/api-reference/messages/object) is completed. + x-oaiMeta: + dataDescription: "`data` is a [message](/docs/api-reference/messages/object)" + - type: object + properties: + event: + type: string + enum: ["thread.message.incomplete"] + data: + $ref: "#/components/schemas/MessageObject" + required: + - event + - data + description: Occurs when a [message](/docs/api-reference/messages/object) ends before it is completed. + x-oaiMeta: + dataDescription: "`data` is a [message](/docs/api-reference/messages/object)" + + ErrorEvent: + type: object + properties: + event: + type: string + enum: ["error"] + data: + $ref: "#/components/schemas/Error" + required: + - event + - data + description: Occurs when an [error](/docs/guides/error-codes/api-errors) occurs. This can happen due to an internal server error or a timeout. + x-oaiMeta: + dataDescription: "`data` is an [error](/docs/guides/error-codes/api-errors)" + + DoneEvent: + type: object + properties: + event: + type: string + enum: ["done"] + data: + type: string + enum: ["[DONE]"] + required: + - event + - data + description: Occurs when a stream ends. + x-oaiMeta: + dataDescription: "`data` is `[DONE]`" + + Batch: + type: object + properties: + id: + type: string + object: + type: string + enum: [batch] + description: The object type, which is always `batch`. + endpoint: + type: string + description: The OpenAI API endpoint used by the batch. + + errors: + type: object + properties: + object: + type: string + description: The object type, which is always `list`. + data: + type: array + items: + type: object + properties: + code: + type: string + description: An error code identifying the error type. + message: + type: string + description: A human-readable message providing more details about the error. + param: + type: string + description: The name of the parameter that caused the error, if applicable. + nullable: true + line: + type: integer + description: The line number of the input file where the error occurred, if applicable. + nullable: true + input_file_id: + type: string + description: The ID of the input file for the batch. + completion_window: + type: string + description: The time frame within which the batch should be processed. + status: + type: string + description: The current status of the batch. + enum: + - validating + - failed + - in_progress + - finalizing + - completed + - expired + - cancelling + - cancelled + output_file_id: + type: string + description: The ID of the file containing the outputs of successfully executed requests. + error_file_id: + type: string + description: The ID of the file containing the outputs of requests with errors. + created_at: + type: integer + description: The Unix timestamp (in seconds) for when the batch was created. + in_progress_at: + type: integer + description: The Unix timestamp (in seconds) for when the batch started processing. + expires_at: + type: integer + description: The Unix timestamp (in seconds) for when the batch will expire. + finalizing_at: + type: integer + description: The Unix timestamp (in seconds) for when the batch started finalizing. + completed_at: + type: integer + description: The Unix timestamp (in seconds) for when the batch was completed. + failed_at: + type: integer + description: The Unix timestamp (in seconds) for when the batch failed. + expired_at: + type: integer + description: The Unix timestamp (in seconds) for when the batch expired. + cancelling_at: + type: integer + description: The Unix timestamp (in seconds) for when the batch started cancelling. + cancelled_at: + type: integer + description: The Unix timestamp (in seconds) for when the batch was cancelled. + request_counts: + type: object + properties: + total: + type: integer + description: Total number of requests in the batch. + completed: + type: integer + description: Number of requests that have been completed successfully. + failed: + type: integer + description: Number of requests that have failed. + required: + - total + - completed + - failed + description: The request counts for different statuses within the batch. + metadata: + description: *metadata_description + type: object + x-oaiTypeLabel: map + nullable: true + required: + - id + - object + - endpoint + - input_file_id + - completion_window + - status + - created_at + x-oaiMeta: + name: The batch object + example: *batch_object + + BatchRequestInput: + type: object + description: The per-line object of the batch input file + properties: + custom_id: + type: string + description: A developer-provided per-request id that will be used to match outputs to inputs. Must be unique for each request in a batch. + method: + type: string + enum: ["POST"] + description: The HTTP method to be used for the request. Currently only `POST` is supported. + url: + type: string + description: The OpenAI API relative URL to be used for the request. Currently `/v1/chat/completions`, `/v1/embeddings`, and `/v1/completions` are supported. + x-oaiMeta: + name: The request input object + example: | + {"custom_id": "request-1", "method": "POST", "url": "/v1/chat/completions", "body": {"model": "gpt-4o-mini", "messages": [{"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "What is 2+2?"}]}} + + BatchRequestOutput: + type: object + description: The per-line object of the batch output and error files + properties: + id: + type: string + custom_id: + type: string + description: A developer-provided per-request id that will be used to match outputs to inputs. + response: + type: object + nullable: true + properties: + status_code: + type: integer + description: The HTTP status code of the response + request_id: + type: string + description: An unique identifier for the OpenAI API request. Please include this request ID when contacting support. + body: + type: object + x-oaiTypeLabel: map + description: The JSON body of the response + error: + type: object + nullable: true + description: For requests that failed with a non-HTTP error, this will contain more information on the cause of the failure. + properties: + code: + type: string + description: A machine-readable error code. + message: + type: string + description: A human-readable error message. + x-oaiMeta: + name: The request output object + example: | + {"id": "batch_req_wnaDys", "custom_id": "request-2", "response": {"status_code": 200, "request_id": "req_c187b3", "body": {"id": "chatcmpl-9758Iw", "object": "chat.completion", "created": 1711475054, "model": "gpt-4o-mini", "choices": [{"index": 0, "message": {"role": "assistant", "content": "2 + 2 equals 4."}, "finish_reason": "stop"}], "usage": {"prompt_tokens": 24, "completion_tokens": 15, "total_tokens": 39}, "system_fingerprint": null}}, "error": null} + + ListBatchesResponse: + type: object + properties: + data: + type: array + items: + $ref: "#/components/schemas/Batch" + first_id: + type: string + example: "batch_abc123" + last_id: + type: string + example: "batch_abc456" + has_more: + type: boolean + object: + type: string + enum: [list] + required: + - object + - data + - has_more + + AuditLogActorServiceAccount: + type: object + description: The service account that performed the audit logged action. + properties: + id: + type: string + description: The service account id. + + AuditLogActorUser: + type: object + description: The user who performed the audit logged action. + properties: + id: + type: string + description: The user id. + email: + type: string + description: The user email. + + AuditLogActorApiKey: + type: object + description: The API Key used to perform the audit logged action. + properties: + id: + type: string + description: The tracking id of the API key. + type: + type: string + description: The type of API key. Can be either `user` or `service_account`. + enum: ["user", "service_account"] + user: + $ref: "#/components/schemas/AuditLogActorUser" + service_account: + $ref: "#/components/schemas/AuditLogActorServiceAccount" + + AuditLogActorSession: + type: object + description: The session in which the audit logged action was performed. + properties: + user: + $ref: "#/components/schemas/AuditLogActorUser" + ip_address: + type: string + description: The IP address from which the action was performed. + + AuditLogActor: + type: object + description: The actor who performed the audit logged action. + properties: + type: + type: string + description: The type of actor. Is either `session` or `api_key`. + enum: ["session", "api_key"] + session: + type: object + $ref: "#/components/schemas/AuditLogActorSession" + api_key: + type: object + $ref: "#/components/schemas/AuditLogActorApiKey" + + + AuditLogEventType: + type: string + description: The event type. + x-oaiExpandable: true + enum: + - api_key.created + - api_key.updated + - api_key.deleted + - invite.sent + - invite.accepted + - invite.deleted + - login.succeeded + - login.failed + - logout.succeeded + - logout.failed + - organization.updated + - project.created + - project.updated + - project.archived + - service_account.created + - service_account.updated + - service_account.deleted + - user.added + - user.updated + - user.deleted + + AuditLog: + type: object + description: A log of a user action or configuration change within this organization. + properties: + id: + type: string + description: The ID of this log. + type: + $ref: "#/components/schemas/AuditLogEventType" + + effective_at: + type: integer + description: The Unix timestamp (in seconds) of the event. + project: + type: object + description: The project that the action was scoped to. Absent for actions not scoped to projects. + properties: + id: + type: string + description: The project ID. + name: + type: string + description: The project title. + actor: + $ref: "#/components/schemas/AuditLogActor" + api_key.created: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The tracking ID of the API key. + data: + type: object + description: The payload used to create the API key. + properties: + scopes: + type: array + items: + type: string + description: A list of scopes allowed for the API key, e.g. `["api.model.request"]` + api_key.updated: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The tracking ID of the API key. + changes_requested: + type: object + description: The payload used to update the API key. + properties: + scopes: + type: array + items: + type: string + description: A list of scopes allowed for the API key, e.g. `["api.model.request"]` + api_key.deleted: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The tracking ID of the API key. + invite.sent: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The ID of the invite. + data: + type: object + description: The payload used to create the invite. + properties: + email: + type: string + description: The email invited to the organization. + role: + type: string + description: The role the email was invited to be. Is either `owner` or `member`. + invite.accepted: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The ID of the invite. + invite.deleted: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The ID of the invite. + login.failed: + type: object + description: The details for events with this `type`. + properties: + error_code: + type: string + description: The error code of the failure. + error_message: + type: string + description: The error message of the failure. + logout.failed: + type: object + description: The details for events with this `type`. + properties: + error_code: + type: string + description: The error code of the failure. + error_message: + type: string + description: The error message of the failure. + organization.updated: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The organization ID. + changes_requested: + type: object + description: The payload used to update the organization settings. + properties: + title: + type: string + description: The organization title. + description: + type: string + description: The organization description. + name: + type: string + description: The organization name. + settings: + type: object + properties: + threads_ui_visibility: + type: string + description: Visibility of the threads page which shows messages created with the Assistants API and Playground. One of `ANY_ROLE`, `OWNERS`, or `NONE`. + usage_dashboard_visibility: + type: string + description: Visibility of the usage dashboard which shows activity and costs for your organization. One of `ANY_ROLE` or `OWNERS`. + project.created: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The project ID. + data: + type: object + description: The payload used to create the project. + properties: + name: + type: string + description: The project name. + title: + type: string + description: The title of the project as seen on the dashboard. + project.updated: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The project ID. + changes_requested: + type: object + description: The payload used to update the project. + properties: + title: + type: string + description: The title of the project as seen on the dashboard. + project.archived: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The project ID. + service_account.created: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The service account ID. + data: + type: object + description: The payload used to create the service account. + properties: + role: + type: string + description: The role of the service account. Is either `owner` or `member`. + service_account.updated: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The service account ID. + changes_requested: + type: object + description: The payload used to updated the service account. + properties: + role: + type: string + description: The role of the service account. Is either `owner` or `member`. + service_account.deleted: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The service account ID. + user.added: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The user ID. + data: + type: object + description: The payload used to add the user to the project. + properties: + role: + type: string + description: The role of the user. Is either `owner` or `member`. + user.updated: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The project ID. + changes_requested: + type: object + description: The payload used to update the user. + properties: + role: + type: string + description: The role of the user. Is either `owner` or `member`. + user.deleted: + type: object + description: The details for events with this `type`. + properties: + id: + type: string + description: The user ID. + required: + - id + - type + - effective_at + - actor + x-oaiMeta: + name: The audit log object + example: | + { + "id": "req_xxx_20240101", + "type": "api_key.created", + "effective_at": 1720804090, + "actor": { + "type": "session", + "session": { + "user": { + "id": "user-xxx", + "email": "user@example.com" + }, + "ip_address": "127.0.0.1", + "user_agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36" + } + }, + "api_key.created": { + "id": "key_xxxx", + "data": { + "scopes": ["resource.operation"] + } + } + } + + ListAuditLogsResponse: + type: object + properties: + object: + type: string + enum: [list] + data: + type: array + items: + $ref: "#/components/schemas/AuditLog" + first_id: + type: string + example: "audit_log-defb456h8dks" + last_id: + type: string + example: "audit_log-hnbkd8s93s" + has_more: + type: boolean + + required: + - object + - data + - first_id + - last_id + - has_more + + Invite: + type: object + description: Represents an individual `invite` to the organization. + properties: + object: + type: string + enum: [organization.invite] + description: The object type, which is always `organization.invite` + id: + type: string + description: The identifier, which can be referenced in API endpoints + email: + type: string + description: The email address of the individual to whom the invite was sent + role: + type: string + enum: [owner, reader] + description: "`owner` or `reader`" + status: + type: string + enum: [accepted, expired, pending] + description: "`accepted`,`expired`, or `pending`" + invited_at: + type: integer + description: The Unix timestamp (in seconds) of when the invite was sent. + expires_at: + type: integer + description: The Unix timestamp (in seconds) of when the invite expires. + accepted_at: + type: integer + description: The Unix timestamp (in seconds) of when the invite was accepted. + + required: + - object + - id + - email + - role + - status + - invited_at + - expires_at + x-oaiMeta: + name: The invite object + example: | + { + "object": "organization.invite", + "id": "invite-abc", + "email": "user@example.com", + "role": "owner", + "status": "accepted", + "invited_at": 1711471533, + "expires_at": 1711471533, + "accepted_at": 1711471533 + } + + InviteListResponse: + type: object + properties: + object: + type: string + enum: [list] + description: The object type, which is always `list` + data: + type: array + items: + $ref: '#/components/schemas/Invite' + first_id: + type: string + description: The first `invite_id` in the retrieved `list` + last_id: + type: string + description: The last `invite_id` in the retrieved `list` + has_more: + type: boolean + description: The `has_more` property is used for pagination to indicate there are additional results. + required: + - object + - data + + InviteRequest: + type: object + properties: + email: + type: string + description: "Send an email to this address" + role: + type: string + enum: [reader, owner] + description: "`owner` or `reader`" + required: + - email + - role + + InviteDeleteResponse: + type: object + properties: + object: + type: string + enum: [organization.invite.deleted] + description: The object type, which is always `organization.invite.deleted` + id: + type: string + deleted: + type: boolean + required: + - object + - id + - deleted + + User: + type: object + description: Represents an individual `user` within an organization. + properties: + object: + type: string + enum: [organization.user] + description: The object type, which is always `organization.user` + id: + type: string + description: The identifier, which can be referenced in API endpoints + name: + type: string + description: The name of the user + email: + type: string + description: The email address of the user + role: + type: string + enum: [owner, reader] + description: "`owner` or `reader`" + added_at: + type: integer + description: The Unix timestamp (in seconds) of when the user was added. + required: + - object + - id + - name + - email + - role + - added_at + x-oaiMeta: + name: The user object + example: | + { + "object": "organization.user", + "id": "user_abc", + "name": "First Last", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + + UserListResponse: + type: object + properties: + object: + type: string + enum: [list] + data: + type: array + items: + $ref: '#/components/schemas/User' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + required: + - object + - data + - first_id + - last_id + - has_more + + UserRoleUpdateRequest: + type: object + properties: + role: + type: string + enum: [owner,reader] + description: "`owner` or `reader`" + required: + - role + + UserDeleteResponse: + type: object + properties: + object: + type: string + enum: [organization.user.deleted] + id: + type: string + deleted: + type: boolean + required: + - object + - id + - deleted + + Project: + type: object + description: Represents an individual project. + properties: + id: + type: string + description: The identifier, which can be referenced in API endpoints + object: + type: string + enum: [organization.project] + description: The object type, which is always `organization.project` + name: + type: string + description: The name of the project. This appears in reporting. + created_at: + type: integer + description: The Unix timestamp (in seconds) of when the project was created. + archived_at: + type: integer + nullable: true + description: The Unix timestamp (in seconds) of when the project was archived or `null`. + status: + type: string + enum: [active, archived] + description: "`active` or `archived`" + required: + - id + - object + - name + - created_at + - status + x-oaiMeta: + name: The project object + example: | + { + "id": "proj_abc", + "object": "organization.project", + "name": "Project example", + "created_at": 1711471533, + "archived_at": null, + "status": "active" + } + + ProjectListResponse: + type: object + properties: + object: + type: string + enum: [list] + data: + type: array + items: + $ref: '#/components/schemas/Project' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + required: + - object + - data + - first_id + - last_id + - has_more + + ProjectCreateRequest: + type: object + properties: + name: + type: string + description: The friendly name of the project, this name appears in reports. + required: + - name + + ProjectUpdateRequest: + type: object + properties: + name: + type: string + description: The updated name of the project, this name appears in reports. + required: + - name + + DefaultProjectErrorResponse: + type: object + properties: + code: + type: integer + message: + type: string + required: + - code + - message + + ProjectUser: + type: object + description: Represents an individual user in a project. + properties: + object: + type: string + enum: [organization.project.user] + description: The object type, which is always `organization.project.user` + id: + type: string + description: The identifier, which can be referenced in API endpoints + name: + type: string + description: The name of the user + email: + type: string + description: The email address of the user + role: + type: string + enum: [owner, member] + description: "`owner` or `member`" + added_at: + type: integer + description: The Unix timestamp (in seconds) of when the project was added. + + required: + - object + - id + - name + - email + - role + - added_at + x-oaiMeta: + name: The project user object + example: | + { + "object": "organization.project.user", + "id": "user_abc", + "name": "First Last", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + + ProjectUserListResponse: + type: object + properties: + object: + type: string + data: + type: array + items: + $ref: '#/components/schemas/ProjectUser' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + required: + - object + - data + - first_id + - last_id + - has_more + + ProjectUserCreateRequest: + type: object + properties: + user_id: + type: string + description: The ID of the user. + role: + type: string + enum: [owner, member] + description: "`owner` or `member`" + required: + - user_id + - role + + ProjectUserUpdateRequest: + type: object + properties: + role: + type: string + enum: [owner, member] + description: "`owner` or `member`" + required: + - role + + ProjectUserDeleteResponse: + type: object + properties: + object: + type: string + enum: [organization.project.user.deleted] + id: + type: string + deleted: + type: boolean + required: + - object + - id + - deleted + + ProjectServiceAccount: + type: object + description: Represents an individual service account in a project. + properties: + object: + type: string + enum: [organization.project.service_account] + description: The object type, which is always `organization.project.service_account` + id: + type: string + description: The identifier, which can be referenced in API endpoints + name: + type: string + description: The name of the service account + role: + type: string + enum: [owner, member] + description: "`owner` or `member`" + created_at: + type: integer + description: The Unix timestamp (in seconds) of when the service account was created + required: + - object + - id + - name + - role + - created_at + x-oaiMeta: + name: The project service account object + example: | + { + "object": "organization.project.service_account", + "id": "svc_acct_abc", + "name": "Service Account", + "role": "owner", + "created_at": 1711471533 + } + + ProjectServiceAccountListResponse: + type: object + properties: + object: + type: string + enum: [list] + data: + type: array + items: + $ref: '#/components/schemas/ProjectServiceAccount' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + required: + - object + - data + - first_id + - last_id + - has_more + + ProjectServiceAccountCreateRequest: + type: object + properties: + name: + type: string + description: The name of the service account being created. + required: + - name + + ProjectServiceAccountCreateResponse: + type: object + properties: + object: + type: string + enum: [organization.project.service_account] + id: + type: string + name: + type: string + role: + type: string + enum: [member] + description: Service accounts can only have one role of type `member` + created_at: + type: integer + api_key: + $ref: '#/components/schemas/ProjectServiceAccountApiKey' + required: + - object + - id + - name + - role + - created_at + - api_key + + ProjectServiceAccountApiKey: + type: object + properties: + object: + type: string + enum: [organization.project.service_account.api_key] + description: The object type, which is always `organization.project.service_account.api_key` + + value: + type: string + name: + type: string + created_at: + type: integer + id: + type: string + required: + - object + - value + - name + - created_at + - id + + ProjectServiceAccountDeleteResponse: + type: object + properties: + object: + type: string + enum: [organization.project.service_account.deleted] + id: + type: string + deleted: + type: boolean + required: + - object + - id + - deleted + + ProjectApiKey: + type: object + description: Represents an individual API key in a project. + properties: + object: + type: string + enum: [organization.project.api_key] + description: The object type, which is always `organization.project.api_key` + redacted_value: + type: string + description: The redacted value of the API key + name: + type: string + description: The name of the API key + created_at: + type: integer + description: The Unix timestamp (in seconds) of when the API key was created + id: + type: string + description: The identifier, which can be referenced in API endpoints + owner: + type: object + properties: + type: + type: string + enum: [user, service_account] + description: "`user` or `service_account`" + user: + $ref: '#/components/schemas/ProjectUser' + service_account: + $ref: '#/components/schemas/ProjectServiceAccount' + required: + - object + - redacted_value + - name + - created_at + - id + - owner + x-oaiMeta: + name: The project API key object + example: | + { + "object": "organization.project.api_key", + "redacted_value": "sk-abc...def", + "name": "My API Key", + "created_at": 1711471533, + "id": "key_abc", + "owner": { + "type": "user", + "user": { + "object": "organization.project.user", + "id": "user_abc", + "name": "First Last", + "email": "user@example.com", + "role": "owner", + "added_at": 1711471533 + } + } + } + + ProjectApiKeyListResponse: + type: object + properties: + object: + type: string + enum: [list] + data: + type: array + items: + $ref: '#/components/schemas/ProjectApiKey' + first_id: + type: string + last_id: + type: string + has_more: + type: boolean + required: + - object + - data + - first_id + - last_id + - has_more + + ProjectApiKeyDeleteResponse: + type: object + properties: + object: + type: string + enum: [organization.project.api_key.deleted] + id: + type: string + deleted: + type: boolean + required: + - object + - id + - deleted + +security: + - ApiKeyAuth: [] + +x-oaiMeta: + navigationGroups: + - id: endpoints + title: Endpoints + - id: assistants + title: Assistants + - id: administration + title: Administration + - id: legacy + title: Legacy + groups: + # > General Notes + # The `groups` section is used to generate the API reference pages and navigation, in the same + # order listed below. Additionally, each `group` can have a list of `sections`, each of which + # will become a navigation subroute and subsection under the group. Each section has: + # - `type`: Currently, either an `endpoint` or `object`, depending on how the section needs to + # be rendered + # - `key`: The reference key that can be used to lookup the section definition + # - `path`: The path (url) of the section, which is used to generate the navigation link. + # + # > The `object` sections maps to a schema component and the following fields are read for rendering + # - `x-oaiMeta.name`: The name of the object, which will become the section title + # - `x-oaiMeta.example`: The example object, which will be used to generate the example sample (always JSON) + # - `description`: The description of the object, which will be used to generate the section description + # + # > The `endpoint` section maps to an operation path and the following fields are read for rendering: + # - `x-oaiMeta.name`: The name of the endpoint, which will become the section title + # - `x-oaiMeta.examples`: The endpoint examples, which can be an object (meaning a single variation, most + # endpoints, or an array of objects, meaning multiple variations, e.g. the + # chat completion and completion endpoints, with streamed and non-streamed examples. + # - `x-oaiMeta.returns`: text describing what the endpoint returns. + # - `summary`: The summary of the endpoint, which will be used to generate the section description + - id: audio + title: Audio + description: | + Learn how to turn audio into text or text into audio. + + Related guide: [Speech to text](/docs/guides/speech-to-text) + navigationGroup: endpoints + sections: + - type: endpoint + key: createSpeech + path: createSpeech + - type: endpoint + key: createTranscription + path: createTranscription + - type: endpoint + key: createTranslation + path: createTranslation + - type: object + key: CreateTranscriptionResponseJson + path: json-object + - type: object + key: CreateTranscriptionResponseVerboseJson + path: verbose-json-object + - id: chat + title: Chat + description: | + Given a list of messages comprising a conversation, the model will return a response. + + Related guide: [Chat Completions](/docs/guides/text-generation) + navigationGroup: endpoints + sections: + - type: endpoint + key: createChatCompletion + path: create + - type: object + key: CreateChatCompletionResponse + path: object + - type: object + key: CreateChatCompletionStreamResponse + path: streaming + - id: embeddings + title: Embeddings + description: | + Get a vector representation of a given input that can be easily consumed by machine learning models and algorithms. + + Related guide: [Embeddings](/docs/guides/embeddings) + navigationGroup: endpoints + sections: + - type: endpoint + key: createEmbedding + path: create + - type: object + key: Embedding + path: object + - id: fine-tuning + title: Fine-tuning + description: | + Manage fine-tuning jobs to tailor a model to your specific training data. + + Related guide: [Fine-tune models](/docs/guides/fine-tuning) + navigationGroup: endpoints + sections: + - type: endpoint + key: createFineTuningJob + path: create + - type: endpoint + key: listPaginatedFineTuningJobs + path: list + - type: endpoint + key: listFineTuningEvents + path: list-events + - type: endpoint + key: listFineTuningJobCheckpoints + path: list-checkpoints + - type: endpoint + key: retrieveFineTuningJob + path: retrieve + - type: endpoint + key: cancelFineTuningJob + path: cancel + - type: object + key: FinetuneChatRequestInput + path: chat-input + - type: object + key: FinetuneCompletionRequestInput + path: completions-input + - type: object + key: FineTuningJob + path: object + - type: object + key: FineTuningJobEvent + path: event-object + - type: object + key: FineTuningJobCheckpoint + path: checkpoint-object + - id: batch + title: Batch + description: | + Create large batches of API requests for asynchronous processing. The Batch API returns completions within 24 hours for a 50% discount. + + Related guide: [Batch](/docs/guides/batch) + navigationGroup: endpoints + sections: + - type: endpoint + key: createBatch + path: create + - type: endpoint + key: retrieveBatch + path: retrieve + - type: endpoint + key: cancelBatch + path: cancel + - type: endpoint + key: listBatches + path: list + - type: object + key: Batch + path: object + - type: object + key: BatchRequestInput + path: request-input + - type: object + key: BatchRequestOutput + path: request-output + - id: files + title: Files + description: | + Files are used to upload documents that can be used with features like [Assistants](/docs/api-reference/assistants), [Fine-tuning](/docs/api-reference/fine-tuning), and [Batch API](/docs/guides/batch). + navigationGroup: endpoints + sections: + - type: endpoint + key: createFile + path: create + - type: endpoint + key: listFiles + path: list + - type: endpoint + key: retrieveFile + path: retrieve + - type: endpoint + key: deleteFile + path: delete + - type: endpoint + key: downloadFile + path: retrieve-contents + - type: object + key: OpenAIFile + path: object + - id: uploads + title: Uploads + description: | + Allows you to upload large files in multiple parts. + navigationGroup: endpoints + sections: + - type: endpoint + key: createUpload + path: create + - type: endpoint + key: addUploadPart + path: add-part + - type: endpoint + key: completeUpload + path: complete + - type: endpoint + key: cancelUpload + path: cancel + - type: object + key: Upload + path: object + - type: object + key: UploadPart + path: part-object + - id: images + title: Images + description: | + Given a prompt and/or an input image, the model will generate a new image. + + Related guide: [Image generation](/docs/guides/images) + navigationGroup: endpoints + sections: + - type: endpoint + key: createImage + path: create + - type: endpoint + key: createImageEdit + path: createEdit + - type: endpoint + key: createImageVariation + path: createVariation + - type: object + key: Image + path: object + - id: models + title: Models + description: | + List and describe the various models available in the API. You can refer to the [Models](/docs/models) documentation to understand what models are available and the differences between them. + navigationGroup: endpoints + sections: + - type: endpoint + key: listModels + path: list + - type: endpoint + key: retrieveModel + path: retrieve + - type: endpoint + key: deleteModel + path: delete + - type: object + key: Model + path: object + - id: moderations + title: Moderations + description: | + Given some input text, outputs if the model classifies it as potentially harmful across several categories. + + Related guide: [Moderations](/docs/guides/moderation) + navigationGroup: endpoints + sections: + - type: endpoint + key: createModeration + path: create + - type: object + key: CreateModerationResponse + path: object + + + - id: assistants + title: Assistants + beta: true + description: | + Build assistants that can call models and use tools to perform tasks. + + [Get started with the Assistants API](/docs/assistants) + navigationGroup: assistants + sections: + - type: endpoint + key: createAssistant + path: createAssistant + - type: endpoint + key: listAssistants + path: listAssistants + - type: endpoint + key: getAssistant + path: getAssistant + - type: endpoint + key: modifyAssistant + path: modifyAssistant + - type: endpoint + key: deleteAssistant + path: deleteAssistant + - type: object + key: AssistantObject + path: object + - id: threads + title: Threads + beta: true + description: | + Create threads that assistants can interact with. + + Related guide: [Assistants](/docs/assistants/overview) + navigationGroup: assistants + sections: + - type: endpoint + key: createThread + path: createThread + - type: endpoint + key: getThread + path: getThread + - type: endpoint + key: modifyThread + path: modifyThread + - type: endpoint + key: deleteThread + path: deleteThread + - type: object + key: ThreadObject + path: object + - id: messages + title: Messages + beta: true + description: | + Create messages within threads + + Related guide: [Assistants](/docs/assistants/overview) + navigationGroup: assistants + sections: + - type: endpoint + key: createMessage + path: createMessage + - type: endpoint + key: listMessages + path: listMessages + - type: endpoint + key: getMessage + path: getMessage + - type: endpoint + key: modifyMessage + path: modifyMessage + - type: endpoint + key: deleteMessage + path: deleteMessage + - type: object + key: MessageObject + path: object + - id: runs + title: Runs + beta: true + description: | + Represents an execution run on a thread. + + Related guide: [Assistants](/docs/assistants/overview) + navigationGroup: assistants + sections: + - type: endpoint + key: createRun + path: createRun + - type: endpoint + key: createThreadAndRun + path: createThreadAndRun + - type: endpoint + key: listRuns + path: listRuns + - type: endpoint + key: getRun + path: getRun + - type: endpoint + key: modifyRun + path: modifyRun + - type: endpoint + key: submitToolOuputsToRun + path: submitToolOutputs + - type: endpoint + key: cancelRun + path: cancelRun + - type: object + key: RunObject + path: object + - id: run-steps + title: Run Steps + beta: true + description: | + Represents the steps (model and tool calls) taken during the run. + + Related guide: [Assistants](/docs/assistants/overview) + navigationGroup: assistants + sections: + - type: endpoint + key: listRunSteps + path: listRunSteps + - type: endpoint + key: getRunStep + path: getRunStep + - type: object + key: RunStepObject + path: step-object + - id: vector-stores + title: Vector Stores + beta: true + description: | + Vector stores are used to store files for use by the `file_search` tool. + + Related guide: [File Search](/docs/assistants/tools/file-search) + navigationGroup: assistants + sections: + - type: endpoint + key: createVectorStore + path: create + - type: endpoint + key: listVectorStores + path: list + - type: endpoint + key: getVectorStore + path: retrieve + - type: endpoint + key: modifyVectorStore + path: modify + - type: endpoint + key: deleteVectorStore + path: delete + - type: object + key: VectorStoreObject + path: object + - id: vector-stores-files + title: Vector Store Files + beta: true + description: | + Vector store files represent files inside a vector store. + + Related guide: [File Search](/docs/assistants/tools/file-search) + navigationGroup: assistants + sections: + - type: endpoint + key: createVectorStoreFile + path: createFile + - type: endpoint + key: listVectorStoreFiles + path: listFiles + - type: endpoint + key: getVectorStoreFile + path: getFile + - type: endpoint + key: deleteVectorStoreFile + path: deleteFile + - type: object + key: VectorStoreFileObject + path: file-object + - id: vector-stores-file-batches + title: Vector Store File Batches + beta: true + description: | + Vector store file batches represent operations to add multiple files to a vector store. + + Related guide: [File Search](/docs/assistants/tools/file-search) + navigationGroup: assistants + sections: + - type: endpoint + key: createVectorStoreFileBatch + path: createBatch + - type: endpoint + key: getVectorStoreFileBatch + path: getBatch + - type: endpoint + key: cancelVectorStoreFileBatch + path: cancelBatch + - type: endpoint + key: listFilesInVectorStoreBatch + path: listBatchFiles + - type: object + key: VectorStoreFileBatchObject + path: batch-object + - id: assistants-streaming + title: Streaming + beta: true + description: | + Stream the result of executing a Run or resuming a Run after submitting tool outputs. + + You can stream events from the [Create Thread and Run](/docs/api-reference/runs/createThreadAndRun), + [Create Run](/docs/api-reference/runs/createRun), and [Submit Tool Outputs](/docs/api-reference/runs/submitToolOutputs) + endpoints by passing `"stream": true`. The response will be a [Server-Sent events](https://html.spec.whatwg.org/multipage/server-sent-events.html#server-sent-events) stream. + + Our Node and Python SDKs provide helpful utilities to make streaming easy. Reference the + [Assistants API quickstart](/docs/assistants/overview) to learn more. + navigationGroup: assistants + sections: + - type: object + key: MessageDeltaObject + path: message-delta-object + - type: object + key: RunStepDeltaObject + path: run-step-delta-object + - type: object + key: AssistantStreamEvent + path: events + + - id: administration + title: Overview + description: | + Programmatically manage your organization. + + The Audit Logs endpoint provides a log of all actions taken in the + organization for security and monitoring purposes. + + To access these endpoints please generate an Admin API Key through the [API Platform Organization overview](/organization/admin-keys). Admin API keys cannot be used for non-administration endpoints. + + For best practices on setting up your organization, please refer to this [guide](/docs/guides/production-best-practices/setting-up-your-organization) + navigationGroup: administration + + - id: invite + title: Invites + description: Invite and manage invitations for an organization. Invited users are automatically added to the Default project. + navigationGroup: administration + sections: + - type: endpoint + key: list-invites + path: list + - type: endpoint + key: inviteUser + path: create + - type: endpoint + key: retrieve-invite + path: retrieve + - type: endpoint + key: delete-invite + path: delete + - type: object + key: Invite + path: object + + - id: users + title: Users + description: | + Manage users and their role in an organization. Users will be automatically added to the Default project. + navigationGroup: administration + sections: + - type: endpoint + key: list-users + path: list + - type: endpoint + key: modify-user + path: modify + - type: endpoint + key: retrieve-user + path: retrieve + - type: endpoint + key: delete-user + path: delete + - type: object + key: User + path: object + + - id: projects + title: Projects + description: | + Manage the projects within an orgnanization includes creation, updating, and archiving or projects. + The Default project cannot be modified or archived. + navigationGroup: administration + sections: + - type: endpoint + key: list-projects + path: list + - type: endpoint + key: create-project + path: create + - type: endpoint + key: retrieve-project + path: retrieve + - type: endpoint + key: modify-project + path: modify + - type: endpoint + key: archive-project + path: archive + - type: object + key: Project + path: object + + - id: project-users + title: Project Users + description: | + Manage users within a project, including adding, updating roles, and removing users. + Users cannot be removed from the Default project, unless they are being removed from the organization. + navigationGroup: administration + sections: + - type: endpoint + key: list-project-users + path: list + - type: endpoint + key: create-project-user + path: creeate + - type: endpoint + key: retrieve-project-user + path: retrieve + - type: endpoint + key: modify-project-user + path: modify + - type: endpoint + key: delete-project-user + path: delete + - type: object + key: ProjectUser + path: object + + - id: project-service-accounts + title: Project Service Accounts + description: | + Manage service accounts within a project. A service account is a bot user that is not associated with a user. + If a user leaves an organization, their keys and membership in projects will no longer work. Service accounts + do not have this limitation. However, service accounts can also be deleted from a project. + navigationGroup: administration + sections: + - type: endpoint + key: list-project-service-accounts + path: list + - type: endpoint + key: create-project-service-account + path: create + - type: endpoint + key: retrieve-project-service-account + path: retrieve + - type: endpoint + key: delete-project-service-account + path: delete + - type: object + key: ProjectServiceAccount + path: object + + - id: project-api-keys + title: Project API Keys + description: | + Manage API keys for a given project. Supports listing and deleting keys for users. + This API does not allow issuing keys for users, as users need to authorize themselves to generate keys. + navigationGroup: administration + sections: + - type: endpoint + key: list-project-api-keys + path: list + - type: endpoint + key: retrieve-project-api-key + path: retrieve + - type: endpoint + key: delete-project-api-key + path: delete + - type: object + key: ProjectApiKey + path: object + + - id: audit-logs + title: Audit Logs + description: | + Logs of user actions and configuration changes within this organization. + + To log events, you must activate logging in the [Organization Settings](/settings/organization/general). + Once activated, for security reasons, logging cannot be deactivated. + navigationGroup: administration + sections: + - type: endpoint + key: list-audit-logs + path: list + - type: object + key: AuditLog + path: object + + - id: completions + title: Completions + legacy: true + navigationGroup: legacy + description: | + Given a prompt, the model will return one or more predicted completions along with the probabilities of alternative tokens at each position. Most developer should use our [Chat Completions API](/docs/guides/text-generation/text-generation-models) to leverage our best and newest models. + sections: + - type: endpoint + key: createCompletion + path: create + - type: object + key: CreateCompletionResponse + path: object \ No newline at end of file diff --git a/package-lock.json b/package-lock.json index a77b672a3..afd00388f 100644 --- a/package-lock.json +++ b/package-lock.json @@ -8,49 +8,196 @@ "name": "openai-tsp", "version": "0.1.0", "dependencies": { - "@typespec/compiler": "^0.49.0-dev.11", - "@typespec/openapi": "^0.49.0-dev.4", - "@typespec/openapi3": "^0.49.0-dev.10", - "@typespec/rest": "^0.49.0-dev.3" + "@autorest/csharp": "3.0.0-beta.20240722.4", + "@azure-tools/typespec-csharp": "0.2.0-beta.20240722.4", + "@typespec/openapi3": "^0.58.0" + } + }, + "node_modules/@apidevtools/swagger-methods": { + "version": "3.0.2", + "resolved": "https://registry.npmjs.org/@apidevtools/swagger-methods/-/swagger-methods-3.0.2.tgz", + "integrity": "sha512-QAkD5kK2b1WfjDS/UQn/qQkbwF31uqRjPTrsCs5ZG9BQGAkjwvqGFjjPqAuzac/IYzpPtRzjCP1WrTuAIjMrXg==" + }, + "node_modules/@autorest/csharp": { + "version": "3.0.0-beta.20240722.4", + "resolved": "https://registry.npmjs.org/@autorest/csharp/-/csharp-3.0.0-beta.20240722.4.tgz", + "integrity": "sha512-wpCJGvRnA4Zv4tT20LOc+6vCSOrDs0v5HYcSPB2y8iwC6ohLqSYpjeoc0FiV2ePX4FodBlFRO8WCueUDhCBgFQ==" + }, + "node_modules/@azure-tools/typespec-autorest": { + "version": "0.44.0", + "resolved": "https://registry.npmjs.org/@azure-tools/typespec-autorest/-/typespec-autorest-0.44.0.tgz", + "integrity": "sha512-GlIQayA6HfKndq1T2qHBXtL6n8gTiShUEhi30zncoBaIUnwumkXSnx18uCQl0EzFmvAqLYt3kbHqQNzZIdGaeQ==", + "peer": true, + "engines": { + "node": ">=18.0.0" + }, + "peerDependencies": { + "@azure-tools/typespec-azure-core": "~0.44.0", + "@azure-tools/typespec-azure-resource-manager": "~0.44.0", + "@azure-tools/typespec-client-generator-core": "~0.44.0", + "@typespec/compiler": "~0.58.0", + "@typespec/http": "~0.58.0", + "@typespec/openapi": "~0.58.0", + "@typespec/rest": "~0.58.0", + "@typespec/versioning": "~0.58.0" + } + }, + "node_modules/@azure-tools/typespec-azure-core": { + "version": "0.44.0", + "resolved": "https://registry.npmjs.org/@azure-tools/typespec-azure-core/-/typespec-azure-core-0.44.0.tgz", + "integrity": "sha512-d11QK2v5fOZH8YUqf42FsqHEirKCHzeKFq4Uo/51BXCXmJJahsTaFMAG2M0GoJe8tmTHeMijStnVMfzcGNqCAA==", + "peer": true, + "engines": { + "node": ">=18.0.0" + }, + "peerDependencies": { + "@typespec/compiler": "~0.58.0", + "@typespec/http": "~0.58.0", + "@typespec/rest": "~0.58.0" + } + }, + "node_modules/@azure-tools/typespec-azure-resource-manager": { + "version": "0.44.0", + "resolved": "https://registry.npmjs.org/@azure-tools/typespec-azure-resource-manager/-/typespec-azure-resource-manager-0.44.0.tgz", + "integrity": "sha512-m4dG41at6En1swbxlvCDl1v4Mvrfp17acDnRxEcd4SdKP2R9eVS2mBy1tSuFtMcJlOnoBZ5CxQgk+Osg/Q9nmA==", + "peer": true, + "dependencies": { + "change-case": "~5.4.4", + "pluralize": "^8.0.0" + }, + "engines": { + "node": ">=18.0.0" + }, + "peerDependencies": { + "@azure-tools/typespec-azure-core": "~0.44.0", + "@typespec/compiler": "~0.58.0", + "@typespec/http": "~0.58.0", + "@typespec/openapi": "~0.58.0", + "@typespec/rest": "~0.58.0", + "@typespec/versioning": "~0.58.0" + } + }, + "node_modules/@azure-tools/typespec-azure-rulesets": { + "version": "0.44.0", + "resolved": "https://registry.npmjs.org/@azure-tools/typespec-azure-rulesets/-/typespec-azure-rulesets-0.44.0.tgz", + "integrity": "sha512-ZFiT+rtLIq3uP4uSr85i7w+3r02BEqERePaCtTyjexo2IBz0lwQ5Nn/5ujfuDDSy+3daoC2bQy8hw/BGWg9/Ng==", + "peer": true, + "engines": { + "node": ">=18.0.0" + }, + "peerDependencies": { + "@azure-tools/typespec-azure-core": "~0.44.0", + "@azure-tools/typespec-azure-resource-manager": "~0.44.0", + "@azure-tools/typespec-client-generator-core": "~0.44.0", + "@typespec/compiler": "~0.58.0" + } + }, + "node_modules/@azure-tools/typespec-client-generator-core": { + "version": "0.44.1", + "resolved": "https://registry.npmjs.org/@azure-tools/typespec-client-generator-core/-/typespec-client-generator-core-0.44.1.tgz", + "integrity": "sha512-hpDYS4J329kPnXMAndburITh81jgOloxLrv6QXJadurnFhxFHb8AycGO8VWgFYTf04cWd7yDx7HutzGSN9C7TQ==", + "peer": true, + "dependencies": { + "change-case": "~5.4.4", + "pluralize": "^8.0.0" + }, + "engines": { + "node": ">=18.0.0" + }, + "peerDependencies": { + "@azure-tools/typespec-azure-core": "~0.44.0", + "@typespec/compiler": "~0.58.0", + "@typespec/http": "~0.58.0", + "@typespec/rest": "~0.58.0", + "@typespec/versioning": "~0.58.0" + } + }, + "node_modules/@azure-tools/typespec-csharp": { + "version": "0.2.0-beta.20240722.4", + "resolved": "https://registry.npmjs.org/@azure-tools/typespec-csharp/-/typespec-csharp-0.2.0-beta.20240722.4.tgz", + "integrity": "sha512-TSe1W4gutfalAN/Xuq6WP+DvjJYWznc0kqMRCXjuUd/F3XSdbH71QZxsVBnOqzT2cGkOamY7XRYXlKMTdzifvA==", + "dependencies": { + "@autorest/csharp": "3.0.0-beta.20240722.4", + "@typespec/http-client-csharp": "0.1.9-alpha.20240718.1", + "json-serialize-refs": "0.1.0-0" + }, + "peerDependencies": { + "@azure-tools/typespec-autorest": ">=0.42.1 <1.0.0", + "@azure-tools/typespec-azure-core": ">=0.36.0 <1.0.0", + "@azure-tools/typespec-azure-resource-manager": ">=0.36.0 <1.0.0", + "@azure-tools/typespec-azure-rulesets": ">=0.36.0 <1.0.0", + "@azure-tools/typespec-client-generator-core": ">=0.36.0 <1.0.0", + "@typespec/compiler": ">=0.50.0 <1.0.0", + "@typespec/http": ">=0.50.0 <1.0.0", + "@typespec/openapi": ">=0.50.0 <1.0.0", + "@typespec/rest": ">=0.50.0 <1.0.0", + "@typespec/versioning": ">=0.50.0 <1.0.0", + "@typespec/xml": ">=0.50.0 <1.0.0" } }, "node_modules/@babel/code-frame": { - "version": "7.22.13", - "resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.22.13.tgz", - "integrity": "sha512-XktuhWlJ5g+3TJXc5upd9Ks1HutSArik6jf2eAjYFyIOf4ej3RN+184cZbzDvbPnuTJIUhPKKJE3cIsYTiAT3w==", + "version": "7.24.7", + "resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.24.7.tgz", + "integrity": "sha512-BcYH1CVJBO9tvyIZ2jVeXgSIMvGZ2FDRvDdOIVQyuklNKSsx+eppDEBq/g47Ayw+RqNFE+URvOShmf+f/qwAlA==", "dependencies": { - "@babel/highlight": "^7.22.13", - "chalk": "^2.4.2" + "@babel/highlight": "^7.24.7", + "picocolors": "^1.0.0" }, "engines": { "node": ">=6.9.0" } }, "node_modules/@babel/helper-validator-identifier": { - "version": "7.22.5", - "resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.22.5.tgz", - "integrity": "sha512-aJXu+6lErq8ltp+JhkJUfk1MTGyuA4v7f3pA+BJ5HLfNC6nAQ0Cpi9uOquUj8Hehg0aUiHzWQbOVJGao6ztBAQ==", + "version": "7.24.7", + "resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.24.7.tgz", + "integrity": "sha512-rR+PBcQ1SMQDDyF6X0wxtG8QyLCgUB0eRAGguqRLfkCA87l7yAP7ehq8SNj96OOGTO8OBV70KhuFYcIkHXOg0w==", "engines": { "node": ">=6.9.0" } }, "node_modules/@babel/highlight": { - "version": "7.22.13", - "resolved": "https://registry.npmjs.org/@babel/highlight/-/highlight-7.22.13.tgz", - "integrity": "sha512-C/BaXcnnvBCmHTpz/VGZ8jgtE2aYlW4hxDhseJAWZb7gqGM/qtCK6iZUb0TyKFf7BOUsBH7Q7fkRsDRhg1XklQ==", + "version": "7.24.7", + "resolved": "https://registry.npmjs.org/@babel/highlight/-/highlight-7.24.7.tgz", + "integrity": "sha512-EStJpq4OuY8xYfhGVXngigBJRWxftKX9ksiGDnmlY3o7B/V7KIAc9X4oiK87uPJSc/vs5L869bem5fhZa8caZw==", "dependencies": { - "@babel/helper-validator-identifier": "^7.22.5", + "@babel/helper-validator-identifier": "^7.24.7", "chalk": "^2.4.2", - "js-tokens": "^4.0.0" + "js-tokens": "^4.0.0", + "picocolors": "^1.0.0" }, "engines": { "node": ">=6.9.0" } }, + "node_modules/@babel/runtime": { + "version": "7.24.8", + "resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.24.8.tgz", + "integrity": "sha512-5F7SDGs1T72ZczbRwbGO9lQi0NLjQxzl6i4lJxLxfW9U5UluCSyEJeniWvnhl3/euNiqQVbo8zruhsDfid0esA==", + "dependencies": { + "regenerator-runtime": "^0.14.0" + }, + "engines": { + "node": ">=6.9.0" + } + }, + "node_modules/@humanwhocodes/momoa": { + "version": "2.0.4", + "resolved": "https://registry.npmjs.org/@humanwhocodes/momoa/-/momoa-2.0.4.tgz", + "integrity": "sha512-RE815I4arJFtt+FVeU1Tgp9/Xvecacji8w/V6XtXsWWH/wz/eNkNbhb+ny/+PlVZjV0rxQpRSQKNKE3lcktHEA==", + "engines": { + "node": ">=10.10.0" + } + }, + "node_modules/@jsdevtools/ono": { + "version": "7.1.3", + "resolved": "https://registry.npmjs.org/@jsdevtools/ono/-/ono-7.1.3.tgz", + "integrity": "sha512-4JQNk+3mVzK3xh2rqd6RB4J46qUR19azEHBneZyTZM+c456qOrbbM/5xcR8huNCCcbVt7+UmizG6GuUvPvKUYg==" + }, "node_modules/@nodelib/fs.scandir": { "version": "2.1.5", "resolved": "https://registry.npmjs.org/@nodelib/fs.scandir/-/fs.scandir-2.1.5.tgz", "integrity": "sha512-vq24Bq3ym5HEQm2NKCr3yXDwjc7vTsEThRDnkp2DK9p1uqLR+DHurm/NOTo0KG7HYHU7eppKZj3MyqYuMBf62g==", + "peer": true, "dependencies": { "@nodelib/fs.stat": "2.0.5", "run-parallel": "^1.1.9" @@ -63,6 +210,7 @@ "version": "2.0.5", "resolved": "https://registry.npmjs.org/@nodelib/fs.stat/-/fs.stat-2.0.5.tgz", "integrity": "sha512-RkhPPp2zrqDAQA/2jNhnztcPAlv64XdhIp7a7454A5ovI7Bukxgt7MX7udwAu3zg1DcpPU0rz3VV1SeaqvY4+A==", + "peer": true, "engines": { "node": ">= 8" } @@ -71,6 +219,7 @@ "version": "1.2.8", "resolved": "https://registry.npmjs.org/@nodelib/fs.walk/-/fs.walk-1.2.8.tgz", "integrity": "sha512-oGB+UxlgWcgQkgwo8GcEGwemoTFt3FIO9ababBmaGwXIoBKZ+GTy0pP185beGg7Llih/NSHSV2XAs1lnznocSg==", + "peer": true, "dependencies": { "@nodelib/fs.scandir": "2.1.5", "fastq": "^1.6.0" @@ -79,23 +228,166 @@ "node": ">= 8" } }, + "node_modules/@readme/better-ajv-errors": { + "version": "1.6.0", + "resolved": "https://registry.npmjs.org/@readme/better-ajv-errors/-/better-ajv-errors-1.6.0.tgz", + "integrity": "sha512-9gO9rld84Jgu13kcbKRU+WHseNhaVt76wYMeRDGsUGYxwJtI3RmEJ9LY9dZCYQGI8eUZLuxb5qDja0nqklpFjQ==", + "dependencies": { + "@babel/code-frame": "^7.16.0", + "@babel/runtime": "^7.21.0", + "@humanwhocodes/momoa": "^2.0.3", + "chalk": "^4.1.2", + "json-to-ast": "^2.0.3", + "jsonpointer": "^5.0.0", + "leven": "^3.1.0" + }, + "engines": { + "node": ">=14" + }, + "peerDependencies": { + "ajv": "4.11.8 - 8" + } + }, + "node_modules/@readme/better-ajv-errors/node_modules/ansi-styles": { + "version": "4.3.0", + "resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz", + "integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==", + "dependencies": { + "color-convert": "^2.0.1" + }, + "engines": { + "node": ">=8" + }, + "funding": { + "url": "https://github.com/chalk/ansi-styles?sponsor=1" + } + }, + "node_modules/@readme/better-ajv-errors/node_modules/chalk": { + "version": "4.1.2", + "resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz", + "integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==", + "dependencies": { + "ansi-styles": "^4.1.0", + "supports-color": "^7.1.0" + }, + "engines": { + "node": ">=10" + }, + "funding": { + "url": "https://github.com/chalk/chalk?sponsor=1" + } + }, + "node_modules/@readme/better-ajv-errors/node_modules/color-convert": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz", + "integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==", + "dependencies": { + "color-name": "~1.1.4" + }, + "engines": { + "node": ">=7.0.0" + } + }, + "node_modules/@readme/better-ajv-errors/node_modules/color-name": { + "version": "1.1.4", + "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz", + "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==" + }, + "node_modules/@readme/better-ajv-errors/node_modules/has-flag": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz", + "integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==", + "engines": { + "node": ">=8" + } + }, + "node_modules/@readme/better-ajv-errors/node_modules/supports-color": { + "version": "7.2.0", + "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz", + "integrity": "sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==", + "dependencies": { + "has-flag": "^4.0.0" + }, + "engines": { + "node": ">=8" + } + }, + "node_modules/@readme/json-schema-ref-parser": { + "version": "1.2.0", + "resolved": "https://registry.npmjs.org/@readme/json-schema-ref-parser/-/json-schema-ref-parser-1.2.0.tgz", + "integrity": "sha512-Bt3QVovFSua4QmHa65EHUmh2xS0XJ3rgTEUPH998f4OW4VVJke3BuS16f+kM0ZLOGdvIrzrPRqwihuv5BAjtrA==", + "dependencies": { + "@jsdevtools/ono": "^7.1.3", + "@types/json-schema": "^7.0.6", + "call-me-maybe": "^1.0.1", + "js-yaml": "^4.1.0" + } + }, + "node_modules/@readme/openapi-parser": { + "version": "2.6.0", + "resolved": "https://registry.npmjs.org/@readme/openapi-parser/-/openapi-parser-2.6.0.tgz", + "integrity": "sha512-pyFJXezWj9WI1O+gdp95CoxfY+i+Uq3kKk4zXIFuRAZi9YnHpHOpjumWWr67wkmRTw19Hskh9spyY0Iyikf3fA==", + "dependencies": { + "@apidevtools/swagger-methods": "^3.0.2", + "@jsdevtools/ono": "^7.1.3", + "@readme/better-ajv-errors": "^1.6.0", + "@readme/json-schema-ref-parser": "^1.2.0", + "@readme/openapi-schemas": "^3.1.0", + "ajv": "^8.12.0", + "ajv-draft-04": "^1.0.0", + "call-me-maybe": "^1.0.1" + }, + "engines": { + "node": ">=18" + }, + "peerDependencies": { + "openapi-types": ">=7" + } + }, + "node_modules/@readme/openapi-schemas": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/@readme/openapi-schemas/-/openapi-schemas-3.1.0.tgz", + "integrity": "sha512-9FC/6ho8uFa8fV50+FPy/ngWN53jaUu4GRXlAjcxIRrzhltJnpKkBG2Tp0IDraFJeWrOpk84RJ9EMEEYzaI1Bw==", + "engines": { + "node": ">=18" + } + }, + "node_modules/@sindresorhus/merge-streams": { + "version": "2.3.0", + "resolved": "https://registry.npmjs.org/@sindresorhus/merge-streams/-/merge-streams-2.3.0.tgz", + "integrity": "sha512-LtoMMhxAlorcGhmFYI+LhPgbPZCkgP6ra1YL604EeF6U98pLlQ3iWIGMdWSC+vWmPBWBNgmDBAhnAobLROJmwg==", + "peer": true, + "engines": { + "node": ">=18" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/@types/json-schema": { + "version": "7.0.15", + "resolved": "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.15.tgz", + "integrity": "sha512-5+fP8P8MFNC+AyZCDxrB2pkZFPGzqQWUzpSeuuVLvm8VMcorNYavBqoFcxK8bQz4Qsbn4oUEEem4wDLfcysGHA==" + }, "node_modules/@typespec/compiler": { - "version": "0.49.0-dev.11", - "resolved": "https://registry.npmjs.org/@typespec/compiler/-/compiler-0.49.0-dev.11.tgz", - "integrity": "sha512-SNt6hqu017JhwU3qPpolsGRKgSnb9Wc4FZs5FPQ6i1Ktubtgx9Ac9pxEdSNgOsdoBC3efzbpNCBasLGms0V+Fw==", - "dependencies": { - "@babel/code-frame": "~7.22.13", - "ajv": "~8.12.0", - "change-case": "~4.1.2", - "globby": "~13.2.2", + "version": "0.58.1", + "resolved": "https://registry.npmjs.org/@typespec/compiler/-/compiler-0.58.1.tgz", + "integrity": "sha512-bVxxM35r40OtuL4+/9W/g1EevlnWnW6i151nsZAFOJj1xWHoE2G9zkx5/Feic8OlzArjhGGLJOLH3Ez1Wrw35A==", + "peer": true, + "dependencies": { + "@babel/code-frame": "~7.24.7", + "ajv": "~8.16.0", + "change-case": "~5.4.4", + "globby": "~14.0.2", "mustache": "~4.2.0", - "picocolors": "~1.0.0", - "prettier": "~3.0.3", + "picocolors": "~1.0.1", + "prettier": "~3.3.2", "prompts": "~2.4.2", - "semver": "^7.5.4", - "vscode-languageserver": "~9.0.0", - "vscode-languageserver-textdocument": "~1.0.8", - "yaml": "~2.3.2", + "semver": "^7.6.2", + "temporal-polyfill": "^0.2.5", + "vscode-languageserver": "~9.0.1", + "vscode-languageserver-textdocument": "~1.0.11", + "yaml": "~2.4.5", "yargs": "~17.7.2" }, "bin": { @@ -103,93 +395,142 @@ "tsp-server": "cmd/tsp-server.js" }, "engines": { - "node": ">=16.0.0" + "node": ">=18.0.0" } }, "node_modules/@typespec/http": { - "version": "0.48.0", - "resolved": "https://registry.npmjs.org/@typespec/http/-/http-0.48.0.tgz", - "integrity": "sha512-e+0Y0Ky71flUNZSRzCfoOm8XvXsSYGmQgB9VZFDbLl8mQlXwuTfib4tWrU531TCtZHMnylbXx2wAk5+3uC6b9g==", + "version": "0.58.0", + "resolved": "https://registry.npmjs.org/@typespec/http/-/http-0.58.0.tgz", + "integrity": "sha512-jQpkugg9AZVrNDMkDIgZRpIoRkkU2b0LtKWqMGg33MItYj9/DYSgDtY7xb7oCBppRtFFZ/h138HyhYl3zQxZRg==", "peer": true, "engines": { - "node": ">=16.0.0" + "node": ">=18.0.0" + }, + "peerDependencies": { + "@typespec/compiler": "~0.58.0" + } + }, + "node_modules/@typespec/http-client-csharp": { + "version": "0.1.9-alpha.20240718.1", + "resolved": "https://registry.npmjs.org/@typespec/http-client-csharp/-/http-client-csharp-0.1.9-alpha.20240718.1.tgz", + "integrity": "sha512-4bB2Q/ZJiQs63tLQKB4ao1DtbIhSipKM9zsHComz7N3p95rvVw2a2EnXCfqqgmJgHERgykcLB0X8CljuJGPHMA==", + "dependencies": { + "json-serialize-refs": "0.1.0-0" }, "peerDependencies": { - "@typespec/compiler": "~0.48.0" + "@azure-tools/typespec-azure-core": ">=0.36.0 <1.0.0", + "@azure-tools/typespec-client-generator-core": ">=0.36.0 <1.0.0", + "@typespec/compiler": ">=0.50.0 <1.0.0", + "@typespec/http": ">=0.50.0 <1.0.0", + "@typespec/openapi": ">=0.50.0 <1.0.0", + "@typespec/rest": ">=0.50.0 <1.0.0", + "@typespec/versioning": ">=0.50.0 <1.0.0" } }, "node_modules/@typespec/openapi": { - "version": "0.49.0-dev.4", - "resolved": "https://registry.npmjs.org/@typespec/openapi/-/openapi-0.49.0-dev.4.tgz", - "integrity": "sha512-qH2borMxQoAdiMDvd88MTvlF2vFZUzusDFtxmKx/GEy+aqkw7pAnR0fqeCbPGR/P8a6slpDchusY/le3608yAQ==", + "version": "0.58.0", + "resolved": "https://registry.npmjs.org/@typespec/openapi/-/openapi-0.58.0.tgz", + "integrity": "sha512-gu6nXfmpfZrfq8Etpgl1dpMfsXii7EzQyhZgsPhIy7ZwV5bDmFk1/oyhTqIpWrnr4pD3r151T2BQjzJefjf15A==", + "peer": true, "engines": { - "node": ">=16.0.0" + "node": ">=18.0.0" }, "peerDependencies": { - "@typespec/compiler": "~0.48.1 || >=0.49.0-dev <0.49.0", - "@typespec/http": "~0.48.0 || >=0.49.0-dev <0.49.0" + "@typespec/compiler": "~0.58.0", + "@typespec/http": "~0.58.0" } }, "node_modules/@typespec/openapi3": { - "version": "0.49.0-dev.10", - "resolved": "https://registry.npmjs.org/@typespec/openapi3/-/openapi3-0.49.0-dev.10.tgz", - "integrity": "sha512-J9oiVJKv3pTcNIUzftHS676w4LOxvQe6fqAAx37Nql7SJ3AZrqHXwOrlxjMKZHifU7T+V/KZKF8Y6Li4ORPTPw==", + "version": "0.58.0", + "resolved": "https://registry.npmjs.org/@typespec/openapi3/-/openapi3-0.58.0.tgz", + "integrity": "sha512-G9t9CWT9cN6ip39dLZaE6JdEDxGsFyOUxA2s6a087rweoTH85XzsFiQL7uiUD8vHhXyEo6tF6sy3LMZVN0BsoQ==", "dependencies": { - "yaml": "~2.3.2" + "@readme/openapi-parser": "~2.6.0", + "yaml": "~2.4.5" + }, + "bin": { + "tsp-openapi3": "cmd/tsp-openapi3.js" }, "engines": { - "node": ">=16.0.0" + "node": ">=18.0.0" }, "peerDependencies": { - "@typespec/compiler": "~0.48.1 || >=0.49.0-dev <0.49.0", - "@typespec/http": "~0.48.0 || >=0.49.0-dev <0.49.0", - "@typespec/openapi": "~0.48.0 || >=0.49.0-dev <0.49.0", - "@typespec/versioning": "~0.48.0 || >=0.49.0-dev <0.49.0" + "@typespec/compiler": "~0.58.0", + "@typespec/http": "~0.58.0", + "@typespec/openapi": "~0.58.0", + "@typespec/versioning": "~0.58.0" } }, "node_modules/@typespec/rest": { - "version": "0.49.0-dev.3", - "resolved": "https://registry.npmjs.org/@typespec/rest/-/rest-0.49.0-dev.3.tgz", - "integrity": "sha512-/33xOp3N5wtUZ6O+kNssIzCEXR7+fjThtGysnsUL0lS8W3OesCgF9gKZH9fB0beaRlccmzFoRcHSOQLwalkfmg==", + "version": "0.58.0", + "resolved": "https://registry.npmjs.org/@typespec/rest/-/rest-0.58.0.tgz", + "integrity": "sha512-QBxkED0/KQKG22pwzis0n7BY+uLMSZZPSoVe/ESBFika9n5/yyeQ0l58xbFFwwfxAxe4xwuZ5PNwTdEXZbzr5g==", + "peer": true, "engines": { - "node": ">=16.0.0" + "node": ">=18.0.0" }, "peerDependencies": { - "@typespec/compiler": "~0.48.1 || >=0.49.0-dev <0.49.0", - "@typespec/http": "~0.48.0 || >=0.49.0-dev <0.49.0" + "@typespec/compiler": "~0.58.0", + "@typespec/http": "~0.58.0" } }, "node_modules/@typespec/versioning": { - "version": "0.48.0", - "resolved": "https://registry.npmjs.org/@typespec/versioning/-/versioning-0.48.0.tgz", - "integrity": "sha512-WF26vmMPwizhSnjX0ox23nbp7hthtB4cN/J5w1tlryXyp/BXySHoYsJEMK7fviSpj4WdreVXdM6wmRIG33zqig==", + "version": "0.58.0", + "resolved": "https://registry.npmjs.org/@typespec/versioning/-/versioning-0.58.0.tgz", + "integrity": "sha512-brnQQ3wKWh4AbgqmnVLj+8zyOaDk9VPWg4QBecdQxzz7PrSrlAzIzRfeIyr67+hwi/0SvkTAB6GNH7YYTypKGA==", "peer": true, "engines": { - "node": ">=16.0.0" + "node": ">=18.0.0" }, "peerDependencies": { - "@typespec/compiler": "~0.48.0" + "@typespec/compiler": "~0.58.0" + } + }, + "node_modules/@typespec/xml": { + "version": "0.58.0", + "resolved": "https://registry.npmjs.org/@typespec/xml/-/xml-0.58.0.tgz", + "integrity": "sha512-2OG2JXypMZR90nVoIVwxc+6Sc4lsggLPuh1yhw1BKiv2NBAuSdfcK4MuiIooqR206QJw9xM8qtNxlUeDb00DDw==", + "peer": true, + "engines": { + "node": ">=18.0.0" + }, + "peerDependencies": { + "@typespec/compiler": "~0.58.0" } }, "node_modules/ajv": { - "version": "8.12.0", - "resolved": "https://registry.npmjs.org/ajv/-/ajv-8.12.0.tgz", - "integrity": "sha512-sRu1kpcO9yLtYxBKvqfTeh9KzZEwO3STyX1HT+4CaDzC6HpTGYhIhPIzj9XuKU7KYDwnaeh5hcOwjy1QuJzBPA==", + "version": "8.16.0", + "resolved": "https://registry.npmjs.org/ajv/-/ajv-8.16.0.tgz", + "integrity": "sha512-F0twR8U1ZU67JIEtekUcLkXkoO5mMMmgGD8sK/xUFzJ805jxHQl92hImFAqqXMyMYjSPOyUPAwHYhB72g5sTXw==", "dependencies": { - "fast-deep-equal": "^3.1.1", + "fast-deep-equal": "^3.1.3", "json-schema-traverse": "^1.0.0", "require-from-string": "^2.0.2", - "uri-js": "^4.2.2" + "uri-js": "^4.4.1" }, "funding": { "type": "github", "url": "https://github.com/sponsors/epoberezkin" } }, + "node_modules/ajv-draft-04": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/ajv-draft-04/-/ajv-draft-04-1.0.0.tgz", + "integrity": "sha512-mv00Te6nmYbRp5DCwclxtt7yV/joXJPGS7nM+97GdxvuttCOfgI3K4U25zboyeX0O+myI8ERluxQe5wljMmVIw==", + "peerDependencies": { + "ajv": "^8.5.0" + }, + "peerDependenciesMeta": { + "ajv": { + "optional": true + } + } + }, "node_modules/ansi-regex": { "version": "5.0.1", "resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz", "integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==", + "peer": true, "engines": { "node": ">=8" } @@ -205,35 +546,27 @@ "node": ">=4" } }, + "node_modules/argparse": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/argparse/-/argparse-2.0.1.tgz", + "integrity": "sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==" + }, "node_modules/braces": { - "version": "3.0.2", - "resolved": "https://registry.npmjs.org/braces/-/braces-3.0.2.tgz", - "integrity": "sha512-b8um+L1RzM3WDSzvhm6gIz1yfTbBt6YTlcEKAvsmqCZZFw46z626lVj9j1yEPW33H5H+lBQpZMP1k8l+78Ha0A==", + "version": "3.0.3", + "resolved": "https://registry.npmjs.org/braces/-/braces-3.0.3.tgz", + "integrity": "sha512-yQbXgO/OSZVD2IsiLlro+7Hf6Q18EJrKSEsdoMzKePKXct3gvD8oLcOQdIzGupr5Fj+EDe8gO/lxc1BzfMpxvA==", + "peer": true, "dependencies": { - "fill-range": "^7.0.1" + "fill-range": "^7.1.1" }, "engines": { "node": ">=8" } }, - "node_modules/camel-case": { - "version": "4.1.2", - "resolved": "https://registry.npmjs.org/camel-case/-/camel-case-4.1.2.tgz", - "integrity": "sha512-gxGWBrTT1JuMx6R+o5PTXMmUnhnVzLQ9SNutD4YqKtI6ap897t3tKECYla6gCWEkplXnlNybEkZg9GEGxKFCgw==", - "dependencies": { - "pascal-case": "^3.1.2", - "tslib": "^2.0.3" - } - }, - "node_modules/capital-case": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/capital-case/-/capital-case-1.0.4.tgz", - "integrity": "sha512-ds37W8CytHgwnhGGTi88pcPyR15qoNkOpYwmMMfnWqqWgESapLqvDx6huFjQ5vqWSn2Z06173XNA7LtMOeUh1A==", - "dependencies": { - "no-case": "^3.0.4", - "tslib": "^2.0.3", - "upper-case-first": "^2.0.2" - } + "node_modules/call-me-maybe": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/call-me-maybe/-/call-me-maybe-1.0.2.tgz", + "integrity": "sha512-HpX65o1Hnr9HH25ojC1YGs7HCQLq0GCOibSaWER0eNpgJ/Z1MZv2mTc7+xh6WOPxbRVcmgbv4hGU+uSQ/2xFZQ==" }, "node_modules/chalk": { "version": "2.4.2", @@ -249,28 +582,16 @@ } }, "node_modules/change-case": { - "version": "4.1.2", - "resolved": "https://registry.npmjs.org/change-case/-/change-case-4.1.2.tgz", - "integrity": "sha512-bSxY2ws9OtviILG1EiY5K7NNxkqg/JnRnFxLtKQ96JaviiIxi7djMrSd0ECT9AC+lttClmYwKw53BWpOMblo7A==", - "dependencies": { - "camel-case": "^4.1.2", - "capital-case": "^1.0.4", - "constant-case": "^3.0.4", - "dot-case": "^3.0.4", - "header-case": "^2.0.4", - "no-case": "^3.0.4", - "param-case": "^3.0.4", - "pascal-case": "^3.1.2", - "path-case": "^3.0.4", - "sentence-case": "^3.0.4", - "snake-case": "^3.0.4", - "tslib": "^2.0.3" - } + "version": "5.4.4", + "resolved": "https://registry.npmjs.org/change-case/-/change-case-5.4.4.tgz", + "integrity": "sha512-HRQyTk2/YPEkt9TnUPbOpr64Uw3KOicFWPVBb+xiHvd6eBx/qPr9xqfBFDT8P2vWsvvz4jbEkfDe71W3VyNu2w==", + "peer": true }, "node_modules/cliui": { "version": "8.0.1", "resolved": "https://registry.npmjs.org/cliui/-/cliui-8.0.1.tgz", "integrity": "sha512-BSeNnyus75C4//NQ9gQt1/csTXyo/8Sb+afLAkzAptFuMsod9HFokGNudZpi/oQV73hnVK+sR+5PVRMd+Dr7YQ==", + "peer": true, "dependencies": { "string-width": "^4.2.0", "strip-ansi": "^6.0.1", @@ -280,6 +601,14 @@ "node": ">=12" } }, + "node_modules/code-error-fragment": { + "version": "0.0.230", + "resolved": "https://registry.npmjs.org/code-error-fragment/-/code-error-fragment-0.0.230.tgz", + "integrity": "sha512-cadkfKp6932H8UkhzE/gcUqhRMNf8jHzkAN7+5Myabswaghu4xABTgPHDCjW+dBAJxj/SpkTYokpzDqY4pCzQw==", + "engines": { + "node": ">= 4" + } + }, "node_modules/color-convert": { "version": "1.9.3", "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-1.9.3.tgz", @@ -293,45 +622,17 @@ "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.3.tgz", "integrity": "sha512-72fSenhMw2HZMTVHeCA9KCmpEIbzWiQsjN+BHcBbS9vr1mtt+vJjPdksIBNUmKAW8TFUDPJK5SUU3QhE9NEXDw==" }, - "node_modules/constant-case": { - "version": "3.0.4", - "resolved": "https://registry.npmjs.org/constant-case/-/constant-case-3.0.4.tgz", - "integrity": "sha512-I2hSBi7Vvs7BEuJDr5dDHfzb/Ruj3FyvFyh7KLilAjNQw3Be+xgqUBA2W6scVEcL0hL1dwPRtIqEPVUCKkSsyQ==", - "dependencies": { - "no-case": "^3.0.4", - "tslib": "^2.0.3", - "upper-case": "^2.0.2" - } - }, - "node_modules/dir-glob": { - "version": "3.0.1", - "resolved": "https://registry.npmjs.org/dir-glob/-/dir-glob-3.0.1.tgz", - "integrity": "sha512-WkrWp9GR4KXfKGYzOLmTuGVi1UWFfws377n9cc55/tb6DuqyF6pcQ5AbiHEshaDpY9v6oaSr2XCDidGmMwdzIA==", - "dependencies": { - "path-type": "^4.0.0" - }, - "engines": { - "node": ">=8" - } - }, - "node_modules/dot-case": { - "version": "3.0.4", - "resolved": "https://registry.npmjs.org/dot-case/-/dot-case-3.0.4.tgz", - "integrity": "sha512-Kv5nKlh6yRrdrGvxeJ2e5y2eRUpkUosIW4A2AS38zwSz27zu7ufDwQPi5Jhs3XAlGNetl3bmnGhQsMtkKJnj3w==", - "dependencies": { - "no-case": "^3.0.4", - "tslib": "^2.0.3" - } - }, "node_modules/emoji-regex": { "version": "8.0.0", "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz", - "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==" + "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==", + "peer": true }, "node_modules/escalade": { - "version": "3.1.1", - "resolved": "https://registry.npmjs.org/escalade/-/escalade-3.1.1.tgz", - "integrity": "sha512-k0er2gUkLf8O0zKJiAhmkTnJlTvINGv7ygDNPbeIsX/TJjGJZHuh9B2UxbsaEkmlEo9MfhrSzmhIlhRlI2GXnw==", + "version": "3.1.2", + "resolved": "https://registry.npmjs.org/escalade/-/escalade-3.1.2.tgz", + "integrity": "sha512-ErCHMCae19vR8vQGe50xIsVomy19rg6gFu3+r3jkEO46suLMWBksvVyoGgQV+jOfl84ZSOSlmv6Gxa89PmTGmA==", + "peer": true, "engines": { "node": ">=6" } @@ -350,9 +651,10 @@ "integrity": "sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q==" }, "node_modules/fast-glob": { - "version": "3.3.1", - "resolved": "https://registry.npmjs.org/fast-glob/-/fast-glob-3.3.1.tgz", - "integrity": "sha512-kNFPyjhh5cKjrUltxs+wFx+ZkbRaxxmZ+X0ZU31SOsxCEtP9VPgtq2teZw1DebupL5GmDaNQ6yKMMVcM41iqDg==", + "version": "3.3.2", + "resolved": "https://registry.npmjs.org/fast-glob/-/fast-glob-3.3.2.tgz", + "integrity": "sha512-oX2ruAFQwf/Orj8m737Y5adxDQO0LAB7/S5MnxCdTNDd4p6BsyIVsv9JQsATbTSq8KHRpLwIHbVlUNatxd+1Ow==", + "peer": true, "dependencies": { "@nodelib/fs.stat": "^2.0.2", "@nodelib/fs.walk": "^1.2.3", @@ -365,17 +667,19 @@ } }, "node_modules/fastq": { - "version": "1.15.0", - "resolved": "https://registry.npmjs.org/fastq/-/fastq-1.15.0.tgz", - "integrity": "sha512-wBrocU2LCXXa+lWBt8RoIRD89Fi8OdABODa/kEnyeyjS5aZO5/GNvI5sEINADqP/h8M29UHTHUb53sUu5Ihqdw==", + "version": "1.17.1", + "resolved": "https://registry.npmjs.org/fastq/-/fastq-1.17.1.tgz", + "integrity": "sha512-sRVD3lWVIXWg6By68ZN7vho9a1pQcN/WBFaAAsDDFzlJjvoGx0P8z7V1t72grFJfJhu3YPZBuu25f7Kaw2jN1w==", + "peer": true, "dependencies": { "reusify": "^1.0.4" } }, "node_modules/fill-range": { - "version": "7.0.1", - "resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.0.1.tgz", - "integrity": "sha512-qOo9F+dMUmC2Lcb4BbVvnKJxTPjCm+RRpe4gDuGrzkL7mEVl/djYSu2OdQ2Pa302N4oqkSg9ir6jaLWJ2USVpQ==", + "version": "7.1.1", + "resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.1.1.tgz", + "integrity": "sha512-YsGpe3WHLK8ZYi4tWDg2Jy3ebRz2rXowDxnld4bkQB00cc/1Zw9AWnC0i9ztDJitivtQvaI9KaLyKrc+hBW0yg==", + "peer": true, "dependencies": { "to-regex-range": "^5.0.1" }, @@ -387,6 +691,7 @@ "version": "2.0.5", "resolved": "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz", "integrity": "sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg==", + "peer": true, "engines": { "node": "6.* || 8.* || >= 10.*" } @@ -395,6 +700,7 @@ "version": "5.1.2", "resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz", "integrity": "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==", + "peer": true, "dependencies": { "is-glob": "^4.0.1" }, @@ -403,23 +709,30 @@ } }, "node_modules/globby": { - "version": "13.2.2", - "resolved": "https://registry.npmjs.org/globby/-/globby-13.2.2.tgz", - "integrity": "sha512-Y1zNGV+pzQdh7H39l9zgB4PJqjRNqydvdYCDG4HFXM4XuvSaQQlEc91IU1yALL8gUTDomgBAfz3XJdmUS+oo0w==", + "version": "14.0.2", + "resolved": "https://registry.npmjs.org/globby/-/globby-14.0.2.tgz", + "integrity": "sha512-s3Fq41ZVh7vbbe2PN3nrW7yC7U7MFVc5c98/iTl9c2GawNMKx/J648KQRW6WKkuU8GIbbh2IXfIRQjOZnXcTnw==", + "peer": true, "dependencies": { - "dir-glob": "^3.0.1", - "fast-glob": "^3.3.0", + "@sindresorhus/merge-streams": "^2.1.0", + "fast-glob": "^3.3.2", "ignore": "^5.2.4", - "merge2": "^1.4.1", - "slash": "^4.0.0" + "path-type": "^5.0.0", + "slash": "^5.1.0", + "unicorn-magic": "^0.1.0" }, "engines": { - "node": "^12.20.0 || ^14.13.1 || >=16.0.0" + "node": ">=18" }, "funding": { "url": "https://github.com/sponsors/sindresorhus" } }, + "node_modules/grapheme-splitter": { + "version": "1.0.4", + "resolved": "https://registry.npmjs.org/grapheme-splitter/-/grapheme-splitter-1.0.4.tgz", + "integrity": "sha512-bzh50DW9kTPM00T8y4o8vQg89Di9oLJVLW/KaOGIXJWP/iqCN6WKYkbNOF04vFLJhwcpYUh9ydh/+5vpOqV4YQ==" + }, "node_modules/has-flag": { "version": "3.0.0", "resolved": "https://registry.npmjs.org/has-flag/-/has-flag-3.0.0.tgz", @@ -428,19 +741,11 @@ "node": ">=4" } }, - "node_modules/header-case": { - "version": "2.0.4", - "resolved": "https://registry.npmjs.org/header-case/-/header-case-2.0.4.tgz", - "integrity": "sha512-H/vuk5TEEVZwrR0lp2zed9OCo1uAILMlx0JEMgC26rzyJJ3N1v6XkwHHXJQdR2doSjcGPM6OKPYoJgf0plJ11Q==", - "dependencies": { - "capital-case": "^1.0.4", - "tslib": "^2.0.3" - } - }, "node_modules/ignore": { - "version": "5.2.4", - "resolved": "https://registry.npmjs.org/ignore/-/ignore-5.2.4.tgz", - "integrity": "sha512-MAb38BcSbH0eHNBxn7ql2NH/kX33OkB3lZ1BNdh7ENeRChHTYsTvWrMubiIAMNS2llXEEgZ1MUOBtXChP3kaFQ==", + "version": "5.3.1", + "resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.1.tgz", + "integrity": "sha512-5Fytz/IraMjqpwfd34ke28PTVMjZjJG2MPn5t7OE4eUCUNf8BAa7b5WUS9/Qvr6mwOQS7Mk6vdsMno5he+T8Xw==", + "peer": true, "engines": { "node": ">= 4" } @@ -449,6 +754,7 @@ "version": "2.1.1", "resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz", "integrity": "sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==", + "peer": true, "engines": { "node": ">=0.10.0" } @@ -457,6 +763,7 @@ "version": "3.0.0", "resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz", "integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==", + "peer": true, "engines": { "node": ">=8" } @@ -465,6 +772,7 @@ "version": "4.0.3", "resolved": "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz", "integrity": "sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==", + "peer": true, "dependencies": { "is-extglob": "^2.1.1" }, @@ -476,6 +784,7 @@ "version": "7.0.0", "resolved": "https://registry.npmjs.org/is-number/-/is-number-7.0.0.tgz", "integrity": "sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==", + "peer": true, "engines": { "node": ">=0.12.0" } @@ -485,52 +794,80 @@ "resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz", "integrity": "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ==" }, + "node_modules/js-yaml": { + "version": "4.1.0", + "resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-4.1.0.tgz", + "integrity": "sha512-wpxZs9NoxZaJESJGIZTyDEaYpl0FKSA+FB9aJiyemKhMwkxQg63h4T1KJgUGHpTqPDNRcmmYLugrRjJlBtWvRA==", + "dependencies": { + "argparse": "^2.0.1" + }, + "bin": { + "js-yaml": "bin/js-yaml.js" + } + }, "node_modules/json-schema-traverse": { "version": "1.0.0", "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-1.0.0.tgz", "integrity": "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug==" }, + "node_modules/json-serialize-refs": { + "version": "0.1.0-0", + "resolved": "https://registry.npmjs.org/json-serialize-refs/-/json-serialize-refs-0.1.0-0.tgz", + "integrity": "sha512-SnNMfW2RRPDXIMKa8zdLb59UjMSI1UFZCtIb8ae68GcZ0a6x8b77lIWqqTOdq1azzmkXupD6UWriPLd0JCrFng==" + }, + "node_modules/json-to-ast": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/json-to-ast/-/json-to-ast-2.1.0.tgz", + "integrity": "sha512-W9Lq347r8tA1DfMvAGn9QNcgYm4Wm7Yc+k8e6vezpMnRT+NHbtlxgNBXRVjXe9YM6eTn6+p/MKOlV/aABJcSnQ==", + "dependencies": { + "code-error-fragment": "0.0.230", + "grapheme-splitter": "^1.0.4" + }, + "engines": { + "node": ">= 4" + } + }, + "node_modules/jsonpointer": { + "version": "5.0.1", + "resolved": "https://registry.npmjs.org/jsonpointer/-/jsonpointer-5.0.1.tgz", + "integrity": "sha512-p/nXbhSEcu3pZRdkW1OfJhpsVtW1gd4Wa1fnQc9YLiTfAjn0312eMKimbdIQzuZl9aa9xUGaRlP9T/CJE/ditQ==", + "engines": { + "node": ">=0.10.0" + } + }, "node_modules/kleur": { "version": "3.0.3", "resolved": "https://registry.npmjs.org/kleur/-/kleur-3.0.3.tgz", "integrity": "sha512-eTIzlVOSUR+JxdDFepEYcBMtZ9Qqdef+rnzWdRZuMbOywu5tO2w2N7rqjoANZ5k9vywhL6Br1VRjUIgTQx4E8w==", + "peer": true, "engines": { "node": ">=6" } }, - "node_modules/lower-case": { - "version": "2.0.2", - "resolved": "https://registry.npmjs.org/lower-case/-/lower-case-2.0.2.tgz", - "integrity": "sha512-7fm3l3NAF9WfN6W3JOmf5drwpVqX78JtoGJ3A6W0a6ZnldM41w2fV5D490psKFTpMds8TJse/eHLFFsNHHjHgg==", - "dependencies": { - "tslib": "^2.0.3" - } - }, - "node_modules/lru-cache": { - "version": "6.0.0", - "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-6.0.0.tgz", - "integrity": "sha512-Jo6dJ04CmSjuznwJSS3pUeWmd/H0ffTlkXXgwZi+eq1UCmqQwCh+eLsYOYCwY991i2Fah4h1BEMCx4qThGbsiA==", - "dependencies": { - "yallist": "^4.0.0" - }, + "node_modules/leven": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/leven/-/leven-3.1.0.tgz", + "integrity": "sha512-qsda+H8jTaUaN/x5vzW2rzc+8Rw4TAQ/4KjB46IwK5VH+IlVeeeje/EoZRpiXvIqjFgK84QffqPztGI3VBLG1A==", "engines": { - "node": ">=10" + "node": ">=6" } }, "node_modules/merge2": { "version": "1.4.1", "resolved": "https://registry.npmjs.org/merge2/-/merge2-1.4.1.tgz", "integrity": "sha512-8q7VEgMJW4J8tcfVPy8g09NcQwZdbwFEqhe/WZkoIzjn/3TGDwtOCYtXGxA3O8tPzpczCCDgv+P2P5y00ZJOOg==", + "peer": true, "engines": { "node": ">= 8" } }, "node_modules/micromatch": { - "version": "4.0.5", - "resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.5.tgz", - "integrity": "sha512-DMy+ERcEW2q8Z2Po+WNXuw3c5YaUSFjAO5GsJqfEl7UjvtIuFKO6ZrKvcItdy98dwFI2N1tg3zNIdKaQT+aNdA==", + "version": "4.0.7", + "resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.7.tgz", + "integrity": "sha512-LPP/3KorzCwBxfeUuZmaR6bG2kdeHSbe0P2tY3FLRU4vYrjYz5hI4QZwV0njUx3jeuKe67YukQ1LSPZBKDqO/Q==", + "peer": true, "dependencies": { - "braces": "^3.0.2", + "braces": "^3.0.3", "picomatch": "^2.3.1" }, "engines": { @@ -541,63 +878,39 @@ "version": "4.2.0", "resolved": "https://registry.npmjs.org/mustache/-/mustache-4.2.0.tgz", "integrity": "sha512-71ippSywq5Yb7/tVYyGbkBggbU8H3u5Rz56fH60jGFgr8uHwxs+aSKeqmluIVzM0m0kB7xQjKS6qPfd0b2ZoqQ==", + "peer": true, "bin": { "mustache": "bin/mustache" } }, - "node_modules/no-case": { - "version": "3.0.4", - "resolved": "https://registry.npmjs.org/no-case/-/no-case-3.0.4.tgz", - "integrity": "sha512-fgAN3jGAh+RoxUGZHTSOLJIqUc2wmoBwGR4tbpNAKmmovFoWq0OdRkb0VkldReO2a2iBT/OEulG9XSUc10r3zg==", - "dependencies": { - "lower-case": "^2.0.2", - "tslib": "^2.0.3" - } - }, - "node_modules/param-case": { - "version": "3.0.4", - "resolved": "https://registry.npmjs.org/param-case/-/param-case-3.0.4.tgz", - "integrity": "sha512-RXlj7zCYokReqWpOPH9oYivUzLYZ5vAPIfEmCTNViosC78F8F0H9y7T7gG2M39ymgutxF5gcFEsyZQSph9Bp3A==", - "dependencies": { - "dot-case": "^3.0.4", - "tslib": "^2.0.3" - } - }, - "node_modules/pascal-case": { - "version": "3.1.2", - "resolved": "https://registry.npmjs.org/pascal-case/-/pascal-case-3.1.2.tgz", - "integrity": "sha512-uWlGT3YSnK9x3BQJaOdcZwrnV6hPpd8jFH1/ucpiLRPh/2zCVJKS19E4GvYHvaCcACn3foXZ0cLB9Wrx1KGe5g==", - "dependencies": { - "no-case": "^3.0.4", - "tslib": "^2.0.3" - } - }, - "node_modules/path-case": { - "version": "3.0.4", - "resolved": "https://registry.npmjs.org/path-case/-/path-case-3.0.4.tgz", - "integrity": "sha512-qO4qCFjXqVTrcbPt/hQfhTQ+VhFsqNKOPtytgNKkKxSoEp3XPUQ8ObFuePylOIok5gjn69ry8XiULxCwot3Wfg==", - "dependencies": { - "dot-case": "^3.0.4", - "tslib": "^2.0.3" - } + "node_modules/openapi-types": { + "version": "12.1.3", + "resolved": "https://registry.npmjs.org/openapi-types/-/openapi-types-12.1.3.tgz", + "integrity": "sha512-N4YtSYJqghVu4iek2ZUvcN/0aqH1kRDuNqzcycDxhOUpg7GdvLa2F3DgS6yBNhInhv2r/6I0Flkn7CqL8+nIcw==", + "peer": true }, "node_modules/path-type": { - "version": "4.0.0", - "resolved": "https://registry.npmjs.org/path-type/-/path-type-4.0.0.tgz", - "integrity": "sha512-gDKb8aZMDeD/tZWs9P6+q0J9Mwkdl6xMV8TjnGP3qJVJ06bdMgkbBlLU8IdfOsIsFz2BW1rNVT3XuNEl8zPAvw==", + "version": "5.0.0", + "resolved": "https://registry.npmjs.org/path-type/-/path-type-5.0.0.tgz", + "integrity": "sha512-5HviZNaZcfqP95rwpv+1HDgUamezbqdSYTyzjTvwtJSnIH+3vnbmWsItli8OFEndS984VT55M3jduxZbX351gg==", + "peer": true, "engines": { - "node": ">=8" + "node": ">=12" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" } }, "node_modules/picocolors": { - "version": "1.0.0", - "resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.0.0.tgz", - "integrity": "sha512-1fygroTLlHu66zi26VoTDv8yRgm0Fccecssto+MhsZ0D/DGW2sm8E8AjW7NU5VVTRt5GxbeZ5qBuJr+HyLYkjQ==" + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.0.1.tgz", + "integrity": "sha512-anP1Z8qwhkbmu7MFP5iTt+wQKXgwzf7zTyGlcdzabySa9vd0Xt392U0rVmz9poOaBj0uHJKyyo9/upk0HrEQew==" }, "node_modules/picomatch": { "version": "2.3.1", "resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.1.tgz", "integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==", + "peer": true, "engines": { "node": ">=8.6" }, @@ -605,10 +918,20 @@ "url": "https://github.com/sponsors/jonschlinkert" } }, + "node_modules/pluralize": { + "version": "8.0.0", + "resolved": "https://registry.npmjs.org/pluralize/-/pluralize-8.0.0.tgz", + "integrity": "sha512-Nc3IT5yHzflTfbjgqWcCPpo7DaKy4FnpB0l/zCAW0Tc7jxAiuqSxHasntB3D7887LSrA93kDJ9IXovxJYxyLCA==", + "peer": true, + "engines": { + "node": ">=4" + } + }, "node_modules/prettier": { - "version": "3.0.3", - "resolved": "https://registry.npmjs.org/prettier/-/prettier-3.0.3.tgz", - "integrity": "sha512-L/4pUDMxcNa8R/EthV08Zt42WBO4h1rarVtK0K+QJG0X187OLo7l699jWw0GKuwzkPQ//jMFA/8Xm6Fh3J/DAg==", + "version": "3.3.3", + "resolved": "https://registry.npmjs.org/prettier/-/prettier-3.3.3.tgz", + "integrity": "sha512-i2tDNA0O5IrMO757lfrdQZCc2jPNDVntV0m/+4whiDfWaTKfMNgR7Qz0NAeGz/nRqF4m5/6CLzbP4/liHt12Ew==", + "peer": true, "bin": { "prettier": "bin/prettier.cjs" }, @@ -623,6 +946,7 @@ "version": "2.4.2", "resolved": "https://registry.npmjs.org/prompts/-/prompts-2.4.2.tgz", "integrity": "sha512-NxNv/kLguCA7p3jE8oL2aEBsrJWgAakBpgmgK6lpPWV+WuOmY6r2/zbAVnP+T8bQlA0nzHXSJSJW0Hq7ylaD2Q==", + "peer": true, "dependencies": { "kleur": "^3.0.3", "sisteransi": "^1.0.5" @@ -632,9 +956,9 @@ } }, "node_modules/punycode": { - "version": "2.3.0", - "resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.0.tgz", - "integrity": "sha512-rRV+zQD8tVFys26lAGR9WUuS4iUAngJScM+ZRSKtvl5tKeZ2t5bvdNFdNHBW9FWR4guGHlgmsZ1G7BSm2wTbuA==", + "version": "2.3.1", + "resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz", + "integrity": "sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg==", "engines": { "node": ">=6" } @@ -656,12 +980,19 @@ "type": "consulting", "url": "https://feross.org/support" } - ] + ], + "peer": true + }, + "node_modules/regenerator-runtime": { + "version": "0.14.1", + "resolved": "https://registry.npmjs.org/regenerator-runtime/-/regenerator-runtime-0.14.1.tgz", + "integrity": "sha512-dYnhHh0nJoMfnkZs6GmmhFknAGRrLznOu5nc9ML+EJxGvrx6H7teuevqVqCuPcPK//3eDrrjQhehXVx9cnkGdw==" }, "node_modules/require-directory": { "version": "2.1.1", "resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz", "integrity": "sha512-fGxEI7+wsG9xrvdjsrlmL22OMTTiHRwAMroiEeMgq8gzoLC/PQr7RsRDSTLUg/bZAZtF+TVIkHc6/4RIKrui+Q==", + "peer": true, "engines": { "node": ">=0.10.0" } @@ -678,6 +1009,7 @@ "version": "1.0.4", "resolved": "https://registry.npmjs.org/reusify/-/reusify-1.0.4.tgz", "integrity": "sha512-U9nH88a3fc/ekCF1l0/UP1IosiuIjyTh7hBvXVMHYgVcfGvt897Xguj2UOLDeI5BG2m7/uwyaLVT6fbtCwTyzw==", + "peer": true, "engines": { "iojs": ">=1.0.0", "node": ">=0.10.0" @@ -701,17 +1033,16 @@ "url": "https://feross.org/support" } ], + "peer": true, "dependencies": { "queue-microtask": "^1.2.2" } }, "node_modules/semver": { - "version": "7.5.4", - "resolved": "https://registry.npmjs.org/semver/-/semver-7.5.4.tgz", - "integrity": "sha512-1bCSESV6Pv+i21Hvpxp3Dx+pSD8lIPt8uVjRrxAUt/nbswYc+tK6Y2btiULjd4+fnq15PX+nqQDC7Oft7WkwcA==", - "dependencies": { - "lru-cache": "^6.0.0" - }, + "version": "7.6.3", + "resolved": "https://registry.npmjs.org/semver/-/semver-7.6.3.tgz", + "integrity": "sha512-oVekP1cKtI+CTDvHWYFUcMtsK/00wmAEfyqKfNdARm8u1wNVhSgaX7A8d4UuIlUI5e84iEwOhs7ZPYRmzU9U6A==", + "peer": true, "bin": { "semver": "bin/semver.js" }, @@ -719,45 +1050,29 @@ "node": ">=10" } }, - "node_modules/sentence-case": { - "version": "3.0.4", - "resolved": "https://registry.npmjs.org/sentence-case/-/sentence-case-3.0.4.tgz", - "integrity": "sha512-8LS0JInaQMCRoQ7YUytAo/xUu5W2XnQxV2HI/6uM6U7CITS1RqPElr30V6uIqyMKM9lJGRVFy5/4CuzcixNYSg==", - "dependencies": { - "no-case": "^3.0.4", - "tslib": "^2.0.3", - "upper-case-first": "^2.0.2" - } - }, "node_modules/sisteransi": { "version": "1.0.5", "resolved": "https://registry.npmjs.org/sisteransi/-/sisteransi-1.0.5.tgz", - "integrity": "sha512-bLGGlR1QxBcynn2d5YmDX4MGjlZvy2MRBDRNHLJ8VI6l6+9FUiyTFNJ0IveOSP0bcXgVDPRcfGqA0pjaqUpfVg==" + "integrity": "sha512-bLGGlR1QxBcynn2d5YmDX4MGjlZvy2MRBDRNHLJ8VI6l6+9FUiyTFNJ0IveOSP0bcXgVDPRcfGqA0pjaqUpfVg==", + "peer": true }, "node_modules/slash": { - "version": "4.0.0", - "resolved": "https://registry.npmjs.org/slash/-/slash-4.0.0.tgz", - "integrity": "sha512-3dOsAHXXUkQTpOYcoAxLIorMTp4gIQr5IW3iVb7A7lFIp0VHhnynm9izx6TssdrIcVIESAlVjtnO2K8bg+Coew==", + "version": "5.1.0", + "resolved": "https://registry.npmjs.org/slash/-/slash-5.1.0.tgz", + "integrity": "sha512-ZA6oR3T/pEyuqwMgAKT0/hAv8oAXckzbkmR0UkUosQ+Mc4RxGoJkRmwHgHufaenlyAgE1Mxgpdcrf75y6XcnDg==", + "peer": true, "engines": { - "node": ">=12" + "node": ">=14.16" }, "funding": { "url": "https://github.com/sponsors/sindresorhus" } }, - "node_modules/snake-case": { - "version": "3.0.4", - "resolved": "https://registry.npmjs.org/snake-case/-/snake-case-3.0.4.tgz", - "integrity": "sha512-LAOh4z89bGQvl9pFfNF8V146i7o7/CqFPbqzYgP+yYzDIDeS9HaNFtXABamRW+AQzEVODcvE79ljJ+8a9YSdMg==", - "dependencies": { - "dot-case": "^3.0.4", - "tslib": "^2.0.3" - } - }, "node_modules/string-width": { "version": "4.2.3", "resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz", "integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==", + "peer": true, "dependencies": { "emoji-regex": "^8.0.0", "is-fullwidth-code-point": "^3.0.0", @@ -771,6 +1086,7 @@ "version": "6.0.1", "resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz", "integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==", + "peer": true, "dependencies": { "ansi-regex": "^5.0.1" }, @@ -789,10 +1105,26 @@ "node": ">=4" } }, + "node_modules/temporal-polyfill": { + "version": "0.2.5", + "resolved": "https://registry.npmjs.org/temporal-polyfill/-/temporal-polyfill-0.2.5.tgz", + "integrity": "sha512-ye47xp8Cb0nDguAhrrDS1JT1SzwEV9e26sSsrWzVu+yPZ7LzceEcH0i2gci9jWfOfSCCgM3Qv5nOYShVUUFUXA==", + "peer": true, + "dependencies": { + "temporal-spec": "^0.2.4" + } + }, + "node_modules/temporal-spec": { + "version": "0.2.4", + "resolved": "https://registry.npmjs.org/temporal-spec/-/temporal-spec-0.2.4.tgz", + "integrity": "sha512-lDMFv4nKQrSjlkHKAlHVqKrBG4DyFfa9F74cmBZ3Iy3ed8yvWnlWSIdi4IKfSqwmazAohBNwiN64qGx4y5Q3IQ==", + "peer": true + }, "node_modules/to-regex-range": { "version": "5.0.1", "resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz", "integrity": "sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==", + "peer": true, "dependencies": { "is-number": "^7.0.0" }, @@ -800,25 +1132,16 @@ "node": ">=8.0" } }, - "node_modules/tslib": { - "version": "2.6.2", - "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.6.2.tgz", - "integrity": "sha512-AEYxH93jGFPn/a2iVAwW87VuUIkR1FVUKB77NwMF7nBTDkDrrT/Hpt/IrCJ0QXhW27jTBDcf5ZY7w6RiqTMw2Q==" - }, - "node_modules/upper-case": { - "version": "2.0.2", - "resolved": "https://registry.npmjs.org/upper-case/-/upper-case-2.0.2.tgz", - "integrity": "sha512-KgdgDGJt2TpuwBUIjgG6lzw2GWFRCW9Qkfkiv0DxqHHLYJHmtmdUIKcZd8rHgFSjopVTlw6ggzCm1b8MFQwikg==", - "dependencies": { - "tslib": "^2.0.3" - } - }, - "node_modules/upper-case-first": { - "version": "2.0.2", - "resolved": "https://registry.npmjs.org/upper-case-first/-/upper-case-first-2.0.2.tgz", - "integrity": "sha512-514ppYHBaKwfJRK/pNC6c/OxfGa0obSnAl106u97Ed0I625Nin96KAjttZF6ZL3e1XLtphxnqrOi9iWgm+u+bg==", - "dependencies": { - "tslib": "^2.0.3" + "node_modules/unicorn-magic": { + "version": "0.1.0", + "resolved": "https://registry.npmjs.org/unicorn-magic/-/unicorn-magic-0.1.0.tgz", + "integrity": "sha512-lRfVq8fE8gz6QMBuDM6a+LO3IAzTi05H6gCVaUpir2E1Rwpo4ZUog45KpNXKC/Mn3Yb9UDuHumeFTo9iV/D9FQ==", + "peer": true, + "engines": { + "node": ">=18" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" } }, "node_modules/uri-js": { @@ -833,44 +1156,50 @@ "version": "8.2.0", "resolved": "https://registry.npmjs.org/vscode-jsonrpc/-/vscode-jsonrpc-8.2.0.tgz", "integrity": "sha512-C+r0eKJUIfiDIfwJhria30+TYWPtuHJXHtI7J0YlOmKAo7ogxP20T0zxB7HZQIFhIyvoBPwWskjxrvAtfjyZfA==", + "peer": true, "engines": { "node": ">=14.0.0" } }, "node_modules/vscode-languageserver": { - "version": "9.0.0", - "resolved": "https://registry.npmjs.org/vscode-languageserver/-/vscode-languageserver-9.0.0.tgz", - "integrity": "sha512-npT72Iu28Tjsm94MsCbwJmIu5ycCF3UEPj3Eb3725T1Hqf4d+Vj2W4GC+F8l4n9yNItJuvE/AHYvomvAs9Dj8A==", + "version": "9.0.1", + "resolved": "https://registry.npmjs.org/vscode-languageserver/-/vscode-languageserver-9.0.1.tgz", + "integrity": "sha512-woByF3PDpkHFUreUa7Hos7+pUWdeWMXRd26+ZX2A8cFx6v/JPTtd4/uN0/jB6XQHYaOlHbio03NTHCqrgG5n7g==", + "peer": true, "dependencies": { - "vscode-languageserver-protocol": "3.17.4" + "vscode-languageserver-protocol": "3.17.5" }, "bin": { "installServerIntoExtension": "bin/installServerIntoExtension" } }, "node_modules/vscode-languageserver-protocol": { - "version": "3.17.4", - "resolved": "https://registry.npmjs.org/vscode-languageserver-protocol/-/vscode-languageserver-protocol-3.17.4.tgz", - "integrity": "sha512-IpaHLPft+UBWf4dOIH15YEgydTbXGz52EMU2h16SfFpYu/yOQt3pY14049mtpJu+4CBHn+hq7S67e7O0AwpRqQ==", + "version": "3.17.5", + "resolved": "https://registry.npmjs.org/vscode-languageserver-protocol/-/vscode-languageserver-protocol-3.17.5.tgz", + "integrity": "sha512-mb1bvRJN8SVznADSGWM9u/b07H7Ecg0I3OgXDuLdn307rl/J3A9YD6/eYOssqhecL27hK1IPZAsaqh00i/Jljg==", + "peer": true, "dependencies": { "vscode-jsonrpc": "8.2.0", - "vscode-languageserver-types": "3.17.4" + "vscode-languageserver-types": "3.17.5" } }, "node_modules/vscode-languageserver-textdocument": { - "version": "1.0.8", - "resolved": "https://registry.npmjs.org/vscode-languageserver-textdocument/-/vscode-languageserver-textdocument-1.0.8.tgz", - "integrity": "sha512-1bonkGqQs5/fxGT5UchTgjGVnfysL0O8v1AYMBjqTbWQTFn721zaPGDYFkOKtfDgFiSgXM3KwaG3FMGfW4Ed9Q==" + "version": "1.0.11", + "resolved": "https://registry.npmjs.org/vscode-languageserver-textdocument/-/vscode-languageserver-textdocument-1.0.11.tgz", + "integrity": "sha512-X+8T3GoiwTVlJbicx/sIAF+yuJAqz8VvwJyoMVhwEMoEKE/fkDmrqUgDMyBECcM2A2frVZIUj5HI/ErRXCfOeA==", + "peer": true }, "node_modules/vscode-languageserver-types": { - "version": "3.17.4", - "resolved": "https://registry.npmjs.org/vscode-languageserver-types/-/vscode-languageserver-types-3.17.4.tgz", - "integrity": "sha512-9YXi5pA3XF2V+NUQg6g+lulNS0ncRCKASYdK3Cs7kiH9sVFXWq27prjkC/B8M/xJLRPPRSPCHVMuBTgRNFh2sQ==" + "version": "3.17.5", + "resolved": "https://registry.npmjs.org/vscode-languageserver-types/-/vscode-languageserver-types-3.17.5.tgz", + "integrity": "sha512-Ld1VelNuX9pdF39h2Hgaeb5hEZM2Z3jUrrMgWQAu82jMtZp7p3vJT3BzToKtZI7NgQssZje5o0zryOrhQvzQAg==", + "peer": true }, "node_modules/wrap-ansi": { "version": "7.0.0", "resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz", "integrity": "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==", + "peer": true, "dependencies": { "ansi-styles": "^4.0.0", "string-width": "^4.1.0", @@ -887,6 +1216,7 @@ "version": "4.3.0", "resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz", "integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==", + "peer": true, "dependencies": { "color-convert": "^2.0.1" }, @@ -901,6 +1231,7 @@ "version": "2.0.1", "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz", "integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==", + "peer": true, "dependencies": { "color-name": "~1.1.4" }, @@ -911,25 +1242,25 @@ "node_modules/wrap-ansi/node_modules/color-name": { "version": "1.1.4", "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz", - "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==" + "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==", + "peer": true }, "node_modules/y18n": { "version": "5.0.8", "resolved": "https://registry.npmjs.org/y18n/-/y18n-5.0.8.tgz", "integrity": "sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA==", + "peer": true, "engines": { "node": ">=10" } }, - "node_modules/yallist": { - "version": "4.0.0", - "resolved": "https://registry.npmjs.org/yallist/-/yallist-4.0.0.tgz", - "integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A==" - }, "node_modules/yaml": { - "version": "2.3.2", - "resolved": "https://registry.npmjs.org/yaml/-/yaml-2.3.2.tgz", - "integrity": "sha512-N/lyzTPaJasoDmfV7YTrYCI0G/3ivm/9wdG0aHuheKowWQwGTsK0Eoiw6utmzAnI6pkJa0DUVygvp3spqqEKXg==", + "version": "2.4.5", + "resolved": "https://registry.npmjs.org/yaml/-/yaml-2.4.5.tgz", + "integrity": "sha512-aBx2bnqDzVOyNKfsysjA2ms5ZlnjSAW2eG3/L5G/CSujfjLJTJsEw1bGw8kCf04KodQWk1pxlGnZ56CRxiawmg==", + "bin": { + "yaml": "bin.mjs" + }, "engines": { "node": ">= 14" } @@ -938,6 +1269,7 @@ "version": "17.7.2", "resolved": "https://registry.npmjs.org/yargs/-/yargs-17.7.2.tgz", "integrity": "sha512-7dSzzRQ++CKnNI/krKnYRV7JKKPUXMEh61soaHKg9mrWEhzFWhFnxPxGl+69cD1Ou63C13NUPCnmIcrvqCuM6w==", + "peer": true, "dependencies": { "cliui": "^8.0.1", "escalade": "^3.1.1", @@ -955,6 +1287,7 @@ "version": "21.1.1", "resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-21.1.1.tgz", "integrity": "sha512-tVpsJW7DdjecAiFpbIB1e3qxIQsE6NoPc5/eTdrbbIC4h0LVsWhnoa3g+m2HclBIujHzsxZ4VJVA+GUuc2/LBw==", + "peer": true, "engines": { "node": ">=12" } diff --git a/package.json b/package.json index eba48058d..7ff610bbf 100644 --- a/package.json +++ b/package.json @@ -3,12 +3,9 @@ "version": "0.1.0", "type": "module", "dependencies": { - "@typespec/compiler": "^0.49.0-dev.11", - "@typespec/openapi": "^0.49.0-dev.4", - "@typespec/openapi3": "^0.49.0-dev.10", - "@typespec/rest": "^0.49.0-dev.3", - "@typespec/http": "^0.49.0-dev.0", - "@typespec/versioning": "^0.49.0-dev.0" + "@autorest/csharp": "3.0.0-beta.20240722.4", + "@azure-tools/typespec-csharp": "0.2.0-beta.20240722.4", + "@typespec/openapi3": "^0.58.0" }, "private": true } diff --git a/readme.md b/readme.md deleted file mode 100644 index c4dd93188..000000000 --- a/readme.md +++ /dev/null @@ -1,15 +0,0 @@ -A conversion of the OpenAI OpenAPI to TypeSpec. - -There are some deltas: - -### Changes to API Semantics: - -- Many things are missing defaults (mostly due to bug where we can't set null defaults) -- Error responses have been added. -- Where known, the `object` property's type is narrowed from string to the constant value it will always be - -### Changes to API metadata or OpenAPI format - -- Much of the x-oaiMeta entries have not been added. -- In some cases, new schemas needed to be defined in order to be defined in TypeSpec (e.g. because the constraints could not be added to a model property with a heterogeneous type) -- There is presently no way to set `title` diff --git a/tsp-output/@typespec/openapi3/openapi.yaml b/tsp-output/@typespec/openapi3/openapi.yaml deleted file mode 100644 index d37490680..000000000 --- a/tsp-output/@typespec/openapi3/openapi.yaml +++ /dev/null @@ -1,2963 +0,0 @@ -openapi: 3.0.0 -info: - title: OpenAI API - version: 2.0.0 - description: The OpenAI REST API. Please see https://platform.openai.com/docs/api-reference for more details. -tags: - - name: OpenAI -paths: - /audio/transcriptions: - post: - tags: - - OpenAI - operationId: createTranscription - summary: Transcribes audio into the input language. - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/CreateTranscriptionResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - multipart/form-data: - schema: - $ref: '#/components/schemas/CreateTranscriptionRequest' - /audio/translations: - post: - tags: - - OpenAI - operationId: createTranslation - summary: Transcribes audio into the input language. - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/CreateTranslationResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - multipart/form-data: - schema: - $ref: '#/components/schemas/CreateTranslationRequest' - /chat/completions: - post: - tags: - - OpenAI - operationId: createChatCompletion - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/CreateChatCompletionResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - application/json: - schema: - $ref: '#/components/schemas/CreateChatCompletionRequest' - /completions: - post: - tags: - - OpenAI - operationId: createCompletion - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/CreateCompletionResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - application/json: - schema: - $ref: '#/components/schemas/CreateCompletionRequest' - x-oaiMeta: - name: Create chat completion - group: chat - returns: |- - Returns a [chat completion](/docs/api-reference/chat/object) object, or a streamed sequence of - [chat completion chunk](/docs/api-reference/chat/streaming) objects if the request is streamed. - path: create - examples: - - title: No streaming - request: - curl: |- - curl https://api.openai.com/v1/chat/completions \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "model": "VAR_model_id", - "messages": [ - { - "role": "system", - "content": "You are a helpful assistant." - }, - { - "role": "user", - "content": "Hello!" - } - ] - python: |- - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - - completion = openai.ChatCompletion.create( - model="VAR_model_id", - messages=[ - {"role": "system", "content": "You are a helpful assistant."}, - {"role": "user", "content": "Hello!"} - ] - ) - - print(completion.choices[0].message) - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const completion = await openai.chat.completions.create({ - messages: [{ role: "system", content: "string" }], - model: "VAR_model_id", - }); - - console.log(completion.choices[0]); - } - - main(); - response: |- - { - "id": "chatcmpl-123", - "object": "chat.completion", - "created": 1677652288, - "model": "gpt-3.5-turbo-0613", - "choices": [{ - "index": 0, - "message": { - "role": "assistant", - "content": " - - Hello there, how may I assist you today?", - }, - "finish_reason": "stop" - }], - "usage": { - "prompt_tokens": 9, - "completion_tokens": 12, - "total_tokens": 21 - } - } - - title: Streaming - request: - curl: |- - curl https://api.openai.com/v1/chat/completions \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer $OPENAI_API_KEY" \ - -d '{ - "model": "VAR_model_id", - "messages": [ - { - "role": "system", - "content": "You are a helpful assistant." - }, - { - "role": "user", - "content": "Hello!" - } - ], - "stream": true - }' - python: |- - import os - import openai - openai.api_key = os.getenv("OPENAI_API_KEY") - - completion = openai.ChatCompletion.create( - model="VAR_model_id", - messages=[ - {"role": "system", "content": "You are a helpful assistant."}, - {"role": "user", "content": "Hello!"} - ], - stream=True - ) - - for chunk in completion: - print(chunk.choices[0].delta) - node.js: |- - import OpenAI from "openai"; - - const openai = new OpenAI(); - - async function main() { - const completion = await openai.chat.completions.create({ - model: "VAR_model_id", - messages: [ - {"role": "system", "content": "You are a helpful assistant."}, - {"role": "user", "content": "Hello!"} - ], - stream: true, - }); - - for await (const chunk of completion) { - console.log(chunk.choices[0].delta.content); - } - } - - main(); - response: |- - { - "id": "chatcmpl-123", - "object": "chat.completion.chunk", - "created": 1677652288, - "model": "gpt-3.5-turbo", - "choices": [{ - "index": 0, - "delta": { - "content": "Hello", - }, - "finish_reason": "stop" - }] - } - /edits: - post: - tags: - - OpenAI - operationId: createEdit - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/CreateEditResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - application/json: - schema: - $ref: '#/components/schemas/CreateEditRequest' - deprecated: true - /embeddings: - post: - tags: - - OpenAI - operationId: createEmbedding - summary: Creates an embedding vector representing the input text. - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/CreateEmbeddingResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - application/json: - schema: - $ref: '#/components/schemas/CreateEmbeddingRequest' - /files: - get: - tags: - - OpenAI - operationId: listFiles - summary: Returns a list of files that belong to the user's organization. - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/ListFilesResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - post: - tags: - - OpenAI - operationId: createFile - summary: Returns a list of files that belong to the user's organization. - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/OpenAIFile' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - multipart/form-data: - schema: - $ref: '#/components/schemas/CreateFileRequest' - /files/files/{file_id}: - post: - tags: - - OpenAI - operationId: retrieveFile - summary: Returns information about a specific file. - parameters: - - name: file_id - in: path - required: true - description: The ID of the file to use for this request. - schema: - type: string - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/OpenAIFile' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - delete: - tags: - - OpenAI - operationId: deleteFile - summary: Delete a file - parameters: - - name: file_id - in: path - required: true - description: The ID of the file to use for this request. - schema: - type: string - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/DeleteFileResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - /files/files/{file_id}/content: - get: - tags: - - OpenAI - operationId: downloadFile - summary: Returns the contents of the specified file. - parameters: - - name: file_id - in: path - required: true - description: The ID of the file to use for this request. - schema: - type: string - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - type: string - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - /fine-tunes: - post: - tags: - - OpenAI - operationId: createFineTune - summary: |- - Creates a job that fine-tunes a specified model from a given dataset. - - Response includes details of the enqueued job including job status and the name of the fine-tuned models once complete. - - [Learn more about fine-tuning](/docs/guides/legacy-fine-tuning) - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/FineTune' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - application/json: - schema: - $ref: '#/components/schemas/CreateFineTuneRequest' - deprecated: true - get: - tags: - - OpenAI - operationId: listFineTunes - summary: List your organization's fine-tuning jobs - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/ListFineTunesResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - deprecated: true - /fine-tunes/{fine_tune_id}: - get: - tags: - - OpenAI - operationId: retrieveFineTune - summary: |- - Gets info about the fine-tune job. - - [Learn more about fine-tuning](/docs/guides/legacy-fine-tuning) - parameters: - - name: fine_tune_id - in: path - required: true - description: The ID of the fine-tune job - schema: - type: string - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/FineTune' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - deprecated: true - /fine-tunes/{fine_tune_id}/cancel: - post: - tags: - - OpenAI - operationId: cancelFineTune - summary: Immediately cancel a fine-tune job. - parameters: - - name: fine_tune_id - in: path - required: true - description: The ID of the fine-tune job to cancel - schema: - type: string - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/FineTune' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - deprecated: true - /fine-tunes/{fine_tune_id}/events: - get: - tags: - - OpenAI - operationId: listFineTuneEvents - summary: Get fine-grained status updates for a fine-tune job. - parameters: - - name: fine_tune_id - in: path - required: true - description: The ID of the fine-tune job to get events for. - schema: - type: string - - name: stream - in: query - required: false - description: |- - Whether to stream events for the fine-tune job. If set to true, events will be sent as - data-only - [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) - as they become available. The stream will terminate with a `data: [DONE]` message when the - job is finished (succeeded, cancelled, or failed). - - If set to false, only events generated so far will be returned. - schema: - type: boolean - default: false - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/ListFineTuneEventsResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - deprecated: true - /fine_tuning/jobs: - post: - tags: - - OpenAI - operationId: createFineTuningJob - description: |- - Creates a job that fine-tunes a specified model from a given dataset. - - Response includes details of the enqueued job including job status and the name of the - fine-tuned models once complete. - - [Learn more about fine-tuning](/docs/guides/fine-tuning) - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/FineTuningJob' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - application/json: - schema: - $ref: '#/components/schemas/CreateFineTuningJobRequest' - get: - tags: - - OpenAI - operationId: listPaginatedFineTuningJobs - parameters: - - name: after - in: query - required: false - description: Identifier for the last job from the previous pagination request. - schema: - type: string - - name: limit - in: query - required: false - description: Number of fine-tuning jobs to retrieve. - schema: - type: integer - format: int64 - default: 20 - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/ListPaginatedFineTuningJobsResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - /fine_tuning/jobs/{fine_tuning_job_id}: - get: - tags: - - OpenAI - operationId: retrieveFineTuningJob - summary: |- - Get info about a fine-tuning job. - - [Learn more about fine-tuning](/docs/guides/fine-tuning) - parameters: - - name: fine_tuning_job_id - in: path - required: true - schema: - type: string - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/FineTuningJob' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - /fine_tuning/jobs/{fine_tuning_job_id}/cancel: - post: - tags: - - OpenAI - operationId: cancelFineTuningJob - summary: Immediately cancel a fine-tune job. - parameters: - - name: fine_tuning_job_id - in: path - required: true - description: The ID of the fine-tuning job to cancel. - schema: - type: string - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/FineTuningJob' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - /fine_tuning/jobs/{fine_tuning_job_id}/events: - get: - tags: - - OpenAI - operationId: listFineTuningEvents - summary: Get status updates for a fine-tuning job. - parameters: - - name: fine_tuning_job_id - in: path - required: true - description: The ID of the fine-tuning job to get events for. - schema: - type: string - - name: after - in: query - required: false - description: Identifier for the last event from the previous pagination request. - schema: - type: string - - name: limit - in: query - required: false - description: Number of events to retrieve. - schema: - type: integer - default: 20 - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/ListFineTuningJobEventsResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - /images/edits: - post: - tags: - - OpenAI - operationId: createImageEdit - summary: Creates an edited or extended image given an original image and a prompt. - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/ImagesResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - multipart/form-data: - schema: - $ref: '#/components/schemas/CreateImageEditRequest' - /images/generations: - post: - tags: - - OpenAI - operationId: createImage - summary: Creates an image given a prompt - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/ImagesResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - application/json: - schema: - $ref: '#/components/schemas/CreateImageRequest' - /images/variations: - post: - tags: - - OpenAI - operationId: createImageVariation - summary: Creates an edited or extended image given an original image and a prompt. - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/ImagesResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - multipart/form-data: - schema: - $ref: '#/components/schemas/CreateImageVariationRequest' - /models: - get: - tags: - - OpenAI - operationId: listModels - summary: |- - Lists the currently available models, and provides basic information about each one such as the - owner and availability. - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/ListModelsResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - /models/{model}: - get: - tags: - - OpenAI - operationId: retrieveModel - summary: |- - Retrieves a model instance, providing basic information about the model such as the owner and - permissioning. - parameters: - - name: model - in: path - required: true - description: The ID of the model to use for this request. - schema: - type: string - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/Model' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - delete: - tags: - - OpenAI - operationId: deleteModel - summary: Delete a fine-tuned model. You must have the Owner role in your organization to delete a model. - parameters: - - name: model - in: path - required: true - description: The model to delete - schema: - type: string - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/DeleteModelResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - /moderations: - post: - tags: - - OpenAI - operationId: createModeration - summary: Classifies if text violates OpenAI's Content Policy - parameters: [] - responses: - '200': - description: The request has succeeded. - content: - application/json: - schema: - $ref: '#/components/schemas/CreateModerationResponse' - default: - description: An unexpected error response. - content: - application/json: - schema: - $ref: '#/components/schemas/ErrorResponse' - requestBody: - required: true - content: - application/json: - schema: - $ref: '#/components/schemas/CreateModerationRequest' -security: - - BearerAuth: [] -components: - schemas: - ChatCompletionFunctionCallOption: - type: object - required: - - name - properties: - name: - type: string - description: The name of the function to call. - ChatCompletionFunctionParameters: - type: object - additionalProperties: {} - ChatCompletionFunctions: - type: object - required: - - name - - parameters - properties: - name: - type: string - description: |- - The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and - dashes, with a maximum length of 64. - description: - type: string - description: |- - A description of what the function does, used by the model to choose when and how to call the - function. - parameters: - allOf: - - $ref: '#/components/schemas/ChatCompletionFunctionParameters' - description: |- - The parameters the functions accepts, described as a JSON Schema object. See the - [guide](/docs/guides/gpt/function-calling) for examples, and the - [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation - about the format.\n\nTo describe a function that accepts no parameters, provide the value - `{\"type\": \"object\", \"properties\": {}}`. - ChatCompletionRequestMessage: - type: object - required: - - role - - content - properties: - role: - type: string - enum: - - system - - user - - assistant - - function - description: The role of the messages author. One of `system`, `user`, `assistant`, or `function`. - content: - type: string - nullable: true - description: |- - The contents of the message. `content` is required for all messages, and may be null for - assistant messages with function calls. - name: - type: string - description: |- - The name of the author of this message. `name` is required if role is `function`, and it - should be the name of the function whose response is in the `content`. May contain a-z, - A-Z, 0-9, and underscores, with a maximum length of 64 characters. - function_call: - type: object - description: The name and arguments of a function that should be called, as generated by the model. - required: - - name - - arguments - properties: - name: - type: string - description: The name of the function to call. - arguments: - type: string - description: |- - The arguments to call the function with, as generated by the model in JSON format. Note that - the model does not always generate valid JSON, and may hallucinate parameters not defined by - your function schema. Validate the arguments in your code before calling your function. - ChatCompletionResponseMessage: - type: object - required: - - role - - content - properties: - role: - type: string - enum: - - system - - user - - assistant - - function - description: The role of the author of this message. - content: - type: string - nullable: true - description: The contents of the message. - function_call: - type: object - description: The name and arguments of a function that should be called, as generated by the model. - required: - - name - - arguments - properties: - name: - type: string - description: The name of the function to call. - arguments: - type: string - description: |- - The arguments to call the function with, as generated by the model in JSON format. Note that - the model does not always generate valid JSON, and may hallucinate parameters not defined by - your function schema. Validate the arguments in your code before calling your function. - CompletionUsage: - type: object - description: Usage statistics for the completion request. - required: - - prompt_tokens - - completion_tokens - - total_tokens - properties: - prompt_tokens: - type: integer - format: int64 - description: Number of tokens in the prompt. - completion_tokens: - type: integer - format: int64 - description: Number of tokens in the generated completion - total_tokens: - type: integer - format: int64 - description: Total number of tokens used in the request (prompt + completion). - CreateChatCompletionRequest: - type: object - required: - - model - - messages - properties: - model: - anyOf: - - type: string - - type: string - enum: - - gpt4 - - gpt-4-0314 - - gpt-4-0613 - - gpt-4-32k - - gpt-4-32k-0314 - - gpt-4-32k-0613 - - gpt-3.5-turbo - - gpt-3.5-turbo-16k - - gpt-3.5-turbo-0301 - - gpt-3.5-turbo-0613 - - gpt-3.5-turbo-16k-0613 - description: |- - ID of the model to use. See the [model endpoint compatibility](/docs/models/model-endpoint-compatibility) - table for details on which models work with the Chat API. - x-oaiTypeLabel: string - messages: - type: array - items: - $ref: '#/components/schemas/ChatCompletionRequestMessage' - description: |- - A list of messages comprising the conversation so far. - [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_format_inputs_to_ChatGPT_models.ipynb). - minItems: 1 - functions: - type: array - items: - $ref: '#/components/schemas/ChatCompletionFunctions' - description: A list of functions the model may generate JSON inputs for. - minItems: 1 - maxItems: 128 - function_call: - anyOf: - - type: string - enum: - - none - - auto - - $ref: '#/components/schemas/ChatCompletionFunctionCallOption' - description: |- - Controls how the model responds to function calls. `none` means the model does not call a - function, and responds to the end-user. `auto` means the model can pick between an end-user or - calling a function. Specifying a particular function via `{\"name":\ \"my_function\"}` forces the - model to call that function. `none` is the default when no functions are present. `auto` is the - default if functions are present. - temperature: - oneOf: - - $ref: '#/components/schemas/Temperature' - nullable: true - description: |- - What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output - more random, while lower values like 0.2 will make it more focused and deterministic. - - We generally recommend altering this or `top_p` but not both. - default: 1 - top_p: - oneOf: - - $ref: '#/components/schemas/TopP' - nullable: true - description: |- - An alternative to sampling with temperature, called nucleus sampling, where the model considers - the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising - the top 10% probability mass are considered. - - We generally recommend altering this or `temperature` but not both. - default: 1 - n: - oneOf: - - $ref: '#/components/schemas/N' - nullable: true - description: |- - How many completions to generate for each prompt. - **Note:** Because this parameter generates many completions, it can quickly consume your token - quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. - default: 1 - max_tokens: - oneOf: - - $ref: '#/components/schemas/MaxTokens' - nullable: true - description: |- - The maximum number of [tokens](/tokenizer) to generate in the completion. - - The token count of your prompt plus `max_tokens` cannot exceed the model's context length. - [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb) - for counting tokens. - default: 16 - stop: - allOf: - - $ref: '#/components/schemas/Stop' - description: Up to 4 sequences where the API will stop generating further tokens. - default: null - presence_penalty: - oneOf: - - $ref: '#/components/schemas/Penalty' - nullable: true - description: |- - Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear - in the text so far, increasing the model's likelihood to talk about new topics. - - [See more information about frequency and presence penalties.](/docs/guides/gpt/parameter-details) - frequency_penalty: - oneOf: - - $ref: '#/components/schemas/Penalty' - nullable: true - description: |- - Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing - frequency in the text so far, decreasing the model's likelihood to repeat the same line - verbatim. - - [See more information about frequency and presence penalties.](/docs/guides/gpt/parameter-details) - logit_bias: - type: object - description: |- - Modify the likelihood of specified tokens appearing in the completion. - Accepts a json object that maps tokens (specified by their token ID in the tokenizer) to an - associated bias value from -100 to 100. Mathematically, the bias is added to the logits - generated by the model prior to sampling. The exact effect will vary per model, but values - between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 - should result in a ban or exclusive selection of the relevant token. - additionalProperties: - type: integer - format: int64 - nullable: true - x-oaiTypeLabel: map - user: - allOf: - - $ref: '#/components/schemas/User' - description: |- - A unique identifier representing your end-user, which can help OpenAI to monitor and detect - abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). - stream: - type: boolean - nullable: true - description: |- - If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only - [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) - as they become available, with the stream terminated by a `data: [DONE]` message. - [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb). - default: true - CreateChatCompletionResponse: - type: object - description: Represents a chat completion response returned by model, based on the provided input. - required: - - id - - object - - created - - model - - choices - properties: - id: - type: string - description: A unique identifier for the chat completion. - object: - type: string - description: The object type, which is always `chat.completion`. - created: - type: integer - format: unixtime - description: The Unix timestamp (in seconds) of when the chat completion was created. - model: - type: string - description: The model used for the chat completion. - choices: - type: array - items: - type: object - required: - - index - - message - - finish_reason - properties: - index: - type: integer - format: int64 - description: The index of the choice in the list of choices. - message: - $ref: '#/components/schemas/ChatCompletionResponseMessage' - finish_reason: - type: string - enum: - - stop - - length - - function_call - - content_filter - description: |- - The reason the model stopped generating tokens. This will be `stop` if the model hit a - natural stop point or a provided stop sequence, `length` if the maximum number of tokens - specified in the request was reached, `content_filter` if the content was omitted due to - a flag from our content filters, or `function_call` if the model called a function. - description: A list of chat completion choices. Can be more than one if `n` is greater than 1. - usage: - $ref: '#/components/schemas/CompletionUsage' - x-oaiMeta: - name: The chat completion object - group: chat - example: '' - CreateCompletionRequest: - type: object - required: - - model - - prompt - properties: - model: - anyOf: - - type: string - - type: string - enum: - - babbage-002 - - davinci-002 - - text-davinci-003 - - text-davinci-002 - - text-davinci-001 - - code-davinci-002 - - text-curie-001 - - text-babbage-001 - - text-ada-001 - description: |- - ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to - see all of your available models, or see our [Model overview](/docs/models/overview) for - descriptions of them. - x-oaiTypeLabel: string - prompt: - allOf: - - $ref: '#/components/schemas/Prompt' - description: |- - The prompt(s) to generate completions for, encoded as a string, array of strings, array of - tokens, or array of token arrays. - - Note that <|endoftext|> is the document separator that the model sees during training, so if a - prompt is not specified the model will generate as if from the beginning of a new document. - default: <|endoftext|> - suffix: - type: string - nullable: true - description: The suffix that comes after a completion of inserted text. - default: null - temperature: - oneOf: - - $ref: '#/components/schemas/Temperature' - nullable: true - description: |- - What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output - more random, while lower values like 0.2 will make it more focused and deterministic. - - We generally recommend altering this or `top_p` but not both. - default: 1 - top_p: - oneOf: - - $ref: '#/components/schemas/TopP' - nullable: true - description: |- - An alternative to sampling with temperature, called nucleus sampling, where the model considers - the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising - the top 10% probability mass are considered. - - We generally recommend altering this or `temperature` but not both. - default: 1 - n: - oneOf: - - $ref: '#/components/schemas/N' - nullable: true - description: |- - How many completions to generate for each prompt. - **Note:** Because this parameter generates many completions, it can quickly consume your token - quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. - default: 1 - max_tokens: - oneOf: - - $ref: '#/components/schemas/MaxTokens' - nullable: true - description: |- - The maximum number of [tokens](/tokenizer) to generate in the completion. - - The token count of your prompt plus `max_tokens` cannot exceed the model's context length. - [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb) - for counting tokens. - default: 16 - stop: - allOf: - - $ref: '#/components/schemas/Stop' - description: Up to 4 sequences where the API will stop generating further tokens. - default: null - presence_penalty: - oneOf: - - $ref: '#/components/schemas/Penalty' - nullable: true - description: |- - Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear - in the text so far, increasing the model's likelihood to talk about new topics. - - [See more information about frequency and presence penalties.](/docs/guides/gpt/parameter-details) - frequency_penalty: - oneOf: - - $ref: '#/components/schemas/Penalty' - nullable: true - description: |- - Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing - frequency in the text so far, decreasing the model's likelihood to repeat the same line - verbatim. - - [See more information about frequency and presence penalties.](/docs/guides/gpt/parameter-details) - logit_bias: - type: object - description: |- - Modify the likelihood of specified tokens appearing in the completion. - Accepts a json object that maps tokens (specified by their token ID in the tokenizer) to an - associated bias value from -100 to 100. Mathematically, the bias is added to the logits - generated by the model prior to sampling. The exact effect will vary per model, but values - between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 - should result in a ban or exclusive selection of the relevant token. - additionalProperties: - type: integer - format: int64 - nullable: true - x-oaiTypeLabel: map - user: - allOf: - - $ref: '#/components/schemas/User' - description: |- - A unique identifier representing your end-user, which can help OpenAI to monitor and detect - abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids). - stream: - type: boolean - nullable: true - description: |- - If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only - [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) - as they become available, with the stream terminated by a `data: [DONE]` message. - [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb). - default: true - logprobs: - type: integer - format: int64 - nullable: true - description: |- - Include the log probabilities on the `logprobs` most likely tokens, as well the chosen tokens. - For example, if `logprobs` is 5, the API will return a list of the 5 most likely tokens. The - API will always return the `logprob` of the sampled token, so there may be up to `logprobs+1` - elements in the response. - - The maximum value for `logprobs` is 5. - default: null - echo: - type: boolean - nullable: true - description: Echo back the prompt in addition to the completion - default: false - best_of: - type: integer - format: int64 - nullable: true - description: |- - Generates `best_of` completions server-side and returns the "best" (the one with the highest - log probability per token). Results cannot be streamed. - - When used with `n`, `best_of` controls the number of candidate completions and `n` specifies - how many to return – `best_of` must be greater than `n`. - - **Note:** Because this parameter generates many completions, it can quickly consume your token - quota. Use carefully and ensure that you have reasonable settings for `max_tokens` and `stop`. - default: 1 - CreateCompletionResponse: - type: object - description: |- - Represents a completion response from the API. Note: both the streamed and non-streamed response - objects share the same shape (unlike the chat endpoint). - required: - - id - - object - - created - - model - - choices - properties: - id: - type: string - description: A unique identifier for the completion. - object: - type: string - description: The object type, which is always `text_completion`. - created: - type: integer - format: unixtime - description: The Unix timestamp (in seconds) of when the completion was created. - model: - type: string - description: The model used for the completion. - choices: - type: array - items: - type: object - required: - - index - - text - - logprobs - - finish_reason - properties: - index: - type: integer - format: int64 - text: - type: string - logprobs: - type: object - required: - - tokens - - token_logprobs - - top_logprobs - - text_offset - properties: - tokens: - type: array - items: - type: string - token_logprobs: - type: array - items: - type: number - format: double - top_logprobs: - type: array - items: - type: object - additionalProperties: - type: integer - format: int64 - text_offset: - type: array - items: - type: integer - format: int64 - nullable: true - finish_reason: - type: string - enum: - - stop - - length - - content_filter - description: |- - The reason the model stopped generating tokens. This will be `stop` if the model hit a - natural stop point or a provided stop sequence, or `content_filter` if content was omitted - due to a flag from our content filters, `length` if the maximum number of tokens specified - in the request was reached, or `content_filter` if content was omitted due to a flag from our - content filters. - description: The list of completion choices the model generated for the input. - usage: - $ref: '#/components/schemas/CompletionUsage' - x-oaiMeta: - name: The completion object - legacy: true - example: '' - CreateEditRequest: - type: object - required: - - model - - instruction - properties: - model: - anyOf: - - type: string - - type: string - enum: - - text-davinci-edit-001 - - code-davinci-edit-001 - description: |- - ID of the model to use. You can use the `text-davinci-edit-001` or `code-davinci-edit-001` - model with this endpoint. - x-oaiTypeLabel: string - input: - type: string - nullable: true - description: The input text to use as a starting point for the edit. - default: '' - instruction: - type: string - description: The instruction that tells the model how to edit the prompt. - n: - oneOf: - - $ref: '#/components/schemas/EditN' - nullable: true - description: How many edits to generate for the input and instruction. - default: 1 - temperature: - oneOf: - - $ref: '#/components/schemas/Temperature' - nullable: true - description: |- - What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output - more random, while lower values like 0.2 will make it more focused and deterministic. - - We generally recommend altering this or `top_p` but not both. - default: 1 - top_p: - oneOf: - - $ref: '#/components/schemas/TopP' - nullable: true - description: |- - An alternative to sampling with temperature, called nucleus sampling, where the model considers - the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising - the top 10% probability mass are considered. - - We generally recommend altering this or `temperature` but not both. - default: 1 - CreateEditResponse: - type: object - required: - - object - - created - - choices - - usage - properties: - object: - type: string - enum: - - edit - description: The object type, which is always `edit`. - created: - type: integer - format: unixtime - description: The Unix timestamp (in seconds) of when the edit was created. - choices: - type: array - items: - type: object - required: - - text - - index - - finish_reason - properties: - text: - type: string - description: The edited result. - index: - type: integer - format: int64 - description: The index of the choice in the list of choices. - finish_reason: - type: string - enum: - - stop - - length - description: |- - The reason the model stopped generating tokens. This will be `stop` if the model hit a - natural stop point or a provided stop sequence, or `length` if the maximum number of tokens - specified in the request was reached. - description: 'description: A list of edit choices. Can be more than one if `n` is greater than 1.' - usage: - $ref: '#/components/schemas/CompletionUsage' - CreateEmbeddingRequest: - type: object - required: - - model - - input - properties: - model: - anyOf: - - type: string - - type: string - enum: - - text-embedding-ada-002 - description: ID of the model to use. You can use the [List models](/docs/api-reference/models/list) API to see all of your available models, or see our [Model overview](/docs/models/overview) for descriptions of them. - x-oaiTypeLabel: string - input: - anyOf: - - type: string - - type: array - items: - type: string - - $ref: '#/components/schemas/TokenArray' - - $ref: '#/components/schemas/TokenArrayArray' - description: |- - Input text to embed, encoded as a string or array of tokens. To embed multiple inputs in a - single request, pass an array of strings or array of token arrays. Each input must not exceed - the max input tokens for the model (8191 tokens for `text-embedding-ada-002`) and cannot be an empty string. - [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb) - for counting tokens. - user: - $ref: '#/components/schemas/User' - CreateEmbeddingResponse: - type: object - required: - - object - - model - - data - - usage - properties: - object: - type: string - enum: - - embedding - description: The object type, which is always "embedding". - model: - type: string - description: The name of the model used to generate the embedding. - data: - type: array - items: - $ref: '#/components/schemas/Embedding' - description: The list of embeddings generated by the model. - usage: - type: object - description: The usage information for the request. - required: - - prompt_tokens - - total_tokens - properties: - prompt_tokens: - type: integer - format: int64 - description: The number of tokens used by the prompt. - total_tokens: - type: integer - format: int64 - description: The total number of tokens used by the request. - CreateFileRequest: - type: object - required: - - file - - purpose - properties: - file: - type: string - format: binary - description: |- - Name of the [JSON Lines](https://jsonlines.readthedocs.io/en/latest/) file to be uploaded. - - If the `purpose` is set to "fine-tune", the file will be used for fine-tuning. - purpose: - type: string - description: |- - The intended purpose of the uploaded documents. Use "fine-tune" for - [fine-tuning](/docs/api-reference/fine-tuning). This allows us to validate the format of the - uploaded file. - CreateFineTuneRequest: - type: object - required: - - training_file - properties: - training_file: - type: string - description: |- - The ID of an uploaded file that contains training data. - - See [upload file](/docs/api-reference/files/upload) for how to upload a file. - - Your dataset must be formatted as a JSONL file, where each training example is a JSON object - with the keys "prompt" and "completion". Additionally, you must upload your file with the - purpose `fine-tune`. - - See the [fine-tuning guide](/docs/guides/legacy-fine-tuning/creating-training-data) for more - details. - validation_file: - type: string - nullable: true - description: |- - The ID of an uploaded file that contains validation data. - - If you provide this file, the data is used to generate validation metrics periodically during - fine-tuning. These metrics can be viewed in the - [fine-tuning results file](/docs/guides/legacy-fine-tuning/analyzing-your-fine-tuned-model). - Your train and validation data should be mutually exclusive. - - Your dataset must be formatted as a JSONL file, where each validation example is a JSON object - with the keys "prompt" and "completion". Additionally, you must upload your file with the - purpose `fine-tune`. - - See the [fine-tuning guide](/docs/guides/legacy-fine-tuning/creating-training-data) for more - details. - model: - anyOf: - - type: string - - type: string - enum: - - ada - - babbage - - curie - - davinci - nullable: true - description: |- - The name of the base model to fine-tune. You can select one of "ada", "babbage", "curie", - "davinci", or a fine-tuned model created after 2022-04-21 and before 2023-08-22. To learn more - about these models, see the [Models](/docs/models) documentation. - x-oaiTypeLabel: string - n_epochs: - type: integer - format: int64 - nullable: true - description: |- - The number of epochs to train the model for. An epoch refers to one full cycle through the - training dataset. - default: 4 - batch_size: - type: integer - format: int64 - nullable: true - description: |- - The batch size to use for training. The batch size is the number of training examples used to - train a single forward and backward pass. - - By default, the batch size will be dynamically configured to be ~0.2% of the number of examples - in the training set, capped at 256 - in general, we've found that larger batch sizes tend to - work better for larger datasets. - default: null - learning_rate_multiplier: - type: number - format: double - nullable: true - description: |- - The learning rate multiplier to use for training. The fine-tuning learning rate is the original - learning rate used for pretraining multiplied by this value. - - By default, the learning rate multiplier is the 0.05, 0.1, or 0.2 depending on final - `batch_size` (larger learning rates tend to perform better with larger batch sizes). We - recommend experimenting with values in the range 0.02 to 0.2 to see what produces the best - results. - default: null - prompt_loss_rate: - type: number - format: double - nullable: true - description: |- - The weight to use for loss on the prompt tokens. This controls how much the model tries to - learn to generate the prompt (as compared to the completion which always has a weight of 1.0), - and can add a stabilizing effect to training when completions are short. - - If prompts are extremely long (relative to completions), it may make sense to reduce this - weight so as to avoid over-prioritizing learning the prompt. - default: 0.01 - compute_classification_metrics: - type: boolean - nullable: true - description: |- - If set, we calculate classification-specific metrics such as accuracy and F-1 score using the - validation set at the end of every epoch. These metrics can be viewed in the - [results file](/docs/guides/legacy-fine-tuning/analyzing-your-fine-tuned-model). - - In order to compute classification metrics, you must provide a `validation_file`. Additionally, - you must specify `classification_n_classes` for multiclass classification or - `classification_positive_class` for binary classification. - default: false - classification_n_classes: - type: integer - format: int64 - nullable: true - description: |- - The number of classes in a classification task. - - This parameter is required for multiclass classification. - default: null - classification_positive_class: - type: string - nullable: true - description: |- - The positive class in binary classification. - - This parameter is needed to generate precision, recall, and F1 metrics when doing binary - classification. - default: null - classification_betas: - type: array - items: - type: number - format: double - nullable: true - description: |- - If this is provided, we calculate F-beta scores at the specified beta values. The F-beta score - is a generalization of F-1 score. This is only used for binary classification. - - With a beta of 1 (i.e. the F-1 score), precision and recall are given the same weight. A larger - beta score puts more weight on recall and less on precision. A smaller beta score puts more - weight on precision and less on recall. - default: null - suffix: - oneOf: - - $ref: '#/components/schemas/SuffixString' - nullable: true - description: |- - A string of up to 18 characters that will be added to your fine-tuned model name. - - For example, a `suffix` of "custom-model-name" would produce a model name like - `ada:ft-your-org:custom-model-name-2022-02-15-04-21-04`. - default: null - CreateFineTuningJobRequest: - type: object - required: - - training_file - - model - properties: - training_file: - type: string - description: |- - The ID of an uploaded file that contains training data. - - See [upload file](/docs/api-reference/files/upload) for how to upload a file. - - Your dataset must be formatted as a JSONL file. Additionally, you must upload your file with - the purpose `fine-tune`. - - See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. - validation_file: - type: string - nullable: true - description: |- - The ID of an uploaded file that contains validation data. - - If you provide this file, the data is used to generate validation metrics periodically during - fine-tuning. These metrics can be viewed in the fine-tuning results file. The same data should - not be present in both train and validation files. - - Your dataset must be formatted as a JSONL file. You must upload your file with the purpose - `fine-tune`. - - See the [fine-tuning guide](/docs/guides/fine-tuning) for more details. - model: - anyOf: - - type: string - - type: string - enum: - - babbage-002 - - davinci-002 - - gpt-3.5-turbo - description: |- - The name of the model to fine-tune. You can select one of the - [supported models](/docs/guides/fine-tuning/what-models-can-be-fine-tuned). - x-oaiTypeLabel: string - hyperparameters: - type: object - description: The hyperparameters used for the fine-tuning job. - properties: - n_epochs: - anyOf: - - type: string - enum: - - auto - - $ref: '#/components/schemas/NEpochs' - description: |- - The number of epochs to train the model for. An epoch refers to one full cycle through the - training dataset. - default: auto - suffix: - oneOf: - - $ref: '#/components/schemas/SuffixString' - nullable: true - description: |- - A string of up to 18 characters that will be added to your fine-tuned model name. - - For example, a `suffix` of "custom-model-name" would produce a model name like - `ft:gpt-3.5-turbo:openai:custom-model-name:7p4lURel`. - default: null - CreateImageEditRequest: - type: object - required: - - prompt - - image - properties: - prompt: - type: string - description: A text description of the desired image(s). The maximum length is 1000 characters. - image: - type: string - format: binary - description: |- - The image to edit. Must be a valid PNG file, less than 4MB, and square. If mask is not - provided, image must have transparency, which will be used as the mask. - mask: - type: string - format: binary - description: |- - An additional image whose fully transparent areas (e.g. where alpha is zero) indicate where - `image` should be edited. Must be a valid PNG file, less than 4MB, and have the same dimensions - as `image`. - n: - oneOf: - - $ref: '#/components/schemas/ImagesN' - nullable: true - description: The number of images to generate. Must be between 1 and 10. - default: 1 - size: - type: string - enum: - - 256x256 - - 512x512 - - 1024x1024 - nullable: true - description: The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. - default: 1024x1024 - response_format: - type: string - enum: - - url - - b64_json - nullable: true - description: The format in which the generated images are returned. Must be one of `url` or `b64_json`. - default: url - user: - $ref: '#/components/schemas/User' - CreateImageRequest: - type: object - required: - - prompt - properties: - prompt: - type: string - description: A text description of the desired image(s). The maximum length is 1000 characters. - n: - oneOf: - - $ref: '#/components/schemas/ImagesN' - nullable: true - description: The number of images to generate. Must be between 1 and 10. - default: 1 - size: - type: string - enum: - - 256x256 - - 512x512 - - 1024x1024 - nullable: true - description: The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. - default: 1024x1024 - response_format: - type: string - enum: - - url - - b64_json - nullable: true - description: The format in which the generated images are returned. Must be one of `url` or `b64_json`. - default: url - user: - $ref: '#/components/schemas/User' - CreateImageVariationRequest: - type: object - required: - - image - properties: - image: - type: string - format: binary - description: |- - The image to use as the basis for the variation(s). Must be a valid PNG file, less than 4MB, - and square. - n: - oneOf: - - $ref: '#/components/schemas/ImagesN' - nullable: true - description: The number of images to generate. Must be between 1 and 10. - default: 1 - size: - type: string - enum: - - 256x256 - - 512x512 - - 1024x1024 - nullable: true - description: The size of the generated images. Must be one of `256x256`, `512x512`, or `1024x1024`. - default: 1024x1024 - response_format: - type: string - enum: - - url - - b64_json - nullable: true - description: The format in which the generated images are returned. Must be one of `url` or `b64_json`. - default: url - user: - $ref: '#/components/schemas/User' - CreateModerationRequest: - type: object - required: - - input - properties: - input: - anyOf: - - type: string - - type: array - items: - type: string - description: The input text to classify - model: - anyOf: - - type: string - - type: string - enum: - - text-moderation-latest - - text-moderation-stable - description: |- - Two content moderations models are available: `text-moderation-stable` and - `text-moderation-latest`. The default is `text-moderation-latest` which will be automatically - upgraded over time. This ensures you are always using our most accurate model. If you use - `text-moderation-stable`, we will provide advanced notice before updating the model. Accuracy - of `text-moderation-stable` may be slightly lower than for `text-moderation-latest`. - x-oaiTypeLabel: string - default: text-moderation-latest - CreateModerationResponse: - type: object - required: - - id - - model - - results - properties: - id: - type: string - description: The unique identifier for the moderation request. - model: - type: string - description: The model used to generate the moderation results. - results: - type: array - items: - type: object - required: - - flagged - - categories - - category_scores - properties: - flagged: - type: boolean - description: Whether the content violates [OpenAI's usage policies](/policies/usage-policies). - categories: - type: object - description: A list of the categories, and whether they are flagged or not. - required: - - hate - - hate/threatening - - harassment - - harassment/threatening - - self-harm - - self-harm/intent - - self-harm/instructive - - sexual - - sexual/minors - - violence - - violence/graphic - properties: - hate: - type: boolean - description: |- - Content that expresses, incites, or promotes hate based on race, gender, ethnicity, - religion, nationality, sexual orientation, disability status, or caste. Hateful content - aimed at non-protected groups (e.g., chess players) is harrassment. - hate/threatening: - type: boolean - description: |- - Hateful content that also includes violence or serious harm towards the targeted group - based on race, gender, ethnicity, religion, nationality, sexual orientation, disability - status, or caste. - harassment: - type: boolean - description: Content that expresses, incites, or promotes harassing language towards any target. - harassment/threatening: - type: boolean - description: Harassment content that also includes violence or serious harm towards any target. - self-harm: - type: boolean - description: |- - Content that promotes, encourages, or depicts acts of self-harm, such as suicide, cutting, - and eating disorders. - self-harm/intent: - type: boolean - description: |- - Content where the speaker expresses that they are engaging or intend to engage in acts of - self-harm, such as suicide, cutting, and eating disorders. - self-harm/instructive: - type: boolean - description: |- - Content that encourages performing acts of self-harm, such as suicide, cutting, and eating - disorders, or that gives instructions or advice on how to commit such acts. - sexual: - type: boolean - description: |- - Content meant to arouse sexual excitement, such as the description of sexual activity, or - that promotes sexual services (excluding sex education and wellness). - sexual/minors: - type: boolean - description: Sexual content that includes an individual who is under 18 years old. - violence: - type: boolean - description: Content that depicts death, violence, or physical injury. - violence/graphic: - type: boolean - description: Content that depicts death, violence, or physical injury in graphic detail. - category_scores: - type: object - description: A list of the categories along with their scores as predicted by model. - required: - - hate - - hate/threatening - - harassment - - harassment/threatening - - self-harm - - self-harm/intent - - self-harm/instructive - - sexual - - sexual/minors - - violence - - violence/graphic - properties: - hate: - type: number - format: double - description: The score for the category 'hate'. - hate/threatening: - type: number - format: double - description: The score for the category 'hate/threatening'. - harassment: - type: number - format: double - description: The score for the category 'harassment'. - harassment/threatening: - type: number - format: double - description: The score for the category 'harassment/threatening'. - self-harm: - type: number - format: double - description: The score for the category 'self-harm'. - self-harm/intent: - type: number - format: double - description: The score for the category 'self-harm/intent'. - self-harm/instructive: - type: number - format: double - description: The score for the category 'self-harm/instructive'. - sexual: - type: number - format: double - description: The score for the category 'sexual'. - sexual/minors: - type: number - format: double - description: The score for the category 'sexual/minors'. - violence: - type: number - format: double - description: The score for the category 'violence'. - violence/graphic: - type: number - format: double - description: The score for the category 'violence/graphic'. - description: A list of moderation objects. - CreateTranscriptionRequest: - type: object - required: - - file - - model - properties: - file: - type: string - format: binary - description: |- - The audio file object (not file name) to transcribe, in one of these formats: flac, mp3, mp4, - mpeg, mpga, m4a, ogg, wav, or webm. - x-oaiTypeLabel: file - model: - anyOf: - - type: string - - type: string - enum: - - whisper-1 - description: ID of the model to use. Only `whisper-1` is currently available. - x-oaiTypeLabel: string - prompt: - type: string - description: |- - An optional text to guide the model's style or continue a previous audio segment. The - [prompt](/docs/guides/speech-to-text/prompting) should match the audio language. - response_format: - type: string - enum: - - json - - text - - srt - - verbose_json - - vtt - description: |- - The format of the transcript output, in one of these options: json, text, srt, verbose_json, or - vtt. - default: json - temperature: - type: number - format: double - description: |- - The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more - random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, - the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to - automatically increase the temperature until certain thresholds are hit. - minimum: 0 - maximum: 1 - default: 0 - language: - type: string - description: |- - The language of the input audio. Supplying the input language in - [ISO-639-1](https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes) format will improve accuracy - and latency. - CreateTranscriptionResponse: - type: object - required: - - text - properties: - text: - type: string - CreateTranslationRequest: - type: object - required: - - file - - model - properties: - file: - type: string - format: binary - description: |- - The audio file object (not file name) to translate, in one of these formats: flac, mp3, mp4, - mpeg, mpga, m4a, ogg, wav, or webm. - x-oaiTypeLabel: file - model: - anyOf: - - type: string - - type: string - enum: - - whisper-1 - description: ID of the model to use. Only `whisper-1` is currently available. - x-oaiTypeLabel: string - prompt: - type: string - description: |- - An optional text to guide the model's style or continue a previous audio segment. The - [prompt](/docs/guides/speech-to-text/prompting) should match the audio language. - response_format: - type: string - enum: - - json - - text - - srt - - verbose_json - - vtt - description: |- - The format of the transcript output, in one of these options: json, text, srt, verbose_json, or - vtt. - default: json - temperature: - type: number - format: double - description: |- - The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more - random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, - the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to - automatically increase the temperature until certain thresholds are hit. - minimum: 0 - maximum: 1 - default: 0 - CreateTranslationResponse: - type: object - required: - - text - properties: - text: - type: string - DeleteFileResponse: - type: object - required: - - id - - object - - deleted - properties: - id: - type: string - object: - type: string - deleted: - type: boolean - DeleteModelResponse: - type: object - required: - - id - - object - - deleted - properties: - id: - type: string - object: - type: string - deleted: - type: boolean - EditN: - type: integer - format: int64 - minimum: 0 - maximum: 20 - Embedding: - type: object - description: Represents an embedding vector returned by embedding endpoint. - required: - - index - - object - - embedding - properties: - index: - type: integer - format: int64 - description: The index of the embedding in the list of embeddings. - object: - type: string - enum: - - embedding - description: The object type, which is always "embedding". - embedding: - type: array - items: - type: number - format: double - description: |- - The embedding vector, which is a list of floats. The length of vector depends on the model as\ - listed in the [embedding guide](/docs/guides/embeddings). - Error: - type: object - required: - - type - - message - - param - - code - properties: - type: - type: string - message: - type: string - param: - type: string - nullable: true - code: - type: string - nullable: true - ErrorResponse: - type: object - required: - - error - properties: - error: - $ref: '#/components/schemas/Error' - FineTune: - type: object - description: The `FineTune` object represents a legacy fine-tune job that has been created through the API. - required: - - id - - object - - created_at - - updated_at - - model - - fine_tuned_model - - organization_id - - status - - hyperparams - - training_files - - validation_files - - result_files - properties: - id: - type: string - description: The object identifier, which can be referenced in the API endpoints. - object: - type: string - enum: - - fine-tune - description: The object type, which is always "fine-tune". - created_at: - type: integer - format: unixtime - description: The Unix timestamp (in seconds) for when the fine-tuning job was created. - updated_at: - type: integer - format: unixtime - description: The Unix timestamp (in seconds) for when the fine-tuning job was last updated. - model: - type: string - description: The base model that is being fine-tuned. - fine_tuned_model: - type: string - nullable: true - description: The name of the fine-tuned model that is being created. - organization_id: - type: string - description: The organization that owns the fine-tuning job. - status: - type: string - enum: - - created - - running - - succeeded - - failed - - cancelled - description: |- - The current status of the fine-tuning job, which can be either `created`, `running`, - `succeeded`, `failed`, or `cancelled`. - hyperparams: - type: object - description: |- - The hyperparameters used for the fine-tuning job. See the - [fine-tuning guide](/docs/guides/legacy-fine-tuning/hyperparameters) for more details. - required: - - n_epochs - - batch_size - - prompt_loss_weight - - learning_rate_multiplier - properties: - n_epochs: - type: integer - format: int64 - description: |- - The number of epochs to train the model for. An epoch refers to one full cycle through the - training dataset. - batch_size: - type: integer - format: int64 - description: |- - The batch size to use for training. The batch size is the number of training examples used to - train a single forward and backward pass. - prompt_loss_weight: - type: number - format: double - description: The weight to use for loss on the prompt tokens. - learning_rate_multiplier: - type: number - format: double - description: The learning rate multiplier to use for training. - compute_classification_metrics: - type: boolean - description: The classification metrics to compute using the validation dataset at the end of every epoch. - classification_positive_class: - type: string - description: The positive class to use for computing classification metrics. - classification_n_classes: - type: integer - format: int64 - description: The number of classes to use for computing classification metrics. - training_files: - type: array - items: - $ref: '#/components/schemas/OpenAIFile' - description: The list of files used for training. - validation_files: - type: array - items: - $ref: '#/components/schemas/OpenAIFile' - description: The list of files used for validation. - result_files: - type: array - items: - $ref: '#/components/schemas/OpenAIFile' - description: The compiled results files for the fine-tuning job. - events: - type: array - items: - $ref: '#/components/schemas/FineTuneEvent' - description: The list of events that have been observed in the lifecycle of the FineTune job. - FineTuneEvent: - type: object - required: - - object - - created_at - - level - - message - properties: - object: - type: string - created_at: - type: integer - format: unixtime - level: - type: string - message: - type: string - FineTuningEvent: - type: object - required: - - object - - created_at - - level - - message - properties: - object: - type: string - created_at: - type: integer - format: unixtime - level: - type: string - message: - type: string - data: - type: object - additionalProperties: {} - nullable: true - type: - type: string - enum: - - message - - metrics - FineTuningJob: - type: object - required: - - id - - object - - created_at - - finished_at - - model - - fine_tuned_model - - organization_id - - status - - hyperparameters - - training_file - - validation_file - - result_files - - trained_tokens - - error - properties: - id: - type: string - description: The object identifier, which can be referenced in the API endpoints. - object: - type: string - enum: - - fine_tuning.job - description: The object type, which is always "fine_tuning.job". - created_at: - type: integer - format: unixtime - description: The Unix timestamp (in seconds) for when the fine-tuning job was created. - finished_at: - type: string - format: date-time - nullable: true - description: |- - The Unix timestamp (in seconds) for when the fine-tuning job was finished. The value will be - null if the fine-tuning job is still running. - model: - type: string - description: The base model that is being fine-tuned. - fine_tuned_model: - type: string - nullable: true - description: |- - The name of the fine-tuned model that is being created. The value will be null if the - fine-tuning job is still running. - organization_id: - type: string - description: The organization that owns the fine-tuning job. - status: - type: string - enum: - - created - - pending - - running - - succeeded - - failed - - cancelled - description: |- - The current status of the fine-tuning job, which can be either `created`, `pending`, `running`, - `succeeded`, `failed`, or `cancelled`. - hyperparameters: - type: object - description: |- - The hyperparameters used for the fine-tuning job. See the - [fine-tuning guide](/docs/guides/fine-tuning) for more details. - properties: - n_epochs: - anyOf: - - type: string - enum: - - auto - - $ref: '#/components/schemas/NEpochs' - description: |- - The number of epochs to train the model for. An epoch refers to one full cycle through the - training dataset. - - "Auto" decides the optimal number of epochs based on the size of the dataset. If setting the - number manually, we support any number between 1 and 50 epochs. - default: auto - training_file: - type: string - description: |- - The file ID used for training. You can retrieve the training data with the - [Files API](/docs/api-reference/files/retrieve-contents). - validation_file: - type: string - nullable: true - description: |- - The file ID used for validation. You can retrieve the validation results with the - [Files API](/docs/api-reference/files/retrieve-contents). - result_files: - type: array - items: - type: string - description: |- - The compiled results file ID(s) for the fine-tuning job. You can retrieve the results with the - [Files API](/docs/api-reference/files/retrieve-contents). - trained_tokens: - type: integer - format: int64 - nullable: true - description: |- - The total number of billable tokens processed by this fine tuning job. The value will be null - if the fine-tuning job is still running. - error: - type: object - description: |- - For fine-tuning jobs that have `failed`, this will contain more information on the cause of the - failure. - properties: - message: - type: string - description: A human-readable error message. - code: - type: string - description: A machine-readable error code. - param: - type: string - nullable: true - description: |- - The parameter that was invalid, usually `training_file` or `validation_file`. This field - will be null if the failure was not parameter-specific. - nullable: true - FineTuningJobEvent: - type: object - required: - - id - - object - - created_at - - level - - message - properties: - id: - type: string - object: - type: string - created_at: - type: integer - format: unixtime - level: - type: string - enum: - - info - - warn - - error - message: - type: string - Image: - type: object - description: Represents the url or the content of an image generated by the OpenAI API. - properties: - url: - type: string - format: uri - description: The URL of the generated image, if `response_format` is `url` (default). - b64_json: - type: string - format: base64 - description: The base64-encoded JSON of the generated image, if `response_format` is `b64_json`. - ImagesN: - type: integer - format: int64 - minimum: 1 - maximum: 10 - ImagesResponse: - type: object - required: - - created - - data - properties: - created: - type: integer - format: unixtime - data: - type: array - items: - $ref: '#/components/schemas/Image' - ListFilesResponse: - type: object - required: - - object - - data - properties: - object: - type: string - data: - type: array - items: - $ref: '#/components/schemas/OpenAIFile' - ListFineTuneEventsResponse: - type: object - required: - - object - - data - properties: - object: - type: string - data: - type: array - items: - $ref: '#/components/schemas/FineTuneEvent' - ListFineTunesResponse: - type: object - required: - - object - - data - properties: - object: - type: string - data: - type: array - items: - $ref: '#/components/schemas/FineTune' - ListFineTuningJobEventsResponse: - type: object - required: - - object - - data - properties: - object: - type: string - data: - type: array - items: - $ref: '#/components/schemas/FineTuningJobEvent' - ListModelsResponse: - type: object - required: - - object - - data - properties: - object: - type: string - data: - type: array - items: - $ref: '#/components/schemas/Model' - ListPaginatedFineTuningJobsResponse: - type: object - required: - - object - - data - - has_more - properties: - object: - type: string - data: - type: array - items: - $ref: '#/components/schemas/FineTuningJob' - has_more: - type: boolean - MaxTokens: - type: integer - format: int64 - minimum: 0 - Model: - type: object - description: Describes an OpenAI model offering that can be used with the API. - required: - - id - - object - - created - - owned_by - properties: - id: - type: string - description: The model identifier, which can be referenced in the API endpoints. - object: - type: string - enum: - - model - description: The object type, which is always "model". - created: - type: integer - format: unixtime - description: The Unix timestamp (in seconds) when the model was created. - owned_by: - type: string - description: The organization that owns the model. - N: - type: integer - format: int64 - minimum: 1 - maximum: 128 - NEpochs: - type: integer - format: int64 - minimum: 1 - maximum: 50 - OpenAIFile: - type: object - description: The `File` object represents a document that has been uploaded to OpenAI. - required: - - id - - object - - bytes - - createdAt - - filename - - purpose - - status - properties: - id: - type: string - description: The file identifier, which can be referenced in the API endpoints. - object: - type: string - enum: - - file - description: The object type, which is always "file". - bytes: - type: integer - format: int64 - description: The size of the file in bytes. - createdAt: - type: integer - format: unixtime - description: The Unix timestamp (in seconds) for when the file was created. - filename: - type: string - description: The name of the file. - purpose: - type: string - description: The intended purpose of the file. Currently, only "fine-tune" is supported. - status: - type: string - enum: - - uploaded - - processed - - pending - - error - - deleting - - deleted - description: |- - The current status of the file, which can be either `uploaded`, `processed`, `pending`, - `error`, `deleting` or `deleted`. - status_details: - type: string - nullable: true - description: |- - Additional details about the status of the file. If the file is in the `error` state, this will - include a message describing the error. - Penalty: - type: number - format: double - minimum: -2 - maximum: 2 - Prompt: - oneOf: - - type: string - - type: array - items: - type: string - - $ref: '#/components/schemas/TokenArray' - - $ref: '#/components/schemas/TokenArrayArray' - nullable: true - Stop: - oneOf: - - type: string - - $ref: '#/components/schemas/StopSequences' - nullable: true - StopSequences: - type: array - items: - type: string - minItems: 1 - maxItems: 4 - SuffixString: - type: string - minLength: 1 - maxLength: 40 - Temperature: - type: number - format: double - minimum: 0 - maximum: 2 - TokenArray: - type: array - items: - type: integer - format: int64 - minItems: 1 - TokenArrayArray: - type: array - items: - $ref: '#/components/schemas/TokenArray' - minItems: 1 - TopP: - type: number - format: double - minimum: 0 - maximum: 1 - User: - type: string - securitySchemes: - BearerAuth: - type: http - scheme: bearer -servers: - - url: https://api.openai.com/v1 - description: OpenAI Endpoint - variables: {}