Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "4.29.2"
".": "4.30.0"
}
27 changes: 27 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,32 @@
# Changelog

## 4.30.0 (2024-03-28)

Full Changelog: [v4.29.2...v4.30.0](https://github.com/openai/openai-node/compare/v4.29.2...v4.30.0)

### Features

* assistant fromReadableStream ([#738](https://github.com/openai/openai-node/issues/738)) ([8f4ba18](https://github.com/openai/openai-node/commit/8f4ba18268797d6c54c393d701b13c7ff2aa71bc))


### Bug Fixes

* **client:** correctly send deno version header ([#736](https://github.com/openai/openai-node/issues/736)) ([b7ea175](https://github.com/openai/openai-node/commit/b7ea175b2854909de77b920dd25613f1d2daefd6))
* **example:** correcting example ([#739](https://github.com/openai/openai-node/issues/739)) ([a819551](https://github.com/openai/openai-node/commit/a81955175da24e196490a38850bbf6f9b6779ea8))
* handle process.env being undefined in debug func ([#733](https://github.com/openai/openai-node/issues/733)) ([2baa149](https://github.com/openai/openai-node/commit/2baa1491f7834f779ca49c3027d2344ead412dd2))
* **internal:** make toFile use input file's options ([#727](https://github.com/openai/openai-node/issues/727)) ([15880d7](https://github.com/openai/openai-node/commit/15880d77b6c1cf58a6b9cfdbf7ae4442cdbddbd6))


### Chores

* **internal:** add type ([#737](https://github.com/openai/openai-node/issues/737)) ([18c1989](https://github.com/openai/openai-node/commit/18c19891f783019517d7961fe03c4d98de0fcf93))


### Documentation

* **readme:** consistent use of sentence case in headings ([#729](https://github.com/openai/openai-node/issues/729)) ([7e515fd](https://github.com/openai/openai-node/commit/7e515fde433ebfb7871d75d53915eef05a08a916))
* **readme:** document how to make undocumented requests ([#730](https://github.com/openai/openai-node/issues/730)) ([a06d861](https://github.com/openai/openai-node/commit/a06d861a015eeee411fa2c6ed9bf3000313cfc03))

## 4.29.2 (2024-03-19)

Full Changelog: [v4.29.1...v4.29.2](https://github.com/openai/openai-node/compare/v4.29.1...v4.29.2)
Expand Down
58 changes: 52 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ You can import in Deno via:
<!-- x-release-please-start-version -->

```ts
import OpenAI from 'https://deno.land/x/openai@v4.29.2/mod.ts';
import OpenAI from 'https://deno.land/x/openai@v4.30.0/mod.ts';
```

<!-- x-release-please-end -->
Expand All @@ -46,7 +46,7 @@ async function main() {
main();
```

## Streaming Responses
## Streaming responses

We provide support for streaming responses using Server Sent Events (SSE).

Expand Down Expand Up @@ -256,7 +256,7 @@ Note that `runFunctions` was previously available as well, but has been deprecat
Read more about various examples such as with integrating with [zod](helpers.md#integrate-with-zod),
[next.js](helpers.md#integrate-wtih-next-js), and [proxying a stream to the browser](helpers.md#proxy-streaming-to-a-browser).

## File Uploads
## File uploads

Request parameters that correspond to file uploads can be passed in many different forms:

Expand Down Expand Up @@ -437,7 +437,51 @@ console.log(raw.headers.get('X-My-Header'));
console.log(chatCompletion);
```

## Customizing the fetch client
### Making custom/undocumented requests

This library is typed for convenient access to the documented API. If you need to access undocumented
endpoints, params, or response properties, the library can still be used.

#### Undocumented endpoints

To make requests to undocumented endpoints, you can use `client.get`, `client.post`, and other HTTP verbs.
Options on the client, such as retries, will be respected when making these requests.

```ts
await client.post('/some/path', {
body: { some_prop: 'foo' },
query: { some_query_arg: 'bar' },
});
```

#### Undocumented params

To make requests using undocumented parameters, you may use `// @ts-expect-error` on the undocumented
parameter. This library doesn't validate at runtime that the request matches the type, so any extra values you
send will be sent as-is.

```ts
client.foo.create({
foo: 'my_param',
bar: 12,
// @ts-expect-error baz is not yet public
baz: 'undocumented option',
});
```

For requests with the `GET` verb, any extra params will be in the query, all other requests will send the
extra param in the body.

If you want to explicitly send an extra argument, you can do so with the `query`, `body`, and `headers` request
options.

#### Undocumented properties

To access undocumented response properties, you may access the response object with `// @ts-expect-error` on
the response object, or cast the response object to the requisite type. Like the request params, we do not
validate or strip extra properties from the response from the API.

### Customizing the fetch client

By default, this library uses `node-fetch` in Node, and expects a global `fetch` function in other environments.

Expand All @@ -455,6 +499,8 @@ import OpenAI from 'openai';
To do the inverse, add `import "openai/shims/node"` (which does import polyfills).
This can also be useful if you are getting the wrong TypeScript types for `Response` ([more details](https://github.com/openai/openai-node/tree/master/src/_shims#readme)).

### Logging and middleware

You may also provide a custom `fetch` function when instantiating the client,
which can be used to inspect or alter the `Request` or `Response` before/after each request:

Expand All @@ -475,7 +521,7 @@ const client = new OpenAI({
Note that if given a `DEBUG=true` environment variable, this library will log all requests and responses automatically.
This is intended for debugging purposes only and may change in the future without notice.

## Configuring an HTTP(S) Agent (e.g., for proxies)
### Configuring an HTTP(S) Agent (e.g., for proxies)

By default, this library uses a stable agent for all http/https requests to reuse TCP connections, eliminating many TCP & TLS handshakes and shaving around 100ms off most requests.

Expand All @@ -497,7 +543,7 @@ await openai.models.list({
});
```

## Semantic Versioning
## Semantic versioning

This package generally follows [SemVer](https://semver.org/spec/v2.0.0.html) conventions, though certain backwards-incompatible changes may be released as minor versions:

Expand Down
2 changes: 1 addition & 1 deletion build-deno
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ This is a build produced from https://github.com/openai/openai-node – please g
Usage:

\`\`\`ts
import OpenAI from "https://deno.land/x/openai@v4.29.2/mod.ts";
import OpenAI from "https://deno.land/x/openai@v4.30.0/mod.ts";

const client = new OpenAI();
\`\`\`
Expand Down
4 changes: 3 additions & 1 deletion examples/assistant-stream-raw.ts
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
#!/usr/bin/env -S npm run tsn -T

import OpenAI from 'openai';

const openai = new OpenAI();
Expand Down Expand Up @@ -27,7 +29,7 @@ async function main() {
for await (const event of stream) {
if (event.event === 'thread.message.delta') {
const chunk = event.data.delta.content?.[0];
if (chunk && 'text' in chunk) {
if (chunk && 'text' in chunk && chunk.text.value) {
process.stdout.write(chunk.text.value);
}
}
Expand Down
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "openai",
"version": "4.29.2",
"version": "4.30.0",
"description": "The official TypeScript library for the OpenAI API",
"author": "OpenAI <support@openai.com>",
"types": "dist/index.d.ts",
Expand Down
5 changes: 3 additions & 2 deletions src/core.ts
Original file line number Diff line number Diff line change
Expand Up @@ -818,7 +818,8 @@ const getPlatformProperties = (): PlatformProperties => {
'X-Stainless-OS': normalizePlatform(Deno.build.os),
'X-Stainless-Arch': normalizeArch(Deno.build.arch),
'X-Stainless-Runtime': 'deno',
'X-Stainless-Runtime-Version': Deno.version,
'X-Stainless-Runtime-Version':
typeof Deno.version === 'string' ? Deno.version : Deno.version?.deno ?? 'unknown',
};
}
if (typeof EdgeRuntime !== 'undefined') {
Expand Down Expand Up @@ -1075,7 +1076,7 @@ function applyHeadersMut(targetHeaders: Headers, newHeaders: Headers): void {
}

export function debug(action: string, ...args: any[]) {
if (typeof process !== 'undefined' && process.env['DEBUG'] === 'true') {
if (typeof process !== 'undefined' && process?.env?.['DEBUG'] === 'true') {
console.log(`OpenAI:DEBUG:${action}`, ...args);
}
}
Expand Down
28 changes: 27 additions & 1 deletion src/lib/AssistantStream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -158,6 +158,32 @@ export class AssistantStream
};
}

static fromReadableStream(stream: ReadableStream): AssistantStream {
const runner = new AssistantStream();
runner._run(() => runner._fromReadableStream(stream));
return runner;
}

protected async _fromReadableStream(
readableStream: ReadableStream,
options?: Core.RequestOptions,
): Promise<Run> {
const signal = options?.signal;
if (signal) {
if (signal.aborted) this.controller.abort();
signal.addEventListener('abort', () => this.controller.abort());
}
this._connected();
const stream = Stream.fromReadableStream<AssistantStreamEvent>(readableStream, this.controller);
for await (const event of stream) {
this.#handleEvent(event);
}
if (stream.controller.signal?.aborted) {
throw new APIUserAbortError();
}
return this._addRun(this.#endRequest());
}

toReadableStream(): ReadableStream {
const stream = new Stream(this[Symbol.asyncIterator].bind(this), this.controller);
return stream.toReadableStream();
Expand Down Expand Up @@ -385,7 +411,7 @@ export class AssistantStream
throw new OpenAIError(`stream has ended, this shouldn't happen`);
}

if (!this.#finalRun) throw Error('Final run has been been received');
if (!this.#finalRun) throw Error('Final run has not been received');

return this.#finalRun;
}
Expand Down
2 changes: 1 addition & 1 deletion src/streaming.ts
Original file line number Diff line number Diff line change
Expand Up @@ -201,7 +201,7 @@ export class Stream<Item> implements AsyncIterable<Item> {
async start() {
iter = self[Symbol.asyncIterator]();
},
async pull(ctrl) {
async pull(ctrl: any) {
try {
const { value, done } = await iter.next();
if (done) return ctrl.close();
Expand Down
5 changes: 4 additions & 1 deletion src/uploads.ts
Original file line number Diff line number Diff line change
Expand Up @@ -102,11 +102,14 @@ export type ToFileInput = Uploadable | Exclude<BlobLikePart, string> | AsyncIter
export async function toFile(
value: ToFileInput | PromiseLike<ToFileInput>,
name?: string | null | undefined,
options: FilePropertyBag | undefined = {},
options?: FilePropertyBag | undefined,
): Promise<FileLike> {
// If it's a promise, resolve it.
value = await value;

// Use the file's options if there isn't one provided
options ??= isFileLike(value) ? { lastModified: value.lastModified, type: value.type } : {};

if (isResponseLike(value)) {
const blob = await value.blob();
name ||= new URL(value.url).pathname.split(/[\\/]/).pop() ?? 'unknown_file';
Expand Down
2 changes: 1 addition & 1 deletion src/version.ts
Original file line number Diff line number Diff line change
@@ -1 +1 @@
export const VERSION = '4.29.2'; // x-release-please-version
export const VERSION = '4.30.0'; // x-release-please-version