Skip to content

Commit e4ea7a1

Browse files
committed
feat(beta): add streaming and function calling helpers (#409)
1 parent 0e67361 commit e4ea7a1

32 files changed

+4538
-7
lines changed

README.md

Lines changed: 114 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ You can import in Deno via:
2121
<!-- x-release-please-start-version -->
2222

2323
```ts
24-
import OpenAI from 'https://raw.githubusercontent.com/openai/openai-node/v4.14.1-deno/mod.ts';
24+
import OpenAI from 'https://raw.githubusercontent.com/openai/openai-node/v4.14.2-deno/mod.ts';
2525
```
2626

2727
<!-- x-release-please-end -->
@@ -102,6 +102,119 @@ Documentation for each method, request param, and response field are available i
102102
> [!IMPORTANT]
103103
> Previous versions of this SDK used a `Configuration` class. See the [v3 to v4 migration guide](https://github.com/openai/openai-node/discussions/217).
104104
105+
### Streaming responses
106+
107+
This library provides several conveniences for streaming chat completions, for example:
108+
109+
```ts
110+
import OpenAI from 'openai';
111+
112+
const openai = new OpenAI();
113+
114+
async function main() {
115+
const stream = await openai.beta.chat.completions.stream({
116+
model: 'gpt-4',
117+
messages: [{ role: 'user', content: 'Say this is a test' }],
118+
stream: true,
119+
});
120+
121+
stream.on('content', (delta, snapshot) => {
122+
process.stdout.write(delta);
123+
});
124+
125+
// or, equivalently:
126+
for await (const part of stream) {
127+
process.stdout.write(part.choices[0]?.delta?.content || '');
128+
}
129+
130+
const chatCompletion = await stream.finalChatCompletion();
131+
console.log(chatCompletion); // {id: "…", choices: […], …}
132+
}
133+
134+
main();
135+
```
136+
137+
Streaming with `openai.beta.chat.completions.stream({…})` exposes
138+
[various helpers for your convenience](helpers.md#events) including event handlers and promises.
139+
140+
Alternatively, you can use `openai.chat.completions.create({ stream: true, … })`
141+
which only returns an async iterable of the chunks in the stream and thus uses less memory
142+
(it does not build up a final chat completion object for you).
143+
144+
If you need to cancel a stream, you can `break` from a `for await` loop or call `stream.abort()`.
145+
146+
### Automated function calls
147+
148+
We provide a `openai.beta.chat.completions.runFunctions({…})` convenience helper for using function calls
149+
with the `/chat/completions` endpoint which automatically calls the JavaScript functions you provide
150+
and sends their results back to the `/chat/completions` endpoint,
151+
looping as long as the model requests function calls.
152+
153+
If you pass a `parse` function, it will automatically parse the `arguments` for you and returns any parsing errors to the model to attempt auto-recovery. Otherwise, the args will be passed to the function you provide as a string.
154+
155+
If you pass `function_call: {name: …}` instead of `auto`, it returns immediately after calling that function (and only loops to auto-recover parsing errors).
156+
157+
```ts
158+
import OpenAI from 'openai';
159+
160+
const client = new OpenAI();
161+
162+
async function main() {
163+
const runner = client.beta.chat.completions
164+
.runFunctions({
165+
model: 'gpt-3.5-turbo',
166+
messages: [{ role: 'user', content: 'How is the weather this week?' }],
167+
functions: [
168+
{
169+
function: getCurrentLocation,
170+
parameters: { type: 'object', properties: {} },
171+
},
172+
{
173+
function: getWeather,
174+
parse: JSON.parse, // or use a validation library like zod for typesafe parsing.
175+
parameters: {
176+
type: 'object',
177+
properties: {
178+
location: { type: 'string' },
179+
},
180+
},
181+
},
182+
],
183+
})
184+
.on('message', (message) => console.log(message));
185+
186+
const finalContent = await runner.finalContent();
187+
console.log();
188+
console.log('Final content:', finalContent);
189+
}
190+
191+
async function getCurrentLocation() {
192+
return 'Boston'; // Simulate lookup
193+
}
194+
195+
async function getWeather(args: { location: string }) {
196+
const { location } = args;
197+
// … do lookup …
198+
return { temperature, precipitation };
199+
}
200+
201+
main();
202+
203+
// {role: "user", content: "How's the weather this week?"}
204+
// {role: "assistant", function_call: "getCurrentLocation", arguments: "{}"}
205+
// {role: "function", name: "getCurrentLocation", content: "Boston"}
206+
// {role: "assistant", function_call: "getWeather", arguments: '{"location": "Boston"}'}
207+
// {role: "function", name: "getWeather", content: '{"temperature": "50degF", "preciptation": "high"}'}
208+
// {role: "assistant", content: "It's looking cold and rainy - you might want to wear a jacket!"}
209+
//
210+
// Final content: "It's looking cold and rainy - you might want to wear a jacket!"
211+
```
212+
213+
Like with `.stream()`, we provide a variety of [helpers and events](helpers.md#events).
214+
215+
Read more about various examples such as with integrating with [zod](helpers.md#integrate-with-zod),
216+
[next.js](helpers.md#integrate-wtih-next-js), and [proxying a stream to the browser](helpers.md#proxy-streaming to-a-browser).
217+
105218
## File Uploads
106219

107220
Request parameters that correspond to file uploads can be passed in many different forms:

api.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -156,3 +156,14 @@ Methods:
156156
- <code title="get /fine-tunes">client.fineTunes.<a href="./src/resources/fine-tunes.ts">list</a>() -> FineTunesPage</code>
157157
- <code title="post /fine-tunes/{fine_tune_id}/cancel">client.fineTunes.<a href="./src/resources/fine-tunes.ts">cancel</a>(fineTuneId) -> FineTune</code>
158158
- <code title="get /fine-tunes/{fine_tune_id}/events">client.fineTunes.<a href="./src/resources/fine-tunes.ts">listEvents</a>(fineTuneId, { ...params }) -> FineTuneEventsListResponse</code>
159+
160+
# Beta
161+
162+
## Chat
163+
164+
### Completions
165+
166+
Methods:
167+
168+
- <code>client.beta.chat.completions.<a href="./src/resources/beta/chat/completions.ts">runFunctions</a>(body, options?) -> ChatCompletionRunner | ChatCompletionStreamingRunner</code>
169+
- <code>client.beta.chat.completions.<a href="./src/resources/beta/chat/completions.ts">stream</a>(body, options?) -> ChatCompletionStream</code>

ecosystem-tests/node-ts-cjs-auto/tests/test.ts

Lines changed: 87 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
import OpenAI, { toFile } from 'openai';
1+
import OpenAI, { APIUserAbortError, toFile } from 'openai';
22
import { TranscriptionCreateParams } from 'openai/resources/audio/transcriptions';
33
import fetch from 'node-fetch';
44
import { File as FormDataFile, Blob as FormDataBlob } from 'formdata-node';
@@ -68,6 +68,92 @@ it(`streaming works`, async function () {
6868
expect(chunks.map((c) => c.choices[0]?.delta.content || '').join('')).toBeSimilarTo('This is a test', 10);
6969
});
7070

71+
it(`ChatCompletionStream works`, async function () {
72+
const chunks: OpenAI.Chat.ChatCompletionChunk[] = [];
73+
const contents: [string, string][] = [];
74+
const messages: OpenAI.Chat.ChatCompletionMessage[] = [];
75+
const chatCompletions: OpenAI.Chat.ChatCompletion[] = [];
76+
let finalContent: string | undefined;
77+
let finalMessage: OpenAI.Chat.ChatCompletionMessage | undefined;
78+
let finalChatCompletion: OpenAI.Chat.ChatCompletion | undefined;
79+
80+
const stream = client.beta.chat.completions
81+
.stream({
82+
model: 'gpt-4',
83+
messages: [{ role: 'user', content: 'Say this is a test' }],
84+
})
85+
.on('chunk', (chunk) => chunks.push(chunk))
86+
.on('content', (delta, snapshot) => contents.push([delta, snapshot]))
87+
.on('message', (message) => messages.push(message))
88+
.on('chatCompletion', (completion) => chatCompletions.push(completion))
89+
.on('finalContent', (content) => (finalContent = content))
90+
.on('finalMessage', (message) => (finalMessage = message))
91+
.on('finalChatCompletion', (completion) => (finalChatCompletion = completion));
92+
const content = await stream.finalContent();
93+
94+
expect(content).toBeSimilarTo('This is a test', 10);
95+
expect(chunks.length).toBeGreaterThan(0);
96+
expect(contents.length).toBeGreaterThan(0);
97+
for (const chunk of chunks) {
98+
expect(chunk.id).toEqual(finalChatCompletion?.id);
99+
expect(chunk.created).toEqual(finalChatCompletion?.created);
100+
expect(chunk.model).toEqual(finalChatCompletion?.model);
101+
}
102+
expect(finalContent).toEqual(content);
103+
expect(contents.at(-1)?.[1]).toEqual(content);
104+
expect(finalMessage?.content).toEqual(content);
105+
expect(finalChatCompletion?.choices?.[0]?.message.content).toEqual(content);
106+
expect(messages).toEqual([finalMessage]);
107+
expect(chatCompletions).toEqual([finalChatCompletion]);
108+
expect(await stream.finalContent()).toEqual(content);
109+
expect(await stream.finalMessage()).toEqual(finalMessage);
110+
expect(await stream.finalChatCompletion()).toEqual(finalChatCompletion);
111+
});
112+
113+
it(`aborting ChatCompletionStream works`, async function () {
114+
const chunks: OpenAI.Chat.ChatCompletionChunk[] = [];
115+
const contents: [string, string][] = [];
116+
const messages: OpenAI.Chat.ChatCompletionMessage[] = [];
117+
const chatCompletions: OpenAI.Chat.ChatCompletion[] = [];
118+
let finalContent: string | undefined;
119+
let finalMessage: OpenAI.Chat.ChatCompletionMessage | undefined;
120+
let finalChatCompletion: OpenAI.Chat.ChatCompletion | undefined;
121+
let emittedError: any;
122+
let caughtError: any;
123+
const controller = new AbortController();
124+
const stream = client.beta.chat.completions
125+
.stream(
126+
{
127+
model: 'gpt-4',
128+
messages: [{ role: 'user', content: 'Say this is a test' }],
129+
},
130+
{ signal: controller.signal },
131+
)
132+
.on('error', (e) => (emittedError = e))
133+
.on('chunk', (chunk) => chunks.push(chunk))
134+
.on('content', (delta, snapshot) => {
135+
contents.push([delta, snapshot]);
136+
controller.abort();
137+
})
138+
.on('message', (message) => messages.push(message))
139+
.on('chatCompletion', (completion) => chatCompletions.push(completion))
140+
.on('finalContent', (content) => (finalContent = content))
141+
.on('finalMessage', (message) => (finalMessage = message))
142+
.on('finalChatCompletion', (completion) => (finalChatCompletion = completion));
143+
try {
144+
await stream.finalContent();
145+
} catch (error) {
146+
caughtError = error;
147+
}
148+
expect(caughtError).toBeInstanceOf(APIUserAbortError);
149+
expect(finalContent).toBeUndefined();
150+
expect(finalMessage).toBeUndefined();
151+
expect(finalChatCompletion).toBeUndefined();
152+
expect(chatCompletions).toEqual([]);
153+
expect(chunks.length).toBeGreaterThan(0);
154+
expect(contents.length).toBeGreaterThan(0);
155+
});
156+
71157
it('handles formdata-node File', async function () {
72158
const file = await fetch(url)
73159
.then((x) => x.arrayBuffer())

examples/.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
yarn.lock
2+
node_modules

examples/function-call-diy.ts

Lines changed: 142 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,142 @@
1+
#!/usr/bin/env -S npm run tsn -T
2+
3+
import OpenAI from 'openai';
4+
import { ChatCompletionMessage, ChatCompletionMessageParam } from 'openai/resources/chat';
5+
6+
// gets API Key from environment variable OPENAI_API_KEY
7+
const openai = new OpenAI();
8+
9+
const functions: OpenAI.Chat.ChatCompletionCreateParams.Function[] = [
10+
{
11+
name: 'list',
12+
description: 'list queries books by genre, and returns a list of names of books',
13+
parameters: {
14+
type: 'object',
15+
properties: {
16+
genre: { type: 'string', enum: ['mystery', 'nonfiction', 'memoir', 'romance', 'historical'] },
17+
},
18+
},
19+
},
20+
{
21+
name: 'search',
22+
description: 'search queries books by their name and returns a list of book names and their ids',
23+
parameters: {
24+
type: 'object',
25+
properties: {
26+
name: { type: 'string' },
27+
},
28+
},
29+
},
30+
{
31+
name: 'get',
32+
description:
33+
"get returns a book's detailed information based on the id of the book. Note that this does not accept names, and only IDs, which you can get by using search.",
34+
parameters: {
35+
type: 'object',
36+
properties: {
37+
id: { type: 'string' },
38+
},
39+
},
40+
},
41+
];
42+
43+
async function callFunction(function_call: ChatCompletionMessage.FunctionCall): Promise<any> {
44+
const args = JSON.parse(function_call.arguments!);
45+
switch (function_call.name) {
46+
case 'list':
47+
return await list(args['genre']);
48+
49+
case 'search':
50+
return await search(args['name']);
51+
52+
case 'get':
53+
return await get(args['id']);
54+
55+
default:
56+
throw new Error('No function found');
57+
}
58+
}
59+
60+
async function main() {
61+
const messages: ChatCompletionMessageParam[] = [
62+
{
63+
role: 'system',
64+
content:
65+
'Please use our book database, which you can access using functions to answer the following questions.',
66+
},
67+
{
68+
role: 'user',
69+
content:
70+
'I really enjoyed reading To Kill a Mockingbird, could you recommend me a book that is similar and tell me why?',
71+
},
72+
];
73+
console.log(messages[0]);
74+
console.log(messages[1]);
75+
console.log();
76+
77+
while (true) {
78+
const completion = await openai.chat.completions.create({
79+
model: 'gpt-3.5-turbo',
80+
messages,
81+
functions: functions,
82+
});
83+
84+
const message = completion.choices[0]!.message;
85+
messages.push(message);
86+
console.log(message);
87+
88+
// If there is no function call, we're done and can exit this loop
89+
if (!message.function_call) {
90+
return;
91+
}
92+
93+
// If there is a function call, we generate a new message with the role 'function'.
94+
const result = await callFunction(message.function_call);
95+
const newMessage = {
96+
role: 'function' as const,
97+
name: message.function_call.name!,
98+
content: JSON.stringify(result),
99+
};
100+
messages.push(newMessage);
101+
102+
console.log(newMessage);
103+
console.log();
104+
}
105+
}
106+
107+
const db = [
108+
{
109+
id: 'a1',
110+
name: 'To Kill a Mockingbird',
111+
genre: 'historical',
112+
description: `Compassionate, dramatic, and deeply moving, "To Kill A Mockingbird" takes readers to the roots of human behavior - to innocence and experience, kindness and cruelty, love and hatred, humor and pathos. Now with over 18 million copies in print and translated into forty languages, this regional story by a young Alabama woman claims universal appeal. Harper Lee always considered her book to be a simple love story. Today it is regarded as a masterpiece of American literature.`,
113+
},
114+
{
115+
id: 'a2',
116+
name: 'All the Light We Cannot See',
117+
genre: 'historical',
118+
description: `In a mining town in Germany, Werner Pfennig, an orphan, grows up with his younger sister, enchanted by a crude radio they find that brings them news and stories from places they have never seen or imagined. Werner becomes an expert at building and fixing these crucial new instruments and is enlisted to use his talent to track down the resistance. Deftly interweaving the lives of Marie-Laure and Werner, Doerr illuminates the ways, against all odds, people try to be good to one another.`,
119+
},
120+
{
121+
id: 'a3',
122+
name: 'Where the Crawdads Sing',
123+
genre: 'historical',
124+
description: `For years, rumors of the “Marsh Girl” haunted Barkley Cove, a quiet fishing village. Kya Clark is barefoot and wild; unfit for polite society. So in late 1969, when the popular Chase Andrews is found dead, locals immediately suspect her.
125+
126+
But Kya is not what they say. A born naturalist with just one day of school, she takes life's lessons from the land, learning the real ways of the world from the dishonest signals of fireflies. But while she has the skills to live in solitude forever, the time comes when she yearns to be touched and loved. Drawn to two young men from town, who are each intrigued by her wild beauty, Kya opens herself to a new and startling world—until the unthinkable happens.`,
127+
},
128+
];
129+
130+
async function list(genre: string) {
131+
return db.filter((item) => item.genre === genre).map((item) => ({ name: item.name, id: item.id }));
132+
}
133+
134+
async function search(name: string) {
135+
return db.filter((item) => item.name.includes(name)).map((item) => ({ name: item.name, id: item.id }));
136+
}
137+
138+
async function get(id: string) {
139+
return db.find((item) => item.id === id)!;
140+
}
141+
142+
main();

0 commit comments

Comments
 (0)