-
Notifications
You must be signed in to change notification settings - Fork 876
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
functionCallResult never get's called in the frontend with ChatCompletionStream #1060
Comments
This is the same error: https://community.openai.com/t/beta-completion-runtools/531973 |
After some investigation, when role=="tool" the message never gets added to the stream. in other words this will log only in the backend and never in the frontend. runner.on("message", (message) => {
if (message.role === "tool"){
console.log(message) // Works on backend but not on frontend
}
}) Is there a way to manually push it as a chunk? Is there anything I can do to at least send it as extra data and then somehow add it back in the correct index in my messages in the frontend? I really need this data and openai will anyways crush on my next message as it doesn't have the toolCall respose. |
@cosbgn thanks for the report, I think you might be using the wrong stream class on the frontend, can you try using |
Hi, I tried all possible classes and also to simply read the chucks from
the stream without any class.
The tool response never gets sent to the frontend, I think it's an issue
with the runTool implementation.
|
Running into the same issue, I can create a custom |
Thanks for reporting, I can reproduce the issue. @blechatellier unfortunately I'm not currently aware of a full workaround, I think you'd have to manually enqueue chunks in your own format and then call We'll look into fixing this. |
@RobertCraigie I'm trying something like this in the backend: .on("message", async (message) => {
if (message.role === "tool" && !added_tool_call_ids.includes(message.tool_call_id)) {
added_tool_call_ids.push(message.tool_call_id)
runner._addMessage(message)
} However it fails because I believe it has already the
Do you have an example on how I could get this to work? |
Confirm this is a Node library issue and not an underlying OpenAI API issue
Describe the bug
In my backend I do:
And in my frontend I do:
This works but after the role:assistant with tool_calls it doesn't push the tool_call results to the messages array. In the frontend also
runner.on('functionCallResult')
never gets called.This means that next calls fail because after a tool call we must pass the tool call results.
How can I get the tool call function response to show up in messages?
To Reproduce
const steam = client.beta.chat.completions.runTools({})
return steam.toReadableStream()
read the stream with ChatCompletionStream
fail to get the function response as message
P.s I've also tested with ChatCompletionStreamingRunner and it's the same
Code snippets
No response
OS
mac latest
Node version
22
Library version
latest
The text was updated successfully, but these errors were encountered: