Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tool_calls and other keys in OpenAI API return value set to None are stripped #23

Open
gaborvar opened this issue Jan 26, 2025 · 0 comments

Comments

@gaborvar
Copy link

gaborvar commented Jan 26, 2025

Hi @pamelafox
I use message_builder in code that relies on the tool_calls feature of GPT.

model_helper.py assumes that certain keys (if present) cannot be set to None in the return value of the ChatCompletion API:

raise ValueError(f"Could not encode unsupported message value type: {type(value)}")

Example return value:

ChatCompletionMessage(content='Inform, assist, connect with legal experts, facilitate paperwork.', refusal=None, role='assistant', audio=None, function_call=None, tool_calls=None)

Keys set to None are routinely returned by the model, in my case gpt-4o 2024-08-06 .

It seems to be a bug which is suppressed in other parts of the code if tool_calls are not used. I.e. the code omits None keys before they hit this code line. However, for tool_calls to work correctly I need to pass back to the model what it has returned previously. OpenAI API returns an error if I omit certain parts of its previous returns.

Is tool_calls and openai_messages_token_helper together a niche scenario or is it worth fixing in the repo?

See also #21 #20
Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant