You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
[FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
Please do not modify this template :) and fill in all the required fields.
Dify version
0.13.2
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
I added a locally deployed glm-4-9b-chat in an OpenAI API-compatible manner and set up function calls.
when I used this model to run an agent, I encountered a 400 error. Below is the detailed information about the error
[openai_api_compatible] Error: API request failed with status code 400: {"object":"error","message":[{"type":'extra_forbidden', 'loc':('body', 'functions'), 'msg': 'Extra inputs are not permitted', 'input':[{'name': 'Analysis', 'description': '', 'parameters': {'type': 'object', 'properties': {'text': {'type': 'string', 'description': ''}, 'class_name': {'type': 'string', 'description': ''}}, 'required': ['text']}}]}],"type":"BadRequestError","param":null,"code":400}
How should I go about resolving this issue?
✔️ Expected Behavior
I hope to be able to use function calls within the agent
❌ Actual Behavior
Some errors have occurred.
The text was updated successfully, but these errors were encountered:
To resolve the "Extra inputs are not permitted" error when using function calls with a locally deployed glm-4-9b-chat model in an OpenAI API-compatible manner, ensure that the model you are using supports function calling. The error message indicates that the functions section in your request body is not being accepted, which suggests that the model might not support function calling or the request format is incorrect.
Here are some steps you can take to address this issue:
Check Model Compatibility: Ensure that the model you are using supports function calling. According to the Dify platform, models like gpt-3.5 and gpt-4 series support function calling [1][2]. If glm-4-9b-chat does not support this feature, you may need to switch to a compatible model.
Verify Request Format: Make sure that the functions section in your request body is formatted correctly. The functions list should include dictionaries with name, description, and parameters fields. Each function should be properly defined according to the expected input format [3].
Review Configuration Settings: Double-check your configuration settings to ensure that the function_calling_type is set correctly. If you are using function calls, this should be set to "function_call" [3].
By ensuring compatibility and correct configuration, you should be able to resolve the error and use function calls within the agent.
Self Checks
Dify version
0.13.2
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
I added a locally deployed glm-4-9b-chat in an OpenAI API-compatible manner and set up


function calls
.when I used this model to run an agent, I encountered a 400 error. Below is the detailed information about the error
How should I go about resolving this issue?
✔️ Expected Behavior
I hope to be able to use function calls within the agent
❌ Actual Behavior
Some errors have occurred.
The text was updated successfully, but these errors were encountered: