-
Notifications
You must be signed in to change notification settings - Fork 11.5k
Add function calling and structured outputs support #46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Fixes #9 Add support for function calling and structured outputs. * **README.md** - Add a section about function calling and structured outputs. - Include examples of using function calling and structured outputs. - Mention the future plans for these features. * **src/model.py** - Add support for structured data formats like JSON and XML. - Implement function calling capabilities. - Include integration with external tools and APIs. * **src/utils.py** - Add utility functions for parsing and generating structured data formats. - Include helper functions for function calling. * **tests/test_model.py** - Add unit tests for structured data format support. - Include tests for function calling capabilities. - Add tests for API integration. * **tests/test_utils.py** - Add unit tests for utility functions related to structured data formats. - Include tests for helper functions for function calling. --- For more details, open the [Copilot Workspace session](https://copilot-workspace.githubnext.com/deepseek-ai/DeepSeek-R1/issues/9?shareId=XXXX-XXXX-XXXX-XXXX).
https://github.com/justinlietz93/agent_tools If you want to use deepseek with tools i created a suite of tools and a wrapper that works with deepseek along with a guide and examples |
Awesome thank you very much! Can't wait!!! |
|
||
def call_function(self, function_name, args): | ||
prompt = f"Call function {function_name} with args {args} and return the result." | ||
return self.generate(prompt) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this really function calling? The function calling usually just "returns" function name and params to use from the list of supplied function name. it doesn't call any function itself, right?
Has anyone else taken a look at this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well techically this is the same with openai. Openai cannot run the function. You would have to process the args and run it by yourself.
from openai import OpenAI
client = OpenAI()
tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current temperature for a given location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City and country e.g. Bogotá, Colombia"
}
},
"required": [
"location"
],
"additionalProperties": False
},
"strict": True
}
}]
completion = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "What is the weather like in Paris today?"}],
tools=tools
)
print(completion.choices[0].message.tool_calls)
This returns:
[{
"id": "call_12345xyz",
"type": "function",
"function": {
"name": "get_weather",
"arguments": "{\"location\":\"Paris, France\"}"
}
}]
·``
^^^Taken from https://platform.openai.com/docs/guides/function-calling
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct, the models don't call themselves, just execute the tools based on known functions etc. Not aware of any models that can do what Anthropic refer to as "sampling" (mcp docs) where the tool can use the main LLM and keys to carry out a task
Hi there! I think you can use llama3.2-1b to parse the tool call. As deepseek-r1 currently lacks the ability to parse a tool call, we can however ask it to list down its needs in text. The llama3.2-1b model on the other hand has the ability to parse tools calls, but however lacks the ability to "think". By using these two models together, you can use function calling. How I achieved it.
Do ignore the first part, it is just some personalisation done on my end. response = ollama.chat(
model = 'deepseek-r1:7b',
messages = [
{
'role': 'user',
'content': open('deepseek_prompt_engineering.txt', encoding = 'utf-8').read() + '今天天气怎么样'
}
]) This returned:
So you can basically truncate the hallucination off. After getting the "function call" from deepseek-r1, you can then pass it through llama3.2-1b. import ollama
def get_weather():
"""
Gets the current weather in Singapore from the weather API.
Args:
Returns:
dict: The json weather from the weather API.
"""
return {
'weather': 'sunny',
'humidity': '69%'
}
response = ollama.chat(
model = 'llama3.2:1b',
messages = [
{
'role': 'user',
'content': '[Function Call] 调用工具“天气”'
}
],
tools = [get_weather]
)
print(response.message) This returned:
As you can see the tool call has been successfully generated. import ollama
response = ollama.chat(
model = 'deepseek-r1:7b',
messages = [
{
'role': 'user',
'content': open('deepseek_prompt_engineering.txt', encoding = 'utf-8').read() + '今天天气怎么样'
},
{
'role': 'assistant',
'content': '[Function Call] 调用工具“天气”,参数为获取当前实时天气情况。'
},
{
'role': 'user',
'content': '来自函数调用的结果:{"weather": "sunny", "humidity": "69%"}。请你根据以上把信息转达给用户。'
}
]
)
print(response.message.content) This returned:
As you can see, this method kinda works. If you have any ideas or comments, please comment down below! I did not test this on a very large scale and I am assuming a lot thing here. If possible, please help me test this out at a larger scale and tell me if this message is feasible as a whole. Thanks a lot. P.S. I know both chinese and english so its fine communicating in both languages. I am a very young student so I make a lot of mistakes. Please tell me if I have made any :D |
Check out my agent tools repo, I have deepseek r1 running on an autonomous loop right now which can use tools calls at will |
Fixes #9
Add support for function calling and structured outputs.
README.md
src/model.py
src/utils.py
tests/test_model.py
tests/test_utils.py
For more details, open the Copilot Workspace session.