Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions DeepSeek/DeepSeek-V3_1.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,17 @@ vllm serve deepseek-ai/DeepSeek-V3.1 \
--served-model-name ds31
```

### Function calling

vLLM also supports calling user-defined functions. Make sure to run your DeepSeek-V3.1 models with the following arguments. The example file is included in the official container and can be downloaded [here](https://github.com/vllm-project/vllm/blob/main/examples/tool_chat_template_deepseekv31.jinja)

```bash
vllm serve ...
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The vllm serve ... syntax is a bit ambiguous, and using ... on a separate line is unconventional. It would be clearer to show a more complete command that includes the model name, to indicate that these are additional arguments, similar to the previous example in the document.

Suggested change
vllm serve ...
vllm serve deepseek-ai/DeepSeek-V3.1 \

--enable-auto-tool-choice
--tool-call-parser deepseek_v31
--chat-template examples/tool_chat_template_deepseekv31.jinja
```

## Using the Model

### OpenAI Client Example
Expand Down