[Model] Support IQuestCoder model#31575
Conversation
There was a problem hiding this comment.
Code Review
This pull request introduces support for two new models, IQuestCoder and IQuestLoopCoder, by adding their respective implementations. The code is well-structured and largely follows the existing patterns for model integration in vLLM. My review focuses on ensuring the correctness and clarity of the new model definitions. I've identified a few areas for improvement, mainly related to cleaning up unused code for pipeline parallelism that seems to have been carried over from template files, and correcting some type hint mismatches. Addressing these points will improve the maintainability and correctness of the new model implementations.
|
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run You ask your reviewers to trigger select CI tests on top of Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add If you have any questions, please reach out to us on Slack at https://slack.vllm.ai. 🚀 |
|
@yxing-bj you can send email to collaboration@vllm.ai for official collaboration! |
Thanks. How should I send an email to establish a collaboration? |
Just send an email from your company email address, and then we can better coordinate model releases before it goes public. |
|
When can this PR be merged into the main branch? |
I replaced the |
08f15c8 to
e75857e
Compare
|
Documentation preview: https://vllm--31575.org.readthedocs.build/en/31575/ |
92cdea2 to
2f4e88d
Compare
Signed-off-by: yxing <yxing@iquestlab.com>
Signed-off-by: yxing <yxing@iquestlab.com>
Signed-off-by: yxing <yxing@iquestlab.com>
Signed-off-by: yxing <yxing@iquestlab.com>
2f4e88d to
5b56358
Compare
|
if IQuestLab/IQuest-Coder-V1-40B-Instruct support tool call? use qwen3_coder template? vllm serve IQuestLab/IQuest-Coder-V1-40B-Instruct --tensor-parallel-size 4 --trust-remote-code --tool-call-parser qwen3_coder --enable-auto-tool-choice |
We support qwen3_coder template. You can try. However, the content generated by the current model is still not quite satisfactory. In the future, we will update model to support tools more better |
I have try --tool-call-parser qwen3_coder And --tool-call-parser hermers ,the tool call accuracy of the terminal is close to 0,maybe there has something error,please give me the correct config? |
|
We need the |
the same question !! |
In order to support tool call, we start to launch vllm service with the following command: Then we send request and get response. The following is an example: Then we would get the result: And we can try other BFCL examples, samples of user prompts are BFCL_v4_web_search.json and the tools are web_search.json |
Signed-off-by: yxing <yxing@iquestlab.com>
Signed-off-by: yxing <yxing@iquestlab.com>
|
Hi @yxing-bj, the pre-commit checks have failed. Please run: uv pip install pre-commit
pre-commit install
pre-commit run --all-filesThen, commit the changes and push to your branch. For future commits, Tip Is
|
|
Please fix pre-commit |
Signed-off-by: yxing <yxing@iquestlab.com>
Head branch was pushed to by a user without write access
Signed-off-by: yxing <yxing@iquestlab.com>
I have try this config, but i find model return toolCalls.function.name = ***,this name is not definition in the request tools list ,i want to know if current model has the accuracy of tool call been verified ? {
"toolCalls": [
{
"index": 0,
"id": "chatcmpl-tool-***",
"type": "function",
"function": {
"arguments": "***",
"name": "****"
}
}
]
}"temperature":0.6 |
We have noticed this issue and will fixed it in the new version. |
Signed-off-by: yxing <yxing@iquestlab.com>
new version of vllm or the model? because I have the issue that with opencode is hallucinating and calling tools that doest' exists like |
Signed-off-by: yxing <yxing@iquestlab.com> Signed-off-by: dsuhinin <suhinin.dmitriy@gmail.com>
Signed-off-by: yxing <yxing@iquestlab.com>
Purpose
IQuest-Coder-V1 is a new family of code large language models (LLMs) designed to advance autonomous software engineering and code intelligence. We built a repo about IQuestCoder.
We had uploaded these models to Hugging Face, including IQuestCoder and IQuestLoopCoder. To make them easier for everyone to use, we support these models on vLLM platform.
Test Plan
Firstly, we start to launch a vLLM server
Then , we use vLLM with an OpenAI-compatible API endpoint
Test Result
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.