Conversation
|
we will need to make sure that we require the version of vllm that supports this feature |
Sure, we can wait for the vllm release and then update the vllm version in setup in this PR. |
|
you will need to update this: https://github.com/huggingface/trl/blob/main/setup.cfg#L67 |
Yes, exactly. I will update it once vllm 10.3.0 released. |
Signed-off-by: jiqing-feng <jiqing.feng@intel.com>
|
for ref: #4122 will pin vLLM |
|
Issue tracking this PR: #4617 |
|
Hi @kashif @qgallouedec . I have updated this PR. Please review it. Thanks! |
Signed-off-by: jiqing-feng <jiqing.feng@intel.com>
Signed-off-by: jiqing-feng <jiqing.feng@intel.com>
Signed-off-by: jiqing-feng <jiqing.feng@intel.com>
|
Thanks @jiqing-feng! BTW, it looks like you're pinning to <v0.12.0. What's the blocker now for v0.12.0? It was released last week. |
I just kept the original maximum version v0.12.0, it's not my change. |
|
My understanding is that TRL already worked with v0.11.2. This fix is required for support of v0.12.0. @qgallouedec, wanted to ensure this PR was on your radar. Thanks! |
|
Thanks for the PR. Indeed, TRL supports 0.10.2 - 0.11.2, but not 0.12, and this PR should allow to support 0.12. Ideally we would like to continue supporting 0.10.2 - 0.11.2, I'm not sure if this PR allows this, I'll need to check. I'll review it in details very soon. |
docs/source/vllm_integration.md
Outdated
|
|
||
| > [!WARNING] | ||
| > TRL currently only supports vLLM versions `0.10.2`, `0.11.0`, `0.11.1`, and `0.11.2`. Please ensure you have one of these versions installed to avoid compatibility issues. | ||
| > TRL currently only supports vLLM versions `0.10.3`, `0.11.0`, `0.11.1`, and `0.11.2`. Please ensure you have one of these versions installed to avoid compatibility issues. |
There was a problem hiding this comment.
0.10.3 is not a valid vllm version.
|
|
qgallouedec
left a comment
There was a problem hiding this comment.
LGTM, thanks! I modified a bit to ensure backward compatibility
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
The latest vllm changed the
guided_decodingtostructured_outputsin this PR 22772. We also need to update this parameter in the vllm client and vllm serve.Hi @qgallouedec . Would you please review this PR? Thanks!