[v1] fix parallel config rank#13445
Conversation
|
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add 🚀 |
WoosukKwon
left a comment
There was a problem hiding this comment.
LGTM. Seems to work on my dev box!
|
|
Signed-off-by: youkaichao <youkaichao@gmail.com>
Signed-off-by: youkaichao <youkaichao@gmail.com>
Signed-off-by: youkaichao <youkaichao@gmail.com> Signed-off-by: Louis Ulmer <ulmerlouis@gmail.com>
Signed-off-by: youkaichao <youkaichao@gmail.com>
#12816 forgot to copy this line of code, and workers cannot get their rank correctly. that information is used in
vllm/vllm/compilation/backends.py
Line 400 in b3942e1