Skip to content

[FIX_FOR_VLLM_LATEST] Fix for hourly#697

Merged
iboiko-habana merged 2 commits intomainfrom
adobrzyn/hourly_fix
Dec 8, 2025
Merged

[FIX_FOR_VLLM_LATEST] Fix for hourly#697
iboiko-habana merged 2 commits intomainfrom
adobrzyn/hourly_fix

Conversation

@adobrzyn
Copy link
Copy Markdown
Collaborator

@adobrzyn adobrzyn commented Dec 8, 2025

Signed-off-by: Dobrzyniewicz, Agata <agata.dobrzyniewicz@intel.com>
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds a missing remote_request_id field to a request configuration dictionary to fix compatibility with the latest version of vLLM. The change addresses breaking changes introduced in vllm-project/vllm#29665.

Key Changes:

  • Added remote_request_id parameter to the request configuration in send_request_to_service function

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Signed-off-by: Dobrzyniewicz, Agata <agata.dobrzyniewicz@intel.com>
@github-actions
Copy link
Copy Markdown

github-actions bot commented Dec 8, 2025

✅ CI Passed

All checks passed successfully against the following vllm commit:
408cf42f67dbcd50027fcd0f6ba35df83ced9107

@iboiko-habana iboiko-habana changed the title [FIX_FOR_VLLM_LATEST] Maybe fix for hourly [FIX_FOR_VLLM_LATEST] Fix for hourly Dec 8, 2025
@iboiko-habana iboiko-habana merged commit de92b87 into main Dec 8, 2025
46 checks passed
@adobrzyn adobrzyn deleted the adobrzyn/hourly_fix branch January 9, 2026 10:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants