Skip to content

fix(grpc): update vLLM imports for inputs reorganization#1033

Merged
CatherineSue merged 4 commits intomainfrom
fix/vllm-mm-inputs-rename
Apr 3, 2026
Merged

fix(grpc): update vLLM imports for inputs reorganization#1033
CatherineSue merged 4 commits intomainfrom
fix/vllm-mm-inputs-rename

Conversation

@CatherineSue
Copy link
Copy Markdown
Member

@CatherineSue CatherineSue commented Apr 3, 2026

Description

Problem

vLLM reorganized its inputs module in vllm-project/vllm#35182 (commit ba2f0acc2, 2026-03-25), renaming and relocating several symbols. This change first appeared in v0.18.2rc0 and is now officially released in v0.19.0:

  • vllm.multimodal.inputs.mm_inputsvllm.inputs.engine.mm_input
  • vllm.multimodal.inputs.MultiModalInputsvllm.inputs.engine.MultiModalInput
  • vllm.inputs.token_inputsvllm.inputs.engine.tokens_input

The gRPC servicer was importing the old names, which breaks on vLLM >= v0.19.0.

Solution

Update the gRPC servicer imports to use the new module paths and renamed symbols from vllm.inputs.engine.

Changes

  • Update grpc_servicer/smg_grpc_servicer/vllm/servicer.py:
    • Import MultiModalInput (was MultiModalInputs) from vllm.inputs.engine
    • Import mm_input (was mm_inputs) from vllm.inputs.engine
    • Import tokens_input (was token_inputs) from vllm.inputs.engine
    • Update type annotations and function calls to use the new names

Test Plan

  • Verify imports resolve correctly against vLLM v0.19.0
  • Run existing gRPC servicer tests to ensure no regression
Checklist
  • cargo +nightly fmt passes
  • cargo clippy --all-targets --all-features -- -D warnings passes
  • (Optional) Documentation updated
  • (Optional) Please join us on Slack #sig-smg to discuss, review, and merge PRs

Summary by CodeRabbit

  • Refactor
    • Updated internal tokenization and multimodal input handling to align with the newer engine-level interfaces, improving compatibility and maintainability.
  • Chores
    • Bumped the minimum vLLM dependency to a more recent release to match the updated interfaces.

vLLM upstream (#35182) reorganized multimodal input types:
- MultiModalInputs -> MultiModalInput (moved to vllm.inputs.engine)
- mm_inputs() -> mm_input() (moved to vllm.inputs.engine)

Update the gRPC servicer imports and usages to match. This change
is prepared ahead of the vLLM release that includes the reorganization
and should be merged when that version is available.

Signed-off-by: Chang Su <chang.s.su@oracle.com>

# Conflicts:
#	grpc_servicer/smg_grpc_servicer/vllm/servicer.py
@github-actions github-actions bot added the grpc gRPC client and router changes label Apr 3, 2026
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Apr 3, 2026

Caution

Review failed

The pull request is closed.

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

Run ID: 7ca3a81f-0c26-4cf5-a93f-d6583046a0a0

📥 Commits

Reviewing files that changed from the base of the PR and between a4a79f7 and 194028c.

📒 Files selected for processing (1)
  • grpc_servicer/pyproject.toml

📝 Walkthrough

Walkthrough

Replaced vLLM input constructors in the gRPC servicer: switched tokenized prompt import to vllm.inputs.engine.tokens_input and swapped multimodal input construction/type from mm_inputs/VllmMultiModalInputs to mm_input/VllmMultiModalInput in _build_preprocessed_mm_inputs. (≤50 words)

Changes

Cohort / File(s) Summary
vLLM input constructor updates
grpc_servicer/smg_grpc_servicer/vllm/servicer.py
Replaced vllm.inputs.token_inputs import with vllm.inputs.engine.tokens_input; changed _build_preprocessed_mm_inputs to construct multimodal inputs via vllm.inputs.engine.mm_input and updated return annotation from VllmMultiModalInputs to VllmMultiModalInput; preserved argument structure (prompt_token_ids, mm_kwargs, mm_hashes, mm_placeholders, prompt).
Dependency bump
grpc_servicer/pyproject.toml
Raised optional vllm extra from vllm>=0.17.0 to vllm>=0.19.0.

Sequence Diagram(s)

(omitted)

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Possibly related PRs

Suggested labels

dependencies

Suggested reviewers

  • njhill
  • slin1237

Poem

🐰 Hopped through code with nimble paws,
Swapped old imports for fresher laws,
Tokens trimmed and mm stitched tight,
Engine-ready now — what a sight!
A tiny hop, then off to write.

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.
Title check ✅ Passed The title accurately describes the main change: updating vLLM imports in the gRPC servicer due to vLLM's inputs module reorganization. It is specific, clear, and directly related to the primary objective of the PR.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch fix/vllm-mm-inputs-rename

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the vLLM multimodal input handling by migrating from the deprecated mm_inputs and MultiModalInputs to the newer mm_input and MultiModalInput classes from vllm.inputs.engine. These changes include updating imports, return type hints, and function calls within the _build_preprocessed_mm_inputs method. I have no feedback to provide.

…nization

Part of the vLLM #35182 reorganization that moved symbols from
vllm.inputs.data to vllm.inputs.engine with renames.

Signed-off-by: Chang Su <chang.s.su@oracle.com>
Comment thread grpc_servicer/smg_grpc_servicer/vllm/servicer.py
Signed-off-by: Chang Su <chang.s.su@oracle.com>
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@grpc_servicer/smg_grpc_servicer/vllm/servicer.py`:
- Around line 18-19: The project imports VllmMultiModalInput, mm_input, and
tokens_input from vllm (in grpc_servicer/smg_grpc_servicer/vllm/servicer.py),
which require vllm>=0.19.0; update the dependency constraint in
grpc_servicer/pyproject.toml by changing the vllm requirement to vllm =
["vllm>=0.19.0"] so the installed vllm release matches the new API
reorganization.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

Run ID: c8443587-019b-4c52-8cea-1408800ea702

📥 Commits

Reviewing files that changed from the base of the PR and between 57a30b4 and a4a79f7.

📒 Files selected for processing (1)
  • grpc_servicer/smg_grpc_servicer/vllm/servicer.py

Comment thread grpc_servicer/smg_grpc_servicer/vllm/servicer.py
@CatherineSue CatherineSue changed the title fix(grpc): update vLLM multimodal imports for inputs reorganization fix(grpc): update vLLM imports for inputs reorganization Apr 3, 2026
The new imports from vllm.inputs.engine require vLLM >=0.19.0.

Signed-off-by: Chang Su <chang.s.su@oracle.com>
@github-actions github-actions bot added the dependencies Dependency updates label Apr 3, 2026
@CatherineSue CatherineSue merged commit 0b02060 into main Apr 3, 2026
11 of 13 checks passed
@CatherineSue CatherineSue deleted the fix/vllm-mm-inputs-rename branch April 3, 2026 17:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Dependency updates grpc gRPC client and router changes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant