Skip to content

[EncoderCacheManager] Remove unnecessary copy#32800

Merged
DarkLight1337 merged 1 commit intovllm-project:mainfrom
lgeiger:encode-cache-copy
Jan 24, 2026
Merged

[EncoderCacheManager] Remove unnecessary copy#32800
DarkLight1337 merged 1 commit intovllm-project:mainfrom
lgeiger:encode-cache-copy

Conversation

@lgeiger
Copy link
Copy Markdown
Contributor

@lgeiger lgeiger commented Jan 21, 2026

Purpose

#13173 introduced this copy. Since then self.get_cached_input_ids changed to always return a new set so no need to copy it again.

def get_cached_input_ids(self, request: Request) -> set[int]:
"""Get all cached multimodal input IDs for a request.
Returns the set of input IDs whose `mm_hash` exists in the cache map.
This includes entries that are currently unreferenced (and thus present
in `freeable`); for such entries, freeing for this request will be a
no-op.
"""
return {
input_id
for input_id in range(len(request.mm_features))
if request.mm_features[input_id].identifier in self.cached
}

def get_cached_input_ids(self, request: Request) -> set[int]:
return set(range(len(request.mm_features)))

Test Plan

CI

Signed-off-by: Lukas Geiger <lukas.geiger94@gmail.com>
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request removes a redundant .copy() call when getting cached input IDs within the EncoderCacheManager.free method. The get_cached_input_ids method already constructs and returns a new set, so an additional copy is unnecessary. The change is correct and safely improves efficiency by avoiding an extra object allocation. I have reviewed the changes and found no issues.

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) January 24, 2026 12:47
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Jan 24, 2026
@DarkLight1337 DarkLight1337 merged commit 5fa0f6e into vllm-project:main Jan 24, 2026
50 of 52 checks passed
@lgeiger lgeiger deleted the encode-cache-copy branch January 24, 2026 17:23
ms1design pushed a commit to ms1design/vllm that referenced this pull request Jan 24, 2026
Signed-off-by: Lukas Geiger <lukas.geiger94@gmail.com>
Signed-off-by: Mieszko Syty <mieszko@ms1design.pl>
cwazai pushed a commit to cwazai/vllm that referenced this pull request Jan 25, 2026
Signed-off-by: Lukas Geiger <lukas.geiger94@gmail.com>
Signed-off-by: 陈建华 <1647430658@qq.com>
ItzDEXX pushed a commit to ItzDEXX/vllm that referenced this pull request Feb 19, 2026
Signed-off-by: Lukas Geiger <lukas.geiger94@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed v1

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants