Skip to content

[BugFix] Fix cmake based incremental install (wrong vllm install dir)#35773

Merged
vllm-bot merged 2 commits intovllm-project:mainfrom
neuralmagic:lwilkinson/fix-install
Mar 3, 2026
Merged

[BugFix] Fix cmake based incremental install (wrong vllm install dir)#35773
vllm-bot merged 2 commits intovllm-project:mainfrom
neuralmagic:lwilkinson/fix-install

Conversation

@LucasWilkinson
Copy link
Copy Markdown
Collaborator

@LucasWilkinson LucasWilkinson commented Mar 2, 2026

alternative #35768, (broke by: 8b5014d) when using

cmake --build --preset release --target install

all components are installed at once, unlike setup.py based methods like python setup.py build_ext --inplace or uv pip install -e . that install components one-by-one. This led to double appending of vllm to the path

Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
@mergify mergify Bot added ci/build bug Something isn't working labels Mar 2, 2026
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses a bug in the CMake-based installation process where installing multiple vllm-flash-attn components at once would lead to an incorrect, doubly-nested installation directory. The original code used a foreach loop that modified the CMAKE_INSTALL_PREFIX for each component, causing the issue. The fix replaces this loop with install(CODE ... ALL_COMPONENTS), which correctly ensures the prefix modification logic runs only once per installation. This change is a clean and effective solution to the problem.

Copy link
Copy Markdown
Member

@yewentao256 yewentao256 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Verified that this one fix the issue

@yewentao256 yewentao256 added the ready ONLY add when PR is ready to merge/full CI is needed label Mar 2, 2026
@LucasWilkinson LucasWilkinson enabled auto-merge (squash) March 2, 2026 19:27
@vllm-bot vllm-bot merged commit f44d1dd into vllm-project:main Mar 3, 2026
113 of 116 checks passed
Copilot AI pushed a commit to machov/vllm that referenced this pull request Mar 10, 2026
…vllm-project#35773)

Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
Co-authored-by: Wentao Ye <44945378+yewentao256@users.noreply.github.com>
avinashsingh77 pushed a commit to avinashsingh77/vllm that referenced this pull request Mar 12, 2026
…vllm-project#35773)

Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
Co-authored-by: Wentao Ye <44945378+yewentao256@users.noreply.github.com>
wendyliu235 pushed a commit to wendyliu235/vllm-public that referenced this pull request Mar 18, 2026
…vllm-project#35773)

Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
Co-authored-by: Wentao Ye <44945378+yewentao256@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working ci/build ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants