Skip to content

[Bugfix] restore legacy stage config precedence#2663

Merged
hsliuustc0106 merged 1 commit intovllm-project:mainfrom
xiaohajiayou:bugfix/restore-stage-config-precedence
Apr 10, 2026
Merged

[Bugfix] restore legacy stage config precedence#2663
hsliuustc0106 merged 1 commit intovllm-project:mainfrom
xiaohajiayou:bugfix/restore-stage-config-precedence

Conversation

@xiaohajiayou
Copy link
Copy Markdown
Contributor

@xiaohajiayou xiaohajiayou commented Apr 10, 2026

PLEASE FILL IN THE PR DESCRIPTION HERE ENSURING ALL CHECKLIST ITEMS (AT THE BOTTOM) HAVE BEEN CONSIDERED.

Purpose

This PR fixes an unintended behavior introduced by #2076 in the legacy
load_and_resolve_stage_configs path.

After #2076, CLI runtime args can override default stage configs. However,
CLI default values also started participating in the merge and could override
the YAML's default stage config values even when the user did not explicitly
set those CLI args.

This PR restores the previous behavior for the legacy loader by switching
prefer_stage_engine_args back to True when loading model-default stage
configs.

Behavior

Before this fix:

  • CLI defaults could override default stage YAML values
  • default stage configs could be unintentionally polluted by CLI defaults

After this fix:

  • stage YAML defaults take precedence again in the legacy model-default load path
  • only intentional runtime overrides should affect resolved configs

Rationale

The affected path is the legacy load_and_resolve_stage_configs chain, which
is planned to be removed by the config refactor tracked in #2072.

this pr is a minimal rollback of the legacy precedence behavior, intended as a temporary fix until the #2072 refactor is completed.


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan. Please provide the test scripts & test commands. Please state the reasons if your codes don't require additional test scripts. For test file guidelines, please check the test style doc
  • The test results. Please paste the results comparison before and after, or the e2e results.
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model. Please run mkdocs serve to sync the documentation editions to ./docs.
  • (Optional) Release notes update. If your change is user-facing, please update the release notes draft.

BEFORE SUBMITTING, PLEASE READ https://github.com/vllm-project/vllm-omni/blob/main/CONTRIBUTING.md (anything written below this line will be removed by GitHub Actions)

Signed-off-by: xiaohajiayou <923390377@qq.com>
@chatgpt-codex-connector
Copy link
Copy Markdown

Codex usage limits have been reached for code reviews. Please check with the admins of this repo to increase the limits by adding credits.
Credits must be used to enable repository wide code reviews.

@hsliuustc0106 hsliuustc0106 merged commit 0c46ba5 into vllm-project:main Apr 10, 2026
5 checks passed
Sy0307 pushed a commit to Sy0307/vllm-omni that referenced this pull request Apr 10, 2026
Signed-off-by: xiaohajiayou <923390377@qq.com>
daixinning pushed a commit to daixinning/vllm-omni that referenced this pull request Apr 13, 2026
Signed-off-by: xiaohajiayou <923390377@qq.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants