Skip to content

[BugFix] Add BAGEL single-stage diffusion config and fix multiple <im_start><im_end> bug#2381

Merged
princepride merged 2 commits intovllm-project:mainfrom
princepride:add-bagel-single-stage-yaml
Apr 1, 2026
Merged

[BugFix] Add BAGEL single-stage diffusion config and fix multiple <im_start><im_end> bug#2381
princepride merged 2 commits intovllm-project:mainfrom
princepride:add-bagel-single-stage-yaml

Conversation

@princepride
Copy link
Copy Markdown
Collaborator

@princepride princepride commented Mar 31, 2026

Summary

  • Add bagel_single_stage.yaml stage config for running BAGEL as a single-stage diffusion pipeline on a single GPU, with sensible defaults (gpu_memory_utilization=0.45, enforce_eager, seed=52).

Test plan

python3 ./examples/offline_inference/bagel/end2end.py --stage-configs-path vllm_omni/model_executor/stage_configs/bagel_single_stage.yaml

Result

Before:
image

After:
image

Signed-off-by: princepride <wangzhipeng628@gmail.com>
@hsliuustc0106 hsliuustc0106 added the ready label to trigger buildkite CI label Mar 31, 2026
Signed-off-by: princepride <wangzhipeng628@gmail.com>
@princepride princepride changed the title [Config] Add BAGEL single-stage diffusion config [Config] Add BAGEL single-stage diffusion config and fix multiple <im_start><im_end> bug Mar 31, 2026
@princepride princepride changed the title [Config] Add BAGEL single-stage diffusion config and fix multiple <im_start><im_end> bug [Config] Add BAGEL single-stage diffusion config and fix multiple <im_start><im_end> bug Mar 31, 2026
@princepride princepride changed the title [Config] Add BAGEL single-stage diffusion config and fix multiple <im_start><im_end> bug [BugFix] Add BAGEL single-stage diffusion config and fix multiple <im_start><im_end> bug Mar 31, 2026
Copy link
Copy Markdown
Collaborator

@gcanlin gcanlin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@princepride princepride merged commit 369f301 into vllm-project:main Apr 1, 2026
7 of 8 checks passed
vraiti pushed a commit to vraiti/vllm-omni that referenced this pull request Apr 9, 2026
…m_start><im_end>` bug (vllm-project#2381)

Signed-off-by: princepride <wangzhipeng628@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready label to trigger buildkite CI

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants