Skip to content

[Test][HunyuanImage3] Add e2e offline inference smoke tests for I2T, T2I#2986

Closed
TaffyOfficial wants to merge 2 commits into
vllm-project:mainfrom
TaffyOfficial:feat/hunyuan-image3-e2e-tests
Closed

[Test][HunyuanImage3] Add e2e offline inference smoke tests for I2T, T2I#2986
TaffyOfficial wants to merge 2 commits into
vllm-project:mainfrom
TaffyOfficial:feat/hunyuan-image3-e2e-tests

Conversation

@TaffyOfficial
Copy link
Copy Markdown
Contributor

@TaffyOfficial TaffyOfficial commented Apr 21, 2026

Summary

Add missing e2e smoke tests for all three HunyuanImage-3.0 offline inference pipelines.
Currently only test_hunyuanimage3_text2img.py (CLIP-based accuracy test) exists — there
are no basic smoke tests that verify each pipeline topology can load and produce output.

Changes

Tests

  • tests/e2e/offline_inference/test_hunyuanimage3_i2t.py — Image-to-Text (single AR stage, text output)
  • tests/e2e/offline_inference/test_hunyuanimage3_t2i.py — Text-to-Image (DiT-only)

All tests are marked @pytest.mark.advanced_model and require ≥ 4 CUDA GPUs (TP4).

Examples

  • Add examples/offline_inference/hunyuan_image3/image_to_image.py for IT2I pipeline
  • Update image_to_text.py to use build_prompt() and stage_configs_path
  • Minor fix in prompt_utils.py

Test Plan

pytest tests/e2e/offline_inference/test_hunyuanimage3_i2t.py -v -m advanced_model
pytest tests/e2e/offline_inference/test_hunyuanimage3_t2i.py -v -m advanced_model
---

@chatgpt-codex-connector
Copy link
Copy Markdown

Codex usage limits have been reached for code reviews. Please check with the admins of this repo to increase the limits by adding credits.
Credits must be used to enable repository wide code reviews.

@TaffyOfficial TaffyOfficial changed the title [WIP][Test][HunyuanImage3] Add e2e offline inference smoke tests for I2T, … [WIP][Test][HunyuanImage3] Add e2e offline inference smoke tests for I2T, T2I Apr 21, 2026
@TaffyOfficial TaffyOfficial force-pushed the feat/hunyuan-image3-e2e-tests branch 3 times, most recently from 6f49a9d to 48a494f Compare April 21, 2026 09:36
@TaffyOfficial TaffyOfficial changed the title [WIP][Test][HunyuanImage3] Add e2e offline inference smoke tests for I2T, T2I [Test][HunyuanImage3] Add e2e offline inference smoke tests for I2T, T2I Apr 21, 2026
…nd T2I pipelines

Add missing e2e smoke tests for HunyuanImage-3.0 offline inference:
- test_hunyuanimage3_i2t.py: Image-to-Text (single AR stage, text output)
- test_hunyuanimage3_t2i.py: Text-to-Image (DiT-only)

Both tests are marked @pytest.mark.advanced_model and require >= 4 CUDA GPUs (TP4).

Signed-off-by: TaffyOfficial <2324465096@qq.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@TaffyOfficial TaffyOfficial force-pushed the feat/hunyuan-image3-e2e-tests branch from 48a494f to 4819192 Compare April 21, 2026 09:40
@TaffyOfficial
Copy link
Copy Markdown
Contributor Author

@hsliuustc0106 This is the test case that was overlooked during the last merge

@TaffyOfficial
Copy link
Copy Markdown
Contributor Author

@hsliuustc0106 need review

…re to nightly CI

- Pin first 20 chars of AR output ("The image is a solid") against the HF greedy
  reference captured via scripts/bench/hf_i2t_pr2986_baseline.py. 20 chars is the
  longest stable common prefix; vllm-omni vs HF is not bitwise-alignable past
  that point (sampler tie-break on temp=0 + MoE numerics).
- Add nightly Buildkite step under nightly-diffusion-x2iat-group running both
  test_hunyuanimage3_i2t.py and test_hunyuanimage3_t2i.py on 4x H100, gated by
  the group's existing labels (NIGHTLY / nightly-test / diffusion-x2iat-test).

Signed-off-by: zuiho-kai <wu15922848573@outlook.com>
Signed-off-by: TaffyOfficial <2324465096@qq.com>
@TaffyOfficial TaffyOfficial force-pushed the feat/hunyuan-image3-e2e-tests branch from 9517ac7 to 0ad2e82 Compare May 4, 2026 03:37
@Gaohan123 Gaohan123 added this to the v0.20.0 milestone May 4, 2026
@hsliuustc0106
Copy link
Copy Markdown
Collaborator

how does this relate to #3332 as I2T is already contained?

@TaffyOfficial
Copy link
Copy Markdown
Contributor Author

how does this relate to #3332 as I2T is already contained?

原本那个负责上ci,这个负责用例,现在这个pr废弃,用#3322

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants