Skip to content

Conversation

@kfhfar
Copy link
Contributor

@kfhfar kfhfar commented Oct 9, 2025

Purpose

This is part of #22041 and CI sprint. This is for pruning the Kernel mamba tests. Used #22936, #22939 for reference.

Test Plan

pytest tests/kernels/mamba

Test Result

Still to complete

Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

Copy link
Member

@yewentao256 yewentao256 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the work!
Generally, please do not affect the functionality like removing test dtype.
Some redundant test could be removed like [128, 256, 512, 1024, 2048, 4096]) to [128, 1024, 4096]) is good.

return (out if activation is None else F.silu(out)).to(dtype=dtype_in)


@pytest.mark.parametrize("itype", [torch.bfloat16, torch.float])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test for float is needed

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ack.

@pytest.mark.parametrize("has_bias", [False, True])
@pytest.mark.parametrize("seqlen", [1])
@pytest.mark.parametrize("width", [4])
@pytest.mark.parametrize("dim", [2048, 2048 + 16, 4096])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test for arbitrary shape is needed

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ack

Signed-off-by: kfhfar <[email protected]>
Signed-off-by: Fardin Hoque <[email protected]>

Removing bfloat16 as added accidentally

Signed-off-by: Fardin Hoque <[email protected]>
Copy link
Member

@yewentao256 yewentao256 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the work! Just a few thoughts

@kfhfar kfhfar marked this pull request as ready for review October 13, 2025 00:08
Copy link
Member

@yewentao256 yewentao256 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for the work!

@yewentao256 yewentao256 added the ready ONLY add when PR is ready to merge/full CI is needed label Oct 13, 2025
@yewentao256 yewentao256 merged commit 577c72a into vllm-project:main Oct 13, 2025
18 checks passed
1994 pushed a commit to 1994/vllm that referenced this pull request Oct 14, 2025
Signed-off-by: Fardin Hoque <[email protected]>
Signed-off-by: Wentao Ye <[email protected]>
Co-authored-by: Wentao Ye <[email protected]>
Signed-off-by: 1994 <[email protected]>
Dhruvilbhatt pushed a commit to Dhruvilbhatt/vllm that referenced this pull request Oct 14, 2025
Signed-off-by: Fardin Hoque <[email protected]>
Signed-off-by: Wentao Ye <[email protected]>
Co-authored-by: Wentao Ye <[email protected]>
Signed-off-by: Dhruvil Bhatt <[email protected]>
bbartels pushed a commit to bbartels/vllm that referenced this pull request Oct 16, 2025
Signed-off-by: Fardin Hoque <[email protected]>
Signed-off-by: Wentao Ye <[email protected]>
Co-authored-by: Wentao Ye <[email protected]>
Signed-off-by: bbartels <[email protected]>
lywa1998 pushed a commit to lywa1998/vllm that referenced this pull request Oct 20, 2025
Signed-off-by: Fardin Hoque <[email protected]>
Signed-off-by: Wentao Ye <[email protected]>
Co-authored-by: Wentao Ye <[email protected]>
alhridoy pushed a commit to alhridoy/vllm that referenced this pull request Oct 24, 2025
Signed-off-by: Fardin Hoque <[email protected]>
Signed-off-by: Wentao Ye <[email protected]>
Co-authored-by: Wentao Ye <[email protected]>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
Signed-off-by: Fardin Hoque <[email protected]>
Signed-off-by: Wentao Ye <[email protected]>
Co-authored-by: Wentao Ye <[email protected]>
Signed-off-by: xuebwang-amd <[email protected]>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
Signed-off-by: Fardin Hoque <[email protected]>
Signed-off-by: Wentao Ye <[email protected]>
Co-authored-by: Wentao Ye <[email protected]>
Signed-off-by: xuebwang-amd <[email protected]>
0xrushi pushed a commit to 0xrushi/vllm that referenced this pull request Oct 26, 2025
Signed-off-by: Fardin Hoque <[email protected]>
Signed-off-by: Wentao Ye <[email protected]>
Co-authored-by: Wentao Ye <[email protected]>
Signed-off-by: 0xrushi <[email protected]>
0xrushi pushed a commit to 0xrushi/vllm that referenced this pull request Oct 26, 2025
Signed-off-by: Fardin Hoque <[email protected]>
Signed-off-by: Wentao Ye <[email protected]>
Co-authored-by: Wentao Ye <[email protected]>
Signed-off-by: 0xrushi <[email protected]>
rtourgeman pushed a commit to rtourgeman/vllm that referenced this pull request Nov 10, 2025
Signed-off-by: Fardin Hoque <[email protected]>
Signed-off-by: Wentao Ye <[email protected]>
Co-authored-by: Wentao Ye <[email protected]>
Zhathw pushed a commit to Zhathw/vllm that referenced this pull request Nov 12, 2025
Signed-off-by: Fardin Hoque <[email protected]>
Signed-off-by: Wentao Ye <[email protected]>
Co-authored-by: Wentao Ye <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants