Skip to content

CustomOp: test forward dispatch for grouped_topk#31530

Merged
yewentao256 merged 2 commits intovllm-project:mainfrom
xinyu-intel:dev/xinyu/grouped-topk-test
Jan 2, 2026
Merged

CustomOp: test forward dispatch for grouped_topk#31530
yewentao256 merged 2 commits intovllm-project:mainfrom
xinyu-intel:dev/xinyu/grouped-topk-test

Conversation

@xinyu-intel
Copy link
Contributor

@xinyu-intel xinyu-intel commented Dec 30, 2025

Purpose

follow up of #29575, #31221 and #31523 to test forward func dispatch for grouped_topk custom op.

@xinyu-intel
Copy link
Contributor Author

@tjtanaa @ganyi1996ppo please check this.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds a test for the forward dispatch mechanism of the grouped_topk custom operation. The changes correctly set up a VllmConfig to enable the custom op and then assert that the appropriate platform-specific forward method (forward_cuda or forward_hip) is selected. The test also includes a correctness check comparing a baseline implementation with the fused kernel. The implementation is sound and effectively validates the dispatch logic. I have no issues to report.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@xinyu-intel xinyu-intel force-pushed the dev/xinyu/grouped-topk-test branch from 49ac2e9 to c8e285e Compare December 30, 2025 13:01
Signed-off-by: Xinyu Chen <xinyu1.chen@intel.com>
Signed-off-by: Xinyu Chen <xinyu1.chen@intel.com>
dtype: torch.dtype,
):
vllm_config = VllmConfig(
compilation_config=CompilationConfig(custom_ops=["all", "+grouped_topk"])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we don't need to add "all" as we are only testing "grouped_topk". All will enable all custom ops. We should also test with only +grouped_topk.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good catch:)

@jikunshang jikunshang added the ready ONLY add when PR is ready to merge/full CI is needed label Dec 31, 2025
Copy link
Member

@yewentao256 yewentao256 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for the work!

@yewentao256 yewentao256 merged commit 08f425b into vllm-project:main Jan 2, 2026
19 checks passed
LucasWilkinson pushed a commit to neuralmagic/vllm that referenced this pull request Jan 6, 2026
Signed-off-by: Xinyu Chen <xinyu1.chen@intel.com>
yugong333 pushed a commit to yugong333/vllm that referenced this pull request Jan 9, 2026
Signed-off-by: Xinyu Chen <xinyu1.chen@intel.com>
akh64bit pushed a commit to akh64bit/vllm that referenced this pull request Jan 16, 2026
Signed-off-by: Xinyu Chen <xinyu1.chen@intel.com>
dsuhinin pushed a commit to dsuhinin/vllm that referenced this pull request Jan 21, 2026
Signed-off-by: Xinyu Chen <xinyu1.chen@intel.com>
Signed-off-by: dsuhinin <suhinin.dmitriy@gmail.com>
ItzDEXX pushed a commit to ItzDEXX/vllm that referenced this pull request Feb 19, 2026
Signed-off-by: Xinyu Chen <xinyu1.chen@intel.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants