Skip to content

[perf]Support MOE Multi-stream in Deepseek#947

Merged
wangxiyuan merged 10 commits intovllm-project:mainfrom
David9857:cv
Jun 5, 2025
Merged

[perf]Support MOE Multi-stream in Deepseek#947
wangxiyuan merged 10 commits intovllm-project:mainfrom
David9857:cv

Conversation

@David9857
Copy link
Copy Markdown
Contributor

@David9857 David9857 commented May 24, 2025

What this PR does / why we need it?

Support MOE inner Multi-stream for Deepseek.
This feature requires graph mode with mc2 enabled.

Does this PR introduce any user-facing change?

How was this patch tested?

moe_expert_num = len(expert_map)
# hidden_states = hidden_states.bfloat16()
kwargs = {
kwargs1 = {
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rename to a readable name

Comment thread vllm_ascend/envs.py Outdated
lambda: bool(int(os.getenv("COMPILE_CUSTOM_KERNELS", "1"))),
"VLLM_ENABLE_MC2":
lambda: bool(int(os.getenv("VLLM_ENABLE_MC2", '0'))),
"VLLM_ENABLE_CV_PARALLEL":
Copy link
Copy Markdown
Collaborator

@wangxiyuan wangxiyuan May 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use additional_config instead of env, since this change is only used for torchair GE mode. like #839 does, there are another 3 new config option coming.

how about

{
   "additional_config": {
      "torchair_graph_config": {
         "enable": True,
         "enable_cv_parallet": True,
         "batch_sizes": "12345",
         "batch_sizes_init": True
      } 
  }
}

cc @zzzzwwjj

@wangxiyuan
Copy link
Copy Markdown
Collaborator

wangxiyuan commented May 29, 2025

David9857 added 2 commits June 4, 2025 11:46
Signed-off-by: David9857 <985700846@qq.com>

use additional_config to enable cv parallel

Signed-off-by: David9857 <985700846@qq.com>

rename kwargs1 in fused_experts_with_mc2

Signed-off-by: David9857 <985700846@qq.com>
Signed-off-by: David9857 <985700846@qq.com>
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Jun 4, 2025

This pull request has conflicts, please resolve those before we can evaluate the pull request.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Jun 5, 2025

This pull request has conflicts, please resolve those before we can evaluate the pull request.

Comment thread vllm_ascend/models/deepseek_v2.py Outdated
self.gate.e_score_correction_bias = None

self.enable_cv_parallel = False
additional_config = get_current_vllm_config().additional_config
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please use ascend_config instead now. Note that doc should be updated at the same time.

Signed-off-by: David9857 <985700846@qq.com>
Signed-off-by: David9857 <985700846@qq.com>
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Jun 5, 2025

This pull request has conflicts, please resolve those before we can evaluate the pull request.

Signed-off-by: David9857 <985700846@qq.com>

bugfix

Signed-off-by: David9857 <985700846@qq.com>
Signed-off-by: David9857 <985700846@qq.com>
@wangxiyuan wangxiyuan merged commit 78431b3 into vllm-project:main Jun 5, 2025
23 checks passed
chopper0126 pushed a commit to chopper0126/vllm-ascend that referenced this pull request Oct 16, 2025
### What this PR does / why we need it?
Support MOE inner Multi-stream for Deepseek. 
This feature requires graph mode with mc2 enabled.

---------

Signed-off-by: David9857 <985700846@qq.com>
Angazenn pushed a commit to Angazenn/vllm-ascend that referenced this pull request Oct 21, 2025
### What this PR does / why we need it?
Support MOE inner Multi-stream for Deepseek. 
This feature requires graph mode with mc2 enabled.

---------

Signed-off-by: David9857 <985700846@qq.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants