[bugfix] fix typo of _skip_all_reduce_across_dp_group#5435
[bugfix] fix typo of _skip_all_reduce_across_dp_group#5435wangxiyuan merged 2 commits intovllm-project:mainfrom
Conversation
Signed-off-by: jiangkuaixue123 <jiangxiaozhou111@163.com>
|
👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:
If CI fails, you can run linting and testing checks locally according Contributing and Testing. |
There was a problem hiding this comment.
Code Review
This pull request corrects a typo in the method name _skip_all_reduce_across_dp_group and its call site. The change is straightforward and correct, improving code clarity. I have no further comments as there are no high or critical severity issues in the changed code.
|
@wangxiyuan I noticed a typo while reviewing the code — could you help merge the fix? |
|
Thanks for your first contributions! Your awesome first PR has been included in vLLM Ascend v0.13.0rc1 release. [1] https://github.com/vllm-project/vllm-ascend/releases/tag/v0.13.0rc1 |
…to eplb_refactor * 'main' of https://github.com/vllm-project/vllm-ascend: (46 commits) [Feature] Support to use fullgraph with eagle (vllm-project#5118) [EPLB][refactor] Modification of the initialization logic for expert_map and log2phy(depend on pr5285) (vllm-project#5311) [Refactor]6/N Extract common code of class AscendMLAImpl (vllm-project#5314) [Refactor] cache cos/sin in mla & remove parameter model in builder. (vllm-project#5277) update vllm pin to 12.27 (vllm-project#5412) [ReleaseNote] Add release note for v0.13.0rc1 (vllm-project#5334) [Bugfix] Correctly handle the output shape in multimodal attention (vllm-project#5443) Fix nightly (vllm-project#5413) [bugfix] fix typo of _skip_all_reduce_across_dp_group (vllm-project#5435) [Doc]modify pcp tutorial doc (vllm-project#5440) [Misc] fast fail for exiting if tools/install_flash_infer_attention_score_ops_a2.sh (vllm-project#5422) [Doc] Update DeepSeek V3.1/R1 2P1D doc (vllm-project#5387) [DOC]Fix model weight download links (vllm-project#5436) [Doc] Modify DeepSeek-R1/V3.1 documentation (vllm-project#5426) Revert "[feat] enable hierarchical mc2 ops on A2 by default (vllm-project#5300)" (vllm-project#5434) [Bugfix] fix greedy temperature detection (vllm-project#5417) [doc] Update Qwen3-235B doc for reproducing latest performance (vllm-project#5323) [feat] enable hierarchical mc2 ops on A2 by default (vllm-project#5300) [Doc] delete environment variable HCCL_OP_EXPANSION_MODE in DeepSeekV3.1/R1 (vllm-project#5419) [Doc] add long_sequence feature user guide (vllm-project#5343) ...
) ### What this PR does / why we need it? fix typo of _skip_all_reduce_across_dp_group ### Does this PR introduce _any_ user-facing change? no ### How was this patch tested? - vLLM version: release/v0.13.0 - vLLM main: vllm-project/vllm@81786c8 Signed-off-by: jiangkuaixue123 <jiangxiaozhou111@163.com> Signed-off-by: Che Ruan <cr623@ic.ac.uk>
) ### What this PR does / why we need it? fix typo of _skip_all_reduce_across_dp_group ### Does this PR introduce _any_ user-facing change? no ### How was this patch tested? - vLLM version: release/v0.13.0 - vLLM main: vllm-project/vllm@81786c8 Signed-off-by: jiangkuaixue123 <jiangxiaozhou111@163.com> Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
) ### What this PR does / why we need it? fix typo of _skip_all_reduce_across_dp_group ### Does this PR introduce _any_ user-facing change? no ### How was this patch tested? - vLLM version: release/v0.13.0 - vLLM main: vllm-project/vllm@81786c8 Signed-off-by: jiangkuaixue123 <jiangxiaozhou111@163.com>
) ### What this PR does / why we need it? fix typo of _skip_all_reduce_across_dp_group ### Does this PR introduce _any_ user-facing change? no ### How was this patch tested? - vLLM version: release/v0.13.0 - vLLM main: vllm-project/vllm@81786c8 Signed-off-by: jiangkuaixue123 <jiangxiaozhou111@163.com> Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
What this PR does / why we need it?
fix typo of _skip_all_reduce_across_dp_group
Does this PR introduce any user-facing change?
no
How was this patch tested?