Skip to content

[Misc] fast fail for exiting if tools/install_flash_infer_attention_score_ops_a2.sh#5422

Merged
MengqingCao merged 4 commits intovllm-project:mainfrom
MengqingCao:fd
Dec 27, 2025
Merged

[Misc] fast fail for exiting if tools/install_flash_infer_attention_score_ops_a2.sh#5422
MengqingCao merged 4 commits intovllm-project:mainfrom
MengqingCao:fd

Conversation

@MengqingCao
Copy link
Copy Markdown
Collaborator

@MengqingCao MengqingCao commented Dec 27, 2025

What this PR does / why we need it?

Use set -euo pipefail to exit if tools/install_flash_infer_attention_score_ops_a2.sh failed in any line

How was this patch tested?

test pass locally:
image

…r_attention_score_ops_a2.sh failed in anyline

Signed-off-by: MengqingCao <cmq0113@163.com>
@github-actions
Copy link
Copy Markdown
Contributor

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces set -euo pipefail to two shell scripts to ensure they exit immediately upon any command failure. This is a crucial improvement for the robustness of these installation scripts. My review feedback builds upon this by suggesting the addition of an ERR trap. This will enhance debuggability by printing the exact command and line number that caused the script to fail, making it easier to diagnose issues during setup.

Comment thread tools/install_flash_infer_attention_score_ops_a2.sh
Comment thread tools/install_flash_infer_attention_score_ops_a3.sh
MengqingCao and others added 2 commits December 27, 2025 14:57
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Signed-off-by: Mengqing Cao <cmq0113@163.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Signed-off-by: Mengqing Cao <cmq0113@163.com>
@Yikun Yikun changed the title [Build][QuickFix] Quick fix for exiting if tools/install_flash_infer_attention_score_ops_a2.sh failed in anyline [Misc] Quick fix for exiting if tools/install_flash_infer_attention_score_ops_a2.sh failed in anyline Dec 27, 2025
@Yikun Yikun changed the title [Misc] Quick fix for exiting if tools/install_flash_infer_attention_score_ops_a2.sh failed in anyline [Misc] fast fail for exiting if tools/install_flash_infer_attention_score_ops_a2.sh Dec 27, 2025
@MengqingCao MengqingCao merged commit 77cd960 into vllm-project:main Dec 27, 2025
6 of 8 checks passed
845473182 pushed a commit to 845473182/vllm-ascend that referenced this pull request Dec 29, 2025
…to eplb_refactor

* 'main' of https://github.com/vllm-project/vllm-ascend: (46 commits)
  [Feature] Support to use fullgraph with eagle (vllm-project#5118)
  [EPLB][refactor] Modification of the initialization logic for expert_map and log2phy(depend on pr5285) (vllm-project#5311)
  [Refactor]6/N Extract common code of class AscendMLAImpl (vllm-project#5314)
  [Refactor] cache cos/sin in mla & remove parameter model in builder. (vllm-project#5277)
  update vllm pin to 12.27 (vllm-project#5412)
  [ReleaseNote] Add release note for v0.13.0rc1 (vllm-project#5334)
  [Bugfix] Correctly handle the output shape in multimodal attention (vllm-project#5443)
  Fix nightly (vllm-project#5413)
  [bugfix] fix typo of _skip_all_reduce_across_dp_group (vllm-project#5435)
  [Doc]modify pcp tutorial doc (vllm-project#5440)
  [Misc] fast fail for exiting if tools/install_flash_infer_attention_score_ops_a2.sh (vllm-project#5422)
  [Doc] Update DeepSeek V3.1/R1 2P1D doc (vllm-project#5387)
  [DOC]Fix model weight download links (vllm-project#5436)
  [Doc] Modify DeepSeek-R1/V3.1 documentation (vllm-project#5426)
  Revert "[feat] enable hierarchical mc2 ops on A2 by default (vllm-project#5300)" (vllm-project#5434)
  [Bugfix] fix greedy temperature detection (vllm-project#5417)
  [doc] Update Qwen3-235B doc for reproducing latest performance (vllm-project#5323)
  [feat] enable hierarchical mc2 ops on A2 by default (vllm-project#5300)
  [Doc] delete environment variable HCCL_OP_EXPANSION_MODE in DeepSeekV3.1/R1 (vllm-project#5419)
  [Doc] add long_sequence feature user guide (vllm-project#5343)
  ...
ZRJ026 pushed a commit to ZRJ026/vllm-ascend that referenced this pull request Feb 28, 2026
…core_ops_a2.sh (vllm-project#5422)

### What this PR does / why we need it?
Use `set -euo pipefail` to exit if
tools/install_flash_infer_attention_score_ops_a2.sh failed in any line

- vLLM version: release/v0.13.0
- vLLM main:
vllm-project/vllm@81786c8
---------
Signed-off-by: MengqingCao <cmq0113@163.com>
Signed-off-by: Mengqing Cao <cmq0113@163.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
maoxx241 pushed a commit to maoxx241/vllm-ascend that referenced this pull request Mar 2, 2026
…core_ops_a2.sh (vllm-project#5422)

### What this PR does / why we need it?
Use `set -euo pipefail` to exit if
tools/install_flash_infer_attention_score_ops_a2.sh failed in any line

- vLLM version: release/v0.13.0
- vLLM main:
vllm-project/vllm@81786c8
---------
Signed-off-by: MengqingCao <cmq0113@163.com>
Signed-off-by: Mengqing Cao <cmq0113@163.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
ZRJ026 pushed a commit to ZRJ026/vllm-ascend that referenced this pull request Mar 4, 2026
…core_ops_a2.sh (vllm-project#5422)

### What this PR does / why we need it?
Use `set -euo pipefail` to exit if
tools/install_flash_infer_attention_score_ops_a2.sh failed in any line

- vLLM version: release/v0.13.0
- vLLM main:
vllm-project/vllm@81786c8
---------
Signed-off-by: MengqingCao <cmq0113@163.com>
Signed-off-by: Mengqing Cao <cmq0113@163.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants