Fix build errors of GridSample and test failures in test_attention_fusion.py#27642
Merged
Fix build errors of GridSample and test failures in test_attention_fusion.py#27642
Conversation
Contributor
There was a problem hiding this comment.
Pull request overview
This PR fixes a build error in the CUDA GridSample operator and reverts a transformer fusion fallback that caused test regressions, following up on PRs #27201 and #27556.
Changes:
- Removed an unused local variable
mode_strin the GridSample CUDA constructor that shadowed per-branch locals and caused a build warning/error. - Reverted the
SkipLayerNormalizationfusion fallback infusion_skiplayernorm.py(from PR #27556) that broke GPT-2 tests, restoring the early-return when symbolic shape inference fails. - Updated GridSample custom tests and Qwen3 attention fusion test expectations to match the fixed code.
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated no comments.
| File | Description |
|---|---|
onnxruntime/core/providers/cuda/tensor/grid_sample.cc |
Remove unused outer-scope mode_str variable to fix shadowing build error |
onnxruntime/test/providers/cpu/tensor/grid_sample_test_custom.inc |
Use default GetExecutionProviders() matching convention of all other GridSample tests |
onnxruntime/python/tools/transformers/fusion_skiplayernorm.py |
Revert fallback: early-return when shape inference fails instead of using default skip_index |
onnxruntime/test/python/transformers/test_attention_fusion.py |
Update Qwen3 test expectations (4 SLN, 0 SSLN) to match reverted fusion behavior |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
sanaa-hamel-microsoft
approved these changes
Mar 13, 2026
hariharans29
approved these changes
Mar 13, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
This PR addresses a build error and subsequent test failures related to recent changes in GridSample and the transformer optimizer. Related PRs: #27201, #27556.
Changes
1. Fix GridSample Build Error
mode_strinonnxruntime/core/providers/cuda/tensor/grid_sample.ccthat was causing a warning (treated as error). After enabling c++20 in recent comment, this warning surfaces.grid_sample.cc2. Update GridSample Tests
onnxruntime/test/providers/cpu/tensor/grid_sample_test_custom.incto use default execution providers inRunTestsinstead of a hardcoded opset version, ensuring compatibility across different environments.3. Revert Transformer Fusion Fallback
onnxruntime/python/tools/transformers/fusion_skiplayernorm.pythat enabled a fallback forSkipLayerNormalizationfusion when symbolic shape inference fails.fusion_skiplayernorm.py4. Restore Transformer Test Parity
onnxruntime/test/python/transformers/test_attention_fusion.pyspecificallytest_qwen3_normalization_fusionto match the expected node counts after reverting the fusion fallback.test_attention_fusion.pyVerification
onnxruntime/test/python/transformers/test_attention_fusion.pypasses with "OK".