Skip to content

Disable warpReduce under the LTS driver#1337

Merged
alexbaden merged 1 commit intollvm-targetfrom
alex/lts_warp_reduce
Jun 12, 2024
Merged

Disable warpReduce under the LTS driver#1337
alexbaden merged 1 commit intollvm-targetfrom
alex/lts_warp_reduce

Conversation

@alexbaden
Copy link
Copy Markdown
Contributor

Allows all HF Float32 Training accuracy models to pass under LTS driver and upstream PyTorch.

cc #1336

@alexbaden alexbaden requested a review from whitneywhtsang June 12, 2024 17:13
@whitneywhtsang
Copy link
Copy Markdown
Contributor

What's the issue to fix it on LTS?

@alexbaden
Copy link
Copy Markdown
Contributor Author

#1336 linked above

@whitneywhtsang
Copy link
Copy Markdown
Contributor

#1336 linked above

Thanks, I thought that issue is for this PR, and not for the future investigation.

Comment on lines +90 to +93
const bool isLTS =
op->getParentOfType<ModuleOp>()->hasAttr("triton_gpu.is_lts");
if (isLTS)
return false;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[optional] IMO triton_gpu.is_lts is clear, no need to create a variable.

Suggested change
const bool isLTS =
op->getParentOfType<ModuleOp>()->hasAttr("triton_gpu.is_lts");
if (isLTS)
return false;
if (op->getParentOfType<ModuleOp>()->hasAttr("triton_gpu.is_lts"))
return false;

@alexbaden
Copy link
Copy Markdown
Contributor Author

Still fails after the fox127 upgrade so I'm going to merge this.

@alexbaden alexbaden merged commit 3d79d38 into llvm-target Jun 12, 2024
@alexbaden alexbaden deleted the alex/lts_warp_reduce branch June 12, 2024 21:09
etiotto pushed a commit that referenced this pull request Jul 10, 2024
The workarounds were added in
#1275 and
#1337.
All huggingface training float32 models pass with the LTS workaround
removed:
https://github.com/intel/intel-xpu-backend-for-triton/actions/runs/9865658421

Signed-off-by: Whitney Tsang <whitney.tsang@intel.com>
wdziurdz pushed a commit that referenced this pull request Apr 7, 2026
Fixes #1179

Signed-off-by: Gregory Shimansky <gregory.shimansky@intel.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants