Skip to content

Conversation

@nicolasvasilache
Copy link
Contributor

@nicolasvasilache nicolasvasilache commented Jun 24, 2025

Carrying reverts from #21162

Also added a revert for 4d21da002a056c64231fb89ee9e4eba90080e9bb (not a hard fix, just is a load bearing change that should be done as a seperate cherry pick)

Adds a local commit to stablehlo to allow compiling with the new llvm patch

@nicolasvasilache nicolasvasilache force-pushed the users/nico/pad-attention-integrate-only branch from a5d8087 to 9026569 Compare June 24, 2025 15:12
@Groverkss Groverkss force-pushed the users/nico/pad-attention-integrate-only branch from 9026569 to 60a321e Compare June 24, 2025 17:13
@Groverkss Groverkss changed the title LLVM / MLIR integrate for #21152 Bump LLVM to d31ba5256327d30f264c2f671bf197877b242cde Jun 24, 2025
@Groverkss Groverkss force-pushed the users/nico/pad-attention-integrate-only branch from 60a321e to 7c18d18 Compare June 24, 2025 17:18
@Groverkss Groverkss changed the title Bump LLVM to d31ba5256327d30f264c2f671bf197877b242cde Bump LLVM to 4ac4726d00644f6c6b0e2de1df0d00deed0015bf Jun 25, 2025
Copy link
Contributor

@Max191 Max191 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Stamping but I’m not able to review all the changes right now. If there are changes that need a closer look in here then might want to wait for another approval too

@lialan lialan mentioned this pull request Jun 26, 2025
@lialan lialan force-pushed the users/nico/pad-attention-integrate-only branch 3 times, most recently from e662694 to 0f9eb07 Compare June 26, 2025 16:57
ci-skip: windows_x64_msvc
@lialan lialan force-pushed the users/nico/pad-attention-integrate-only branch from 0f9eb07 to aa65b4e Compare June 26, 2025 20:17
@lialan lialan merged commit 6965a0d into iree-org:main Jun 26, 2025
44 checks passed
@lialan lialan deleted the users/nico/pad-attention-integrate-only branch June 26, 2025 21:17
nicolasvasilache added a commit that referenced this pull request Jul 6, 2025
This PR makes OnlineAttention derive from IndexingMapOpInterface and
make it pad with transform.structured.pad_tiling_interface.

Additionally, ensures the dynamic case pads to a constant before tiling
and properly canonicalizes to constant shapes
once AffineMin simplification kicks in.

This requires integrating LLVM past
d31ba5256327d30f264c2f671bf197877b242cde.

The integrate PR is separated in #21175

---------

Signed-off-by: Nicolas Vasilache <[email protected]>
Signed-off-by: Nicolas Vasilache <[email protected]>
Co-authored-by: Nicolas Vasilache <[email protected]>
Co-authored-by: Kunwar Grover <[email protected]>
keshavvinayak01 pushed a commit to keshavvinayak01/iree that referenced this pull request Sep 4, 2025
Carrying reverts from iree-org#21162

Also added a revert for
[4d21da002a056c64231fb89ee9e4eba90080e9bb](https://github.com/llvm/llvm-project/pull/144158)
(not a hard fix, just is a load bearing change that should be done as a
seperate cherry pick)

Adds a local commit to stablehlo to allow compiling with the new llvm
patch

Co-authored-by: Kunwar Grover <[email protected]>
Signed-off-by: keshavvinayak01 <[email protected]>
keshavvinayak01 pushed a commit to keshavvinayak01/iree that referenced this pull request Sep 4, 2025
This PR makes OnlineAttention derive from IndexingMapOpInterface and
make it pad with transform.structured.pad_tiling_interface.

Additionally, ensures the dynamic case pads to a constant before tiling
and properly canonicalizes to constant shapes
once AffineMin simplification kicks in.

This requires integrating LLVM past
d31ba5256327d30f264c2f671bf197877b242cde.

The integrate PR is separated in iree-org#21175

---------

Signed-off-by: Nicolas Vasilache <[email protected]>
Signed-off-by: Nicolas Vasilache <[email protected]>
Co-authored-by: Nicolas Vasilache <[email protected]>
Co-authored-by: Kunwar Grover <[email protected]>
Signed-off-by: keshavvinayak01 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants