You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
…2273)
### Ticket
#2041#2042
### Problem description
To enable llama to go through the optimizer, we need constraints and
runtime APIs for all ops (tracked in #2084). This PR enables these APIs
for `mean` and `reshape`.
### What's changed
- Added `getOpRuntime()` and `getOpConstraints()` interface methods to
`ReshapeOp` and `MeanOp`
- Added unit tests for both APIs on both Ops
- Closes#2041
- Closes#2042
Note: This PR seems long for a "simple" integration change because both
APIs require a manual translation of mlir types down to ttnn types. As a
result, opportunities for sharing code across Ops is limited. About
1/3rd of the diff is in unit tests
### Checklist
- [x] New/Existing tests provide coverage for changes
…2273)
### Ticket
#2041#2042
### Problem description
To enable llama to go through the optimizer, we need constraints and
runtime APIs for all ops (tracked in #2084). This PR enables these APIs
for `mean` and `reshape`.
### What's changed
- Added `getOpRuntime()` and `getOpConstraints()` interface methods to
`ReshapeOp` and `MeanOp`
- Added unit tests for both APIs on both Ops
- Closes#2041
- Closes#2042
Note: This PR seems long for a "simple" integration change because both
APIs require a manual translation of mlir types down to ttnn types. As a
result, opportunities for sharing code across Ops is limited. About
1/3rd of the diff is in unit tests
### Checklist
- [x] New/Existing tests provide coverage for changes
This is needed to run llama sharded.
Reference op constraints implementation: #1554
The text was updated successfully, but these errors were encountered: