Skip to content

cpu: rv64: matmul: add dropout attribute check#4197

Merged
vpirogov merged 2 commits intouxlfoundation:mainfrom
zhangfeiv0:fix_matmul_dropout
Nov 7, 2025
Merged

cpu: rv64: matmul: add dropout attribute check#4197
vpirogov merged 2 commits intouxlfoundation:mainfrom
zhangfeiv0:fix_matmul_dropout

Conversation

@zhangfeiv0
Copy link
Contributor

Description

This issue was introduced by #3784, and the error occurs as follows:

./tests/benchdnn/benchdnn --matmul --stag=ab --dtag=ab --attr-dropout=0.5:12345678 1x1:1x1
Error: Function 'initialize_memory_create' at (/data/zhangfei/oneDNN/tests/benchdnn/dnnl_memory.cpp:880) returned 'invalid_arguments'
[CHECK_MEM][ERROR]: Allocations were not cleared
[CHECK_MEM][ERROR]: Total size wasn't reduced to 0

Therefore, add a check for the dropout attribute to fall back to the generic implementation and avoid the error. The effect of applying this PR is as follows:

./tests/benchdnn/benchdnn --matmul --stag=ab --dtag=ab --attr-dropout=0.5:12345678 1x1:1x1
0:PASSED (13 ms) __REPRO: --matmul --stag=ab --dtag=ab --attr-dropout=0.5:12345678 1x1:1x1
============================================================
= Implementation statistics (--summary=no-impl to disable) =
============================================================
| ref:any : 1 (100%)                                       |
============================================================
tests:1 passed:1 skipped:0 mistrusted:0 unimplemented:0 invalid_arguments:0 failed:0 listed:0
total: 0.01s; create_pd: 0.00s (2%); create_prim: 0.00s (0%); fill: 0.00s (6%); execute: 0.00s (0%); compute_ref: 0.00s (0%); compare: 0.01s (78%);

@zhangfeiv0 zhangfeiv0 requested a review from a team as a code owner October 23, 2025 09:04
@zhangfeiv0 zhangfeiv0 requested a review from dzarukin October 27, 2025 01:17
@vpirogov vpirogov merged commit f2286a0 into uxlfoundation:main Nov 7, 2025
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants