-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bench: Fix execution pre-validation #340
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All of this is checked in validate_benchmark_case
where a test run is performed. I don't see point of repeating this.
I may have missed it but, I do get a crash on inputs in |
Please investigate where exactly the problem is and why |
@chfast pushed your fixes here, but feel free to close. |
Codecov Report
@@ Coverage Diff @@
## master #340 +/- ##
==========================================
- Coverage 98.85% 98.84% -0.01%
==========================================
Files 41 42 +1
Lines 11854 12134 +280
==========================================
+ Hits 11718 11994 +276
- Misses 136 140 +4 |
cff2264
to
59c369f
Compare
# Copyright 2020 The Fizzy Authors. | ||
# SPDX-License-Identifier: Apache-2.0 | ||
|
||
add_test( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, couldn't we run spectest smoketests the same way?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure. I have not connected the dots..
0a113b1
to
b28ad5e
Compare
@@ -100,7 +100,6 @@ WasmEngine::Result FizzyEngine::execute( | |||
{ | |||
const auto [trapped, result_stack] = | |||
fizzy::execute(*m_instance, static_cast<uint32_t>(func_ref), args); | |||
assert(result_stack.size() <= 1); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This just shows fizzy's "public api" is broken right now :)
Hopefully after the span things are going in, we can merge a version of #219 can go in and fix this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was confused at first, but truly we dump whole stack also in the case when returning trapped
.
I can add disabled test for such case, but I don't think this really helps.
engine->instantiate(*benchmark_case.wasm_binary); | ||
const auto func_ref = engine->find_function(benchmark_case.func_name, benchmark_case.func_sig); | ||
std::optional<fizzy::test::WasmEngine::FuncRef> func_ref; | ||
if (ok) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you explain the changes here in this function?
It looks like when validation is not ok, you would still go and dereference func_ref
at line 185
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah I guess it works because validate_benchmark_case
would calll state.SkipWithError
and the loop below will not be executed.
But why then not just early return
if (!validate_benchmark_case(...))
return;
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. The loop will not run, but we need to call auto _ : state
anyway (this is a libbenchmark limitation in the version we use).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps it's woth to add a comment that we need to call it even when validation failed.
This adds a set of "wrong" wasm benchmark cases.
Looks good to me, @gumb0 okay to merge? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, only a minor suggestion to add a comment
In case of SkipWithError, the benchmark loop must be reached anyway because of the libbenchmark limitation. Also don't execute WasmEngine::find_function() if parsing fails.
Drop assert which checks number of results in not greater than 1. This is not true for executions resulting in traps, and FizzyEngine is not the best place to check it.
This adds
fizzy-bench
integration tests and then fixes crashes.The changes are fine to land as they don't modify any benchmarking loop.