Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bench: Fix execution pre-validation #340

Merged
merged 5 commits into from
Jun 2, 2020
Merged

bench: Fix execution pre-validation #340

merged 5 commits into from
Jun 2, 2020

Conversation

axic
Copy link
Member

@axic axic commented May 26, 2020

This adds fizzy-bench integration tests and then fixes crashes.

Maybe we should merge this after 0.2.0 in case it would make comparison harder to 0.1.0?
Or could release a patch (0.1.1) with this commit.

The changes are fine to land as they don't modify any benchmarking loop.

@axic axic requested a review from chfast May 26, 2020 11:11
Copy link
Collaborator

@chfast chfast left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of this is checked in validate_benchmark_case where a test run is performed. I don't see point of repeating this.

@axic
Copy link
Member Author

axic commented May 26, 2020

All of this is checked in validate_benchmark_case where a test run is performed. I don't see point of repeating this.

I may have missed it but, I do get a crash on inputs in execution which return a failure in the instantiate case.

@chfast
Copy link
Collaborator

chfast commented May 26, 2020

I may have missed it but, I do get a crash on inputs in execution which return a failure in the instantiate case.

Please investigate where exactly the problem is and why validate_benchmark_case is not able to catch it.

@axic
Copy link
Member Author

axic commented May 26, 2020

@chfast pushed your fixes here, but feel free to close.

@axic axic requested a review from chfast May 27, 2020 12:31
@chfast chfast changed the title bench: make sure benchmark_execute also detects errors bench: Fix execution pre-validation May 29, 2020
@codecov
Copy link

codecov bot commented May 29, 2020

Codecov Report

Merging #340 into master will decrease coverage by 0.00%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master     #340      +/-   ##
==========================================
- Coverage   98.85%   98.84%   -0.01%     
==========================================
  Files          41       42       +1     
  Lines       11854    12134     +280     
==========================================
+ Hits        11718    11994     +276     
- Misses        136      140       +4     

@chfast chfast force-pushed the bench-exec branch 2 times, most recently from cff2264 to 59c369f Compare May 29, 2020 11:27
# Copyright 2020 The Fizzy Authors.
# SPDX-License-Identifier: Apache-2.0

add_test(
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, couldn't we run spectest smoketests the same way?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure. I have not connected the dots..

@chfast chfast force-pushed the bench-exec branch 7 times, most recently from 0a113b1 to b28ad5e Compare June 1, 2020 15:36
@@ -100,7 +100,6 @@ WasmEngine::Result FizzyEngine::execute(
{
const auto [trapped, result_stack] =
fizzy::execute(*m_instance, static_cast<uint32_t>(func_ref), args);
assert(result_stack.size() <= 1);
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This just shows fizzy's "public api" is broken right now :)

Hopefully after the span things are going in, we can merge a version of #219 can go in and fix this.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was confused at first, but truly we dump whole stack also in the case when returning trapped.
I can add disabled test for such case, but I don't think this really helps.

@axic
Copy link
Member Author

axic commented Jun 1, 2020

Looks good to me, but a bit worried if this goes in first before #331, then #331 needs to be adjusted further to not cause a drop in coverage.

@chfast
Copy link
Collaborator

chfast commented Jun 1, 2020

Looks good to me, but a bit worried if this goes in first before #331, then #331 needs to be adjusted further to not cause a drop in coverage.

No problem, this can go after #331.

engine->instantiate(*benchmark_case.wasm_binary);
const auto func_ref = engine->find_function(benchmark_case.func_name, benchmark_case.func_sig);
std::optional<fizzy::test::WasmEngine::FuncRef> func_ref;
if (ok)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you explain the changes here in this function?
It looks like when validation is not ok, you would still go and dereference func_ref at line 185

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah I guess it works because validate_benchmark_case would calll state.SkipWithError and the loop below will not be executed.

But why then not just early return

  if (!validate_benchmark_case(...))
    return;

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. The loop will not run, but we need to call auto _ : state anyway (this is a libbenchmark limitation in the version we use).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps it's woth to add a comment that we need to call it even when validation failed.

@axic
Copy link
Member Author

axic commented Jun 2, 2020

Looks good to me, @gumb0 okay to merge?

Copy link
Collaborator

@gumb0 gumb0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, only a minor suggestion to add a comment

chfast added 3 commits June 2, 2020 18:58
In case of SkipWithError, the benchmark loop must be reached anyway
because of the libbenchmark limitation.

Also don't execute WasmEngine::find_function() if parsing fails.
Drop assert which checks number of results in not greater than 1.
This is not true for executions resulting in traps, and FizzyEngine
is not the best place to check it.
@chfast chfast merged commit d0482be into master Jun 2, 2020
@chfast chfast deleted the bench-exec branch June 2, 2020 17:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants