Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add additional benchmarks #565

Merged
merged 4 commits into from
Mar 21, 2024
Merged

Add additional benchmarks #565

merged 4 commits into from
Mar 21, 2024

Conversation

cleaton
Copy link
Contributor

@cleaton cleaton commented Sep 29, 2023

Benchmarks used to verify #482 . At that time there was no benchmarks setup in rustler yet so I didn't include them in the PR.
As there is now, this might be of use for others or CI/CD integration.

@evnu
Copy link
Member

evnu commented Sep 29, 2023

Thanks! Can you fix clippy? Also, would be nice to see some example results, could you simply post the machine you used and the output of the benchmark here? :) I wonder what a good structure would be find "interesting" benchmarks if the set of benchmarks in the repository increases. Maybe we should have some logical structure w.r.t. what benchmarks we have.

@cleaton
Copy link
Contributor Author

cleaton commented Sep 29, 2023

@evnu yes I agree, for now I just included all benchmarks I used even though there is a bit overlap with existing benchmarks just to start a discussion.

What about having a single Benchmark.run() that can read command line arguments and generate the benchee options? That way for local testing developers can run a single specific test without commenting code while at the same time there's a single place for CI/CD job to call to run all benchmarks as part of some regression job ( ex using https://github.com/bencheeorg/benchee#saving-loading-and-comparing-previous-runs ).

@cleaton
Copy link
Contributor Author

cleaton commented Sep 30, 2023

As an addition, it would be nice if rustler also had a standard way for rustler projects to add benchrmarks/tests that would automatically be setup for PGO, following these steps: https://doc.rust-lang.org/rustc/profile-guided-optimization.html#a-complete-cargo-workflow

Something that could be configured in project Native.ex:

use Rustler,
    pgo_run: <Project>.Benchmark.run()

Rustler would then use a combination of internal benchmarks and the Project provided benchmark to merge a combined instrumentation file. by default if not provided rustler could still use internal benchmarks to run PGO.

I guess PGO would automatically perform some optimizations that this PR is doing manually: #475

@evnu
Copy link
Member

evnu commented Oct 4, 2023

What about having a single Benchmark.run() that can read command line arguments and generate the benchee options? That way for local testing developers can run a single specific test without commenting code while at the same time there's a single place for CI/CD job to call to run all benchmarks as part of some regression job

W.r.t. regression in CI/CD, in my experience, that seldomly works well unless one puts in considerable effort. CI/CD is run on non-dedicated machines, so results can vary wildly. I wouldn't put too much work into that.

Enabling running all benchmarks with a single command sounds good. I guess for that we would need to make the benchmarks coherent, so that the runs "tell a story". Otherwise, someone running the benchmarks still needs to read the benchmark code first to understand what it is all about.

As an addition, it would be nice if rustler also had a standard way for rustler projects to add benchrmarks/tests that would automatically be setup for PGO,

Interesting suggestion. This goes out-of-scope of this PR, maybe create a separate ticket just for that? I have never played with PGO yet, so I don't know how complicated it is to use.

@cleaton
Copy link
Contributor Author

cleaton commented Oct 9, 2023

Interesting suggestion. This goes out-of-scope of this PR, maybe create a separate ticket just for that? I have never played with PGO yet, so I don't know how complicated it is to use.

Neither have I, yes I think a separate PR is a better fit. I'll leave this PR here for now and I'll create another one for PGO. Can come back to this one later to see if it's still of interest.

@cleaton cleaton mentioned this pull request Oct 14, 2023
@evnu evnu self-requested a review October 16, 2023 19:07
@evnu evnu requested a review from a team November 21, 2023 08:53
@evnu
Copy link
Member

evnu commented Nov 21, 2023

@cleaton seems I kind make the github actions run again. Can you rebase on master so that the tests run again?

@filmor filmor merged commit 199e48d into rusterlium:master Mar 21, 2024
34 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants