Benchmarking compiler's peak memory use#15450
Conversation
|
Locally I get the directory does contain files though Is there something wrong with my setup, do i have to source something beforehand? |
7d614eb to
95c1971
Compare
No, it should be enough to just run the script the way you're doing it. I think you just don't have GNU I added a check against that so now you should at least be getting a more understandable error message. |
|
I ran the benchmark on 0.8.24 and 0.8.27 release binaries to test it and to see how the memory usage changed. Looks like it decreased slightly in most cases. I also included timing numbers to get some perspective on how far we got since then.
Note that the ones, which ended due to an error differ more, but I think they must have just crashed at different points. At least in case of OZ, the 0.8.27 run did finish while 0.8.24 didn't. For this reason I also didn't wait for sablier to finish on 0.8.24. It would probably run out of memory on my system anyway. |
… and round it for external projects
- via-ir -> ir - Centered headers
95c1971 to
9bf850b
Compare
That was precisely the issue! It's still a relatively fresh installation on this machine, the package was indeed missing :) |
This PR adds an extra column showing maximum resident set size of the process during its lifetime, as reported by
/usr/bin/time. In local benchmarks this issolcalone, while in external ones this is the wholeforgecommand.It also updates the scripts to report the CPU time rather than wall clock time. Since solc is single-threaded they should mostly match but CPU time is what we're really interested in (we could very easily start reporting both if that ever changes).
Also small tweaks, like time rounding in external benchmarks or cosmetic changes to how table looks.