Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Significant performance drop compared to node v0.10.16 #472

Closed
sufianrhazi opened this issue Jan 16, 2015 · 12 comments
Closed

Significant performance drop compared to node v0.10.16 #472

sufianrhazi opened this issue Jan 16, 2015 · 12 comments

Comments

@sufianrhazi
Copy link

Hey folks,

At Etsy, we have an asset build system that is currently using node v0.10.16.

A full run of the build system is CPU bound, but performance drops significantly when using io.js v1.0.2:

  • node v0.10.16: 127079.67ms
  • io.js v1.0.2: 196066.5ms

I've created a synthetic benchmark (generate jquery source from an AST using escodegen) which represents a big portion of the time spent in this use case: https://github.com/sufianrhazi/escodegen-bench

Hope this helps! Is there any profiling info I can provide to suss out new bottlenecks?

@trevnorris
Copy link
Contributor

I've attempted to reproduce the issue as best I could. Here are the results:

exec         time (ns)
v0.10.16     1368457789
v0.10.35     1393973208
v0.12 (node) 1931450232
v1.x  (iojs) 2032736628

From the benchmark is shows you're using the 2.6.32-220.4.1.el6.x86_64 kernel. @rvagg mentioned there may be a correlation if the binary was built on el5, but I'll let him speak to that.

@trevnorris
Copy link
Contributor

Rehashing the numbers, I can reproduce a ~25-30% performance hit, which accumulates to around 500-1000ms of additional execution time. I can't reproduce the ~85% performance hit stated in the benchmark repository, nor can I see how execution time was in the 100k+ ms area.

@rvagg
Copy link
Member

rvagg commented Jan 16, 2015

To clarify: the binaries on iojs.org/dist/ for Linux are compiled on EL5 so that they work on Linuxes forward from there but they are generated with GCC 4.8. I don't believe there should be a performance hit from compiling in this way but @sufianrhazi perhaps you could clarify which binaries you are using, did you create your own or download from iojs.org?

@sufianrhazi
Copy link
Author

I downloaded the precompiled 64 bit binaries from iojs.org.

I should have mentioned that my specific numbers were generated on a VM on a host with many other VMs, so the hard numbers will not be comparable. Regardless, the relative times are significant.

On Jan 16, 2015, at 6:22 PM, Rod Vagg [email protected] wrote:

To clarify: the binaries on iojs.org/dist/ for Linux are compiled on EL5 so that they work on Linuxes forward from there but they are generated with GCC 4.8. I don't believe there should be a performance hit from compiling in this way but @sufianrhazi perhaps you could clarify which binaries you are using, did you create your own or download from iojs.org?


Reply to this email directly or view it on GitHub.

@bnoordhuis
Copy link
Member

@sufianrhazi Would it be possible for you to run perf record -g -i -c 40000 (and perf report --stdio afterwards) for io.js and node.js v0.10 / v0.12? Putting the performance profiles side by side might provide some clues.

@mraleph
Copy link
Contributor

mraleph commented Jan 22, 2015

I took a look at the profiles and found at least one issue with V8: https://code.google.com/p/v8/issues/detail?id=3842

@jbergstroem
Copy link
Member

I saw a fix was committed upstream. Looks like it lives in 4.2 as well. Anyone keen on trying -next?

@jdalton
Copy link
Member

jdalton commented Apr 24, 2015

I noticed travis-ci has started timing-out, within the last week, on lodash-cli builds only when testing iojs because it takes too long. Something may have happened recently to regress it further.

@Fishrock123
Copy link
Contributor

@jdalton do you have any more info? What version this may have happened since? What part of it is timing out? (Making a new issue may be better though.)

@jdalton
Copy link
Member

jdalton commented Apr 24, 2015

~~No more info at the moment, looks like 1.8.1, it's the building of ~400 modules (it normally takes some time but not 10 minutes+ which is travis-ci's timeout). I'll dig in a bit more. No worries yet, just wanted to throw a +1 on the perf note.~~

Patched lodash-cli to improve its perf by more than 30x which avoids the travis-ci timeout.
There's still a difference but it's not critical now.

Node 0.12.2

Created 401 modules in 10 seconds.

io.js 1.8.1

Created 401 modules in 13 seconds.

@bnoordhuis
Copy link
Member

I think we can close this now? Unless someone wants to take a look at the remaining gap?

@Qard
Copy link
Member

Qard commented Aug 14, 2015

Closing as stale. Feel free to re-open if you disagree.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants