-
Notifications
You must be signed in to change notification settings - Fork 10.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove argue.js to improve performance by almost 40% #11707
Conversation
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
13 similar comments
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
2 similar comments
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
Thanks for your pull request. The automated tests will run as soon as one of the admins verifies this change is ok for us to run on our infrastructure. |
This is OK to test |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
deserialize: Function, | ||
argument: null, metadata: [Metadata, new Metadata()], | ||
options: [Object], callback: Function}, arguments); | ||
/* Remove argue.js and parse arguments array explicitly as it reduces overhead by almost 40% */ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this comment is really necessary actually.
I was evaluating the QPS performance on GRPC on Java/Node.js for an internals project, when I was surprised by low QPS numbers for Node.js dug in I found that big chunk of time was spent in argue.js call to fill in default and optional function parameters (see below for number and flame graph of profile output) so I have modified the code to fill in the default and optional parameters. Results for Node.js QPS benchmark can be reproduced using this benchmark driver.
Machine Configuration
12GB RAM, 4 VCPU
Hardened Ubuntu 14.04
JDK 1.8 JVM / Node.JS v6.10
Canned Request (1K) / Response (4K)
Java QPS
Channels: 1
Outstanding RPCs per Channel: 10
Server Payload Size: 4096
Client Payload Size: 1024
50%ile Latency (in micros): 915
90%ile Latency (in micros): 1607
95%ile Latency (in micros): 1719
99%ile Latency (in micros): 1879
99.9%ile Latency (in micros): 4223
Maximum Latency (in micros): 24191
QPS: 9715
Node.JS QPS with Argue.js
Channels: 1
Outstanding RPCs per Channel: 10
Server Payload Size: 4096
Client Payload Size: 1024
50%ile Latency (in micros): 2883
90%ile Latency (in micros): 3611
95%ile Latency (in micros): 3815
99%ile Latency (in micros): 10783
99.9%ile Latency (in micros): 13231
Maximum Latency (in micros): 32303
Request Count: 1986074
QPS: 3310
User Time: 560.490193
System Time: 39.202487
Elapsed Time: 600.001344083
Node.JS QPS with out Argue.js
Channels: 1
Outstanding RPCs per Channel: 10
Server Payload Size: 4096
Client Payload Size: 1024
50%ile Latency (in micros): 2032
90%ile Latency (in micros): 2621
95%ile Latency (in micros): 2851
99%ile Latency (in micros): 3583
99.9%ile Latency (in micros): 17887
Maximum Latency (in micros): 28463
Request Count: 2712765
QPS: 4521.272519230517
User Time: 547.966842
System Time: 53.169124
Elapsed Time: 600.000329213
Flame Graphs