Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

http performance test result is bad #8246

Closed
sushi90 opened this issue Aug 24, 2016 · 6 comments
Closed

http performance test result is bad #8246

sushi90 opened this issue Aug 24, 2016 · 6 comments
Labels
http Issues or PRs related to the http subsystem. performance Issues and PRs related to the performance of Node.js.

Comments

@sushi90
Copy link
Contributor

sushi90 commented Aug 24, 2016

  • Version:v6.4.0
  • Platform:Linux 3.13.0-92-generic Parallel non-io tests #139-Ubuntu SMP Tue Jun 28 20:42:26 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
  • Subsystem: none

I make ab test between nodejs and php, but result is confuse me.The os is 16 core cpu, so I start 16 process with nodejs and php. The ab command is: ab -c 1000 -n 100000. The result is below:

php:
Requests per second:    91116.93 [#/sec] (mean)
Time per request:       5.487 [ms] (mean)
Time per request:       0.011 [ms] (mean, across all concurrent requests)
Transfer rate:          14681.93 [Kbytes/sec] received

Connection Times (ms)  min  mean[+/-sd] median   max
Connect:        0    0  12.7      0    1008
Processing:     0    5   3.3      5     224
Waiting:        0    5   3.3      5     224
Total:          0    5  13.2      5    1021

nodejs:
Requests per second:    17698.54 [#/sec] (mean)
Time per request:       56.502 [ms] (mean)
Time per request:       0.057 [ms] (mean, across all concurrent requests)
Transfer rate:          1486.40 [Kbytes/sec] received

Connection Times (ms)  min  mean[+/-sd] median   max
Connect:        0   28 160.1      3    3039
Processing:     0   21  59.6      7    1641
Waiting:        0   20  59.6      6    1640
Total:          1   49 172.9     10    3426

nodejs code is bellow:

'use strict'
const cluster = require('cluster')
const http = require('http')
const cpuNums = require('os').cpus().length

if (cluster.isMaster) {
  cluster.schedulingPolicy = cluster.SCHED_NONE
  for (var i = 0; i < cpuNums; i++) {
    cluster.fork()
  }

  cluster.on('exit', (worker, code, signal) => {
    console.log(`worker ${worker.process.pid} died`)
  })
} else {
  // Workers can share any TCP connection
  // In this case it is an HTTP server
  http.createServer((req, res) => {
    res.writeHead(200)
    res.end('hello world')
  }).listen(8002)
}

If I dont set cluster.schedulingPolicy = cluster.SCHED_NONE , the performance is even worse.
Is there some thing wrong with my code?Or this is normal.

@addaleax addaleax added http Issues or PRs related to the http subsystem. performance Issues and PRs related to the performance of Node.js. labels Aug 24, 2016
@bnoordhuis
Copy link
Member

Can you post your php script and the configuration of the web server? Did you also benchmark single-process mode, i.e., without cluster?

@sushi90
Copy link
Contributor Author

sushi90 commented Aug 24, 2016

If without cluster, in single-process mode.ab test command is: ab -c 50 -n 10000.The benchmark result is bellow.

php
Concurrency Level:      50
Time taken for tests:   0.656 seconds
Complete requests:      10000
Failed requests:        0
Total transferred:      2480000 bytes
HTML transferred:       120000 bytes
Requests per second:    15253.27 [#/sec] (mean)
Time per request:       3.278 [ms] (mean)
Time per request:       0.066 [ms] (mean, across all concurrent requests)
Transfer rate:          3694.15 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   0.9      1      16
Processing:     0    1   1.3      1      18
Waiting:        0    1   1.3      1      18
Total:          1    2   1.7      2      18

nodejs
Concurrency Level:      50
Time taken for tests:   1.514 seconds
Complete requests:      10000
Failed requests:        0
Total transferred:      860000 bytes
HTML transferred:       110000 bytes
Requests per second:    6603.19 [#/sec] (mean)
Time per request:       7.572 [ms] (mean)
Time per request:       0.151 [ms] (mean, across all concurrent requests)
Transfer rate:          554.56 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.6      0       9
Processing:     3    7   7.3      7     210
Waiting:        3    7   7.3      7     210
Total:          4    8   7.4      7     212

php version: 5.6.24
use swoole extension,version is 1.8.10.php code is bellow:

<?php
$server = new \Swoole\Http\Server('0.0.0.0', '8001');
$server->set(array(
    'worker_num'    => 1,
    'daemonize'     => 1,
    'dispatch_mode' => 3,
    'open_tcp_nodelay'  => 1,
));
$server->on('Request', function(\Swoole\Http\Request $request, \Swoole\Http\Response $response){
    $response->header('Last-Modified', 'Thu, 18 Jun 2015 10:24:27 GMT');
    $response->header('E-Tag', '55829c5b-17');
    $response->header('Accept-Ranges', 'bytes');
    $response->status(200);
    $response->end("hello world\n");
});
$server->start();

nodejs code:

'use strict'
// const cluster = require('cluster')
const http = require('http')
const cpuNums = require('os').cpus().length

// if (cluster.isMaster) {
//   cluster.schedulingPolicy = cluster.SCHED_NONE
//   for (var i = 0; i < cpuNums; i++) {
//     cluster.fork()
//   }

//   cluster.on('exit', (worker, code, signal) => {
//     console.log(`worker ${worker.process.pid} died`)
//   })
// } else {
  // Workers can share any TCP connection
  // In this case it is an HTTP server
  http.createServer((req, res) => {
    res.writeHead(200)
    res.end('hello world')
  }).listen(8002)
// }

If with keepalive options, the result is even worse. ab test command: ab -c 50 -n 10000 -k. The result is bellow:

php
Concurrency Level:      50
Time taken for tests:   0.149 seconds
Complete requests:      10000
Failed requests:        0
Keep-Alive requests:    10000
Total transferred:      2530000 bytes
HTML transferred:       120000 bytes
Requests per second:    67241.36 [#/sec] (mean)
Time per request:       0.744 [ms] (mean)
Time per request:       0.015 [ms] (mean, across all concurrent requests)
Transfer rate:          16613.34 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.1      0       1
Processing:     0    1   0.1      1       2
Waiting:        0    1   0.1      1       2
Total:          0    1   0.1      1       2

nodejs
Concurrency Level:      50
Time taken for tests:   1.349 seconds
Complete requests:      10000
Failed requests:        0
Keep-Alive requests:    0
Total transferred:      860000 bytes
HTML transferred:       110000 bytes
Requests per second:    7414.12 [#/sec] (mean)
Time per request:       6.744 [ms] (mean)
Time per request:       0.135 [ms] (mean, across all concurrent requests)
Transfer rate:          622.67 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.5      0       8
Processing:     3    6   2.0      6      17
Waiting:        3    6   2.0      6      17
Total:          4    7   2.0      6      17

@bnoordhuis
Copy link
Member

One thing you're doing in your php code that you aren't doing in the node.js script, is enabling TCP nodelay. Another is setting a static Date header whereas the node.js script has to generate one for you (something that is more expensive than you might think it is.)

You should investigate whether both servers are doing roughly the same unit of work per request. If swoole is taking shortcuts left and right, it's going to be faster but not necessarily more correct.

Meaningful comparative benchmarks are a pretty broad subject. For example, how frequently does the garbage collector run in your php script vs. node.js? How many system calls do they make in total / per request? What is peak and average memory consumption like? And so on.

@sushi90
Copy link
Contributor Author

sushi90 commented Aug 24, 2016

Meaningful comparative benchmarks are a pretty broad subject. For example, how frequently does the garbage collector run in your php script vs. node.js? How many system calls do they make in total / per request? What is peak and average memory consumption like? And so on.

I agree with your point of view.
The good news is when I set 'Content-Length' to http header, the performance promote huge. But I dont know why. I guess if dont set content-length, Transfer-Encoding: chunked is set as default, it will lead to slow performance?
At last, by default the http server use Nagle algorithm or not? I cant find the method to set http response nodelay in doc.
Thanks very much.

@silverwind
Copy link
Contributor

silverwind commented Aug 24, 2016

At last, by default the http server use Nagle algorithm or not?

That's platform-specific right now. Linux generally defaults to enabled TCP_NODELAY, while other platforms generally do not. There's #906 about enabling it everywhere.

I cant find the method to set http response nodelay in doc.

You'd do something like this:

server.on('connection', function(socket) {
  socket.setNoDelay(true);
});

Let is know if it helps your performance. It might help our decision on enabling it.

@sushi90
Copy link
Contributor Author

sushi90 commented Aug 25, 2016

I think if you want to set http nodelay, code should like this

http.createServer((req, res) => {
    res.connection.setNoDelay(false)
    var body = 'hello world\n'
    res.writeHead(200, {
      'Content-Length': body.length,
      'Content-Type': 'text/plain'
    })
    res.end(body)
  }).listen(8002)

whether set false or not, the performance is same.So I think the key point of my problem is that set "Content-Length" on http header or not.

@sushi90 sushi90 closed this as completed Aug 25, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
http Issues or PRs related to the http subsystem. performance Issues and PRs related to the performance of Node.js.
Projects
None yet
Development

No branches or pull requests

4 participants