You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Apr 22, 2023. It is now read-only.
"The console functions are synchronous when the destination is a terminal or a file (to avoid lost messages in case of premature exit) and asynchronous when it's a pipe (to avoid blocking for long periods of time)."
And then it gives the following code to test it for ourselves:
$ node script.js 2> error.log | tee info.log
So I created a very simple pipe.js as follows:
var data = '...................................................';
for (var i = 0, l = 22; i < l; i++) {
data += data;
}
var start = Date.now();
console.log(data);
console.log('wrote %d bytes in %dms', data.length, Date.now() - start);
process.on('exit', function() {
console.log('exited');
console.log('Total processing time: ' + (Date.now() - start) + 'ms.');
});
process.on('error', function() {
console.log('err');
});
When I run
node pipe.js 2> error.log | tee info.log
I get
wrote 213909504 bytes in 18084ms
exited
Total processing time: 18114ms.
at the end of the info.log.
since totall processing time is really close to "18084ms" logged time, it looks like the write operation is blocking.
(also running pipe.js as a daemon using forever, and nohup resulted in similar outputs, too)
For instance
$ node pipe.js > baz.log; cat baz.log | grep ms
gives
wrote 213909504 bytes in 1633ms
Total processing time: 1634ms.
The values, albeit smaller, are agin really close, which hints for a blocking operation.
...
As a counter example. if we use a file stream for the very same data:
var fs = require('fs'),
stream;
function initialize(file) {
if (typeof file === 'string') {
stream = fs.createWriteStream(
file, {flags: 'a+', encoding: 'utf8'}
);
return;
}
stream = file;
}
initialize('out.log');
function log(stuff) {
stream.write(stuff + '\n');
}
var data = '...................................................';
for (var i = 0, l = 22; i < l; i++) {
data += data;
}
var start = Date.now();
log(data);
log(data);
log(data);
log(data);
log(data);
log(data);
log(data);
log(data);
log(data);
console.log('wrote %d bytes in %dms', data.length, Date.now() - start);
process.on('exit', function() {
console.log('exited');
console.log('Total processing time: ' + (Date.now() - start) + 'ms.');
});
The output is:
wrote 213909504 bytes in 3554ms
exited
Total processing time: 12596ms.
The ~10sec difference between two timestamps show that the IO operation is being done async.
...
In addition, when I look at the source code, I see that console.log does some minimal formatting and yields the execution to process.stdout.
"process.stderr and process.stdout are unlike other streams in Node in that writes to them are usually blocking.
They are blocking in the case that they refer to regular files or TTY file descriptors.
In the case they refer to pipes:
They are blocking in Linux/Unix.
They are non-blocking like other streams in Windows."
You are very correct! This is actually being tracked right now as an issue in node 4: nodejs/node#784. I'd recommend pushing discourse to there for now, and then if a fix lands, we can consider backporting it.
(tested in Mac OS X 10.9.4, and Ubuntu 14.04.1 with similar results)
http://nodejs.org/api/stdio.html#stdio_console says that
"The console functions are synchronous when the destination is a terminal or a file (to avoid lost messages in case of premature exit) and asynchronous when it's a pipe (to avoid blocking for long periods of time)."
And then it gives the following code to test it for ourselves:
So I created a very simple pipe.js as follows:
When I run
I get
at the end of the info.log.
since totall processing time is really close to "18084ms" logged time, it looks like the write operation is blocking.
(also running pipe.js as a daemon using forever, and nohup resulted in similar outputs, too)
For instance
gives
The values, albeit smaller, are agin really close, which hints for a blocking operation.
...
As a counter example. if we use a file stream for the very same data:
The output is:
The ~10sec difference between two timestamps show that the IO operation is being done async.
...
In addition, when I look at the source code, I see that console.log does some minimal formatting and yields the execution to process.stdout.
And the documentation of proces says (ref: http://nodejs.org/api/process.html#process_process_stdout) that:
"process.stderr and process.stdout are unlike other streams in Node in that writes to them are usually blocking.
They are blocking in the case that they refer to regular files or TTY file descriptors.
In the case they refer to pipes:
They are blocking in Linux/Unix.
They are non-blocking like other streams in Windows."
...
My logical conclusion is that console.log is always blocking in linux-like systems, and the documentation at http://nodejs.org/api/stdio.html#stdio_console) is outdated.
Does that make sense?
The text was updated successfully, but these errors were encountered: