Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why does node/io.js hang when printing to the console inside a loop? #1741

Closed
pgkos opened this issue May 20, 2015 · 22 comments
Closed

Why does node/io.js hang when printing to the console inside a loop? #1741

pgkos opened this issue May 20, 2015 · 22 comments
Labels
help wanted Issues that need assistance from volunteers or PRs that need help to proceed. memory Issues and PRs related to the memory management or memory footprint. process Issues and PRs related to the process subsystem.

Comments

@pgkos
Copy link

pgkos commented May 20, 2015

Sorry if it is a dumb question/issue, but why does this code:

for (i = 0; i < 1000000; i++) {
  console.log('abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc' + i);
}

hang node for a few seconds (at around i = 1500) before resuming again? During the hang the amount of memory used by the node process increases very fast and the CPU usage is very high during that time.

On the other hand, if I write:

for (i = 0; i < 1000000; i++) {
  console.log(i);
}

then the node process does not hang at all.

The reason I am asking is that I have a piece of code which involves printing inside a loop, and it also hangs, but for much longer (like 30-60 seconds, before resuming again). The code prints the paths of all files in a directory (walking recursively): https://gist.github.com/pgkos/f0a650daf56aa49899e9.

The issue occurs on both io.js v2.0.2 and node.js v0.12.2, on Ubuntu 64-bit.

@pgkos pgkos changed the title Why does node/io.js hang when printing to a console inside a loop? Why does node/io.js hang when printing to the console inside a loop? May 20, 2015
@bnoordhuis
Copy link
Member

I can think of a couple of reasons. The first snippet creates a million strings (i.e. does a million heap allocations) whereas the second one does not; you can verify that with the --trace_gc flag.

What's more, if the first argument to console.log() is a string, it's considered a format string that needs to be parsed. In the first snippet, that works out to about 60 MB of memory that is scanned. You can avoid that by using process.stdout.write() instead.

Apropos memory usage, console.log() and process.stdout.write() are mostly asynchronous under the hood. When you call it, a 'write to the console' request is put together that is then handed off to libuv.

It's not until the next tick of the event loop that the memory of that request is freed; that includes the actual string that is written. In both snippets, one million requests are in flight but in the first one, they're much bigger.

@Fishrock123 Fishrock123 added the question Issues that look for answers. label May 20, 2015
@mscdex mscdex added the console Issues and PRs related to the console subsystem. label May 20, 2015
@ChALkeR
Copy link
Member

ChALkeR commented May 20, 2015

Actually, + i is not needed.there.

var x = process.hrtime();
for (var i = 0; i < 1000000; i++) {
    console.log('abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890');
}
console.log(process.hrtime(x));

Is almost equally slow.

@ChALkeR
Copy link
Member

ChALkeR commented May 20, 2015

@bnoordhuis

What's more, if the first argument to console.log() is a string, it's considered a format string that needs to be parsed. In the first snippet, that works out to about 60 MB of memory that is scanned. You can avoid that by using process.stdout.write() instead.

See #1749. It does not fix this issue, but makes things slightly better.

@Fishrock123
Copy link
Contributor

Simple answer: the code in question blocks the process for a bit while the loop is running~~, and libuv doesn't print it until the next tick~~.

@ChALkeR
Copy link
Member

ChALkeR commented May 25, 2015

@Fishrock123 I would say that (almost) everything is printed in the same tick.

The cause of the lag is gc going crazy at some point and hanging everything for a second or so.

Part of this single gc loop:

[16372]      251 ms: Scavenge 9.3 (45.0) -> 6.8 (46.0) MB, 2.3 ms [allocation failure].
[16372]      263 ms: Scavenge 9.8 (46.0) -> 7.7 (47.0) MB, 3.2 ms [allocation failure].
[16372]      403 ms: Scavenge 10.8 (47.0) -> 8.6 (47.0) MB, 132.1 ms [allocation failure].
[16372]      428 ms: Scavenge 15.6 (47.0) -> 10.6 (48.0) MB, 5.6 ms [allocation failure].
[16372]      452 ms: Scavenge 16.5 (48.0) -> 12.3 (50.0) MB, 7.6 ms [allocation failure].
[16372]      738 ms: Scavenge 18.5 (50.0) -> 14.1 (52.0) MB, 254.8 ms (+ 13.7 ms in 94 steps since last GC) [allocation failure].
[16372]      795 ms: Mark-sweep 22.0 (52.0) -> 15.7 (54.0) MB, 15.8 ms (+ 32.0 ms in 214 steps since start of marking, biggest step 0.4 ms) [GC interrupt] [GC in old space requested].
[16372]      846 ms: Scavenge 29.2 (54.0) -> 19.7 (56.0) MB, 11.0 ms [allocation failure].
[16372]      892 ms: Scavenge 31.5 (56.0) -> 23.1 (60.0) MB, 12.9 ms [allocation failure].
[16372]      934 ms: Scavenge 35.4 (60.0) -> 26.7 (63.0) MB, 12.4 ms [allocation failure].
[16372]      976 ms: Scavenge 38.8 (63.0) -> 30.2 (67.0) MB, 13.1 ms [allocation failure].
[16372]     1019 ms: Scavenge 42.4 (67.0) -> 33.7 (71.0) MB, 12.9 ms [allocation failure].
[16372]     1060 ms: Scavenge 45.9 (71.0) -> 37.3 (74.0) MB, 12.3 ms [allocation failure].
[16372]     1103 ms: Scavenge 49.5 (74.0) -> 40.8 (78.0) MB, 12.5 ms [allocation failure].
[16372]     1145 ms: Scavenge 53.0 (78.0) -> 44.3 (81.0) MB, 13.1 ms [allocation failure].
[16372]     1187 ms: Scavenge 56.5 (81.0) -> 47.9 (85.0) MB, 12.5 ms [allocation failure].
[16372]     1229 ms: Scavenge 60.1 (85.0) -> 51.4 (89.0) MB, 12.7 ms [allocation failure].
[16372]     1272 ms: Scavenge 63.6 (89.0) -> 54.9 (92.0) MB, 12.7 ms [allocation failure].
[16372]     1315 ms: Scavenge 67.1 (92.0) -> 58.5 (96.0) MB, 12.7 ms [allocation failure].
[16372]     1388 ms: Scavenge 70.7 (96.0) -> 62.0 (99.0) MB, 15.9 ms (+ 24.6 ms in 186 steps since last GC) [allocation failure].
[16372]     1533 ms: Scavenge 74.2 (99.0) -> 65.6 (103.0) MB, 19.9 ms (+ 91.8 ms in 186 steps since last GC) [allocation failure].
[16372]     1611 ms: Scavenge 77.8 (103.0) -> 69.1 (107.0) MB, 16.2 ms (+ 28.4 ms in 186 steps since last GC) [allocation failure].
[16372]     1683 ms: Scavenge 81.3 (107.0) -> 72.6 (110.0) MB, 13.5 ms (+ 26.9 ms in 186 steps since last GC) [allocation failure].
[16372]     1755 ms: Scavenge 84.8 (110.0) -> 76.2 (114.0) MB, 13.7 ms (+ 25.6 ms in 186 steps since last GC) [allocation failure].
[16372]     1828 ms: Mark-sweep 80.9 (114.0) -> 77.5 (117.0) MB, 49.2 ms (+ 208.7 ms in 1003 steps since start of marking, biggest step 68.1 ms) [GC interrupt] [GC in old space requested].
[16372]     1874 ms: Scavenge 91.8 (117.0) -> 81.6 (119.0) MB, 10.7 ms [allocation failure].
[16372]     1917 ms: Scavenge 93.2 (119.0) -> 85.0 (123.0) MB, 15.0 ms [allocation failure].

@ChALkeR
Copy link
Member

ChALkeR commented May 25, 2015

@Fishrock123 FYI, I can reproduce this in async:

function f(max) {
    for (var i = 0; i < max; i++) {
        console.log('abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890');
    }
}

function g() {
    f(100);
    process.nextTick(g);
}
g();

Run with --trace-gc.

@ChALkeR
Copy link
Member

ChALkeR commented May 25, 2015

Btw, I saw misconceptions about sync code.
Please note:

  1. It does not block any type of gc runs from executing. Both scavenge and mark-sweep runs are automatically executed in the middle of a tick. No need to manually trigger gc().
  2. It does not block output by itself.

@ChALkeR
Copy link
Member

ChALkeR commented May 25, 2015

Another example:

function g() {
    process.stdout.write('abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890\n');
    process.nextTick(g);
}
g();

It's even far worse in this case. When gc starts, it never finishes running (runs over and over again with no actual stdout output), memory usage quickly goes to 1.4 GiB and the process quits with «Allocation failed - process out of memory» error.

It looks as if garbage collector blocks output from happening, but the js code still runs, process.stdout.write gets called and overflows the output buffer.

@Fishrock123 Fishrock123 reopened this May 25, 2015
@ChALkeR
Copy link
Member

ChALkeR commented May 25, 2015

If you don't like recursive process.nextTick, let's setTimeout for 1ms every 1e6 iterations.

var i = 0;
function g() {
    console.log('abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890');
    i++;
    if (i % 1000000 === 0) {
        setTimeout(g, 1);
    } else {
        process.nextTick(g);
    }
}
g();

Takes more time, but still crashes.

Another example (setTimeout for 0 ms every 1e5 iterations):

var i = 0;
function g() {
    console.log('abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890');
    i++;
    if (i % 100000 === 0) {
        setTimeout(g, 0);
    } else {
        process.nextTick(g);
    }
}
g();

Another one:

var s = 'abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890';
function g() {
    for (var i = 0; i < 1e5; i++) {
        console.log(s);
    }
    setTimeout(g, 0);
}
g();

The numbers might require some tuning if you have a system that is faster than mine.

@ChALkeR
Copy link
Member

ChALkeR commented May 25, 2015

Asked on IRC by @Fishrock123 :

can you try allocating the sting outside the loop and using a reference to it?

Doesn't change anything

@ChALkeR
Copy link
Member

ChALkeR commented May 25, 2015

Btw, there is no issue when stdout is piped somewhere (for example, iojs test.js > /dev/null, iojs test.js > out.txt, iojs test.js | wc). Probably because in this case process.stdout is blocking.

@ChALkeR
Copy link
Member

ChALkeR commented May 25, 2015

One approach to «fixing» this would be to actually block output if the buffer is too large until it is dumped.

Almost as it is done in the case when output is piped, but only when the buffer gets overfilled and only until it is freed to a reasonable size.

Btw, that could actually make output faster, because with the current implementation it could make garbage collector go crazy (have you noticed gc runs that took 254 ms above?) when the output buffer grows too big.

A side question: why is output blocking when it is piped? Could it be for the same reason (everything going boom when it is processed too slowly)? Or are there other reasons?

@Fishrock123
Copy link
Contributor

Probably because in this case process.stdout is blocking.

Not unless you're on win32. See #1771, could you try that patch? (It might fix this?)

@ChALkeR
Copy link
Member

ChALkeR commented May 26, 2015 via email

@ChALkeR ChALkeR added the memory Issues and PRs related to the memory management or memory footprint. label Aug 17, 2015
@ChALkeR ChALkeR removed the question Issues that look for answers. label Sep 20, 2015
gdiggs added a commit to codeclimate/codeclimate-eslint that referenced this issue Jan 12, 2016
`console.log` does some extra work to construct the output, and because
it's already a string, we can save some memory by using
`process.stdout.write`. See this issue for more information:
nodejs/node#1741 (comment)

This is an attempt to remedy the OOM issues when rendering all the
issues. Increasing the batch size will also help by reducing the number
of GCs.
@Fishrock123 Fishrock123 added process Issues and PRs related to the process subsystem. and removed console Issues and PRs related to the console subsystem. labels Jan 13, 2016
@pgkos
Copy link
Author

pgkos commented Mar 1, 2016

Btw, placing this piece of code before the loop with console.log:

process.stdout._handle.setBlocking(true);

completely fixes the problem.

@jasnell
Copy link
Member

jasnell commented Mar 9, 2016

@ChALkeR @Fishrock123 ... does this one need to stay open?

@ChALkeR
Copy link
Member

ChALkeR commented Mar 9, 2016

@jasnell To me, this still looks like an issue that has to be solved. Very low priority, though.
I'm keeping this one in mind.

@Fishrock123
Copy link
Contributor

A side question: why is output blocking when it is piped? Could it be for the same reason (everything going boom when it is processed too slowly)? Or are there other reasons?

It shouldn't be?

@ChALkeR Is this a bug in libuv or is it a v8 memory issue?

@gireeshpunathil
Copy link
Member

should this remain open? I am not sure the reported problem is still relevant or not -
/cc @ChALkeR @Fishrock123 @bnoordhuis

@ChALkeR
Copy link
Member

ChALkeR commented Apr 8, 2018

@gireeshpunathil This and #6379 are basically the same issue. Perhaps we can keep only one of those open, I would prefer the latter.

@bnoordhuis
Copy link
Member

Okay, let's close this one.

Limess pushed a commit to Financial-Times/lambda-logger that referenced this issue Jul 27, 2018
This is not reliable anymore after AWS changes.

See 'nodejs/node#1741 (comment)'
jamesdonoh referenced this issue in davisjam/vuln-regex-detector Feb 13, 2019
1. Compute deathSignal correctly
2. Redirect output to tmp file.
   Very long piped strings sometimes get mishandled.
harryi3t added a commit to postmanlabs/electron that referenced this issue Oct 29, 2019
- remove usage of s3 bucket
- update required file list, we don't build for arm, mas & linux 32 bit
- always download file in quite mode: node crashes with out of heap memory error if we log really fast to stdout nodejs/node#1741
ProfessorManhattan pushed a commit to ProfessorManhattan/codeclimate-eslint that referenced this issue Mar 12, 2022
`console.log` does some extra work to construct the output, and because
it's already a string, we can save some memory by using
`process.stdout.write`. See this issue for more information:
nodejs/node#1741 (comment)

This is an attempt to remedy the OOM issues when rendering all the
issues. Increasing the batch size will also help by reducing the number
of GCs.
@arthurwolf
Copy link

arthurwolf commented Apr 5, 2023

For anyone running into this issue, and having trouble getting the magic function (process.stdout._handle.setBlocking(true);) call to work with typescript, use:

(process.stdout as unknown as WriteStream & { _handle: any })._handle.setBlocking(true);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Issues that need assistance from volunteers or PRs that need help to proceed. memory Issues and PRs related to the memory management or memory footprint. process Issues and PRs related to the process subsystem.
Projects
None yet
Development

No branches or pull requests

9 participants