Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Strange (?) memory footprint #1146

Closed
Olegas opened this issue Mar 13, 2015 · 12 comments
Closed

Strange (?) memory footprint #1146

Olegas opened this issue Mar 13, 2015 · 12 comments

Comments

@Olegas
Copy link
Contributor

Olegas commented Mar 13, 2015

I have an app which monitors it's memory usage via process.memoryUsage()

Sometimes, data, reported by this method, looks strange to mee.

I can see process' heapUsed > heapTotal and heapUsed > RSS.

Is it normal behavior? How can this be explained?

I can see this on 0.10.33 and on io.js 1.5.1 and some versions back.

@bnoordhuis
Copy link
Member

The heapUsed > RSS is normal, it means part of the process' memory has been swapped out to disk.

I don't think I've ever seen heapUsed > heapTotal. That's coming directly from V8 though.

@bnoordhuis
Copy link
Member

That said, I did spot a minor bug but unless you've been tweaking --max_old_space_size, it's unlikely that you've been affected by that. See #1148.

@Olegas
Copy link
Contributor Author

Olegas commented Mar 13, 2015

@bnoordhuis I'm using --max-old-space-size=... and --expose-gc

@bnoordhuis
Copy link
Member

Right, if you set it to something > 4 GB, heapUsed and heapTotal can wrap around. #1148 fixes that.

@Olegas
Copy link
Contributor Author

Olegas commented Mar 13, 2015

2015-03-14 00-53-00 30
@bnoordhuis this is how my memoryUsage looks like.
Green is heapUsed, red is heapTotal and blue is rss.
This sum of 2 cluster workers.
My --max-old-space-size is 1900m

@bnoordhuis
Copy link
Member

I did a quick check of how V8 calculates the used and total size but I didn't spot anything obvious. It's possible the size of the live objects is overestimated but that's just guesswork. Sorry, I don't think there's anything actionable for us here.

@Olegas
Copy link
Contributor Author

Olegas commented Mar 13, 2015

From my side this is not just curiosity. Sometimes, my workers run into some trouble and began to consume all CPU time, 100%. The process is still able to process incoming requests, write logs (those memory ones) and so, but things get very very slow.

Investigating this, I'd noticed that strange (from my point of view) memory behavior. And if I let things go and do not restart, my heapUsed grows up to 1.9G (my max-old-space-size). So, I decided to fill an issue just to understand - is it possible to heapUsed became greter than RSS.

So, I undertand - heapUsed > rss means swap and I have some long-living objects or leaks in my app.

@bnoordhuis , thank you.

@bnoordhuis
Copy link
Member

@Olegas Happy to help. I'll close the issue now but let me know if you find anything.

@Olegas
Copy link
Contributor Author

Olegas commented Mar 17, 2015

@bnoordhuis I've got another sample of such strange process.

One of our worker server become unresponsible (TTFB ~30sec). Worker, which has that response time got "strange memory footprint". Below is some details:

RSS: 215.47 MB, Heap Used: 1.91 GB, Heap total: 126.16 MB

ps wuax | grep 15419
USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
503      15941 17.5  2.7 1224572 218912 ?      Rl   01:30 222:18 /usr/bin/node --expose-gc --nouse-idle-notification --max-old-space-size=1900 /usr/local/bin/ws-preproc/app.js

System memory status:

Mem:   8057668k total,  7704652k used,   353016k free,    95752k buffers
Swap:  4128764k total,    20316k used,  4108448k free,  3241732k cached

No swapping issues!

lsof

COMMAND   PID        USER   FD   TYPE             DEVICE   SIZE/OFF     NODE NAME
node    15941 explotation  cwd    DIR              253,0      36864   391682 /tmp
node    15941 explotation  rtd    DIR              253,0       4096        2 /
node    15941 explotation  txt    REG              253,0    1378496   131479 /usr/bin/node
node    15941 explotation  mem    REG              253,0      27424      114 /lib64/libnss_dns-2.12.so
node    15941 explotation  mem    REG              253,0      65928      387 /lib64/libnss_files-2.12.so
node    15941 explotation  mem    REG              253,0      46288   145359 /usr/local/bin/ws-preproc/node_modules/hiredis/build/Release/hiredis.node
node    15941 explotation  mem    REG              253,0      15536   145320 /usr/local/bin/ws-preproc/node_modules/heapdump/build/Release/addon.node
node    15941 explotation  mem    REG              253,0     122040     8954 /lib64/libselinux.so.1
node    15941 explotation  mem    REG              253,0     110960       67 /lib64/libresolv-2.12.so
node    15941 explotation  mem    REG              253,0      10192      128 /lib64/libkeyutils.so.1.3
node    15941 explotation  mem    REG              253,0      43728     7366 /lib64/libkrb5support.so.0.1
node    15941 explotation  mem    REG              253,0     174840     4171 /lib64/libk5crypto.so.3.1
node    15941 explotation  mem    REG              253,0      14664       55 /lib64/libcom_err.so.2.1
node    15941 explotation  mem    REG              253,0     941920     6945 /lib64/libkrb5.so.3.3
node    15941 explotation  mem    REG              253,0     277704     4487 /lib64/libgssapi_krb5.so.2.2
node    15941 explotation  mem    REG              253,0   16011824   134075 /usr/lib64/libicudata.so.42.1
node    15941 explotation  mem    REG              253,0    1376272   134087 /usr/lib64/libicuuc.so.42.1
node    15941 explotation  mem    REG              253,0    1660584   134077 /usr/lib64/libicui18n.so.42.1
node    15941 explotation  mem    REG              253,0      19536       72 /lib64/libdl-2.12.so
node    15941 explotation  mem    REG              253,0      43880     1553 /lib64/librt-2.12.so
node    15941 explotation  mem    REG              253,0    1921176       41 /lib64/libc-2.12.so
node    15941 explotation  mem    REG              253,0     142640       65 /lib64/libpthread-2.12.so
node    15941 explotation  mem    REG              253,0      90880       77 /lib64/libgcc_s-4.4.7-20120601.so.1
node    15941 explotation  mem    REG              253,0     596272     1565 /lib64/libm-2.12.so
node    15941 explotation  mem    REG              253,0     987096   143553 /usr/lib64/libstdc++.so.6.0.13
node    15941 explotation  mem    REG              253,0    1963264   153183 /usr/lib64/libcrypto.so.1.0.1e
node    15941 explotation  mem    REG              253,0     441240   153187 /usr/lib64/libssl.so.1.0.1e
node    15941 explotation  mem    REG              253,0    6104872   141271 /usr/lib64/libv8.so.3.14.5
node    15941 explotation  mem    REG              253,0     110592   143525 /usr/lib64/libuv.so.0.10
node    15941 explotation  mem    REG              253,0      66336   151271 /usr/lib64/libcares19.so.2.0.0
node    15941 explotation  mem    REG              253,0      19592   150571 /usr/lib64/libhttp_parser.so.2.0
node    15941 explotation  mem    REG              253,0      88600      100 /lib64/libz.so.1.2.3
node    15941 explotation  mem    REG              253,0     154528     1531 /lib64/ld-2.12.so
node    15941 explotation    0u  unix 0xffff8801c1362400        0t0 87145247 socket
node    15941 explotation    1u  unix 0xffff880239e8fb80        0t0 87145249 socket
node    15941 explotation    2u  unix 0xffff88023b75fc00        0t0 87145251 socket
node    15941 explotation    3u  unix 0xffff8802397a1740        0t0 87145576 socket
node    15941 explotation    4r  FIFO                0,8        0t0 87145590 pipe
node    15941 explotation    5w  FIFO                0,8        0t0 87145590 pipe
node    15941 explotation    6u   REG                0,9          0     3971 [eventpoll]
node    15941 explotation    7r  FIFO                0,8        0t0 87145591 pipe
node    15941 explotation    8w  FIFO                0,8        0t0 87145591 pipe
node    15941 explotation    9u   REG                0,9          0     3971 [eventfd]
node    15941 explotation   10r   DIR              253,0       4096        2 /
node    15941 explotation   11w   REG              253,0   38961493   136959 /usr/local/bin/ws-preproc/logs/health.log
node    15941 explotation   12u  IPv4           87146394        0t0      TCP -----:56410->------:metatude-mds (ESTABLISHED)
node    15941 explotation   13w   REG              253,0     103448   152022 /usr/local/bin/ws-preproc/logs/----------------2211.json.log
node    15941 explotation   14w   REG              253,0   16400524   151394 /usr/local/bin/ws-preproc/logs/http.log
node    15941 explotation   15w   REG              253,0   73950550   154044 /usr/local/bin/ws-preproc/logs/bl.log
node    15941 explotation   16w   REG              253,0    1107495   147872 /usr/local/bin/ws-preproc/logs/httpSlow.log
node    15941 explotation   17w   REG              253,0          0   154121 /usr/local/bin/ws-preproc/logs/blSlow.log
node    15941 explotation   18r   DIR               0,10          0        1 inotify
node    15941 explotation   19w   REG              253,0 1591622299   154111 /usr/local/bin/ws-preproc/logs/rights.log
node    15941 explotation   20w   REG              253,0 2834461269   154118 /usr/local/bin/ws-preproc/logs/redis.log
node    15941 explotation   21w   REG              253,0  550997769   152018 /usr/local/bin/ws-preproc/logs/net.log
node    15941 explotation   22w   REG              253,0     150766   146607 /usr/local/bin/ws-preproc/logs/httpFail.log
node    15941 explotation   23u  IPv4           87146395        0t0      TCP -------:56411->-------:metatude-mds (ESTABLISHED)
node    15941 explotation   24u  IPv4           87147022        0t0      TCP *:emwin (LISTEN)
node    15941 explotation   25u  IPv4           87146203        0t0      TCP *:56813 (LISTEN)
node    15941 explotation   26w   REG              253,0   12892466   152021 /usr/local/bin/ws-preproc/logs/express-res200.log
node    15941 explotation   27w   REG              253,0      24808   152019 /usr/local/bin/ws-preproc/logs/express-res304.log
node    15941 explotation   29u  IPv4           95666353        0t0      TCP --------:40735->----------:https (ESTABLISHED)
node    15941 explotation   30u  IPv4           95666773        0t0      TCP ------:40736->------:https (ESTABLISHED)
node    15941 explotation   31u  IPv4           95666774        0t0      TCP ------:40737->------:https (ESTABLISHED)
node    15941 explotation   32u  IPv4           95666775        0t0      TCP ------:40738->-------:https (ESTABLISHED)
node    15941 explotation   33u  IPv4           95666776        0t0      TCP --------:40739->------:https (ESTABLISHED)
node    15941 explotation   34u  IPv4           95666777        0t0      TCP --------:40740->-------:https (ESTABLISHED)
node    15941 explotation   35u  IPv4           95666778        0t0      TCP ------:40741->-----:https (ESTABLISHED)
node    15941 explotation   42w   REG              253,0        203   152024 /usr/local/bin/ws-preproc/logs/express-res404.log

.heapsnapshot ~= 60Mb (using node-heapdump)
Chrome Devtools shows 135 Mb for it.

This is node 0.10.33 on CentOS. io.js 1.5.1 introduces the same behavior for memory footprint (same app, same flags, same OS)

@Olegas
Copy link
Contributor Author

Olegas commented Mar 17, 2015

Some internal log inspection shows the process is being able to communicate with "upstream" servers with normal response time, but sometimes internal (for this machine) tasks (reading files, working with strings using regexp, etc...) are taking enormous time to complete.
We have some "custom" event loop "response time" metrics. We set timer and measure deviation. Then instance got into this "strange" state I can see some timers deviate up to 5-10 sec (!) from expected 10 ms interval.

@Olegas
Copy link
Contributor Author

Olegas commented Mar 17, 2015

My app uses requirejs which is using vm.runInThisContext internally. Can it relate to #1170 ?

@bnoordhuis
Copy link
Member

@Olegas It's possible but this is not the right bug tracker for issues with node.js v0.10. The vm subsystem in io.js is a from-the-ground-up rewrite of the one in v0.10.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants