-
Notifications
You must be signed in to change notification settings - Fork 29.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
high rss and steady and low heap usage #12805
Comments
@terrywh Can you reproduce it without the compiled addons? It could be that the addons you're using do not let V8 know about memory they have allocated/deallocated (via |
Do you have transparent huge pages enabled? Try turning it off, see #11077. |
@mscdex i'm using https://github.com/Blizzard/node-rdkafka, i will try to see if i can reproduce without it. |
Same Problem here on EBS t2.micro as well as t2.medium instances. "max_old_space_size" and the likes had no influence and disabling transparent huge pages did not help either. I stripped down the app to do exactly nothing but still, RSS constantly grows while Heap stays a flat line. |
In addition to my previous post: checked this behaviour on 7.10.0, 7.6.0 and the only working fix was to go back to 6.10.3 |
@dlueth "do exactly nothing" so what does it do specifically? |
@terrywh Sorry for having been a bit unspecific: Nothing left but an idle loop, just to keep the process running |
@dlueth maybe you can give us some code to reproduce this problem? my problem seems to be caused by the c++ module. |
@terrywh If I find the time tonight I will try to prepare something |
Will take some more time to prepare an example of our problem - will report back! |
Closing due to inactivity. Can reopen if needed. |
I file another issue on the c++ module Blizzard/node-rdkafka, and will report back if i got anything concrete. |
@dlueth Could you possibly test on Node 7.4.0? We are facing a similar large RSS, no heap growth issue, and I swear it used to work on 7.4.0. If you're unable to reproduce on that version, then it might narrow the window of what changed that caused it. Or maybe it's a 3rd party package... not sure yet. |
@owenallenaz Will see what I can do to test this and report back! |
@owenallenaz Did not help in our case, switching to 7.4.0 shows the same behaviour regarding large RSS but constant heap as later versions do |
@dlueth I am seeing similar issues even on node 8.9.3 and 9.5.0 streaming image files from an image server through a zip archiving routine then by express down to the user. I have tried destroying every stream created manually and calling garbage collection manually, but the RSS space is high, never decreases while heap stays relatively constant and low. Thank you for any update. This data graphed here covers the period of manual destroying and garbage collection for example. |
@bam-tbf In our case the issue is not solved but the overall situation improved, at least with 8.9.4. There still is a memory leak burried somewhere but it is now within acceptable limits, restarting our instances once a week is sufficient so far. One of the causes was an external script we depended on that, in the meantime, fixed its leaks (and we got finally rid of in the end anyway) but the guys behind it mentioned that there still are some potential leaks left within node itself. |
@dlueth if it's a memory leak shouldn't it consume heap space? |
@Restuta it normally should but somehow node seemed to eventually loose track of it and assume it as non-heap |
try to use the jemalloc library - it prevent the RSS grownig |
We are still experiencing the same issue in v12.16.3, any leads about how to track down what is causing RSS to grow? No luck with the devtools since it's reporting only the heap memory. |
@hakimelek - what is your system's memory configuration? I am suspecting something similar to #31641 , wherein a large demand of memory in the process that occurred and satisfied in the past, but never released back due to the fact that the system did not want those back as it has ample free memory. |
@gireeshpunathil Yes there are 4 processes running on an 8Gb RAM. When inspecting the processes memory I see both the Virtual Size and Resident Size trending up under load, these are retrieved in New Relic. These boxes in productions are running in v10.21.0. I have tried to reproduce the same locally even after upgrading to v12.16.3 and tap on After a couple of heap snapshots in production, I am wondering if that the memory growing perceived by the system is not caused by the heap memory and I am not sure now what tools can I used to get insights around the RSS memory. 11/2 Update: This is the behavior I see in the application locally I would assume that every time GC is kicked in and heap memory drops, RSS should drop as well. Am I missing anything? |
thanks for the charts. As mentioned in the referenced issue, the
My summary: unless the Hope this helps. |
Version: Node v7.8.0
Platform: Linux (CentOS 7.0 x64)
I am using --inspect to get get heap snap shots. in the heap snap shots, it show a memory usage of only 27MB, but when seeing in the system it show a large rss usage.
message from console:
snap from top command:
heap snapshots:
am i missing something ? or is this suppose to be? this project is using some third party (C++) modules.
i try to add --max_old_space_size=128 options but i got no luck, it seems this options has not been working at all ( i also tried options that using '-' insdead of '_' --max-old-space-size=128, it seems to be also not working);
The text was updated successfully, but these errors were encountered: