-
Notifications
You must be signed in to change notification settings - Fork 550
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
undici fetch has memory leak #1108
Comments
Do you have a repro? |
At first glance, your code behaves more or less how I expected. There should not be any memory leaks there. (I should probably spend some time in documenting how this happen - however I'm a bit short on time right now). |
It would be valuable to try to reproduce this issue with just Some areas within fetch it could happen is in the Body Mixin. We should ensure that once something like |
@Ethan-Arrowood My understanding is that there is no issue. All the requests are still referenced in the array and they can't be collected. |
@cjh980402 can you try again with the latest release? |
I can confirm quite severe memory leak still present with Node v16.13.1 and undici v4.11.3 The issue is described in Node.js non-heap memory leak. As soon as we replace However do we know if the issue is with Also switching back to |
@kyrylkov: Can you show how you use fetch? |
@ronag There are multiple cases. Let me collect some. |
Can you try with this branch? https://github.com/nodejs/undici/tree/finalization |
|
Trying.... |
You need to consume the response. See, https://github.com/nodejs/undici#garbage-collection. |
@ronag Alright. But this cannot cause the leak cause this specific code is called once per week at most. |
@ronag GC pauses immediately got at least 2x worse. Need more time to see if memory leak is still present. |
@ronag It's leaking again. |
The following does not leak memory: import { fetch } from './index.js';
import { setTimeout } from 'timers/promises';
for (let i = 0; i < 200; i++) {
if (i % 10 == 0) {
console.log(process.memoryUsage().heapUsed / 1024 / 1024);
}
for (let j = 0; j < 20; j++) {
await (await fetch('https://www.youtube.com/')).text();
}
await (await fetch('https://naver.com')).text();
}
console.log('all req sent');
console.log(process.memoryUsage().heapUsed / 1024 / 1024);
await setTimeout(10000);
console.log(process.memoryUsage().heapUsed / 1024 / 1024);
await setTimeout(100000);
console.log('end'); |
@mcollina The leak we're observing is non-heap memory, so my understanding the test above wouldn't show it. |
Could you please include a test to reproduce? What's exactly is the scale of those graphs? What's the output you're seeing when Node is giving you the out of memory error? |
We don't have a test because it's a big app and we've been chasing non-heap memory leak for weeks. We seem to isolate the cause to undici fetch vs node-fetch.
Total memory on the chart in the Stack Overflow question is 768MB.
We would need to allow it to crash again to see exact out of memory error. The problem is that GC pauses increase 2x-3x as soon as we switch to undici fetch which negatively affects app performance in production. Thus running these tests till out of memory is both time consuming and bad for production application experience overall. We don't have a way to reproduce it outside of production environment. |
The finalization branch linked above should fix/resolve it. This is actually pretty bad and possibly a reason to not be spec-compliant @ronag and drop the FinalizationRegistry. |
@mcollina What we can and will do next week is try to run for 24 hours with undici fetch, recording memory and GC profiles and then another 24 hours with node-fetch. That should answer lots of questions. |
We really need a way to reproduce the problem it we want to fix it. |
heres a good way to reproduce the memory leak index.js: const { Worker } = require('worker_threads');
const os = require('os');
for (let i = 0; i < 50; i++) {
new Worker('./worker.js');
}
setInterval(() => {
console.log(os.freemem());
}, 1000) worker.js const { fetch } = require("undici");
const main = async () => {
const res = await fetch("https://jsonplaceholder.typicode.com/posts/1");
const json = await res.json();
main();
};
main(); result with undici fetch:
exact same program with node-fetch:
|
I have tested this for about 10 minutes. Eventually it stabilizes but it degrades pretty sharply. @ronag I think we might want to remove it. |
But we removed that? |
It seems it's still there: Line 31 in 50fa51b
|
@mcollina This one has been added after the removal: |
I've tried removing that and it's not the cause or the problem. Technically it's not a leak but there is some very odd memory behavior that we should fix. |
I haven't really spent much time optimising allocations. It might also be related to web streams. |
This comment has been minimized.
This comment has been minimized.
I cannot reproduce it directly when using a single process. This is quite interesting: const { fetch } = require(".");
const { isMainThread } = require('worker_threads');
const os = require('os');
const main = async () => {
const res = await fetch("https://jsonplaceholder.typicode.com/posts/1");
const json = await res.json();
main();
};
main();
main();
main();
main();
main();
if (isMainThread) {
setInterval(() => {
console.log(os.freemem());
}, 1000)
} Very interestingly most of the allocations comes from Node core internals: Going back to the multithreaded example, on my system the RSS stabilizes at 1.5 GB of RAM. My theory for this is that each worker uses about 30 MB of RAM (* 50 = 1.5 GB) because of the activity they need to do - this is consistent for the heap requirement of running a Node.js application. See nodejs/node#34823 for more details on the topic of the memory requirements of worker threads. Now, undici fetch() is not gentle on memory allocations and it should probably be optimize to reduce the memory consumption, but this can be closed. |
Landing this in core will probably help some nodejs/node#41121 |
Hopefully that'll happen soon. |
|
Cross that. It did make it into node 17.2. Please try with Node 17.2+ and Undici 4.12.1+. |
i have re-ran this: undici 4.12.1 looks like the memory usage is still very weird, but it seems to stabilize after 90 seconds while using 1.5gb of ram
|
Bug Description
When I use node-fetch or axios, then there are no memory leak like under screenshot.
However, I found memory leak when I use undici.fetch.
Reproducible By
I made request about 50 times~200times then I found that memory leak easily.
Expected Behavior
there are no memory leak.
Logs & Screenshots
Environment
Ubuntu 20.04
Node v16.13.0
Additional context
The text was updated successfully, but these errors were encountered: