-
-
Notifications
You must be signed in to change notification settings - Fork 6.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JavaScript heap out of memory after upgrade to Jest 26 #9980
Comments
We will need a repro that can be downloaded and analyzed. |
Oh |
Thanks, that's good to know. Still weird |
I spoke too soon, it seems like the issue is this helper function:
|
This is still the case 🙂 |
Also sounds like JSDOM leaking |
Not sure if it is related. But I get heap leak for simple expect:
Clearing cache doesn't help |
I am facing similar issues |
Same issue here as well. (using ts-jest) |
I got it during a full run in which some tests failed. I spent some time debugging and taking memory snapshots and comparing.. I couldn’t find any leaks. |
I think I'm running into the same issue. Created a new app recently with Jest 26. Using Enzyme for snapshot testing. Updated a test to use I posted an issue to Enzyme enzymejs/enzyme#2405 (comment) Below is the error I get on this test
|
I tried removing random test suites from my tests but still jest memory leaks. So there is no particular test causing the leak. |
I had a similar problem where I used to run into |
After doing some research, it seems this memory leak has been an ongoing issue since 2019 (Jest 22) so wanted to consolidate some notes here for posterity. Past issues have been related to graceful-fs and I think some have solved it via a hack/workaround that removes graceful-fs and then re-adds graceful-js after running jest. One troubleshooting thread was looking at compileFunction in the The common cause of the issues I've seen is collecting coverage via collecting coverage and graceful-fs. I haven't done an in-depth analysis of those issues but seeing that they are both filesystem-related and having solved my own issue which was related to file imports I suspect they are some version of the same issue I was having. Wanted to provide the solution I found so others may reap benefits: The cause: Using imports of the format The solution: Using the format |
Often times when this happens, I delete the src folder (provided it's on version control) and run |
+1 @alexfromapex solution did not worked for me. Jest 26.6.3 Dump: https://pastebin.com/Mfwi2iiA It happens after some re-runs on any CI server (my runners are docker containers). Always after a fresh boot it works normally, and after some runs, it breaks again, only comming back after a new reboot. I tried with 1GB RAM and 2GB RAM machines, same result. It seems not happening with 8GB+ RAM hardware (my local machine). Some other info I've gathered, it happens always after ~5m running, everytime the test log has the same size (it might be happening at same spot). |
i have the same issue |
I have very similar issue, My test: const first = [ div_obj, p_obj, a_obj ]; // array with three DOM elements
const second = [ div_obj, p_obj, a_obj ]; // array with same DOM elements
second.push( pre_obj ); // add new obj
expect(first).toEqual(second); // compare two arrays one 3 elements other 4 elements test should fail within 250 ms (timeout), but it takes 40 sec and it spits out message:
Somehow I believe stack trace points to Jest: v26.6.3 |
I have a similar issue with: Node: 15.5.1 |
Update: It just happened in my local host with 8GB ram, but this time in watch mode and outside docker, running a single test, after consecutive file saves (without waiting tests to finish). Here is the dump: https://pastebin.com/jrDkCYiH IDK if this help, but here is the memory status when it happened: [klarkc@ssdarch ~]$ free -m
total used free shared buff/cache available
Mem: 7738 3731 2621 473 1385 3226
Swap: 8191 2133 6058 |
I had this same error on my Gitlab CI and I just temporary added the |
We see this regularly on our tests at https://github.com/renovatebot/renovate |
We see this regularly on our tests -- the first one succeeds then the second fails. |
We are also struggling with this issue on Jest 26. Upgrading to Jest 29 didn't work. |
Specifying |
Sadly, this didn't solve our issues however, I can run our unit tests in two sessions which solves the issue for now |
This solved my issue: |
For me this happened only on CI. |
Updating my |
I tried a few different solutions that didn't work for me:
What did work for me: Limiting the idle memory per worker using the flag I'm also limiting the number of workers so maybe it was a combination of the solutions. |
Doesn't help in my case. still stuck in the test. |
Jest expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
Bucket: IMAGES_BUCKET,
Key: `${a}/${b}/info.json`,
Body: expect.jsonMatching(infoFile),
ContentType: "application/json",
}); The s3 client is "aws-sdk-client-mock": "2.2.0", // hapens with 3.0.0 as well
"aws-sdk-client-mock-jest": "2.2.0",
"@aws-sdk/client-s3": "3.414.0", There's nothing fancy in there; const infoFile: SomeType = {
imageKeys: [imageObject1.Key!, imageObject2.Key!], // strings
taskToken, // string
location, // shallow dict
}; Commenting out parts of it does not help, but as soon as I comment out the whole expectation my test turns green. With it I constantly get:
I tried What's also interesting, I use similar expectations right before and after the problematic one and they run just fine. |
Have you tried node 21.1+? EDIT: oh, a specific assertion - that's weird. Could you out together a minimal reproduction? |
Hi @SimenB , I'll give it a try. What's rather silly - this is my expectation: expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
Bucket: IMAGES_BUCKET,
Key: `${a}/${b}/info.json`,
Body: expect.jsonMatching(infoFile),
ContentType: "application/json",
}); And this follows right after that and runs fine: expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
Bucket: IMAGES_BUCKET,
Key: `${a}/${IMAGES_DOCUMENT_NAME}`,
Body: expect.anything(),
ContentType: "application/pdf",
}); But it ain't seem to matter what the actual body of the first is, still runs oom even if I turn it into: expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
Bucket: "b",
Key: `a`,
Body: "",
ContentType: "application/json",
}); This is driving me crazy. It's the same thing ... |
When I strip it down to the size where I can almoste share it it starts working again -.-' But another fun fact I forgot to mention: As soon as I attach the debugger (IntelliJ IDEA) it, too, works. Still blows up the RAM and takes forever but he can step over it. I see that IDEA comes with some "fixes" /edit I was, however, able to boil it down further. expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
Body: expect.anything(),
// ... works, whereas expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
Body: expect.jsonMatching({}), // or jsonMatchin(1)
// ... or even expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
Body: expect.any(Number),
// ... blows up. @SimenB is there any way I can profile the jest worker? When I /edit2: tried node 20, same behaviour /edit3: |
This started happening to us this week. Affects previously passing builds. |
I tried in on pure Windows and Mac and it's basically the same behaviour. The single test, which virtually does nothing, takes up about 2.5 GB of RAM. For some reasons that's too much for my WSL, in PS and Mac the max heap seems to be higher for some reason. With Maybe related: #11956 My stripped-down sample only seems to use aroud 350MB:
It does the same thing, there's just less code not being executed. |
@mike-4040 I'm facing the same problem. Did you solve it? |
Alright, I drilled further down into jest and here's what I've found: There's some map-reduce equality check, counting if any invocation matched the given args (and swallowing any exception). In my test, one input was a "larger" image and for some reason that equality check took 2+ GB of RAM for the 4 MB image. I don't know what Jest does down there in order to compare the 4 MB buffer to a 4 byte string, but I circumvented the issue by reducing the image to 1x1 pixels (for the actual contents don't matter for now). |
Yes, by migrating to mocha + chai + sinon :) |
@black-snow would you be able to share that image (perhaps privately?) |
@SimenB the concrete image doesn't seem to be the issue, jest hangs in the matcher check: ({ commandCalls }) => {
const matchCount = commandCalls
.map(call => call.args[0].input) // eslint-disable-line @typescript-eslint/no-unsafe-return
.map(received => {
try {
expect(received).toEqual(expect.objectContaining(input)); // here
return true;
}
catch (e) {
return false;
}
})
.reduce((acc, val) => acc + Number(val), 0);
return { pass: matchCount > 0, data: { matchCount } };
} Jest iterates over all the things it received and does some comparison. When it hits my expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
Bucket: IMAGES_BUCKET,
Key: `${a}/${b}/info.json`,
Body: expect.jsonMatching(infoFile),
ContentType: "application/json",
}); it compares a 4 MB buffer, probably containing the jpeg with some 4 byte values and for some reason this blows up the heap and takes forever. P.S.: Why not return 0/1 instead of booleans, which get cast to Number right after? |
This solved mine too thanks! |
I ran into the issue with Node 16.19.0 and Jest 29.7.0. What did not work for me:
What did work for me:
|
Same problem in Node 16.20.2 and Jest 29.7.0 What did not work for me:
This worked:
or
|
|
🐛 Bug Report
I upgraded from 24.X to 26.0.0 but now test that was passing is not
Running test takes long time to complete then I get this error
To Reproduce
My test:
code:
jest.config:
envinfo
System:
OS: Linux 4.15 Ubuntu 18.04.4 LTS (Bionic Beaver)
CPU: (36) x64 Intel(R) Xeon(R) Platinum 8124M CPU @ 3.00GHz
Binaries:
Node: 14.1.0 - ~/.nvm/versions/node/v14.1.0/bin/node
Yarn: 1.22.4 - /usr/bin/yarn
npm: 6.14.4 - ~/.nvm/versions/node/v14.1.0/bin/npm
npmPackages:
jest: ^26.0.0 => 26.0.0
The text was updated successfully, but these errors were encountered: