Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: --runInBand seems to leak memory when importing the same module multiple times #12142

Closed
bcluyse opened this issue Dec 12, 2021 · 7 comments

Comments

@bcluyse
Copy link

bcluyse commented Dec 12, 2021

Version

27.4.4

Steps to reproduce

Any version since 25.1.0 seems to leak memory when running with --runInBand

  1. Clone my repo at https://github.com/bcluyse/jest-runinband-memory-leak-repro
  2. Install dependencies by running
    npm run build
  3. npm run test27, attach a debugger and check memory dump at the end of tests
  4. You can see dozens of instances of lodash present in memory

Expected behavior

Lodash module string once in memory

Actual behavior

Dozens of instances of lodash module string in memory
(this is the memory snapshot for jest 25)

25_screenshot

Additional context

We have seen this behavior in our application where we use --runInBand to run in CI.
All versions I have tested between 25.1.0 - 27.4.4 have this issue.
24.9.0 seems to still be okay, as can be seen by running npm run test24.

Environment

System:
  OS: Linux 5.4 Ubuntu 20.04.3 LTS (Focal Fossa)
  CPU: (8) x64 Intel(R) Core(TM) i5-8350U CPU @ 1.70GHz
Binaries:
  Node: 14.16.0 - ~/.nvm/versions/node/v14.16.0/bin/node
  Yarn: 1.22.17 - ~/.nvm/versions/node/v14.16.0/bin/yarn
  npm: 7.6.0 - ~/.nvm/versions/node/v14.16.0/bin/npm
@wojtek1150
Copy link

Also I've noticed that previous versions were faster on CI. on 25 and 26 locally 40s, on CI 4min. After update to 27 locally 25s but on CI timeout after 25min

@sibelius
Copy link

any ideas of how to fix it?

@smolijar
Copy link

We ran into the same issues. I did a little benchmark to know if switch to a different test framework would solve the issue. Doing that discovered that the leaks accentuate with leaving opened handles: https://github.com/grissius/jest-is-a-rude-needy-clown-and-eats-lot-of-memory#leakables

@leonardopliski
Copy link

Using this approach from @oskar-kupski-elpassion reduces the memory leaks for me. Also, if you're using typescript with ts-jest I'd recommend switching to swc or using a jest custom transformer with the typescript package, as ts-jest is causing more memory leaks for me.

@bcluyse
Copy link
Author

bcluyse commented Jan 24, 2022

@leonardopliski
We are not using typescript and neither is the minimal reproduction repo mentioned using it.
The suggested approach does not have any impact on the memory consumption of the minimal reproduction so I am not sure it is related.

@SimenB
Copy link
Member

SimenB commented Mar 3, 2022

As mentioned in #7874 (comment) I think the cached strings are red herrings (and if they're not, Node needs to expose some API).

From the linked reproduction in the OP (really well put together, thank you!) I think you can see the same. While memory usage increased, it also drops when GC has run (and running with --detect-leaks forcing GC keeps it stable at 32MB)

@SimenB SimenB closed this as completed Mar 3, 2022
@github-actions
Copy link

github-actions bot commented Apr 3, 2022

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Apr 3, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

6 participants