Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(test): windows tests are failing #3637

Closed
wants to merge 2 commits into from

Conversation

MSNev
Copy link
Contributor

@MSNev MSNev commented Feb 27, 2023

Which problem is this PR solving?

This issue is occurring because the underlying v8 engine used by node is consuming more memory on windows, while #3628 is about the tests failing on WIndows build, it also fails on a windows machine.

Fixes #3628

Short description of the changes

This is a targeted fix for the current project that is failing to build, this could also be applied in the unit-test.yml, but that would not address (fix) for anyone cloning and running tests in the repo on a Windows machine.

For reference the --max-old-space-size= is actually a v8 configuration option (not specifically node)
nodejs/node#7937

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

How Has This Been Tested?

Tested locally on a Windows development machine with the following repo steps (the only way locally I found to repo), the clean is required because lerna was caching the result once a successful build occurred.

git clean -xdf
npm install
npm run compile
npm run test

And also in the GitHub pipeline

@MSNev MSNev requested a review from a team February 27, 2023 22:17
@codecov
Copy link

codecov bot commented Feb 27, 2023

Codecov Report

Merging #3637 (a18b547) into main (708afd0) will increase coverage by 0.27%.
The diff coverage is n/a.

❗ Current head a18b547 differs from pull request most recent head b92e438. Consider uploading reports for the commit b92e438 to get more accurate results

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #3637      +/-   ##
==========================================
+ Coverage   93.79%   94.07%   +0.27%     
==========================================
  Files         264      268       +4     
  Lines        7349     7930     +581     
  Branches     1491     1645     +154     
==========================================
+ Hits         6893     7460     +567     
- Misses        456      470      +14     
Impacted Files Coverage Δ
...ges/opentelemetry-instrumentation-http/src/http.ts 94.21% <0.00%> (ø)
...y-instrumentation-http/src/enums/AttributeNames.ts 100.00% <0.00%> (ø)
...es/opentelemetry-instrumentation-http/src/utils.ts 99.18% <0.00%> (ø)
.../src/platform/browser/export/BatchSpanProcessor.ts 50.00% <0.00%> (ø)
packages/opentelemetry-sdk-trace-base/src/Span.ts 98.58% <0.00%> (+0.70%) ⬆️
api/karma.worker.js 100.00% <0.00%> (+100.00%) ⬆️
packages/opentelemetry-resources/karma.worker.js 100.00% <0.00%> (+100.00%) ⬆️
...ckages/opentelemetry-sdk-trace-web/karma.worker.js 100.00% <0.00%> (+100.00%) ⬆️
...kages/opentelemetry-sdk-trace-base/karma.worker.js 100.00% <0.00%> (+100.00%) ⬆️

@MSNev
Copy link
Contributor Author

MSNev commented Feb 27, 2023

Hmm, while this is fixing the out of memory issue the @opentelemetry/shim-opentracing:test is failing, I don't have a local repo for this failure...

@@ -9,7 +9,7 @@
"prepublishOnly": "npm run compile",
"compile": "tsc --build",
"clean": "tsc --build --clean",
"test": "nyc ts-mocha -p tsconfig.json test/**/*.test.ts",
"test": "cross-env NODE_OPTIONS='--max-old-space-size=8192' nyc ts-mocha -p tsconfig.json test/**/*.test.ts",
Copy link
Contributor Author

@MSNev MSNev Feb 27, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This one is here to fix local (cloned repo) builds, it also fixes the CI test run for unit-test.yml, but because of other failing tests I've also added line 67 NODE_OPTIONS: --max-old-space-size=8192 to unit-test.yml.

Only the instrumentation-http package was failing for a local clone build (for me at least).

@legendecas
Copy link
Member

Restarted windows build and it reports OOM for shim-opentracing too:


> @opentelemetry/shim-opentracing:test


> @opentelemetry/[email protected] test
> nyc ts-mocha -p tsconfig.json test/**/*.test.ts


<--- Last few GCs --->

[2568:000001E238B2A740]     8985 ms: Mark-sweep (reduce) 15.9 (30.1) -> 13.4 (16.3) MB, 13.3 / 0.0 ms  (+ 0.2 ms in 2 steps since start of marking, biggest step 0.2 ms, walltime since start of marking 37 ms) (average mu = 0.998, current mu = 0.998) finali

<--- JS stacktrace --->

FATAL ERROR: Committing semi space failed. Allocation failed - JavaScript heap out of memory
 1: 00007FF6256E168F v8::internal::CodeObjectRegistry::~CodeObjectRegistry+122159
 2: 00007FF62566B456 DSA_meth_get_flags+64118
 3: 00007FF62566C4D2 DSA_meth_get_flags+68338
 4: 00007FF625FA3CB4 v8::Isolate::ReportExternalAllocationLimitReached+116
 5: 00007FF625F8E27D v8::SharedArrayBuffer::Externalize+781
 6: 00007FF625E3183C v8::internal::Heap::EphemeronKeyWriteBarrierFromCode+1468
 7: 00007FF625E3B55F v8::internal::Heap::PageFlagsAreConsistent+3007
 8: 00007FF625E2E119 v8::internal::Heap::CollectGarbage+2137
 9: 00007FF625E3194B v8::internal::Heap::FinalizeIncrementalMarkingAtomically+267
10: 00007FF625E2640F v8::base::CPU::is_fp64_mode+415
11: 00007FF62560007D node::TriggerNodeReport+53341
12: 00007FF6255FE876 node::TriggerNodeReport+47190
13: 00007FF62573F3DB uv_async_send+331
14: 00007FF62573EB6C uv_loop_init+1292
15: 00007FF62573ED0A uv_run+202
16: 00007FF62570DDF5 node::SpinEventLoop+325
17: 00007FF625624271 cppgc::internal::Marker::visitor+57937
18: 00007FF6256A33FA node::Start+202
19: 00007FF6254C86EC RC4_options+347628
20: 00007FF62652A228 v8::internal::compiler::RepresentationChanger::Uint32OverflowOperatorFor+14520
21: 00007FFEF1A84DE0 BaseThreadInitThunk+16
22: 00007FFEF21DE40B RtlUserThreadStart+43

@dyladan
Copy link
Member

dyladan commented Feb 28, 2023

Looks like #3642 addresses the underlying issue causing the high memory so that might be a better way to go

@MSNev MSNev closed this Mar 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Windows tests are failing
4 participants