-
Notifications
You must be signed in to change notification settings - Fork 533
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: update core dependencies to 0.25 #635
Conversation
Codecov Report
@@ Coverage Diff @@
## main #635 +/- ##
==========================================
- Coverage 96.68% 94.87% -1.82%
==========================================
Files 13 208 +195
Lines 634 12128 +11494
Branches 124 1155 +1031
==========================================
+ Hits 613 11506 +10893
- Misses 21 622 +601
|
Anyone know why this is failing? i'm not really sure |
Best guess is too high load because of parallel testing. I see no other reason for e.g. |
I agree. I can't reproduce it locally either. |
Trying to limit concurrency to 4 (there are 2 cpus on the runners) to see if that helps. Probably we haven't seen this yet because we haven't had an update which touches so many packages. |
Seems like the lambda failure is a legitimate failure |
@willarmiros the AWS lambda tests are consistently failing and they pass locally. Any idea what might be up? |
Taking a look...seems like the problem is that the spans aren't getting exported. Were there any changes to the memory exporter or force flush? |
I can see no relevant changes were made, and that wouldn't really make sense since the tests are passing locally anyway. I can see that there many other tests that are using the in-memory exporter and then verifying the content of those exported spans. One difference is that we're using the BatchSpanProcessor and all others that I've seen are using the SimpleSpanProcessor, so maybe there's something to that |
Not that I'm aware of |
Quick perusal of the release notes from 0.25 shows only this that might have made a difference? |
Yeah I saw that, don't think that's the culprit because it should have equally impacted the tests that use SimpleSpanProcessor... it could be something weird with the GitHub runner environment, just really not sure why we'd only see it with these tests on this particular PR. |
@anuraaga do you have any ideas? |
@willarmiros Since we don't have package-lock.json is it possible the dependencies aren't at the same version when run locally as what's running in CI? That's what comes to mind as a possible reason for not being able to reproduce. We have this that relies on some weird behavior of the patching infrastructure but wonder if an update changed it. |
@anuraaga @willarmiros I wonder if it has to do with the hoisting? Dependencies are hoisted in CI to make the CI run faster. |
What it says on the tin