-
Notifications
You must be signed in to change notification settings - Fork 821
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JaegerExporter can throw on export #608
Comments
Explanation of the issueBatchSpanProcessor has a 20 second default timeout When the JaegerExporter exports, it calls After 20 seconds, the BatchSpanProcessor exports a batch of spans, which are appended and then exported. During this export, the 4th 5 second interval of the Jaeger exporter timeout occurs and it attempts to flush. 2 concurrent flushes cause the crash. Fix: remove the 5 second timer from jaeger exporter the jaeger exporter is already flushing on every export so there is no need for it to ever flush manually. |
I think this is not true, sender will wait till internal buffer reaches |
This might very well work in the production, but in case of examples you might not see the full traces unless you call |
@mayurkale22 stopping my simple example in the debugger shows max packet size to be It comes from line 87, but
All of this is to say, the branch you linked always evaluates to false no matter how the Jaeger exporter is configured. |
I just did |
with the exact above code using the 0.2.0 release versions of otel? |
blowing away my node_modules and rebuilding also gave me 64935. |
In case someone else stumbles over the same error: double check you used const x = this._agentThrift.Agent.emitBatch.argumentsMessageRW.byteLength(this._convertBatchToThriftMessage());
console.log(x, x.length)
return x.length If something is wrong with the message itself this function barfs the error message out ( Hope that helps someone. |
* chore: add AWS resource detector ownership * chore: add owners to readme Co-authored-by: Valentin Marchaud <[email protected]>
Please answer these questions before submitting a bug report.
What version of OpenTelemetry are you using?
0.2.0
What version of Node are you using?
8 and 10
What did you do?
If possible, provide a recipe for reproducing the error.
Make an http request to localhost:8080, wait about 5 seconds for the batch to send.
What did you expect to see?
spans in the backend
What did you see instead?
process crash
Additional context
Add any other context about the problem here.
@mayurkale22 please assign me, I think i found the issue
The text was updated successfully, but these errors were encountered: