Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: adding force flush to span processors #802

Merged
merged 5 commits into from
Mar 3, 2020

Conversation

obecny
Copy link
Member

@obecny obecny commented Feb 20, 2020

Which problem is this PR solving?

Short description of the changes

  • It adds forceFlush function to the span processor - it mainly affects BashSpanProcessors as for the rest the spans are exported automatically without waiting.

@obecny obecny self-assigned this Feb 20, 2020
@obecny obecny added the size/L Denotes a PR that changes 100-499 lines, ignoring generated files. label Feb 20, 2020
Copy link
Member

@dyladan dyladan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. happy to get rid of some @todos

@dyladan dyladan added the enhancement New feature or request label Feb 20, 2020
@dyladan dyladan changed the title chore: adding force flash to span processors chore: adding force flush to span processors Feb 20, 2020

// does nothing.
onStart(span: Span): void {}

onEnd(span: Span): void {
if (this._isShutdown) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks redundant to me, we already have the same logic in Zipkin and Collector exporter. Maybe bug in Stackdriver exporter where we don't do anything on shutdown and in Jaeger exporter, we juse close the socket connection.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually agree. if the span processor shuts down its exporter, then it should be safe for it to continue sending spans even after shutdown and the exporter will just no-op

Copy link
Member Author

@obecny obecny Feb 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but this is what spec says for span processors
https://github.com/open-telemetry/opentelemetry-specification/blob/master/specification/sdk-tracing.md#shutdown
so they should have the same functionality ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one additional if check isn't hurting anything. I would say just leave it in

@dyladan dyladan added this to the Beta milestone Feb 26, 2020
@dyladan
Copy link
Member

dyladan commented Mar 2, 2020

Please resolve the conflicts.

@open-telemetry/javascript-approvers please review this. It is needed for beta.

@codecov-io
Copy link

codecov-io commented Mar 2, 2020

Codecov Report

Merging #802 into master will increase coverage by 1.84%.
The diff coverage is 100%.

@@            Coverage Diff             @@
##           master     #802      +/-   ##
==========================================
+ Coverage   92.54%   94.38%   +1.84%     
==========================================
  Files         245      246       +1     
  Lines       10712    10837     +125     
  Branches     1044     1052       +8     
==========================================
+ Hits         9913    10229     +316     
+ Misses        799      608     -191
Impacted Files Coverage Δ
...telemetry-tracing/test/BasicTracerProvider.test.ts 100% <ø> (ø)
packages/opentelemetry-resources/src/version.ts 0% <ø> (ø) ⬆️
...pentelemetry-exporter-prometheus/src/prometheus.ts 90% <100%> (ø) ⬆️
...ages/opentelemetry-plugin-http/test/utils/utils.ts 33.33% <0%> (-26.67%) ⬇️
...ges/opentelemetry-plugin-https/test/utils/utils.ts 33.33% <0%> (-26.67%) ⬇️
.../opentelemetry-exporter-collector/src/transform.ts 88.88% <0%> (-8.09%) ⬇️
...ntelemetry-tracing/test/MultiSpanProcessor.test.ts 97.43% <0%> (-2.57%) ⬇️
...entelemetry-exporter-jaeger/test/transform.test.ts 100% <0%> (ø) ⬆️
...s/opentelemetry-metrics/test/MeterProvider.test.ts 100% <0%> (ø) ⬆️
packages/opentelemetry-resources/src/constants.ts 100% <0%> (ø) ⬆️
... and 52 more

Copy link
Member

@mayurkale22 mayurkale22 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@mayurkale22
Copy link
Member

/cc @hekike (original author) FYI

@@ -45,7 +44,6 @@ export class ZipkinExporter implements SpanExporter {
const urlStr = config.url || ZipkinExporter.DEFAULT_URL;
const urlOpts = url.parse(urlStr);

this._forceFlush = config.forceFlush || true;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should remove _forceFlushOnShutdown from Jaeger exporter also.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In zipkin this functionality was in fact "dead" but for Jaeger it does something - there is a timeout which waits for the Jaeger to close the sender. Are you ok to make it in separate PR then ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure 👍

@mayurkale22 mayurkale22 added the Merge:LGTM This PR is ready to be merged by a Maintainer (has enough valid approvals, successful build, etc.) label Mar 3, 2020
@mayurkale22 mayurkale22 merged commit b4cfde3 into open-telemetry:master Mar 3, 2020
@obecny obecny deleted the feat-force-flash branch July 8, 2020 12:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request Merge:LGTM This PR is ready to be merged by a Maintainer (has enough valid approvals, successful build, etc.) size/L Denotes a PR that changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add ForceFlush to span processor
6 participants