We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please answer these questions before submitting a bug report.
"@opentelemetry/api": "^1.0.4", "@opentelemetry/auto-instrumentations-node": "^0.27.4", "@opentelemetry/exporter-trace-otlp-proto": "^0.27.0", "@opentelemetry/resources": "^1.0.1", "@opentelemetry/sdk-node": "^0.27.0", "@opentelemetry/semantic-conventions": "^1.0.1", "@opentelemetry/exporter-trace-otlp-http": "^0.27.0",
v16.14.0
"use strict"; const { v4: uuidv4 } = require("uuid"); const process = require("process"); const opentelemetry = require("@opentelemetry/sdk-node"); const { getNodeAutoInstrumentations } = require("@opentelemetry/auto-instrumentations-node"); const { Resource } = require("@opentelemetry/resources"); const { SemanticResourceAttributes } = require("@opentelemetry/semantic-conventions"); const { OTLPTraceExporter } = require("@opentelemetry/exporter-trace-otlp-proto"); const api = require("@opentelemetry/api"); const { CompressionAlgorithm } = require('@opentelemetry/exporter-trace-otlp-http') api.diag.setLogger( new api.DiagConsoleLogger(), api.DiagLogLevel.DEBUG, ); const resource = new Resource({ [SemanticResourceAttributes.SERVICE_NAME]: "OpenTelemetry-Node.JS-Example", [SemanticResourceAttributes.SERVICE_INSTANCE_ID]: uuidv4(), }); const instrumentations = [getNodeAutoInstrumentations()]; const traceExporter = new OTLPTraceExporter({ compression: CompressionAlgorithm.GZIP, url: 'http://localhost:4318/v1/traces' }); const sdk = new opentelemetry.NodeSDK({ resource, traceExporter, instrumentations, }); sdk .start() .then(() => console.log("Tracing initialized")) .catch((error) => console.log("Error initializing tracing", error)); process.on("SIGTERM", () => { sdk .shutdown() .then(() => console.log("Tracing terminated")) .catch((error) => console.log("Error terminating tracing", error)) .finally(() => process.exit(0)); });
If possible, provide a recipe for reproducing the error.
Can use this example to reproduce the error: https://github.com/newrelic/newrelic-opentelemetry-examples/tree/main/javascript/simple-nodejs-app-http-exp
Just update the endpoint to export to a local collector and add gzip compression to exporter (like shown above).
statusCode: 200
Got this error code: stack":"Error: Timeout\n at Timeout._onTimeout......
stack":"Error: Timeout\n at Timeout._onTimeout......
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered:
Update on this issue:
'Content-Length' header was causing this issue. The length was set before the data was compressed.
'Content-Length'
Removing Content-Length header solved this issue.
Content-Length
Sorry, something went wrong.
Shouldn't we set the correct length after compression instead ?
this would require to first gzip everything in memory as HTTP headers have to be sent first instead of streaming the data.
Successfully merging a pull request may close this issue.
Please answer these questions before submitting a bug report.
What version of OpenTelemetry are you using?
What version of Node are you using?
v16.14.0
Please provide the code you used to setup the OpenTelemetry SDK
What did you do?
If possible, provide a recipe for reproducing the error.
Can use this example to reproduce the error: https://github.com/newrelic/newrelic-opentelemetry-examples/tree/main/javascript/simple-nodejs-app-http-exp
Just update the endpoint to export to a local collector and add gzip compression to exporter (like shown above).
What did you expect to see?
statusCode: 200
What did you see instead?
Got this error code:
stack":"Error: Timeout\n at Timeout._onTimeout......
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: