Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to export when using gzip compression with OTLP/HTTP and OTLP/PROTO trace exporters #2876

Closed
svetlanabrennan opened this issue Mar 31, 2022 · 3 comments · Fixed by #2879
Labels
bug Something isn't working

Comments

@svetlanabrennan
Copy link
Contributor

Please answer these questions before submitting a bug report.

What version of OpenTelemetry are you using?

"@opentelemetry/api": "^1.0.4",
"@opentelemetry/auto-instrumentations-node": "^0.27.4",
"@opentelemetry/exporter-trace-otlp-proto": "^0.27.0",
"@opentelemetry/resources": "^1.0.1",
"@opentelemetry/sdk-node": "^0.27.0",
"@opentelemetry/semantic-conventions": "^1.0.1",
"@opentelemetry/exporter-trace-otlp-http": "^0.27.0",

What version of Node are you using?

v16.14.0

Please provide the code you used to setup the OpenTelemetry SDK

"use strict";

const { v4: uuidv4 } = require("uuid");
const process = require("process");
const opentelemetry = require("@opentelemetry/sdk-node");
const { getNodeAutoInstrumentations } = require("@opentelemetry/auto-instrumentations-node");
const { Resource } = require("@opentelemetry/resources");
const { SemanticResourceAttributes } = require("@opentelemetry/semantic-conventions");
const { OTLPTraceExporter } = require("@opentelemetry/exporter-trace-otlp-proto");
const api = require("@opentelemetry/api");
const { CompressionAlgorithm } = require('@opentelemetry/exporter-trace-otlp-http')

api.diag.setLogger(
  new api.DiagConsoleLogger(),
  api.DiagLogLevel.DEBUG,
);

const resource = new Resource({
  [SemanticResourceAttributes.SERVICE_NAME]: "OpenTelemetry-Node.JS-Example",
  [SemanticResourceAttributes.SERVICE_INSTANCE_ID]: uuidv4(),
});

const instrumentations = [getNodeAutoInstrumentations()];

const traceExporter = new OTLPTraceExporter({
  compression: CompressionAlgorithm.GZIP,
  url: 'http://localhost:4318/v1/traces'
});

const sdk = new opentelemetry.NodeSDK({
  resource,
  traceExporter,
  instrumentations,
});

sdk
  .start()
  .then(() => console.log("Tracing initialized"))
  .catch((error) => console.log("Error initializing tracing", error));

process.on("SIGTERM", () => {
  sdk
    .shutdown()
    .then(() => console.log("Tracing terminated"))
    .catch((error) => console.log("Error terminating tracing", error))
    .finally(() => process.exit(0));
});

What did you do?

If possible, provide a recipe for reproducing the error.

Can use this example to reproduce the error: https://github.com/newrelic/newrelic-opentelemetry-examples/tree/main/javascript/simple-nodejs-app-http-exp

Just update the endpoint to export to a local collector and add gzip compression to exporter (like shown above).

What did you expect to see?

statusCode: 200

What did you see instead?

Got this error code: stack":"Error: Timeout\n at Timeout._onTimeout......

Additional context

Add any other context about the problem here.

@svetlanabrennan svetlanabrennan added the bug Something isn't working label Mar 31, 2022
@svetlanabrennan
Copy link
Contributor Author

Update on this issue:

'Content-Length' header was causing this issue. The length was set before the data was compressed.

Removing Content-Length header solved this issue.

@vmarchaud
Copy link
Member

Shouldn't we set the correct length after compression instead ?

@Flarna
Copy link
Member

Flarna commented Apr 2, 2022

Shouldn't we set the correct length after compression instead ?

this would require to first gzip everything in memory as HTTP headers have to be sent first instead of streaming the data.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants