Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Observable Counters/UpDownCounters Report Incorrect Values #2296

Closed
samin36 opened this issue Sep 5, 2023 · 1 comment · Fixed by #2298
Closed

Observable Counters/UpDownCounters Report Incorrect Values #2296

samin36 opened this issue Sep 5, 2023 · 1 comment · Fixed by #2298
Assignees
Labels
bug Something isn't working triage/accepted Indicates an issue or PR is ready to be actively worked on.

Comments

@samin36
Copy link
Contributor

samin36 commented Sep 5, 2023

Environment
Opentelemetry-cpp v1.9.0 on CentOS 7

Problem
Any Observable Counters/UpDownCounters report incorrect values if the initial observation value is non-zero.

Steps to reproduce

#include "opentelemetry/exporters/ostream/metric_exporter.h"
#include "opentelemetry/metrics/provider.h"
#include "opentelemetry/sdk/metrics/aggregation/default_aggregation.h"
#include "opentelemetry/sdk/metrics/aggregation/histogram_aggregation.h"
#include "opentelemetry/sdk/metrics/export/periodic_exporting_metric_reader.h"
#include "opentelemetry/sdk/metrics/meter.h"
#include "opentelemetry/sdk/metrics/meter_provider.h"
#include <memory>
#include <thread>

namespace metric_sdk = opentelemetry::sdk::metrics;
namespace common = opentelemetry::common;
namespace exportermetrics = opentelemetry::exporter::metrics;
namespace metrics_api = opentelemetry::metrics;
namespace nostd = opentelemetry::nostd;

int value = 3;

void InitMetrics(const std::string &name) {
    std::unique_ptr<metric_sdk::PushMetricExporter> exporter{
        new exportermetrics::OStreamMetricExporter};

    std::string version{"1.2.0"};
    std::string schema{"https://opentelemetry.io/schemas/1.2.0"};

    // Initialize and set the global MeterProvider
    metric_sdk::PeriodicExportingMetricReaderOptions options;
    options.export_interval_millis = std::chrono::milliseconds(5000);
    options.export_timeout_millis = std::chrono::milliseconds(500);
    std::unique_ptr<metric_sdk::MetricReader> reader{
        new metric_sdk::PeriodicExportingMetricReader(std::move(exporter),
                                                      options)};
    auto provider = std::shared_ptr<metrics_api::MeterProvider>(
        new metric_sdk::MeterProvider());
    auto p = std::static_pointer_cast<metric_sdk::MeterProvider>(provider);
    p->AddMetricReader(std::move(reader));

    metrics_api::Provider::SetMeterProvider(provider);
}

void CleanupMetrics() {
    std::shared_ptr<metrics_api::MeterProvider> none;
    metrics_api::Provider::SetMeterProvider(none);
}

void TestCallback(opentelemetry::metrics::ObserverResult observer_result,
                  void * /* state */) {
    if (nostd::holds_alternative<nostd::shared_ptr<
            opentelemetry::metrics::ObserverResultT<int64_t>>>(
            observer_result)) {
        nostd::get<nostd::shared_ptr<
            opentelemetry::metrics::ObserverResultT<int64_t>>>(observer_result)
            ->Observe(value++);
    }
}

int main(int argc, char **argv) {
    std::string name{"ostream_metric_example"};
    InitMetrics(name);

    auto provider = metrics_api::Provider::GetMeterProvider();
    nostd::shared_ptr<metrics_api::Meter> meter =
        provider->GetMeter(name, "1.2.0");

    auto test_counter =
        meter->CreateInt64ObservableUpDownCounter("test_counter");
    test_counter->AddCallback(TestCallback, nullptr);

    while (true);

    CleanupMetrics();
}

What is the expected behavior?
The values are reported as 3, 4, 5, 6, etc.

{
  scope name    : ostream_metric_example
  schema url    :
  version       : 1.2.0
  start time    : Tue Sep  5 14:50:30 2023
  end time      : Tue Sep  5 14:50:30 2023
  instrument name       : test_counter
  description   :
  unit          :
  type          : SumPointData
  value         : 3
  attributes            :
  resources     :
        service.name: unknown_service
        telemetry.sdk.language: cpp
        telemetry.sdk.name: opentelemetry
        telemetry.sdk.version: 1.9.0
}
{
  scope name    : ostream_metric_example
  schema url    :
  version       : 1.2.0
  start time    : Tue Sep  5 14:50:30 2023
  end time      : Tue Sep  5 14:50:35 2023
  instrument name       : test_counter
  description   :
  unit          :
  type          : SumPointData
  value         : 4
  attributes            :
  resources     :
        service.name: unknown_service
        telemetry.sdk.language: cpp
        telemetry.sdk.name: opentelemetry
        telemetry.sdk.version: 1.9.0
}
{
  scope name    : ostream_metric_example
  schema url    :
  version       : 1.2.0
  start time    : Tue Sep  5 14:50:30 2023
  end time      : Tue Sep  5 14:50:40 2023
  instrument name       : test_counter
  description   :
  unit          :
  type          : SumPointData
  value         : 5
  attributes            :
  resources     :
        service.name: unknown_service
        telemetry.sdk.language: cpp
        telemetry.sdk.name: opentelemetry
        telemetry.sdk.version: 1.9.0
}

What is the actual behavior?
The values are reported as 3, 7, 8, 9, etc.


{
  scope name    : ostream_metric_example
  schema url    :
  version       : 1.2.0
  start time    : Tue Sep  5 14:50:30 2023
  end time      : Tue Sep  5 14:50:30 2023
  instrument name       : test_counter
  description   :
  unit          :
  type          : SumPointData
  value         : 3
  attributes            :
  resources     :
        service.name: unknown_service
        telemetry.sdk.language: cpp
        telemetry.sdk.name: opentelemetry
        telemetry.sdk.version: 1.9.0
}
{
  scope name    : ostream_metric_example
  schema url    :
  version       : 1.2.0
  start time    : Tue Sep  5 14:50:30 2023
  end time      : Tue Sep  5 14:50:35 2023
  instrument name       : test_counter
  description   :
  unit          :
  type          : SumPointData
  value         : 7
  attributes            :
  resources     :
        service.name: unknown_service
        telemetry.sdk.language: cpp
        telemetry.sdk.name: opentelemetry
        telemetry.sdk.version: 1.9.0
}
{
  scope name    : ostream_metric_example
  schema url    :
  version       : 1.2.0
  start time    : Tue Sep  5 14:50:30 2023
  end time      : Tue Sep  5 14:50:40 2023
  instrument name       : test_counter
  description   :
  unit          :
  type          : SumPointData
  value         : 8
  attributes            :
  resources     :
        service.name: unknown_service
        telemetry.sdk.language: cpp
        telemetry.sdk.name: opentelemetry
        telemetry.sdk.version: 1.9.0
}

Additional Info
I tried to reproduce the same issue with:

  • the Opentelemetry-Python SDK, but that reported the correct values (3, 4, 5, 6 etc.)
  • Opentelemetry-cpp v1.11.0 and was able to successfully do so.
@samin36 samin36 added the bug Something isn't working label Sep 5, 2023
@github-actions github-actions bot added the needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. label Sep 5, 2023
@lalitb
Copy link
Member

lalitb commented Sep 5, 2023

Environment
Opentelemetry-cpp v1.9.0 on CentOS 7

Opentelemetry-cpp v1.11.0 and was able to successfully do so.

Do you mean you are seeing this issue with v1.9.0, and it works correctly in v1.11.0?

Ok please ignore it, I guess you mean the issue is there in v1.11.0 too. Will check further.

@lalitb lalitb self-assigned this Sep 5, 2023
@marcalff marcalff added triage/accepted Indicates an issue or PR is ready to be actively worked on. and removed needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels Sep 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage/accepted Indicates an issue or PR is ready to be actively worked on.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants