-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Prometheus Receiver tests to run OpenMetrics test cases #6409
Add Prometheus Receiver tests to run OpenMetrics test cases #6409
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the plan for keeping these tests in sync with the openmetrics repo? Can we add a script or makefile target to sync the tests from a tag in the openmetrics repo?
receiver/prometheusreceiver/metrics_receiver_open_metrics_test.go
Outdated
Show resolved
Hide resolved
@dashpole My initial approach was to pull the test data within these tests directly from the openmetrics repo but that meant the tests would be dependent on GitHub availability. Will discuss this in the Prometheus WG meeting and perhaps we can open a follow up issue if we want to keep the tests in sync. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm OK if we want to tackle that as a follow-up.
@codeboten can you please merge? We are blocked on other PRs dependent on this one :-) Ty! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there an issue to track the work to sort out how to keep the test data in sync with the open metrics repo?
"bad_clashing_names_0": {}, "bad_clashing_names_1": {}, "bad_clashing_names_2": {}, | ||
"bad_counter_values_0": {}, "bad_counter_values_1": {}, "bad_counter_values_2": {}, | ||
"bad_counter_values_3": {}, "bad_counter_values_5": {}, "bad_counter_values_6": {}, | ||
"bad_counter_values_10": {}, "bad_counter_values_11": {}, "bad_counter_values_12": {}, | ||
"bad_counter_values_13": {}, "bad_counter_values_14": {}, "bad_counter_values_15": {}, | ||
"bad_counter_values_16": {}, "bad_counter_values_17": {}, "bad_counter_values_18": {}, | ||
"bad_counter_values_19": {}, "bad_exemplars_on_unallowed_samples_2": {}, "bad_exemplar_timestamp_0": {}, | ||
"bad_exemplar_timestamp_1": {}, "bad_exemplar_timestamp_2": {}, "bad_grouping_or_ordering_0": {}, | ||
"bad_grouping_or_ordering_2": {}, "bad_grouping_or_ordering_3": {}, "bad_grouping_or_ordering_4": {}, | ||
"bad_grouping_or_ordering_5": {}, "bad_grouping_or_ordering_6": {}, "bad_grouping_or_ordering_7": {}, | ||
"bad_grouping_or_ordering_8": {}, "bad_grouping_or_ordering_9": {}, "bad_grouping_or_ordering_10": {}, | ||
"bad_histograms_0": {}, "bad_histograms_1": {}, "bad_histograms_2": {}, "bad_histograms_3": {}, | ||
"bad_histograms_6": {}, "bad_histograms_7": {}, "bad_histograms_8": {}, | ||
"bad_info_and_stateset_values_0": {}, "bad_info_and_stateset_values_1": {}, "bad_metadata_in_wrong_place_0": {}, | ||
"bad_metadata_in_wrong_place_1": {}, "bad_metadata_in_wrong_place_2": {}, | ||
"bad_missing_or_invalid_labels_for_a_type_1": {}, "bad_missing_or_invalid_labels_for_a_type_3": {}, | ||
"bad_missing_or_invalid_labels_for_a_type_4": {}, "bad_missing_or_invalid_labels_for_a_type_6": {}, | ||
"bad_missing_or_invalid_labels_for_a_type_7": {}, "bad_repeated_metadata_0": {}, | ||
"bad_repeated_metadata_1": {}, "bad_repeated_metadata_3": {}, "bad_stateset_info_values_0": {}, | ||
"bad_stateset_info_values_1": {}, "bad_stateset_info_values_2": {}, "bad_stateset_info_values_3": {}, | ||
"bad_timestamp_4": {}, "bad_timestamp_5": {}, "bad_timestamp_7": {}, "bad_unit_6": {}, "bad_unit_7": {}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why pull all these tests in if we're skipping them?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The idea was to keep track of the open metrics tests that aren't currently supported by the receiver.
Created #6439. |
…emetry#6409) * Add Prometheus receiver test to run OpenMetrics test cases * fix: resolved gosec lint issue * fix error handling in getTestCase * remove http requests in tests by copying openmetrics data into testdata * only skip failing openmetrics tests using a deny-list
Description:
In order to verify that the Prometheus Receiver is functioning as expected for transforming OpenMetrics exposition to OTLP format, this PR adds two tests to run positive and negative test cases form the OpenMetrics repository (https://github.com/OpenObservability/OpenMetrics/tree/main/tests/testdata/parsers).
TestOpenMetricsPositive
test:TestOpenMetricsNegative
test:Link to tracking Issue:
Closes #6106