feat(component spec validation tests): add partial support for component_discarded_events_total metrics#16935
feat(component spec validation tests): add partial support for component_discarded_events_total metrics#16935davidhuie-dd wants to merge 1 commit intodh/error-metrics-validationfrom
component_discarded_events_total metrics#16935Conversation
Regression Detector ResultsRun ID: 27df08c6-2c2d-4399-94bd-ff28fd12814a ExplanationA regression test is an integrated performance test for The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values mean that baseline is faster, positive comparison. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Fine details of change detection per experiment.
|
| .expect("should not fail to encode input event"); | ||
| } | ||
| TestEvent::Interrupted { | ||
| interrupted: _, |
There was a problem hiding this comment.
can these two cases be combined since they are identical?
|
David Huie seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account. You have signed the CLA already but the status is still pending? Let us recheck it. |
|
The code this touches has changed on master with changes I've made to the area so this PR is no longer suitable to merge as is and the logic it introduced is already covered. |
#16842
This introduces a test function that can verify the
component_discarded_events_totalmetric total according to a new test event type:TestEvent::Interrupted.I don't actually introduce any actual test harness support for this metric here since it is going to be an involved process per component. We'll have to bring up the full integration test harness, then interrupt the event transmission process midway through in order to trigger these errors. I imagine that we will only do this to our most important integrations due to the high cost of implementing support for this metric.