[Bugfix]: Fixing bug in label generation for multiple groupbys#3594
[Bugfix]: Fixing bug in label generation for multiple groupbys#3594mistercrunch merged 1 commit intoapache:masterfrom
Conversation
2 similar comments
| } else { | ||
| label = verbose_map[column]; | ||
| if (Array.isArray(column) && column.length) { | ||
| label = verbose_map[column[0]] || column[0]; |
There was a problem hiding this comment.
Is it a safe assumption that metric is always in position 0?
const label = column.map(s => verbose_map[s] || s).join(', ') may be more elegant but it may "verbosify" dimension members instead of metrics (collisions are probably rare and not super problematic). This problem exist here too as in this current case column[0] could be a metric or a dimension member anyhow.
Now I'm thinking verbosifying could be done in the backend where we know what is a metric or a dim member (this info is lost on the frontend). I know I said to verbosify it in the frontend earlier. What I wanted to avoid was using verbose names in dataframe column headers...
In any case I'll merge this to fix the bug and we can revisit later if needed.
There was a problem hiding this comment.
Yeah, so my change for renaming on the backend is still around somewhere #3437.
Anyways your change #3563 as well as the original code also assumed it in the first position too.
There are some advantages (that we currently don't use) to do it in the front end.
E.g. we could have a checkbox that cloud switch between verbose and regular name or you could have custom verbose maps per slice instead of per datasource. And updating the charts would not require requests to the application.
So if we document and define that metrics always have to come before groupys we should be fine doing it in the UI.
Fixes: #3590
Relates to: #3504 #3563 #3566
I guess I started this so I'll take 99% of the blame here:
History:
grouped by,single metrictime series charts #3566 would have fixed both issues but came too lateSo we either apply this fix or roll back #3563 and apply #3566
Hopefully this is the last issue :(