[Filebeat] Fix conditions to decode_json_fields and which pipeline to run#35268
Conversation
|
This pull request does not have a backport label.
To fixup this pull request, you need to add the backport labels for the needed
|
| processors: | ||
| # non-ECS: same as json.keys_under_root: false, allows compatibility with non-ecs logs. | ||
| - decode_json_fields: | ||
| when: |
There was a problem hiding this comment.
Ensure this only runs if it's a json log
|
/test |
90d437e to
73431e0
Compare
73431e0 to
fa8a313
Compare
|
It seems that we don't set the |
Thanks for catching this. I've managed to fix it and also updated the description with all scenarios that this change covers. |
| - script: | ||
| lang: painless | ||
| if: 'ctx.json != null' | ||
| description: Merges filebeat generated fields with ECS log content | ||
| source: |- | ||
| ctx.json.keySet().each(key -> ctx.merge(key, ctx.json.get(key), (oldValue, newValue) -> { | ||
| if (newValue instanceof Map) { | ||
| newValue.putAll(oldValue); | ||
| } | ||
|
|
||
| return newValue; | ||
| })) |
There was a problem hiding this comment.
This way, the pipeline can properly merge the ECS log content with some fields the Filebeat adds by default (e.g log.offset)
klacabane
left a comment
There was a problem hiding this comment.
@crespocarlos walked me through the change offline, looks great thanks!
… run (#35268) * Fix conditions to decode_json_fields and which pipeline to run * Fix pipeline condition * Propery merge ingest log content
What does this PR do?
This PR fixes a problem with the ingest pipeline failing to use the correct ingest pipeline when
json.keys_under_rootis set totrueWhy is it important?
The Kibana logs on ECS deployments constantly show
field [json] doesn't existmessage inerror.messagefield. That happens because cloud deployments automatically setjson.keys_under_roottotruefor versions > 8.0.0 and the pipeline didn't properly cover such scenario after this changeChecklist
I have commented my code, particularly in hard-to-understand areasI have made corresponding changes to the documentationI have made corresponding change to the default configuration filesI have added tests that prove my fix is effective or that my feature worksI have added an entry inCHANGELOG.next.asciidocorCHANGELOG-developer.next.asciidoc.How to test this PR locally
kibana.ymlPull this branch and start filebeat from the source https://github.com/elastic/kibana/blob/main/x-pack/plugins/monitoring/dev_docs/how_to/running_components_from_source.md#filebeat
filebeat.yml, enable Kibana module.The ingest pipeline has to properly override the original log content when there are custom field values on
filebeat.ymlinput.fieldssettings. Therefore, these test scenarios have to work as following :fields_under_root: trueandjson.fields_under_root: true->input.fieldsoverride original log contentfields_under_root: trueandjson.fields_under_root: false->input.fieldsoverride original log contentfields_under_root: falseandjson.fields_under_root: true->input.fieldsoverride original log contentfields_under_root: falseandjson.fields_under_root: false->input.fieldsoverride original log contentinputobject altogether -> original log content is keptThe fact that
json.fields_under_rootistrue, shouldn't causeerror.messagefield to be present.Related issues
closes #34210