Skip to content

[filebeat] Failed to parse kubernetes.labels.app #8773

@dmitryzykov

Description

@dmitryzykov

I'm using helm/stable/filebeat which is based on docker.elastic.co/beats/filebeat-oss:6.4.2
all other components are also oss:6.4.2

I'm having a problem when filebeat failed to parse kubernetes.labels.app for many containers.

This is filebeat log when I'm sending directly to elasticsearch:

2018-10-27T09:42:23.923Z	WARN	elasticsearch/client.go:520	Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0x12a6762a, ext:63676230135, loc:(*time.Location)(nil)}, Meta:common.MapStr(nil), Fields:common.MapStr{"beat":common.MapStr{"version":"6.4.2", "name":"filebeat-tx7fz", "hostname":"filebeat-tx7fz"}, "offset":3227, "stream":"stderr", "message":"   at Microsoft.Extensions.DependencyInjection.ServiceLookup.ServiceProviderEngine.GetService(Type serviceType, ServiceProviderEngineScope serviceProviderEngineScope)", "kubernetes":common.MapStr{"container":common.MapStr{"name":"sink"}, "namespace":"default", "replicaset":common.MapStr{"name":"test-propertysite-sink-78549f4bb7"}, "labels":common.MapStr{"pod-template-hash":"3410590663", "app":common.MapStr{"kubernetes":common.MapStr{"io/name":"propertysite", "io/component":"sink", "io/instance":"test"}}}, "pod":common.MapStr{"name":"test-propertysite-sink-78549f4bb7-rzj84"}, "node":common.MapStr{"name":"aks-nodepool1-38193062-2"}}, "host":common.MapStr{"name":"filebeat-tx7fz"}, "meta":common.MapStr{"cloud":common.MapStr{"provider":"az", "instance_id":"0fdf45d7-7bfa-4f34-b5e6-9edc8dde83e6", "instance_name":"aks-nodepool1-38193062-2", "machine_type":"Standard_DS2_v2", "region":"westeurope"}}, "source":"/var/lib/docker/containers/408f7e34fa197c31708354997cbbac77b1deb460c6ab7d2bb62ae59dae8a2231/408f7e34fa197c31708354997cbbac77b1deb460c6ab7d2bb62ae59dae8a2231-json.log", "prospector":common.MapStr{"type":"docker"}, "input":common.MapStr{"type":"docker"}}, Private:file.State{Id:"", Finished:false, Fileinfo:(*os.fileStat)(0xc4203b3040), Source:"/var/lib/docker/containers/408f7e34fa197c31708354997cbbac77b1deb460c6ab7d2bb62ae59dae8a2231/408f7e34fa197c31708354997cbbac77b1deb460c6ab7d2bb62ae59dae8a2231-json.log", Offset:3464, Timestamp:time.Time{wall:0xbeed2a9fb6313090, ext:224243617813, loc:(*time.Location)(0x1f61860)}, TTL:-1, Type:"docker", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0xfe57e, Device:0x801}}}, Flags:0x1} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse [kubernetes.labels.app]","caused_by":{"type":"illegal_state_exception","reason":"Can't get text on a START_OBJECT at 1:419"}}

and this is the same error from logstash, when I'm sending from filebeat to logstash

[2018-10-26T19:38:14,673][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2018.10.26", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x21da5026>], :response=>{"index"=>{"_index"=>"filebeat-2018.10.26", "_type"=>"doc", "_id"=>"Ch3isWYBrdoKcExWXgp1", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [kubernetes.labels.app]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:695"}}}}}

This is a part of my filebeat config file:

    filebeat.prospectors:
    - enabled: true
      paths:
      - /var/log/*.log
      - /var/log/messages
      - /var/log/syslog
      type: log
    - containers.ids:
      - '*'
      processors:
      - add_kubernetes_metadata:
          in_cluster: true
      - drop_event:
          when:
            equals:
              kubernetes.container.name: filebeat
      type: docker

I noticed that parsing problems appear only for pods with such labels format(which is recommended for helm charts:

  labels:
    app.kubernetes.io/name: foo
    helm.sh/chart: foo-chart
    app.kubernetes.io/instance: foo
    app.kubernetes.io/managed-by: Tiller
    app.kubernetes.io/component: foo

and the problem doesn't appear for more simpler label format like:

  labels:
    name: bar
    chart: bar-chart
    instance: bar
    managed-by: Tiller
    component: bar

Expected behavior from filebeat is a successful parse of such labels and successful shipment to elasticsearch instead parse error and failure to send log.

For example Fluentd parse app.kubernetes.io/component and it appears in elasticsearch as kubernetes.labels.app_kubernetes_io/component

During bug research, I found similar problems

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions