Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NULLs in tetragon data #193

Closed
jpegleg opened this issue Jun 24, 2022 · 6 comments · Fixed by #1090
Closed

NULLs in tetragon data #193

jpegleg opened this issue Jun 24, 2022 · 6 comments · Fixed by #1090
Assignees
Labels
kind/bug Something isn't working

Comments

@jpegleg
Copy link

jpegleg commented Jun 24, 2022

When tetragon is installed via helm as instructed, some fields are populated by null strings (example in data below) instead of the expected data:

{"process_exec":{"process":{"exec_id":"AAAAAAAAADoxNjcxODQ1Mjc0NTk0MTc6MTczMjQ4OA==","pid":1732488,"uid":0,"cwd":"/","binary":"/usr/bin/tetra","arguments":"status","flags":"execve rootcwd clone","start_time":"2022-06-24T01:27:13.470Z","auid":4294967295,"pod":{"namespace":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000","name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000","container":{"id":"containerd://9001291f84d1fa67aeaafd7880621865e6b7b398a67294803716581e27c5d7a5","name":"tetragon","image":{"id":"quay.io/cilium/tetragon@sha256:bb81d915aafdefa1a7873de30791e5a4698322d463af51195b4c262060fcc703","name":"quay.io/cilium/tetragon:v0.8.0"},"start_time":"2022-06-23T18:52:01Z","pid":31562}},"docker":"9001291f84d1fa67aeaafd788062186","parent_exec_id":"AAAAAAAAADoxNjcxODQ0ODI1MDI1Nzg6MTczMjQ3OA==","refcnt":1},"parent":{"exec_id":"AAAAAAAAADoxNjcxODQ0ODI1MDI1Nzg6MTczMjQ3OA==","pid":1732478,"uid":0,"cwd":"/var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/462cbabc0ce9d0f9c11d21fbb6c8d62850f21eee7bd4289f04745b26cf1e6acd/","binary":"/snap/microk8s/3272/bin/runc","arguments":"--root /run/containerd/runc/k8s.io --log /var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/9001291f84d1fa67aeaaf 7880621865e6b7b398a67294803716581e27c5d7a5/log.json --log-format json exec --process /var/snap/microk8s/common/run/runc-process2712392004 --detach --pid-file /var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/9001291f84d1fa67aeaaf 7880621865e6b7b398a67294803716581e27c5d7a5/5a342fb48ed6670a76e757b94846ea207d1e73296b5fde67622d6ac3 09bc01f.pid 9001291f84d1fa67aeaafd7880621865e6b7b398a67294803716581e27c5d7a5","flags":"execve clone","start_time":"2022-06-24T01:27:13.425Z","auid":4294967295,"parent_exec_id":"AAAAAAAAADoxNjI1OTAwMDAwMDA6NTEwOQ==","refcnt":1}},"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-06-24T01:27:13.470Z"}
{"process_exit":{"process":{"exec_id":"AAAAAAAAADoxNjcxODQ1Mjc0NTk0MTc6MTczMjQ4OA==","pid":1732488,"uid":0,"cwd":"/","binary":"/usr/bin/tetra","arguments":"status","flags":"execve rootcwd clone","start_time":"2022-06-24T01:27:13.470Z","auid":4294967295,"pod":{"namespace":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000","name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000","container":{"id":"containerd://9001291f84d1fa67aeaafd7880621865e6b7b398a67294803716581e27c5d7a5","name":"tetragon","image":{"id":"quay.io/cilium/tetragon@sha256:bb81d915aafdefa1a7873de30791e5a4698322d463af51195b4c262060fcc703","name":"quay.io/cilium/tetragon:v0.8.0"},"start_time":"2022-06-23T18:52:01Z","pid":31562}},"docker":"9001291f84d1fa67aeaafd788062186","parent_exec_id":"AAAAAAAAADoxNjcxODQ0ODI1MDI1Nzg6MTczMjQ3OA=="},"parent":{"exec_id":"AAAAAAAAADoxNjcxODQ0ODI1MDI1Nzg6MTczMjQ3OA==","pid":1732478,"uid":0,"cwd":"/var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/462cbabc0ce9d0f9c11d21fbb6c8d62850f21eee7bd4289f04745b26cf1e6acd/","binary":"/snap/microk8s/3272/bin/runc","arguments":"--root /run/containerd/runc/k8s.io --log /var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/9001291f84d1fa67aeaaf 7880621865e6b7b398a67294803716581e27c5d7a5/log.json --log-format json exec --process /var/snap/microk8s/common/run/runc-process2712392004 --detach --pid-file /var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/9001291f84d1fa67aeaaf 7880621865e6b7b398a67294803716581e27c5d7a5/5a342fb48ed6670a76e757b94846ea207d1e73296b5fde67622d6ac3 09bc01f.pid 9001291f84d1fa67aeaafd7880621865e6b7b398a67294803716581e27c5d7a5","flags":"execve clone","start_time":"2022-06-24T01:27:13.425Z","auid":4294967295,"parent_exec_id":"AAAAAAAAADoxNjI1OTAwMDAwMDA6NTEwOQ==","refcnt":4294967295}},"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-06-24T01:27:13.476Z"}

It isn't just tetragon container events that have these nulls like the example, other container's tetragon trace data has also been observed to be impacted.

This behavior hasn't been appearing immediately when tetragon is installed, it seems to take some time before it starts happening, perhaps a few hours or a day, then it continues to occur.

I have observed this behavior across multiple builds of k3s and microk8s clusters but only when tetragon is deployed. We can also see related data in the microk8s kubelite daemon log:

microk8s.daemon-kubelite.log:Jun 23 13:17:55 moon2 microk8s.daemon-kubelite[16056]: E0623 13:17:55.084224   16056 status.go:71] apiserver received an error that is not an metav1.Status: storage.InvalidError{Errs:field.ErrorList{(*field.Error)(0xc014330000)}}: resourceVersion: Invalid value: "\x00\x00\x00\x00\x00\x00": strconv.ParseUint: parsing "\x00\x00\x00\x00\x00\x00": invalid syntax

This behavior does not occur when tetragon is not running/installed into the clusters.

Tests for this I have done so far have been with both k3s and microk8s clusters.
Cluster nodes in the tests have been running OpenSUSE Leap 15.4, Ubuntu 22 Server, Ubuntu 22 Desktop, (kernels 5.15.0-39-generic and 5.14.21-150400.22-default at the moment).

Currently I have a microk8s cluster running with this happening.

Let me know if there is any other data that would be helpful!

@Forsworns
Copy link
Contributor

Forsworns commented Jun 29, 2022

I use KinD and haven't encounter this problem. But I guess the problems origin from following codes

The podInfo is queried in

func GetProcess(

And the node name is queried in

func GetNodeNameForExport() string {

They are all done in golang side

@Forsworns
Copy link
Contributor

BTW, the log is too hard to read, lol. Here is the formatted version. The pod.namespace, pod.name and node_name are missing as

{
    "process_exec": {
        "process": {
            "exec_id": "AAAAAAAAADoxNjcxODQ1Mjc0NTk0MTc6MTczMjQ4OA==",
            "pid": 1732488,
            "uid": 0,
            "cwd": "/",
            "binary": "/usr/bin/tetra",
            "arguments": "status",
            "flags": "execve rootcwd clone",
            "start_time": "2022-06-24T01:27:13.470Z",
            "auid": 4294967295,
            "pod": {
                "namespace": "\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000",
                "name": "\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000",
                "container": {
                    "id": "containerd://9001291f84d1fa67aeaafd7880621865e6b7b398a67294803716581e27c5d7a5",
                    "name": "tetragon",
                    "image": {
                        "id": "quay.io/cilium/tetragon@sha256:bb81d915aafdefa1a7873de30791e5a4698322d463af51195b4c262060fcc703",
                        "name": "quay.io/cilium/tetragon:v0.8.0"
                    },
                    "start_time": "2022-06-23T18:52:01Z",
                    "pid": 31562
                }
            },
            "docker": "9001291f84d1fa67aeaafd788062186",
            "parent_exec_id": "AAAAAAAAADoxNjcxODQ0ODI1MDI1Nzg6MTczMjQ3OA==",
            "refcnt": 1
        },
        "parent": {
            "exec_id": "AAAAAAAAADoxNjcxODQ0ODI1MDI1Nzg6MTczMjQ3OA==",
            "pid": 1732478,
            "uid": 0,
            "cwd": "/var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/462cbabc0ce9d0f9c11d21fbb6c8d62850f21eee7bd4289f04745b26cf1e6acd/",
            "binary": "/snap/microk8s/3272/bin/runc",
            "arguments": "--root /run/containerd/runc/k8s.io --log /var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/9001291f84d1fa67aeaaf 7880621865e6b7b398a67294803716581e27c5d7a5/log.json --log-format json exec --process /var/snap/microk8s/common/run/runc-process2712392004 --detach --pid-file /var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/9001291f84d1fa67aeaaf 7880621865e6b7b398a67294803716581e27c5d7a5/5a342fb48ed6670a76e757b94846ea207d1e73296b5fde67622d6ac3 09bc01f.pid 9001291f84d1fa67aeaafd7880621865e6b7b398a67294803716581e27c5d7a5",
            "flags": "execve clone",
            "start_time": "2022-06-24T01:27:13.425Z",
            "auid": 4294967295,
            "parent_exec_id": "AAAAAAAAADoxNjI1OTAwMDAwMDA6NTEwOQ==",
            "refcnt": 1
        }
    },
    "node_name": "\u0000\u0000\u0000\u0000\u0000\u0000\u0000",
    "time": "2022-06-24T01:27:13.470Z"
}
{
    "process_exit": {
        "process": {
            "exec_id": "AAAAAAAAADoxNjcxODQ1Mjc0NTk0MTc6MTczMjQ4OA==",
            "pid": 1732488,
            "uid": 0,
            "cwd": "/",
            "binary": "/usr/bin/tetra",
            "arguments": "status",
            "flags": "execve rootcwd clone",
            "start_time": "2022-06-24T01:27:13.470Z",
            "auid": 4294967295,
            "pod": {
                "namespace": "\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000",
                "name": "\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000",
                "container": {
                    "id": "containerd://9001291f84d1fa67aeaafd7880621865e6b7b398a67294803716581e27c5d7a5",
                    "name": "tetragon",
                    "image": {
                        "id": "quay.io/cilium/tetragon@sha256:bb81d915aafdefa1a7873de30791e5a4698322d463af51195b4c262060fcc703",
                        "name": "quay.io/cilium/tetragon:v0.8.0"
                    },
                    "start_time": "2022-06-23T18:52:01Z",
                    "pid": 31562
                }
            },
            "docker": "9001291f84d1fa67aeaafd788062186",
            "parent_exec_id": "AAAAAAAAADoxNjcxODQ0ODI1MDI1Nzg6MTczMjQ3OA=="
        },
        "parent": {
            "exec_id": "AAAAAAAAADoxNjcxODQ0ODI1MDI1Nzg6MTczMjQ3OA==",
            "pid": 1732478,
            "uid": 0,
            "cwd": "/var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/462cbabc0ce9d0f9c11d21fbb6c8d62850f21eee7bd4289f04745b26cf1e6acd/",
            "binary": "/snap/microk8s/3272/bin/runc",
            "arguments": "--root /run/containerd/runc/k8s.io --log /var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/9001291f84d1fa67aeaaf 7880621865e6b7b398a67294803716581e27c5d7a5/log.json --log-format json exec --process /var/snap/microk8s/common/run/runc-process2712392004 --detach --pid-file /var/snap/microk8s/common/run/containerd/io.containerd.runtime.v2.task/k8s.io/9001291f84d1fa67aeaaf 7880621865e6b7b398a67294803716581e27c5d7a5/5a342fb48ed6670a76e757b94846ea207d1e73296b5fde67622d6ac3 09bc01f.pid 9001291f84d1fa67aeaafd7880621865e6b7b398a67294803716581e27c5d7a5",
            "flags": "execve clone",
            "start_time": "2022-06-24T01:27:13.425Z",
            "auid": 4294967295,
            "parent_exec_id": "AAAAAAAAADoxNjI1OTAwMDAwMDA6NTEwOQ==",
            "refcnt": 4294967295
        }
    },
    "node_name": "\u0000\u0000\u0000\u0000\u0000\u0000\u0000",
    "time": "2022-06-24T01:27:13.476Z"
}

@jpegleg
Copy link
Author

jpegleg commented Jul 2, 2022

One of the more severe occurrences of this behavior (observed on k3s) has only timestamp data intact, looking like this:

{"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-07-02T11:29:28.228Z"}
{"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-07-02T11:29:28.235Z"}
{"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-07-02T11:29:38.234Z"}
{"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-07-02T11:29:38.240Z"}
{"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-07-02T11:29:48.239Z"}
{"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-07-02T11:29:48.249Z"}
{"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-07-02T11:29:58.230Z"}
{"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-07-02T11:29:58.236Z"}
{"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-07-02T11:30:08.228Z"}
{"node_name":"\u0000\u0000\u0000\u0000\u0000\u0000\u0000","time":"2022-07-02T11:30:08.234Z"}

@kkourt
Copy link
Contributor

kkourt commented Jul 4, 2022

Thank you for the report!

Could you please add the logs from the tetragon container?

@kkourt kkourt added the kind/bug Something isn't working label Jul 4, 2022
@jpegleg
Copy link
Author

jpegleg commented Jul 6, 2022

Thank you for the report!

Could you please add the logs from the tetragon container?

You're welcome!

Here is a tetragon log sample from a k3s cluster that encountered this behavior:

time="2022-07-06T02:01:55Z" level=info msg="Loaded config from directory" config-dir=/etc/tetragon
time="2022-07-06T02:01:55Z" level=info msg="Starting tetragon" version=v0.8.0
time="2022-07-06T02:01:55Z" level=info msg="config settings" config="map[bpf-lib:/var/lib/tetragon/ btf: cilium-bpf: config-dir:/etc/tetragon config-file: debug:false enable-cilium-api:false enable-export-aggregation:false enable-k8s-api:true enable-process-ancestors:true enable-process-cred:false enable-process-ns:false export-aggregation-buffer-size:10000 export-aggregation-window-size:15s export-allowlist:{\"event_set\":[\"PROCESS_EXEC\", \"PROCESS_EXIT\", \"PROCESS_KPROBE\"]} export-denylist:{\"health_check\":true}\n{\"namespace\":[\"\", \"cilium\", \"kube-system\"]} export-file-compress:false export-file-max-backups:5 export-file-max-size-mb:10 export-file-rotation-interval:0s export-filename:/var/run/cilium/tetragon/tetragon.log export-rate-limit:-1 force-small-progs:false ignore-missing-progs:false kernel: log-format:text log-level:info metrics-server::2112 netns-dir:/var/run/docker/netns/ process-cache-size:65536 procfs:/procRoot run-standalone:false server-address:localhost:54321 verbose:0]"
time="2022-07-06T02:01:55Z" level=info msg="Available sensors" sensors=
time="2022-07-06T02:01:55Z" level=info msg="Registered tracing sensors" sensors="kprobe sensor, tracepoint sensor"
time="2022-07-06T02:01:55Z" level=info msg="Registered probe types" types="kprobe sensor, tracepoint sensor"
time="2022-07-06T02:01:56Z" level=info msg="Enabling Kubernetes API"
time="2022-07-06T02:01:56Z" level=info msg="Starting metrics server" addr=":2112"
time="2022-07-06T02:01:56Z" level=info msg="Initialized pod cache" num_pods=4
time="2022-07-06T02:01:56Z" level=info msg="Disabling Cilium API"
time="2022-07-06T02:01:56Z" level=info msg="Starting process manager" enableCilium=false enableEventCache=true enableProcessCred=false enableProcessNs=false
time="2022-07-06T02:01:56Z" level=info msg="Starting gRPC server" address="localhost:54321"
time="2022-07-06T02:01:56Z" level=info msg="Starting JSON exporter" logger="&{/var/run/cilium/tetragon/tetragon.log 10 0 5 false false 0 <nil> {0 0} <nil> {0 {0 0}}}" request="allow_list:{event_set:PROCESS_EXEC  event_set:PROCESS_EXIT  event_set:PROCESS_KPROBE}  deny_list:{health_check:{value:true}}  deny_list:{namespace:\"\"  namespace:\"cilium\"  namespace:\"kube-system\"}"
time="2022-07-06T02:01:56Z" level=info msg="Exporter configuration" enabled=true fileName=/var/run/cilium/tetragon/tetragon.log
time="2022-07-06T02:01:56Z" level=info msg="Using metadata file" metadata=
time="2022-07-06T02:01:56Z" level=info msg="Loading sensor" name=__main__
time="2022-07-06T02:01:56Z" level=info msg="Loading kernel version 5.14.21"
time="2022-07-06T02:01:56Z" level=info msg="tetragon, map loaded." map=execve_map path=/sys/fs/bpf/tcpmon/execve_map sensor=__main__
time="2022-07-06T02:01:56Z" level=info msg="tetragon, map loaded." map=execve_map_stats path=/sys/fs/bpf/tcpmon/execve_map_stats sensor=__main__
time="2022-07-06T02:01:56Z" level=info msg="Started watching tracing policies"
time="2022-07-06T02:01:56Z" level=info msg="tetragon, map loaded." map=names_map path=/sys/fs/bpf/tcpmon/names_map sensor=__main__
time="2022-07-06T02:01:56Z" level=info msg="tetragon, map loaded." map=tcpmon_map path=/sys/fs/bpf/tcpmon/tcpmon_map sensor=__main__
time="2022-07-06T02:01:56Z" level=info msg="BPF prog was loaded" label=tracepoint/sys_exit prog=/var/lib/tetragon/bpf_exit.o
time="2022-07-06T02:01:56Z" level=info msg="BPF prog was loaded" label=kprobe/wake_up_new_task prog=/var/lib/tetragon/bpf_fork.o
time="2022-07-06T02:01:56Z" level=info msg="Load probe" Program=/var/lib/tetragon/bpf_execve_event_v53.o Type=execve
time="2022-07-06T02:01:57Z" level=info msg="Read ProcFS /procRoot appended 70/261 entries"
time="2022-07-06T02:01:57Z" level=warning msg="Procfs execve event pods/ identifier error" error="open /procRoot/0/cgroup: no such file or directory"
time="2022-07-06T02:01:57Z" level=info msg="BPF prog was loaded" label=tracepoint/sys_execve prog=/var/lib/tetragon/bpf_execve_event_v53.o
time="2022-07-06T02:01:57Z" level=info msg="Loaded BPF maps and events for sensor successfully" sensor=__main__
time="2022-07-06T02:01:57Z" level=info msg="Using metadata file" metadata=
time="2022-07-06T02:01:57Z" level=info msg="Loading sensor" name=__main__
time="2022-07-06T02:01:57Z" level=info msg="Loading kernel version 5.14.21"
time="2022-07-06T02:01:57Z" level=info msg="Loaded BPF maps and events for sensor successfully" sensor=__main__
time="2022-07-06T02:01:57Z" level=info msg="Listening for events..."
time="2022-07-06T02:33:01Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: an error on the server (\"unknown\") has prevented the request from succeeding (get tracingpolicies.cilium.io)"
time="2022-07-06T02:33:02Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:33:05Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:33:09Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:33:17Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:33:34Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:34:12Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:34:50Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:35:29Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:36:20Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:36:56Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:37:37Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:38:24Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:39:20Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:39:58Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"
time="2022-07-06T02:40:37Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: failed to list *v1alpha1.TracingPolicy: resourceVersion: Invalid value: \"\\x00\\x00\\x00\\x00\": strconv.ParseUint: parsing \"\\x00\\x00\\x00\\x00\": invalid syntax"

@kkourt
Copy link
Contributor

kkourt commented Jul 12, 2022

Here is a tetragon log sample from a k3s cluster that encountered this behavior:

Thanks!

time="2022-07-06T02:33:01Z" level=error msg="Kubernetes API error" error="github.com/cilium/tetragon/pkg/k8s/client/informers/externalversions/factory.go:104: Failed to watch *v1alpha1.TracingPolicy: an error on the server ("unknown") has prevented the request from succeeding (get tracingpolicies.cilium.io)"

So it seems that we need to:

  • Have a more sane event when this error happens (e.g., an error flag)
  • Figure out whether the error is something to be expected or not

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants