Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Environment variable FLUENT_OJ_OPTION_MAX_NESTING doesn't work #3888

Closed
lozuwa opened this issue Sep 7, 2022 · 1 comment
Closed

Environment variable FLUENT_OJ_OPTION_MAX_NESTING doesn't work #3888

lozuwa opened this issue Sep 7, 2022 · 1 comment

Comments

@lozuwa
Copy link

lozuwa commented Sep 7, 2022

Describe the bug

Setting the variable FLUENT_OJ_OPTION_MAX_NESTING doesn't limit the nesting option of json.

To Reproduce

Follow these steps to reproduce:

  1. Install fluentd using the helm chart kokuwa/fluentd-elasticsearch version 13.3.0

Command to install:

helm repo add kokuwa https://kokuwaio.github.io/helm-charts
helm install fluentd kokuwa/fluentd-elasticsearch --version=13.3.0 -f values.yaml

Values:

elasticsearch:
  buffer:
    chunkLimitSize: "2M"
  hosts: ["elasticsearch-client:9200"]
  outputType: elasticsearch_dynamic
  suppressTypeName: true
  logstash:
    prefix: "logstash"
  scheme: "http"

env:
  FLUENT_OJ_OPTION_MAX_NESTING: 2
  
resources:
  limits:
    cpu: 500m
    memory: 500Mi
  requests:
    cpu: 100m
    memory: 200Mi

Please note:

  • Fluentd version installed with this chart version is fluentd-1.14.4
  • FLUENT_OJ_OPTION_MAX_NESTING has been set to 2
  1. Start sending logs from a pod with nested json objects

  2. Check fluentd logs. FLUENT_OJ_OPTION_MAX_NESTING doesn't do anything

2022-09-07 19:16:44.313879242 +0000 fluent.error: {"message":"[elasticsearch] Could not push log to Elasticsearch:
...
 \"caused_by\"=>{\"type\"=>\"illegal_argument_exception\", \"reason\"=>\"Limit of total fields [1000] has been exceeded while adding new fields [1]\"}}}}]}"}

Expected behavior

FLUENT_OJ_OPTION_MAX_NESTING should limit max nesting

Your Environment

- Fluentd version: fluentd-1.14.4
- TD Agent version: N/A
- Operating system: Provided by docker image quay.io/fluentd_elasticsearch/fluentd:v3.4.0
- Kernel version: Provided by docker image quay.io/fluentd_elasticsearch/fluentd:v3.4.0

Your Configuration

Helm chart with values


elasticsearch:
  buffer:
    chunkLimitSize: "2M"
  hosts: ["elasticsearch-client:9200"]
  outputType: elasticsearch_dynamic
  suppressTypeName: true
  logstash:
    prefix: "logstash"
  scheme: "http"

env:
  FLUENT_OJ_OPTION_MAX_NESTING: 2
  
resources:
  limits:
    cpu: 500m
    memory: 500Mi
  requests:
    cpu: 100m
    memory: 200Mi


### Your Error Log

```shell
Here are the fluentd logs. FLUENT_OJ_OPTION_MAX_NESTING doesn't do anything


2022-09-07 19:16:44.313879242 +0000 fluent.error: {"message":"[elasticsearch] Could not push log to Elasticsearch:
...
 \"caused_by\"=>{\"type\"=>\"illegal_argument_exception\", \"reason\"=>\"Limit of total fields [1000] has been exceeded while adding new fields [1]\"}}}}]}"}

Additional context

No response

@ashie
Copy link
Member

ashie commented Sep 9, 2022

Sorry, it's a known issue: #3311 (comment)
We'll remove it and add max_nesting to parser_json` instead.

@ashie ashie closed this as completed Sep 9, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants