You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using JSON extractor and one of my fields can be > 16383 UTF-8 chars.
Elasticsearch give me the error:
IllegalArgumentException[Document contains at least one immense term in field="msg.response" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[110, 116, 114, 121, 62, 60, 107, 101, 121, 62, 115, 101, 114, 118, 105, 99, 101, 95, 116, 121, 112, 101, 60, 47, 107, 101, 121, 62, 60, 118]...', original message: bytes can be at most 32766 in length; got 38507]; nested: MaxBytesLengthExceededException[bytes can be at most 32766 in length; got 38507];
As result - log entries lost.
It can be multiply ways to resolve:
I am using JSON extractor and one of my fields can be > 16383 UTF-8 chars.
Elasticsearch give me the error:
As result - log entries lost.
It can be multiply ways to resolve:
I fond only one 'dummy' solution:
Please, provide any usefull solution. Thank you.
The text was updated successfully, but these errors were encountered: