You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(optional) already reported 3rd party upstream repository or mailing list if you use k8s addon or helm charts.
Steps to replicate
Create 2 logs with diffrent format ex: ts=1.6530758923877115E9 and ts=2022-05-20T20:38:47.298361917Z
Expected Behavior or What you need to ask
I would expect to not lose logs because of luck.
In my case this happened in AKS, one managed service after upgrade created timestamp in diffrent format and then some logs were missing, this had serious impact because logs got lost and we did not know about this issue. I would like to not lose logs if this happen again. If I know the problematic key/value I can handle it with transformation or drop it, but what if this happens frequently with other key/values?
Using Fluentd and OpenSearch plugin versions
Dynamic datatype has conflicts, looks like different apps use different formats for the key ts and get datatype that is not supported by all microservices.
OpenSearch rejects records from app1 in the dev environment: error event: error_class=Fluent::Plugin::OpenSearchErrorHandler::OpenSearchError error="400 - Rejected by OpenSearch [error type]: mapper_parsing_exception [reason]: 'failed to parse field [ts] of type [date] in document with id 'mHYA44ABjc1sEmR6h9ze'. Preview of field's value: '1.6530758923877115E9''"
OpenSearch rejects record from app2 in qa environment: error event: error_class=Fluent::Plugin::OpenSearchErrorHandler::OpenSearchError error="400 - Rejected by OpenSearch [error type]: mapper_parsing_exception [reason]: 'failed to parse field [ts] of type [float] in document with id 'Y2Mx44ABx8DDlswJ0RGZ'. Preview of field's value: '2022-05-20T20:38:47.298361917Z''"
cosmo0920 The bot closed the issue directly, I have updated the description, can we open it? If there is no solution for this we can suggest a workaround and update the FAQ documentation
Steps to replicate
Create 2 logs with diffrent format ex: ts=1.6530758923877115E9 and ts=2022-05-20T20:38:47.298361917Z
Expected Behavior or What you need to ask
I would expect to not lose logs because of luck.
In my case this happened in AKS, one managed service after upgrade created timestamp in diffrent format and then some logs were missing, this had serious impact because logs got lost and we did not know about this issue. I would like to not lose logs if this happen again. If I know the problematic key/value I can handle it with transformation or drop it, but what if this happens frequently with other key/values?
Using Fluentd and OpenSearch plugin versions
Dynamic datatype has conflicts, looks like different apps use different formats for the key
ts
and get datatype that is not supported by all microservices.OpenSearch rejects records from app1 in the dev environment:
error event: error_class=Fluent::Plugin::OpenSearchErrorHandler::OpenSearchError error="400 - Rejected by OpenSearch [error type]: mapper_parsing_exception [reason]: 'failed to parse field [ts] of type [date] in document with id 'mHYA44ABjc1sEmR6h9ze'. Preview of field's value: '1.6530758923877115E9''"
OpenSearch rejects record from app2 in qa environment:
error event: error_class=Fluent::Plugin::OpenSearchErrorHandler::OpenSearchError error="400 - Rejected by OpenSearch [error type]: mapper_parsing_exception [reason]: 'failed to parse field [ts] of type [float] in document with id 'Y2Mx44ABx8DDlswJ0RGZ'. Preview of field's value: '2022-05-20T20:38:47.298361917Z''"
It is running in AKS
Version and plugins:
The text was updated successfully, but these errors were encountered: