We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Check CONTRIBUTING guideline first and here is the list to help us investigate the problem.
flush_interval 1s buffer_chunk_limit 4m buffer_queue_limit 43200
retry_wait 1s max_retry_wait 60s disable_retry_limit true
max_send_retries 1 required_acks 1 compression_codec gzip output_data_type json output_include_time true num_threads 1 #kafka_agg_max_bytes 1048576
The text was updated successfully, but these errors were encountered:
This seems kafka plugin issue and error happens inside ruby-kafka. So the problem is communicating between ruby-kafka and your kafka cluster.
Sorry, something went wrong.
ok, thanks.
Thread#run
No branches or pull requests
Check CONTRIBUTING guideline first and here is the list to help us investigate the problem.
fluentd-0.14.20
centos in docker machine
flush_interval 1s
buffer_chunk_limit 4m
buffer_queue_limit 43200
retry_wait 1s
max_retry_wait 60s
disable_retry_limit true
ruby-kafka producer options
max_send_retries 1
# loglevel: trace, debug, info, warn, error log_level debugrequired_acks 1
compression_codec gzip
output_data_type json
output_include_time true
num_threads 1
#kafka_agg_max_bytes 1048576
2018-02-06 09:44:21 +0000 [warn]: #0 Send exception occurred: data error
2018-02-06 09:44:21 +0000 [warn]: #0 Exception Backtrace : /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/gzip_codec.rb:17:in close' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/gzip_codec.rb:17:incompress'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/compressor.rb:50:in block in compress_data' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/instrumenter.rb:21:ininstrument'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/compressor.rb:49:in compress_data' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/compressor.rb:34:incompress'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/produce_operation.rb:89:in block (2 levels) in send_buffered_messages' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/message_buffer.rb:44:inblock (2 levels) in each'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/message_buffer.rb:43:in each' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/message_buffer.rb:43:inblock in each'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/message_buffer.rb:42:in each' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/message_buffer.rb:42:ineach'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/produce_operation.rb:87:in block in send_buffered_messages' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/produce_operation.rb:81:ineach'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/produce_operation.rb:81:in send_buffered_messages' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/produce_operation.rb:47:inblock in execute'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/instrumenter.rb:21:in instrument' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/produce_operation.rb:41:inexecute'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/producer.rb:301:in block in deliver_messages_with_retries' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/producer.rb:293:inloop'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/producer.rb:293:in deliver_messages_with_retries' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/producer.rb:243:inblock in deliver_messages'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/instrumenter.rb:21:in instrument' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/ruby-kafka-0.4.0/lib/kafka/producer.rb:236:indeliver_messages'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/fluent-plugin-kafka-0.6.0/lib/fluent/plugin/out_kafka_buffered.rb:246:in deliver_messages' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/fluent-plugin-kafka-0.6.0/lib/fluent/plugin/out_kafka_buffered.rb:309:inwrite'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/fluentd-0.14.20/lib/fluent/compat/output.rb:131:in write' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/fluentd-0.14.20/lib/fluent/plugin/output.rb:1043:intry_flush'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/fluentd-0.14.20/lib/fluent/plugin/output.rb:1268:in flush_thread_run' /usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/fluentd-0.14.20/lib/fluent/plugin/output.rb:420:inblock (2 levels) in start'
/usr/local/rvm/gems/ruby-2.4.0@fluentd/gems/fluentd-0.14.20/lib/fluent/plugin_helper/thread.rb:78:in `block in thread_create'
2018-02-06 09:44:21 +0000 [info]: #0 initialized kafka producer: kafka
2018-02-06 09:44:21 +0000 [debug]: #0 taking back chunk for errors. chunk="5648805a66fd415d7ee306278c3d0f39"
2018-02-06 09:44:21 +0000 [warn]: #0 failed to flush the buffer. retry_time=0 next_retry_seconds=2018-02-06 09:44:22 +0000 chunk="5648805a66fd415d7ee306278c3d0f39" error_class=Zlib::DataError error="data error"
2018-02-06 09:44:21 +0000 [warn]: #0 suppressed same stacktrace
2018-02-06 09:44:22 +0000 [warn]: #0 retry succeeded. chunk_id="5648805a66fd415d7ee306278c3d0f39"
The text was updated successfully, but these errors were encountered: