Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running Fluentd in windows, failed to flush the buffer #1272

Closed
sanju090 opened this issue Oct 11, 2016 · 9 comments
Closed

Running Fluentd in windows, failed to flush the buffer #1272

sanju090 opened this issue Oct 11, 2016 · 9 comments

Comments

@sanju090
Copy link

sanju090 commented Oct 11, 2016

Hi Everyone,
I am facing issue flushing buffer from fluentd to elasticsearch. I am new to Fluentd, can you please help?

ElasticSearch version -> elasticsearch-2.4.1
Fluentd Version -> 0.14.7
fluent-plugin-elasticseach -> 1.7.0
OS -> Windows 7
flush_buffer_issue
flush_buffer_issue

fluent.conf

@type forward port 24224 @type elasticsearch host localhost port 9200 index_name fluentd type_name fluentd buffer_type memory buffer_chunk_limit 256m buffer_queue_limit 128 flush_interval 5s disable_retry_limit false retry_limit 5 retry_wait 1s max_retry_wait 5s
@repeatedly
Copy link
Member

I'm not familiar with Windows but it seems not fluentd related issue.
Check your environment: http://stackoverflow.com/questions/2972600/no-connection-could-be-made-because-the-target-machine-actively-refused-it

@sanju090
Copy link
Author

Hi Masahiro,
Thank you very much for your quick response. Please ignore that "No connection could be made.." part that message came when i tried to kill the process by hitting Ctrl + C.
I am able to send log data from Java application to fluentd, i did stdout it was able to print the data. The issue occurs when sending those data to elasticsearch.

Here is my updated fluent.conf file

@type forward port 24224 @type copy @type stdout @type elasticsearch host localhost port 9200 index_name fluentd type_name fluentd buffer_type memory buffer_chunk_limit 256m buffer_queue_limit 128 flush_interval 5s disable_retry_limit false retry_limit 5 retry_wait 1s max_retry_wait 5s

Java Code to send log to fluentd

import java.util.HashMap;
import java.util.Map;
import org.fluentd.logger.FluentLogger;

public class FluentdLogger {
final static FluentLogger LOG = FluentLogger.getLogger("fluentd", "localhost", 24224);

public static void main (String [] args) {
    FluentdLogger fluentdlog = new FluentdLogger();
    fluentdlog.doApplicationLogic();
}

public void doApplicationLogic() {

    Map<String, Object> data = new HashMap<String, Object>();
    data.put("from", "userA");
    data.put("to", "userB");
    LOG.log("tag1", data);


}

}

buffer_issue1
buffer2

@repeatedly
Copy link
Member

I see.
It seems a regression by #1263 .
I will check it.

@repeatedly
Copy link
Member

Could you test this configuration with v0.14.6?

@repeatedly
Copy link
Member

I confirmed this msgpack_each related error on Mac.
I will fix this issue soon.

@repeatedly
Copy link
Member

I found the cause and I'm now writing the patch.
We will release v0.14.8 soon.

@sanju090
Copy link
Author

Hi Masahiro, Thank you very much for looking into it and writing the patch. Can you please update this thread once v0.14.8 is ready. Thanks again

repeatedly added a commit that referenced this issue Oct 13, 2016
…-compat-layer

Chunks in compat layer should provide msgpack_each method. fix #1272
@repeatedly
Copy link
Member

Release v0.14.8

@sanju090
Copy link
Author

Thank you very much Masahiro for your quick responses and the fix 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants