Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jaeger demo not working #30772

Closed
AkselAllas opened this issue Jan 25, 2024 · 4 comments
Closed

Jaeger demo not working #30772

AkselAllas opened this issue Jan 25, 2024 · 4 comments
Labels

Comments

@AkselAllas
Copy link

Component(s)

No response

What happened?

Description

https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/examples/demo/docker-compose.yaml

Doesn't work on my macos with:
Docker version 24.0.7, build afdd53b

Steps to Reproduce

Clone latest main branch, run docker compose up

Expected Result

Jaeger has traces

Actual Result

I see following error for collector:

2024-01-25T08:12:24.842Z info    exporterhelper/retry_sender.go:177      Exporting failed. Will retry the request after interval.   {"kind": "exporter", "data_type": "traces", "name": "otlp", "error": "rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial tcp 172.25.0.2:4317: connect: connection refused\"", "interval": "34.281815716s"}

And full logs of jaeger container:

2024/01/25 08:11:38 maxprocs: Leaving GOMAXPROCS=10: CPU quota undefined
{"level":"info","ts":1706170298.7259846,"caller":"flags/service.go:119","msg":"Mounting metrics handler on admin server","route":"/metrics"}
{"level":"info","ts":1706170298.726109,"caller":"flags/service.go:125","msg":"Mounting expvar handler on admin server","route":"/debug/vars"}
{"level":"info","ts":1706170298.7262967,"caller":"flags/admin.go:129","msg":"Mounting health check on admin server","route":"/"}
{"level":"info","ts":1706170298.7263808,"caller":"flags/admin.go:143","msg":"Starting admin HTTP server","http-addr":":14269"}
{"level":"info","ts":1706170298.7263992,"caller":"flags/admin.go:121","msg":"Admin server started","http.host-port":"[::]:14269","health-status":"unavailable"}
{"level":"info","ts":1706170298.7275581,"caller":"memory/factory.go:66","msg":"Memory storage initialized","configuration":{"MaxTraces":0}}
{"level":"info","ts":1706170298.7278695,"caller":"static/strategy_store.go:138","msg":"Loading sampling strategies","filename":"/etc/jaeger/sampling_strategies.json"}
{"level":"info","ts":1706170298.7421882,"caller":"channelz/funcs.go:340","msg":"[core][Server #1] Server created","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7423134,"caller":"server/grpc.go:104","msg":"Starting jaeger-collector gRPC server","grpc.host-port":"[::]:14250"}
{"level":"info","ts":1706170298.742344,"caller":"server/http.go:56","msg":"Starting jaeger-collector HTTP server","http host-port":":14268"}
{"level":"info","ts":1706170298.7424648,"caller":"server/zipkin.go:52","msg":"Not listening for Zipkin HTTP traffic, port not configured"}
{"level":"info","ts":1706170298.742483,"caller":"grpc/builder.go:73","msg":"Agent requested insecure grpc connection to collector(s)"}
{"level":"info","ts":1706170298.7425313,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3] Channel created","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7425632,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3] original dial target is: \":14250\"","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7425747,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3] dial target \":14250\" parse failed: parse \":14250\": missing protocol scheme","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7425785,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3] fallback to scheme \"passthrough\"","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7426286,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3] parsed dial target is: {Scheme:passthrough Authority: Endpoint::14250 URL:{Scheme:passthrough Opaque: User: Host: Path:/:14250 RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7426353,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3] Channel authority set to \"localhost:14250\"","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.742516,"caller":"channelz/funcs.go:340","msg":"[core][Server #1 ListenSocket #2] ListenSocket created","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.742804,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3] Resolver state updated: {\n  \"Addresses\": [\n    {\n      \"Addr\": \":14250\",\n      \"ServerName\": \"\",\n      \"Attributes\": null,\n      \"BalancerAttributes\": null,\n      \"Type\": 0,\n      \"Metadata\": null\n    }\n  ],\n  \"ServiceConfig\": null,\n  \"Attributes\": null\n} (resolver returned new addresses)","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.742848,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3] Channel switches to new LB policy \"round_robin\"","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7428958,"caller":"grpclog/component.go:55","msg":"[balancer]base.baseBalancer: got new ClientConn state: {{[{\n  \"Addr\": \":14250\",\n  \"ServerName\": \"\",\n  \"Attributes\": null,\n  \"BalancerAttributes\": null,\n  \"Type\": 0,\n  \"Metadata\": null\n}] <nil> <nil>} <nil>}","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7429283,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3 SubChannel #4] Subchannel created","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7429543,"caller":"grpclog/component.go:71","msg":"[roundrobin]roundrobinPicker: Build called with info: {map[]}","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7429621,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3] Channel Connectivity change to CONNECTING","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7429955,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3 SubChannel #4] Subchannel Connectivity change to CONNECTING","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7430136,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3 SubChannel #4] Subchannel picks a new address \":14250\" to connect","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.74306,"caller":"grpc/builder.go:113","msg":"Checking connection to collector"}
{"level":"info","ts":1706170298.7430677,"caller":"grpc/builder.go:124","msg":"Agent collector connection state change","dialTarget":":14250","status":"CONNECTING"}
{"level":"info","ts":1706170298.743093,"caller":"grpclog/component.go:71","msg":"[balancer]base.baseBalancer: handle SubConn state change: 0x40005e2630, CONNECTING","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7433767,"caller":"./main.go:256","msg":"Starting agent"}
{"level":"info","ts":1706170298.7434509,"caller":"querysvc/query_service.go:135","msg":"Archive storage not created","reason":"archive storage not supported"}
{"level":"info","ts":1706170298.743458,"caller":"app/flags.go:136","msg":"Archive storage not initialized"}
{"level":"info","ts":1706170298.743566,"caller":"channelz/funcs.go:340","msg":"[core][Server #7] Server created","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.743592,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3 SubChannel #4] Subchannel Connectivity change to READY","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7435942,"caller":"app/agent.go:69","msg":"Starting jaeger-agent HTTP server","http-port":5778}
{"level":"info","ts":1706170298.7436154,"caller":"grpclog/component.go:71","msg":"[balancer]base.baseBalancer: handle SubConn state change: 0x40005e2630, READY","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7436397,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8] Channel created","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7436476,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8] original dial target is: \":16685\"","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7436535,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8] dial target \":16685\" parse failed: parse \":16685\": missing protocol scheme","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7436535,"caller":"grpclog/component.go:71","msg":"[roundrobin]roundrobinPicker: Build called with info: {map[0x40005e2630:{{\n  \"Addr\": \":14250\",\n  \"ServerName\": \"\",\n  \"Attributes\": null,\n  \"BalancerAttributes\": null,\n  \"Type\": 0,\n  \"Metadata\": null\n}}]}","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7436705,"caller":"channelz/funcs.go:340","msg":"[core][Channel #3] Channel Connectivity change to READY","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7436788,"caller":"grpc/builder.go:124","msg":"Agent collector connection state change","dialTarget":":14250","status":"READY"}
{"level":"info","ts":1706170298.7436566,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8] fallback to scheme \"passthrough\"","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.743843,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8] parsed dial target is: {Scheme:passthrough Authority: Endpoint::16685 URL:{Scheme:passthrough Opaque: User: Host: Path:/:16685 RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.744708,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8] Channel authority set to \"localhost:16685\"","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.744752,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8] Resolver state updated: {\n  \"Addresses\": [\n    {\n      \"Addr\": \":16685\",\n      \"ServerName\": \"\",\n      \"Attributes\": null,\n      \"BalancerAttributes\": null,\n      \"Type\": 0,\n      \"Metadata\": null\n    }\n  ],\n  \"ServiceConfig\": null,\n  \"Attributes\": null\n} (resolver returned new addresses)","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.744769,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8] Channel switches to new LB policy \"pick_first\"","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7447844,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8 SubChannel #9] Subchannel created","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7448285,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8] Channel Connectivity change to CONNECTING","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7449918,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8 SubChannel #9] Subchannel Connectivity change to CONNECTING","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7450216,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8 SubChannel #9] Subchannel picks a new address \":16685\" to connect","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7451622,"caller":"grpclog/component.go:71","msg":"[core]pickfirstBalancer: UpdateSubConnState: 0x4000790000, {CONNECTING <nil>}","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.745174,"caller":"app/static_handler.go:181","msg":"UI config path not provided, config file will not be watched"}
{"level":"info","ts":1706170298.7452555,"caller":"app/server.go:218","msg":"Query server started","http_addr":"[::]:16686","grpc_addr":"[::]:16685"}
{"level":"info","ts":1706170298.7452831,"caller":"healthcheck/handler.go:129","msg":"Health Check state change","status":"ready"}
{"level":"info","ts":1706170298.745308,"caller":"app/server.go:301","msg":"Starting GRPC server","port":16685,"addr":":16685"}
{"level":"info","ts":1706170298.7453387,"caller":"app/server.go:282","msg":"Starting HTTP server","port":16686,"addr":":16686"}
{"level":"info","ts":1706170298.7454355,"caller":"grpclog/component.go:71","msg":"[core]Creating new client transport to \"{\\n  \\\"Addr\\\": \\\":16685\\\",\\n  \\\"ServerName\\\": \\\"localhost:16685\\\",\\n  \\\"Attributes\\\": null,\\n  \\\"BalancerAttributes\\\": null,\\n  \\\"Type\\\": 0,\\n  \\\"Metadata\\\": null\\n}\": connection error: desc = \"transport: Error while dialing dial tcp :16685: connect: connection refused\"","system":"grpc","grpc_log":true}
{"level":"warn","ts":1706170298.7454567,"caller":"channelz/funcs.go:342","msg":"[core][Channel #8 SubChannel #9] grpc: addrConn.createTransport failed to connect to {\n  \"Addr\": \":16685\",\n  \"ServerName\": \"localhost:16685\",\n  \"Attributes\": null,\n  \"BalancerAttributes\": null,\n  \"Type\": 0,\n  \"Metadata\": null\n}. Err: connection error: desc = \"transport: Error while dialing dial tcp :16685: connect: connection refused\"","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7454665,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8 SubChannel #9] Subchannel Connectivity change to TRANSIENT_FAILURE","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7454777,"caller":"grpclog/component.go:71","msg":"[core]pickfirstBalancer: UpdateSubConnState: 0x4000790000, {TRANSIENT_FAILURE connection error: desc = \"transport: Error while dialing dial tcp :16685: connect: connection refused\"}","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.7454839,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8] Channel Connectivity change to TRANSIENT_FAILURE","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170298.745315,"caller":"channelz/funcs.go:340","msg":"[core][Server #7 ListenSocket #10] ListenSocket created","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170299.7455788,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8 SubChannel #9] Subchannel Connectivity change to IDLE","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170299.7456577,"caller":"grpclog/component.go:71","msg":"[core]pickfirstBalancer: UpdateSubConnState: 0x4000790000, {IDLE connection error: desc = \"transport: Error while dialing dial tcp :16685: connect: connection refused\"}","system":"grpc","grpc_log":true}
{"level":"info","ts":1706170299.7456737,"caller":"channelz/funcs.go:340","msg":"[core][Channel #8] Channel Connectivity change to IDLE","system":"grpc","grpc_log":true}

Collector version

0.88.0

Environment information

No response

OpenTelemetry Collector configuration

No response

Log output

No response

Additional context

No response

@AkselAllas AkselAllas added bug Something isn't working needs triage New item requiring triage labels Jan 25, 2024
Copy link
Contributor

Pinging code owners for examples/demo: @open-telemetry/collector-approvers. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@crobert-1
Copy link
Member

Hello @AkselAllas, I tested on macOS with the same Docker version and it's working for me. I can see traces in Jaeger and I'm not seeing any errors. connection refused makes me think it's an environment error. Maybe a firewall is blocking the connection? Some other reason the port can't be connected to?

@crobert-1 crobert-1 added question Further information is requested and removed bug Something isn't working needs triage New item requiring triage labels Jan 25, 2024
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

  • examples/demo: @open-telemetry/collector-approvers

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Mar 26, 2024
Copy link
Contributor

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale May 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants