You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This bug is related to #1983. Similar to what's described there, the OpenAIInstrumentor doesn't propagate the trace context of the exported span through the LLM request.
This fix will also benefit other frameworks built on top of the OpenAI client library, such as CrewAI.
The issue you're experiencing with the OpenAIInstrumentor not propagating the trace context between OpenAI and vLLM spans could be related to how the trace context is being handled in the instrumentor. The OpenAIInstrumentor is supposed to propagate trace context by including it in the HTTP request headers, as verified by tests that check for the presence of the TRACEPARENT_HEADER_NAME in the request headers [1].
To address this issue, you might want to ensure that the trace context is correctly injected and propagated through the request headers. The _OpenAITracingWrapper class is responsible for injecting tracing information into OpenAI client requests by modifying the extra_headers in the request [2]. Make sure that this mechanism is functioning as expected in your setup.
Additionally, verify that the TracerProvider and SpanProcessor are correctly configured to manage spans and trace context. The TracerProvider should be set globally, and the spans should be processed and exported properly [3].
If the issue persists, consider checking for any discrepancies in the OpenAI client library version or any other dependencies that might affect trace context propagation [4].
Which component is this bug for?
OpenAI Instrumentation
📜 Description
This bug is related to #1983. Similar to what's described there, the
OpenAIInstrumentor
doesn't propagate the trace context of the exported span through the LLM request.This fix will also benefit other frameworks built on top of the OpenAI client library, such as CrewAI.
👟 Reproduction steps
OpenAIInstrumentor
:👍 Expected behavior
I should see in Jaeger WebUI one trace with 2 spans.
👎 Actual Behavior with Screenshots
I see 2 traces, each with a single span.
http://localhost:16686/
🤖 Python Version
No response
📃 Provide any additional context for the Bug.
No response
👀 Have you spent some time to check if this bug has been raised before?
Are you willing to submit PR?
Yes I am willing to submit a PR!
The text was updated successfully, but these errors were encountered: