feat(langchain): implement emitting events in addition to current behavior#2889
feat(langchain): implement emitting events in addition to current behavior#2889nirga merged 14 commits intotraceloop:mainfrom
Conversation
… to current behavior * Add "use_legacy_attributes" to Config and the Instrumentor constructor, defaulting to True; * emit events for user prompts and AI responses, following [OpenTelemetry semantic conventions](https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-events/\\\); * introduce a privacy safeguard by checking the OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT environment variable to enable or disable content capture in events; * implement comprehensive tests to verify the new functionality and ensure that, when use_legacy_attributes == True, the existing behavior remains unchanged;
- Introduce TypedDicts and dataclasses to standardize inputs for event handling; - Add `emit_event` method to encapsulate all event emission logic via `Event` instances; - Refactor instrumentation to use `emit_event`, improving clarity and maintainability.
|
Generated with ❤️ by ellipsis.dev |
…metry instrumentation for Langchain - Improved code organization and readability across event models and span utilities.
🚨 BugBot couldn't runSomething went wrong. Try again by commenting "bugbot run", or contact support (requestId: serverGenReqId_43b90518-6947-4e9b-bab1-c59574709513). |
nirga
left a comment
There was a problem hiding this comment.
I think you're reverting several changes we've incorporated in callback_handler, can you take a look?
🚨 BugBot couldn't runSomething went wrong. Try again by commenting "bugbot run", or contact support (requestId: serverGenReqId_93ce9dcb-576e-49c5-bb5d-654aa14c71db). |
There was a problem hiding this comment.
Noticed this file is empty. Is it intentional, or should it be removed?
There was a problem hiding this comment.
Hey @ronensc, I'm taking a look in my branch here, if any code was lost in the merges, I'll remove it if it isn't necessary anymore, thank you for point out.
Hey @nirga, I'll take a look here in the merges I did and fix if that's the case, sorry for the delay, I hadn't seen your comment. |
…avior (traceloop#2889) Co-authored-by: Nir Gazit <nirga@users.noreply.github.com>
| else: | ||
| set_llm_request(span, serialized, prompts, kwargs, self.spans[run_id]) | ||
|
|
||
| @dont_throw |
There was a problem hiding this comment.
@LuizDMM Could you please clarify the reason for removing @dont_throw from on_llm_end()?
…avior (#2889) Co-authored-by: Nir Gazit <nirga@users.noreply.github.com>
✅ PR Requirements
📌 Issue Requirements