Skip to content

fix(langchain): langgraph application crash due to context detach#3256

Merged
galkleinman merged 2 commits intomainfrom
gk/fix-langgraph-crash
Aug 14, 2025
Merged

fix(langchain): langgraph application crash due to context detach#3256
galkleinman merged 2 commits intomainfrom
gk/fix-langgraph-crash

Conversation

@galkleinman
Copy link
Copy Markdown
Contributor

@galkleinman galkleinman commented Aug 14, 2025

Important

Fixes Langchain application crash by implementing safe context attachment and detachment in async scenarios.

  • Behavior:
    • Implements try-except blocks for context attachment and detachment in __call__() in __init__.py and _create_span(), _create_llm_span(), _end_span() in callback_handler.py.
    • If context operations fail, the application continues without suppression or detachment, ensuring core functionality is unaffected.
  • Functions:
    • Adds _safe_attach_context() and _safe_detach_context() in callback_handler.py to handle context operations safely.
    • Modifies _create_span(), _create_llm_span(), and _end_span() to use the new safe context functions.
  • Misc:
    • Adds comments explaining the rationale for handling exceptions in context operations.

This description was created by Ellipsis for 8caba09. You can customize this summary. It will automatically update as commits are pushed.

Summary by CodeRabbit

  • Bug Fixes

    • Improved stability of LangChain instrumentation by safely handling context attach/detach failures.
    • Prevented rare crashes when suppressing LLM instrumentation, ensuring operations continue even if context updates fail.
    • Increased resilience in async and legacy chain scenarios (e.g., LLMChain), reducing interruptions during tracing.
  • Refactor

    • Centralized fail-safe context management to standardize attachment/detachment and minimize edge-case errors.

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Aug 14, 2025

Walkthrough

Adds guarded context attach/detach operations across LangChain instrumentation. Introduces safe helper methods in the callback handler and wraps SUPPRESS_LANGUAGE_MODEL_INSTRUMENTATION_KEY updates with try/except blocks. Adjusts span creation and end paths to tolerate context API failures without raising, while leaving public APIs unchanged.

Changes

Cohort / File(s) Summary of changes
Context suppression guard in OpenAI wrapper
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/__init__.py
Wraps SUPPRESS_LANGUAGE_MODEL_INSTRUMENTATION_KEY attach/set in try/except within _OpenAITracingWrapper.call, making suppression non-fatal if context operations fail.
Safe context ops in callback handler
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py
Adds private helpers _safe_attach_context and _safe_detach_context; replaces direct attach/detach usage; guards association_properties attachment; wraps suppression key handling in _create_llm_span and on_chain_end; adjusts _end_span to use safe detach; preserves on_llm_end logic with minor refactor for clarity.

Sequence Diagram(s)

sequenceDiagram
  participant LangChain
  participant CallbackHandler
  participant ContextAPI
  participant Tracer/Span

  LangChain->>CallbackHandler: _create_span(...)
  CallbackHandler->>ContextAPI: safe attach span context
  alt attach fails
    CallbackHandler->>CallbackHandler: proceed without token
  end
  CallbackHandler->>Tracer/Span: create/start span

  LangChain->>CallbackHandler: _create_llm_span(...)
  CallbackHandler->>ContextAPI: set SUPPRESS_LANGUAGE_MODEL_INSTRUMENTATION_KEY (guarded)
  alt suppress set fails
    CallbackHandler->>CallbackHandler: continue without suppression token
  end

  LangChain->>CallbackHandler: _end_span(token)
  CallbackHandler->>ContextAPI: safe detach(token)
  alt detach fails
    CallbackHandler->>CallbackHandler: swallow error
  end
  CallbackHandler->>Tracer/Span: end span
Loading
sequenceDiagram
  participant Caller
  participant OpenAITracingWrapper
  participant ContextAPI

  Caller->>OpenAITracingWrapper: __call__(...)
  OpenAITracingWrapper->>ContextAPI: set/attach suppression key (guarded)
  alt failure
    OpenAITracingWrapper->>OpenAITracingWrapper: continue call without suppression
  end
  OpenAITracingWrapper->>Caller: result
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

Suggested reviewers

  • nirga
  • ronensc
  • doronkopit5

Poem

I hop through spans with gentle care,
Attach, detach—beware the snare.
If context slips, I will not cry,
I twitch my nose and safely try.
Suppressed or not, the traces sing—
A carrot’s nod to everything. 🥕🐇

Tip

🔌 Remote MCP (Model Context Protocol) integration is now available!

Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats.

✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch gk/fix-langgraph-crash

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR/Issue comments)

Type @coderabbitai help to get the list of available commands.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Status, Documentation and Community

  • Visit our Status Page to check the current availability of CodeRabbit.
  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@galkleinman galkleinman marked this pull request as ready for review August 14, 2025 11:59
Copy link
Copy Markdown
Contributor

@ellipsis-dev ellipsis-dev Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Important

Looks good to me! 👍

Reviewed everything up to 3666eaf in 55 seconds. Click for details.
  • Reviewed 160 lines of code in 2 files
  • Skipped 0 files when reviewing.
  • Skipped posting 6 draft comments. View those below.
  • Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/__init__.py:242
  • Draft comment:
    Wraps context API attach in a try/except to avoid crash. Consider logging the caught exception at debug level for extra diagnostics.
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 50% None
2. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py:194
  • Draft comment:
    The new _safe_attach_context helper cleanly wraps context attachment. Optionally log the exception on failure to aid debugging.
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 50% None
3. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py:208
  • Draft comment:
    The _safe_detach_context method gracefully handles context detach errors. Adding optional debug logging for exceptions might help diagnose context issues in async scenarios.
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 50% None
4. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py:261
  • Draft comment:
    The association properties update is now wrapped in a try/except block. Consider logging the exception at a debug level to record if the context update fails.
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 50% None
5. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py:363
  • Draft comment:
    In _create_llm_span, the try/except for setting the suppression flag is consistent. Consider adding optional debug logging if context setting fails.
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 50% None
6. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py:461
  • Draft comment:
    The on_chain_end method resets the suppression flag within a try/except block. Similar to other places, consider logging exceptions to ease future troubleshooting.
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 50% None

Workflow ID: wflow_TYJUKATW0BFQh1D8

You can customize Ellipsis by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.

Copy link
Copy Markdown
Contributor

@ellipsis-dev ellipsis-dev Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Important

Looks good to me! 👍

Reviewed 8caba09 in 51 seconds. Click for details.
  • Reviewed 57 lines of code in 1 files
  • Skipped 0 files when reviewing.
  • Skipped posting 3 draft comments. View those below.
  • Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py:194
  • Draft comment:
    Minor docstring whitespace cleanup in _safe_attach_context; looks fine.
  • Reason this comment was not posted:
    Confidence changes required: 0% <= threshold 50% None
2. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py:221
  • Draft comment:
    Docstring formatting and inline comment cleanup in _safe_detach_context improve readability.
  • Reason this comment was not posted:
    Confidence changes required: 0% <= threshold 50% None
3. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py:564
  • Draft comment:
    Reformatted association_properties block in on_llm_end for improved readability.
  • Reason this comment was not posted:
    Confidence changes required: 0% <= threshold 50% None

Workflow ID: wflow_CSdr3vNvUD2WmWxH

You can customize Ellipsis by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.

@galkleinman galkleinman merged commit dbeceb8 into main Aug 14, 2025
8 of 9 checks passed
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🔭 Outside diff range comments (2)
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/__init__.py (1)

245-255: Attach without detach leaks suppression context; use a token and detach in finally (also satisfies Ruff SIM105).

Current code sets SUPPRESS_LANGUAGE_MODEL_INSTRUMENTATION_KEY but never restores the previous context, potentially suppressing downstream instrumentation longer than intended. Wrap the call with a token and detach in a finally block. Also replace try/except/pass with contextlib.suppress per SIM105.

Apply this diff within call:

-        # In legacy chains like LLMChain, suppressing model instrumentations
-        # within create_llm_span doesn't work, so this should helps as a fallback
-        try:
-            context_api.attach(
-                context_api.set_value(SUPPRESS_LANGUAGE_MODEL_INSTRUMENTATION_KEY, True)
-            )
-        except Exception:
-            # If context setting fails, continue without suppression
-            # This is not critical for core functionality
-            pass
-
-        return wrapped(*args, **kwargs)
+        # In legacy chains like LLMChain, suppressing model instrumentations
+        # within create_llm_span doesn't work, so this helps as a fallback.
+        token = None
+        with contextlib.suppress(Exception):
+            token = context_api.attach(
+                context_api.set_value(
+                    SUPPRESS_LANGUAGE_MODEL_INSTRUMENTATION_KEY, True
+                )
+            )
+        try:
+            return wrapped(*args, **kwargs)
+        finally:
+            if token:
+                with contextlib.suppress(Exception):
+                    context_api.detach(token)

Additionally, add the missing import at the top of the file:

import contextlib
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py (1)

283-299: Critical: span context token is overwritten by suppression token → leaked current-span context and broken LIFO detach.

In _create_span you now attach the span to context and store its token in SpanHolder. In _create_llm_span you attach a suppression token and then overwrite self.spans[run_id] with a new SpanHolder that stores the suppression token, losing the span token. _end_span detaches only one token, so the span’s context token will leak and may corrupt parentage of subsequent spans.

Fix by:

  • Preserving the span token from _create_span.
  • Tracking the suppression token separately and detaching it first (then the span token, then the association token if any) to respect LIFO.

Apply these diffs:

  1. Detach suppression, then span token, then association token in _end_span:
     span.end()
-    token = self.spans[run_id].token
-    if token:
-        self._safe_detach_context(token)
+    # Detach in LIFO order: suppression -> span -> association
+    supp_token = getattr(self, "_suppression_tokens", {}).pop(run_id, None)
+    if supp_token:
+        self._safe_detach_context(supp_token)
+
+    token = self.spans[run_id].token  # span context token
+    if token:
+        self._safe_detach_context(token)
+
+    assoc_token = getattr(self, "_association_tokens", {}).pop(run_id, None)
+    if assoc_token:
+        self._safe_detach_context(assoc_token)
  1. Store only the suppression token (do not overwrite SpanHolder) in _create_llm_span:
-        # we already have an LLM span by this point,
-        # so skip any downstream instrumentation from here
-        try:
-            token = context_api.attach(
-                context_api.set_value(SUPPRESS_LANGUAGE_MODEL_INSTRUMENTATION_KEY, True)
-            )
-        except Exception:
-            # If context setting fails, continue without suppression token
-            token = None
-
-        self.spans[run_id] = SpanHolder(
-            span, token, None, [], workflow_name, None, entity_path
-        )
+        # We already have an LLM span; suppress downstream instrumentation from here
+        supp_token = None
+        with contextlib.suppress(Exception):
+            supp_token = context_api.attach(
+                context_api.set_value(
+                    SUPPRESS_LANGUAGE_MODEL_INSTRUMENTATION_KEY, True
+                )
+            )
+        # Track suppression token separately to detach it in LIFO order on end
+        if supp_token is not None:
+            self._suppression_tokens[run_id] = supp_token
  1. Track association_properties context token in _create_span for proper cleanup later:
-            try:
-                context_api.attach(
-                    context_api.set_value(
-                        "association_properties",
-                        {**current_association_properties, **sanitized_metadata},
-                    )
-                )
-            except Exception:
-                # If setting association properties fails, continue without them
-                # This doesn't affect the core span functionality
-                pass
+            with contextlib.suppress(Exception):
+                assoc_token = context_api.attach(
+                    context_api.set_value(
+                        "association_properties",
+                        {**current_association_properties, **sanitized_metadata},
+                    )
+                )
+                # Store token for LIFO-detach at end of run
+                if assoc_token is not None:
+                    self._association_tokens[run_id] = assoc_token

Additions required outside these hunks:

# In TraceloopCallbackHandler.__init__:
self._suppression_tokens: dict[UUID, object] = {}
self._association_tokens: dict[UUID, object] = {}

And ensure contextlib is imported at the top of this file:

import contextlib

This preserves correct context stack behavior and prevents leaks across async runs.

Also applies to: 362-374, 188-193

🧹 Nitpick comments (2)
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py (2)

194-207: Annotate return type of _safe_attach_context for clarity.

Explicitly annotate the token type as Optional[object] to improve readability and help static analysis.

-    def _safe_attach_context(self, span: Span):
+    def _safe_attach_context(self, span: Span) -> Optional[object]:

462-470: Use contextlib.suppress instead of try/except/pass (Ruff SIM105).

This resets suppression without emitting errors, consistent with the resilience goals.

-            try:
-                context_api.attach(
-                    context_api.set_value(
-                        SUPPRESS_LANGUAGE_MODEL_INSTRUMENTATION_KEY, False
-                    )
-                )
-            except Exception:
-                # If context reset fails, it's not critical for functionality
-                pass
+            with contextlib.suppress(Exception):
+                context_api.attach(
+                    context_api.set_value(
+                        SUPPRESS_LANGUAGE_MODEL_INSTRUMENTATION_KEY, False
+                    )
+                )
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these settings in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between f706a9f and 8caba09.

📒 Files selected for processing (2)
  • packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/__init__.py (1 hunks)
  • packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py (6 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*.py

📄 CodeRabbit Inference Engine (CLAUDE.md)

**/*.py: Python code must conform to Flake8 linting rules
Do not hardcode API keys in source code; read them from environment variables or a secure vault

Files:

  • packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/__init__.py
  • packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py
🧬 Code Graph Analysis (1)
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py (1)
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py (1)
  • _extract_model_name_from_association_metadata (373-376)
🪛 Ruff (0.12.2)
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/__init__.py

245-252: Use contextlib.suppress(Exception) instead of try-except-pass

(SIM105)

packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py

262-272: Use contextlib.suppress(Exception) instead of try-except-pass

(SIM105)


462-470: Use contextlib.suppress(Exception) instead of try-except-pass

(SIM105)

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
  • GitHub Check: Test Packages (3.12)
  • GitHub Check: Test Packages (3.11)
  • GitHub Check: Lint
  • GitHub Check: Test Packages (3.10)
  • GitHub Check: Build Packages (3.11)
🔇 Additional comments (2)
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py (2)

568-573: LGTM: safer, clearer association properties retrieval.

The multi-line assignment improves readability without changing behavior.


241-300: SpanHolder is expandable and context attach is safe – please audit callers of _create_span

  • Verified that SpanHolder (packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/span_utils.py) is a @dataclass without __slots__, so it can safely be extended with additional fields.
  • The updated _create_span in callback_handler.py (lines 241–300) now attaches the new span to the current context, altering the previous “non-current” span pattern. The two-token detach in _safe_attach_context should restore the original context correctly.
  • Please review any callers of _create_span to ensure none relied on the old behavior of leaving the span non-current.

Comment on lines +208 to +240
def _safe_detach_context(self, token):
"""
Safely detach context token without causing application crashes.

This method implements a fail-safe approach to context detachment that handles
all known edge cases in async/concurrent scenarios where context tokens may
become invalid or be detached in different execution contexts.

We use the runtime context directly to avoid logging errors from context_api.detach()
"""
if not token:
return

try:
# Use the runtime context directly to avoid error logging from context_api.detach()
from opentelemetry.context import _RUNTIME_CONTEXT

_RUNTIME_CONTEXT.detach(token)
except Exception:
# Context detach can fail in async scenarios when tokens are created in different contexts
# This includes ValueError, RuntimeError, and other context-related exceptions
# This is expected behavior and doesn't affect the correct span hierarchy
#
# Common scenarios where this happens:
# 1. Token created in one async task/thread, detached in another
# 2. Context was already detached by another process
# 3. Token became invalid due to context switching
# 4. Race conditions in highly concurrent scenarios
#
# This is safe to ignore as the span itself was properly ended
# and the tracing data is correctly captured.
pass

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Prefer contextlib.suppress and avoid relying solely on private _RUNTIME_CONTEXT.

Using a private API may break across OTel versions. Try context_api.detach first, then fall back to _RUNTIME_CONTEXT only if needed. Also adopt contextlib.suppress to satisfy linting and reduce noise.

-    def _safe_detach_context(self, token):
+    def _safe_detach_context(self, token: Optional[object]) -> None:
@@
-        try:
-            # Use the runtime context directly to avoid error logging from context_api.detach()
-            from opentelemetry.context import _RUNTIME_CONTEXT
-
-            _RUNTIME_CONTEXT.detach(token)
-        except Exception:
-            # Context detach can fail in async scenarios when tokens are created in different contexts
-            # This includes ValueError, RuntimeError, and other context-related exceptions
-            # This is expected behavior and doesn't affect the correct span hierarchy
-            #
-            # Common scenarios where this happens:
-            # 1. Token created in one async task/thread, detached in another
-            # 2. Context was already detached by another process
-            # 3. Token became invalid due to context switching
-            # 4. Race conditions in highly concurrent scenarios
-            #
-            # This is safe to ignore as the span itself was properly ended
-            # and the tracing data is correctly captured.
-            pass
+        # Attempt a normal detach first; fall back to runtime context if needed.
+        with contextlib.suppress(Exception):
+            context_api.detach(token)
+            return
+        with contextlib.suppress(Exception):
+            from opentelemetry.context import _RUNTIME_CONTEXT  # type: ignore
+            _RUNTIME_CONTEXT.detach(token)

Committable suggestion skipped: line range outside the PR's diff.

Comment on lines +262 to +272
try:
context_api.attach(
context_api.set_value(
"association_properties",
{**current_association_properties, **sanitized_metadata},
)
)
)
except Exception:
# If setting association properties fails, continue without them
# This doesn't affect the core span functionality
pass
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Replace try/except/pass with contextlib.suppress and retain the token for later detach (Ruff SIM105).

This aligns with lint guidance and prevents leaking association_properties in the runtime context beyond the span lifecycle.

Apply this diff:

-            try:
-                context_api.attach(
-                    context_api.set_value(
-                        "association_properties",
-                        {**current_association_properties, **sanitized_metadata},
-                    )
-                )
-            except Exception:
-                # If setting association properties fails, continue without them
-                # This doesn't affect the core span functionality
-                pass
+            with contextlib.suppress(Exception):
+                assoc_token = context_api.attach(
+                    context_api.set_value(
+                        "association_properties",
+                        {**current_association_properties, **sanitized_metadata},
+                    )
+                )
+                if assoc_token is not None:
+                    self._association_tokens[run_id] = assoc_token
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
try:
context_api.attach(
context_api.set_value(
"association_properties",
{**current_association_properties, **sanitized_metadata},
)
)
)
except Exception:
# If setting association properties fails, continue without them
# This doesn't affect the core span functionality
pass
with contextlib.suppress(Exception):
assoc_token = context_api.attach(
context_api.set_value(
"association_properties",
{**current_association_properties, **sanitized_metadata},
)
)
if assoc_token is not None:
self._association_tokens[run_id] = assoc_token
🧰 Tools
🪛 Ruff (0.12.2)

262-272: Use contextlib.suppress(Exception) instead of try-except-pass

(SIM105)

🤖 Prompt for AI Agents
In
packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py
around lines 262-272, replace the bare try/except/pass with contextlib.suppress
while capturing the token returned by context_api.attach so it can be detached
later; specifically, import contextlib, use contextlib.suppress(Exception)
around the context_api.attach(...) call and assign its return to a variable
(e.g., token = context_api.attach(...)) and ensure that elsewhere after the span
lifecycle you call context_api.detach(token) to avoid leaking
association_properties in the runtime context beyond the span.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants