-
Notifications
You must be signed in to change notification settings - Fork 953
fix(anthropic): restore accidentally lost cache tokens attributes #3648
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
efe6a15
598cee8
122dd17
f5877d4
01a7108
1c95ef9
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -94,6 +94,13 @@ def _set_token_usage( | |
| ) | ||
| set_span_attribute(span, SpanAttributes.LLM_USAGE_TOTAL_TOKENS, total_tokens) | ||
|
|
||
| set_span_attribute( | ||
| span, SpanAttributes.GEN_AI_USAGE_CACHE_READ_INPUT_TOKENS, cache_read_tokens | ||
| ) | ||
| set_span_attribute( | ||
| span, SpanAttributes.GEN_AI_USAGE_CACHE_CREATION_INPUT_TOKENS, cache_creation_tokens | ||
| ) | ||
|
Comment on lines
+97
to
+102
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🧩 Analysis chain🏁 Script executed: # First, let's verify the streaming.py file content at the specified lines
echo "=== Content of streaming.py around lines 97-102 ==="
cat -n packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/streaming.py | sed -n '80,110p'Repository: traceloop/openllmetry Length of output: 1460 🏁 Script executed: # Now check for cache token attributes in __init__.py
echo "=== Searching for cache token attributes in __init__.py ==="
rg -n "CACHE_READ_INPUT_TOKENS|CACHE_CREATION_INPUT_TOKENS" \
--type py \
packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/Repository: traceloop/openllmetry Length of output: 1227 🏁 Script executed: # Check the set_span_attribute function implementation
echo "=== Checking set_span_attribute function in utils.py ==="
cat -n packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/utils.py | head -40Repository: traceloop/openllmetry Length of output: 1450 🏁 Script executed: # Search for _set_token_usage and _aset_token_usage functions in __init__.py
echo "=== Searching for token usage functions in __init__.py ==="
rg -n "_set_token_usage|_aset_token_usage" \
--type py \
packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py \
-A 15Repository: traceloop/openllmetry Length of output: 2829 🏁 Script executed: # Check the actual cache token attribute lines in __init__.py more closely
echo "=== Checking cache token attributes in _aset_token_usage (around line 286) ==="
cat -n packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py | sed -n '280,295p'
echo ""
echo "=== Checking cache token attributes in _set_token_usage (around line 400) ==="
cat -n packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py | sed -n '395,410p'Repository: traceloop/openllmetry Length of output: 1455 🏁 Script executed: # Verify the SpanAttributes constants used in both files
echo "=== SpanAttributes imports and usage in streaming.py ==="
head -50 packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/streaming.py | grep -i "import\|SpanAttributes"
echo ""
echo "=== SpanAttributes imports in __init__.py ==="
head -50 packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py | grep -i "import\|SpanAttributes"Repository: traceloop/openllmetry Length of output: 1867 🏁 Script executed: # Let's verify the SpanAttributes constants to understand the difference
echo "=== Checking what SpanAttributes constants are available ==="
rg "LLM_USAGE_CACHE|GEN_AI_USAGE_CACHE" --type pyRepository: traceloop/openllmetry Length of output: 2886 🏁 Script executed: # Check if these are custom constants defined in the package
echo "=== Looking for SpanAttributes definition/import ==="
rg -n "class SpanAttributes|from.*SpanAttributes" --type py packages/opentelemetry-instrumentation-anthropic/ -B 2 -A 2Repository: traceloop/openllmetry Length of output: 2561 🏁 Script executed: # Quick check to confirm cache_read_tokens and cache_creation_tokens extraction in __init__.py
echo "=== Checking cache token extraction in __init__.py _aset_token_usage ==="
cat -n packages/opentelemetry-instrumentation-anthropic/opentelemetry/instrumentation/anthropic/__init__.py | sed -n '200,230p'Repository: traceloop/openllmetry Length of output: 1619 Cache token attributes correctly restored, but SpanAttributes constants are inconsistently named across files. The new The non-streaming path in However, there's an inconsistency: streaming.py uses 🤖 Prompt for AI Agents |
||
|
|
||
| set_span_attribute( | ||
| span, GenAIAttributes.GEN_AI_RESPONSE_MODEL, complete_response.get("model") | ||
| ) | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
rg -n "LLM_REQUEST_STRUCTURED_OUTPUT_SCHEMA\|structured_output_schema" \ --type py \ packages/opentelemetry-semantic-conventions-aiRepository: traceloop/openllmetry
Length of output: 47
🏁 Script executed:
Repository: traceloop/openllmetry
Length of output: 689
🏁 Script executed:
Repository: traceloop/openllmetry
Length of output: 4042
🏁 Script executed:
Repository: traceloop/openllmetry
Length of output: 642
🏁 Script executed:
Repository: traceloop/openllmetry
Length of output: 515
🏁 Script executed:
# Check the main attributes file to see what constants are defined cat -n packages/opentelemetry-semantic-conventions-ai/opentelemetry/semconv_ai/__init__.pyRepository: traceloop/openllmetry
Length of output: 15476
Add
LLM_REQUEST_STRUCTURED_OUTPUT_SCHEMAconstant toSpanAttributesor use the appropriate semantic convention constant.The constant
LLM_REQUEST_STRUCTURED_OUTPUT_SCHEMAdoes not exist inopentelemetry-semantic-conventions-ai. Line 177 uses a literal string"gen_ai.request.structured_output_schema"while all other attributes in this file reference named constants fromSpanAttributesorGenAIAttributes(e.g.,SpanAttributes.LLM_REQUEST_FUNCTIONSat line 159,SpanAttributes.LLM_USAGE_TOTAL_TOKENSat lines 337, 361).Either define this constant in
SpanAttributesto align with semantic conventions best practices and maintain consistency, or verify that this attribute name is correct and document why a literal string is necessary here.🤖 Prompt for AI Agents