-
Notifications
You must be signed in to change notification settings - Fork 476
support Anthropic chat models #9420
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Datadog ReportBranch report: ❌ 1 Failed (0 Known Flaky), 171831 Passed, 1139 Skipped, 11h 16m 6.38s Total duration (24m 50.11s time saved) ❌ Failed Tests (1)
|
Yun-Kim
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking great! Added some comments to help clear some context / suggestions.
tests/snapshots/tests.contrib.anthropic.test_anthropic.test_anthropic_llm_sync.json
Outdated
Show resolved
Hide resolved
| def record_usage(self, span: Span, usage: Dict[str, Any]) -> None: | ||
| if not usage or not self.metrics_enabled: | ||
| return | ||
| for token_type in ("prompt", "completion"): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| for token_type in ("prompt", "completion"): | |
| for token_type in ("input", "output"): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why? I thought we were going with prompt and completion for the tag names?
| self.record_usage( | ||
| span, | ||
| {"prompt": _get_attr(usage, "input_tokens", 0), "completion": getattr(usage, "output_tokens", 0)}, | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
AnthropicIntegration.record_usage() should be used to tag span metrics for the input/output/total token counts. We should have a separate AnthropicIntegration._set_llmobs_metrics_tags() to return the recorded span metric values, i.e.
def _get_llmobs_metrics_tags(span):
return {"input_tokens": span.get_metric("anthropic.response.usage.input_tokens"), "output_tokens": ..., "total_tokens": ...}And set that on the span, i.e. span.set_tag_str(METRICS, json.dumps(self._get_llmobs_metrics_tags(span))
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
and we set this tag in the LLMObs Integration correct?
Fixes #6055.
Checklist
changelog/no-changelogis set@DataDog/apm-tees.Reviewer Checklist