Skip to content
Open
Show file tree
Hide file tree
Changes from 18 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 23 additions & 1 deletion packages/aws_bedrock_agentcore/_dev/build/docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,17 @@ For more details about these requirements, check the [AWS integration documentat
* You can install only one Elastic Agent per host.
* Elastic Agent is required to collect metrics from CloudWatch and ship the data to Elastic, where the events will then be processed through the integration's ingest pipelines.

### How to find the `log_group_arn` (for log-based datasets)

Some datasets in this integration require the ARN of the CloudWatch log group where your AgentCore logs are stored. You can find it by:

- Opening CloudWatch in the AWS Console
- Going to Logs > Log groups
- Selecting the log group used by your AgentCore deployment
- Copying the Log group ARN shown

You can then use this ARN when configuring any log-based dataset.

## Setup

To use the Amazon Bedrock AgentCore metrics, ensure your agents are deployed and running. The integration will automatically collect metrics from the AWS/Bedrock-AgentCore CloudWatch namespace. For enhanced observability, enable detailed monitoring and logging for your AgentCore resources.
Expand Down Expand Up @@ -64,4 +75,15 @@ The metrics include the following dimensions for enhanced filtering and analysis
- `SessionId`: The session identifier for agent invocations

{{event "metrics"}}
{{fields "metrics"}}
{{fields "metrics"}}

## Logs

### Runtime Application Logs

Amazon Bedrock AgentCore runtime application logs provide detailed insights into agent execution, decision-making processes, and operational events. The integration collects comprehensive log data from your intelligent agents to help you understand agent behavior and troubleshoot issues.

For more details about enabling logs for AgentCore, check the [Amazon Bedrock AgentCore Observability Guide](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/observability-view.html).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think, it will be good to mention the steps needed to fetch the log_group_arn, needed for configuring this dataset?

As it is the same steps for all the log-based datasets, maybe a common section (may be under What do I need to use this integration?) should suffice.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've added a small section to explain how to get the log group arn 👍


{{event "runtime_application_logs"}}
{{fields "runtime_application_logs"}}
5 changes: 5 additions & 0 deletions packages/aws_bedrock_agentcore/changelog.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,9 @@
# newer versions go on top
- version: "0.0.2"
changes:
- description: Add `runtime_application_logs` data stream.
type: enhancement
link: https://github.com/elastic/integrations/pull/999
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix the PR link

- version: "0.0.1"
changes:
- description: Initial draft of the package
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
{"resource_arn": "arn:aws:bedrock-agentcore:us-east-1:627286350134:runtime/claudeserver-CdBoW2FLP0", "event_timestamp": 1762144239907, "account_id": "627286350134", "request_id": "136b5865-b303-43a7-b737-3fed1ba17341", "session_id": "090ab333-40ff-49cd-ade1-1527b3a8ede7", "span_id": "3f2b2909093d27f3", "trace_id": "69082fe91a0ed1872de8e57378fe229b", "service_name": "AgentCoreCodeRuntime", "operation": "InvokeAgentRuntime", "request_payload":{"prompt": "What is this agent about, this is claudserver?"}}
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
{
"expected": [
{
"@timestamp": "2025-11-03T04:30:39.907Z",
"aws": {
"bedrock_agentcore": {
"account_id": "627286350134",
"request_id": "136b5865-b303-43a7-b737-3fed1ba17341",
"resource_arn": "arn:aws:bedrock-agentcore:us-east-1:627286350134:runtime/claudeserver-CdBoW2FLP0"
}
},
"cloud": {
"account": {
"id": "627286350134"
},
"provider": "aws",
"service": {
"name": "bedrock-agentcore"
}
},
"ecs": {
"version": "8.11.0"
},
"event": {
"original": "{\"resource_arn\": \"arn:aws:bedrock-agentcore:us-east-1:627286350134:runtime/claudeserver-CdBoW2FLP0\", \"event_timestamp\": 1762144239907, \"account_id\": \"627286350134\", \"request_id\": \"136b5865-b303-43a7-b737-3fed1ba17341\", \"session_id\": \"090ab333-40ff-49cd-ade1-1527b3a8ede7\", \"span_id\": \"3f2b2909093d27f3\", \"trace_id\": \"69082fe91a0ed1872de8e57378fe229b\", \"service_name\": \"AgentCoreCodeRuntime\", \"operation\": \"InvokeAgentRuntime\", \"request_payload\":{\"prompt\": \"What is this agent about, this is claudserver?\"}}",
"outcome": "success"
},
"gen_ai": {
"conversation": {
"id": "090ab333-40ff-49cd-ade1-1527b3a8ede7"
},
"operation": {
"name": "InvokeAgentRuntime"
},
"prompt": "What is this agent about, this is claudserver?",
"provider": {
"name": "aws"
},
"system": "aws_bedrock_agentcore"
},
"service": {
"name": "AgentCoreCodeRuntime"
},
"span": {
"id": "3f2b2909093d27f3"
},
"tags": [
"preserve_original_event",
"preserve_duplicate_custom_fields"
],
"trace": {
"id": "69082fe91a0ed1872de8e57378fe229b"
}
}
]
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
fields:
tags:
- preserve_original_event
- preserve_duplicate_custom_fields
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
{{#unless log_group_name}}
{{#unless log_group_name_prefix}}
{{#if log_group_arn }}
log_group_arn: {{ log_group_arn }}
{{/if}}
{{/unless}}
{{/unless}}

{{#unless log_group_arn}}
{{#unless log_group_name}}
{{#if log_group_name_prefix }}
log_group_name_prefix: {{ log_group_name_prefix }}
{{/if}}
{{#if include_linked_accounts_with_prefix }}
include_linked_accounts_for_prefix_mode: {{ include_linked_accounts_with_prefix }}
{{/if}}
{{#if number_of_workers }}
number_of_workers: {{ number_of_workers }}
{{/if}}
{{/unless}}
{{/unless}}

{{#unless log_group_arn}}
{{#unless log_group_name_prefix}}
{{#if log_group_name }}
log_group_name: {{ log_group_name }}
{{/if}}
{{/unless}}
{{/unless}}

{{#unless log_group_arn}}
region_name: {{ region_name }}
{{/unless}}

{{#unless log_stream_prefix}}
{{#if log_streams }}
log_streams: {{ log_streams }}
{{/if}}
{{/unless}}

{{#unless log_streams}}
{{#if log_stream_prefix }}
log_stream_prefix: {{ log_stream_prefix }}
{{/if}}
{{/unless}}

{{#if start_position }}
start_position: {{ start_position }}
{{/if}}

{{#if scan_frequency }}
scan_frequency: {{ scan_frequency }}
{{/if}}

{{#if api_sleep }}
api_sleep: {{ api_sleep }}
{{/if}}

{{#if latency }}
latency: {{ latency }}
{{/if}}

{{#if credential_profile_name}}
credential_profile_name: {{credential_profile_name}}
{{/if}}
{{#if shared_credential_file}}
shared_credential_file: {{shared_credential_file}}
{{/if}}
{{#if api_timeout}}
api_timeout: {{api_timeout}}
{{/if}}
{{#if default_region}}
default_region: {{default_region}}
{{/if}}
{{#if access_key_id}}
access_key_id: {{access_key_id}}
{{/if}}
{{#if secret_access_key}}
secret_access_key: {{secret_access_key}}
{{/if}}
{{#if session_token}}
session_token: {{session_token}}
{{/if}}
{{#if role_arn}}
role_arn: {{role_arn}}
{{/if}}
{{#if proxy_url }}
proxy_url: {{proxy_url}}
{{/if}}
tags:
{{#if preserve_original_event}}
- preserve_original_event
{{/if}}
{{#if preserve_duplicate_custom_fields}}
- preserve_duplicate_custom_fields
{{/if}}
{{#each tags as |tag|}}
- {{tag}}
{{/each}}
{{#contains "forwarded" tags}}
publisher_pipeline.disable_host: true
{{/contains}}
{{#if processors}}
processors:
{{processors}}
{{/if}}
Original file line number Diff line number Diff line change
@@ -0,0 +1,163 @@
---
description: Pipeline for Amazon Bedrock AgentCore runtime application logs
processors:
- rename:
field: message
target_field: event.original
if: 'ctx.event?.original == null'
description: 'Store original message in event.original'
ignore_missing: true

- remove:
field: message
ignore_missing: true
if: ctx.event?.original != null
description: 'The `message` field is no longer required if the document has an `event.original` field.'

# Parse JSON if the log comes as a string
- json:
field: event.original
target_field: aws.bedrock_agentcore
if: 'ctx.event?.original != null'
ignore_failure: true

# Set ECS version
- set:
field: ecs.version
value: '8.11.0'

# Set cloud service metadata
- set:
field: cloud.service.name
value: bedrock-agentcore

- set:
field: cloud.provider
value: aws

- set:
field: gen_ai.provider.name
value: aws
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this isn't a valid value, see the note in semconv:

gen_ai.provider.name has the following list of
well-known values. If one of them applies, then the respective value
MUST be used; otherwise, a custom value MAY be used.

Value Description Stability
anthropic Anthropic  Development
aws.bedrock AWS Bedrock  Development
azure.ai.inference Azure AI Inference  Development
azure.ai.openai Azure OpenAI  Development
cohere Cohere  Development
deepseek DeepSeek  Development
gcp.gemini Gemini [14]  Development
gcp.gen_ai Any Google generative AI endpoint [15]  Development
gcp.vertex_ai Vertex AI [16]  Development
groq Groq  Development
ibm.watsonx.ai IBM Watsonx AI  Development
mistral_ai Mistral AI  Development
openai OpenAI  Development
perplexity Perplexity  Development
x_ai xAI  Development

i suggest aws.bedrock_agentcore?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it seems like this is more about the model provider though, as opposed to the framework provider. i think anything except the actual model provider would be misleading here.


# Process timestamp
- date:
field: aws.bedrock_agentcore.event_timestamp
target_field: "@timestamp"
formats:
- UNIX_MS
ignore_failure: true

# Map fields to ECS
- set:
field: cloud.account.id
copy_from: aws.bedrock_agentcore.account_id
ignore_empty_value: true

- rename:
field: aws.bedrock_agentcore.operation
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this field conform to the well-known values specified in semconv for this field?

specifically:

gen_ai.operation.name has the following list of
well-known values. If one of them applies, then the respective value
MUST be used; otherwise, a custom value MAY be used.

Value Description Stability
chat Chat completion operation such as OpenAI Chat API  Development
create_agent Create GenAI agent  Development
embeddings Embeddings operation such as OpenAI Create embeddings API  Development
execute_tool Execute a tool  Development
generate_content Multimodal content generation operation such as Gemini Generate Content  Development
invoke_agent Invoke GenAI agent  Development
text_completion Text completions operation such as OpenAI Completions API (Legacy)  Development

target_field: gen_ai.operation.name
ignore_missing: true

- rename:
field: aws.bedrock_agentcore.request_payload.prompt
target_field: gen_ai.prompt
ignore_missing: true

- rename:
field: aws.bedrock_agentcore.service_name
target_field: service.name
ignore_missing: true

# Set trace context
- rename:
field: aws.bedrock_agentcore.trace_id
target_field: trace.id
ignore_missing: true

- rename:
field: aws.bedrock_agentcore.span_id
target_field: span.id
ignore_missing: true

# Set gen_ai fields for consistency
- set:
field: gen_ai.system
value: aws_bedrock_agentcore

# Handle session information
- rename:
field: aws.bedrock_agentcore.session_id
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think it would be preferable to alias these fields so they are searchable both from the AWS documented naming and the gen_ai naming

target_field: gen_ai.conversation.id
ignore_missing: true
- rename:
field: aws.bedrock_agentcore.region
target_field: cloud.region
ignore_missing: true

# Set event outcome based on presence of error indicators
- set:
field: event.outcome
value: success
if: ctx.error?.message == null

- set:
field: event.outcome
value: failure
if: ctx.error?.message != null

# Truncate large prompt fields to prevent storage issues
- script:
description: Truncate large prompts and store hash
lang: painless
if: ctx.gen_ai?.prompt != null && ctx.gen_ai?.prompt instanceof String
source: |
if (ctx.gen_ai.prompt.length() > 32766) {
ctx.aws.bedrock_agentcore.prompt_hash = ctx.gen_ai.prompt.sha1();
ctx.gen_ai.remove("prompt");
}
ignore_failure: true

# Remove empty/null values
- script:
description: Remove null/empty values recursively
lang: painless
source: |
boolean drop(Object o) {
if (o == null || o == "") {
return true;
} else if (o instanceof Map) {
((Map) o).values().removeIf(v -> drop(v));
return (((Map) o).size() == 0);
} else if (o instanceof List) {
((List) o).removeIf(v -> drop(v));
return (((List) o).length == 0);
}
return false;
}
drop(ctx);
# Clean up temporary fields
- remove:
field:
- aws.bedrock_agentcore.event_timestamp
ignore_missing: true
if: ctx['@timestamp'] != null

# Handle event.original field size
- script:
description: Truncate large event.original field
lang: painless
if: ctx.event?.original != null && ctx.event.original instanceof String
source: |
if (ctx.event.original.length() > 32766) {
ctx.event.original = 'sha1-' + ctx.event.original.sha1() + ':' + ctx.event.original.length().toString();
}
ignore_failure: true

on_failure:
- set:
field: error.message
value: 'Processor {{{_ingest.on_failure_processor_type}}} with tag {{{_ingest.on_failure_processor_tag}}} in pipeline {{{_ingest.pipeline}}} failed with message: {{{_ingest.on_failure_message}}}'
- set:
field: event.outcome
value: failure
Loading