Skip to main content

Tracing Amazon Bedrock AgentCore

Amazon Bedrock AgentCore Logo

Integration via OpenTelemetry

Amazon Bedrock AgentCore can be integrated with MLflow via OpenTelemetry. Configure Amazon Bedrock AgentCore's OpenTelemetry exporter to send traces to MLflow's OTLP endpoint.

info

OpenTelemetry trace ingestion is supported in MLflow 3.6.0 and above.

OpenTelemetry endpoint (OTLP)

MLflow Server exposes an OTLP endpoint at /v1/traces (OTLP). This endpoint accepts traces from any native OpenTelemetry instrumentation, allowing you to trace applications written in other languages such as Java, Go, Rust, etc.

To use this endpoint, start MLflow Server with a SQL-based backend store. The following command starts MLflow Server with an SQLite backend store:

bash
mlflow server

To use other types of SQL databases such as PostgreSQL, MySQL, and MSSQL, change the store URI as described in the backend store documentation.

In your application, configure the server endpoint and set the MLflow experiment ID in the OTLP header x-mlflow-experiment-id.

bash
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=http://localhost:5000/v1/traces
export OTEL_EXPORTER_OTLP_TRACES_HEADERS=x-mlflow-experiment-id=123
note

Currently, MLflow Server supports only the OTLP/HTTP endpoint, and the OTLP/gRPC endpoint is not yet supported.

Enable OpenTelemetry in Amazon Bedrock AgentCore

Refer to the Amazon Bedrock AgentCore Observability documentation for setting up tracing in Amazon Bedrock AgentCore and specify OTLP HTTP exporter with above environment variables.

Reference

For complete step-by-step instructions on sending traces to MLflow from OpenTelemetry compatible frameworks, see the Collect OpenTelemetry Traces into MLflow.