Skip to main content

Tracing Pydantic AI Gateway

Pydantic AI Gateway is a unified interface for accessing multiple AI providers with a single key. It supports models from OpenAI, Anthropic, Google Vertex, Groq, AWS Bedrock, and more. Key features include spending limits, failover management, and zero translation—requests flow through directly in each provider's native format, giving you immediate access to new model features as soon as they are released.

Since Pydantic AI Gateway exposes OpenAI and Anthropic-compatible APIs, you can use MLflow's automatic tracing integrations to capture detailed traces of your LLM interactions.

Pydantic AI Gateway Tracing
Looking for PydanticAI Agent Framework?

This guide covers tracing LLM calls through Pydantic AI Gateway. If you're building agents using the Pydantic AI framework directly, see the PydanticAI Integration guide instead.

Prerequisite

Start MLflow Server

If you have a local Python environment >= 3.10, you can start the MLflow server locally using the mlflow CLI command.

bash
mlflow server

Get Pydantic AI Gateway API Key

Create an account on Pydantic AI Gateway to get your API key, or bring your own API key from a supported LLM provider.

Query Gateway

You can trace LLM calls through Pydantic AI Gateway using any of the following approaches:

Since Pydantic AI Gateway exposes an OpenAI-compatible API, you can use MLflow's OpenAI automatic tracing integration to trace calls.

python
import mlflow
from openai import OpenAI

# Enable auto-tracing for OpenAI
mlflow.openai.autolog()

# Set MLflow tracking URI and experiment
mlflow.set_tracking_uri("http://localhost:5000")
mlflow.set_experiment("Pydantic AI Gateway")

# Point OpenAI client to Pydantic AI Gateway
client = OpenAI(
base_url="https://gateway.pydantic.dev/proxy/chat/",
api_key="<PYDANTIC_AI_GATEWAY_API_KEY>",
)

# Make API calls - traces will be captured automatically
response = client.chat.completions.create(
model="gpt-5",
messages=[{"role": "user", "content": "Hello world"}],
)
print(response.choices[0].message.content)

View Traces in MLflow UI

Open the MLflow UI at http://localhost:5000 (or your custom MLflow server URL) to see the traces from your Pydantic AI Gateway calls.

Next Steps