mlflow.tracing
Attention
The mlflow.tracing
namespace only contains a few utility functions fo managing traces. The main entry point for MLflow
Tracing is Tracing Fluent APIs defined directly under the
mlflow
namespace, or the low-level Tracing Client APIs
- mlflow.tracing.configure(span_processors: list[typing.Callable[[ForwardRef('LiveSpan')], NoneType]] | None = None) mlflow.tracing.config.TracingConfigContext [source]
Note
Experimental: This function may change or be removed in a future release without warning.
Configure MLflow tracing. Can be used as function or context manager.
Only updates explicitly provided arguments, leaving others unchanged.
- Parameters
span_processors – List of functions to process spans before export. This is helpful for filtering/masking particular attributes from the span to prevent sensitive data from being logged or for reducing the size of the span. Each function must accept a single argument of type LiveSpan and should not return any value. When multiple functions are provided, they are applied sequentially in the order they are provided.
- Returns
- Context manager for temporary configuration changes.
When used as a function, the configuration changes persist. When used as a context manager, changes are reverted on exit.
- Return type
TracingConfigContext
Examples
def pii_filter(span): """Example PII filter that masks sensitive data in span attributes.""" # Mask sensitive inputs if inputs := span.inputs: for key, value in inputs.items(): if "password" in key.lower() or "token" in key.lower(): span.set_inputs({**inputs, key: "[REDACTED]"}) # Mask sensitive outputs if outputs := span.outputs: if isinstance(outputs, dict): for key in outputs: if "secret" in key.lower(): outputs[key] = "[REDACTED]" span.set_outputs(outputs) # Mask sensitive attributes for attr_key in list(span.attributes.keys()): if "api_key" in attr_key.lower(): span.set_attribute(attr_key, "[REDACTED]") # Permanent configuration change mlflow.tracing.configure(span_processors=[pii_filter]) # Temporary configuration change with mlflow.tracing.configure(span_processors=[pii_filter]): # PII filtering enabled only in this block pass
- mlflow.tracing.disable()[source]
Disable tracing.
Note
This function sets up OpenTelemetry to use NoOpTracerProvider and effectively disables all tracing operations.
Example:
import mlflow @mlflow.trace def f(): return 0 # Tracing is enabled by default f() assert len(mlflow.search_traces()) == 1 # Disable tracing mlflow.tracing.disable() f() assert len(mlflow.search_traces()) == 1
- mlflow.tracing.disable_notebook_display()[source]
Disables displaying the MLflow Trace UI in notebook output cells. Call
mlflow.tracing.enable_notebook_display()
to re-enable display.
- mlflow.tracing.enable()[source]
Enable tracing.
Example:
import mlflow @mlflow.trace def f(): return 0 # Tracing is enabled by default f() assert len(mlflow.search_traces()) == 1 # Disable tracing mlflow.tracing.disable() f() assert len(mlflow.search_traces()) == 1 # Re-enable tracing mlflow.tracing.enable() f() assert len(mlflow.search_traces()) == 2
- mlflow.tracing.enable_notebook_display()[source]
Enables the MLflow Trace UI in notebook output cells. The display is on by default, and the Trace UI will show up when any of the following operations are executed:
On trace completion (i.e. whenever a trace is exported)
When calling the
mlflow.search_traces()
fluent APIWhen calling the
mlflow.client.MlflowClient.get_trace()
ormlflow.client.MlflowClient.search_traces()
client APIs
To disable, please call
mlflow.tracing.disable_notebook_display()
.
- mlflow.tracing.reset()[source]
Reset the flags that indicates whether the MLflow tracer provider has been initialized. This ensures that the tracer provider is re-initialized when next tracing operation is performed.
- mlflow.tracing.set_destination(destination: mlflow.tracing.destination.TraceDestination, *, context_local: bool = False)[source]
Note
Experimental: This function may change or be removed in a future release without warning.
Set a custom span destination to which MLflow will export the traces.
A destination specified by this function will take precedence over other configurations, such as tracking URI, OTLP environment variables.
- Parameters
destination – A
TraceDestination
object that specifies the destination of the trace data.context_local – If False (default), the destination is set globally. If True, the destination is isolated per async task or thread, providing isolation in concurrent applications.
Example
import mlflow from mlflow.tracing.destination import Databricks # Setting the destination globally mlflow.tracing.set_destination(Databricks(experiment_id="123")) # Setting the destination with async task isolation mlflow.tracing.set_destination(Databricks(experiment_id="456"), context_local=True) # Reset the destination (to an active experiment as default) mlflow.tracing.reset()
- mlflow.tracing.set_span_chat_tools(span: LiveSpan, tools: list[ChatTool])[source]
Set the mlflow.chat.tools attribute on the specified span. This attribute is used in the UI, and also by downstream applications that consume trace data, such as MLflow evaluate.
- Parameters
span – The LiveSpan to add the attribute to
tools – A list of standardized chat tool definitions (refer to the spec for details)
Example:
import mlflow from mlflow.tracing import set_span_chat_tools tools = [ { "type": "function", "function": { "name": "add", "description": "Add two numbers", "parameters": { "type": "object", "properties": { "a": {"type": "number"}, "b": {"type": "number"}, }, "required": ["a", "b"], }, }, } ] @mlflow.trace def f(): span = mlflow.get_current_active_span() set_span_chat_tools(span, tools) return 0 f()
- class mlflow.tracing.destination.Databricks(experiment_id: Optional[str] = None, experiment_name: Optional[str] = None)[source]
Note
Experimental: This class may change or be removed in a future release without warning.
A destination representing a Databricks tracing server.
By setting this destination in the
mlflow.tracing.set_destination()
function, MLflow will log traces to the specified experiment.If neither experiment_id nor experiment_name is specified, an active experiment when traces are created will be used as the destination. If both are specified, they must refer to the same experiment.
- class mlflow.tracing.destination.MlflowExperiment(experiment_id: Optional[str] = None)[source]
Note
Experimental: This class may change or be removed in a future release without warning.
A destination representing an MLflow experiment.
By setting this destination in the
mlflow.tracing.set_destination()
function, MLflow will log traces to the specified experiment.
- class mlflow.tracing.destination.TraceDestination[source]
Note
Experimental: This class may change or be removed in a future release without warning.
A configuration object for specifying the destination of trace data.