mlflow.onnx

The mlflow.onnx module provides APIs for logging and loading ONNX models in the MLflow Model format. This module exports MLflow Models with the following flavors:

ONNX (native) format
This is the main flavor that can be loaded back as an ONNX model object.
mlflow.pyfunc
Produced for use by generic pyfunc-based deployment tools and batch inference.
mlflow.onnx.get_default_conda_env()

Note

Experimental: This method may change or be removed in a future release without warning.

Returns:The default Conda environment for MLflow Models produced by calls to save_model() and log_model().
mlflow.onnx.load_model(model_uri)

Note

Experimental: This method may change or be removed in a future release without warning.

Load an ONNX model from a local file or a run.

Parameters:model_uri

The location, in URI format, of the MLflow model, for example:

  • /Users/me/path/to/local/model
  • relative/path/to/local/model
  • s3://my_bucket/path/to/model
  • runs:/<mlflow_run_id>/run-relative/path/to/model

For more information about supported URI schemes, see the Artifacts Documentation.

Returns:An ONNX model instance.
mlflow.onnx.log_model(onnx_model, artifact_path, conda_env=None)

Note

Experimental: This method may change or be removed in a future release without warning.

Log an ONNX model as an MLflow artifact for the current run.

Parameters:
  • onnx_model – ONNX model to be saved.
  • artifact_path – Run-relative artifact path.
  • conda_env

    Either a dictionary representation of a Conda environment or the path to a Conda environment yaml file. If provided, this decribes the environment this model should be run in. At minimum, it should specify the dependencies contained in get_default_conda_env(). If None, the default get_default_conda_env() environment is added to the model. The following is an example dictionary representation of a Conda environment:

    {
        'name': 'mlflow-env',
        'channels': ['defaults'],
        'dependencies': [
            'python=3.6.0',
            'onnx=1.4.1',
            'onnxruntime=0.3.0'
        ]
    }
    
mlflow.onnx.save_model(onnx_model, path, conda_env=None, mlflow_model=<mlflow.models.Model object>)

Note

Experimental: This method may change or be removed in a future release without warning.

Save an ONNX model to a path on the local file system.

Parameters:
  • onnx_model – ONNX model to be saved.
  • path – Local path where the model is to be saved.
  • conda_env

    Either a dictionary representation of a Conda environment or the path to a Conda environment yaml file. If provided, this decribes the environment this model should be run in. At minimum, it should specify the dependencies contained in get_default_conda_env(). If None, the default get_default_conda_env() environment is added to the model. The following is an example dictionary representation of a Conda environment:

    {
        'name': 'mlflow-env',
        'channels': ['defaults'],
        'dependencies': [
            'python=3.6.0',
            'onnx=1.4.1',
            'onnxruntime=0.3.0'
        ]
    }
    
  • mlflow_modelmlflow.models.Model this flavor is being added to.