mlflow.keras
The mlflow.keras
module provides an API for logging and loading Keras models. This module
exports Keras models with the following flavors:
- Keras (native) format
- This is the main flavor that can be loaded back into Keras.
mlflow.pyfunc
- Produced for use by generic pyfunc-based deployment tools and batch inference.
-
mlflow.keras.
get_default_conda_env
() Returns: The default Conda environment for MLflow Models produced by calls to save_model()
andlog_model()
.
-
mlflow.keras.
load_model
(model_uri, **kwargs) Load a Keras model from a local file or a run.
Extra arguments are passed through to keras.load_model.
Parameters: model_uri – The location, in URI format, of the MLflow model. For example:
/Users/me/path/to/local/model
relative/path/to/local/model
s3://my_bucket/path/to/model
runs:/<mlflow_run_id>/run-relative/path/to/model
For more information about supported URI schemes, see Referencing Artifacts.
Returns: A Keras model instance. >>> # Load persisted model as a Keras model or as a PyFunc, call predict() on a pandas DataFrame >>> keras_model = mlflow.keras.load_model("runs:/96771d893a5e46159d9f3b49bf9013e2" + "/models") >>> predictions = keras_model.predict(x_test)
-
mlflow.keras.
log_model
(keras_model, artifact_path, conda_env=None, **kwargs) Log a Keras model as an MLflow artifact for the current run.
Parameters: - keras_model – Keras model to be saved.
- artifact_path – Run-relative artifact path.
- conda_env –
Either a dictionary representation of a Conda environment or the path to a Conda environment yaml file. If provided, this decribes the environment this model should be run in. At minimum, it should specify the dependencies contained in
get_default_conda_env()
. IfNone
, the defaultmlflow.keras.get_default_conda_env()
environment is added to the model. The following is an example dictionary representation of a Conda environment:{ 'name': 'mlflow-env', 'channels': ['defaults'], 'dependencies': [ 'python=3.7.0', 'keras=2.2.4', 'tensorflow=1.8.0' ] }
- kwargs – kwargs to pass to
keras_model.save
method.
>>> from keras import Dense, layers >>> import mlflow >>> # Build, compile, and train your model >>> keras_model = ... >>> keras_model.compile(optimizer="rmsprop", loss="mse", metrics=["accuracy"]) >>> results = keras_model.fit( ... x_train, y_train, epochs=20, batch_size = 128, validation_data=(x_val, y_val)) >>> # Log metrics and log the model >>> with mlflow.start_run() as run: >>> mlflow.keras.log_model(keras_model, "models")
-
mlflow.keras.
save_model
(keras_model, path, conda_env=None, mlflow_model=<mlflow.models.Model object>) Save a Keras model to a path on the local file system.
Parameters: - keras_model – Keras model to be saved.
- path – Local path where the model is to be saved.
- conda_env –
Either a dictionary representation of a Conda environment or the path to a Conda environment yaml file. If provided, this decribes the environment this model should be run in. At minimum, it should specify the dependencies contained in
get_default_conda_env()
. IfNone
, the defaultget_default_conda_env()
environment is added to the model. The following is an example dictionary representation of a Conda environment:{ 'name': 'mlflow-env', 'channels': ['defaults'], 'dependencies': [ 'python=3.7.0', 'keras=2.2.4', 'tensorflow=1.8.0' ] }
- mlflow_model – MLflow model config this flavor is being added to.
>>> import mlflow >>> # Build, compile, and train your model >>> keras_model = ... >>> keras_model_path = ... >>> keras_model.compile(optimizer="rmsprop", loss="mse", metrics=["accuracy"]) >>> results = keras_model.fit( ... x_train, y_train, epochs=20, batch_size = 128, validation_data=(x_val, y_val)) ... # Save the model as an MLflow Model >>> mlflow.keras.save_model(keras_model, keras_model_path)