MLFlowModels to package a machine learning model

What are MLFlowModels?

A format for packaging machine learning models.

MLFlowModels Introduction Flow

  1. Define an MLmodel file
  2. Train, evaluate, and customize models
  3. Deploy the model

Learn more

  • Storage format
  • Define an MLmodel
  • Set up a server for the model
    • mlflow models serve -m my_model
  • Deploying to AWS SageMaker
    • mlflow deployments create -t sagemaker -m my_model [other options]
  • Values that can be set in MLmodel
  • time_created
  • run_id
  • signature
  • input_example
  • databricks_runtime
  • mlflow_version
  • Log
  • For environment reproduction, the following files are automatically recorded each time a model is logged:
  • conda.yaml
  • python_env.yaml
  • requirements.txt
  • Model Signing
  • Supports column-based signatures
  • Supports tensor-based signing
  • Model input example
  • Provide an instance of a valid model input
  • Model API
  • add_flavor
  • save
  • .log
  • load
  • Standard built-in flavors
  • python_function
  • crate
  • h2o
  • keras
  • mleap
  • pytorch
  • sklearn
  • spark
  • tensorflow
  • onnx
  • gluon
  • xgboost
  • lightgbm
  • catboost
  • spacy
  • fastai
  • statsmodels
  • prophet
  • pmdarima
  • diviner
  • Deploy the model
  • Deploy via API
  • Endpoint Definition
    • /ping Used for health check
    • /health (same as /ping)
    • /versionmlflow is used to get the version
    • /invocations used for grading
  • Example request
    • curl http://127.0.0.1:5000/invocations -H 'Content-Type: application/json' -d '{ "columns": ["a", "b", "c"], "data ": [[1, 2, 3], [4, 5, 6]] }'
    • curl http://127.0.0.1:5000/invocations -H 'Content-Type: application/json; format=pandas-records' -d '[{"a": 1,"b": 2,"c": 3}, {"a": 4,"b": 5,"c": 6}
    • curl http://127.0.0.1:5000/invocations -H 'Content-Type: application/json' -d '{"instances": [{"a": "s1", "b": 1, c": [1, 2, 3]}, {"a": "s2", "b": 2, "c": [4, 5, 6]}, {"a": "s3", "b": 3, "c": [7, 8, 9]}]}'
    • curl http://127.0.0.1:5000/invocations -H 'Content-Type: application/json' -d '{"inputs": {"a": ["s1", "s2", "s3 "], "b": [1, 2, 3], "c": [[1, 2, 3], [4, 5, 6], [7, 8, 9]]}}']'
  • Delivered in MLServer (experimental)
  • Use MLServer to provide MLflow models
    • mlflow models serve -m my_model --enable-mlserver
  • Build Docker images built with MLServer
    • mlflow models build -m my_model --enable-mlserver -n my-model

Directory configuration

my_model/
├── MLmodel
├── model.pkl
├── conda.yaml
├── python_env.yaml
└── requirements.txt

MLmodel Files

time_created: 2018-05-25T17:28:53.35

flavors:
  sklearn:
    sklearn_version: 0.19.1
    pickled_model: model.pkl
  python_function:
    loader_module: mlflow.sklearn

flavors

Column-based

signature:
    inputs: '[{"name": "sepal length (cm)", "type": "double"}, {"name": "sepal width
      (cm)", "type": "double"}, {"name": "petal length (cm)", "type": "double"}, {"name":
      "petal width (cm)", "type": "double"}]'
    outputs: '[{"type": "integer"}]'

Tensorbase

signature:
    inputs: '[{"name": "images", "dtype": "uint8", "shape": [-1, 28, 28, 1]}]'
    outputs: '[{"shape": [-1, 10], "dtype": "float32"}]'

Log signed

Column-based

import pandas as pd
from sklearn import datasets
from sklearn.ensemble import RandomForestClassifier
import mlflow
import mlflow.sklearn
from mlflow.models.signature import infer_signature

iris = datasets.load_iris()
iris_train = pd. DataFrame(iris.data, columns=iris.feature_names)
clf = RandomForestClassifier(max_depth=7, random_state=0)
clf.fit(iris_train, iris.target)
signature = infer_signature(iris_train, clf.predict(iris_train))
mlflow.sklearn.log_model(clf, "iris_rf", signature=signature)
from mlflow.models.signature import ModelSignature
from mlflow.types.schema import Schema, ColSpec

input_schema = Schema([
  ColSpec("double", "sepal length (cm)"),
  ColSpec("double", "sepal width (cm)"),
  ColSpec("double", "petal length (cm)"),
  ColSpec("double", "petal width (cm)"),
])
output_schema = Schema([ColSpec("long")])
signature = ModelSignature(inputs=input_schema, outputs=output_schema)

Tensorbase

from keras.datasets import mnist
from keras.utils import to_categorical
from keras.models import Sequential
from keras.layers import Conv2D, MaxPooling2D, Dense, Flatten
from keras.optimizers import SGD
import mlflow
import mlflow.keras
from mlflow.models.signature import infer_signature

(train_X, train_Y), (test_X, test_Y) = mnist.load_data()
trainX = train_X.reshape((train_X.shape[0], 28, 28, 1))
testX = test_X.reshape((test_X.shape[0], 28, 28, 1))
trainY = to_categorical(train_Y)
testY = to_categorical(test_Y)

model = Sequential()
model.add(Conv2D(32, (3, 3), activation='relu', kernel_initializer='he_uniform', input_shape=(28, 28, 1)))
model.add(MaxPooling2D((2, 2)))
model.add(Flatten())
model.add(Dense(100, activation='relu', kernel_initializer='he_uniform'))
model.add(Dense(10, activation='softmax'))
opt = SGD(lr=0.01, momentum=0.9)
model.compile(optimizer=opt, loss='categorical_crossentropy', metrics=['accuracy'])
model.fit(trainX, trainY, epochs=10, batch_size=32, validation_data=(testX, testY))

signature = infer_signature(testX, model.predict(testX))
mlflow.keras.log_model(model, "mnist_cnn", signature=signature)
import numpy as np
from mlflow.models.signature import ModelSignature
from mlflow.types.schema import Schema, TensorSpec

input_schema = Schema([
  TensorSpec(np.dtype(np.uint8), (-1, 28, 28, 1)),
])
output_schema = Schema([TensorSpec(np.dtype(np.float32), (-1, 10))])
signature = ModelSignature(inputs=input_schema, outputs=output_schema)

Model Input

Column-based

input_example = {
  "sepal length (cm)": 5.1,
  "sepal width (cm)": 3.5,
  "petal length (cm)": 1.4,
  "petal width (cm)": 0.2
}
mlflow.sklearn.log_model(..., input_example=input_example)

Tensorbase

# each input has shape (4, 4)
input_example = np.array([
   [[  0,   0,   0,   0],
    [  0, 134,  25,  56],
    [253, 242, 195,   6],
    [  0,  93,  82,  82]],
   [[  0,  23,  46,   0],
    [ 33,  13,  36, 166],
    [ 76,  75,   0, 255],
    [ 33,  44,  11,  82]]
], dtype=np.uint8)
mlflow.keras.log_model(..., input_example=input_example)