Throughout the model building process, a model lives in memory and is accessible throughout the application’s lifecycle. However, once the application stops running, if the model is not saved somewhere locally or remotely, it’s no longer accessible. Typically models are used at some point after training in other applications either for inference or re-training. Therefore, it’s important to store the model.
Save a model locally
When saving a model you need two things:
- The
ITransformer
of the model. - The
DataViewSchema
of theITransformer
‘s expected input.
After training the model, use the Save
method to save the trained model to a file called model.zip
using the DataViewSchema
of the input data.
// Save Trained Model mlContext.Model.Save(trainedModel, data.Schema, "model.zip");
Load a model stored locally
In a separate application or process, use the Load
method along with the file path to get the trained model into your application.
//Define DataViewSchema for data preparation pipeline and trained model DataViewSchema modelSchema; // Load trained model ITransformer trainedModel = mlContext.Model.Load("model.zip", out modelSchema);