For data scientists, ML engineers, product managers, and all practitioners alike.
How to maintain ML models once after you deploy them, and what exactly to prepare for? In this guide, we look into the key concepts that relate to production ML model operations.
This guide is made for data scientists, data engineers, ML engineers, and AI product managers who deal with operating production ML-based systems and products.
What you will find in this guide:
Data drift refers to the shifts in the distribution of the input data that ML models receive in production. You can use data drift analysis as a monitoring and debugging technique for production ML models to evaluate if the model still operates in a familiar environment.
Concept drift refers to changes in the data patterns and relationships that the ML model has learned. It can cause a decline in the production model quality. You can detect concept drift by monitoring production model quality drops or through proxy metrics like prediction drift.