ML Model Hosting
This post is to discuss different techniques for Model hosting. The real-world scenario will involve
Model Training would involve pre-processing, normalization, bucketing and multiple features. This is like metadata. This needs to be stored in a metadata table or a separate standalone script to apply for the test data
Model deployment - Save the model as pickle file / Save the coefficient values in DB with metadata value to compute
Model Rendering - Exposure as a Service, Invoke this after a service call
Monitoring model performance - Expected / Actual values need to be monitored, deviation beyond a certain threshold would need human in loop to update / re-train / fix the model
Model Retraining - Model has to be updated on a regular basis or deviation from certain threshold accuracy
Tools -
Flask API, Web Interface, Visualization of Clusters, Recommendations, SQL Code/ Python code for preparing feature variables, Handling Missing data, normalization (standard code), Logging, Model monitoring schema
Click Prediction ML Software Pipelines from the Trenches
Architecting a Machine Learning Pipeline
Machine Learning Pipeline from scratch
Models to Production
How to Code Neat Machine Learning Pipelines
Lessons learnt while building a Machine Learning Pipeline
Cookiecutter Data Science
production-data-science
Happy Learning!!!
This post is to discuss different techniques for Model hosting. The real-world scenario will involve
- Data Pipeline
- Model Training
- Model Deployment (Saved File / Saved Coefficients and compute at runtime)
- Model Rendering
- Framework for monitoring model performance
- Model Retraining
Model Training would involve pre-processing, normalization, bucketing and multiple features. This is like metadata. This needs to be stored in a metadata table or a separate standalone script to apply for the test data
Model deployment - Save the model as pickle file / Save the coefficient values in DB with metadata value to compute
Model Rendering - Exposure as a Service, Invoke this after a service call
Monitoring model performance - Expected / Actual values need to be monitored, deviation beyond a certain threshold would need human in loop to update / re-train / fix the model
Model Retraining - Model has to be updated on a regular basis or deviation from certain threshold accuracy
Tools -
Flask API, Web Interface, Visualization of Clusters, Recommendations, SQL Code/ Python code for preparing feature variables, Handling Missing data, normalization (standard code), Logging, Model monitoring schema
ReferencesSuccessfully augmenting human expertise with Artificial Intelligence and other smart technologies starts with the person at the top - How CEOs can tap AI’s full potential. Link >> https://t.co/1dy8VdrmMv @PwC via @antgrasso @MikeQuindazzi #AI #Automation #DigitalStrategy pic.twitter.com/Js8Q0Bc1mj— Antonio Grasso (@antgrasso) May 27, 2020
Click Prediction ML Software Pipelines from the Trenches
Architecting a Machine Learning Pipeline
Machine Learning Pipeline from scratch
Models to Production
How to Code Neat Machine Learning Pipelines
Lessons learnt while building a Machine Learning Pipeline
Cookiecutter Data Science
production-data-science
Happy Learning!!!
No comments:
Post a Comment