- Metrics used to evaluate submissions
- Best result finding optimal hyperplane
- Exploratory metric analysis along with data analysis
- Own ways to measure effectiveness of algorithms
- Mean Aquare Error
- RMSE
- R Squared
- Same from optimization perspective
- Accuracy
- LogLoss
- AUC
- Cohen's Kappa
N - Samples
y - target values
y~ - target Predictions
yi - target ith value
yi~ - prediction ith object
Mean Square Error
MSE = 1/N(yi - yi~)^2
- Average the squared differences between actuals and targets
RMSE - Root Mean square Error = Sqrt(MSE)
- Same as scale of target
- RMSE vs MSE
- Similar in terms of minimizers
- Every RMSE minimizer is MSE minimizer
- MSE(a) > MSE(b) <=> RMSE(a) > RMSE(b)
- MSE orders in same way as RMSE
- MSE easier to work with
- Bit of difference in gradient based model
- They may not be interchargeable for learning methods (learning rate)
- How much model is better than constant baseline
- 1 predictions perfect
- WHEN MSE is 0, R Square = 1
- All reasonable models score between 0 and 1
- Avg of absolute difference value between target and predictions
- Widely used in Finance
- 10$ Error twice worse than 5$ Error
- MAE easier to justify
- Median of target values useful for MAE
- MAE gradient step function -1 smaller than target, +1 when greater than target
- MAE is not differentiable
- For outliers - use MAS
- unexpected but normal MSE
- MAE robust to outliers
No comments:
Post a Comment