Accuracy vs Loss is usually neglected. People usually consider and care about the accuracy metric while model training. However, loss is something to be equally taken care of.
By definition, Accuracy score is the number of correct predictions obtained. Loss values are the values indicating the difference from the desired target state(s).
When accuracy - loss discrepancy occur? 1) Predictions are getting near to the target values. Loss values would improve, accuracy would remain the same. The model is getting more robust and better. 2) Opposite to the above pt, if the model is overfitting, a single wrong prediction would lead to significant difference in loss values without much change in accuracy values. 3) For imbalanced datasets, where one class comprises 90% or more of the dataset, the accuracy score would generally be high. In such scenarios, accuracy score would not be a great metric.
Conclusion ->
1) In case of two different models with different set of hyperparameters and same accuracy score, choose the one which has better loss values. 2) For perfectly balanced datasets, accuracy score is usually a good metric. 3) Accuracy score is a performance metric, however, loss values shows the model's variables and it's training.
Comments