A confusion matrix gives the detailed summary of prediction results on classification problem. The number of correct and incorrect predictions are summarized with count values and classified by each class. This is main concept behind the confusion matrix.
The confusion matrix explains the ways that how our classification model is confused when it makes predictions. By this we can understand that not only into the errors being made by a classifier but more importantly the types of errors that are being made.
Statement: Prediction is positive and it’s true.
Example: You predicted that a woman is pregnant and she actually is.
Statement: Prediction is negative and it’s true.
Example: You predicted that a man is not pregnant and he actually is not.
False Positive: (Type 1 Error)
Statement: Prediction is positive and it’s false.
Example: You predicted that a man is pregnant but he actually is not.
False Negative: (Type 2 Error)
Statement: Prediction is negative and it’s false.
Example: You predicted that a woman is not pregnant but she actually is.
Metrics used for Predictions using Confusion Matrix:
Recall explains about for all the given positive classes, how much we predicted them correctly. It must be as high as possible.
Precision tells about the for all positively predicted classes, how many are actually positive.
Accuracy can be obtained by Out of all the classes, how many are correctly predicted, which will be, in this case 4/7. It must be as high as possible.
It is difficult to compare two models with low precision and high recall or high precision and low recall. So, to compare the two models, F-Score is used.
F-score is used to measure Recall and Precision at the same moment. It uses Harmonic Mean rather than Arithmetic Mean by exhausting the extreme values more.