Interactive Confusion Matrix Explorer

Predicted
Positive Negative
Actual Positive 85 15
Negative 10 90
Accuracy
0.875
Precision
0.895
Recall (Sensitivity)
0.850
F1 Score
0.872
Specificity
0.900
False Positive Rate
0.100

Understanding the Metrics

Accuracy = (TP + TN) / (TP + TN + FP + FN)

Precision = TP / (TP + FP) - Of all positive predictions, how many were correct?

Recall = TP / (TP + FN) - Of all actual positives, how many did we find?

F1 Score = 2 × (Precision × Recall) / (Precision + Recall)