3x3 Confusion Matrix:
From: | To: |
A 3x3 confusion matrix is a table that visualizes the performance of a classification model with three classes. It shows how many predictions were correct (true positives) and where the model made errors (false positives and false negatives).
Enter the counts of:
Accuracy: Overall proportion of correct predictions.
Precision: Proportion of positive identifications that were correct.
Recall: Proportion of actual positives that were identified correctly.
F1 Score: Harmonic mean of precision and recall.
Macro Average: Average of metrics calculated for each class.
Micro Average: Global average counting total true positives, false negatives, etc.
High values (close to 1) indicate better performance. Compare metrics across classes to identify where the model performs well or struggles. Macro averages treat all classes equally, while micro averages are influenced by class sizes.
Q1: When should I use a 3x3 confusion matrix?
A: When evaluating multi-class classification problems with exactly three classes.
Q2: What's the difference between macro and micro averages?
A: Macro averages treat all classes equally, while micro averages are influenced by class imbalance.
Q3: How do I improve my model based on these results?
A: Look for classes with low precision (many false positives) or low recall (many false negatives) to focus improvement efforts.
Q4: What does an F1 score of 0.5 mean?
A: This indicates moderate performance - the harmonic mean of precision and recall is 0.5.
Q5: Can I use this for binary classification?
A: While possible, it's better to use a 2x2 confusion matrix calculator for binary problems.