1

I know when model is made to predict a float value, a common approach to report the models validation is using k-fold technique and calculating the average of all folds accuracy (here is a similar question).

Now suppose that my model is a classifier and each fold outputs a confusion matrix. how can i combine confusion matrixs

morteza
  • 11
  • 2

1 Answers1

1

You can just sum all the cells across folds: for every true class $T_i$ and every predicted $P_j$, the number is the sum of this cell $(T_i,P_j)$ for fold 1, fold 2, .., fold N.

This is because the confusion matrix is made with the test set for every fold, and by construction the union of all the test sets across folds is equal to the full dataset. This way one can report performance for the full dataset, without data leakage of course.

A confusion matrix is not an evaluation measure though.

Erwan
  • 26,519
  • 3
  • 16
  • 39