5

Is f1_score(average='micro') always the same as calculating the accuracy. Or it is just in this case?

I have tried with different values and they gave the same answer but I don't have the analytical demonstration.

from sklearn.metrics import accuracy_score
from sklearn.metrics import f1_score

y_true = [0, 1, 2, 0, 1, 2] y_pred = [0, 2, 1, 0, 0, 1] print(f1_score(y_true, y_pred, average='micro')) print(accuracy_score(y_true,y_pred))

0.3333333

0.3333333

Carlos Mougan
  • 6,430
  • 2
  • 20
  • 51

1 Answers1

7

In classification tasks for which every test case is guaranteed to be assigned to exactly one class, micro-F is equivalent to accuracy.

The above answer is from: https://stackoverflow.com/questions/37358496/is-f1-micro-the-same-as-accuracy

More detailed explanation: https://simonhessner.de/why-are-precision-recall-and-f1-score-equal-when-using-micro-averaging-in-a-multi-class-problem/