Home > Software design >  Why tensorflow addons F1 Score gave 0 for correct guess?
Why tensorflow addons F1 Score gave 0 for correct guess?

Time:12-10

I am confused. My goal is to train my CNN model with F1 Score. However, the result is weird

import tensorflow_addons as tfa
import numpy as np
metric = tfa.metrics.F1Score(
    num_classes=4, threshold=0.5)
y_true = np.array([
    [0, 1, 0, 0],
    # [0, 1, 0, 0],
    # [1, 0, 0, 0]
], np.int32)
y_pred = np.array([
    [0, 1, 0, 0],
    # [0.2, 0.6, 0.2, 0.2],
    # [0.6, 0.2, 0.2, 0.2]
], np.float32)
metric.update_state(y_true, y_pred)
result = metric.result()
result.numpy()

The expected result is

[1,1,1,1]

So, when I want to get the macro F1 Score, it should be 1 instead of 0.25.

The actual result is

[0, 1, 0, 0]

So, when I use parameter average=macro, the actual result is 0.25.

EDIT:

I am confused. I add another row to y_true, and it works. I expected it to throws error but it does not.

import tensorflow_addons as tfa
import numpy as np
metric = tfa.metrics.F1Score(
    num_classes=4, threshold=0.5)
y_true = np.array([
    [0, 1, 0, 0],
    [1, 0, 0, 0]
    # [0, 1, 0, 0],
    # [1, 0, 0, 0]
], np.int32)
y_pred = np.array([
    [0, 1, 0, 0],
    # [0.2, 0.6, 0.2, 0.2],
    # [0.6, 0.2, 0.2, 0.2]
], np.float32)
metric.update_state(y_true, y_pred)
result = metric.result()
result.numpy()

Is tensorflow addons buggy?

CodePudding user response:

There is no issue with tfa.metrics.F1Score. You have defined 4 classes and each element of the y_pred row represents the class probabilities and its made 1 if its above the threshold, and then F1 score is computed. In your first example, there were no outputs representing classes 0,2,3, that's why they were zero.

Check the below example,

y_true = np.array([
    [0, 1, 1, 0],
    [0, 0, 0, 1],
    [1, 0, 1, 0],

y_pred = np.array([
    [0, 1, 0, 0],
    [0, 1, 0, 0],
    [0.6, 0, 0.51, 0],
#metrics.F1Score
[1.       , 0.6666667, 0.6666667, 0.       ]
  • Related