Home > OS >  Why are my True/False Values with keras so high?
Why are my True/False Values with keras so high?

Time:06-27

I have 39209 images for training and I'm trying to get the tp, tn, fp, fn values of my ML model. Is it right that tp tn fp fn must be the number of images in total? I'm really confused why my true/false values are so high or is that correct?

        model = Sequential()
        model.add(Conv2D(32, kernel_size=(3, 3), activation='relu', input_shape=X_train.shape[1:]))
        model.add(Conv2D(64, (3, 3), activation='relu'))
        model.add(MaxPooling2D(pool_size=(2, 2)))
        model.add(Dropout(0.25))
        model.add(Flatten())
        model.add(Dense(128, activation='relu'))
        model.add(Dropout(0.5))
        model.add(Dense(43, activation='softmax'))
    
        model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=[tf.keras.metrics.TruePositives(name='tp'),
                                                                                  tf.keras.metrics.TrueNegatives(name='tn'),
                                                                                  tf.keras.metrics.FalsePositives(name='fp'),
                                                                                  tf.keras.metrics.FalseNegatives(name='fn')])

        model.fit(X_train, y_train,
                  epochs=10)
Epoch 1/10
39209/39209 [==============================] - 16s 414us/sample - loss: 1.2279 - tp: 20191.0000 - tn: 1643982.0000 - fp: 2796.0000 - fn: 19018.0000
Epoch 2/10
39209/39209 [==============================] - 11s 292us/sample - loss: 0.4241 - tp: 31887.0000 - tn: 1644121.0000 - fp: 2657.0000 - fn: 7322.0000
Epoch 3/10
39209/39209 [==============================] - 11s 287us/sample - loss: 0.2881 - tp: 34487.0000 - tn: 1644681.0000 - fp: 2097.0000 - fn: 4722.0000
Epoch 4/10
39209/39209 [==============================] - 11s 287us/sample - loss: 0.2177 - tp: 35696.0000 - tn: 1645046.0000 - fp: 1732.0000 - fn: 3513.0000
Epoch 5/10
39209/39209 [==============================] - 11s 285us/sample - loss: 0.1828 - tp: 36241.0000 - tn: 1645204.0000 - fp: 1574.0000 - fn: 2968.0000
Epoch 6/10
39209/39209 [==============================] - 11s 291us/sample - loss: 0.1457 - tp: 36813.0000 - tn: 1645488.0000 - fp: 1290.0000 - fn: 2396.0000
Epoch 7/10
39209/39209 [==============================] - 11s 290us/sample - loss: 0.1231 - tp: 37267.0000 - tn: 1645655.0000 - fp: 1123.0000 - fn: 1942.0000
Epoch 8/10
39209/39209 [==============================] - 11s 288us/sample - loss: 0.1117 - tp: 37427.0000 - tn: 1645699.0000 - fp: 1079.0000 - fn: 1782.0000
Epoch 9/10
39209/39209 [==============================] - 11s 287us/sample - loss: 0.0990 - tp: 37720.0000 - tn: 1645846.0000 - fp: 932.0000 - fn: 1489.0000
Epoch 10/10
39209/39209 [==============================] - 11s 287us/sample - loss: 0.0867 - tp: 37847.0000 - tn: 1645879.0000 - fp: 899.0000 - fn: 1362.0000

CodePudding user response:

After playing with the confusion matrix from sklearn I figured it out.

The tf.keras.metrics.TruePositives() etc. add all results together from all labels.

If I generate a multi-class confusion matrix and calculate the true/false values with numpy I get the following results:

    from sklearn.metrics import confusion_matrix
    labels = test["ClassId"].values #y_true
    clean_preds = np.argmax(model.predict(X_test), axis=1) #y_pred

    cm = confusion_matrix(labels, clean_preds)
    fp = cm.sum(axis=0) - np.diag(cm)
    fn = cm.sum(axis=1) - np.diag(cm)
    tp = np.diag(cm)
    tn = cm.sum() - (fp   fn   tp)

    print('True Positive', tp)
    print('True Negative', tn)
    print('False Positive', fp)
    print('False Negative', fn)

This sklearn metric also works with values from keras and the preprocess() function from the Adversarial Robustness Toolbox.

True Positive [ 59 707 743 430 655 620 132 428 431 479 656 416 682 718 269 210 149 351
 383  60  89  84 109 144  89 476 170  60 147  87 133 266  60 209 119 389
 115  58 684  88  87  45  79]
True Negative [12570 11898 11873 12169 11959 11953 12480 12175 12173 12132 11968 12204
 11937 11907 12360 12413 12478 12269 12229 12562 12522 12538 12509 12477
 12538 12128 12445 12568 12478 12540 12480 12353 12568 12417 12508 12236
 12508 12569 11934 12540 12535 12561 12534]
False Positive [ 0 12  7 11 11 47  0  5  7 18  2  6  3  3  0  7  2  1 11  8 18  2  1  3
  2 22  5  2  2  0  0  7  2  3  2  4  2  1  6  0  5  9  6]
False Negative [ 1 13  7 20  5 10 18 22 19  1  4  4  8  2  1  0  1  9  7  0  1  6 11  6
  1  4 10  0  3  3 17  4  0  1  1  1  5  2  6  2  3 15 11]

  • Related