Home > other >  Novice [post] keras custom the precision and recall, why these two indicators are the same value.
Novice [post] keras custom the precision and recall, why these two indicators are the same value.

Time:09-18

Index of online copy code, should be no problem, but why in the operation of the display of the two indexes are the same? Also defines a f1, like them, too? There are bosses know why?

 def check_units (y_true, y_pred) : 
If y_pred. Shape [1].=1:
Y_pred=y_pred [:, 1:2)
Y_true=y_true [:, 1:2)
Return y_true, y_pred

Def precision (y_true y_pred) :
Y_true, y_pred=check_units (y_true y_pred)
True_positives=Keith um (written by K.r ound (final lip (y_true * y_pred, 0, 1)))
Predicted_positives=Keith um (written by K.r ound (final lip (y_pred, 0, 1)))
Precision=true_positives/(predicted_positives + Kevin Everett sat psilon ())
Return precision

Def recall (y_true y_pred) :
Y_true, y_pred=check_units (y_true y_pred)
True_positives=Keith um (written by K.r ound (final lip (y_true * y_pred, 0, 1)))
Possible_positives=Keith um (written by K.r ound (final lip (y_true, 0, 1)))
Recall=true_positives/(possible_positives + Kevin Everett sat psilon ())
Return recall


Below is running some screenshots
 12/846 [...] ETA: - earth - loss: 5.2509 accuracy: 0.1624 - precision: 0.1624 - recall: 0.1624 - f1:0.1624 
13/846 [...] ETA: - 6:23 - loss: 5.2181 accuracy: 0.1641 - precision: 0.1641 - recall: 0.1641 - f1:0.1641
14/846 [...] ETA: - when - loss: 5.1733 accuracy: 0.1666 - precision: 0.1666 - recall: 0.1666 - f1:0.1666
15/846 [...] ETA: - 6:05 - loss: 5.1398 accuracy: 0.1686 - precision: 0.1686 - recall: 0.1686 - f1:0.1686
16/846 [...] ETA: - 5:55 - loss: 5.1052 accuracy: 0.1703 - precision: 0.1703 - recall: 0.1703 - f1:0.1703
17/846 [...] ETA: - for - loss: 5.0709 accuracy: 0.1722 - precision: 0.1722 - recall: 0.1722 - f1:0.1722

CodePudding user response:

You within the same precision and recall method, the result must be the same,,

CodePudding user response:

reference 1st floor's a Philippine a reply:
you within the same precision and recall methods, the result must be the same,,
is not the same ah, two possible_positives is different
  • Related