Home > Back-end >  sklearn.metrics.cohen_kappa_score zero results
sklearn.metrics.cohen_kappa_score zero results

Time:09-30

I have predictions and labels arrays as below and I was trying to calculate Kohens Kappa (linear and quadratic) and although almost all the preds are correct, I get 0.0 score of Kappas.

labels = [0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
preds = [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]

kappas_linear_cls = cohen_kappa_score(labels, preds)
kappas_quadratic_csl = cohen_kappa_score(labels, preds, weights='quadratic')

linear kappa: 0.0
quadratic kappa: 0.0

I am using the built in function sklearn.metrics.cohen_kappa_score.

How I might solve this problem?

CodePudding user response:

That seems correct.

The observed agreement ratio (P0) is 33/35 (sorry for a possible imprecise count of the total length). Basically, that's the same as accuracy.

The expected agreement (Pe) is the sum of products of class probabilities in both vectors: (2/35 * 0) (33/35 * 1) = 33/35. In other words, if predictions were randomly shuffled, they'd still be correct in 33/35 cases.

Kappa is (P0 - Pe) / (1 - Pe) = (33/35 - 33/35) / (2/35) = 0.

  • Related