Load scenario:
Perfect agreement
Random labelling
Biased but consistent
Kappa paradox (high % low κ)
Custom (edit cells)
Confusion matrix · click ± to edit
Labeller B
correct
incorrect
Labeller A
correct
70
−
+
5
−
+
Labeller A
incorrect
10
−
+
15
−
+
Total: 100
Percent agreement (p
o
)
85.0%
Chance agreement (p
e
)
62.5%
Cohen's κ
0.60
Moderate to substantial agreement.
Two labellers genuinely agree above what chance would produce.
κ = (p
o
− p
e
) / (1 − p
e
)