Cohen`s Kappa Good Agreement

While there are probably many assumptions among evaluators, it may be useful to use Kappa statistics, but if the evaluators are well trained and there are probably few presumptions, the researcher can certainly rely on the percentage of concordance to determine the reliability of the interraters. Weighted Kappa allows disagreements to be weighed differently[21] and is particularly useful when codes are ordered. [8]:66 These are three matrices, the matrix of observed scores, the matrix of expected scores as a function of random correspondence and the weight matrix. The weight matrix cells on the diagonal (top left to the bottom right) are conform and therefore contain zeros. Off-diagonal cells contain weightings that indicate the severity of this disagreement. Often, the cells are weighted one of the diagonal with 1, these two with 2, etc.