Inter-rater analysis using the kappa coefficient

kappa.001-001For detailed information, please read the following page.

Fleiss’ kappa

Fleiss’ kappa ( see also Cohen’s kappa) is a statistic measure of the extent of agreement between raters. It works for any number of raters giving categorical ratingsĀ  to a fixed number of items.

This value will be provided on a separate summary table.