Current Slide

Small screen detected. You are viewing the mobile version of SlideWiki. If you wish to edit slides you will need to use a larger device.

Kappa measure for inter-judge (dis)agreement

  • Kappa measure

    • Agreement measure among judges

    • Designed for categorical judgments

    • Corrects for chance agreement

  • Kappa = [ P(A) – P(E) ] / [ 1 – P(E) ]

  • P(A) – proportion of time judges agree

  • P(E) – what agreement would be by chance

  • Kappa = 0 for chance agreement, 1 for total agreement.


Speaker notes:

Content Tools

Sources

There are currently no sources for this slide.