Artigo Acesso aberto Revisado por pares

The Kappa Statistic: A Second Look

2004; Association for Computational Linguistics; Volume: 30; Issue: 1 Linguagem: Inglês

10.1162/089120104773633402

ISSN

1530-9312

Autores

Barbara Di Eugenio, Michael Glass,

Tópico(s)

Statistical Methods in Epidemiology

Resumo

In recent years, the kappa coefficient of agreement has become the de facto standard for evaluating intercoder agreement for tagging tasks. In this squib, we highlight issues that affect κ and that the community has largely neglected. First, we discuss the assumptions underlying different computations of the expected agreement component of κ. Second, we discuss how prevalence and bias affect the κ measure.

Referência(s)
Altmetric
PlumX