Capítulo de livro Acesso aberto Revisado por pares

Assessors Agreement: A Case Study Across Assessor Type, Payment Levels, Query Variations and Relevance Dimensions

2016; Springer Science+Business Media; Linguagem: Inglês

10.1007/978-3-319-44564-9_4

ISSN

1611-3349

Autores

João Palotti, Guido Zuccon, Johannes Bernhardt, Allan Hanbury, Lorraine Goeuriot,

Tópico(s)

Topic Modeling

Resumo

Relevance assessments are the cornerstone of Information Retrieval evaluation. Yet, there is only limited understanding of how assessment disagreement influences the reliability of the evaluation in terms of systems rankings. In this paper we examine the role of assessor type (expert vs. layperson), payment levels (paid vs. unpaid), query variations and relevance dimensions (topicality and understandability) and their influence on system evaluation in the presence of disagreements across assessments obtained in the different settings. The analysis is carried out in the context of the CLEF 2015 eHealth Task 2 collection and shows that disagreements between assessors belonging to the same group have little impact on evaluation. It also shows, however, that assessment disagreement found across settings has major impact on evaluation when topical relevance is considered, while it has no impact when understandability assessments are considered.

Referência(s)