776 kappa 08 good agreement 067 kappa 08

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: eval Sec. 8.5 Test Collec)ons Sec. 8.5 Introduc)on to Informa)on Retrieval From document collec)ons to test collec)ons   S)ll need   Test queries   Relevance assessments   Test queries   Must be germane to docs available   Best designed by domain experts   Random query terms generally not a good idea   Relevance assessments   Human judges, )me ­consuming   Are human panels perfect? 27 Introduc)on to Informa)on Retrieval Sec. 8.5 Kappa measure for inter ­judge (dis) agreement Sec. 8.5 Introduc)on to Informa)on Retrieval P(A)? P(E)? Kappa Measure: Example Number of docs Relevant Nonrelevant Nonrelevant Relevant Nonrelevant 10 29 Relevant 20 Kappa = [ P(A) – P(E) ] / [ 1 – P(E) ] P(A) – propor)on of )me judges agree P(E) – what agreement would be by chance Kappa = 0 for chance agreement, 1 for total agreement. Judge 2 70   Agreement measure among judges   Designed for categorical judgments   Corrects for chance agreement Judge 1 300   Kappa measure         28 Nonrelevant Relevant 30 5 Introduc)on to...
View Full Document

This document was uploaded on 02/26/2014.

Ask a homework question - tutors are online