Agreement models for multiraters


Creative Commons License

Saracbasi T.

TURKISH JOURNAL OF MEDICAL SCIENCES, vol.41, no.5, pp.939-944, 2011 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 41 Issue: 5
  • Publication Date: 2011
  • Doi Number: 10.3906/sag-1006-891
  • Journal Name: TURKISH JOURNAL OF MEDICAL SCIENCES
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, TR DİZİN (ULAKBİM)
  • Page Numbers: pp.939-944
  • Hacettepe University Affiliated: No

Abstract

Aim: Agreement between 2 or more independent raters evaluating the same items and same scale can be measured by kappa coefficient. In recent years, modeling agreement among raters rather than summarizing indices has been preferred. In this study, the disadvantages of kappa are reviewed. Agreement models are introduced and these models are applied to a real data set.