## Assessing interrater agreement on binary measurements via intraclass odds ratio.

### Details

Serval ID

serval:BIB_64C8711553C8

Type

**Article**: article from journal or magazin.

Collection

Publications

Fund

Title

Assessing interrater agreement on binary measurements via intraclass odds ratio.

Journal

Biometrical Journal. Biometrische Zeitschrift

ISSN

1521-4036 (Electronic)

ISSN-L

0323-3847

Publication state

Published

Issued date

2016

Volume

58

Number

4

Pages

962-973

Language

english

Abstract

Interrater agreement on binary measurements is usually assessed via Scott's π or Cohen's κ, which are known to be difficult to interpret. One reason for this difficulty is that these coefficients can be defined as a correlation between two exchangeable measurements made on the same subject, that is as an "intraclass correlation", a concept originally defined for continuous measurements. To measure an association between two binary variables, it is however more common to calculate an odds ratio rather than a correlation. For assessing interrater agreement on binary measurements, we suggest thus to calculate the odds ratio between two exchangeable measurements made on the same subject, yielding the concept of "intraclass odds ratio". Since it is interpretable as a ratio of probabilities of (strict) concordance and discordance (between two raters rating two subjects), an intraclass odds ratio might be easier to understand for researchers and clinicians than an intraclass correlation. It might thus be a valuable descriptive measure (summary index) to evaluate the agreement among a set of raters, without having to refer to arbitrary benchmark values. To facilitate its use, an explicit formula to calculate a confidence interval for the intraclass odds ratio is also provided.

Pubmed

Web of science

Create date

01/04/2016 15:51

Last modification date

18/11/2016 14:49