Kappa statistic: Difference between revisions
Jump to navigation
Jump to search
imported>Robert Badgett (Adapted from WP) |
imported>David E. Volk No edit summary |
||
Line 1: | Line 1: | ||
{{subpages}} | |||
==Interpretation== | ==Interpretation== | ||
Landis and Koch<ref name="pmid843571">{{cite journal |author=Landis JR, Koch GG |title=The measurement of observer agreement for categorical data |journal=Biometrics |volume=33 |issue=1 |pages=159–74 |year=1977 |pmid=843571 |doi=}}</ref> proposed the schema in the table below for interpreting <math>\kappa</math> values. | Landis and Koch<ref name="pmid843571">{{cite journal |author=Landis JR, Koch GG |title=The measurement of observer agreement for categorical data |journal=Biometrics |volume=33 |issue=1 |pages=159–74 |year=1977 |pmid=843571 |doi=}}</ref> proposed the schema in the table below for interpreting <math>\kappa</math> values. | ||
Line 22: | Line 23: | ||
==References== | ==References== | ||
<references/> | <references/> | ||
Revision as of 12:30, 27 December 2007
Interpretation
Landis and Koch[1] proposed the schema in the table below for interpreting values.
Interpretation | |
---|---|
< 0 | Poor agreement |
0.0 — 0.20 | Slight agreement |
0.21 — 0.40 | Fair agreement |
0.41 — 0.60 | Moderate agreement |
0.61 — 0.80 | Substantial agreement |
0.81 — 1.00 | Almost perfect agreement |
References
- ↑ Landis JR, Koch GG (1977). "The measurement of observer agreement for categorical data". Biometrics 33 (1): 159–74. PMID 843571. [e]