Table 4

Interpretation of agreement by kappa (κ) values34

κ valueStrength of agreement
< 0.0No agreement
0.01–0.20Slight
0.21–0.40Fair
0.41–0.60Moderate
0.61–0.80Substantial
0.81–1.00Almost perfect
  • The measure of agreement between and among entities was calculated using Fleiss κ values and ranges from no agreement to almost perfect agreement.