Skip to main content

Table 2 Summary of Inter-auditor reliability

From: Intra- and inter-rater reliability of an electronic health record audit used in a chiropractic teaching clinic system: an observational study

Auditors

Objective items (n = 20)

Subjective items (n = 41)

% agreement (95%CI)

Kappa (95%CI)

% agreement (95%CI)

Kappa (95%CI)

1 and 2

88 (85, 91)

0.72 (0.65, 0.78)

71 (68, 73)

0.46 (0.41, 0.51)

1 and 3

78 (74, 81)

0.50 (0.43, 0.57)

73 (70, 75)

0.47 (0.42, 0.52)

2 and 3

81 (77, 85)

0.57 (0.50, 0.64)

67 (64, 69)

0.41 (0.36, 0.45)

Three-auditor

82 (80, 84)

0.59 (0.53, 0.66)

70 (69, 72)

0.44 (0.40, 0.48)