Skip to main content

Table 3 The inter-rater agreement among 14 pharmacists for three sets of prescription evaluations

From: Evaluation of the reliability of the criteria for assessing prescription quality in Chinese hospitals among pharmacists in China

Comparators

TPs

HPs

CPs

Inter agreement Fleiss’ Kappa (95% Cl)

  Set 1

0.44 (0.40–0.48)

0.62 (0.50–0.73)

0.38 (0.32–0.44)

  Set 2

0.51 (0.47–0.54)

0.62 (0.50–0.73)

0.47 (0.41–0.53)

  Set 3

0.68 (0.64–0.72)

0.69 (0.57–0.80)

0.69 (0.63–0.75)

Accuracy Rate (%)

  Set 1

61.22

67.14

57.94

  Set 2

65.76

73.10

61.69

  Set 3

80.54 ∆, #

82.76

79.31 ∆, #

  1. TPs Total pharmacists, HPs Hospital pharmacists, CPs Community pharmacists
  2. ∆ means p < 0.05, Set 3 vs. Set 1; # means p < 0.05, Set 3 vs. Set 2