The overall percentage agreement
rate was determined as the agreement with the modal result among all observations.
Kappa statistic test and percentage agreement
was done and Kappa coefficient was derived.
Wilcoxon signed-rank tests and percentage agreement
calculations were used for comparisons between the radiographs and the gold-standard observations and between the orthogonal and mesio-angulated radiographs, respectively.
For 4 items, k values were slightly below this threshold but raw percentage agreement
was high (90%-94%); therefore, these items were also included (online Technical Appendix Table 3).
In the present study a percentage agreement
was selected as an approach to establishing reliability.
Inter-observer percentage agreement
of cephalic end vertebra ranged from 36-71%, and caudal end vertebra ranged from 36-79% (Table 1).
Kappa describes agreement beyond chance relative to perfect agreement beyond chance (as opposed to percentage agreement
, which describes agreement relative to perfect agreement).
Interobserver reliability was measured using the simple percentage agreement
, Cohen's kappa coefficient, and Gwet's [AC.sub.1] statistic  as well as the corresponding confidence intervals (CI).
(A) between the raters was calculated by dividing the observed agreement (O) by the possible agreement (P).
A percentage agreement
and a Cohen's kappa were calculated.
Excellent inter-observer agreement was seen for the presence or absence of nerve root compression (Percentage agreement
= 88.89%; k = 0.774; p = 0.737).
Compared with Lab A, the percentage agreement
for ART eligibility was 81% (i.e.