Upload
randolf-bishop
View
214
Download
0
Embed Size (px)
Citation preview
INTEROBSERVER AND INTRAOBSERVER VARIABILITY IN THE C-EOS. COMPARISON BETWEEN EXPERIENCED SPINE SURGEONS AND TRAINEES.
María del Mar Pozo-Balado, PhDJosé María López-Puerta, MD
Marco Antonio Pérez-Gañán, MDDavid M Farrington, MD
INTRODUCTION
• Describe and group patients to guide optimal care and prognosticate outcomes within Early Onset Scoliosis (EOS) children.
• Substantial to excellent inter-observer reliability has been demonstrated in experienced surgeons.
Early Onset Scoliosis Classification System (C-EOS)
• Intra-observer reproducibility and• Intra- and inter-observer reproducibility by clinicians with different level of
expertise.
However, in order to expand it use, it must also show…
OBJECTIVE
To demonstrate that C-EOS classification is a highly reliable comparison tool in EOS patients regardless of the reviewers level of training.
PATIENTS AND METHODS I
• Agreement study.• Two expert spine orthopedic surgeons and two senior year orthopedic
residents comparison of 81 cases of EOS classified according to the C-EOS classification on two separate occasions.
Study Design
C-EOS classification
Williams BA, et al., J Bone Surg Am. 2014
PATIENTS AND METHODS II
Inter-observer agreement.
• Comparison between the initial responses of all evaluators.• Interclass correlation coefficients (ICC) for continuous variables (age). • Kappa indexes (Ik) for categorical variables (Etiology, Cobb Angle and
Kyphosis).• Levels of agreement for ICC and Ik: 0.00-0.2, slight agreement; 0.21-0.40, fair
agreement; 0.41-0.60, moderate agreement; 0.61-0.8, substantial agreement; 0.81-1.00, almost perfect agreement.
Landis and Koch, Biometrics 1977
Patients.
• 81 children under 10 years old.• Complete clinical data. • Available radiographic studies.
PATIENTS AND METHODS III
Intra-observer agreement.
• Comparison between 2 evaluations of the same cases by the same evaluator, separated by 72 hours.
• Interclass correlation coefficients (ICC) for continuous variables (age)• Kappa indexes (KI) for categorical variables (Etiology, Cobb Angle and
Kyphosis). • Levels of agreement for ICC and k: 0.00-0.2, slight agreement; 0.21-0.40, fair
agreement; 0.41-0.60, moderate agreement; 0.61-0.8, substantial agreement; 0.81-1.00, almost perfect agreement.
Landis and Koch, Biometrics 1977
Sensitivity and Specificity
• Sensitivity: fraction of positive cases correctly classified by the evaluators, expressed as percentages (true-positive assessments).
• Specificity: true-negative assessments.
RESULTS I
Demographical and Clinical data of the 81 study children.
Characteristic Value
Age, median months [IQR] 81 [44-107]
Cobb, median angle [IQR] 36 [29-55]
Kiphosys, median angle [IQR] 20 [15-25]
Etiology n (%)
Congenital 37 (47.5)
Idiopathic 23 (28.4)
Neuromuscular 12 (14.8)
Syndromic 9 (11.1)
RESULTS II
Almost perfect intra-observer agreement variability.
Classification variable Expert 1 Expert 2 Trainee 1 Trainee 2Age 1.000* 1.000* 0.983* 1.000* Etiology 0.981** 0.924** 0.964** 0.982** Cobb 1.000** 0.975** 0.950** 1.000** Kyphosis 1.000** 0.931** 1.000** 1.000**
*ICC; **IK
No significant differences were observed in the intra-observer agreement of age, ethiology, Cobb or kyphosis between experts and
trainees.
RESULTS IVRESULTS III
Almost perfect inter-observer agreement variability in all cases except for etiology between experts.
Classification variable Experts TraineesAge 0.999* 0.983*Etiology 0.775** 0.874**Cobb 1.000** 0.950** Kyphosis 0.886** 1.000**
*ICC; **IK
No significant differences were observed in the inter-observer agreement of age, ethiology, Cobb or kyphosis between experts and
trainees.
Substantial inter-observer agreement for ethiology between experts.
RESULTS IIRESULTS IV
Sensibility and Specificity rates for C-EOS classification according to the experience.
Evaluator Sensibility (%) Specificity (%)Exp 1 100 97.1Exp 2 60 92.1Trainee 1 97.6 91.7 Trainee 2 81.8 98.6
CONCLUSIONS
Agreement between and within observers for the C-EOS Classification is very high. However, Etiology variable shows lower agreement than the rest of variables.