Distal radius fractures: Classifications concordance among orthopedic residents on a teaching hospital
- PMID: 36460558
- DOI: 10.1016/j.jos.2022.11.010
Distal radius fractures: Classifications concordance among orthopedic residents on a teaching hospital
Abstract
Background: Several classification systems have been developed to support orthopedic surgeons regarding diagnostic, treatment, or prognostic outcomes of distal radius fracture (DRF). However, the best classification system for this fracture remains controversial. We aimed to identify the reliability of three different DRF classifications among orthopedists in training (medical residents).
Methods: Orthopedic residents (n = 22) evaluated thirty cases of DRF in anteroposterior and lateral projections in three different periods (0, 6, 12 months). Each radiography was sorted with three different classifications: Frykman, AO/OTA, and Jupiter-Fernandez. All assessments were blinded to the investigators. The inter- and intra-observer reliability was evaluated using the Cohen's kappa coefficient. An additional analysis was performed for a simpler sub-classification of the AO/OTA (27, 9, or 3 groups).
Results: Inter-observer agreement for AO/OTA, Frykman, and Jupiter-Fernandez classifications was slight (k = 0.15), fair (k = 0.31), and fair (k = 0.30), respectively. Intra-observer agreement showed similar results: AO/OTA, k = 0.14; Frykman, k = 0.28; and Jupiter-Fernandez, k = 0.28. When the AO/OTA classification was simplified (9 or 3 descriptions), the inter-observer agreement improved from slight (k = 0.16) to fair (k = 0.21 and k = 0.30, respectively). A similar improvement from slight (k = 0.14) to fair (k = 0.32 and k = 0.21) was detected for intra-observer agreement.
Conclusions: The more complex the DRF classification system, the more complex is to reach reliable inter- and intra-observer agreements between orthopedic trainees. Senior residents did not necessarily show greater kappa values in DRF classifications.
Keywords: Interobserver agreement; Intraobserver agreement; Orthopedic resident; Radius fracture; X-ray.
Copyright © 2022 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.
Conflict of interest statement
Declaration of competing of interests All authors declare no conflict of interest or competing interest.
Similar articles
-
Reliability of distal radius fracture classification systems: a CT based study.Emerg Radiol. 2024 Dec;31(6):873-879. doi: 10.1007/s10140-024-02294-2. Epub 2024 Nov 5. Emerg Radiol. 2024. PMID: 39499384
-
Distal radius fractures are difficult to classify.Injury. 2018 Jun;49 Suppl 1:S29-S32. doi: 10.1016/S0020-1383(18)30299-7. Injury. 2018. PMID: 29929689
-
Are distal radius fracture classifications reproducible? Intra and interobserver agreement.Sao Paulo Med J. 2008 May 1;126(3):180-5. doi: 10.1590/s1516-31802008000300008. Sao Paulo Med J. 2008. PMID: 18711658 Free PMC article.
-
Intrarater and Inter-rater Reliability of Tibial Plateau Fracture Classifications: Systematic Review and Meta-Analysis.JB JS Open Access. 2024 Oct 3;9(4):e23.00181. doi: 10.2106/JBJS.OA.23.00181. eCollection 2024 Oct-Dec. JB JS Open Access. 2024. PMID: 39364175 Free PMC article. Review.
-
Adult distal radius fractures classification systems: essential clinical knowledge or abstract memory testing?Ann R Coll Surg Engl. 2016 Nov;98(8):525-531. doi: 10.1308/rcsann.2016.0237. Epub 2016 Aug 11. Ann R Coll Surg Engl. 2016. PMID: 27513789 Free PMC article. Review.
MeSH terms
LinkOut - more resources
Full Text Sources
Medical