Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Jan;29(1):133-137.
doi: 10.1016/j.jos.2022.11.010. Epub 2022 Nov 29.

Distal radius fractures: Classifications concordance among orthopedic residents on a teaching hospital

Affiliations

Distal radius fractures: Classifications concordance among orthopedic residents on a teaching hospital

Victor M Peña-Martínez et al. J Orthop Sci. 2024 Jan.

Abstract

Background: Several classification systems have been developed to support orthopedic surgeons regarding diagnostic, treatment, or prognostic outcomes of distal radius fracture (DRF). However, the best classification system for this fracture remains controversial. We aimed to identify the reliability of three different DRF classifications among orthopedists in training (medical residents).

Methods: Orthopedic residents (n = 22) evaluated thirty cases of DRF in anteroposterior and lateral projections in three different periods (0, 6, 12 months). Each radiography was sorted with three different classifications: Frykman, AO/OTA, and Jupiter-Fernandez. All assessments were blinded to the investigators. The inter- and intra-observer reliability was evaluated using the Cohen's kappa coefficient. An additional analysis was performed for a simpler sub-classification of the AO/OTA (27, 9, or 3 groups).

Results: Inter-observer agreement for AO/OTA, Frykman, and Jupiter-Fernandez classifications was slight (k = 0.15), fair (k = 0.31), and fair (k = 0.30), respectively. Intra-observer agreement showed similar results: AO/OTA, k = 0.14; Frykman, k = 0.28; and Jupiter-Fernandez, k = 0.28. When the AO/OTA classification was simplified (9 or 3 descriptions), the inter-observer agreement improved from slight (k = 0.16) to fair (k = 0.21 and k = 0.30, respectively). A similar improvement from slight (k = 0.14) to fair (k = 0.32 and k = 0.21) was detected for intra-observer agreement.

Conclusions: The more complex the DRF classification system, the more complex is to reach reliable inter- and intra-observer agreements between orthopedic trainees. Senior residents did not necessarily show greater kappa values in DRF classifications.

Keywords: Interobserver agreement; Intraobserver agreement; Orthopedic resident; Radius fracture; X-ray.

PubMed Disclaimer

Conflict of interest statement

Declaration of competing of interests All authors declare no conflict of interest or competing interest.

Similar articles

LinkOut - more resources