Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2007 Sep;44(9):675-81.

Impact of training on observer variation in chest radiographs of children with severe pneumonia

Affiliations
  • PMID: 17921556
Free article

Impact of training on observer variation in chest radiographs of children with severe pneumonia

Archana B Patel et al. Indian Pediatr. 2007 Sep.
Free article

Abstract

Background: Pneumonia diagnosed using chest radiographs is often used as a study end point in trials and epidemiological studies. We studied whether training of the end-users in 172 standardized chest radiographic features will decrease variability in the interpretation.

Methods: Inter-observer variation of 3 observers in recognizing standardized radiographic features for pneumonia was studied in 172 chest radiographs of children with clinical severe pneumonia. (as per WHO definition). The observers were then trained using a software with a repository of normal and abnormal films showing a spectrum of radiological changes in pneumonia. The interobserver variation in recognizing the same standardized radiographic features was recorded after this training. For each radiographic feature, Cohen's kappa statistics to assess the between-observer agreement and Fleiss's multiple rater kappa statistics to assess agreement among all three clinicians was used.

Results: The 'uniterpretable' films reduced from 16.6% (95% CI 0%-34.1%) before training to 8.1% (95% CI 0%-17.7%) after training. The 'adequate' films increased from 54.2% (95% CI 12.5%-95.9%) before training to 70% (95% CI 46.5%-93.4%) after training. For all features, agreement between observers 1 with 2 and 1 with 3, the Cohen's kappa improved from poor to moderate agreement. The Fleiss's kappa values before training were 0.1 to 0.2 and after training ranged from 0.37 to 0.52 indicating moderate to good agreement after training.

Conclusions: Training of the doctors using standardized features with the help of a software improves agreement substantially in identifying radiological pneumonia.

PubMed Disclaimer

Similar articles

Cited by

Publication types

LinkOut - more resources