Representation of ophthalmology concepts by electronic systems: intercoder agreement among physicians using controlled terminologies
- PMID: 16488013
- DOI: 10.1016/j.ophtha.2006.01.017
Representation of ophthalmology concepts by electronic systems: intercoder agreement among physicians using controlled terminologies
Abstract
Objective: To assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9, Clinical Modification [ICD9CM]; Current Procedural Terminology, fourth edition; Logical Observation Identifiers, Names, and Codes [LOINC]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; and Medical Entities Dictionary).
Design: Noncomparative case series.
Participants: Five complete ophthalmology case presentations selected from a publicly available journal.
Methods: Each case was parsed into discrete concepts. Electronic or paper browsers were used independently by 3 physician coders to assign a code for every concept in each terminology. A match score representing adequacy of assignment for each concept was assigned on a 3-point scale (0, no match; 1, partial match; 2, complete match). For every concept, the level of intercoder agreement was determined by 2 methods: (1) based on exact code matching with assignment of complete agreement when all coders assigned the same code, partial agreement when 2 coders assigned the same code, and no agreement when all coders assigned different codes, and (2) based on manual review for semantic equivalence of all assigned codes by an independent ophthalmologist to classify intercoder agreement for each concept as complete agreement, partial agreement, or no agreement. Subsequently, intercoder agreement was calculated in the same manner for the subset of concepts judged to have adequate coverage by each terminology, based on receiving a match score of 2 by at least 2 of the 3 coders.
Main outcome measures: Intercoder agreement in each controlled terminology: complete, partial, or none.
Results: Cases were parsed into 242 unique concepts. When all concepts were analyzed by manual review, the proportion of complete intercoder agreement ranged from 12% (LOINC) to 44% (SNOMED-CT), and the difference in intercoder agreement between LOINC and all other terminologies was statistically significant (P<0.004). When only concepts with adequate terminology were analyzed by manual review, the proportion of complete intercoder agreement ranged from 33% (LOINC) to 64% (ICD9CM), and there were no statistically significant differences in intercoder agreement among any pairs of terminologies.
Conclusions: The level of intercoder agreement for ophthalmic concepts in existing controlled medical terminologies is imperfect. Intercoder reproducibility is essential for accurate and consistent electronic representation of medical data.
Similar articles
-
Representation of ophthalmology concepts by electronic systems: adequacy of controlled medical terminologies.Ophthalmology. 2005 Feb;112(2):175-83. doi: 10.1016/j.ophtha.2004.09.032. Ophthalmology. 2005. PMID: 15691548
-
Documentation and coding of ED patient encounters: an evaluation of the accuracy of an electronic medical record.Am J Emerg Med. 2006 Oct;24(6):664-78. doi: 10.1016/j.ajem.2006.02.005. Am J Emerg Med. 2006. PMID: 16984834
-
Mapping the categories of the Swedish primary health care version of ICD-10 to SNOMED CT concepts: rule development and intercoder reliability in a mapping trial.BMC Med Inform Decis Mak. 2007 May 2;7:9. doi: 10.1186/1472-6947-7-9. BMC Med Inform Decis Mak. 2007. PMID: 17472757 Free PMC article.
-
An overview of coding and its relationship to standardized clinical terminology.Top Health Inf Manage. 2000 Nov;21(2):1-9. Top Health Inf Manage. 2000. PMID: 11143274 Review.
-
The practical impact of ontologies on biomedical informatics.Yearb Med Inform. 2006:124-35. Yearb Med Inform. 2006. PMID: 17051306 Review.
Cited by
-
Evaluation of electronic health record implementation in ophthalmology at an academic medical center (an American Ophthalmological Society thesis).Trans Am Ophthalmol Soc. 2013 Sep;111:70-92. Trans Am Ophthalmol Soc. 2013. PMID: 24167326 Free PMC article.
-
Inter-rater agreement for the annotation of neurologic signs and symptoms in electronic health records.Front Digit Health. 2023 Jun 13;5:1075771. doi: 10.3389/fdgth.2023.1075771. eCollection 2023. Front Digit Health. 2023. PMID: 37383943 Free PMC article.
-
Qualitative analysis of manual annotations of clinical text with SNOMED CT.PLoS One. 2018 Dec 27;13(12):e0209547. doi: 10.1371/journal.pone.0209547. eCollection 2018. PLoS One. 2018. PMID: 30589855 Free PMC article.
-
Risks and rewards of increasing patient access to medical records in clinical ophthalmology using OpenNotes.Eye (Lond). 2022 Oct;36(10):1951-1958. doi: 10.1038/s41433-021-01775-9. Epub 2021 Oct 5. Eye (Lond). 2022. PMID: 34611314 Free PMC article.
-
Automated UMLS-based comparison of medical forms.PLoS One. 2013 Jul 4;8(7):e67883. doi: 10.1371/journal.pone.0067883. Print 2013. PLoS One. 2013. PMID: 23861827 Free PMC article.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources