Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2025 Apr;34(4):937-947.
doi: 10.1007/s11136-024-03858-y. Epub 2024 Nov 29.

The comprehensibility continuum: a novel method for analysing comprehensibility of patient reported outcome measures

Affiliations

The comprehensibility continuum: a novel method for analysing comprehensibility of patient reported outcome measures

Victoria Gale et al. Qual Life Res. 2025 Apr.

Abstract

Purpose: Evidence of comprehensibility is frequently required during the development of patient reported outcome measures (PROMs); the respondent's interpretation of PROM items needs to align with intended meanings. Cognitive interviews are recommended for investigating PROM comprehensibility, yet guidance for analysis is lacking. Consequently, the quality and trustworthiness of cognitive interview data and analysis is threatened, as there is no clear procedure detailing how analysts can systematically, and consistently, identify evidence that respondent interpretations align/misalign with intended meanings.

Methods: This paper presents a novel, structured approach to comprehensibility analysis - the 'Comprehensibility Continuum' - that builds upon existing cognitive interview guidance.

Results: The Comprehensibility Continuum comprises a structured rating scale to code depth of alignment between intended item meaning and respondent interpretation and consists of five main stages: before cognitive interviews are conducted, researchers must (1) Define intended meanings of PROM items; and (2) Determine comprehensibility thresholds for both participant- and item-level. After conducting interviews, they (3) Prepare data by transcribing interviews 'intelligent' verbatim; (4) Code transcripts using the Comprehensibility Continuum scale in iterative sets, assigning an overall code for each item at participant-level; and (5) Compare participant-level codes across all participants to determine overall item comprehensibility, such that decisions can be made to retain, modify, or remove items.

Conclusion: Quality in qualitative data analysis is achieved through rigorous methods that are clearly described and justified. Given insufficiency in guidelines, cognitive interviewers must reflect on how best to demonstrate PROM comprehensibility systematically and consistently from interview data, and the Comprehensibility Continuum method offers a potential solution.

Keywords: Analysis; Cognitive interview; Content validity; Patient-reported outcomes measures.

PubMed Disclaimer

Conflict of interest statement

Declarations. Ethical approval: Ethical approval for this research was granted by the Sheffield Centre for Health and Related Research (SCHARR), School of Medicine and Population Health, Research Ethics Committee (Date: 02.03.2023, Reference number: 051410). Consent to participate: Written informed consent was obtained from the parents of all children included in the research. Consent to Publish: The authors affirm that written informed consent was obtained from the parents of all children included in the research for the publication of anonymised participant quotes. Competing interests: Philip A. Powell is an Associate Editor at Quality of Life Research. Victoria Gale and Jill Carlton declare no competing interests. The authors have no relevant financial interests to disclose.

Figures

Fig. 1
Fig. 1
Summary of analysis process for evaluating comprehensibility using the Comprehensibility Continuum

References

    1. Food and Drug Administration. (2009). Guidance for Industry-Patient-reported outcome measures: Use in Medical Product Development to support labeling claims. Food and Drug Administration. - PMC - PubMed
    1. Brod, M., Tesler, L. E., & Christensen, T. L. (2009). Qualitative research and content validity: Developing best practices based on science and experience. Quality of life Research, 18, 1263–1278. 10.1007/s11136-009-9540- - PubMed
    1. Patrick, D. L., Burke, L. B., Gwaltney, C. J., Leidy, N. K., Martin, M. L., Molsen, E., & Ring, L. (2011). Content validity—establishing and reporting the evidence in newly developed patient-reported outcomes (PRO) Instruments for Medical product evaluation: ISPOR PRO Good Research practices Task Force Report: Part 2—Assessing Respondent understanding. Value in Health, 14(8), 978–988. 10.1016/j.jval.2011.06.013 - PubMed
    1. Patrick, D. L., Burke, L. B., Gwaltney, C. J., Leidy, N. K., Martin, M. L., Molsen, E., & Ring, L. (2011). Content validity—establishing and reporting the evidence in newly developed patient-reported outcomes (PRO) Instruments for Medical product evaluation: ISPOR PRO Good Research practices Task Force Report: Part 1 – eliciting concepts for a New PRO Instrument. Value in Health, 14(8), 967–977. 10.1016/j.jval.2011.06.014 - PubMed
    1. Lasch, K. E., Marquis, P., Vigneux, M., Abetz, L., Arnould, B., Bayliss, M., Crawford, B., & Rosa, K. (2010). PRO development: Rigorous qualitative research as the crucial foundation. Quality of life Research, 19(8), 1087–1096. 10.1007/s11136-010-9677-6 - PMC - PubMed

LinkOut - more resources