Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Jan 11;23(1):e21240.
doi: 10.2196/21240.

Differences in Mode Preferences, Response Rates, and Mode Effect Between Automated Email and Phone Survey Systems for Patients of Primary Care Practices: Cross-Sectional Study

Affiliations

Differences in Mode Preferences, Response Rates, and Mode Effect Between Automated Email and Phone Survey Systems for Patients of Primary Care Practices: Cross-Sectional Study

Sharon Johnston et al. J Med Internet Res. .

Abstract

Background: A growing number of health care practices are adopting software systems that link with their existing electronic medical records to generate outgoing phone calls, emails, or text notifications to patients for appointment reminders or practice updates. While practices are adopting this software technology for service notifications to patients, its use for collection of patient-reported measures is still nascent.

Objective: This study assessed the mode preferences, response rates, and mode effect for a practice-based automated patient survey using phone and email modalities to patients of primary care practices.

Methods: This cross-sectional study analyzed responses and respondent demographics for a short, fully automated, telephone or email patient survey sent to individuals within 72 hours of a visit to their regular primary care practice. Each survey consisted of 5 questions drawn from a larger study's patient survey that all respondents completed in the waiting room at the time of their visit. Automated patient survey responses were linked to self-reported sociodemographic information provided on the waiting room survey including age, sex, reported income, and health status.

Results: A total of 871 patients from 87 primary care practices in British Columbia, Ontario, and Nova Scotia, Canada, agreed to the automated patient survey and 470 patients (45.2%) completed all 5 questions on the automated survey. Email administration of the follow-up survey was preferred over phone-based administration, except among patients aged 75 years and older (P<.001). Overall, response rates for those who selected an emailed survey (369/606, 60.9%) were higher (P<.001) than those who selected the phone survey (101/265, 38.1%). This held true irrespective of age, sex, or chronic disease status of individuals. Response rates were also higher for email (range 57.4% [58/101] to 66.3% [108/163]) compared with phone surveys (range 36% [23/64] to 43% [10/23]) for all income groups except the lowest income quintile, which had similar response rates (email: 29/63, 46%; phone: 23/50, 46%) for phone and email modes. We observed moderate (range 64.6% [62/96] to 78.8% [282/358]) agreement between waiting room survey responses and those obtained in the follow-up automated survey. However, overall agreement in responses was poor (range 45.3% [43/95] to 46.2% [43/93]) for 2 questions relating to care coordination.

Conclusions: An automated practice-based patient experience survey achieved significantly different response rates between phone and email and increased response rates for email as income group rose. Potential mode effects for the different survey modalities may limit multimodal survey approaches. An automated minimal burden patient survey could facilitate the integration of patient-reported outcomes into care planning and service organization, supporting the move of our primary care practices toward a more responsive, patient-centered, continual learning system. However, practices must be attentive to furthering inequities in health care by underrepresenting the experience of certain groups in decision making based on the reach of different survey modes.

Keywords: mixed-mode survey; primary care; response rates.

PubMed Disclaimer

Conflict of interest statement

Conflicts of Interest: None declared.

Similar articles

Cited by

References

    1. Green ME, Hogg W, Savage C, Johnston S, Russell G, Jaakkimainen RL, Glazier RH, Barnsley J, Birtwhistle R. Assessing methods for measurement of clinical outcomes and quality of care in primary care practices. BMC Health Serv Res. 2012;12:214. doi: 10.1186/1472-6963-12-214. http://www.biomedcentral.com/1472-6963/12/214 - DOI - PMC - PubMed
    1. Peters M, Crocker H, Jenkinson C, Doll H, Fitzpatrick R. The routine collection of patient-reported outcome measures (PROMs) for long-term conditions in primary care: a cohort survey. BMJ Open. 2014 Feb 21;4(2):e003968. doi: 10.1136/bmjopen-2013-003968. https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=24561495 - DOI - PMC - PubMed
    1. Hogg W, Johnston S, Russell G, Dahrouge S, Gyorfi-Dyke E, Kristjanssonn E. Conducting waiting room surveys in practice-based primary care research: a user's guide. Can Fam Physician. 2010 Dec;56(12):1375–1376. http://www.cfp.ca/cgi/pmidlookup?view=long&pmid=21156900 - PMC - PubMed
    1. Khanbhai M, Flott K, Darzi A, Mayer E. Evaluating digital maturity and patient acceptability of real-time patient experience feedback systems: systematic review. J Med Internet Res. 2019 Jan 14;21(1):e9076. doi: 10.2196/jmir.9076. - DOI - PMC - PubMed
    1. McLean SM, Booth A, Gee M, Salway S, Cobb M, Bhanbhro S, Nancarrow SA. Appointment reminder systems are effective but not optimal: results of a systematic review and evidence synthesis employing realist principles. Patient Prefer Adherence. 2016;10:479–499. doi: 10.2147/PPA.S93046. doi: 10.2147/PPA.S93046. - DOI - DOI - PMC - PubMed

LinkOut - more resources