Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Jan 5:6:7.
doi: 10.21037/mhealth.2019.10.01. eCollection 2020.

Engagement in mHealth behavioral interventions for HIV prevention and care: making sense of the metrics

Affiliations

Engagement in mHealth behavioral interventions for HIV prevention and care: making sense of the metrics

Lisa B Hightow-Weidman et al. Mhealth. .

Abstract

Background: Engagement is the primary metric by which researchers can assess whether participants in a mHealth intervention used and interacted with the intervention's content as intended over a pre-specified period to result in behavior change. Paradata, defined as the process data documenting users' access, participation, and navigation through a mHealth intervention, have been associated with differential treatment outcomes in mHealth interventions. Within behavioral mHealth interventions, there has been an increase in the number of studies addressing the HIV prevention and care continuum in recent years, yet few have presented engagement metrics or examined how these data could inform design modifications, promote continued engagement, and supplement primary intervention efficacy and scale-up efforts.

Methods: We review common paradata metrics in mHealth interventions (e.g., amount, frequency, duration and depth of use), using case studies from four technology-driven HIV interventions to illustrate their utility in evaluating mHealth behavioral interventions for HIV prevention and care. Across the four case studies, participants' ages ranged between 15 and 30 years and included a racially and ethnically diverse sample of youth. The four case studies had different approaches for engaging young men who have sex with men: a tailored brief intervention, an interactive modular program, a daily tool to monitor and self-regulate treatment adherence, and an online platform promoting social engagement and social support. Each focused on key outcomes across the HIV prevention and care continuum [e.g., safer sex behaviors, HIV testing, antiretroviral therapy (ART) adherence] and collected paradata metrics systematically.

Results: Across the four interventions, paradata was utilized to identify patterns of use, create user profiles, and determine a minimum engagement threshold for future randomized trials based on initial pilot trial data. Evidence of treatment differences based on paradata analyses were also observed in between-arm and within-arm analyses, indicating that intervention exposure and dosage might influence the strength of the observed intervention effects. Paradata reflecting participants' engagement with intervention content was used to suggest modifications to intervention design and navigation, to understand what theoretically-driven content participants chose to engage with in an intervention, and to illustrate how engagement was linked to HIV-related outcomes.

Conclusions: Paradata monitoring and reporting can enhance the rigor of mHealth trials. Metrics of engagement must be systematically collected, analyzed and interpreted to meaningfully understand a mHealth intervention's efficacy. Future mHealth trials should work to identify suitable engagement metrics during intervention development, ensure their collection throughout the trial, and evaluate their impact on trial outcomes.

Keywords: HIV prevention and care; engagement; mHealth; paradata.

PubMed Disclaimer

Conflict of interest statement

Conflicts of Interest: The authors have no conflicts of interest to declare.

References

    1. Couper MP, Alexander GL, Zhang N, et al. Engagement and retention: measuring breadth and depth of participant use of an online intervention. J Med Internet Res 2010;12:e52. 10.2196/jmir.1430 - DOI - PMC - PubMed
    1. Perski O, Blandford A, West R, et al. Conceptualising engagement with digital behaviour change interventions: a systematic review using principles from critical interpretive synthesis. Transl Behav Med 2017;7:254-67. 10.1007/s13142-016-0453-1 - DOI - PMC - PubMed
    1. Brooke J. A quick and dirty usability scale. In: Jordan RW, Thomas B, McClelland IL, et al. editors. Usability Evaluation in Industry. 1st edition. London: Taylor and Francis, 1996:189-94.
    1. Horvath KJ, Oakes JM, Rosser BR, et al. Feasibility, acceptability and preliminary efficacy of an online peer-to-peer social support ART adherence intervention. AIDS Behav 2013;17:2031-44. 10.1007/s10461-013-0469-1 - DOI - PMC - PubMed
    1. Crutzen R, Roosjen JL, Poelman J. Using Google Analytics as a process evaluation method for Internet-delivered interventions: an example on sexual health. Health Promot Int 2013;28:36-42. 10.1093/heapro/das008 - DOI - PubMed