Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Aug 3;6(8):e35319.
doi: 10.2196/35319.

Remote Moderator and Observer Experiences and Decision-making During Usability Testing of a Web-Based Empathy Training Portal: Content Analysis

Affiliations

Remote Moderator and Observer Experiences and Decision-making During Usability Testing of a Web-Based Empathy Training Portal: Content Analysis

Michelle Lobchuk et al. JMIR Form Res. .

Abstract

Background: COVID-19 restrictions severely curtailed empirical endeavors that involved in-person interaction, such as usability testing sessions for technology development. Researchers and developers found themselves using web-based moderation for usability testing. Skilled remote moderators and observers are fundamental in this approach. However, to date, more empirical work is needed that captures the perceptions and support needs of moderators and observers in testing situations.

Objective: The aim of this paper was to identify remote moderator and observer participant experiences and their use of certain tools to capture feedback of users as they interact with the web browser application.

Methods: This research is part of a broader study on an educational web browser application for nursing students to learn perspective taking and enhance their perceptual understanding of a dialogue partner's thoughts and feelings. The broader study used a quantitative and think-aloud qualitative problem-discovery usability study design. This case study explored written accounts of the remote moderator and observer participants regarding their roles, experiences, and reactions to the testing protocol and their suggestions for improved techniques and strategies for conducting remote usability testing. Content analysis was used to analyze participants' experiences in the usability testing sessions.

Results: We collected data from 1 remote moderator and 2 remote observers. Five themes were identified: dealing with personal stressors, dealing with user anxiety, maintaining social presence, ethical response to the study protocol, and communication during sessions. The participants offered recommendations for the design of future remote testing activities as well as evidence-informed training materials for usability project personnel.

Conclusions: This study's findings contribute to a growing body of endeavors to understand human-computer interaction and its impact on remote moderator and observer roles. As technology rapidly advances, more remote usability testing will occur where the knowledge gleaned in this study can have an impact. Recommendations based on moderator and observer participant perspectives identify the need for more evidence-informed training materials for their roles that focus on web-based interpersonal communication skills, execution of user testing protocols, troubleshooting technology and test user issues, proficiency in web conferencing platforms, behavior analysis and feedback technologies, and time management.

Keywords: empathy; internet; qualitative research; user-centered design; web browser.

PubMed Disclaimer

Conflict of interest statement

Conflicts of Interest: None declared.

Figures

Figure 1
Figure 1
Web conferencing test environment set up for phases 1 and 2.
Figure 2
Figure 2
Hotjar (Hotjar Ltd)—scrolling activity (on the application’s Training Portal for a sample of 282 scrolling activities; red indicates that all or almost all users have seen this part of the page).
Figure 3
Figure 3
Hotjar (Hotjar Ltd)—clicking activity (on the application’s Training Portal for a sample of 175 clicks).

Similar articles

Cited by

References

    1. Bastien J. Usability testing: a review of some methodological and technical aspects of the method. Int J Med Inform. 2010 Apr;79(4):e18–23. doi: 10.1016/j.ijmedinf.2008.12.004.S1386-5056(08)00209-8 - DOI - PubMed
    1. Aiyegbusi OL. Key methodological considerations for usability testing of electronic patient-reported outcome (ePRO) systems. Qual Life Res. 2020 Feb;29(2):325–33. doi: 10.1007/s11136-019-02329-z. http://europepmc.org/abstract/MED/31691202 10.1007/s11136-019-02329-z - DOI - PMC - PubMed
    1. Sherwin LB, Yevu-Johnson J, Matteson-Kome M, Bechtold M, Reeder B. Remote usability testing to facilitate the continuation of research. Stud Health Technol Inform. 2022 Jun 06;290:424–7. doi: 10.3233/SHTI220110.SHTI220110 - DOI - PubMed
    1. Alhadreti O. A comparison of synchronous and asynchronous remote usability testing methods. Int J Human Comput Interact. 2021 Jun 13;38(3):289–97. doi: 10.1080/10447318.2021.1938391. https://doi-org.uml.idm.oclc.org/10.1080/10447318.2021.1938391 - DOI - DOI
    1. Hill JR, Brown JC, Campbell NL, Holden RJ. Usability-in-place-remote usability testing methods for homebound older adults: rapid literature review. JMIR Form Res. 2021 Nov 02;5(11):e26181. doi: 10.2196/26181. https://formative.jmir.org/2021/11/e26181/ v5i11e26181 - DOI - PMC - PubMed