Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review

Creating a Survey of Community Engagement in Research [Internet]

Washington (DC): Patient-Centered Outcomes Research Institute (PCORI); 2021 Oct.
Affiliations
Free Books & Documents
Review

Creating a Survey of Community Engagement in Research [Internet]

Melody S. Goodman et al.
Free Books & Documents

Excerpt

Background: Community-engaged research is an umbrella term for forms of research that have community and stakeholder engagement as a core principle, for example, patient-centered outcomes research (PCOR), participatory action research, and community-based participatory research. However, the implementation and category of community engagement can vary across a spectrum from minimal engagement to fully collaborative engagement. A major methodological gap is the assessment of stakeholder engagement from a stakeholder perspective. Rigorous evaluation of the impact of stakeholder engagement on research development, implementation, and outcomes requires the development and validation of tools that assess engagement.

Objective: To develop and validate comprehensive and condensed versions of a survey instrument that will be completed by stakeholders and used to assess engagement in research studies from a stakeholder perspective.

Methods: We reviewed the literature on community engagement in research to identify existing measures of engagement. Using the engagement principles (EPs) we identified, we developed a comprehensive community engagement measure and evaluated its construct validity. The Disparities Elimination Advisory Committee (DEAC), a community-academic partnership at the Siteman Cancer Center and the Washington University in St Louis School of Medicine, served as the community advisory board guiding the work. The DEAC and the Patient Research Advisory Board (PRAB), a community group trained in research methods, served as community partners and assisted in determining the components to include in the measurement tool: engagement categories and EPs. These partners also advised on the development of survey items that would demonstrate the presence of EPs and on the recruitment and retention of participants to use the measurement tool.

To assess content validity, we convened a panel of national experts in stakeholder-engaged research (SER) and used a 5-round Delphi process to reach agreement on engagement categories and their definitions, EPs and their definitions, and items associated with each EP. The DEAC and the PRAB provided feedback on the work of the Delphi panel. The expert review panel comprised a range of stakeholders (eg, patients, caregivers, advocacy groups, clinicians, researchers). After initiation of the Delphi process, we recruited community members who had participated in patient- or community-engaged research across the community engagement spectrum to participate in a longitudinal, web-based study that consisted of 4 surveys over an 18-month period. As we developed the assessment tool to measure categories of community engagement, these surveys permitted the investigators to examine the reliability and validity of the tool, which we named the Research Engagement Survey Tool (REST). We also evaluated the performance of each item on the tool.

We examined 2 types of psychometric properties of the survey items—internal consistency and convergent validity with other scales (eg, Trust in Medical Researchers Scale). After the Delphi process and between surveys 3 and 4, we completed cognitive interviews about the survey items to improve the reliability and validity of REST.

REST has 2 scoring approaches. The first approach, aligned with EPs, is scored by taking the mean of the 3 to 5 items for each EP to obtain EP-specific scores and then taking the overall mean of the EP means to obtain an overall REST score. The second scoring approach aligns REST with the categories of engagement, developed during the Delphi process, based on the percentage of item responses that are aligned with each of 5 categories: (1) outreach and education, (2) consultation, (3) cooperation, (4) collaboration, and (5) partnership. We applied the scoring approaches to the 32-item and 9-item versions of the REST.

Finally, we studied the implementation of the final version of REST (long form) in 20 project teams. These 20 project teams worked with us to administer REST to their stakeholders in 26 separate PCOR and SER projects; the project teams completed surveys and interviews that examined barriers to and facilitating factors for implementing REST.

Results: We identified 11 EPs during our search of the community engagement literature and used those EPs as the basic structure for developing REST. Three to 5 survey items assessed stakeholder engagement on each EP. Because administering the comprehensive measure of stakeholder engagement (32-item REST) will not always be feasible and can increase participant burden, we also developed a shorter version (9 items) of REST. We also developed a version of the condensed measure that could be personally tailored to the needs of a specific population (eg, those with less than a college degree).

We generated administration instructions for REST based on results of its implementation among the stakeholders of 20 PCOR or SER teams, including the findings from implementation surveys and interviews of project team members. The project teams stated that implementing REST was feasible, they understood the results, and they found it useful for their engagement work.

Conclusions: Using REST (both the comprehensive and condensed versions) enables the systematic measurement and collection of data about stakeholder engagement. Using the administration instructions we developed, investigators can apply REST repeatedly over time within and across projects and use it to evaluate the category of stakeholder engagement in research studies.

Limitations: Although REST is a validated measure, its implementation did not include paper administration or multiple forms of administration that might affect its psychometric properties. Studies are needed to evaluate its reliability over time. In addition, REST has not been translated into languages other than English or tested in non-English-speaking communities.

PubMed Disclaimer

Similar articles

Grants and funding

Institution Receiving Award: Washington University School of Medicine