Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Multicenter Study
. 2025 May 23;25(1):759.
doi: 10.1186/s12909-025-07337-x.

Faculty development for undergraduate student programmatic assessment: a Brazilian multi-centered longitudinal study

Affiliations
Multicenter Study

Faculty development for undergraduate student programmatic assessment: a Brazilian multi-centered longitudinal study

Rodrigo Humberto Flauzino et al. BMC Med Educ. .

Abstract

Background: Student assessment in many Brazilian health professions schools consists mainly of summative exams targeting knowledge. Between 2018 and 2020, a faculty development (FD) project was held for nine Brazilian institutions addressing programmatic assessment systems favouring comprehensive, formative assessment of clinical skills and professionalism. This study aimed at identifying the immediate effects of the FD activities, and determining long-term (one-year) changes in participant's assessment practices and other effects.

Methods: Participated 359 teachers from 30 undergraduate courses (13 different health professions). Local leaders took an initial workshops cycle (20 h), which was followed by 10 similar activities at the various institutions. The program design covered topics on assessment practices and systems using active teaching-learning strategies. In the end, participants evaluated the quality of the FD activities and self-evaluated the knowledge and skills acquired. One year later, participants were invited to complete an online structured questionnaire about the importance of the FD activities on 36 aspects of their work.

Results: From 75 to 100% of the 292 (81.3%) respondents had a very positive perception of the workshops, particularly about program design, delivered activities and facilitator's role, with no significant differences between the initial and the other cycles. Immediately after the workshops, perceived levels of knowledge and skills were significantly greater (p = 0.0001) than those before, for all topics evaluated. In the follow-up evaluation one year later, more than 90% of the 121 (33.7%) respondents to the reliable instrument (Cronbach's α = 0,90) attributed high importance of participation to their work in student assessment. More than 70% reported positive impacts on other aspects, such as teaching, relationship with students and colleagues and feelings of greater belonging and personal and professional appreciation.

Conclusion: Our findings indicate that well-designed traditional FD activities focusing solely on student assessment enhance teacher's knowledge and skills in this area, an effect that persists in the long-term. Other improving effects were observed on overall participants' performance and may represent broader positive impacts on teachers and their institutions. Additionally, it is likely that these FD activities may contribute to changes in the participating institutions leading to a more constructive and meaningful approach to student assessment.

Keywords: Evaluation Program; Faculty Development; Health Professions Education; Student Assessment; Teaching Training.

PubMed Disclaimer

Conflict of interest statement

Declarations. Ethics approval and informed consent to participate: We confirm that our study was conducted in accordance with relevant ethical guidelines, including the Declaration of Helsinki. Additionally, it was approved by the local Institutional Research Ethics Committee (Comitê de Ética em Pesquisa do Hospital das Clínicas da Faculdade de Medicina de Ribeirão Preto da Universidade de São Paulo) (Approval Certificate: Case Number 3902020; CAAE: 29464920.6.0000.5440). Informed consent was obtained from all the participants. Consent for publication: Not applicable. Competing interests: The authors declare no competing interests.

Figures

Fig. 1
Fig. 1
Participants’ perceptions (n = 292; 81.3%) of the general quality of centralized and decentralized faculty development workshops
Fig. 2
Fig. 2
Participants’ perceptions (n = 297; 82.7%) of student assessment topics: median knowledge before and after faculty development workshops. Legend: PA—Programmatic Assessment

Similar articles

References

    1. Norcini J, Anderson MB, Bollela V, Burch V, Costa MJ, Duvivier R, et al. 2018 Consensus framework for good assessment. Med Teach. 2018;40(11):1102–9. Available from: 10.1080/0142159X.2018.1500016. - PubMed
    1. Schuwirth LWT, Van der Vleuten CPM. Programmatic assessment: from assessment of learning to assessment for learning. Med Teach. 2011;33(6):478–85. - PubMed
    1. Van der Vleuten CPM, Schuwirth LWT, Driessen EW, Dijkstra J, Tigelaar D, Baartman LKJ, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34(3):205–14. - PubMed
    1. Heeneman S, de Jong LH, Dawson LJ, Wilkinson TJ, Ryan A, Tait GR, et al. Ottawa 2020 consensus statement for programmatic assessment–1. Agreement on the principles. Med Teach. 2021;43(10):1139–48. Available from: 10.1080/0142159X.2021.1957088. - PubMed
    1. Torre D, Rice NE, Ryan A, Bok H, Dawson LJ, Bierer B, et al. Ottawa 2020 consensus statements for programmatic assessment–2. Implementation and practice. Med Teach [Internet]. 2021;43(10):1149–60. Available from: 10.1080/0142159X.2021.1956681. - PubMed

Publication types

LinkOut - more resources