Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2025 Apr 21;25(1):587.
doi: 10.1186/s12909-025-07151-5.

Advancing the assessment of clinical competence in Latin America: a scoping review of OSCE implementation and challenges in resource-limited settings

Affiliations

Advancing the assessment of clinical competence in Latin America: a scoping review of OSCE implementation and challenges in resource-limited settings

Soledad Armijo-Rivera et al. BMC Med Educ. .

Abstract

Background: Objective Structured Clinical Examination (OSCE) is important to assess clinical competencies in health professions. However, in Latin America, a region with limited resources, the implementation and quality of OSCEs remain underexplored despite their increasing use. This study analyses how the OSCE has been applied and how its quality has evolved in Latin America.

Methods: A scoping review methodology was used, including a search across PubMed, Scopus, WOS, LILACS and Scielo, including studies on the implementation of OSCE in Latin America, written in English, French, Portuguese, or Spanish. Their quality was assessed using the AMEE guidelines 81 and 49 criteria and MMERSQI. Data were extracted regarding OSCE structure, evaluator training, validity, reliability, and the use of simulated patients.

Results: 365 articles were obtained, of which 69 met the inclusion criteria. The first report on OSCE implementation in the region dates back to 2000. Three countries accounted for 84.06% of the reports (Chile, Mexico, Brazil). 68.12% was applied in undergraduate programs. In this group, the implementation was mainly in Medicine (69.57%), with lesser use in physiotherapy (7.95%) and nursing (2.9%). The number of stations and duration of each varied, with 18-station circuits being the most common. Evidence of validity and reliability of the OSCE was reported in 26.09%, feedback to students in 33,33%, and simulated patient training in 37.68% of the reports. A notable trend in the quinquennial analysis is the increased use of high-fidelity simulations and the shift towards remote OSCEs during the pandemic. The inclusion of inactive stations, inadequate training for simulated patients, and the absence of evidence supporting instrument validation are recurrently reported challenges in OSCE studies. The overall methodological quality has improved, as evidenced by OSCE Committee and Blueprint in nearly 50% of the studies and rising MMERSQI scores, especially in recent years.

Conclusion: While there has been progress in OSCE implementation, particularly in medical education, gaps remain in standardization, validation, training, and resource allocation. Further efforts are needed to ensure consistent quality, particularly in training simulated patients, addressing inactive stations, and ensuring instrument reliability. Addressing these gaps is crucial for enhancing the effectiveness of OSCEs in resource-limited settings and advancing health professional education across the region.

Keywords: Assessment; Latin America; OSCE; Objective structured clinical examination; Quality education.

PubMed Disclaimer

Conflict of interest statement

Declarations. Ethics approval and consent to participate: Not applicable. Consent for publication: Not applicable. Competing interests: The authors declare no competing interests.

Figures

Fig. 1
Fig. 1
PRISMA flowchart OSCE Latam
Fig. 2
Fig. 2
OSCE articles published from Latam (trend 2000 to 2024)
Fig. 3
Fig. 3
OSCE articles and Per capita income 2023
Fig. 4
Fig. 4
Heat map of articles quality annalyzed by quinquennial

Similar articles

Cited by

References

    1. Harden RM. Revisiting ‘Assessment of clinical competence using an objective structured clinical examination (OSCE)’. Med Educ. 2016;50(4):376–9. 10.1111/medu.12801. PMID: 26995470. - PubMed
    1. Boursicot K, Roberts T, Burdick W. Structured assessments of clinical competence. Understanding medical education. London: Wiley; 2018. pp. 335–45.
    1. Khan KZ, Gaunt K, Ramachandran S, Pushkar P. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part II: organisation & administration. Med Teach. 2013;35(9): e1447–63. doi: 10.3109/0142159X.2013.818635. PMID: 23968324. - PubMed
    1. Daniels VJ, Pugh D. Twelve tips for developing an OSCE that measures what you want. Med Teach. 2018;40(12):1208–13. Epub 2017 Oct 25. PMID: 29069965. - PubMed
    1. Pell G, Fuller R, Homer M, Roberts T, International Association for Medical Education. How to measure the quality of the OSCE: A review of metrics - AMEE guide no. 49. Med Teach. 2010;32(10):802–11. doi: 10.3109/0142159X.2010.507716. PMID: 20854155. - PubMed

Publication types

LinkOut - more resources