Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Nov 24:8:408-413.
doi: 10.5116/ijme.5a10.04e1.

Predictive validity of pre-admission assessments on medical student performance

Affiliations

Predictive validity of pre-admission assessments on medical student performance

Al-Awwab Dabaliz et al. Int J Med Educ. .

Abstract

Objectives: To examine the predictive validity of pre-admission variables on students' performance in a medical school in Saudi Arabia.

Methods: In this retrospective study, we collected admission and college performance data for 737 students in preclinical and clinical years. Data included high school scores and other standardized test scores, such as those of the National Achievement Test and the General Aptitude Test. Additionally, we included the scores of the Test of English as a Foreign Language (TOEFL) and the International English Language Testing System (IELTS) exams. Those datasets were then compared with college performance indicators, namely the cumulative Grade Point Average (cGPA) and progress test, using multivariate linear regression analysis.

Results: In preclinical years, both the National Achievement Test (p=0.04, B=0.08) and TOEFL (p=0.017, B=0.01) scores were positive predictors of cGPA, whereas the General Aptitude Test (p=0.048, B=-0.05) negatively predicted cGPA. Moreover, none of the pre-admission variables were predictive of progress test performance in the same group. On the other hand, none of the pre-admission variables were predictive of cGPA in clinical years. Overall, cGPA strongly predict-ed students' progress test performance (p<0.001 and B=19.02).

Conclusions: Only the National Achievement Test and TOEFL significantly predicted performance in preclinical years. However, these variables do not predict progress test performance, meaning that they do not predict the functional knowledge reflected in the progress test. We report various strengths and deficiencies in the current medical college admission criteria, and call for employing more sensitive and valid ones that predict student performance and functional knowledge, especially in the clinical years.

Keywords: college performance indicators; cumulative grade point average; predictive validity; progress test; standardized admission tests.

PubMed Disclaimer

Similar articles

Cited by

References

    1. Poole P, Shulruf B, Rudland J, Wilkinson T. Comparison of UMAT scores and GPA in prediction of performance in medical school: a national study. Med Educ. 2012;46:163–171. doi: 10.1111/j.1365-2923.2011.04078.x. - DOI - PubMed
    1. Wilkinson D, Zhang J, Parker M. Predictive validity of the Undergraduate Medicine and Health Sciences Admission Test for medical students' academic performance. Med J Aust. 2011;194:341–344. - PubMed
    1. Yates J, James D. The UK Clinical Aptitude Test and clinical course performance at Nottingham: a prospective cohort study. BMC Med Educ. 2013;13:32. doi: 10.1186/1472-6920-13-32. - DOI - PMC - PubMed
    1. Ward A, Stoker HW, Murray-Ward M. Educational measurement: theories and applications. Lanham: University Press of America; 1996.
    1. Dunleavy DM, Kroopnick MH, Dowd KW, Searcy CA, Zhao X. The predictive validity of the MCAT exam in relation to academic performance through medical school: a national cohort study of 2001-2004 matriculants. Acad Med. 2013;88:666–671. doi: 10.1097/ACM.0b013e3182864299. - DOI - PubMed