Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2013 Jun 21:13:89.
doi: 10.1186/1472-6920-13-89.

Question-writing as a learning tool for students--outcomes from curricular exams

Affiliations

Question-writing as a learning tool for students--outcomes from curricular exams

Alexander Jobs et al. BMC Med Educ. .

Abstract

Background: Writing exam questions can be a valuable learning tool. We asked students to construct multiple choice questions for curricular exams in Internal Medicine. The questions for the particular exams were chosen from a pool of at least 300 student-written questions. The uncorrected pool was accessible to all students. We studied the influence of this approach on the students' learning habits and their test results. We hypothesized that creating a pool of their own questions for the exams could encourage students to discuss the learning material.

Methods: All students had to pass 4 exams in 7 fields of Internal Medicine. Three exams were comprised of 20 questions, and we applied the new method in one of these exams. The fourth exam was comprised of 30 questions, 15 of which were chosen from a students' pool. After all exams had been completed we asked the students to fill in a web-based questionnaire on their learning habits and their views on the new approach. The test-results were compared to the results of the lecturers' questions that defined high and low performing students.

Results: A total of 102 students completed all four exams in a row, 68 of whom filled in the questionnaire. Low performing students achieved significantly better results in the students' questions. There was no difference in the number of constructed questions between both groups of students. The new method did not promote group work significantly. However, high performing students stated a stronger wish to be rewarded by good performance.

Conclusions: Creating a curricular exam by choosing questions from a pool constructed by students did not influence the learning habits significantly and favored low performing students. Since the high performing students sought to be rewarded for their efforts, we do not consider the approach applied in our study to be appropriate.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Number of students taking the exams and filling in the questionnaire.
Figure 2
Figure 2
Bland-Altman-plot of the performance of the students who took all four exams (n = 102) differentiating by the type of questions. The gray saturation indicates whether data points superpose a xy-pair (up to 5 students per point).
Figure 3
Figure 3
Bland-Altman-plot of the performance of the students who took the GHO-exam (n = 256) differentiating by the type of questions. The gray saturation indicates whether data points superpose a xy-pair (up to 19 students per point).

References

    1. Bloom BS, Madaus GF, Hastings JT. Evaluation to improve learning. New York: Mcgraw-Hill; 1981.
    1. Frase LT, Schwartz BJ. Effect of question production and answering on prose recall. J Educ Psychol. 1975;67(5):628–635.
    1. Brown IW. To learn is to teach is to create the final exam. Coll Teach. 1991;39:150–153. doi: 10.1080/87567555.1991.9933419. - DOI
    1. Rash AM. An alternative method of assessment: Using student-created problems. Primus. 1997;7(1):89–95. doi: 10.1080/10511979708965848. - DOI
    1. Smith K. Let your students write their own tests. Ireland: Paper presented at the Annual Meeting of the International Association of Teachers of English as a Foreign Language Conference in Dublin; 1990.