Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2014 Summer;13(2):311-21.
doi: 10.1187/cbe.13-05-0097.

Writing Assignments with a Metacognitive Component Enhance Learning in a Large Introductory Biology Course

Affiliations

Writing Assignments with a Metacognitive Component Enhance Learning in a Large Introductory Biology Course

Michelle Mynlieff et al. CBE Life Sci Educ. 2014 Summer.

Abstract

Writing assignments, including note taking and written recall, should enhance retention of knowledge, whereas analytical writing tasks with metacognitive aspects should enhance higher-order thinking. In this study, we assessed how certain writing-intensive "interventions," such as written exam corrections and peer-reviewed writing assignments using Calibrated Peer Review and including a metacognitive component, improve student learning. We designed and tested the possible benefits of these approaches using control and experimental variables across and between our three-section introductory biology course. Based on assessment, students who corrected exam questions showed significant improvement on postexam assessment compared with their nonparticipating peers. Differences were also observed between students participating in written and discussion-based exercises. Students with low ACT scores benefited equally from written and discussion-based exam corrections, whereas students with midrange to high ACT scores benefited more from written than discussion-based exam corrections. Students scored higher on topics learned via peer-reviewed writing assignments relative to learning in an active classroom discussion or traditional lecture. However, students with low ACT scores (17-23) did not show the same benefit from peer-reviewed written essays as the other students. These changes offer significant student learning benefits with minimal additional effort by the instructors.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Exam corrections increase student performance. Students participating in exam interventions (Table 2) were assessed for learning. Two weeks following each exam, five higher-order questions were administered in class for assessment (see Supplemental Material). The maximum score for each assessment was 10, with two points given for each correct answer. The assessment scores for exams 1 and 2 were normalized by multiplying each score by formula image because the assessment scores went up for all groups as the semester progressed. The normalized data are expressed as the mean ± SEM and analyzed by a two-way ANOVA with the exam number and correction type as the independent variables followed by the Holm-Sidak method of pairwise comparisons. n = 447 students; (a) p < 0.001; (b) p = 0.012.
Figure 2.
Figure 2.
Students with low ACTs benefit more than other students from exam corrections. (A) Exam-correction improvement data sorted into a low (17–23), midrange (24–30), and high (31–35) ACT group. Data were collected and normalized as described for Figure 1. A two-way ANOVA revealed significant differences in the postexam assessment scores with the different interventions and across the different ACT groups (p < 0.001). One-way ANOVA of each ACT group revealed significant differences between interventions for each ACT group (p = 0.002, p < 0.001, and p = 0.003 for the low, midrange, and high ACT groups, respectively). Pairwise comparisons with the Holm-Sidak method revealed differences between no corrections and written corrections for all three ACT groups, between no corrections and a discussion activity for the low and midrange, ACT groups but not for the high ACT group. (a) p<0.001; (b) p = 0.001; (c) p = 0.003; (d) p = 0.013. (B) The postexam assessment scores for each student following the written corrections or discussion activity were divided by the postexam assessment score in the absence of any corrections and multiplied by 100 to obtain a percent gain with interventions. A two-way ANOVA revealed that the differences in the mean values based on ACT score were significantly different (p = 0.029). A post hoc analysis using the Holm-Sidak procedure revealed that the low ACT group had a significantly increased performance on the postexam assessments following the discussion activity and written corrections in comparison with either the midrange (p = 0.011) or high (p = 0.017) ACT groups. The average gain on each intervention for each group was compared with 0 with a one-sample t test to determine whether there was a significant gain over no corrections. The p value for all groups and all interventions, except the discussion activity in the high ACT group, was < 0.002 (*). A two-sample t test was used to compare the average gain with written corrections to the discussion activity for each ACT group. The sample size was 42 for the low ACT group, 279 for the midrange ACT group, and 70 or 71 for the high ACT group, depending on the intervention. (a) p = 0.021. Data represent mean ± SEM.
Figure 3.
Figure 3.
Writing assignments increase student short-term and long-term retention. Students participating in interventions (Table 3) were assessed for proficiency 2 wk after the assignment (five-question quiz; short-term retention) and on the cumulative final exam (two exam questions for each topic; long-term retention). In each case, the assessment was scored out of 10 points. Overall, students performed best on the scientific design assessments, so all data were normalized within each topic by multiplying each individual score by the ratio of the average score on the scientific design assessment over the average score on the assessment for the topic as described in Methods. The data were analyzed by a two-way ANOVA with the topic and essay/discussion/lecture only as the independent variables (p < 0.001); this was followed by the Holm-Sidak method of pairwise comparison. (a) p ≤ 0.001; (b) p = 0.020. n = 551, 563, and 568 for essay and lecture, discussion and lecture, and lecture only, respectively. The normalized data are expressed as mean ± SEM.
Figure 4.
Figure 4.
Effect of writing assignments on long-term retention varies with ACT score for students. Student scores on the final exam questions (two questions for each topic; scored out of 10 points) were normalized as described for Figure 3 and separated based on ACT score for the long-term assessment and analyzed with a one-way ANOVA followed by Holm-Sidak pairwise comparisons. For the low ACT, group: n = 58, 61, and 60 for essay, discussion, and lecture, respectively; for the midrange ACT group: n = 358, 348, and 367; and for the high ACT group: n = 96, 93, and 95. (a) p = 0.002; (b) p = 0.021; (c) p = 0.001. The normalized data are expressed as mean ± SEM.

Similar articles

Cited by

References

    1. Ainley M, Hidi S, Berndorff D. Interest, learning, and the psychological processes that mediate their relationship. J Educ Psychol. 2002;94:545–561.
    1. Bangert-Drowns RL, Hurley MM, Wilkinson B. The effects of school-based writing-to-learn interventions on academic achievement: a meta-analysis. Rev Educ Res. 2004;74:29–58.
    1. Bloom BS. Taxonomy of Educational Objectives, Book 1: Cognitive Domain. Boston, MA: Addison Wesley; 1984.
    1. Boehm R, Gland JL. Using exams to teach chemistry more effectively. J Chem Educ. 1991;68:455.
    1. Brownell SE, Price JV, Steinman L. A writing-intensive course improves biology undergraduates’ perception and confidence of their abilities to read scientific literature and communicate science. Adv Physiol Educ. 2013;37:70–79. - PubMed

Publication types

LinkOut - more resources