Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Dec;21(4):ar61.
doi: 10.1187/cbe.21-08-0199.

Collaborative Teaching plus (CT+): A Timely, Flexible, and Dynamic Course Design Implemented during Emergency Remote Teaching in an Introductory Biology Course

Affiliations

Collaborative Teaching plus (CT+): A Timely, Flexible, and Dynamic Course Design Implemented during Emergency Remote Teaching in an Introductory Biology Course

Kamal S Dulai et al. CBE Life Sci Educ. 2022 Dec.

Abstract

Student-centered pedagogies promote student learning in college science, technology, engineering, and mathematics (STEM) classrooms. However, transitioning to active learning from traditional lecturing may be challenging for both students and instructors. This case study presents the development, implementation, and assessment of a modified collaborative teaching (CT) and team-based learning (TBL) approach (CT plus TBL, or CT+) in an introductory biology course at a Minority-Serving Institution. A logic model was formulated depicting the various assessment practices with the culminating goal of improving the student learning experience. We analyzed qualitative and quantitative data based on students and instructors' behaviors and discourse, and student midsemester and end-of-semester surveys. Our findings revealed that the integration of multiple instructors allowed for knowledge exchange in blending complementary behaviors and discourse practices during class sessions. In addition, the frequent ongoing assessments and incorporation of student feedback informed the CT+ design during both in-person and emergency remote teaching. Furthermore, this course design could be easily adapted to a variety of STEM courses in higher education, including remote instruction.

PubMed Disclaimer

Figures

FIGURE 1.
FIGURE 1.
Schematic comparison between TBL (top panel) and CT+ (bottom panel). The different components of the two approaches are displayed on the left. The three phases of the instruction format—pre-lecture, lecture, and post-lecture—flow from left to right (arrowheads.) The design content of lectures for both approaches is displayed in the center block. Final block on the right relays any post-lecture components.
FIGURE 2.
FIGURE 2.
CT+ logic model depicts a twofold development process: theory of action and assessment design represented by brackets on the left. The model is read from left to right with the culminating outcome of the improved student learning experience. In broad strokes, the CT+ components are postulated to cause a change in instructor practice that, in turn, influences student behavior and improves the student learning experience. Course design (teal box) is informed by ongoing assessment and evaluation at every level: course (blue), instructor outcomes (orange), and student outcomes (green). The assessment design incorporated collaboration between the course instructors and the SATAL team. The corresponding components are included in the nested brackets and the remaining green boxes. The various arrows represent the flow of information between the respective components. CLO, course learning objective(s).
FIGURE 3.
FIGURE 3.
Diagram displaying the methodology use for qualitative data analysis. The five steps following content analysis from Saldaña (2015) were used to identify the categories from the open-ended Qualtrics survey (results shown in the left column). Representative examples of each step are listed in the right column.
FIGURE 4.
FIGURE 4.
Analysis of student and instructor interactions during the CT+ timeline. (A) The CT+ timeline illustrates the components of each element of the session adjusted according to duration (length of the arrows). This timeline correlates directly with the intervals shown in corresponding graphs B, D, and F. (B) Instructor behaviors were measured using COPUS analysis. Individual codes are listed on the left and grouped by collapsed codes. The filled colored boxes (blocks) represent observations of behaviors by the lead instructor (red), co-instructor (gray), and both instructors (magenta). (C) Percentage of time each instructor spent on individual and collapsed COPUS codes during the session. The individual codes are grouped into four collapsed categories (Presenting, Guiding, Administering, Other). (D) Student behavior was measured using COPUS analysis. The individual codes were organized into four categories. The blue boxes correspond to student behaviors observed during those time intervals. (E) Percentage of time students spent with reference to the individual COPUS codes (mean) and total (collapsed) times. The collapsed codes are stated in filled boxes. (F) Instructor discourse was measured using CDOP analysis during the observed session. Individual codes are listed on the left, with grouping. Boxes are color-coded as in A. (G) Percentage of time each instructor spent on individual and collapsed CDOP codes during the session. The individual codes are grouped into four collapsed CDOP codes (Authoritative, Non-interactive; Authoritative, Interactive; Dialogic, Interactive; and Other).
FIGURE 5.
FIGURE 5.
Impact of pre-lecture material on students’ learning experience. Midsemester and end-of-semester surveys were used to collect student feedback (A, B). Each row relays levels of student agreement from strongly disagree to strongly agree. Percentages on the left of each box are the combined numbers for students who either strongly disagreed or disagreed (red tones), while those on the right are for the percentage who agreed or strongly agreed (blue tones). The central number corresponds to those who neither agreed nor disagreed (gray boxes).
FIGURE 6.
FIGURE 6.
Course resources have a positive impact on students’ learning experience. Midsemester and end-of-semester surveys were used to collect student feedback on course resources (A, B). Each row relays levels of student agreement for four resources, from strongly disagree to strongly agree. Percentages on the left of each box are the combined percentages for students who either strongly disagreed or disagreed (red tones), while those on the right are for the percentage who agreed or strongly agreed (blue tones). The central percentage corresponds to those who neither agreed nor disagreed (gray boxes). Open-ended question responses from student feedback were quantified (C, D). The most frequent categories of student comments in the midsemester and end-of-semester feedback was identified, with instructor-oriented comments in black and student-oriented elements in blue. The numbers displayed within each circle refer to the number of times those categories appeared in the survey results (i.e., number of references). The blue circles represent comments with categories that worked well, while orange circles represent categories that needed adjustment.

Similar articles

References

    1. Akiha, K., Brigham, E., Couch, B. A., Lewin, J., Stains, M., Stetzer, M. R., … & Smith, M. K. (2018). What types of instructional shifts do students experience? Investigating active learning in science, technology, engineering, and math classes across key transition points from middle school to the university level. Frontiers in Education, 2(68, 1–18). doi: 10.3389/feduc.2017.00068
    1. Alkhouri, J. S., Donham, C., Pusey, T. S., Signorini, A., Stivers, A. H., Kranzfelder, P. (2021). Look who’s talking: Teaching and discourse practices across discipline, position, experience, and class size in STEM college classrooms. BioScience, 71(10), 1063–1078. - PMC - PubMed
    1. American Association for the Advancement of Science (AAAS). (2011). Vision and change in undergraduate biology education: A call to action . Washington, DC. Retrieved August 12, 2019, from http://visionandchange.org/finalreport/
    1. AAAS. (2012). Describing & measuring undergraduate STEM teaching practices. Washington, DC. Retrieved December 10, 2020, from http://ccliconference.org/files/2013/11/Measuring-STEM-Teaching-Practice...
    1. Association of American Universities. (2017). Progress toward achieving systemic change: Five-year status report on the AAU Undergraduate STEM Education Initiative. Retrieved from Washington, DC. Retrieved December 6, 2019, from www.aau.edu/sites/default/files/AAU-Files/STEM-Education-Initiative/STEM...

Publication types