Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...
Is this situation at all like what you're experiencing? Class sizes are steadily increasing, students need more opportunities to practice critical thinking skills, and you need to keep the amount of time devoted to grading under control. That was the situation facing a group of molecular biology and biochemistry professors teaching an advanced recombinant DNA course. They designed an interesting assessment alternative that addressed what they were experiencing.
Here are the details: “We wanted a group exam schema, which we could supplement with individual exams on the same material (but different questions) to increase student accountability and provide the necessary motivation for under-engaged students to participate with their groups” (p. 233). To accomplish these objectives they assigned students to groups (four or five members) and gave each group two weeks to complete an open-book, take-home exam. “The group learning exam questions were specifically designed to focus on higher-order cognitive skills” (p. 233). The teachers used problem-based-learning kinds of questions that were complex and reflective of real-world situations. They postulated that working together on these kinds of problems would prepare students for what came next. Immediately after submitting the group-constructed, take-home essay answers, students took an in-class, closed-book multiple-choice exam individually. The multiple-choice questions were different but covered the same content, and in the judgment of the teachers were easier than the essay questions.
Groups were formed in two different ways. For the first take-home exam, students were placed in groups by major, with each group having a diversity of majors. For the second take-home, students were assigned to groups based on their performance in the first exam: those who earned As grouped together, those with Bs grouped together, and so on.
Some of the results were predictable. The take-home essay scores for students who collaborated were about 14 percent higher than for students in a control group who wrote the take-home exams individually. “Instructor-assigned teams outperformed the control cohort on in-class exam 1 by an average of one standard letter grade” (p. 239). The same result did not occur for exam 2. The students who wrote individual, take-home essay exam answers had higher average scores on the in-class exam than did students who prepared their essays in groups. The faculty researchers speculate that the criteria they used for forming groups for the second exam produced the discrepancy.
Cumulative course final exam scores were significantly higher for students who had worked in teams on the take-home exam. Specifically, 12 percent more students performed at a B grade or higher on the final than did students who turned in individually prepared take-home exams (p. 240). There was also a significant learning gain for C-performing students who worked on teams. The researchers conclude that “instructor-assigned groups helped students not only understand the material in more depth, but also retain what they had learned throughout the course of the semester” (p. 240).
Survey data from students regarding this testing schema was also collected. “The vast majority [80 percent and above] of students in both semesters agreed that working in groups improved/would have improved their overall learning of the course material.” However, if they could choose, only a small majority [57 percent in semester one and 60 percent in semester two] indicated that they would work on exams with their peers.
Sroug, M. C., Miller, H. B., Witherow, D. S., & Carson, S. (2013). Assessment of a novel group-centered testing schema in an upper-level undergraduate molecular biotechnology course. Biochemistry and Molecular Biology Education, 41(4), 232–241.