Test Questions and Quizzing Improve Exam Performance

Credit: iStock.com/MarcosMartinezSanchez
Credit: iStock.com/MarcosMartinezSanchez

Sometimes courses with large enrollments spawn useful innovations, and this study looked at one empirically. Large courses almost always mandate the use of multiple-choice tests, and incorporating quizzes in these courses can present sizeable logistical challenges. To cope with that situation in a large microbiology course, the faculty involved here developed an online multiple-choice question (MCQ) authoring, testing, and learning tool they dubbed Quizzical. Students in the course used the tool to write MCQs that were then used as quiz questions. The research explores how use of Quizzical affected performance in the course.

The study

Riggs, C. D., Kang, S., & Rennie, O. (2020). Positive impact of multiple-choice authoring and regular quiz participation on student learning. CBE—Life Sciences Education, 19(2). https://doi.org/10.1187/cbe.19-09-0189 [open access]

The research questions

  • What’s the impact of Quizzical engagement on final course grades?
  • Does incentivizing participation in authoring questions and taking quizzes correlate with higher exam performance?
  • Does a change in the level of Quizzical engagement between tests affect students’ performance on the subsequent tests?

Interesting background information

Lots of prior research supports the use of interventions where students work with exam questions and take quizzes. The design of these activities and the logistical support the software provides make this intervention of particular interest. Students were assigned a lecture, after which they authored two exam questions using the lecture content. In addition to preparing the questions, students provided justification the answers and the detractors. The software provided guidance in writing MCQs, and the questions were reviewed (and could be revised) before they were posted. Students could take a 10-question quiz within 14 days of the lecture, and scores of 60 percent or more qualified for 6 to 8 percent of their course grade. For missed questions students were able to review the supporting material prepared by the question author. Both parts of the intervention were handled by the Quizzical software with minimal instructor involvement after initial set up.

Study cohort

The researchers studied the use of Quizzical in 500-student sections of a sophomore molecular biology course taken in 2017 and 2018.

Methodological overview

The software tracked use of the quiz option as well as students’ scores. The research team categorized quiz use and scores and then compared them with scores on three exams and a cumulative final. They analyzed the level of engagement between exams and its impact on exam scores. Various statistical tests and controls for prior academic performance were used.

Findings

  • Both question authorship and quiz performance correlated highly with test performance and course grade.
  • “The participation incentive is likely to be a strong contributor to the positive association we see between engagement and test scoring.”
  • “Students whose Quizzical engagement [defined as the amount of quiz participation] increased from one exam to the next earned statistically significant higher scores on the subsequent exam.”
  • Overall conclusion: “The results presented here provide compelling evidence that Quizzical use strongly supports students’ learning in the course, after controlling for prior academic ability.”

Cautions and caveats

The researchers report that they did find a significant gender bias in their results. Male students performed significantly better than female students for both years of the study. Students in this study were predominantly STEM majors, and previous research on STEM students report conflicting findings with respect to gender bias. The research team writes, “The underlying reasons for these apparent gender biases are not clear, as different studies employ different metrics, assessment formats, and methods of analysis, but there are undoubtedly sociocultural and psychological factors involved. Despite the fact gender bias exists for the course grade in our study, we found no gender differences in Quizzical participation.”

Any study done with cohorts that share the same or similar majors raises the question of whether the findings can be generalized to students in other majors. So, although it is not possible to guarantee the same results with the Quizzical software when it’s used with other student cohorts, research results in many fields confirm that exam scores and course grades improve when students are involved with potential test questions.

Practical implications (what you might want to do about this research)

The strongest implications rest on the now well-established value of what’s called “test-enhanced learning.” Whether it’s giving students old exams, having them write potential test question, or taking quizzes, significant exam score improvements are regularly reported.

This particular study shows the value of incentivizing participation in exam preparation activities. Doing so plays into that typical student response: Don’t do anything that doesn’t earn points. But better exam scores mean more learning, so perhaps that makes it a worthwhile tradeoff. Moreover, as most of us have learned, even minimal rewards motivate student participation.

Finally, the beauty of this particular intervention rests on students completing activities that benefit their learning at same time it relieves teachers of time-consuming tasks. Students write (and revise, if needed) what become the quiz questions. They justify the right answers and explain errors in the distractor items. With the quizzes, students decide when to take them and how many to complete. The software manages the quiz process, including keeping track of quiz scores and other relevant data.

Information about using Quizzical may be obtained from one of the authors, whose email appears in the article. Various support materials guide the setup process. The researchers write that “deploying Quizzical is straightforward, as it was designed for instructors with little or no experience in using educational software. . . . Once the course has been set up, there is little or no intervention needed.”

Related research

Much research on the value of test questions and quizzes exists. Here are two good examples that illustrate work in this area (discussed here and here, respectively):

Batsell Jr., W. R., Perry, J. L., Hanley, E., & Hostetter, A. B. (2017). Ecological validity of the testing effect: The use of daily quizzes in introductory psychology. Teaching of Psychology, 44(1), 18–23. https://doi.org/10.1177/0098628316677492

Brame, C. J., & Biel, R. (2015). Test-enhanced learning: The potential for testing to promote greater learning in undergraduate science courses. CBE—Life Sciences Education, 14(2). https://doi.org/10.1187/cbe.14-11-0208 [open access]

Leave a Reply

Logged in as Julie Evener. Edit your profile. Log out? Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon