Student-Written Exams Increase Student Involvement

What? Students writing their own exams? Yes, that’s exactly what these marketing faculty members had their students do. “The Student-Written Exam method is an open book and notes take-home exam in which each student writes and answers his or her own multiple-choice and short essay questions.” (p. 32)

It’s an interesting idea that arose out of the authors’ desires to increase student involvement in learning and self-evaluation, minimize cheating, decrease exam stress, and make exam experiences more meaningful, among other goals. It’s an approach that can be used online or in class.

Even though students take all sorts of exams and quizzes across their college careers, most never get the chance to write an exam, and, as these authors point out, they need support in order to do so. They have prepared a detailed set of exam guidelines to accompany this exam writing experience (the article includes a Web address for these guidelines). The guidelines clearly identify what content is to be covered by the student exam. They list chapter learning objectives to help the students create questions on important content. They offer advice on writing multiple-choice and short essay questions and illustrate the advice with examples. They share Bloom’s taxonomy and encourage students to write challenging questions, again illustrating with examples. They include all the logistical details such as when the exam is due and how it should be formatted. And they share their grading rubric. In addition to the guidelines, they offer to review a multiple-choice and short answer question when students start writing them.

This approach has one other unique feature: an exam feedback session. Students come to it with their completed exam and, on a separate sheet, one short-answer question and answer. For an hour, student questions and answers are discussed. At the end of that session, students have 10 minutes to review and makes changes to their answers.

Exams are graded on how well the set of questions covers the chapter learning objectives, how challenging the questions are, and the accuracy of the answers. In their classes of 25 to 35 students, the faculty did not find that the approach increased their exam grading time. And there was one unexpected grading benefit. “Our experience with this method also showed that it was less tedious to grade different questions than to read and grade the same answers multiple times.” (p. 34)

In addition to being an evaluation experience through which students learned the content, the authors report that the approach accomplishes several other learning goals. It encourages students to take responsibility for their own learning and evaluation. It allows students to word questions and answers in ways that are meaningful to them. Writing questions and answers causes students to engage with the content in deeper ways. Students reported that this assessment experience was less stressful, and it’s an approach that pretty well takes care of the cheating problem.

One of the biggest challenges of the approach is helping students learn how to write good questions. They don’t always see good examples on the exams they regularly take. And writing good test questions is hard, even for teachers. But here too, there is significant learning potential. Students are really being taught how to ask good questions, and that is an invaluable skill.

For instructors, the challenge is losing control over the difficulty and content of the questions students ask on their exams. Despite the supports provided, students didn’t always write challenging questions or cover all the topics the teachers felt should be covered. However, the trade-off was increased motivation. “It forced me to take more action and initiative while studying,” one student wrote. “Thinking of questions was a different way of learning.” (p. 34) The authors also note that they found it informative to see what questions and problems students decided to put on their exams.

We tend to get stuck in ruts and narrow thinking when it comes to how we assess student knowledge. We have our favorite assessment approaches, which we use regularly. An exam alternative like this illustrates the viability of ideas that we may not have considered.

Reference: Corrigan, H., and Craciun, G. (2013). Asking the right questions: Using Student-Written Exams as an innovative approach to learning and evaluation. Marketing Education Review, 23 (1), 31-35.

Leave a Reply

Logged in as Julie Evener. Edit your profile. Log out? Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time crafting...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets that...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...