Editor’s note: This article and “Quizzes on the Go” by W. Mick Charney (published last week week) describe two innovative quiz strategies. For a couple of other unique approaches to quizzes, see “The (Mostly) Unmarked Quiz” and “The Unquiz: An Enjoyable Way to Jog Students’ Memories.”
- I call my quizzes “knowledge checks.” The name emphasizes the primary purpose of the quiz—to discover what the student has not mastered in the assigned material. Students do not associate the word “quiz” with my formative intentions, so I avoid using it.
- Both my knowledge checks and exams are cumulative. I use a pool of questions from each chapter to allow the LMS to randomly assign equivalent questions to each student. I have two goals: to create a scaffold for course mastery and to help reduce the amount of early course material forgotten by the end of the course.
- Students complete a 25-question knowledge check each week for the first 11 or 12 weeks of the course.
- Questions are deliberately designed to require analysis and evaluation of both the question prompt and the possible answers. This trains students to use their higher-level thinking skills rather than memorization as their primary approach to study and test taking.
- Around eight to 10 of each knowledge check’s questions are from the most recently completed chapter(s) or unit(s), and the rest are drawn from the materials from earlier weeks.
- I believe looking up answers is an important component of learning. All of my knowledge checks quizzes and the midterm are open book for this reason, but not the final exam. It assesses what students have mastered in the course.
- I set up the LMS to drop the bottom two knowledge check scores so that students do not have to worry about missing an occasional deadline or about an unusually low score.
- After the deadline has passed to complete the week’s knowledge check, the LMS lets students see the question and possible answers only for those questions that they answered incorrectly.
- The LMS does not reveal the correct answer, nor does it display the questions that were answered correctly. This feature frustrates attempts to compile exam answer lists. I then encourage students to work with me via email to discover, for themselves, the correct answers, as follows:
- They copy and paste an image of the question and the possible answers into an email.
- They tell me their next best answer.
- They explain why that is their next best answer.
- I reply in one of three ways:
- “Yes, this is correct and your reasoning or evidence is valid.”
- “Yes, this is correct but your reasoning or evidence is faulty.”
- I then include guidance and suggestions and ask them to reply with a better reason.
- “No, this is incorrect.”
- I then include guidance and suggestions and ask them to give me their next best answer and reasoning.
- The student continues to correspond with me about each question until they discover for themselves the correct answer and the necessary reasoning or evidence. I don’t offer extra credit for reasoning through the correct answers, but since all tests are cumulative, students benefit from rising scores as the term progresses.
- I try to keep the knowledge check scores minor in comparison to midterm and final exam scores, because the students are still working to master the material. They must be worth enough to motivate students to study and not so much they become traumatic.
- My midterms and final exams are drawn randomly by the LMS from the same question pools.
Keith Weber is a lecturer in the management information systems program at the Rochester Institute of Technology Saunders College of Business.