Knowledge Checks

Credit: iStock.com/Chinnapong
Credit: iStock.com/Chinnapong

Editor’s note: This article and “Quizzes on the Go” by W. Mick Charney (published last week week) describe two innovative quiz strategies. For a couple of other unique approaches to quizzes, see “The (Mostly) Unmarked Quiz” and “The Unquiz: An Enjoyable Way to Jog Students’ Memories.”

  1. I call my quizzes “knowledge checks.” The name emphasizes the primary purpose of the quiz—to discover what the student has not mastered in the assigned material. Students do not associate the word “quiz” with my formative intentions, so I avoid using it.
  2. Both my knowledge checks and exams are cumulative. I use a pool of questions from each chapter to allow the LMS to randomly assign equivalent questions to each student. I have two goals: to create a scaffold for course mastery and to help reduce the amount of early course material forgotten by the end of the course.
  3. Students complete a 25-question knowledge check each week for the first 11 or 12 weeks of the course.
  4. Questions are deliberately designed to require analysis and evaluation of both the question prompt and the possible answers. This trains students to use their higher-level thinking skills rather than memorization as their primary approach to study and test taking.
  5. Around eight to 10 of each knowledge check’s questions are from the most recently completed chapter(s) or unit(s), and the rest are drawn from the materials from earlier weeks.
  6. I believe looking up answers is an important component of learning. All of my knowledge checks quizzes and the midterm are open book for this reason, but not the final exam. It assesses what students have mastered in the course.
  7. I set up the LMS to drop the bottom two knowledge check scores so that students do not have to worry about missing an occasional deadline or about an unusually low score.
  8. After the deadline has passed to complete the week’s knowledge check, the LMS lets students see the question and possible answers only for those questions that they answered incorrectly.
  9. The LMS does not reveal the correct answer, nor does it display the questions that were answered correctly. This feature frustrates attempts to compile exam answer lists. I then encourage students to work with me via email to discover, for themselves, the correct answers, as follows:
    1. They copy and paste an image of the question and the possible answers into an email.
    2. They tell me their next best answer.
    3. They explain why that is their next best answer.
    4. I reply in one of three ways:
      1. “Yes, this is correct and your reasoning or evidence is valid.”
      2. “Yes, this is correct but your reasoning or evidence is faulty.”
        • I then include guidance and suggestions and ask them to reply with a better reason.
      3.  “No, this is incorrect.”
        • I then include guidance and suggestions and ask them to give me their next best answer and reasoning.
    5. The student continues to correspond with me about each question until they discover for themselves the correct answer and the necessary reasoning or evidence. I don’t offer extra credit for reasoning through the correct answers, but since all tests are cumulative, students benefit from rising scores as the term progresses.
  10. I try to keep the knowledge check scores minor in comparison to midterm and final exam scores, because the students are still working to master the material. They must be worth enough to motivate students to study and not so much they become traumatic.
  11. My midterms and final exams are drawn randomly by the LMS from the same question pools.

Keith Weber is a lecturer in the management information systems program at the Rochester Institute of Technology Saunders College of Business.

Leave a Reply

Logged in as Julie Evener. Edit your profile. Log out? Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time crafting...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets that...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon