More on Learning from Exams

Credit: iStock.com/momcilog
Credit: iStock.com/momcilog

My interest in making exams more about learning and less about grades continues. I’m also a realist: exams will always be about grades. But could they please be at least a bit more about learning? The best way to increase learning focus is with strategies that get students dealing with their exam errors. I’ve shared some good ideas for doing that in a number of columns now (see here, here, and here). As good as those strategies are, there’s a chance another one might better fit your instructional situation. So how about this one?

For Those Who Teach from Maryellen Weimer

Barnard and Sweeder (2020) developed it for use in a large chemistry course, motivated by the fact that even though students were doing a post-exam worksheet of “exam-worthy” questions—read: questions on which they did poorly—exam scores were not improving, and the same kinds of mistakes kept showing up on subsequent exams. In the new system, students do not receive a fully graded example in the next recitation section, as they did previously. Instead, they arrive having received an email that indicates their score on each exam question. What’s delivered to them in recitation is an unmarked, scanned, hard copy of their exam. In groups students look at their answers on what’s described as constructed-response exams. They identify their mistakes and with others in the group work to construct a better answer to each exam question. Then students work individually to fill out a worksheet and respond to questions that ask about exam preparation, for an analysis of mistakes, and how they plan to prepare for the next exam. After submitting the worksheet and their responses, students receive a graded copy of their exam that includes teacher feedback.

The approach abounds in commendable features. Student come to the exam debrief knowing the questions on which they lost points, but their privacy is protected. What they’ve missed is not marked on the exam used during recitation. They collaborate with fellow students to generate accurate answers—so they question each other, offer content explanations, and perhaps share answers and argue their merits. Talking about content helps students learn it. Furthermore, analysis and reflection encourage students to look beyond individual answers and toward general characteristics of mistakes. Here’s an activity that, as I’m fond of saying, gets students doing the hard, messy work of learning. The brief article does not say whether these revised answers earned students any additional credit, but that could be an option and might motivate even deeper engagement with the activity.

Too often we miss some of the learning opportunities that exam events afford. What happens here gives students a chance to learn content they hadn’t learned or had learned incorrectly for the exam. Feedback that identifies errors, even feedback that corrects the answer, does not illustrate the processes whereby one creates, finds, or improves an answer. In this strategy students get a chance to do that in the company of peers, which makes it less stressful than having to ask the teacher.

There’s always the argument about the student who aced all the answers, for whom an activity like this might be a waste of time. Given the fact that many students (even very good ones) store exam answers in short-term memory, an exam debrief can provide one more opportunity to encounter, review, and reconsider the content. And in a strategy like this, there are teaching opportunities, which are in fact learning opportunities for both teacher and student. In situations where the answer isn’t a single solution or a multiple-choice option, possible answers must be evaluated, and even good responses can be improved.

Faculty made this work in a large course. Exams were graded using an online grading program, with scores on each question distributed using the mail merge function on a word processing program. Technology does make teaching strategies once possible only in small courses now doable in big ones.

I keep hoping the days of posting exam scores with no follow-up in the course are over. They should be. That’s an approach that does make grades matter more than learning. Yes, it’s our responsibility to test knowledge acquisition, but we are just as responsible for teaching in ways that promote learning. And there are lots of good ways we can push the learning agenda forward when returning exams.

Reference

Barnard, R. A., & Sweeder, R. D. (2020). Using online grading to stagger midterm exam feedback and create space for meaningful student reflection. College Teaching, 68(2), 60–61. https://doi.org/10.1080/87567555.2020.1713041

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time crafting...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets that...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon

My interest in making exams more about learning and less about grades continues. I’m also a realist: exams will always be about grades. But could they please be at least a bit more about learning? The best way to increase learning focus is with strategies that get students dealing with their exam errors. I’ve shared some good ideas for doing that in a number of columns now (see here, here, and here). As good as those strategies are, there’s a chance another one might better fit your instructional situation. So how about this one?

For Those Who Teach from Maryellen Weimer

Barnard and Sweeder (2020) developed it for use in a large chemistry course, motivated by the fact that even though students were doing a post-exam worksheet of “exam-worthy” questions—read: questions on which they did poorly—exam scores were not improving, and the same kinds of mistakes kept showing up on subsequent exams. In the new system, students do not receive a fully graded example in the next recitation section, as they did previously. Instead, they arrive having received an email that indicates their score on each exam question. What’s delivered to them in recitation is an unmarked, scanned, hard copy of their exam. In groups students look at their answers on what’s described as constructed-response exams. They identify their mistakes and with others in the group work to construct a better answer to each exam question. Then students work individually to fill out a worksheet and respond to questions that ask about exam preparation, for an analysis of mistakes, and how they plan to prepare for the next exam. After submitting the worksheet and their responses, students receive a graded copy of their exam that includes teacher feedback.

The approach abounds in commendable features. Student come to the exam debrief knowing the questions on which they lost points, but their privacy is protected. What they’ve missed is not marked on the exam used during recitation. They collaborate with fellow students to generate accurate answers—so they question each other, offer content explanations, and perhaps share answers and argue their merits. Talking about content helps students learn it. Furthermore, analysis and reflection encourage students to look beyond individual answers and toward general characteristics of mistakes. Here’s an activity that, as I’m fond of saying, gets students doing the hard, messy work of learning. The brief article does not say whether these revised answers earned students any additional credit, but that could be an option and might motivate even deeper engagement with the activity.

Too often we miss some of the learning opportunities that exam events afford. What happens here gives students a chance to learn content they hadn’t learned or had learned incorrectly for the exam. Feedback that identifies errors, even feedback that corrects the answer, does not illustrate the processes whereby one creates, finds, or improves an answer. In this strategy students get a chance to do that in the company of peers, which makes it less stressful than having to ask the teacher.

There’s always the argument about the student who aced all the answers, for whom an activity like this might be a waste of time. Given the fact that many students (even very good ones) store exam answers in short-term memory, an exam debrief can provide one more opportunity to encounter, review, and reconsider the content. And in a strategy like this, there are teaching opportunities, which are in fact learning opportunities for both teacher and student. In situations where the answer isn’t a single solution or a multiple-choice option, possible answers must be evaluated, and even good responses can be improved.

Faculty made this work in a large course. Exams were graded using an online grading program, with scores on each question distributed using the mail merge function on a word processing program. Technology does make teaching strategies once possible only in small courses now doable in big ones.

I keep hoping the days of posting exam scores with no follow-up in the course are over. They should be. That’s an approach that does make grades matter more than learning. Yes, it’s our responsibility to test knowledge acquisition, but we are just as responsible for teaching in ways that promote learning. And there are lots of good ways we can push the learning agenda forward when returning exams.

Reference

Barnard, R. A., & Sweeder, R. D. (2020). Using online grading to stagger midterm exam feedback and create space for meaningful student reflection. College Teaching, 68(2), 60–61. https://doi.org/10.1080/87567555.2020.1713041