I remember with horror and embarrassment the first multiple-choice exam I wrote. I didn’t think the students were taking my course all that seriously, so I decided to use the first exam to show just how substantive the content really was. I wrote long, complicated stems and followed them with multiple answer options and various combinations of them. And it worked. Students did poorly on the exam. I was pleased until I returned the test on what turned out to be one of the longest class periods of my teaching career. I desperately needed the advice that follows here.
- Don’t plan to write the entire test at once. If you do, chances are most of the questions will be of the “When did Columbus first visit the new world?” variety.
- The instructions should specify whether the student is selecting the correct answer or the best answer.
- Start by writing the stem first. Aim for a stem that presents a single problem and make it a problem related to significant content in the course. Using a verb in the stem helps ensure it presents a problem clearly.
- Write the stem either as a question or an incomplete statement. Generally questions are preferable to statements because they make it obvious what the student is expected to answer.
- State the question or statement positively, avoiding negatives. A negatively worded question adversely challenges even intelligent readers and students stressed about an exam are easily confused by them.
- After you’ve crafted the stem, write the correct or best answer. Make it brief and clear. It shouldn’t be longer than the incorrect options.
- Now write the incorrect answers, known as distractors. Common student errors make good, plausible distractors. It’s generally best to avoid humorous options. Some research shows that they don’t relax students and ridiculous choices are obviously not the right answer so students who don’t know the material are now guessing between fewer options.
- Include all the words needed to answer the question in the stem. Don’t repeat words or phrases in the distractors that could be put in the stem.
- Terms like “all,” “never,” or “always” are more often in the incorrect options than the correct ones. Test-wise students understand this and use it to their advantage.
- Check for grammatical consistency between the stem and the options. If an answer option isn’t grammatically correct, it doesn’t sound right, and most students won’t select it.
- Generally avoid using “none of the above” and “all of the above” or use them very cautiously. If students are selecting the best answer, “all of the above” is obviously a wrong answer and “none of the above” is very likely a wrong answer. If the instructions are to pick the right answer, research has documented that 25% of the time “all of the above” is the right answer.
- Most test experts recommend four or five answer options.
- When the test is complete, mark the right answers so you can be sure they are randomly distributed among the options. Students quickly learn if, for example, you tend to use “C” as your correct answer and use that knowledge when they don’t know the answer.
After my first multiple-choice test disaster, a colleague helped me with pointers like these. He also told me something else that has stuck with me. Think of a test question as a window that you look through to see what the student knows and understands. If the window is dirty, streaked, cracked, or broken, that makes it harder to see if the student has learned what you wanted him to learn. Good test questions are clean windows. They don’t obscure the view of what the students does and doesn’t know.
Despite good advice and a commitment to writing good multiple-choice questions, it is still possible to write the occasional bad one. It’s a bad question when a sizeable percentage of students miss the question and when a sizeable percentage of those with the highest scores on the exam (or in the class) are missing the question. At this point, it’s best to be honest. You don’t lose credibility with students if you toss out a question now and then. You lose a lot of credibility if you stand by questions that, although perfectly clear to you, confused and misled the masses.
References: I used a Kansas State IDEA paper and the Jacobs and Chase book, Developing and Using Tests Effectively to compile this list. Complete references to both appear in last week’s post.