Test-Item Order: Does It Matter?

Frequently instructors discourage cheating on multiple-choice exams by creating different versions of the exam. Test questions may be reordered randomly, according to their degree of difficulty, or in the order the material was presented in class, or the answer options may be ordered differently. The question is whether these different versions of the exam in any way influence how well a student performs on the exam. There is a fairly widespread belief among faculty that putting the easier questions first helps students. It reduces exam anxiety, and that can improve performance on the exam.

Where does the research come down on this issue? A 1992 meta-analysis of test-item order (cited in the article referenced below) concluded that “students do perform better on exams beginning with easier items than those beginning with harder items or a random order.” (p. 37) But not all the research supports this conclusion. This recent study involving 17 sections of a principles of marketing course enrolling a total of more than 400 students found that test-item order was not a significant factor in student performance.

In this study, each of the exams taken by students had three different versions. In one version, the harder items came first, followed by those of medium difficulty and ending with the easiest questions. The second version used the same test items but in reverse order. And in the final version questions were presented in a random order. Students did score lowest on the random version of the exam, but the differences were not statistically significant.

The author discusses the implication of these results. “Thus, instructors who teach an introductory marketing course with students enrolled from a variety of majors can prepare randomized versions of multiple-choice exams knowing that while cheating will be deterred, all students are treated fairly in the process.” (p. 40)

Because research results are mixed, it is not a bad idea for instructors to systematically order test items and analyze the results to see how test-item order might be affecting their students. Discussion of test-item order and its influence on performance might be a topic worth exploring during exam debrief sessions. Have students considered how the order of the items might be affecting their thinking about the exam and their performance on it? How do they feel when they come to a question that they can’t answer? How might they constructively deal with the increased anxiety difficult questions provoke?

Reference: Vander Schee, B. A. (2013). Test item order, level of difficulty, and student performance in marketing education. Journal of Education for Business, 88 (1), 36–41.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time crafting...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets that...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon
Frequently instructors discourage cheating on multiple-choice exams by creating different versions of the exam. Test questions may be reordered randomly, according to their degree of difficulty, or in the order the material was presented in class, or the answer options may be ordered differently. The question is whether these different versions of the exam in any way influence how well a student performs on the exam. There is a fairly widespread belief among faculty that putting the easier questions first helps students. It reduces exam anxiety, and that can improve performance on the exam. Where does the research come down on this issue? A 1992 meta-analysis of test-item order (cited in the article referenced below) concluded that “students do perform better on exams beginning with easier items than those beginning with harder items or a random order.” (p. 37) But not all the research supports this conclusion. This recent study involving 17 sections of a principles of marketing course enrolling a total of more than 400 students found that test-item order was not a significant factor in student performance. In this study, each of the exams taken by students had three different versions. In one version, the harder items came first, followed by those of medium difficulty and ending with the easiest questions. The second version used the same test items but in reverse order. And in the final version questions were presented in a random order. Students did score lowest on the random version of the exam, but the differences were not statistically significant. The author discusses the implication of these results. “Thus, instructors who teach an introductory marketing course with students enrolled from a variety of majors can prepare randomized versions of multiple-choice exams knowing that while cheating will be deterred, all students are treated fairly in the process.” (p. 40) Because research results are mixed, it is not a bad idea for instructors to systematically order test items and analyze the results to see how test-item order might be affecting their students. Discussion of test-item order and its influence on performance might be a topic worth exploring during exam debrief sessions. Have students considered how the order of the items might be affecting their thinking about the exam and their performance on it? How do they feel when they come to a question that they can't answer? How might they constructively deal with the increased anxiety difficult questions provoke? Reference: Vander Schee, B. A. (2013). Test item order, level of difficulty, and student performance in marketing education. Journal of Education for Business, 88 (1), 36–41.