Getting Answer-Oriented Students to Focus on the Questions

Answer-Oriented Students

Are your students too answer oriented? Are they pretty much convinced that there’s a right answer to every question asked in class? When preparing for exams, do they focus on memorizing answers, often without thinking about the questions?

To cultivate interest in questions, consider having students write exam questions. Could this be a way to help teachers generate new test questions? Don’t count on it. Writing good test questions — ones that make students think, ones that really ascertain whether they understand the material — is hard work. Given that many students are not particularly strong writers to begin with, they won’t write good test questions automatically. In fact, you probably shouldn’t try the strategy if you aren’t willing to devote some time to developing test writing skills.

But having students write test questions benefits them in several ways. It’s an indirect but effective way to get them involved in trying to answer their favorite “what-do-I-need-to-know-for-the-exam?” question. Initially, they may write questions a lot easier than those on the exams. Questions that test recall and focus on details are a lot easier to write than question that require thinking. But if shown samples of questions that test knowledge at different levels, students can see the differences and begin to understand test questions better. If they write questions about content that will be included on the exam, you can use a set of their questions during the exam debrief to show how well they are figuring out on their own what will be on the test.

The strategy also deepens understanding and makes student thinking more precise, especially if they write questions that classmates must try to answer. In my classes, students wrote potential exam questions related to text material not discussed in class. They brought copies of those questions for the rest of the class, answered them first individually, and then in groups during the review session. Poorly worded, unclear, confusing questions generated all sorts of good discussion about questions and content.

The approach also focuses study efforts by connecting questions and answers, something that doesn’t always occur when students are memorizing answers. I’ve had students who could recite answers but were clueless as to the questions they answered.

If you work with students on writing good questions and are willing to do some editorial work on what they submit, some of their questions can show up on the exam. Then you will have a strategy that really motivates student interest in questions! Before considering student questions to include on an exam, it’s good to have decided what content merits questions and then select those student questions that focus on appropriate material. In the case that none of their questions meets your standards, simply add your own questions on that content.

If the idea sounds interesting, but you need some resources, the article by Green describes an assignment in which students created a test bank of questions (posted online without answers but with the question’s author identified). No more than 25% of the questions on the exams in this class were teacher-generated. Green’s article also includes a succinct set of guidelines for writing multiple-choice and short answer questions that models the kind of resources students need in order to write good questions. A more complete set of test question guidelines and discussion of the rationale behind them appears in Jacobs and Chase’s book, which has chapters on multiple-choice items, true-false, matching and completion items. Guidelines for writing good test questions are pretty much timeless, so don’t be put off by the book’s publication date. And if you might be persuaded that writing questions has some empirical validity, there’s a recent study of the strategy also listed.

References: Green, D. H. “Student-Generated Exams: Testing and Learning.” Journal of Marketing Education, 1997, 19 (2), 43-53.

Jacobs, L. C. and Chase, C. I. Developing and Using Tests Effectively: A Guide for Faculty. San Francisco: Jossey-Bass, 1992.

Papinczak, T., Peterson, R., Brabri, A., Ward, K. Kippers, V. and Wilkinson, D. “Using Student-Generated Questions for Student-Centered Assessment.” Assessment & Evaluation in Higher Education, 2012, 37 (4), 439-452.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time crafting...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets that...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon
Are your students too answer oriented? Are they pretty much convinced that there’s a right answer to every question asked in class? When preparing for exams, do they focus on memorizing answers, often without thinking about the questions? To cultivate interest in questions, consider having students write exam questions. Could this be a way to help teachers generate new test questions? Don’t count on it. Writing good test questions — ones that make students think, ones that really ascertain whether they understand the material — is hard work. Given that many students are not particularly strong writers to begin with, they won’t write good test questions automatically. In fact, you probably shouldn’t try the strategy if you aren’t willing to devote some time to developing test writing skills. But having students write test questions benefits them in several ways. It’s an indirect but effective way to get them involved in trying to answer their favorite “what-do-I-need-to-know-for-the-exam?” question. Initially, they may write questions a lot easier than those on the exams. Questions that test recall and focus on details are a lot easier to write than question that require thinking. But if shown samples of questions that test knowledge at different levels, students can see the differences and begin to understand test questions better. If they write questions about content that will be included on the exam, you can use a set of their questions during the exam debrief to show how well they are figuring out on their own what will be on the test. The strategy also deepens understanding and makes student thinking more precise, especially if they write questions that classmates must try to answer. In my classes, students wrote potential exam questions related to text material not discussed in class. They brought copies of those questions for the rest of the class, answered them first individually, and then in groups during the review session. Poorly worded, unclear, confusing questions generated all sorts of good discussion about questions and content. The approach also focuses study efforts by connecting questions and answers, something that doesn’t always occur when students are memorizing answers. I’ve had students who could recite answers but were clueless as to the questions they answered. If you work with students on writing good questions and are willing to do some editorial work on what they submit, some of their questions can show up on the exam. Then you will have a strategy that really motivates student interest in questions! Before considering student questions to include on an exam, it’s good to have decided what content merits questions and then select those student questions that focus on appropriate material. In the case that none of their questions meets your standards, simply add your own questions on that content. If the idea sounds interesting, but you need some resources, the article by Green describes an assignment in which students created a test bank of questions (posted online without answers but with the question’s author identified). No more than 25% of the questions on the exams in this class were teacher-generated. Green’s article also includes a succinct set of guidelines for writing multiple-choice and short answer questions that models the kind of resources students need in order to write good questions. A more complete set of test question guidelines and discussion of the rationale behind them appears in Jacobs and Chase’s book, which has chapters on multiple-choice items, true-false, matching and completion items. Guidelines for writing good test questions are pretty much timeless, so don’t be put off by the book’s publication date. And if you might be persuaded that writing questions has some empirical validity, there’s a recent study of the strategy also listed. References: Green, D. H. “Student-Generated Exams: Testing and Learning.” Journal of Marketing Education, 1997, 19 (2), 43-53. Jacobs, L. C. and Chase, C. I. Developing and Using Tests Effectively: A Guide for Faculty. San Francisco: Jossey-Bass, 1992. Papinczak, T., Peterson, R., Brabri, A., Ward, K. Kippers, V. and Wilkinson, D. “Using Student-Generated Questions for Student-Centered Assessment.” Assessment & Evaluation in Higher Education, 2012, 37 (4), 439-452.