Writing Questions about the Reading: A Formative Assessment Technique

Often instructors find out what students don’t understand when they grade exams, but by then it’s too late. That content has been covered and there’s new material to be worked on now. Instructors can ask questions in class, and they do hear student responses, but only from a very few students, and often those most inclined to answer understand better than those who don’t answer. Instructors can use clicker questions. Then they get responses from everyone, but those are teacher-generated questions. As such they focus student thinking on topics the teacher has selected.

The article cited below describes an assignment in an upper-division, large-lecture biochemistry course that has students generate questions about the readings. The assignment is worth about 5 percent of the course grade. The goal was to develop a formative assessment strategy that “would successfully (1) elicit responses from the majority of students, not just the most vocal, and (2) reveal the full range of student ideas, thereby providing a more robust picture of the class as a whole.” (p. 31) When students generate questions (in this case about the reading), they “reveal both how they think about a topic, as well as the ways in which they make connections between topics as they extend upon and construct new knowledge.” (p. 31)

Here’s how the assignment worked. Students were to submit 11 reading questions, each worth up to three points (with one low-scoring question dropped). They were instructed that their questions “should not focus solely on factual material; rather, a reading question should also describe what conceptual problems the individual has with the material and how the individual arrived at that question.” (p. 31) Questions were due before that content was covered in lecture. The article includes a copy of the assignment description given students. The authors report that students often wrote paragraphs that included more than one question.

The article describes the three methods the authors used to analyze the questions in order to determine how effective they were at revealing student thinking. They looked at the content of the questions, whether they reflected conceptual understanding of the material. They looked at the caliber of the questions themselves, wanting this to be an assignment that helped students learn how to ask “good” questions. They used a previously published taxonomy that included five types of questions: 1) illogical questions, often based on misconceptions; 2) questions of definition or fact; 3) questions that asked for more information than was provided in the reading; 4) questions that resulted from extended thought and synthesis of prior knowledge; and 5) potential research questions that contained at least the kernel of a hypothesis. And finally, they used a conceptual change rubric that assessed students’ levels of metacognition or the extent to which students were self-diagnosing their own learning.

They write of their results: “Analysis of SGRQs (Student-Generated Reading Questions) demonstrates their utility as a rich and versatile data source which instructors can use to diagnose, and potentially, address student thinking.” (p. 34) As for their analysis of the content of the questions, “the detailed nature of SGRQs reveals to instructors not just what students tend to ask questions about but also how students think about and make sense of the materials presented in the chapter prior to the lecture.” (p. 35) That kind of information can valuably guide how instructors prepare and present material during class.

The authors were a bit surprised that this content analysis revealed students still had questions about material they were supposed to have learned in previous courses. “As instructors, we make a myriad of assumptions about the knowledge students bring to our courses.” (p. 36) That, of course, raises questions about learning in those courses, but even though content knowledge may not be at the level instructors expect or deem necessary, it is still far better to have accurate information about what students know than to be designing courses and planning learning experiences based on what the instructor assumes they know.

The analysis of question types revealed that students do not typically ask questions that synthesize or extend knowledge. Less than 10 percent of the questions were close to bona fide research questions. The authors point out how this assignment could easily be used to help students develop their inquiry skills. And the analysis using the conceptual change rubric revealed similar results. All the students were able to consolidate their understanding, which is the first step necessary for metacognition, but only half asked questions “revealing confidence in their knowledge and openness to new ideas.” (p. 35) And only a small percentage asked questions that were coded as elaboration, the kind of questions indicating that students are “not only aware of their understanding but are actively questioning their ideas and reconciling them with alternative explanations.” (p. 35)

The article includes a useful section on implications, which offers a variety of ways teachers can use this assignment, not the least being this: “Incorporating a reading question assignment is a simple, easy-to-implement task that reinforces the importance of reading course materials.” (p. 35) It’s another one of those assignments with the potential to get students doing the reading before they come to class.

Reference:

Offerdahl, E. G., and Montplaisir, L. (2014). Student-generated reading questions: Diagnosing student thinking with diverse formative assessments. Biochemistry and Molecular Biology Education 42 (1), 29–38.

Leave a Reply

Logged in as Julie Evener. Edit your profile. Log out? Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon