Two Options That Improve Textbook Reading

Textbook Reading

Many students don’t do the reading before class. Most surveys report that less than 50 percent of students have read the assigned material before it’s dealt with in class or online. Most faculty don’t need to be persuaded of this fact. They regularly deal with students who aren’t doing the reading. There are multiple reasons why students don’t do the reading. They aren’t used to reading the kind of content assigned in college courses. Textbooks are dauntingly long. The vocabulary is new and difficult. Many students don’t see any value in reading—they don’t read, so they haven’t experienced how it makes understanding in class easier. And then, a lot of students just don’t like to read.

But students’ grades and their learning suffer if they aren’t doing the reading, or if they wait and try to do it all just before the exam. So, how do teachers get students to do the reading? Quizzes are one option, and there’s evidence that they work—not for all students, but for most. But quizzes aren’t the only option. Here are two effective alternatives.

Optional Reading Guides

Optional? If it’s not required, students won’t do it. That’s not what a research team discovered about the reading guides they used in a large (400+ students) introductory biology course. Reading guides were posted one week before the class session, one for every class period. The guides asked students to define terms, explain concepts, make tables and drawings, and interact with various visual materials in the text. The guides were not collected, and students received no points for completing them.

How did the researchers know if students completed the guides? Three ways: students completed an online survey at the end of the course; during each class session, a clicker question asked if students completed all of the guide, part of it, less than half of it, or none of it; and, in TA-led discussion sections, the TAs randomly examined some reading guides without notifying students beforehand. In the survey, about 86 percent of the 677 students surveyed reported they completed the guides before class at least some of the time. Only 5 percent said they never completed the guides. Clicker question responses indicated that about 75 percent had completed at least some of the guide, with nearly 50 percent completing the entire guide before each class. Almost 70 percent of the guides the TAs looked at matched with the students’ self-reports.

Completion of the guides made a difference in exam scores. Using multiple linear regression analyses and controlling for potentially confounding factors, “full completion of the reading guides before class was significantly positively correlated with increased exam scores for all exams in the course” (p. 7). More specifically, if a student completed six reading guides before one of the exams, it would increase the exam score between 6.5 and 9.6 points out of 100.

What was the secret to getting students to complete this optional assignment? The study design didn’t answer that question, but the research team offered some ideas in a section that addresses implementing reading guides in other courses. They did not post completed reading guides, thinking that if they did, students would download those and use them for study. That’s probably not without benefit, but related research in cognitive psychology lends support to the value of students coming up with their own answers, which forces them to engage directly with the material: in this case, the textbook. If students had questions based on the material used in the guides, they were welcome to ask them. Furthermore, the guides were used in class. Faculty referred to them and asked students to compare their work on particular guide questions with one another.

Obviously, it takes time and effort to develop the guides, and part of that work needs to be repeated if a new edition of the book comes out or if the assigned readings for the course are changed. However, this faculty group reported a benefit from preparing the guides: “We found that we were much more in tune with the content of the textbook and were very clear on what we wanted students to read in advance of each class session, which helped align learning objectives between preclass assignments and in-class activities” (p. 8).

What’s most encouraging about this work is the students’ response. A significant number used the reading guides, and more than 70 percent of those reported spending more than 60 minutes on them, and that was without requirement or reward.

Active Reading Questions

Fleck, Richmond, Rauer, and Beckman defined active reading questions (ARQs) as the type of question “intended to help student focus their attention as they read and clarify their understanding as they progress through reading material” (p. 221). ARQs foster higher-level thinking skills by asking students “to remember, apply, and analyze information using prior knowledge” (p. 221), and they foster lower-level thinking skills by asking students to remember and understand the content. Questions that promote both higher- and lower-level thinking skills increase students’ understanding and retention of course content. That assertion is documented with research evidence cited in the article.

The effectiveness of ARQs at promoting learning was tested in a pilot classroom setting and a laboratory experiment. Perhaps the most interesting part of this work was a comparison between having students answer the ARQs and giving students the PowerPoint slides used in the classroom presentation. In the laboratory setting, that comparison was expanded to include a control group that received no additional study aids. As the research team points out, PowerPoint slides are widely used in classrooms (and everywhere else), but “the evidence on their effectiveness as a tool for learning is mixed” (p. 222). However, students regularly ask professors to share their slides and have been known to become quite irate if the request is denied.

In the classroom study, with 37 students in one section and 38 in another, completion of the ARQs was optional, and students did not receive any credit for doing them. Even so, 27 of the 37 students in the first section said they used them at least 50 percent of the time, and 29 of the 38 students in the second section reported they used the PowerPoint slides at least 50 percent of the time. Although neither the questions nor the slides harmed students, neither improved their learning as measured by exam scores.

In the laboratory setting, however, where 153 students either read a passage and simultaneously responded to a set of ARQs, used PowerPoint slides (as they normally would), or were given nothing but were advised to use their own reading strategies, the question set did make a difference. Understanding of the text passage was tested by a set of free-recall questions (essentially, write whatever you remember from the text), multiple-choice questions, and a reading comprehension assessment. The results indicated “the students who used ARQs significantly outperformed both students who used PPT [PowerPoint slides] and those who used no tool at all in the free-recall questions, for both the higher-level and lower-level categorization” (p. 228). The researchers note that free-recall questions require students to search their memories for accurate information and generally encourage more higher-level thinking: “The free-recall questions required students to perform a more difficult, deeper level of processing” (p. 229).

ARQs also require work to create and work to grade. The researchers recommend requiring students to complete them, as the fact they were optional in the classroom study is the likely reason they had no effect on exam performance. Question sets like this may not be a viable option in large courses—although they could be assigned less often, or only some of the questions could be graded. ARQs might be a valuable addition to introductory courses where beginning students, who often find the assigned readings particularly challenging, still need to develop the kind of reading skills necessary for successful learning.

References:

Lieu, R., Wong, A., Asefirad, A., & Shaffer, J.F. (2017). Improving exam performance in introductory biology through the use of preclass reading guides. Cell Biology Education—Life Sciences Education, 16(Fall), 1–10.

Fleck, B., Richmond, A., Rauer, H., & Beckman, L. (2017). Active reading questions as a strategy to support college students’ textbook reading. Scholarship of Teaching and Learning in Psychology, 3(3), 220–232.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon
Many students don't do the reading before class. Most surveys report that less than 50 percent of students have read the assigned material before it's dealt with in class or online. Most faculty don't need to be persuaded of this fact. They regularly deal with students who aren't doing the reading. There are multiple reasons why students don't do the reading. They aren't used to reading the kind of content assigned in college courses. Textbooks are dauntingly long. The vocabulary is new and difficult. Many students don't see any value in reading—they don't read, so they haven't experienced how it makes understanding in class easier. And then, a lot of students just don't like to read. But students' grades and their learning suffer if they aren't doing the reading, or if they wait and try to do it all just before the exam. So, how do teachers get students to do the reading? Quizzes are one option, and there's evidence that they work—not for all students, but for most. But quizzes aren't the only option. Here are two effective alternatives. Optional Reading Guides Optional? If it's not required, students won't do it. That's not what a research team discovered about the reading guides they used in a large (400+ students) introductory biology course. Reading guides were posted one week before the class session, one for every class period. The guides asked students to define terms, explain concepts, make tables and drawings, and interact with various visual materials in the text. The guides were not collected, and students received no points for completing them. How did the researchers know if students completed the guides? Three ways: students completed an online survey at the end of the course; during each class session, a clicker question asked if students completed all of the guide, part of it, less than half of it, or none of it; and, in TA-led discussion sections, the TAs randomly examined some reading guides without notifying students beforehand. In the survey, about 86 percent of the 677 students surveyed reported they completed the guides before class at least some of the time. Only 5 percent said they never completed the guides. Clicker question responses indicated that about 75 percent had completed at least some of the guide, with nearly 50 percent completing the entire guide before each class. Almost 70 percent of the guides the TAs looked at matched with the students' self-reports. Completion of the guides made a difference in exam scores. Using multiple linear regression analyses and controlling for potentially confounding factors, “full completion of the reading guides before class was significantly positively correlated with increased exam scores for all exams in the course” (p. 7). More specifically, if a student completed six reading guides before one of the exams, it would increase the exam score between 6.5 and 9.6 points out of 100. What was the secret to getting students to complete this optional assignment? The study design didn't answer that question, but the research team offered some ideas in a section that addresses implementing reading guides in other courses. They did not post completed reading guides, thinking that if they did, students would download those and use them for study. That's probably not without benefit, but related research in cognitive psychology lends support to the value of students coming up with their own answers, which forces them to engage directly with the material: in this case, the textbook. If students had questions based on the material used in the guides, they were welcome to ask them. Furthermore, the guides were used in class. Faculty referred to them and asked students to compare their work on particular guide questions with one another. Obviously, it takes time and effort to develop the guides, and part of that work needs to be repeated if a new edition of the book comes out or if the assigned readings for the course are changed. However, this faculty group reported a benefit from preparing the guides: “We found that we were much more in tune with the content of the textbook and were very clear on what we wanted students to read in advance of each class session, which helped align learning objectives between preclass assignments and in-class activities” (p. 8). What's most encouraging about this work is the students' response. A significant number used the reading guides, and more than 70 percent of those reported spending more than 60 minutes on them, and that was without requirement or reward. Active Reading Questions Fleck, Richmond, Rauer, and Beckman defined active reading questions (ARQs) as the type of question “intended to help student focus their attention as they read and clarify their understanding as they progress through reading material” (p. 221). ARQs foster higher-level thinking skills by asking students “to remember, apply, and analyze information using prior knowledge” (p. 221), and they foster lower-level thinking skills by asking students to remember and understand the content. Questions that promote both higher- and lower-level thinking skills increase students' understanding and retention of course content. That assertion is documented with research evidence cited in the article. The effectiveness of ARQs at promoting learning was tested in a pilot classroom setting and a laboratory experiment. Perhaps the most interesting part of this work was a comparison between having students answer the ARQs and giving students the PowerPoint slides used in the classroom presentation. In the laboratory setting, that comparison was expanded to include a control group that received no additional study aids. As the research team points out, PowerPoint slides are widely used in classrooms (and everywhere else), but “the evidence on their effectiveness as a tool for learning is mixed” (p. 222). However, students regularly ask professors to share their slides and have been known to become quite irate if the request is denied. In the classroom study, with 37 students in one section and 38 in another, completion of the ARQs was optional, and students did not receive any credit for doing them. Even so, 27 of the 37 students in the first section said they used them at least 50 percent of the time, and 29 of the 38 students in the second section reported they used the PowerPoint slides at least 50 percent of the time. Although neither the questions nor the slides harmed students, neither improved their learning as measured by exam scores. In the laboratory setting, however, where 153 students either read a passage and simultaneously responded to a set of ARQs, used PowerPoint slides (as they normally would), or were given nothing but were advised to use their own reading strategies, the question set did make a difference. Understanding of the text passage was tested by a set of free-recall questions (essentially, write whatever you remember from the text), multiple-choice questions, and a reading comprehension assessment. The results indicated “the students who used ARQs significantly outperformed both students who used PPT [PowerPoint slides] and those who used no tool at all in the free-recall questions, for both the higher-level and lower-level categorization” (p. 228). The researchers note that free-recall questions require students to search their memories for accurate information and generally encourage more higher-level thinking: “The free-recall questions required students to perform a more difficult, deeper level of processing” (p. 229). ARQs also require work to create and work to grade. The researchers recommend requiring students to complete them, as the fact they were optional in the classroom study is the likely reason they had no effect on exam performance. Question sets like this may not be a viable option in large courses—although they could be assigned less often, or only some of the questions could be graded. ARQs might be a valuable addition to introductory courses where beginning students, who often find the assigned readings particularly challenging, still need to develop the kind of reading skills necessary for successful learning. References: Lieu, R., Wong, A., Asefirad, A., & Shaffer, J.F. (2017). Improving exam performance in introductory biology through the use of preclass reading guides. Cell Biology Education—Life Sciences Education, 16(Fall), 1–10. Fleck, B., Richmond, A., Rauer, H., & Beckman, L. (2017). Active reading questions as a strategy to support college students' textbook reading. Scholarship of Teaching and Learning in Psychology, 3(3), 220–232.