active learning

Collaboration without Learning

Active learning approaches frequently promote student conversations about the content. As students try to explain things to each other, argue about answers, and ask questions, learning happens. We can hear it and see it. It’s why we teach.

Read More »

Can Students Misjudge Their Own Learning?

Imagine this scenario: students taking physics—one group with a faculty member who lectures effectively, the other with one who uses active learning extensively. In both cases what students learn is tested after the class session along with their reports of how much they think they’ve

Read More »
Gauge with needle pointing to "very bad," indicating poor student evaluations

My Worst Student Ratings Ever

A year ago I received the worst student ratings of instruction (SRIs) in my 28 years of teaching. On the Likert scale I am normally between 4 and 5 for quality of instructor and quality of the course. Last year, however, my fall term ratings

Read More »
A teacher in front of a classroom

Lecture or Active Learning? When to Decide

I think the active learning versus lecture debate is finally moving on to more useful questions than which one is better. Now there’s interest in deciding when to lecture and when to use active learning. When do we make those decisions, and are we making

Read More »
online learning - formative assessment

Formative Assessment Techniques for Online Learning

While most faculty think of assessments as used to measure learning after the fact, formative assessment classroom techniques (FACTs) give an instructor a snapshot of where students are in their learning so as to address any gaps in their understanding. Online instructors have a variety

Read More »
resisting active learning

Minimizing Student Resistance to Active Learning

This research was motivated by the persistent belief that use of active learning approaches engenders student resistance. Despite the well-documented benefits of active learning, students don’t always endorse these approaches with enthusiasm and that makes faculty reluctant to use these approaches. Up to this point

Read More »
green screen - student engagement

An Engagement Epidemic: Designing an Immersive, Media-Rich Course

Long before the written word, humans relied on stories to entertain, instruct, and preserve cultural traditions. Storytelling is a fundamental way that humans communicate, and yet it is often left out of the college classroom. Rather than telling students stories about how something works or

Read More »
Archives

Get the Latest Updates

Subscribe To Our Weekly Newsletter

Magna Digital Library
wpChatIcon

Active learning approaches frequently promote student conversations about the content. As students try to explain things to each other, argue about answers, and ask questions, learning happens. We can hear it and see it. It’s why we teach.

Teaching Professor Blog

An interesting study of student conversations over clicker questions was motivated by what researchers were hearing faculty say as they started using clicker questions (James & Willoughby, 2011). The faculty “invariably imagined idealized productive conversations wherein students would systematically discuss various question alternatives in light of their own prior knowledge and experience” (p. 123). As the researchers worked with faculty on implementing clicker questions, they started recording some of the student conversations. What they heard students say justified a more thorough analysis. They ended up recording 361 student conversations with the overarching goal of offering faculty insights into what students actually talked about when they discussed the clicker question and answer options.

Some students had the kind of productive conversations the teachers imagined: 38 percent of them discussed at least one of the multiple-choice alternatives, and the answer they finally selected represented the ideas they had discussed. But 62 percent did not have these kinds of exchanges. The researchers provide a typology of these derailed conversations, illustrating them with sample exchanges over specific clicker questions. Understanding some of their analysis requires knowledge of the astronomy content used in the questions, but some of what they heard has broad implications.

“We found that more than one-third of the conversations did not include any meaningful exchange of student ideas,” the authors wrote (p. 131). In some of these conversations students simply asserted that a particular answer was correct. They didn’t cite any evidence or offer any viable reasons but instead said things like an answer “sounded good.” In other of these conversations, students reached a consensus with no real discussion. Someone in the group proposed an answer, offered a justification—sometimes not a very good one—and everyone else agreed. From the recordings, researchers could not tell whether students went along because they didn’t know the answer, because they didn’t care, or because they didn’t feel comfortable offering another possibility. And in some conversations, students didn’t try. They announced that they didn’t know and had no idea how to figure it out and then took a wild guess.

Another category of off-target answers involved student ideas that weren’t among the answer options. These were mistakes or misunderstandings the instructors who wrote the questions hadn’t anticipated students would make. Use of clicker questions where answer information is aggregated does not reveal these alternative student ideas. The same could be said for any student discussion the teacher does hear.

Are there lessons to take from this analysis? I think so. For starters, if we want students to have the kinds of conversations that promote learning, we can’t assume they’re the automatic outcomes of collaboration. Most of us who’ve had students discuss problems in groups know that, even though we’d like to think otherwise. A lot of our students still don’t know how to carry on an intellectual conversation. They don’t understand the value of sharing ideas, considering options, evaluating answers—those back and forth exchanges that increase understanding and lead to the right answer. Teachers can start cultivating that awareness simply, as they did this study, by developing a set of guidelines—in this case, guidelines that outline ways to discuss problems and possible answers.

This study’s findings also revealed that how the clicker questions were graded influenced student discussions. If more credit was awarded for correct answers than for incorrect ones, students were more likely to be passive and to select an answer proposed by someone else, even if they did not agree with that answer. When clicker questions earned credit regardless of their correctness, there was less passivity and more discussion of answer alternatives. These results offer yet another endorsement of low-stakes grading options.

As for those conversations in which students offer alternative ideas—some of which may be brilliant (though most are not)—teachers need to hear those ideas. Teaching improves when a teacher understands student perspectives. Students are encouraged to share their ideas when teachers respond respectfully and constructively confront what may be an interesting but totally incorrect answer.

Active learning powerfully promotes learning, but it doesn’t work magically.

Reference

James, M. C., & Willoughby, S. (2011). Listening to student conversations during clicker questions: What you have not heard might surprise you! American Journal of Physics, 79(1), 123–132. https://doi.org/10.1119/1.3488097