Clickers or Hand Raising?

Clickers have made their way into many classrooms, and unlike any number of other instructional innovations, they have already generated a plethora of research findings, almost all of them indicating the positive benefits of the use of these response systems. The study highlighted here illustrates just how effective they are at engaging students and promoting comprehension of the content.

Researchers compared levels of engagement and content comprehension generated by clickers and the traditional way students respond to questions, by raising their hands. Study participants were 380 undergraduates in kinesiology programs at three different institutions. They were enrolled in 11 sections of several different major courses. They answered designated questions in class either by responding with clickers or by raising their hands, signaling their willingness to respond if the teacher called on them.

The results are what most of us would predict. Clickers engaged more students—way more students. When these students responded using clickers, 97.9 percent answered all four of the designated questions. Only 9.1 percent of the students responded to all four questions by raising their hands, with 26.6 percent of the students not raising their hands in response to any of the questions.

The students also answered a series of survey questions asking whether using clickers or hand raising most increased their confidence in answering questions, understanding the lectures, concentration during the lecture, cognitive engagement, and learning enhancement. Clickers were selected between 81.5 percent and 63.2 percent of the time for these items, and hand raising was selected between 13.1 percent and 5.7 percent of the time. In responses to three open-ended questions, students made clear that they experienced fear, anxiety, and embarrassment when having to answer questions verbally during lectures. Being anonymous reduced those emotions.

Following the use of each response method, students answered two higher-order (thinking) multiple-choice questions. In the cohort, 50.2 percent answered none or one of those questions correctly after the clickers were used, and 49.8 percent answered both correctly. This compared with 58.5 percent who answered none or one of those questions correctly after the hand-raising questions and 41.5 percent who answered both correctly. Those differences are statistically significant.

Even though these results clearly favor the effectiveness of clickers over hand raising, instructors who don’t use clickers should not stop asking students questions. In the survey, students were asked, “In general, how likely are you to actively think about questions posed by the instructor when no opportunity to respond is given?” More than half of the students (55.2 percent) said they were still likely to actively think about questions even if not given an opportunity to respond, and another 14.3 percent said they were very likely to engage in this active thinking. (p. 313) The researcher concludes, “The lack of overt public participation in class does not necessarily equate to cognitive disengagement.” (p. 317)

As the researcher notes, it is not the clickers per se that generate these results. Their effectiveness, like that of all other technology enhancements, results from how they are used. “The benefit of clickers … is not dependent on the technology itself but rather on how well it is utilized to foster thought and reflection in learners.” (p. 310)

Reference: Barr, M. L. (2014). Encouraging college student active engagement in learning: The influence of response methods. Innovative Higher Education, 39, 307-319.

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time crafting...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets that...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon

Clickers have made their way into many classrooms, and unlike any number of other instructional innovations, they have already generated a plethora of research findings, almost all of them indicating the positive benefits of the use of these response systems. The study highlighted here illustrates just how effective they are at engaging students and promoting comprehension of the content.

Researchers compared levels of engagement and content comprehension generated by clickers and the traditional way students respond to questions, by raising their hands. Study participants were 380 undergraduates in kinesiology programs at three different institutions. They were enrolled in 11 sections of several different major courses. They answered designated questions in class either by responding with clickers or by raising their hands, signaling their willingness to respond if the teacher called on them.

The results are what most of us would predict. Clickers engaged more students—way more students. When these students responded using clickers, 97.9 percent answered all four of the designated questions. Only 9.1 percent of the students responded to all four questions by raising their hands, with 26.6 percent of the students not raising their hands in response to any of the questions.

The students also answered a series of survey questions asking whether using clickers or hand raising most increased their confidence in answering questions, understanding the lectures, concentration during the lecture, cognitive engagement, and learning enhancement. Clickers were selected between 81.5 percent and 63.2 percent of the time for these items, and hand raising was selected between 13.1 percent and 5.7 percent of the time. In responses to three open-ended questions, students made clear that they experienced fear, anxiety, and embarrassment when having to answer questions verbally during lectures. Being anonymous reduced those emotions.

Following the use of each response method, students answered two higher-order (thinking) multiple-choice questions. In the cohort, 50.2 percent answered none or one of those questions correctly after the clickers were used, and 49.8 percent answered both correctly. This compared with 58.5 percent who answered none or one of those questions correctly after the hand-raising questions and 41.5 percent who answered both correctly. Those differences are statistically significant.

Even though these results clearly favor the effectiveness of clickers over hand raising, instructors who don't use clickers should not stop asking students questions. In the survey, students were asked, “In general, how likely are you to actively think about questions posed by the instructor when no opportunity to respond is given?” More than half of the students (55.2 percent) said they were still likely to actively think about questions even if not given an opportunity to respond, and another 14.3 percent said they were very likely to engage in this active thinking. (p. 313) The researcher concludes, “The lack of overt public participation in class does not necessarily equate to cognitive disengagement.” (p. 317)

As the researcher notes, it is not the clickers per se that generate these results. Their effectiveness, like that of all other technology enhancements, results from how they are used. “The benefit of clickers ... is not dependent on the technology itself but rather on how well it is utilized to foster thought and reflection in learners.” (p. 310)

Reference: Barr, M. L. (2014). Encouraging college student active engagement in learning: The influence of response methods. Innovative Higher Education, 39, 307-319.