Student Resistance: Fact or Fiction

Student Resistance

When faculty consider adopting a new instructional approach, there’s always a question about how it will be received by students. Will they engage with it and learn from it, or will they resist, as in complain, participate reluctantly, and give the course and instructor low evaluations?  The fear of student resistance can prevent faculty from trying out new approaches, including any number of active learning approaches with well-documented learning benefits.

What isn’t clear about student resistance is whether it’s to new approaches in general or to certain kinds of instructional strategies. Up to this point, there has been scant empirical exploration of the phenomenon, which makes a recent study done in engineering particularly worth highlighting.

A large research team of mostly STEM faculty developed and validated a Student Response to Instructional Practices (StRIP) survey to “measure students’ expectations of active learning and other types of instruction. The StRIP survey also measured students’ experiences of types of instruction, instructor strategies for using in-class activities, and student response to instruction” (p. 4). They asked for student responses to four types of instruction: passive lecture where the instructor talked and students listened; active learning lecture where the instructor talked, but the instructor and students asked and answered questions; group-based activities where students worked on content with other students; and self-directed activities where students assumed responsibility for learning material on their own.

Students were asked to respond to these types of instruction in terms of their perceptions of their value, positivity (their attitude toward the instructor and course), and their evaluation of the course and instructor. They were also asked about their participation with a set of prompts that included positive and negative statements (“I tried my hardest to do a good job;” “I rushed through the activity, giving minimal effort.”). A cohort of 179 students in four different courses at three different institutions took the survey at the beginning of the course, two weeks in, and a third time at the end of the course.

Student responses were analyzed with various statistical methods, which generated a number of different (and interesting) results with only the major findings highlighted here. “Perhaps most importantly, the data show no significant negative correlation between any type of instruction and any student response to instruction” (p. 14). And that included how students responded to group work!  The findings do not rule out the possibility of a student resisting a particular instructional approach, but they do indicate that instructors should not expect student resistance as an automatic outcome to instructional approaches other than those students expect. Moreover, “there was no evidence found to support the common concern that instructor or course evaluations are negatively affected by adopting active learning strategies” (p. 14). Rather, these students “more often than not” saw active learning approaches as having value and participated in them fully.

Faculty with concerns about student resistance should find those results encouraging, but even more helpful were findings that what most strongly predicted how students responded was not the type of instruction, but the strategies the instructor used to implement the particular approach. The StRIP survey asked about implementation in terms of whether the instructor clearly explained what students were supposed to do, including the purpose of the activity, how it related to learning, and its degree of difficulty (not too easy or too difficult). There was also a set of prompts pertaining to how the instructor facilitated the instruction. Was there an opportunity for students to provide feedback, was the instructor there to help, did the instructor’s demeanor encourage engagement, and was an appropriate amount of time devoted to the activity?  Of this finding, the research team writes, “Clearly, instructors have a great deal of influence on how students respond to active learning” (p. 15).

This research was done only with students in one discipline, so more work is needed to confirm that students elsewhere do not in large numbers resist approaches other than lecture. But the research is valuable in its identification of concrete actions instructors can take when they implement other approaches that, in this case, were strong predictors of how students responded to the type of instruction.

Reference: 

Nguyen, K., Husman, J., Borrego, M., Shekhar, P., Prince, M., Demonbrun, M., Finelli, C., Henderson, C., & Waters, C. (2017). Students’ expectations, types of instruction and instructor strategies predicting student response to active learning. International Journal of Engineering Education, 33 (1), 2–18.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time crafting...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets that...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon
When faculty consider adopting a new instructional approach, there's always a question about how it will be received by students. Will they engage with it and learn from it, or will they resist, as in complain, participate reluctantly, and give the course and instructor low evaluations?  The fear of student resistance can prevent faculty from trying out new approaches, including any number of active learning approaches with well-documented learning benefits. What isn't clear about student resistance is whether it's to new approaches in general or to certain kinds of instructional strategies. Up to this point, there has been scant empirical exploration of the phenomenon, which makes a recent study done in engineering particularly worth highlighting. A large research team of mostly STEM faculty developed and validated a Student Response to Instructional Practices (StRIP) survey to “measure students' expectations of active learning and other types of instruction. The StRIP survey also measured students' experiences of types of instruction, instructor strategies for using in-class activities, and student response to instruction” (p. 4). They asked for student responses to four types of instruction: passive lecture where the instructor talked and students listened; active learning lecture where the instructor talked, but the instructor and students asked and answered questions; group-based activities where students worked on content with other students; and self-directed activities where students assumed responsibility for learning material on their own. Students were asked to respond to these types of instruction in terms of their perceptions of their value, positivity (their attitude toward the instructor and course), and their evaluation of the course and instructor. They were also asked about their participation with a set of prompts that included positive and negative statements (“I tried my hardest to do a good job;” “I rushed through the activity, giving minimal effort.”). A cohort of 179 students in four different courses at three different institutions took the survey at the beginning of the course, two weeks in, and a third time at the end of the course. Student responses were analyzed with various statistical methods, which generated a number of different (and interesting) results with only the major findings highlighted here. “Perhaps most importantly, the data show no significant negative correlation between any type of instruction and any student response to instruction” (p. 14). And that included how students responded to group work!  The findings do not rule out the possibility of a student resisting a particular instructional approach, but they do indicate that instructors should not expect student resistance as an automatic outcome to instructional approaches other than those students expect. Moreover, “there was no evidence found to support the common concern that instructor or course evaluations are negatively affected by adopting active learning strategies” (p. 14). Rather, these students “more often than not” saw active learning approaches as having value and participated in them fully. Faculty with concerns about student resistance should find those results encouraging, but even more helpful were findings that what most strongly predicted how students responded was not the type of instruction, but the strategies the instructor used to implement the particular approach. The StRIP survey asked about implementation in terms of whether the instructor clearly explained what students were supposed to do, including the purpose of the activity, how it related to learning, and its degree of difficulty (not too easy or too difficult). There was also a set of prompts pertaining to how the instructor facilitated the instruction. Was there an opportunity for students to provide feedback, was the instructor there to help, did the instructor's demeanor encourage engagement, and was an appropriate amount of time devoted to the activity?  Of this finding, the research team writes, “Clearly, instructors have a great deal of influence on how students respond to active learning” (p. 15). This research was done only with students in one discipline, so more work is needed to confirm that students elsewhere do not in large numbers resist approaches other than lecture. But the research is valuable in its identification of concrete actions instructors can take when they implement other approaches that, in this case, were strong predictors of how students responded to the type of instruction. Reference:  Nguyen, K., Husman, J., Borrego, M., Shekhar, P., Prince, M., Demonbrun, M., Finelli, C., Henderson, C., & Waters, C. (2017). Students' expectations, types of instruction and instructor strategies predicting student response to active learning. International Journal of Engineering Education, 33 (1), 2–18.