Many faculty don’t expect to learn a lot from those end-of-course student comments. Students don’t write much, don’t always think carefully about what they write, and have been known to make ugly comments. Low expectations would seem to be justified, and that’s unfortunate. Because they’ve experienced our teaching firsthand, students know best whether we’ve been clear, provided enough examples, made reasonable demands and offered adequate help. Are there some ways we might improve the caliber of their written comments? Would any of these options help?
To continue reading, you must be a Teaching Professor Subscriber.
Please log in or sign up for full access.
Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...
Many faculty don’t expect to learn a lot from those
end-of-course student comments. Students don’t write much, don’t always think
carefully about what they write, and have been known to make ugly comments. Low
expectations would seem to be justified, and that’s unfortunate. Because they’ve
experienced our teaching firsthand, students know best whether we’ve been clear,
provided enough examples, made reasonable demands and offered adequate help. Are
there some ways we might improve the caliber of their written comments? Would
any of these options help?
The kinds of questions we ask matter. In using the royal “we” I recognize that the open-ended questions on end-of-course ratings forms are usually determined institutionally. But we’re part of that royal “we,” and we ought to object to some of those questions. “What did you like most/least about the course?” Some students (dare say I many?) “like” easy courses. Others “like” courses in which they don’t have to write or solve problems. The question promotes entitlement, conveying the impression that good courses provide students with what they like.
We can use better prompts to solicit written comments. An
open-ended statement such as “describe the impact of policies, practices, and
behavior in the course on your efforts to learn” generates specific examples
and links them to learning. That’s the kind of feedback faculty need and can
use to make targeted changes.
When we ask for feedback matters. I’m not sure there could be a worse time to ask for written commentary than at end of the course. Students are stressed, tired, and ready for break. Most teachers feel the same way. We can ask for comments during the course, and we can ask about specific aspects of the course—such as the assigned readings, the homework problems, the quiz system, and the group activities used online or in class—they come to expect requests for written comments and get practice in providing them.
The amount of time given to writing comments matters. Most of us know that students don’t devote a lot of time to providing course feedback. In one recent study, 37 percent of the more than 600 students surveyed said they’d devote up to 10 minutes to a course evaluation, while 27.5 percent said that five minutes was their maximum (Hoel & Dahl, 2019). If the students are responding to closed questions and being asked for comments, is five or 10 minutes enough time to provide quality feedback? How much time does it take us to provide good written feedback on student papers?
How we ask for written feedback matters. The
internet has made asking easy, and most of us are asked way too often—Delta
wants to know about the service provided on every flight I take. The request
for students’ written feedback needs to be personal. It needs to say why the
feedback is important and identify what we hope to learn from it.
How we respond to the comments matters. If
there’s never any response to the feedback, there’s not much motivation to
provide it. I’ve given up asking Delta to please have coffee on board the 5:55
a.m. commuter flights out of State College. Students need some evidence that
their comments have been read and considered. The study mentioned above also
explored students’ motivations for completing course evaluations; it found that
students’ primary motivation not to complete course evaluations was their
failure to see them as useful or valuable. The power of this motivation was
verified by the more than 50 percent who reported that they never completed
course evaluations—didn’t even open the email containing the link.
Talking with students about their comments also provides an
opportunity to share examples of comments that were unclear or that other
comments contradicted and of those that offered new insights. It’s also an
opportunity to help students learn how to offer constructive feedback.
What we do with the feedback matters. We can
act on it, consider it, ask for further comments, or decide not to act on it. All
those are legitimate responses, made more so if we share the rationale for what
we’ve decided with our students. We might also consider preserving some of the
feedback—not only those “walks on water” affirmations of our teaching (although
it’s nice to have a few of those around on some days) but also insightful comments
that reveal something we didn’t know or clearly understand about our teaching
and comments that provide good examples.
We need students’ insights about their learning experiences
in our courses. The way we typically solicit their comments doesn’t provide the
context or structure needed to generate quality feedback, but we can change
that. The place to start is in our courses and with our students.
Reference
Hoel, A., & Dahl, T. I. (2019). Why bother? Student motivation to participate in student evaluations of teaching. Assessment & Evaluation in Higher Education, 44(3), 361–378. https://doi.org/10.1080/02602938.2018.1511969