RateMyProfessors.com Comments: An Analysis

Whatever philosophical and empirical issues college teachers may have with the Rate My Professor (RMP) website, there is no denying that the site in now a huge repository of information on college teachers. The website reports that it contains 15 million ratings for 1.4 million professors at 7,000 schools.

Some faculty concerns about the site are legitimate (and they are supported by research), but that’s a different article. Other research reports positive correlations between RMP and official student evaluations ratings. What haven’t been looked at very extensively so far are the comments collected on the site. One study reports that students rely on these written comments more than the ratings. The comments do indeed reveal quite a bit about the evaluation criteria students use in their assessment of professors.

This particular analysis of RMP comments looked at 2,371 comments made about 442 marketing professors at 60 randomly selected colleges and universities in the U.S. The ratings of this faculty cohort were compared with the ratings for cross-disciplinary faculty groups, and no significant differences were found.

The researchers started with and then refined a coding schema developed for use in other research. Their final schema included 30 unique thematic codes grouped in the following five areas: 1) global quality (“great professor” “terrible prof”); 2) course elements; 3) student development (professor encouraged participation, developed student’s knowledge); 4) instructor characteristics; and 5) service provider personal characteristics (professor was nice, had a good sense of humor). Results for all 30 of the statement codes appear in the article. Two of the codes did not meet sample size assumptions and were excluded.

The researchers had two hypotheses. First, they predicted the that student comments would have balance rather than being mixed or neutral, and that hypothesis was confirmed, with 55 percent of the comments positive, 38 percent negative, and the remainder mixed or neutral. Second, they predicted that the comments would be positively skewed, and that hypothesis was also confirmed. Nineteen of the 28 code categories were positively skewed. This finding is worth noting, as there continues to be a sense among many faculty that RMP is the place students go when they want to gripe about a professor. Rather, as these researchers note, this collection of comments was “dominated by compliments rather than complaints.”

Some other details are also revealing. One of every two comments included a statement about the global quality of the professor, but only one in 10 comments included a global quality statement about the course. Given the focus of the site, the preponderance of instructor-related comments is not surprising. However, fewer than three in 10 comments included an explicit recommendation about whether to take courses with this instructor.

Raters’ comments on the RMP site focus on instructors’ personal characteristics and teaching attributes. The attributes mentioned include whether the instructor is knowledgeable, organized, and helpful. Personal characteristics (those in the service provider code category) include any number of comments not usually listed in professor job descriptions—the professor was nice, cool, likeable, entertaining, etc. In fact, “there were significantly more assessments of the instructor’s niceness (n=393) compared to assessments of instructors’ knowledge of the material (n=275).

Some faculty might argue that this focus on personal characteristics indicates how unqualified students are to judge instructors. “Niceness” is not generally considered a relevant assessment criteria for academic professionals. However, this analysis of student comments makes clear that the human connection between professors and students is clearly important to them. Students (especially young adults) identify first and foremost with their teachers as persons. It goes back to the teacher as role model, not just as a model of intellectual competence but as a model person. That may make us uncomfortable, and we may wish that students didn’t look at us in this way, but they do, as this study and much other research document.

The influence of Rate My Professors should not be ignored. This research group explains why. “As students increasingly use communication technologies for trusted sources of information, insights, and opinions, student perceptions of professors and the related class choice decisions will be increasingly influenced by Web communications and social networking mediums such as RMP.” (p. 159)

If you’re at all interested in the research on RMP, the article referenced below contains an excellent bibliography that cites most of this work.

Reference:

Hartman, K. B., and Hunt, J. B. (2013). What ratemyprofessor.com reveals about how and why students evaluate their professors: A glimpse into the student mind-set. Marketing Education Review, 23 (2), 151-161.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time crafting...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets that...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon
Whatever philosophical and empirical issues college teachers may have with the Rate My Professor (RMP) website, there is no denying that the site in now a huge repository of information on college teachers. The website reports that it contains 15 million ratings for 1.4 million professors at 7,000 schools. Some faculty concerns about the site are legitimate (and they are supported by research), but that's a different article. Other research reports positive correlations between RMP and official student evaluations ratings. What haven't been looked at very extensively so far are the comments collected on the site. One study reports that students rely on these written comments more than the ratings. The comments do indeed reveal quite a bit about the evaluation criteria students use in their assessment of professors. This particular analysis of RMP comments looked at 2,371 comments made about 442 marketing professors at 60 randomly selected colleges and universities in the U.S. The ratings of this faculty cohort were compared with the ratings for cross-disciplinary faculty groups, and no significant differences were found. The researchers started with and then refined a coding schema developed for use in other research. Their final schema included 30 unique thematic codes grouped in the following five areas: 1) global quality (“great professor” “terrible prof”); 2) course elements; 3) student development (professor encouraged participation, developed student's knowledge); 4) instructor characteristics; and 5) service provider personal characteristics (professor was nice, had a good sense of humor). Results for all 30 of the statement codes appear in the article. Two of the codes did not meet sample size assumptions and were excluded. The researchers had two hypotheses. First, they predicted the that student comments would have balance rather than being mixed or neutral, and that hypothesis was confirmed, with 55 percent of the comments positive, 38 percent negative, and the remainder mixed or neutral. Second, they predicted that the comments would be positively skewed, and that hypothesis was also confirmed. Nineteen of the 28 code categories were positively skewed. This finding is worth noting, as there continues to be a sense among many faculty that RMP is the place students go when they want to gripe about a professor. Rather, as these researchers note, this collection of comments was “dominated by compliments rather than complaints.” Some other details are also revealing. One of every two comments included a statement about the global quality of the professor, but only one in 10 comments included a global quality statement about the course. Given the focus of the site, the preponderance of instructor-related comments is not surprising. However, fewer than three in 10 comments included an explicit recommendation about whether to take courses with this instructor. Raters' comments on the RMP site focus on instructors' personal characteristics and teaching attributes. The attributes mentioned include whether the instructor is knowledgeable, organized, and helpful. Personal characteristics (those in the service provider code category) include any number of comments not usually listed in professor job descriptions—the professor was nice, cool, likeable, entertaining, etc. In fact, “there were significantly more assessments of the instructor's niceness (n=393) compared to assessments of instructors' knowledge of the material (n=275). Some faculty might argue that this focus on personal characteristics indicates how unqualified students are to judge instructors. “Niceness” is not generally considered a relevant assessment criteria for academic professionals. However, this analysis of student comments makes clear that the human connection between professors and students is clearly important to them. Students (especially young adults) identify first and foremost with their teachers as persons. It goes back to the teacher as role model, not just as a model of intellectual competence but as a model person. That may make us uncomfortable, and we may wish that students didn't look at us in this way, but they do, as this study and much other research document. The influence of Rate My Professors should not be ignored. This research group explains why. “As students increasingly use communication technologies for trusted sources of information, insights, and opinions, student perceptions of professors and the related class choice decisions will be increasingly influenced by Web communications and social networking mediums such as RMP.” (p. 159) If you're at all interested in the research on RMP, the article referenced below contains an excellent bibliography that cites most of this work. Reference: Hartman, K. B., and Hunt, J. B. (2013). What ratemyprofessor.com reveals about how and why students evaluate their professors: A glimpse into the student mind-set. Marketing Education Review, 23 (2), 151-161.