More Research on RateMyProfessor.com

The RateMyProfessor (RMP) site has been around now for more than a decade. As of 2013, it contained 14 million entries for more than 1.3 million professors from 7,000 schools. “Its express purpose is to serve as a resource for other users in their decision-making, in this case students weighing their course options.” (p. 182) Despite its popularity among students, faculty continue to view the site with skepticism. Among several common criticisms is the continuing concern that the students who use the site, particularly those who make comments, are students with extreme views—they loved or hated the instructor, the course, or both. Faculty criticism of the validity of the site is also widespread, which may be a reflection of the larger discontent faculty feel about student evaluation in general. Fortunately, various research projects involving the site continue to appear in the literature. They add to our knowledge, allowing us to confront assumptions and anecdotes with data.

This particular study is framed within a specific discipline (chemistry). However, the authors note, “there is nothing exclusive that would prevent the use of this methodological approach to inform the decision regarding the use or not of RMP information by other departments or institutions.” (p. 184) The study started out wanting to know whether students who contribute to RMP are different from the general chemistry cohort used in this study. The second two questions involved what information available on the site students found valuable and what motivated students to review ratings and contribute to them.

Of the almost 400 students in the chemistry student cohort, only 3 percent had never heard of RMP and only 21 percent had never used it. Sixty-three percent reported they used the site but did not contribute ratings, and 13 percent contributed ratings and/or comments. Of the six rating criteria students use on the RMP site, this cohort considered the [instructor] helpfulness, overall rating, and clarity the most important sources of information. Easiness of the course was second from the bottom but still garnered a positive (6.2 out of 10.0 rating) score. However, these results do justify the researchers’ conclusion that their data challenge the faculty assumption that students are visiting the site primarily to find easy courses.

Also of note were the rankings given 24 possible reasons (mostly drawn from previous research) for contributing to the RMP site. The reasons were a combination of positive and negative statements, with a few considered neutral. The six top-ranked reasons, with scores between 7.6 and 8.0 out of 10.0, were all positive statements that “described overall satisfaction, especially with the instructor, but also with instruction and the course.” (p. 191) The first negative statement—“I thought the instructor was not at all helpful”—appeared in the second group, with the highest proportion of negative statements in the third (out of four) groups.

Using a unique empirical approach, these researchers identified two categories of respondents—a group that preferred learning over grades and a second group that was ambivalent. Even though the data did not allow researchers to “identify a group that was exclusively inclined toward grades over learning, the learning/grade ambivalent group was not completely neutral in its preference.” (p. 194) As they explain, the key question is whether grade-oriented students who contribute to RMP are different from those who are learning oriented. “The association test showed that there was no statistically significant difference in the membership of the RMP groups as a function of learning/grade orientation.” (p. 194)

The researchers conclude, “In the present study, we have presented evidence contradicting common assumptions about students who use and contribute to RMP: (a) contributors are substantially different from the rest, (b) that RMP visitors are especially drawn to the site to gather information about course/instructor easiness, and (c) that ranting and raving are particularly important motives for students to contribute to RMP.” (p. 196) “Our evidence adds to a growing body of research that points at RMP as a source of information that should be considered seriously.” (p. 196)

In addition to these interesting findings, this study describes and references virtually all the research that’s been done on the RMP site. For that reason, it’s an important resource to have on hand whenever ideas and opinions about the site are being exchanged. 

Reference: Villalta-Cerdas, A., McKeny, P., Gatlin, T., and Sandi-Urena, S., (2015). Evaluation of instruction: Students’ patterns of use and contribution to RateMyProfessor.com. Assessment & Evaluation in Higher Education, 40 (2), 181-198.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time crafting...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets that...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon

The RateMyProfessor (RMP) site has been around now for more than a decade. As of 2013, it contained 14 million entries for more than 1.3 million professors from 7,000 schools. “Its express purpose is to serve as a resource for other users in their decision-making, in this case students weighing their course options.” (p. 182) Despite its popularity among students, faculty continue to view the site with skepticism. Among several common criticisms is the continuing concern that the students who use the site, particularly those who make comments, are students with extreme views—they loved or hated the instructor, the course, or both. Faculty criticism of the validity of the site is also widespread, which may be a reflection of the larger discontent faculty feel about student evaluation in general. Fortunately, various research projects involving the site continue to appear in the literature. They add to our knowledge, allowing us to confront assumptions and anecdotes with data.

This particular study is framed within a specific discipline (chemistry). However, the authors note, “there is nothing exclusive that would prevent the use of this methodological approach to inform the decision regarding the use or not of RMP information by other departments or institutions.” (p. 184) The study started out wanting to know whether students who contribute to RMP are different from the general chemistry cohort used in this study. The second two questions involved what information available on the site students found valuable and what motivated students to review ratings and contribute to them.

Of the almost 400 students in the chemistry student cohort, only 3 percent had never heard of RMP and only 21 percent had never used it. Sixty-three percent reported they used the site but did not contribute ratings, and 13 percent contributed ratings and/or comments. Of the six rating criteria students use on the RMP site, this cohort considered the [instructor] helpfulness, overall rating, and clarity the most important sources of information. Easiness of the course was second from the bottom but still garnered a positive (6.2 out of 10.0 rating) score. However, these results do justify the researchers' conclusion that their data challenge the faculty assumption that students are visiting the site primarily to find easy courses.

Also of note were the rankings given 24 possible reasons (mostly drawn from previous research) for contributing to the RMP site. The reasons were a combination of positive and negative statements, with a few considered neutral. The six top-ranked reasons, with scores between 7.6 and 8.0 out of 10.0, were all positive statements that “described overall satisfaction, especially with the instructor, but also with instruction and the course.” (p. 191) The first negative statement—“I thought the instructor was not at all helpful”—appeared in the second group, with the highest proportion of negative statements in the third (out of four) groups.

Using a unique empirical approach, these researchers identified two categories of respondents—a group that preferred learning over grades and a second group that was ambivalent. Even though the data did not allow researchers to “identify a group that was exclusively inclined toward grades over learning, the learning/grade ambivalent group was not completely neutral in its preference.” (p. 194) As they explain, the key question is whether grade-oriented students who contribute to RMP are different from those who are learning oriented. “The association test showed that there was no statistically significant difference in the membership of the RMP groups as a function of learning/grade orientation.” (p. 194)

The researchers conclude, “In the present study, we have presented evidence contradicting common assumptions about students who use and contribute to RMP: (a) contributors are substantially different from the rest, (b) that RMP visitors are especially drawn to the site to gather information about course/instructor easiness, and (c) that ranting and raving are particularly important motives for students to contribute to RMP.” (p. 196) “Our evidence adds to a growing body of research that points at RMP as a source of information that should be considered seriously.” (p. 196)

In addition to these interesting findings, this study describes and references virtually all the research that's been done on the RMP site. For that reason, it's an important resource to have on hand whenever ideas and opinions about the site are being exchanged. 

Reference: Villalta-Cerdas, A., McKeny, P., Gatlin, T., and Sandi-Urena, S., (2015). Evaluation of instruction: Students' patterns of use and contribution to RateMyProfessor.com. Assessment & Evaluation in Higher Education, 40 (2), 181-198.