The Student News Site of Weber State University

The Signpost

The Signpost

The Signpost

The Signpost

Latest YouTube Video

Viewpoint: Professor-rating websites underregulated

Imagine, for a moment, that there was a popular website, filled with names upon names of university students, much like Facebook.

Now, picture professors across the country signing in and writing, publicly and anonymously, the things they hated and loved most about each of these students.

“Lauren L.? She was all right. Turned her work in, nothing too spectacular. I gave her the B she deserved.”

“Ew, gross. Steve R. showed up 10 minutes late every day with a new excuse. He hated me, he hated ENGL 2010, and to be honest, the world of literature is better off without him.”

“Ashley S. is one of the easiest students to please. Total suck-up. And she’s a hot tamale! At least an 8!”

Most students at Weber State University should be familiar by now with the website Ratemyprofessors.com, and other faculty-rating websites like it. On these sites, students can assign their past professors grades on qualities like helpfulness, clarity and easiness, which usually combine to an overall score that ranks them in relation to other faculty members. On RMP, students can also add a special “chili pepper” icon, which indicates the “hotness” of the individual professor.

The idea behind a website like this is a good one: Students can provide an off-campus database of teachers, and inform future students as to what their best options are. Many students on campuses just like WSU feel they don’t have enough of a say in the evaluation processes of their instructors, and that the only way to create change is to take the evaluation process into their own hands.

Despite the good intentions of this system, however, there are glaring errors in its execution.

Ignoring the obviously objectifying “hotness” scale, which is ridiculous and born out of the most medieval motivations, the next-most obvious error in these professor-grading sites is that there is no perceivable form of accurate regulation in the commenting and grading functions. People leaving comments and assigning numerical grades to these professors could be scammers, other professors, students who’ve never taken a class from the teacher, or simply former students with axes to grind and plenty of time to angrily fill up a comment box.

There is no way to stop enraged students with bad test scores from posting hundreds of times, nor is there a way to stop students who loved the professor more than the average student from writing the phrase “he/she changed my life” as many times as they can. And for a website that holds the character of so many educational professionals in its hands, it performs less like an impartial scale, and more like a bulletin board, covered in graffiti and personal notes.

Another problem is one that any customer service professional has seen: In the same way that no one fills out a comment card for an average experience at a restaurant, the outlying comments, whether positive or negative, are vastly overrepresented.

Perhaps the most damning attribute to these sites are their emphasis on finding the easiest teachers, instead of the best ones. What if our country’s medical system were filled with doctors who had skated by after finding all the easiest professors at their medical schools?

And how does one truly quantify the “overall quality” of a teacher? What is being measured? Different students look for different benefits and drawbacks to their professors, and it would be nearly impossible to assign a true numerical value to the efforts of each educator.

Not every professor is a master mentor. It’s an unfortunate truth that, for every dynamic professor, there exists one mere instructor. The other unfortunate truth is that students rely on the grades their professors give them. It’s a fact of life.

And while there are students out there who feel, perhaps correctly, that this system of evaluation is a little one-sided, the solution to effectively evaluating the quality of university professionals does not lie in websites like RMP. These sites are objectifying and unscientific, and currently less effective than any university-sponsored evaluation system.

Leave a Comment
More to Discover

Comments (0)

Comments written below are solely the opinions of the author and does not reflect The Signpost staff or its affiliates.
All The Signpost Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *