Coding movie reviews: Why Rotten Tomatoes are less fresh
Perspectives > Blog Post

Coding movie reviews: Why Rotten Tomatoes are less fresh

August 1, 2013

Greg Rice

In the dark overlap between quantitative research and movie nerds, Rotten Tomatoes scores are routinely used as an overall measurement of a film’s quality. As a quantitative researcher that takes pride in using numbers to reflect reality, those Rotten Tomatoes scores drive me totally batty. Here's a breakdown on how Rotten Tomatoes scores movies, and why you, as a consumer, should demand better.

In the dark overlap between quant research and movie nerds, Rotten Tomatoes’ scores are routinely used as an overall measurement of a film’s quality.  As a quantitative researcher that takes pride in using numbers to reflect nuanced reality, those RT scores drive me totally batty.

A quick primer on what Rotten Tomatoes does.  They take a large number of reviews from film critics and decide how many of them are positive reviews.  Then they report a percentage of positive reviews as a film’s Tomatometer score.  The higher the score, the better the movie.

Except it’s not so simple.   By placing every review into one of two categories (positive or negative), they are stripping out the degree of support that critics are giving a film.  RT is basing a movie’s quality on the fact that most people agree that it’s above average, while ignoring how much above average it is.

Metacritic is a lesser-known site, but a much more useful one.  They code each review on a 0 to 100 point scale:  The higher a critic endorses a movie, the more points the review is assigned.  They can use any number on the scale, allowing a lot more freedom to represent the nuance of the reviewer.

It can make a big difference.  Let’s look at two recent movies that got near-universal praise.   The recent horror film The Conjuring got solid reviews all around.  Nearly everyone agreed it is above average, as seen by its 85% on Rotten Tomatoes.  But its Metacritic score is just a 68, and when you dig into the score distribution you can see why.  Everyone agrees that it’s a good movie, but few claim it’s a great one.

Compare that to the best reviewed movie of the summer: Richard Linklater’s Before Midnight.  That movie was given a 98% score on Rotten Tomatoes and a 94 on Metacritic.   The RT score is implying that The Conjuring is almost as good as the best reviewed movie of the summer, but Metacritic is rightly saying it’s nowhere close to that.

I do have to give Rotten Tomatoes credit.  They have built a great brand and have worked hard to become the Internet’s official movie rating for everyone.  Everyone, that is, except the quantiest of movie nerds like me.

Greg Rice

Senior Consultant, Los Angeles

Greg provides senior strategic oversight on quantitative and mixed method engagements with a unique blend of right brain/left brain thinking. He is well-known within the research industry for his...

Up Next