Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the tag “Rotten Tomatoes”

In the Objective Top Twenty, a Certified Fresh Is a Must…But Is It Enough?

When you review the Objective Top Twenty you’ll notice that every movie has earned a Certified Fresh designation from Rotten Tomatoes. It is a dominant factor in my rating system. It may even be too dominant.

All of the analysis that I’ve done so far suggests that a Certified Fresh designation by Rotten Tomatoes is a strong indicator of a “really like” movie. The new Objective Database that I’m working with also shows that a Certified Fresh rating results in a high likelihood that IMDB voters will rate the movie a 7 or higher.

 # of IMDB Votes IMDB Votes 7+ %
Certified Fresh               19,654,608 88.2%
Fresh                  6,144,742 75.4%
Rotten                  9,735,096 48.5%

And, as you might expect, the likelihood of a 7 or higher rating stair steps down as you move into the Fresh and Rotten groups of movies.

This exposes a flaw in my previous thinking about Rotten Tomatoes. In the past I’ve indicated that I haven’t seen a statistical relationship between the % Fresh and the likelihood of a “really like” movie. And, actually, that’s a true statement. The flaw in my thinking was that because I didn’t see it I assumed it didn’t exist.

The Certified Fresh, Fresh, and Rotten designations are primarily defined by % Fresh:

  • Certified Fresh for most movies is > 75% Fresh
  • Fresh for most movies is > 60% and < 75% Fresh
  • Rotten is < 60% Fresh

If differentiation exists for these three groups then it should exist between other % Fresh groups. For example, movies that are 95% Certified Fresh should have a greater “really like” probability than movies that are 80% Certified Fresh. I now believe that I haven’t seen the difference because there hasn’t been enough data to produce stable differences.

When I begin to marry Rotten Tomatoes data with IMDB, I also get more data. Below I’ve grouped the Certified Fresh movies into four groups based on % Fresh.

Certified Fresh:  # of IMDB Votes IMDB Rating 7+ %
100%                     966,496 90.7%
90-99%               10,170,946 89.9%
80-89%                  5,391,437 87.3%
70-79%                  3,125,729 83.5%

We might be seeing the differences you’d expect to see when the units of data get larger.

So, why is this important? If we treat all Certified Fresh movies as strong “really like” prospects, we are in effect saying that we are as likely to “really like” The Shawshank Redemption (Certified Fresh 91%, IMDB Avg. Rating 9.3) as The Mask ( Certified Fresh 77%, IMDB Avg. Rating 6.9). The “really like” model becomes a more dynamic movie pre-screening tool if it can make a Rotten Tomatoes distinction between those two movies.

I believe that the database has to get much larger before we can statistically differentiate between Certified Fresh 87% movies and Certified Fresh 85% movies. But, I think I can begin to integrate the Certified Fresh groupings I developed above to create some additional means of defining quality movies within the Certified Fresh grade.

You might just see this change in next Monday’s Objective Top Twenty.

***

In looking at this weekend’s new releases, there are no sure things but three of the movies are worth keeping an eye on. The Foreigner, the Jackie Chan action thriller, is getting good early feedback from critics and IMDB voters. I expect it to do well at the box office. Marshall, the Thurgood Marshall bio-pic starring Chadwick Boseman, has received some early Oscar buzz. It appears to be headed towards a Certified Fresh rating from Rotten Tomatoes. The movie that may sneak up on audiences is Professor Marston & the Wonder Woman. Professor Marston created the character of Wonder Woman in the 1940’s. This movie tells that story. Already 34 of 38 critics have given it a Fresh rating on Rotten Tomatoes. I would expect it to receive its Certified Fresh designation by tomorrow morning.

 

 

 

 

 

 

Advertisements

Will “You” Really Like This Movie?

If you reviewed this week’s Objective Top Twenty, you might have noticed something other than five additional movies on the list. You might have noticed that, other than Hidden Figures holding onto the number one spot on the list, all of the rankings had changed.

A few month’s back I mentioned that I was developing a new objective database to project “really like” movies that are not influenced at all by my taste in movies. This week’s Objective Top Twenty reflects the early fruits of that labor.

The plan is to build a very robust database of all of the movies from the last twenty five years that finished in the top 150 in box office sales for each year . I have 1992 through 1995 completed which gives me enough movies to get started with.

The key change in the “really like” formula is that my algorithm measures the probability that users of the IMDB database will rate a particular movie as a 7 out of 10 or higher, which is my definition of a “really like” movie. The key components of the formula are IMDB Average Rating, Rotten Tomatoes Rating, CinemaScore Grade, and the number of  Academy Award wins and nominations for the major categories and for the minor categories.

In future posts, I’ll flesh out my logic for all of these factors. But, the key factor is the capability to measure on IMDB the percentage of IMDB voters who have rated a particular movie as a 7 or higher. When you aggregate all of the movies with a particular IMDB average rating you get results that look like this sample:

Avg. Rating % Rating 7+
                8.5 92.8%
                8.0 88.8%
                7.5 81.4%
                7.0 69.2%
                6.5 54.7%
                6.0 41.5%
                5.5 28.7%

Note that, just because a movie has an average rating of 7.0, doesn’t mean that every movie with a 7.0 average rating is a “really like” movie.  Only 69.2% of the votes cast for the movies with a 7.0 rating were ratings of 7 or higher. Conversely, every movie with an average rating of 6.0 isn’t always a “don’t really like” movie since 41.5% of the voters handed out 7’s or higher. It does mean, though, that the probability of a 7.0 average rated movie is more likely to be a “really like” movie than one with a 6.0 rating.

These changes represent a start down a path towards a movie pre-screening tool that is more useful to the followers of this blog. It is a work in progress that will only get better as more years are added to the database. But, we have a better answer now to the question, “Will you ‘really like’ this movie?”

***

If you’re going to the movies this weekend, chances are that you’re going to see Blade Runner 2049. The early indications are that it is going to live up to the hype. You might also check out The Florida Project, an under the radar movie that is getting some apparently well-deserved buzz.

So Now Rotten Tomatoes Has No Impact On the Box Office? Not So Fast.

There has been a conventional wisdom evolving that Rotten Tomatoes movie ratings are negatively impacting ticket sales at the movies. Over the last couple of weeks, there has been a counter argument made based on a study posted in a September 10th blog. The Wrap, Variety, and other websites reporting on the movie industry have run with the story that Rotten Tomatoes has little, if any, impact on movie ticket sales. I believe that is an oversimplification of the study and the intersection of movie ratings and movie consumption.

The points made in the study that are getting the most attention are:

  1. There is very little statistical correlation between Rotten Tomatoes ratings and box office performance.
  2. The median Rotten Tomatoes rating for 2017 is 77.5% Fresh, whereas the ratings for each of the prior four years was either 72% or 73% Fresh.
  3. There is a correlation between Rotten Tomatoes ratings and Audience ratings.

So, the argument goes, you can’t blame Rotten Tomatoes for bad box office when it is statistically proven that it has no impact on box office and, by the way, critics have actually rated this year’s movies higher than last year’s, and audiences stay away from bad movies because they are more savvy today than they’ve been in the past.

I believe the third point should be the headline. When I’ve looked at this before  I’ve found a very strong correlation to the Certified Fresh, Fresh, and Rotten ratings and my “really like” ratings.  On the other hand, I’ve found that the percentage fresh rating has a weaker correlation to whether I’ll “really like” a movie. I wonder what the statistical correlation to box office performance is for the just the three broad ratings?

As to the second point, the overlooked item in the study is that not only have critics in the aggregate liked 2017 movies better that prior years, the worldwide box office has responded with higher ticket sales in 2017 than 2016. Is it possible that better movies in 2017 have translated into more people worldwide going to the movies?

The first point, and the one that became the headline in so many articles, doesn’t make a lot of sense to me. If there is a correlation between Rotten Tomatoes ratings and Audience ratings, doesn’t that suggest that Rotten Tomatoes has contributed to a more informed movie public And, because they are more informed, they are staying away from bad movies. Therefore, Rotten Tomatoes has impacted the box office. The fact that it is an indirect impact rather than a direct impact is a little misleading. Isn’t it?

Near the end of his study presentation Yves Berqquist, the author of the study, concludes that  “Audiences are becoming extremely adept at predicting and judging the quality of a film”. Rotten Tomatoes is just one of the tools audiences are using to pre-screen the movies they watch. IMDB ratings are taken into account as are Cinemascore grades. For example, Box Office Mojo, which is the go to site for movie box office information, specifically cited the “F” grade that Cinemascore gave to Mother! last weekend as a factor in the “supremely disappointing $7.5 million from 2,368 locations” opening weekend box office. Cinemascore has only given out nineteen F’s in almost forty years of movie surveys.

The movie industry may be looking for someone to blame for movie consumers behaving differently than they have in the past. But, the sooner the industry comes to grips with the new reality that movie audiences are more savvy today than they were in the past, the sooner they will improve their own fortunes. It is arrogant to blame Rotten Tomatoes for contributing to a more informed movie audience.

***

It has been seven weeks since a new movie, Detroit, joined The Objective Top Fifteen after its opening weekend. There is a chance that streak might be broken this weekend. Assuming Cinemascore surveys the movie, I think it’s likely that the Boston Marathon bombing bio-pic Stronger will join the list. I have hopes that Battle of the Sexes will sneak in as well. Check out my update on Monday to see how good my instincts were.

 

Why Did “The Big Sick” Drop Out of the Objective Top Fifteen This Week?

This past Sunday my wife, Pam, and I went to see The Big Sick. The movie tells the story of the early relationship days of the two screenwriters, Emily Gordon and Kumail Nanjiani. In fact, Nanjiani plays himself in the movie. It is the authenticity of the story, told in a heartfelt and humorous way, that makes this film special.

On the following day, last weekend’s blockbuster, Dunkirk, moved into the second spot in the revised Objective Top Fifteen rankings. When a new movie comes on the list another one exits. This week’s exiting movie, ironically, was The Big Sick. Wait! If The Big Sick is such a great movie why isn’t it in my top fifteen for the year? Are all of the other movies on the list better movies? Maybe yes. Maybe no. You’ll have to determine that for yourselves. You see the Objective Top Fifteen is your list, not mine.

I developed the Objective Top Ten, which became Fifteen the beginning of July and will become Twenty the beginning of October, to provide you with a ranking of 2017 widely released movies that are most likely to be “really like” movies. Because the ranking is based on objective benchmarks, my taste in movies has no influence on the list. The four benchmarks presently in use are: IMDB Avg. Rating, Rotten Tomatoes Rating, Cinemascore Rating, and Academy Award Nominations and Wins. A movie like Hidden Figures that meets all four benchmarks has the greatest statistical confidence in its “really like” status and earns the highest “really like” probability. A movie that meets three benchmarks has a greater “really like” probability than a movie that meets only two benchmarks. And so on.

The important thing to note, though, is that this is not a list of the fifteen best movies of the year. It is a ranking of probabilities (with some tie breakers thrown in) that you’ll “really like” a movie. It is subject to data availability. The more positive data that’s available, the more statistical confidence, i.e. higher probability, the model has in the projection.

Which brings me back to The Big Sick. Cinemascore surveys those movies that they consider “major releases”. The Big Sick probably didn’t have a big advertising budget. Instead, the producers of the film chose to roll the movie out gradually, beginning on June 23rd, to create some buzz and momentum behind the movie before putting it into wide release on July 14th. This is probably one of the reasons why Cinemascore didn’t survey The Big Sick. But, because The Big Sick is missing that third benchmark needed to develop a higher probability, it dropped out of the Top Fifteen. On the other hand, if it had earned at least an “A-” from Cinemascore The Big Sick would be the #2 movie on the list based on the tie breakers.

And, that is the weakness, and strength of movie data. “Major releases” have it. Smaller movies like The Big Sick don’t.

***

This weekend may be the end of the four week run of Objective Top Fifteen movie breakthroughs. Atomic Blonde, the Charlize Theron spy thriller, has an outside chance of earning a spot on the list. As of this morning, it is borderline for the IMDB and Rotten Tomatoes benchmarks. I’m also tracking Girls Trip which earned a Certified Fresh just in the last couple of days from Rotten Tomatoes and has an “A+” in hand from Cinemascore. For now, it is just below the IMDB benchmark. We’ll see if that changes over the weekend.

 

 

Vacation, My 100th Post, and a July “Really Like” Movie Hot Streak

I arrived in the city of Seattle yesterday in the wee hours of the morning. I’m here to introduce myself to my new, beautiful granddaughter. So if there is a contemplative, or distracted, feel to this week’s post, there is good reason.

This is also my 100th post. Not quite as momentous as your first grandchild, but a marker worthy of reflection nevertheless. It has been a labor of love and a challenge. Blogging was new to me when I started out 99 posts ago. I discovered that you don’t find your voice in the first post. Little by little though you develop a style that you become comfortable with and readers of your blog become comfortable with. If you’re lucky, enough people become engaged in your passion and come back for more. Thanks for your support if you’re one of those loyal followers, or even if you’ve just stopped by for an occasional “check and see”. On to the next 100 posts beginning with a look at what’s caught my eye at the Cineplex this coming weekend.

Dunkirk, which goes into wide release tomorrow, is poised to become the fourth high quality mega-hit in four weeks. As of this morning, it is 94% Certified Fresh on Rotten Tomatoes. And, the early overseas feedback on IMDB has produced an impressive 9.6 average rating. This Christopher Nolan depiction of the rescue of the surrounded British army at the beginning of World War II is being compared to the classic Saving Private Ryan. The Saving Private Ryan comparison benchmarks to keep an eye on are Certified Fresh 92%, IMDB Avg Rating 8.6 and Cinemascore “A”. Pre-wide release Dunkirk is exceeding the Rotten Tomatoes and IMDB scores. We’ll have to wait until Saturday for Cinemascore results. I’m excited about this one.

In addition to off schedule posts to this site, vacation for the Mad Movie Man invariably involves a trip to the movies. With an unusually high number of Certified Fresh movies at the theater it is almost a can’t miss proposition. But, the absolute can’t miss feature of this vacation is the incredible miracle of my granddaughter Addie Rose.

This Is Turning Into a “Really Like” Summer at the Movies.

In case you haven’t noticed, we are in the midst of a pretty good run of high quality movies this summer. Since the first weekend in May, which serves as the unofficial beginning of the summer movie season, there have been at least ten movies that have a 7.2 or higher IMDB average rating and have a Certified Fresh rating on Rotten Tomatoes.

May to July 2017 Wide Released Movies IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh
Baby Driver 8.4 C. Fresh 97%
Spider-Man: Homecoming 8.2 C. Fresh 93%
Wonder Woman 8.0 C. Fresh 92%
Guardians of the Galaxy Vol. 2 8.1 C. Fresh 81%
Big Sick, The 8.0 C. Fresh 97%
I, Daniel Blake  7.9 C. Fresh 92%
A Ghost Story 7.5 C. Fresh 87%
Okja 7.7 C. Fresh 84%
The Beguiled  7.3 C. Fresh 77%
The Hero  7.3 C. Fresh 76%

And if early indicators are accurate, War for the Planet of the Apes will join the list after this coming weekend. And, if the early buzz on social media holds up, Christopher Nolan’s new movie Dunkirk will join the list the following weekend.

This seems to me to be an unusually high number of quality movies for the summer so far but I can’t tell you how unusual…yet. I’m working on a new long term project. I’m creating a database solely made up of objective “really like” movie indicators. It will include all movies finishing in the top 150 in receipts at the box office for each of the last 25 years. This database will provide a better representation of the bad movies that are released each year as well as provide a more robust sample size.

For now, I can only compare this year’s quality to 1992 (the first of the 25 years in my new database). Allowing for the fact that Rotten Tomatoes wasn’t launched until 1998, I’ve allowed movies that aren’t Certified Fresh but would otherwise be if there were enough critic reviews of the movie. Even with that allowance, there are only 3 movies released between May and July 1992 that meet the quality criteria I’m using for this summer.

May to July 1992 Wide Released Movies IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh
Night on Earth             7.5 Fresh 73%
Enchanted April             7.6 Fresh 83%
A League of Their Own             7.2 C. Fresh 78%

I’ll also add that the IMDB average ratings tend to decline over time. It is probable that a few of this year’s movies will ultimately not meet the 7.2 IMDB rating minimum. But, with 7 of the 10 movies sitting with IMDB ratings at 7.7 or better, this year’s list should hold up pretty well over time.

***

As I mentioned above War for the Planet of the Apes opens tomorrow. It is easy to overlook how good this franchise has been. Here are the “really like” indicators for the franchise including a very early look at tomorrow’s entry.

IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh Cinema Score
Rise of the Planet of the Apes (2011)             7.6 C. Fresh 81% A-
Dawn of the Planet of the Apes (2014)             7.6 C. Fresh 90% A-
War for the Planet of the Apes (2017)             9.1 C. Fresh 93% ?

Franchises tend to get tired after the first movie. From the critics’ perspective, this franchise appears to get better with each new movie. I expect to see War for the Planet of the Apes on the Objective Top Fifteen list on Monday.

Leave Mummy Out of Your Father’s Day Plans

One of the goals of this blog is to make sure that you are aware of the internet tools that are out there to protect you from wasting your time on blockbusters like The Mummy. While it had a disappointing opening in the U.S., moviegoers still shelled out an estimated $32.2 million at the box office last weekend for this bad movie. Overseas it met its blockbuster expectations with a box office of $141.8 million. However, if you were really in the mood for a horror genre movie a better choice, but not a sure thing, might have been It Comes At Night which had a more modest U.S. box office of $6 million.

As a general rule, I won’t go to a movie on its opening weekend. I prefer to get at least a weekend’s worth of data. But if you just have to see a movie on its opening weekend here are a couple of hints. First, if you are seeing the movie on its opening Friday, the most reliable indicator is Rotten Tomatoes. Most critics have released their reviews before the day of the movie’s release. The Rotten Tomatoes rating on the movie’s release date is a statistically mature evaluation of the movie. It won’t change much after that day.

If you are going to the movies on the Saturday of opening weekend, you can add Cinemascore to the mix. I’ve blogged about this tool before. This grade is based on feedback moviegoers provide about the movie as they are leaving the theater. The grade is posted on the Saturday after the Friday release.

Finally, by Sunday IMDB will produce a pretty good, though slightly inflated, average rating for the movie.

The comparison of these three checkpoints for The Mummy and for It Comes At Night might’ve been helpful to those who thought they were in for a “really like” movie experience.

Rotten Tomatoes IMDB Avg. Rating Cinemascore Grade
The Mummy Rotten (17%) 5.9 B-
It Comes At Night Certified Fresh (86%) 7.2 D

While the Cinemascore grade of D for It Comes At Night would keep me away from opening weekend for both movies, if I had to see one, it wouldn’t be The Mummy.

Here’s the data behind my reasoning. For IMDB, the breakpoint between a movie with a good chance that I will “really like” it and one that I probably won’t like is an average rating of 7.2. Movies with a 7.2 IMDB average rating of 7.2 or higher I “really like” 63.3% of the time. Movies with an IMDB rating less than 7.2 I “really like” 43.3% of the time. Turning to Rotten Tomatoes, Movies that are Certified Fresh I “really like” 68% of the time. These “really like” percentages drop to 49.6% for movies that are Fresh and 37.5% for movies that are Rotten. So absent any information based on my own personal tastes, I won’t go to the movieplex to watch a movie that isn’t graded Certified Fresh by Rotten Tomatoes and has an IMDB Rating 7.2 or higher. That doesn’t mean that there aren’t any movies out there that don’t meet that criteria that I wouldn’t “really like”. The movie may be in a genre that appeals to me which might provide some tolerance for a little less quality. That being said, the odds that I’ll “really like” a low rated movie are less than 50/50.

I should probably explore the potential of adding Cinemascore to the objective probability factors I use in developing “really like” probabilities. To date, though, I don’t have any Cinemascore data . I don’t yet have a feel for its “really like” reliability. For now, I just use it as another piece of data that might tip me one way or the other if I’m on the fence about a new movie.

Enjoy Father’s Day but stay away from Mummy.

Wonder Woman Is Wonderful But Is It the GOAT Superhero Movie?

Everybody is talking about Wonder Woman and its record-breaking box office last weekend. Critics and audiences agree that Wonder Woman is worth a trip to the theater. The Mad Movie Man is convinced as well. You’ll find the movie in the top half of the 2017 Top Ten List and it is on my Watch List for the week, which means I plan on seeing it within the next week.

I mentioned last week that critics were falling all over themselves in praising this movie with some calling it the Superhero GOAT (Greatest Of All Time). Does it warrant such acclaim? Maybe. When you compare it to four other highly rated superhero movies that kicked off franchises, it holds up pretty well.

Oscar Noms/Wins IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh Combined Rating
Wonder Woman (2017) 0/0 8.3 C. Fresh 93%              17.6
Iron Man (2008) 2/0 7.9 C. Fresh 94%              17.3
Batman Begins (2005) 1/0 8.3 C. Fresh 84%              16.7
Superman (1978) 3/0 7.3 C. Fresh 93%              16.6
Spider-Man (2002) 2/0 7.3 C. Fresh 89%              16.2

All four of these comparison movies were Oscar nominated. We’ll have to wait until next January to see if Wonder Woman earns Oscar recognition. The combined rating presented here totals the IMDB rating and the Rotten Tomatoes % Fresh (converted to a 10 pt. scale) to measure the response of both critics and audiences to the five movies. It is still early, and IMDB ratings tend to fade a little over time, but for now Wonder Woman is clearly in the GOAT discussion.

If Wonder Woman holds on to its statistical GOAT position it will be fueled by the response of women to the movie. A comparison of Female and Male IMDB ratings for the five movies compared here lays this out pretty clearly.

Female IMDB Rating Male IMDB Rating IMDB Rating Differnce
Wonder Woman 8.6 8.2 0.4
Iron Man 7.9 7.9 0.0
Superman 7.3 7.3 0.0
Batman Begins 8.1 8.3 -0.2
Spider-Man 7.1 7.3 -0.2

While men “really like” Wonder Woman, females love the movie. Women are responding like they never have before to a superhero movie. Men, on the other hand, have a slight preference for Christopher Nolan’s vision of Batman. I also have to admit that I personally consider Batman Begins as one of the GOAT movies, irrespective of genre. That being said, I am really excited about seeing Wonder Woman.

***

After all of this praise for Wonder Woman, you might be wondering why it is only fifth on the 2017 Top Movies List. Does that mean that the four movies ahead of it are better movies? It might but not necessarily. The top four movies all went into limited release in 2016 to qualify for Oscar consideration. They didn’t go into wide release until early 2017, which is why they are on this list. All of the other movies on the list won’t be considered for Oscar recognition until January 2018. As I mentioned last week, this list is based on objective criteria. The Oscar nominations that the top four movies received are additional objective pieces of evidence that they are quality movies. This allows the algorithm to be more confident in its evaluation of the movie and as a result produces a higher “really like” probability. Again, just in case you were wondering.

 

When It Comes to Unique Movies, Don’t Put All of Your Movie Eggs in the Netflix Basket.

It is rare to find a movie that isn’t a sequel, or a remake, or a borrowed plot idea, or a tried and true formula. Many of these movies are entertaining because they feel familiar and remind us of another pleasant movie experience. The movie recommender websites Netflix, Movielens, and Criticker do a terrific job of identifying those movie types that you “really liked” before and surfacing those movies that match the familiar plot lines you’ve enjoyed in the past.

But, what about those movies that are truly original. What about those movies that present ideas and plot lines that aren’t in your usual comfort zone. Will these reliable websites surface these unique movies that I might like? Netflix has 20,000+ genre categories that they slot movies into. But, what if a movie doesn’t fit neatly into one of those 20,000 categories.

Yesterday I watched a great movie, Being There.

Being There

This 1979 movie, starring Peter Sellers in an Academy Award nominated performance, is about a simple-minded gardener who never left the home of his employer over the first fifty years of his life. Aside from gardening, the only knowledge he has is what he’s seen on television. After his employer dies he is forced to leave his home. The movie follows Chance the Gardener as he becomes Chauncey Gardner, economic advisor to the President. It’s not a story of transformation but of perception. The movie is fresh.

My most reliable movie recommenders, Netflix and Movielens, warned me away from this movie. The only reason I added it to my weekly Watch List is because I saw the movie in the theater when it first came out in 1979 and “really liked” it.

Another possible reason why Netflix missed on this movie is because I hated Peter Sellers’ other classic movie Dr. Strangelove. I rated it 1 out of 5 stars.  If Being There is slotted among a Netflix category of Peter Sellers classics, it might explain the mismatch.

Is it impossible, then, to surface movies that have creative original content that you might like. Not entirely. Criticker predicted I would rate Being There an 86 out of 100. I gave it an 89. The IMDB rating is 8.0 based on over 54,000 votes. Rotten Tomatoes has it at 96% Certified Fresh. This is why I incorporate ratings from five different websites into the “really like” model rather than just Netflix.

When it comes to unique movies, don’t put all of your movie eggs in the Netflix basket.

 

 

Does Critic Expertise on Rotten Tomatoes Overcome the Law of Large Numbers?

In the evolution of my “really like” movie algorithm, one of the difficulties I keep encountering is how should I integrate Rotten Tomatoes ratings in a statistically significant way. Every time I try I keep rediscovering that its ratings are not as useful as the other websites that I use. It’s not that it has no use. To determine if a movie is worth seeing within a week after its release, you’ll be hard pressed to find a better indicator. The problem is that most of the data for a particular movie is counted in that first week. Most of the critic reviews are completed close to the release dates to provide moviegoers with guidance on the day a movie is released. After that first week, the critics are on to the next batch of new movies to review. With all of the other websites, the ratings continually get better as more people see the movie and provide input. The data pool gets larger and the law of large numbers kicks in. With Rotten Tomatoes, there is very little data growth. Its value is based on the expertise of the critics and less on the law of large numbers.

The question becomes what is the value of film critics expertise. It is actually pretty valuable. When Rotten Tomatoes slots movies into one of their three main rating buckets (Certified Fresh, Fresh, Rotten), it does create a statistically significant differentiation.

Rating “Really Like” %
Certified Fresh 69.7%
Fresh 50.0%
Rotten 36.6%

Rotten Tomatoes is able to separate pretty well those movies I “really like” from those I don’t.

So what’s the problem? If we stick to Certified Fresh movies we’ll “really like” them 7 out of 10 times. That’s true. And, if I’m deciding on which new release to see in the movie theater, that’s really good. But, if I’m deciding what movie my wife and I should watch on Friday night movie night and our selection is from the movies on cable or our streaming service, we can do better.

Of the 1,998 movies I’ve seen in the last 15 years, 923 are Certified Fresh. Which of those movies am I most likely to “really like”? Based on the following table, I wouldn’t rely on the Rotten Tomatoes % Fresh number.

Rating % Fresh Range “Really Like” %
Certified Fresh 96 to 100% 69.9%
Certified Fresh 93 to 95% 73.4%
Certified Fresh 89 to 92% 68.3%
Certified Fresh 85 to 88% 71.2%
Certified Fresh 80 to 84% 73.0%
Certified Fresh 74 to 79% 65.3%

This grouping of six equal size buckets suggests that there isn’t much difference between a movie in my database that is 75% Fresh and one that is 100% Fresh. Now, it is entirely possible that there is an actual difference between 75% Fresh and 100% Fresh. It is possible that, if my database were larger, my data might produce a less random pattern which might be statistically significant. For now, though, the data is what it is. Certified Fresh is predictive and the % Fresh part of the rating less so.

Expertise can reduce the numbers needed for meaningful differentiation between what is Certified Fresh and what is Rotten. The law of large numbers, though, may be too daunting for credible guidance much beyond that.

 

 

Post Navigation