Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Add a Year Here. Tweak a Formula There. And, the Objective Top Twenty Looks Very Different.

I was able to add 1998 to the Objective Database last weekend. The extra data allowed me to factor in Oscar wins to the algorithm. But, it was one little tweak to the Oscar performance factor that dramatically altered the 2017 Objective Top Twenty this week.

For the Oscar performance part of my algorithm I created five groupings of movies based on their highest Academy Award achievement. If a movie won in a major category it went in the first group. If it was nominated for a major but didn’t win, it went in the second group. If it wasn’t nominated for a major but won in a minor category, it went into the third group. If it was only nominated in a minor category but didn’t win, it went into the fourth group. Finally, if it wasn’t nominated in any Oscar category, it went into the fifth group.

In terms of what percentage of the movies in each group that had an average IMDB rating of 7 or better, here are the results:

Best Oscar Performance: %  7+ IMDB Avg. Rating
Major Win 90.3%
Major Nomination 87.7%
Minor Win 79.7%
Minor Nomination 71.7%
No Nominations 59.8%

Wins seem to matter, particularly for the minor categories. Major nominations clearly are better “really like” indicators than minor nominations. It’s the no nominations grouping that’s most revealing. If a movie doesn’t get at least one nomination, the odds of it being a “really like” movie are dramatically reduced. This led to my discovery of some faulty thinking on my part.

If movies like DunkirkLady Bird, and Three Billboards Outside of Ebbing, Missouri, all movies headed towards major Oscar nominations in January, are treated in my algorithm as if they failed to earn a single Oscar nomination, those movies are being unfairly penalized. It was this flaw in my system that needed fixing. Now, those movies that haven’t gone through the Oscar nominating process are designated as Not Applicable. No Oscar performance test is applied to them. Without the weight of the No Nomination designation, many of the movies that didn’t get their first release until 2017 have risen significantly in the 2017 Objective Top Twenty rankings.

***

Get ready for a Thanksgiving treat. Now that 1998 has been added to the Objective Database, we can reveal the Objective Top Seven Movies from the years 1992-1998. Adding Academy Award Wins to the mix will shake up those rankings as well. Check in next Thursday after you’ve taken your post-turkey dinner nap.

***

The wide releases this weekend are Justice LeagueThe Star, and Wonder, but it’s the limited release, Mudbound, that I’ll be watching closely . This movie, set in the post-WII rural American South, is being mentioned as a Best Picture contender. Here’s the thing though. Most people won’t see it in the movie theater since it opens simultaneously on Friday on Netflix streaming. Can a movie that is more widely viewed at home than in the theater gain Academy Award traction? Stay tuned.

 

Advertisements

Objectively Speaking, What Are The Top Six Movies From 1992 to 1997.

Now, I might admit that a Top Six list from a seemingly random six year period seems a little odd. There is a method to my Movie Madness.

As I’ve mentioned on more than one occasion, I’m building a twenty five year movie database with solely objective factors to better identify those movies most of us would “really like”. It’s a time consuming process. If I’m uninterrupted by other priorities in my life, I can usually add a complete year to the database in a week and a half. There will always be interruptions, though, and I don’t expect to finish my project before mid-year 2018.

I’m a little impatient to get some useful information from my efforts and so I thought it might be fun to create an Objective Best Movie List for however many years I’ve completed. I’ve completed six years and so I now have a list of the best six movies from my completed time frame. I should complete 1998 by the weekend and after incorporating the new data into my algorithm I’ll be able to create a Top Seven list. Now that you have the picture here’s the top six in ascending order.

6. Sense and Sensibility (1995). IMDB Avg. 7.7, Certified Fresh 80%, CinemaScore A, Oscar- 3 Major nominations, 4 Minor

This was the first of a mid-1990’s run of Jane Austen titles to make it to the big screen. Emma Thompson won the Oscar for Best Screenplay. She is the only person to ever win both a Best Acting and a Best Screenwriting award. The movie is also noteworthy for the breakthrough performance of Kate Winslet who at age 20 earned her first of seven Oscar nominations.

5. In the Name of the Father (1994). IMDB Avg. 8.1, Certified Fresh 94%, CinemaScore A, Oscar- 4 Major nominations, 3 Minor

This is the movie that will probably surprise many of you. This biopic of Gerry Conlon, who was wrongly imprisoned for an IRA bombing, was the second of Daniel Day-Lewis’ five Best Actor nominations. He lost 30 pounds in preparation for the role and spent his nights on the set in the prison cell designed for the movie.

4. Good Will Hunting (1997). IMDB Avg. 8.3, Certified Fresh 97%, CinemaScore A, Oscar- 4 Major nominations,, 5 Minor

This movie is in my personal top ten. Two relatively unknown actors, Matt Damon and Ben Affleck became stars overnight and won Oscars for Best Screenplay as well. If either of them ever get a Best Actor award, they’ll join Emma Thompson in that select group. In his fourth nominated performance Robin Williams won his only Oscar for Best Supporting Actor.

3. Toy Story (1995). IMDB Avg. 8.3, Certified Fresh 100%, CinemaScore A, Oscar-1 Major Nomination, 2 Minor

Toy Story’s ranking is driven by its 100% Fresh Rotten Tomatoes rating from 78 critics. While its Oscar performance is weaker than the other movies on the list, it should be noted that Toy Story was the first animated movie to ever be nominated for Best Screenplay. As the database grows, I would expect that the number of Oscar nominations and the number of wins will become credible factors in these rankings. For now, receiving one Major and one Minor nomination has the same impact on the algorithm as for a movie like Titanic that won eleven awards. This is probably the only movie of the six that appears out of place in the rankings.

2. Shawshank Redemption (1994). IMDB Avg. 9.3, Certified Fresh 91%, CinemaScore A, Oscar- 3 Major nominations, 4 Minor

Shawshank still ranks as IMDB’s top movie of all time. At some point, I’m going to write an article about movies that achieve cult status after having only modest success at the box office. Shawshank would be one of those movies. After a pedestrian $28,341,469 domestic gross at the Box Office, it became one of the highest grossing video rentals of all time.

1. Schindler’s List (1994). IMDB Avg. 8.9, Certified Fresh 96%, CinemaScore A+, Oscar- 4 Major nominations, 8 Minor

Interestingly, this is the only movie of the six on the list to win Best Picture. It is also the only one on the list to earn an A+ from CinemaScore. Combine that with its twelve Oscar nominations and you can see why, objectively, it is at the top of the list.

Objectivity improves as data grows. It should be fun to see this list change as the database grows.

What do you think?

 

What Does the Best Picture Oscar Race Look Like Today.

I was away most of the week and so I wasn’t able to update my databases, my lists, or come up with new interesting studies. But, that doesn’t mean I haven’t been thinking about “really like” movies.

This time of year I follow AwardsCircuit.com to follow the latest thinking in the Oscar race. AwardsCircuit updated their projected nominees this past Monday and with nine weekends left in the year eight of the ten Best Picture projections have not gone into wide release yet. Does this mean that the best is yet to come? It could. But, it could also mean that they are still hyped for Best Picture because their exposure to critics and audiences has been limited.

There were other movies that have already been released that were expected to be Best Picture contenders. Of these only Dunkirk and Blade Runner 2049 have met their pre-release expectations and are still considered Best Picture caliber movies. Other Best Picture hyped movies, like Battle of the Sexes, Marshall, Suburbicon, and Mother, have either wilted or flopped when exposed to critics and audiences. The same could happen to the eight pre-release movies still projected for Best Picture nominations.

If Dunkirk and Blade Runner 2049 have survived the scrutiny of critics and audiences to remain Best Picture contenders, how do the remaining eight projected contenders measure up to those movies so far. All eight have been seen at film festivals to a limited degree by critics and audiences and so there is some feedback to see how these movies are trending. Using average ratings from IMDB and Rotten Tomatoes % Fresh ratings, we can get some early feedback on how those eight movies are faring so far. I’ve converted the Rotten Tomatoes % Fresh to a ten point scale to get an apples to apples comparison with IMDB. I’ve also included the four movies mentioned above that haven’t lived up to the hype so far. The eight pre-release contenders are in bold on the list.

Movie IMDB Rotten Tomatoes Total Score
Call Me By Your Name 8.3 9.8 18.1
Three Billboards Outside of Ebbing, Missouri 8.3 9.8 18.1
Lady Bird 7.8 10.0 17.8
Dunkirk 8.3 9.2 17.5
Blade Runner 2049 8.5 8.8 17.3
Shape of Water, The 7.5 9.7 17.2
I, Tonya 7.4 9.1 16.5
Mudbound 6.3 9.5 15.8
Battle of the Sexes 6.9 8.5 15.4
Marshall 7.0 8.3 15.3
Mother 7.1 6.9 14.0
Last Flag Flying 6.7 6.8 13.5
Darkest Hour 5.3 7.9 13.2
Suburbicon 4.7 2.5 7.2

If the post-release feedback is consistent with the pre-release feedback, then Call Me By Your NameThree Billboards Outside of Ebbing, Missouri, and Lady Bird are the real deal. The Shape of Water, and I, Tonya also appear solid. Mudbound could be on the fence. The early audience response to Last Flag Flying and Darkest Hour may be warning signs that these movies may have been overhyped. If they falter, Battle of the Sexes could move back into contention. You could also see two movies that haven’t been seen by either critics or audiences yet, The Post and Phantom Thread, possibly emerge as contenders. You could also see a dark horse like The Florida Project (IMDB=8.1, Rotten Tomatoes=97% Fresh) sneak in. There are still many twists and turns that will present themselves before Best Picture nominations are announced in January.

The first of these eight movies to test themselves will be Lady Bird which goes into limited release this coming weekend. With fifty critic reviews registered in Rotten Tomatoes, it is still at 100% Certified Fresh. This is one that I’ll probably see in the theaters. Soairse Ronan has become one of my favorite young actresses.

The Objective Top Twenty Doesn’t Account for Personal Taste

Over the last few months I’ve spent a lot of time introducing you to the 2017 Objective Top Twenty Movies. But, let’s be clear. It isn’t my list of the top twenty movies so far. As a matter of fact, I’ve only seen a handful of the movies and I may only see a handful more in the future. There are some movies on the list that I’ll never watch. At the end of the day, which movies you watch on the list is a matter of personal taste.

The Objective Top Twenty is ranking of movies that, based on available data today, I’m most confident are good movies, regardless of personal taste. Hidden Figures and Lion are at the top of the list because there is more data available for those movies than any of the other movies on the list and that mature data continues to support the quality of those two movies. I can say with a high degree of confidence that both of these movies are really good movies. On the other hand Blade Runner 2049, which probably is a good movie, just doesn’t have the data support yet to confidently support that subjective opinion.

While I’m confident all of the movies on the Objective Top Twenty are good movies, I’m not confident that you, personally, would “really like” every movie on the list. In fact, I’m fairly confident you wouldn’t like every movie on the list. Our personal taste in movies reflects our life experiences. Those movies that we “really like” somehow connect with our memories, our aspirations, our curiosity, or represent a fun place to escape. Not every movie on the Objective Top Twenty is going to make the personal connection needed to make it a “really like” movie for each of us.

So, which of the Objective Top Twenty should you watch. Other than using the websites I promote in this blog, most people use trailers to see if they connect with a small sample of the movie. If it’s an Objective Top Twenty movie and the trailer connects with you, that’s not a bad approach. The only caution is that sometimes a trailer leaves you with the impression that a movie is about X when it’s really about Y.

My recommendation is to use at least one personal rating website that will model your personal taste in movies. I use three, Netflix-DVD, Movielens, and Criticker. There are links for all three at the top of the page. I’ve created a subjective “really like” model to go along with the objective model used to create the Objective Top Twenty. Here’s a ranking of the Objective Top Twenty based on the probability today that I will personally “really like” the movie.

2017 Released Movies Subjective “Really Like” Probability Objective “Really Like” Probability My Rating for Seen Movies
Hidden Figures 74.3% 76.78% 7.9
Lion 74.0% 76.00% 7.9
Wonder Woman 73.2% 71.39% 8.5
Dunkirk 72.7% 70.71% 8.4
Patriots Day 72.7% 71.01%
Spider-Man: Homecoming 71.9% 71.39%
Logan 71.3% 70.71%
Big Sick, The 69.5% 70.56% 8.4
Guardians of the Galaxy Vol. 2 69.2% 71.01%
Only the Brave  62.6% 71.01%
Monster Calls, A 62.2% 71.01%
Land of Mine 61.2% 74.72%
Salesman, The 59.2% 75.18%
I Am Not Your Negro 56.0% 75.18%
Kedi 52.4% 70.56%
Florida Project, The 51.6% 70.56%
Truman 50.8% 70.56%
20th Century Women 50.5% 75.21%
Silence 48.7% 72.78%
Lucky 45.7% 70.56%

The movies that I’ve seen so far are, for the most part, the movies at the top of the list. I’ve, in effect, ranked the Objective Top Twenty based on those movies with the greatest probability that I will “really like” them. I am certain that I will watch all of the top nine movies on this list. I will probably watch some of the remaining eleven movies on the list. I will definitely not watch all of them.

However, you choose to do it, the Objective Top Twenty needs a personal touch when you use the list to pick movies to watch. I can only guarantee that they are good movies. It’s up to you to figure out which ones will be “really like” movies for you.

I’m Stating the Obvious But You Will Probably “Really Like” Oscar Nominated Movies.

You are more likely to “really like” a movie that has received an Oscar nomination than one that hasn’t. Now, there’s a bold statement. But while most people would intuitively agree with the statement, I have statistical data to support it.

As followers of this blog are aware, I’m building a database of  objective movie ratings data from the past 25 years. Last week I added a fifth year of data. With each year that I add I can pose questions that are easier to test statistically, such as, do Oscar nominations have “really like” statistical significance. I even take it a step further by exploring if there are differences between major nominations and minor ones.

Major nominations are the commonly accepted major awards for Best Picture, Director, Actor, Actress, and Screenplay. Minor nominations are for all of the other categories presented on Oscar night. It doesn’t include the special technical awards presented in a separate ceremony.

Here are the results for the years 1992 to 1996. The movies are grouped by whether they were awarded at least one major and/or minor nomination. The table represents the percentage of IMDB voters who gave the movies in each group a rating of 7 or higher.

Movies with: % 7+
Major & Minor Nominations 90.5%
Major Nominations Only 84.6%
Minor Nominations Only 74.7%
No Nominations 61.4%
All Sample Movies 73.0%

Major nominations have a clear statistical advantage over minor nominations. The size of the gap between movies with just minor nominations and those with no nominations might be surprising. My gut tells me that this gap will narrow as we add more years, especially when we add more recent years. But, it is interesting nonetheless. It does suggest that members of the Academy of Motion Picture Arts and Sciences (AMPAS) understand their craft and that knowledge does a great job identifying the “really like” movies released in a given year.

There are more questions to answer regarding Oscar performance as a “really like” indicator. What is the predictive value of an Oscar win? Does predictive value increase with number of nominations that a movie receives? Does a Best Picture nomination have more predictive value than any other category? All of these questions and more will have to wait for more data.

One question we have answered is why all of the movies at the top of the Objective Top Twenty are Oscar nominated movies from last year’s voting. The other takeaway is that all of the other movies on the list that didn’t go through last year’s nominating process, probably won’t stay on the list unless their name is called on January 23, 2018 when this year’s Oscar nominations are announced.

***

It might be a light weekend for new Objective Top Twenty contenders. I’m keeping my eye on Only The Brave which chronicles the story of the Granite Mountain Hotshots, one of the elite firefighting units in the USA. As of this morning, it is 89% Fresh on Rotten Tomatoes and has a 7.3 on IMDB.

 

 

 

 

 

In the Objective Top Twenty, a Certified Fresh Is a Must…But Is It Enough?

When you review the Objective Top Twenty you’ll notice that every movie has earned a Certified Fresh designation from Rotten Tomatoes. It is a dominant factor in my rating system. It may even be too dominant.

All of the analysis that I’ve done so far suggests that a Certified Fresh designation by Rotten Tomatoes is a strong indicator of a “really like” movie. The new Objective Database that I’m working with also shows that a Certified Fresh rating results in a high likelihood that IMDB voters will rate the movie a 7 or higher.

 # of IMDB Votes IMDB Votes 7+ %
Certified Fresh               19,654,608 88.2%
Fresh                  6,144,742 75.4%
Rotten                  9,735,096 48.5%

And, as you might expect, the likelihood of a 7 or higher rating stair steps down as you move into the Fresh and Rotten groups of movies.

This exposes a flaw in my previous thinking about Rotten Tomatoes. In the past I’ve indicated that I haven’t seen a statistical relationship between the % Fresh and the likelihood of a “really like” movie. And, actually, that’s a true statement. The flaw in my thinking was that because I didn’t see it I assumed it didn’t exist.

The Certified Fresh, Fresh, and Rotten designations are primarily defined by % Fresh:

  • Certified Fresh for most movies is > 75% Fresh
  • Fresh for most movies is > 60% and < 75% Fresh
  • Rotten is < 60% Fresh

If differentiation exists for these three groups then it should exist between other % Fresh groups. For example, movies that are 95% Certified Fresh should have a greater “really like” probability than movies that are 80% Certified Fresh. I now believe that I haven’t seen the difference because there hasn’t been enough data to produce stable differences.

When I begin to marry Rotten Tomatoes data with IMDB, I also get more data. Below I’ve grouped the Certified Fresh movies into four groups based on % Fresh.

Certified Fresh:  # of IMDB Votes IMDB Rating 7+ %
100%                     966,496 90.7%
90-99%               10,170,946 89.9%
80-89%                  5,391,437 87.3%
70-79%                  3,125,729 83.5%

We might be seeing the differences you’d expect to see when the units of data get larger.

So, why is this important? If we treat all Certified Fresh movies as strong “really like” prospects, we are in effect saying that we are as likely to “really like” The Shawshank Redemption (Certified Fresh 91%, IMDB Avg. Rating 9.3) as The Mask ( Certified Fresh 77%, IMDB Avg. Rating 6.9). The “really like” model becomes a more dynamic movie pre-screening tool if it can make a Rotten Tomatoes distinction between those two movies.

I believe that the database has to get much larger before we can statistically differentiate between Certified Fresh 87% movies and Certified Fresh 85% movies. But, I think I can begin to integrate the Certified Fresh groupings I developed above to create some additional means of defining quality movies within the Certified Fresh grade.

You might just see this change in next Monday’s Objective Top Twenty.

***

In looking at this weekend’s new releases, there are no sure things but three of the movies are worth keeping an eye on. The Foreigner, the Jackie Chan action thriller, is getting good early feedback from critics and IMDB voters. I expect it to do well at the box office. Marshall, the Thurgood Marshall bio-pic starring Chadwick Boseman, has received some early Oscar buzz. It appears to be headed towards a Certified Fresh rating from Rotten Tomatoes. The movie that may sneak up on audiences is Professor Marston & the Wonder Woman. Professor Marston created the character of Wonder Woman in the 1940’s. This movie tells that story. Already 34 of 38 critics have given it a Fresh rating on Rotten Tomatoes. I would expect it to receive its Certified Fresh designation by tomorrow morning.

 

 

 

 

 

 

Will “You” Really Like This Movie?

If you reviewed this week’s Objective Top Twenty, you might have noticed something other than five additional movies on the list. You might have noticed that, other than Hidden Figures holding onto the number one spot on the list, all of the rankings had changed.

A few month’s back I mentioned that I was developing a new objective database to project “really like” movies that are not influenced at all by my taste in movies. This week’s Objective Top Twenty reflects the early fruits of that labor.

The plan is to build a very robust database of all of the movies from the last twenty five years that finished in the top 150 in box office sales for each year . I have 1992 through 1995 completed which gives me enough movies to get started with.

The key change in the “really like” formula is that my algorithm measures the probability that users of the IMDB database will rate a particular movie as a 7 out of 10 or higher, which is my definition of a “really like” movie. The key components of the formula are IMDB Average Rating, Rotten Tomatoes Rating, CinemaScore Grade, and the number of  Academy Award wins and nominations for the major categories and for the minor categories.

In future posts, I’ll flesh out my logic for all of these factors. But, the key factor is the capability to measure on IMDB the percentage of IMDB voters who have rated a particular movie as a 7 or higher. When you aggregate all of the movies with a particular IMDB average rating you get results that look like this sample:

Avg. Rating % Rating 7+
                8.5 92.8%
                8.0 88.8%
                7.5 81.4%
                7.0 69.2%
                6.5 54.7%
                6.0 41.5%
                5.5 28.7%

Note that, just because a movie has an average rating of 7.0, doesn’t mean that every movie with a 7.0 average rating is a “really like” movie.  Only 69.2% of the votes cast for the movies with a 7.0 rating were ratings of 7 or higher. Conversely, every movie with an average rating of 6.0 isn’t always a “don’t really like” movie since 41.5% of the voters handed out 7’s or higher. It does mean, though, that the probability of a 7.0 average rated movie is more likely to be a “really like” movie than one with a 6.0 rating.

These changes represent a start down a path towards a movie pre-screening tool that is more useful to the followers of this blog. It is a work in progress that will only get better as more years are added to the database. But, we have a better answer now to the question, “Will you ‘really like’ this movie?”

***

If you’re going to the movies this weekend, chances are that you’re going to see Blade Runner 2049. The early indications are that it is going to live up to the hype. You might also check out The Florida Project, an under the radar movie that is getting some apparently well-deserved buzz.

Why Does CinemaScore Leave Out So Many Good Movies When Issuing Grades?

The 2017 Academy Awards will be forever remembered as the year that La La Land was awarded Best Picture for about a minute before they discovered that Moonlight was the actual winner. Those two movies have something else in common. Neither movie received a CinemaScore grade despite, arguably, being the top two movies of 2016.

I’m thinking about this issue this week because three movies with Oscar buzz, StrongerBattle of the Sexes, and Victoria and Abdul,  went into limited release last weekend. None of them were graded by Cinemascore. There is a valid reason for this but that doesn’t make it any less disappointing to movie pre-screeners like myself.

For me, Cinemascore is appealing because it measures only opening night reaction. Most people who go to the opening night of a movie are there because they really want to see that movie. The pre-release buzz has grabbed their attention to such an extent that they can’t wait to see it. They walk into an opening night movie expecting to love it. When they walk out of the movie and respond to CinemaScore’s survey they are probably grading based on expectations. So, when a movie receives an “A” from Cinemascore, it tells us that the movie lives up to the hype. Anything less than that suggests that the movie experience was less than they expected.

CinemaScore gets stuck when it comes to movies that are released in a limited number of theaters prior to them being released widely in most theaters. CinemaScore samples locations throughout the U.S. and Canada to establish a credible unbiased sample. When a movie goes into limited release, it is released in some of their sample locations but not in most of their sample locations. Last weekend, Stronger was released in 573 theaters, Battle of the Sexes was released in 21 theaters, and Victoria and Abdul was released in 4 theaters. To provide some perspective, Kingsman: The Golden Circle opened in 4,003 theaters last weekend and earned a “B+” grade from CinemaScore. When Stronger and Battle of the Sexes goes into wide release tomorrow, does word of mouth reaction from moviegoers who’ve seen the movie in the last week disturb the integrity of any sample taken this weekend? When Victoria and Abdul goes into wide release on October 6, is its release into just 4 theaters last weekend sufficiently small to not taint the sample? I don’t know the answers to these questions. I’ll be looking to see if these movies get graded when they go into wide release. In Box Office Mojo’s article on last weekend’s box office performance they indicate that CinemaScore graded Stronger an “A-” even though it hasn’t been officially posted on their website. Perhaps they are waiting to post it after wide release?

I understand why grades don’t exist on CinemaScore for many limited release movies. I understand the importance of data integrity in the creation of a credible survey. I will just observe, though, that in this age of social media, using limited movie releases to build pre-wide release momentum for quality movies is an increasingly viable strategy. Just this week, A24, the studio behind the rise of Moonlight last year, decided to put their dark horse candidate this year, Lady Bird, into limited release on November 3rd after it emerged from the Telluride and Toronto film festivals with a 100% Fresh grade from Rotten Tomatoes. CinemaScore may be facing the prospect of an even broader inventory of ungraded top tier movies than it does today. It will be interesting to see how they respond to this challenge, if at all.

 

So Now Rotten Tomatoes Has No Impact On the Box Office? Not So Fast.

There has been a conventional wisdom evolving that Rotten Tomatoes movie ratings are negatively impacting ticket sales at the movies. Over the last couple of weeks, there has been a counter argument made based on a study posted in a September 10th blog. The Wrap, Variety, and other websites reporting on the movie industry have run with the story that Rotten Tomatoes has little, if any, impact on movie ticket sales. I believe that is an oversimplification of the study and the intersection of movie ratings and movie consumption.

The points made in the study that are getting the most attention are:

  1. There is very little statistical correlation between Rotten Tomatoes ratings and box office performance.
  2. The median Rotten Tomatoes rating for 2017 is 77.5% Fresh, whereas the ratings for each of the prior four years was either 72% or 73% Fresh.
  3. There is a correlation between Rotten Tomatoes ratings and Audience ratings.

So, the argument goes, you can’t blame Rotten Tomatoes for bad box office when it is statistically proven that it has no impact on box office and, by the way, critics have actually rated this year’s movies higher than last year’s, and audiences stay away from bad movies because they are more savvy today than they’ve been in the past.

I believe the third point should be the headline. When I’ve looked at this before  I’ve found a very strong correlation to the Certified Fresh, Fresh, and Rotten ratings and my “really like” ratings.  On the other hand, I’ve found that the percentage fresh rating has a weaker correlation to whether I’ll “really like” a movie. I wonder what the statistical correlation to box office performance is for the just the three broad ratings?

As to the second point, the overlooked item in the study is that not only have critics in the aggregate liked 2017 movies better that prior years, the worldwide box office has responded with higher ticket sales in 2017 than 2016. Is it possible that better movies in 2017 have translated into more people worldwide going to the movies?

The first point, and the one that became the headline in so many articles, doesn’t make a lot of sense to me. If there is a correlation between Rotten Tomatoes ratings and Audience ratings, doesn’t that suggest that Rotten Tomatoes has contributed to a more informed movie public And, because they are more informed, they are staying away from bad movies. Therefore, Rotten Tomatoes has impacted the box office. The fact that it is an indirect impact rather than a direct impact is a little misleading. Isn’t it?

Near the end of his study presentation Yves Berqquist, the author of the study, concludes that  “Audiences are becoming extremely adept at predicting and judging the quality of a film”. Rotten Tomatoes is just one of the tools audiences are using to pre-screen the movies they watch. IMDB ratings are taken into account as are Cinemascore grades. For example, Box Office Mojo, which is the go to site for movie box office information, specifically cited the “F” grade that Cinemascore gave to Mother! last weekend as a factor in the “supremely disappointing $7.5 million from 2,368 locations” opening weekend box office. Cinemascore has only given out nineteen F’s in almost forty years of movie surveys.

The movie industry may be looking for someone to blame for movie consumers behaving differently than they have in the past. But, the sooner the industry comes to grips with the new reality that movie audiences are more savvy today than they were in the past, the sooner they will improve their own fortunes. It is arrogant to blame Rotten Tomatoes for contributing to a more informed movie audience.

***

It has been seven weeks since a new movie, Detroit, joined The Objective Top Fifteen after its opening weekend. There is a chance that streak might be broken this weekend. Assuming Cinemascore surveys the movie, I think it’s likely that the Boston Marathon bombing bio-pic Stronger will join the list. I have hopes that Battle of the Sexes will sneak in as well. Check out my update on Monday to see how good my instincts were.

 

Before You See Mother! This Weekend, You Might Read This Article

As you might expect, I’m a big fan of Nate Silver’s FiveThirtyEight website. Last Thursday they published an interesting article on the impact of polarizing movies on IMDB ratings, using Al Gore’s An Inconvenient Sequel: Truth to Power as an example. This is not the first instance of this happening and it won’t be the last.

When the new Ghostbusters movie with the all female cast came out in July 2016 there was a similar attempt to tank the IMDB ratings for that movie. That attempt was by men who resented the all female cast. At that time I posted this article. Has a year of new ratings done anything to smooth out the initial polarizing impact of the attempt to tank the ratings? Fortunately, IMDB has a nice little feature that allows you to look at the demographic distribution behind a movie’s rating. If you access IMDB on it’s website, clicking the number of votes that a rating is based on will get you to the demographics behind the rating.

Before looking at the distribution for Ghostbusters, let’s look at a movie that wasn’t polarizing. The 2016 movie Sully is such a movie according to the following demographics:

Votes Average
Males  99301  7.4
Females  19115  7.6
Aged under 18  675  7.7
Males under 18  566  7.6
Females under 18  102  7.8
Aged 18-29  50050  7.5
Males Aged 18-29  40830  7.5
Females Aged 18-29  8718  7.6
Aged 30-44  47382  7.4
Males Aged 30-44  40321  7.4
Females Aged 30-44  6386  7.5
Aged 45+  12087  7.5
Males Aged 45+  9871  7.5
Females Aged 45+  1995  7.8
IMDb staff  17  7.7
Top 1000 voters  437  7.2
US users  17390  7.5
Non-US users  68746  7.4

There is very little difference in the average rating (the number to the far right) among all of the groups. When you have a movie that is not polarizing, like Sully, the distribution by rating should look something like this:

Votes  Percentage  Rating
12465  8.1% 10
19080  12.4% 9
52164  33.9% 8
47887  31.1% 7
15409  10.0% 6
4296  2.8% 5
1267  0.8% 4
589  0.4% 3
334  0.2% 2
576  0.4% 1

It takes on the principles of a bell curve, with the most ratings clustering around the average for the movie.

Here’s what the demographic breakdown for Ghostbusters looks like today:

Votes Average
Males  87119  5.0
Females  27237  6.7
Aged under 18  671  5.3
Males under 18  479  4.9
Females under 18  185  6.6
Aged 18-29  36898  5.4
Males Aged 18-29  25659  5.0
Females Aged 18-29  10771  6.7
Aged 30-44  54294  5.2
Males Aged 30-44  43516  5.0
Females Aged 30-44  9954  6.6
Aged 45+  11422  5.3
Males Aged 45+  9087  5.1
Females Aged 45+  2130  6.3
IMDb staff  45  7.4
Top 1000 voters  482  4.9
US users  25462  5.5
Non-US users  54869  5.2

There is still a big gap in the ratings between men and women and it persists in all age groups. This polarizing effect produces a ratings distribution graph very different from the one for Sully.

Votes  Percentage  Rating
20038  12.8% 10
6352  4.1% 9
13504  8.6% 8
20957  13.4% 7
24206  15.5% 6
18686  12.0% 5
10868  7.0% 4
7547  4.8% 3
6665  4.3% 2
27501  17.6% 1

It looks like a bell curve sitting inside a football goal post. But it is still useful because it suggests the average IMDB rating for the movie when you exclude the 1’s and the 10’s is around 6 rather than a 5.3.

You are probably thinking that, while interesting, is this information useful. Does it help me decide whether to watch a movie or not? Well, here’s the payoff. The big movie opening this weekend that the industry will be watching closely is Mother!. The buzz coming out of the film festivals is that it is a brilliant but polarizing movie. All four of the main actors (Jennifer Lawrence, Javier Bardem, Michele Pfeiffer, Ed Harris) are in the discussion for acting awards. I haven’t seen the movie but I don’t sense that it is politically polarizing like An Inconvenient Sequel and Ghostbusters. I think it probably impacts the sensibilities of different demographics in different ways.

So, should you go see Mother! this weekend? Fortunately, its early screenings at the film festivals give us an early peek at the data trends. The IMDB demographics so far are revealing. First, by looking at the rating distribution, you can see the goal post shape of the graph, confirming that the film is polarizing moviegoers.

Votes  Percentage  Rating
486  36.0% 10
108  8.0% 9
112  8.3% 8
92  6.8% 7
77  5.7% 6
44  3.3% 5
49  3.6% 4
40  3.0% 3
52  3.8% 2
291  21.5% 1

57.5% of IMDB voters have rated it either a 10 or a 1. So are you likely to love it or hate it? Here’s what the demographics suggest:

Votes Average
Males  717  6.1
Females  242  5.4
Aged under 18  25  8.4
Males under 18  18  8.2
Females under 18  6  10.0
Aged 18-29  404  7.3
Males Aged 18-29  305  7.5
Females Aged 18-29  98  6.1
Aged 30-44  288  5.0
Males Aged 30-44  215  5.0
Females Aged 30-44  69  5.2
Aged 45+  152  4.3
Males Aged 45+  111  4.3
Females Aged 45+  40  4.1
Top 1000 voters  48  4.6
US users  273  4.4
Non-US users  438  6.5

While men like the movie more than women, if you are over 30, men and women hate the movie almost equally. There is also a 2 point gap between U.S. and non-U.S. voters. This is a small sample but it has a distinct trend. I’ll be interested to see if the trends hold up as the sample grows.

So, be forewarned. If you take your entire family to see Mother! this weekend, some of you will probably love the trip and some of you will probably wish you stayed home.

 

Post Navigation