Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the tag “IMDB”

I’m Stating the Obvious But You Will Probably “Really Like” Oscar Nominated Movies.

You are more likely to “really like” a movie that has received an Oscar nomination than one that hasn’t. Now, there’s a bold statement. But while most people would intuitively agree with the statement, I have statistical data to support it.

As followers of this blog are aware, I’m building a database of  objective movie ratings data from the past 25 years. Last week I added a fifth year of data. With each year that I add I can pose questions that are easier to test statistically, such as, do Oscar nominations have “really like” statistical significance. I even take it a step further by exploring if there are differences between major nominations and minor ones.

Major nominations are the commonly accepted major awards for Best Picture, Director, Actor, Actress, and Screenplay. Minor nominations are for all of the other categories presented on Oscar night. It doesn’t include the special technical awards presented in a separate ceremony.

Here are the results for the years 1992 to 1996. The movies are grouped by whether they were awarded at least one major and/or minor nomination. The table represents the percentage of IMDB voters who gave the movies in each group a rating of 7 or higher.

Movies with: % 7+
Major & Minor Nominations 90.5%
Major Nominations Only 84.6%
Minor Nominations Only 74.7%
No Nominations 61.4%
All Sample Movies 73.0%

Major nominations have a clear statistical advantage over minor nominations. The size of the gap between movies with just minor nominations and those with no nominations might be surprising. My gut tells me that this gap will narrow as we add more years, especially when we add more recent years. But, it is interesting nonetheless. It does suggest that members of the Academy of Motion Picture Arts and Sciences (AMPAS) understand their craft and that knowledge does a great job identifying the “really like” movies released in a given year.

There are more questions to answer regarding Oscar performance as a “really like” indicator. What is the predictive value of an Oscar win? Does predictive value increase with number of nominations that a movie receives? Does a Best Picture nomination have more predictive value than any other category? All of these questions and more will have to wait for more data.

One question we have answered is why all of the movies at the top of the Objective Top Twenty are Oscar nominated movies from last year’s voting. The other takeaway is that all of the other movies on the list that didn’t go through last year’s nominating process, probably won’t stay on the list unless their name is called on January 23, 2018 when this year’s Oscar nominations are announced.

***

It might be a light weekend for new Objective Top Twenty contenders. I’m keeping my eye on Only The Brave which chronicles the story of the Granite Mountain Hotshots, one of the elite firefighting units in the USA. As of this morning, it is 89% Fresh on Rotten Tomatoes and has a 7.3 on IMDB.

 

 

 

 

 

Advertisements

Will “You” Really Like This Movie?

If you reviewed this week’s Objective Top Twenty, you might have noticed something other than five additional movies on the list. You might have noticed that, other than Hidden Figures holding onto the number one spot on the list, all of the rankings had changed.

A few month’s back I mentioned that I was developing a new objective database to project “really like” movies that are not influenced at all by my taste in movies. This week’s Objective Top Twenty reflects the early fruits of that labor.

The plan is to build a very robust database of all of the movies from the last twenty five years that finished in the top 150 in box office sales for each year . I have 1992 through 1995 completed which gives me enough movies to get started with.

The key change in the “really like” formula is that my algorithm measures the probability that users of the IMDB database will rate a particular movie as a 7 out of 10 or higher, which is my definition of a “really like” movie. The key components of the formula are IMDB Average Rating, Rotten Tomatoes Rating, CinemaScore Grade, and the number of  Academy Award wins and nominations for the major categories and for the minor categories.

In future posts, I’ll flesh out my logic for all of these factors. But, the key factor is the capability to measure on IMDB the percentage of IMDB voters who have rated a particular movie as a 7 or higher. When you aggregate all of the movies with a particular IMDB average rating you get results that look like this sample:

Avg. Rating % Rating 7+
                8.5 92.8%
                8.0 88.8%
                7.5 81.4%
                7.0 69.2%
                6.5 54.7%
                6.0 41.5%
                5.5 28.7%

Note that, just because a movie has an average rating of 7.0, doesn’t mean that every movie with a 7.0 average rating is a “really like” movie.  Only 69.2% of the votes cast for the movies with a 7.0 rating were ratings of 7 or higher. Conversely, every movie with an average rating of 6.0 isn’t always a “don’t really like” movie since 41.5% of the voters handed out 7’s or higher. It does mean, though, that the probability of a 7.0 average rated movie is more likely to be a “really like” movie than one with a 6.0 rating.

These changes represent a start down a path towards a movie pre-screening tool that is more useful to the followers of this blog. It is a work in progress that will only get better as more years are added to the database. But, we have a better answer now to the question, “Will you ‘really like’ this movie?”

***

If you’re going to the movies this weekend, chances are that you’re going to see Blade Runner 2049. The early indications are that it is going to live up to the hype. You might also check out The Florida Project, an under the radar movie that is getting some apparently well-deserved buzz.

So Now Rotten Tomatoes Has No Impact On the Box Office? Not So Fast.

There has been a conventional wisdom evolving that Rotten Tomatoes movie ratings are negatively impacting ticket sales at the movies. Over the last couple of weeks, there has been a counter argument made based on a study posted in a September 10th blog. The Wrap, Variety, and other websites reporting on the movie industry have run with the story that Rotten Tomatoes has little, if any, impact on movie ticket sales. I believe that is an oversimplification of the study and the intersection of movie ratings and movie consumption.

The points made in the study that are getting the most attention are:

  1. There is very little statistical correlation between Rotten Tomatoes ratings and box office performance.
  2. The median Rotten Tomatoes rating for 2017 is 77.5% Fresh, whereas the ratings for each of the prior four years was either 72% or 73% Fresh.
  3. There is a correlation between Rotten Tomatoes ratings and Audience ratings.

So, the argument goes, you can’t blame Rotten Tomatoes for bad box office when it is statistically proven that it has no impact on box office and, by the way, critics have actually rated this year’s movies higher than last year’s, and audiences stay away from bad movies because they are more savvy today than they’ve been in the past.

I believe the third point should be the headline. When I’ve looked at this before  I’ve found a very strong correlation to the Certified Fresh, Fresh, and Rotten ratings and my “really like” ratings.  On the other hand, I’ve found that the percentage fresh rating has a weaker correlation to whether I’ll “really like” a movie. I wonder what the statistical correlation to box office performance is for the just the three broad ratings?

As to the second point, the overlooked item in the study is that not only have critics in the aggregate liked 2017 movies better that prior years, the worldwide box office has responded with higher ticket sales in 2017 than 2016. Is it possible that better movies in 2017 have translated into more people worldwide going to the movies?

The first point, and the one that became the headline in so many articles, doesn’t make a lot of sense to me. If there is a correlation between Rotten Tomatoes ratings and Audience ratings, doesn’t that suggest that Rotten Tomatoes has contributed to a more informed movie public And, because they are more informed, they are staying away from bad movies. Therefore, Rotten Tomatoes has impacted the box office. The fact that it is an indirect impact rather than a direct impact is a little misleading. Isn’t it?

Near the end of his study presentation Yves Berqquist, the author of the study, concludes that  “Audiences are becoming extremely adept at predicting and judging the quality of a film”. Rotten Tomatoes is just one of the tools audiences are using to pre-screen the movies they watch. IMDB ratings are taken into account as are Cinemascore grades. For example, Box Office Mojo, which is the go to site for movie box office information, specifically cited the “F” grade that Cinemascore gave to Mother! last weekend as a factor in the “supremely disappointing $7.5 million from 2,368 locations” opening weekend box office. Cinemascore has only given out nineteen F’s in almost forty years of movie surveys.

The movie industry may be looking for someone to blame for movie consumers behaving differently than they have in the past. But, the sooner the industry comes to grips with the new reality that movie audiences are more savvy today than they were in the past, the sooner they will improve their own fortunes. It is arrogant to blame Rotten Tomatoes for contributing to a more informed movie audience.

***

It has been seven weeks since a new movie, Detroit, joined The Objective Top Fifteen after its opening weekend. There is a chance that streak might be broken this weekend. Assuming Cinemascore surveys the movie, I think it’s likely that the Boston Marathon bombing bio-pic Stronger will join the list. I have hopes that Battle of the Sexes will sneak in as well. Check out my update on Monday to see how good my instincts were.

 

Before You See Mother! This Weekend, You Might Read This Article

As you might expect, I’m a big fan of Nate Silver’s FiveThirtyEight website. Last Thursday they published an interesting article on the impact of polarizing movies on IMDB ratings, using Al Gore’s An Inconvenient Sequel: Truth to Power as an example. This is not the first instance of this happening and it won’t be the last.

When the new Ghostbusters movie with the all female cast came out in July 2016 there was a similar attempt to tank the IMDB ratings for that movie. That attempt was by men who resented the all female cast. At that time I posted this article. Has a year of new ratings done anything to smooth out the initial polarizing impact of the attempt to tank the ratings? Fortunately, IMDB has a nice little feature that allows you to look at the demographic distribution behind a movie’s rating. If you access IMDB on it’s website, clicking the number of votes that a rating is based on will get you to the demographics behind the rating.

Before looking at the distribution for Ghostbusters, let’s look at a movie that wasn’t polarizing. The 2016 movie Sully is such a movie according to the following demographics:

Votes Average
Males  99301  7.4
Females  19115  7.6
Aged under 18  675  7.7
Males under 18  566  7.6
Females under 18  102  7.8
Aged 18-29  50050  7.5
Males Aged 18-29  40830  7.5
Females Aged 18-29  8718  7.6
Aged 30-44  47382  7.4
Males Aged 30-44  40321  7.4
Females Aged 30-44  6386  7.5
Aged 45+  12087  7.5
Males Aged 45+  9871  7.5
Females Aged 45+  1995  7.8
IMDb staff  17  7.7
Top 1000 voters  437  7.2
US users  17390  7.5
Non-US users  68746  7.4

There is very little difference in the average rating (the number to the far right) among all of the groups. When you have a movie that is not polarizing, like Sully, the distribution by rating should look something like this:

Votes  Percentage  Rating
12465  8.1% 10
19080  12.4% 9
52164  33.9% 8
47887  31.1% 7
15409  10.0% 6
4296  2.8% 5
1267  0.8% 4
589  0.4% 3
334  0.2% 2
576  0.4% 1

It takes on the principles of a bell curve, with the most ratings clustering around the average for the movie.

Here’s what the demographic breakdown for Ghostbusters looks like today:

Votes Average
Males  87119  5.0
Females  27237  6.7
Aged under 18  671  5.3
Males under 18  479  4.9
Females under 18  185  6.6
Aged 18-29  36898  5.4
Males Aged 18-29  25659  5.0
Females Aged 18-29  10771  6.7
Aged 30-44  54294  5.2
Males Aged 30-44  43516  5.0
Females Aged 30-44  9954  6.6
Aged 45+  11422  5.3
Males Aged 45+  9087  5.1
Females Aged 45+  2130  6.3
IMDb staff  45  7.4
Top 1000 voters  482  4.9
US users  25462  5.5
Non-US users  54869  5.2

There is still a big gap in the ratings between men and women and it persists in all age groups. This polarizing effect produces a ratings distribution graph very different from the one for Sully.

Votes  Percentage  Rating
20038  12.8% 10
6352  4.1% 9
13504  8.6% 8
20957  13.4% 7
24206  15.5% 6
18686  12.0% 5
10868  7.0% 4
7547  4.8% 3
6665  4.3% 2
27501  17.6% 1

It looks like a bell curve sitting inside a football goal post. But it is still useful because it suggests the average IMDB rating for the movie when you exclude the 1’s and the 10’s is around 6 rather than a 5.3.

You are probably thinking that, while interesting, is this information useful. Does it help me decide whether to watch a movie or not? Well, here’s the payoff. The big movie opening this weekend that the industry will be watching closely is Mother!. The buzz coming out of the film festivals is that it is a brilliant but polarizing movie. All four of the main actors (Jennifer Lawrence, Javier Bardem, Michele Pfeiffer, Ed Harris) are in the discussion for acting awards. I haven’t seen the movie but I don’t sense that it is politically polarizing like An Inconvenient Sequel and Ghostbusters. I think it probably impacts the sensibilities of different demographics in different ways.

So, should you go see Mother! this weekend? Fortunately, its early screenings at the film festivals give us an early peek at the data trends. The IMDB demographics so far are revealing. First, by looking at the rating distribution, you can see the goal post shape of the graph, confirming that the film is polarizing moviegoers.

Votes  Percentage  Rating
486  36.0% 10
108  8.0% 9
112  8.3% 8
92  6.8% 7
77  5.7% 6
44  3.3% 5
49  3.6% 4
40  3.0% 3
52  3.8% 2
291  21.5% 1

57.5% of IMDB voters have rated it either a 10 or a 1. So are you likely to love it or hate it? Here’s what the demographics suggest:

Votes Average
Males  717  6.1
Females  242  5.4
Aged under 18  25  8.4
Males under 18  18  8.2
Females under 18  6  10.0
Aged 18-29  404  7.3
Males Aged 18-29  305  7.5
Females Aged 18-29  98  6.1
Aged 30-44  288  5.0
Males Aged 30-44  215  5.0
Females Aged 30-44  69  5.2
Aged 45+  152  4.3
Males Aged 45+  111  4.3
Females Aged 45+  40  4.1
Top 1000 voters  48  4.6
US users  273  4.4
Non-US users  438  6.5

While men like the movie more than women, if you are over 30, men and women hate the movie almost equally. There is also a 2 point gap between U.S. and non-U.S. voters. This is a small sample but it has a distinct trend. I’ll be interested to see if the trends hold up as the sample grows.

So, be forewarned. If you take your entire family to see Mother! this weekend, some of you will probably love the trip and some of you will probably wish you stayed home.

 

When Art Mirrors Reality: American History X and the Events in Charlottesville

At the end of July I went through my monthly ritual of identifying movies I had watched 15 years ago and moving them onto my list of potential movies to watch now. One of these recycled movies, American History X,immediately moved to the top of my Watch List. Because it wasn’t available on any of the platforms I subscribe to, I added it to the top of my Netflix DVD queue. It was happenstance that I watched this DVD yesterday, a few days after the events in Charlottesville.

My experience has been that, when these movies come up for a second viewing fifteen years later, I have a couple of common recollections of the movie. I have a general memory of what the movie is about. I have very little memory of the details of the movie. And, most importantly, I have a distinct memory of whether I “really liked” the movie even if everything else about the movie is indistinct. If it happens that I remember “loving” a movie, I know that I am about to re-experience the highs of being a movie lover even if I can’t remember why.

I have no memory of American History X when it was first released. It was only a few years later that my exploration of IMDB surfaced this movie that was highly rated but was about a topic that repulsed me, the neo-Nazi movement in California. It took a little time but I finally overcame my reluctance and watched it in 2002, four years after it was released. I remember being surprised at how good a movie it was.

The movie is told in two stories. One story is the 24 hour period after Derek Vinyard, played by Edward Norton, is released from prison after serving three years for voluntary manslaughter of two black men who were attempting to steal his car. His prison experience leads him to rethink the path he followed and is determined to dissuade his younger brother, Danny, from following down the same path.

Danny tells the second story. At the beginning of the movie, a teacher, who is trying to get through to Danny, gives Danny an assignment to write a history about his brother, called American History X. This second story is a flashback, filmed in black and white, of Derek’s evolution from inquisitive high-schooler to neo-Nazi leader to his disillusionment with the movement.

I watched it yesterday with a heightened sense of its relevance. I listened to the rhetoric spewed by  Derek and was amazed how closely it mirrored the rhetoric we hear daily. I noted how the two main characters in the movie were well educated, just as many of the neo-Nazi marchers at Charlottesville were young college educated males. The movie portrays the recruitment of young men who have been preyed upon or feel vulnerable with the pitch that their problems are caused by “those people” rather than their own inability to cope with the lemons that life has tossed their way.

One scene in the film is particularly poignant. There is a flashback of high school aged Derek having breakfast with the father he idolized. Derek is expressing his excitement about a class he is having that is exposing him to cultural experiences of other races. His father, a fireman and otherwise decent man, shuts him down and proceeds to indoctrinate him in his racist “reality”. I immediately thought of Barack Obama’s viral tweet of the words of Nelson Mandela, “No one is born hating another person because of the color of his skin, or his background, or his religion. People must learn to hate…”

At the end of the movie, the younger brother, Danny, narrates the end of his American History X paper with the following words:

“So I guess this is where I tell you what I learned – my conclusion, right? Well, my conclusion is: Hate is baggage. Life’s too short to be pissed off all the time. It’s just not worth it. Derek says it’s always good to end a paper with a quote. He says someone else has already said it best. So if you can’t top it, steal from them and go out strong. So I picked a guy I thought you’d like. ‘We are not enemies, but friends. We must not be enemies. Though passion may have strained, it must not break our bonds of affection. The mystic chords of memory will swell when again touched, as surely they will be, by the better angels of our nature.’ “

Danny is quoting here from Abraham Lincoln’s First Inaugural Address. We can only hope that the hardened shells of our hatred can be penetrated “by the better angels of our nature”.

What IMDB Ratings Give You the Best Chance for a “Really Like” Movie?

As I was browsing the IMDB ratings for the movies released in July, I wondered how the average user of IMDB knows what is a good rating for a movie. I’m sure the more than casual visitor to IMDB would see the 8.2 rating for Baby Driver and immediately recognize that only above average movies receive ratings that high. Or, they might see the 1.5 rating for The Emoji Movie and fully understand that this is a really bad movie. But, what about the 6.8 for Valerian and the City of a Thousand Planets or the 7.2 for Atomic Blonde. They might have a number in their head as to what is the tipping point for a good and bad rating but that number could only be a guess. To really know, you’d have to compile a list of all the movies you’ve seen and compare their IMDB rating to how you’ve rated them. That would be crazy. Right? But, wait a minute. I’m that crazy! I’ve done that! Well, maybe not every movie I’ve ever seen. But, every movie I’ve seen in the last fifteen years.

So, given the fact that I’ve done what only a crazy man would do, what can I tell you about what is a good IMDB rating. Here’s my breakdown:

IMDB Avg. Rating # I Really Liked # I Didn’t Really Like Really Like %
> 8.2 108 43 71.5%
7.2 to 8.1 732 427 63.2%
6.2 to 7.1 303 328 48.0%
< 6.2 6 71 7.8%
> 7.2 840 470 64.1%
< 7.2 309 399 43.6%
All 1149 869 56.9%

The data suggests that IMDB ratings of 7.2 or higher give me the best chance of choosing a “really like” movie.

I mentioned a few posts ago that my new long range project is to develop a database that is totally objective, free from the biases of my movie tastes. I’m compiling data for the top 150 movies in box office receipts for the last 25 years. It’s a time-consuming project that should produce a more robust sample for analysis. One of my concerns has been that the database of movies that I’ve seen doesn’t have a representative sample of bad movies. While it’s a long way from completion, I have completed years 1992 and 1993 which are representative enough to make my point.

IMDB Avg. Rating % of All Movies in Objective Database (Years 1992 & 1993) % of All Movies in My Seen Movie Database
> 8.2 1% 7%
7.2 to 8.1 23% 57%
6.2 to 7.1 35% 31%
< 6.2 41% 4%

Over the last six or seven years in particular, I have made a concerted effort to avoid watching bad movies. You can see this in the data. If 7.2 is the “really like” benchmark, then only 24% of the top 150 movies at the box office are typically “really like” movies. On the other hand, my selective database has generated 64% “really like” movies over the past 15 years. This is a big difference.

***

While no new movies broke into the Objective Top Fifteen this week, Megan Leavy, which was released around eight weeks ago, slipped into the list. This under-the-radar movie didn’t have enough critics’ reviews to be Certified Fresh on Rotten Tomatoes until recently.

As for this weekend, The Dark Tower could be a disappointment to everyone but the most die-hard of Stephen King fans. Instead, I’m keeping an eye on Detroit. This urban drama, directed by Kathryn Bigelow, captures the chaos of Detroit in 1967. It probably will be surveyed by Cinemascore.

A third movie, that probably won’t be surveyed by Cinemascore but I’m watching nevertheless, is Wind River. Taylor Sheridan, who wrote the acclaimed movies Hell or High Water and Sicario, wrote this movie. Sheridan is a great young talent who is stepping behind the camera in his directorial debut as well.

 

 

 

 

Why Did “The Big Sick” Drop Out of the Objective Top Fifteen This Week?

This past Sunday my wife, Pam, and I went to see The Big Sick. The movie tells the story of the early relationship days of the two screenwriters, Emily Gordon and Kumail Nanjiani. In fact, Nanjiani plays himself in the movie. It is the authenticity of the story, told in a heartfelt and humorous way, that makes this film special.

On the following day, last weekend’s blockbuster, Dunkirk, moved into the second spot in the revised Objective Top Fifteen rankings. When a new movie comes on the list another one exits. This week’s exiting movie, ironically, was The Big Sick. Wait! If The Big Sick is such a great movie why isn’t it in my top fifteen for the year? Are all of the other movies on the list better movies? Maybe yes. Maybe no. You’ll have to determine that for yourselves. You see the Objective Top Fifteen is your list, not mine.

I developed the Objective Top Ten, which became Fifteen the beginning of July and will become Twenty the beginning of October, to provide you with a ranking of 2017 widely released movies that are most likely to be “really like” movies. Because the ranking is based on objective benchmarks, my taste in movies has no influence on the list. The four benchmarks presently in use are: IMDB Avg. Rating, Rotten Tomatoes Rating, Cinemascore Rating, and Academy Award Nominations and Wins. A movie like Hidden Figures that meets all four benchmarks has the greatest statistical confidence in its “really like” status and earns the highest “really like” probability. A movie that meets three benchmarks has a greater “really like” probability than a movie that meets only two benchmarks. And so on.

The important thing to note, though, is that this is not a list of the fifteen best movies of the year. It is a ranking of probabilities (with some tie breakers thrown in) that you’ll “really like” a movie. It is subject to data availability. The more positive data that’s available, the more statistical confidence, i.e. higher probability, the model has in the projection.

Which brings me back to The Big Sick. Cinemascore surveys those movies that they consider “major releases”. The Big Sick probably didn’t have a big advertising budget. Instead, the producers of the film chose to roll the movie out gradually, beginning on June 23rd, to create some buzz and momentum behind the movie before putting it into wide release on July 14th. This is probably one of the reasons why Cinemascore didn’t survey The Big Sick. But, because The Big Sick is missing that third benchmark needed to develop a higher probability, it dropped out of the Top Fifteen. On the other hand, if it had earned at least an “A-” from Cinemascore The Big Sick would be the #2 movie on the list based on the tie breakers.

And, that is the weakness, and strength of movie data. “Major releases” have it. Smaller movies like The Big Sick don’t.

***

This weekend may be the end of the four week run of Objective Top Fifteen movie breakthroughs. Atomic Blonde, the Charlize Theron spy thriller, has an outside chance of earning a spot on the list. As of this morning, it is borderline for the IMDB and Rotten Tomatoes benchmarks. I’m also tracking Girls Trip which earned a Certified Fresh just in the last couple of days from Rotten Tomatoes and has an “A+” in hand from Cinemascore. For now, it is just below the IMDB benchmark. We’ll see if that changes over the weekend.

 

 

Vacation, My 100th Post, and a July “Really Like” Movie Hot Streak

I arrived in the city of Seattle yesterday in the wee hours of the morning. I’m here to introduce myself to my new, beautiful granddaughter. So if there is a contemplative, or distracted, feel to this week’s post, there is good reason.

This is also my 100th post. Not quite as momentous as your first grandchild, but a marker worthy of reflection nevertheless. It has been a labor of love and a challenge. Blogging was new to me when I started out 99 posts ago. I discovered that you don’t find your voice in the first post. Little by little though you develop a style that you become comfortable with and readers of your blog become comfortable with. If you’re lucky, enough people become engaged in your passion and come back for more. Thanks for your support if you’re one of those loyal followers, or even if you’ve just stopped by for an occasional “check and see”. On to the next 100 posts beginning with a look at what’s caught my eye at the Cineplex this coming weekend.

Dunkirk, which goes into wide release tomorrow, is poised to become the fourth high quality mega-hit in four weeks. As of this morning, it is 94% Certified Fresh on Rotten Tomatoes. And, the early overseas feedback on IMDB has produced an impressive 9.6 average rating. This Christopher Nolan depiction of the rescue of the surrounded British army at the beginning of World War II is being compared to the classic Saving Private Ryan. The Saving Private Ryan comparison benchmarks to keep an eye on are Certified Fresh 92%, IMDB Avg Rating 8.6 and Cinemascore “A”. Pre-wide release Dunkirk is exceeding the Rotten Tomatoes and IMDB scores. We’ll have to wait until Saturday for Cinemascore results. I’m excited about this one.

In addition to off schedule posts to this site, vacation for the Mad Movie Man invariably involves a trip to the movies. With an unusually high number of Certified Fresh movies at the theater it is almost a can’t miss proposition. But, the absolute can’t miss feature of this vacation is the incredible miracle of my granddaughter Addie Rose.

This Is Turning Into a “Really Like” Summer at the Movies.

In case you haven’t noticed, we are in the midst of a pretty good run of high quality movies this summer. Since the first weekend in May, which serves as the unofficial beginning of the summer movie season, there have been at least ten movies that have a 7.2 or higher IMDB average rating and have a Certified Fresh rating on Rotten Tomatoes.

May to July 2017 Wide Released Movies IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh
Baby Driver 8.4 C. Fresh 97%
Spider-Man: Homecoming 8.2 C. Fresh 93%
Wonder Woman 8.0 C. Fresh 92%
Guardians of the Galaxy Vol. 2 8.1 C. Fresh 81%
Big Sick, The 8.0 C. Fresh 97%
I, Daniel Blake  7.9 C. Fresh 92%
A Ghost Story 7.5 C. Fresh 87%
Okja 7.7 C. Fresh 84%
The Beguiled  7.3 C. Fresh 77%
The Hero  7.3 C. Fresh 76%

And if early indicators are accurate, War for the Planet of the Apes will join the list after this coming weekend. And, if the early buzz on social media holds up, Christopher Nolan’s new movie Dunkirk will join the list the following weekend.

This seems to me to be an unusually high number of quality movies for the summer so far but I can’t tell you how unusual…yet. I’m working on a new long term project. I’m creating a database solely made up of objective “really like” movie indicators. It will include all movies finishing in the top 150 in receipts at the box office for each of the last 25 years. This database will provide a better representation of the bad movies that are released each year as well as provide a more robust sample size.

For now, I can only compare this year’s quality to 1992 (the first of the 25 years in my new database). Allowing for the fact that Rotten Tomatoes wasn’t launched until 1998, I’ve allowed movies that aren’t Certified Fresh but would otherwise be if there were enough critic reviews of the movie. Even with that allowance, there are only 3 movies released between May and July 1992 that meet the quality criteria I’m using for this summer.

May to July 1992 Wide Released Movies IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh
Night on Earth             7.5 Fresh 73%
Enchanted April             7.6 Fresh 83%
A League of Their Own             7.2 C. Fresh 78%

I’ll also add that the IMDB average ratings tend to decline over time. It is probable that a few of this year’s movies will ultimately not meet the 7.2 IMDB rating minimum. But, with 7 of the 10 movies sitting with IMDB ratings at 7.7 or better, this year’s list should hold up pretty well over time.

***

As I mentioned above War for the Planet of the Apes opens tomorrow. It is easy to overlook how good this franchise has been. Here are the “really like” indicators for the franchise including a very early look at tomorrow’s entry.

IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh Cinema Score
Rise of the Planet of the Apes (2011)             7.6 C. Fresh 81% A-
Dawn of the Planet of the Apes (2014)             7.6 C. Fresh 90% A-
War for the Planet of the Apes (2017)             9.1 C. Fresh 93% ?

Franchises tend to get tired after the first movie. From the critics’ perspective, this franchise appears to get better with each new movie. I expect to see War for the Planet of the Apes on the Objective Top Fifteen list on Monday.

Leave Mummy Out of Your Father’s Day Plans

One of the goals of this blog is to make sure that you are aware of the internet tools that are out there to protect you from wasting your time on blockbusters like The Mummy. While it had a disappointing opening in the U.S., moviegoers still shelled out an estimated $32.2 million at the box office last weekend for this bad movie. Overseas it met its blockbuster expectations with a box office of $141.8 million. However, if you were really in the mood for a horror genre movie a better choice, but not a sure thing, might have been It Comes At Night which had a more modest U.S. box office of $6 million.

As a general rule, I won’t go to a movie on its opening weekend. I prefer to get at least a weekend’s worth of data. But if you just have to see a movie on its opening weekend here are a couple of hints. First, if you are seeing the movie on its opening Friday, the most reliable indicator is Rotten Tomatoes. Most critics have released their reviews before the day of the movie’s release. The Rotten Tomatoes rating on the movie’s release date is a statistically mature evaluation of the movie. It won’t change much after that day.

If you are going to the movies on the Saturday of opening weekend, you can add Cinemascore to the mix. I’ve blogged about this tool before. This grade is based on feedback moviegoers provide about the movie as they are leaving the theater. The grade is posted on the Saturday after the Friday release.

Finally, by Sunday IMDB will produce a pretty good, though slightly inflated, average rating for the movie.

The comparison of these three checkpoints for The Mummy and for It Comes At Night might’ve been helpful to those who thought they were in for a “really like” movie experience.

Rotten Tomatoes IMDB Avg. Rating Cinemascore Grade
The Mummy Rotten (17%) 5.9 B-
It Comes At Night Certified Fresh (86%) 7.2 D

While the Cinemascore grade of D for It Comes At Night would keep me away from opening weekend for both movies, if I had to see one, it wouldn’t be The Mummy.

Here’s the data behind my reasoning. For IMDB, the breakpoint between a movie with a good chance that I will “really like” it and one that I probably won’t like is an average rating of 7.2. Movies with a 7.2 IMDB average rating of 7.2 or higher I “really like” 63.3% of the time. Movies with an IMDB rating less than 7.2 I “really like” 43.3% of the time. Turning to Rotten Tomatoes, Movies that are Certified Fresh I “really like” 68% of the time. These “really like” percentages drop to 49.6% for movies that are Fresh and 37.5% for movies that are Rotten. So absent any information based on my own personal tastes, I won’t go to the movieplex to watch a movie that isn’t graded Certified Fresh by Rotten Tomatoes and has an IMDB Rating 7.2 or higher. That doesn’t mean that there aren’t any movies out there that don’t meet that criteria that I wouldn’t “really like”. The movie may be in a genre that appeals to me which might provide some tolerance for a little less quality. That being said, the odds that I’ll “really like” a low rated movie are less than 50/50.

I should probably explore the potential of adding Cinemascore to the objective probability factors I use in developing “really like” probabilities. To date, though, I don’t have any Cinemascore data . I don’t yet have a feel for its “really like” reliability. For now, I just use it as another piece of data that might tip me one way or the other if I’m on the fence about a new movie.

Enjoy Father’s Day but stay away from Mummy.

Post Navigation