Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the month “June, 2016”

Will We Find Something Special in the July Movie Releases?

As I look ahead to July, the peak month of the blockbuster season, my challenge is to find five potential movie gems that I will “really like”. I’m not looking for the top five possible winners at the box office, though some of my choices will be on that list. I might even watch one of the July releases and walk away feeling that the movie is special. In the last five years there have only been a handful of movies released in July that have approached this level. In July 2012, there were two. One was a blockbuster, The Dark Knight Rises, while the other was a low budget documentary that went on to win the Academy Award for Best Documentary, Searching For Sugar Man. The five movies that follow are those five that I think have the best possibility of being special.

Jason Bourne. Release Date: July 29, 2016    “Really Like” Probability: 62.5

This is the third collaboration in the series between director, Paul Greengrass, and actor, Matt Damon. Putting aside The Bourne Legacy, which you can almost consider as a stand alone movie since it didn’t involve Greengrass or Damon, each movie in the series was a “really like” movie. The last movie in their collaboration, The Bourne Ultimatum, is, arguably, the best in the series. The word from producer Frank Marshall is that this movie features a chase scene along the Las Vegas strip that is the best chase scene in the series.

Captain Fantastic. Release Date: July 8, 2016    “Really Like” Probability: 44.3

This movie is in the early discussion for Oscar consideration, having won Best Film at the Seattle International Film Festival and garnered Director Awards for Matt Ross at Cannes and Palm Springs. It’s about a father, played by Viggo Mortensen, who raises his six children off the grid in the forests of the Pacific Northwest and the challenges they face when they are forced to return to the “civilized” world to address an emergency.

Star Trek Beyond. Release Date: July 22, 2016    “Really Like” Probability: 32.1

J. J. Abrams directed the first two movies in this Star Trek series reincarnation and I rated both of them as “really like” movies. Justin Lin, who directed four of the Fast and Furious franchise movies, takes over for Abrams in the Director’s chair for this film. I have never seen a Justin Lin directed movie which makes me a little hesitant with this one. But, with Abrams overseeing as Producer and Idris Elba playing the villain, Krall, I am ready for another trip on the Enterprise.

The BFG. Release Date: July 1, 2016    “Really Like” Probability: 32.1

I’ll finalize my probability for this movie tomorrow. I don’t believe it will be a special movie, but I do believe it can be a “really like” movie. Raold Dahl, Steven Spielberg, and last year’s Oscar winner for Best Supporting Actor, Mark Rylance, are a promising combination. Early reviews are mixed but audiences have yet to weigh in. This is one that you may want to wait on until after opening weekend.

Café Society.  Release Date: July 15 (limited) July 29 (Wide)    “Really Like” Probability: 23.0

The low probability on this one is deceptive. I actually feel that, while not on the same level as Midnight in Paris and Blue Jasmine, this latest film from Woody Allen may be in the ball park. Based on buzz from the Cannes and Seattle festivals, it has made some early lists for Oscar contenders. Allen, who has a knack for attracting top talent, recently lined up Kate Winslet for his next movie. For this comedy, set in the 1930’s, he has another ensemble of “A’ list talent including Steve Carell, Jesse Eisenberg, Kristen Stewart, and Blake Lively.

One of the joys of looking ahead at the movies being released in the next month is the hope that one or two of them will be magical. July, historically, has been a month that produces its fair share of these special movies. Maybe one of these five movies will be one of those iconic movies that earn the sobriquet “classic”.

 

Do July Movies Crackle or Fizzle?

In the United States, the highlight of the month of July is the celebration of Independence Day on the 4th of July. It is commemorated by parades, cook outs, and firework spectaculars. For readers of this blog, though, you may be wondering if the movies released in July crackle like the firework displays put on in the cities claiming the mantle, “cradle of liberty” (Boston, New York and Philadelphia), or do they fizzle out like damp Roman Candles launched in your neighbor’s back yard. I’m happy to report that not only is July National Hot Dog Month in the U.S., it is also a pretty good month most years for movies.

Like June, July is a big month at the box office. For the five year period from 2011 to 2015, ticket sales per movie averaged $15.91 million. Over the same five year period ticket sales for movies released in July averaged $22.13 million, very close to June’s $22.45 million. Where June and July part ways is in the quality of the movies released. Of the top 75 movies  in IMDB’s Top 250, ten of them were released in July, compared to five in June. Based on the movies I’ve seen over the last 15 years, there is a 53.6% probability I will “really like” a movie released in July compared to 36.1% for June releases. Here are my top five “really like” July movie releases:

Oscar Noms. Oscar Wins Best Picture Noms
When Harry Met Sally 1
Saving Private Ryan 12 5 1
Dark Knight, The 8 2
Seabiscuit 7 1
Die Hard 4

All, with the possible exception of the very good Seabiscuit, are considered iconic by most film buffs and were strong Academy Award performers. Throw in Inception with its 8 nominations, including Best Picture, and 4 Academy Award wins, and the list is even more impressive. Based on the evidenced presented here, we might conclude that July movie releases crackle.

Let’s not be hasty, though. In the last ten years, of the 78 movies nominated for Best Picture, only Inception and The Dark Knight were released in July, a paltry 2.6% of the total. Take a look at last year’s top five July Box Office Movies and you begin to wonder, where’s the crackle?

Top Movies IMDB Avg Rating Rotten Tomatoes
Minions 6.4 Rotten 56%
Mission Impossible: Rogue Nation 7.5 Cert. Fresh 93%
Ant-Man 7.4 Cert. Fresh 81%
Trainwreck 6.3 Cert. Fresh 85%
Terminator: Genisys 6.6 Rotten 26%

Of these five movies, only Mission Impossible: Rogue Nation and Ant-Man are “really like” prospects. Neither is destined to be labeled iconic. There isn’t a single Academy Award nomination of any kind in the group. 2014 wasn’t much better with only one nomination among the top five July releases. Does this mean that July used to crackle but now it fizzles?  I think a two year sample isn’t large enough to declare a trend but it’s worth watching.

On Thursday, I’ll take a shot at identifying five movies that have the potential to crackle this July. With five July weekends this year the odds are in our favor. Like every month, though, there will be movies that you expect to crackle but, like that damp Roman Candle, will fizzle out.

In the Battle of Memory vs. Movie Website, Netflix is Still the Champ

On Monday I posed the question, is your memory of a movie that you’ve already seen the best predictor of “really like” movies. Based on Monday’s analysis memory certainly comes out on top against IMDB and Rotten Tomatoes. Today, I’m extending the analysis to Criticker, Movielens, and Netflix. By reconfiguring the data used in Monday’s post, you also can measure the relative effectiveness of each site. For example, let’s look again at IMDB.

Probability I Will “Really Like” Based on IMDB Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 80.1% 69.2%                            0.11
Never Seen Before 50.6% 33.6%                            0.17

It’s not surprising that the probabilities are higher for the movies that were seen before. After all it wouldn’t make sense to watch again the movies you wished you hadn’t seen the first time. But by looking at the gap between the probability of a recommended movie and a non-recommended movie, you begin to see how effectively the movie recommender is at sorting high probability movies from low probability movies. In this instance, the small 11 point spread for Seen Before movies suggests that IMDB is only sorting these movies into small departures from average. The low probabilities for the Never Seen Before movies suggest that, without the benefit of the memory of a movie seen before, IMDB doesn’t do a very good job of identifying “really like” movies.

Rotten Tomatoes follows a similar pattern.

Probability I Will “Really Like” Based on Rotten Tomatoes Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 80.5% 65.1%                            0.15
Never Seen Before 49.8% 31.8%                            0.18

Rotten Tomatoes is a little better than IMDB at sorting movies. The point spreads are a little broader. But, like IMDB, Rotten Tomatoes doesn’t effectively identify “really like” movies for the Never Seen Before group.

Theoretically, when we look at the same data for the remaining three sites, the Percentage Point Spread should be broader to reflect the more personalized nature of the ratings. Certainly, that is the case with Criticker.

Probability I Will “Really Like” Based on Criticker Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 79.3% 56.4%                            0.23
Never Seen Before 45.3% 18.9%                            0.26

Like IMDB and Rotten Tomatoes, though, Criticker isn’t very effective at identifying “really like” movies for those movies in the Never Seen Before group.

When you review the results for Movielens, you can begin to see why I’m so high on it as a movie recommender.

Probability I Will “Really Like” Based on Movielens Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 86.6% 59.6%                            0.27
Never Seen Before 65.1% 22.3%                            0.43

Unlike the three sites we’ve looked at so far, Movielens is a good predictor of “really like” movies for Never Seen Before movies. And, the spread of 43 points for the Never Seen Before movies is dramatically better than the three previous sites. It is a very effective sorter of movies.

Last, but certainly not least, here are the results for Netflix.

Probability I Will “Really Like” Based on Netflix Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 89.8% 45.7%                            0.44
Never Seen Before 65.7% 21.4%                            0.44

What jumps off the page is that there is no memory advantage in the allocation of movies for Netflix. As expected, the Seen Before probabilities are higher. But, there is an identical 44 point gap for Seen Before movies and movies Never Seen Before. It is the only site where you have a less than 50% chance that you will “really like” a movie you’ve already seen if Netflix doesn’t recommend it.

“If memory serves me correctly, I “really liked” this movie the first time I saw it.” That is an instinct worth following even if the movie websites suggest otherwise. But, if Netflix doesn’t recommend it, you might think twice.

***

6/24/2016 Addendum

I’ve finalized my forecast for the last three movies on my June Prospect list. My optimism is turning to pessimism regarding my hopes that Independence Day: Resurgence and Free State of Jones would be “really like movies”. Unfavorable reviews from the critics and less than enthusiastic response from audiences suggest that they could be disappointments. Of my five June prospects, Finding Dory seems to be the only safe bet for theater viewing, with Me Before You a possibility for female moviegoers. The IMDB gender split is pronounced for Me Before You with female voters giving it an 8.1 rating and males a 7.3 rating. It is also one of those rare movies with more female IMDB voters than males.

For “Really Like” Movies, Trust Your Memory More Than IMDB and Rotten Tomatoes

As of today, there are 1,984 movies in my database. Of those 1,984 movies, I’ve watched 446 movies more than once. The general rule I’ve set for myself is to wait at least 15 years before seeing a movie an additional time. I’ve found that over that time span I retain a general idea of what happens in the movie but forget the specific details of what happens. When I watch the movie the additional time it feels like a new movie to me.

I find that, while I forget the specifics of the movie, I maintain sense of whether I liked it or not and that often times drives whether I want to see it a second time. The question I’m exploring today is; When deciding to watch a movie a second time, should I trust the impression of the movie I remember or the recommendations of IMDB or Rotten Tomatoes?

The data suggests that the answer to this question is pretty clear-cut. The movies that I’ve seen before have a much greater probability that I will “really like” them whether or not they are recommended by IMDB or Rotten Tomatoes.

 

When I watched a Movie, I had: And it was IMDB: Probability I Will “Really Like”
Seen Before Recommended 80.1%
Seen Before Not Recommended 69.2%
Never Seen Before Recommended 50.6%
Never Seen Before Not Recommended 33.6%
When I watched a Movie, I had: And it was Rotten Tomatoes: Probability I Will “Really Like”
Seen Before Recommended 80.5%
Seen Before Not Recommended 65.1%
Never Seen Before Recommended 49.8%
Never Seen Before Not Recommended 31.8%

While there are qualitative differences for IMDB and Rotten Tomatoes recommendations, the movies that your memory wants you to see a second time are clearly superior indicators of whether it will be a “really like” movie or not. And, the probabilities are higher for a non-recommended movie you’ve seen before than a recommended movie you haven’t seen before.

The other aspect of this data is that it reveals the weaknesses of IMDB and Rotten Tomatoes as recommenders of movies you haven’t seen. Strip the data of movies that you have seen before and a recommended movie has only a slightly better chance that you will “really like” it than selecting a movie randomly from the database. It reinforces the notion that these sites are better at weeding out movies that you won’t like rather than identifying movies that you will “really like”. Later this week, in Thursday’s post, I’ll share how this cut of the data looks for the websites that are sensitive to my individual taste in movies. Theoretically, it should better replicate the wisdom of movie memories.

For now, though, the wisdom of crowds of moviegoers and critics can’t compete with those golden memories you have of great movies seen in the past when it comes to “really like” movies.

 

Cinemascore Is For Opening Weekend, but Beware of Grade Inflation

We have just endured another Presidential Primary season where every tea leaf was micro-analyzed and every phrase parsed to death. One of the primary tools of the political pundits is the exit poll. In key districts across the primary State, pollsters await voters as they exit the polling place to determine who the voters were pinning their hopes on to lead the free world at that very moment and why. The exit poll fills our insatiable desire for instant feedback for what we’re collectively thinking.

The movie industry has its own version of the exit poll, Cinemascore. In the pre-IMDB days of 1978, the movie industry had the same concerns with critics that they have with Rotten Tomatoes today. The industry felt critics had too much influence with the viewing public. Cinemascore filled this perceived need to balance the sway of critics by measuring the opening night reaction to a movie from moviegoers who were walking out of the theater. Like political exit polls, the theaters polled in the survey were specifically selected to provide a cross section, regionally and demographically, of the viewing public in the U.S. and Canada. Participants in the survey answer six questions about the movie they’ve just watched including the assignment of a grade from A to F.

By going to the website linked above you can view the average grade from the surveys given to recent major movie releases. You can also type in a movie title released after 1978 to see that movie’s average grade. With a paid subscription you can enter the website and presumably access results from the other five questions surveyed. Not all movies are surveyed, only those considered major releases.

How useful are these grades? Well, if you absolutely can’t wait to see a movie, but you can hold off until Saturday night, they can be quite useful. The survey sample is representative of moviegoers like you. I would expect that most attendees of an opening night movie have a high degree of interest in the movie, just like you. On the other hand, if your decision to attend an opening weekend movie is more casually made, Cinemascore could be deceiving.

The 24 recent movie releases currently displayed on the Cinemascore Home Page are ranked below by grade with the accompanying IMDB rating results:

Cinemascore
Recent Movie Results
Movie Cinemascore # IMDB Votes IMDB Avg. Rating
ME BEFORE YOU A                6,217 7.9
CAPTAIN AMERICA: CIVIL WAR A            229,127 8.3
GOD’S NOT DEAD 2 A                3,809 3.3
JUNGLE BOOK, THE A              84,945 7.8
TEENAGE MUTANT NINJA TURTLES: OUT OF THE SHADOWS A-                8,013 6.5
X-MEN: APOCALYPSE A-            102,507 7.4
ALICE THROUGH THE LOOKING GLASS A-              12,479 6.4
CONJURING 2, THE A-                7,238 8.4
NOW YOU SEE ME 2 A-                3,696 7.2
BARBERSHOP: THE NEXT CUT A-                2,099 6.1
MY BIG FAT GREEK WEDDING 2 A-                8,083 6.2
ANGRY BIRDS MOVIE, THE B+              13,128 6.4
WARCRAFT B+              45,628 7.8
HUNTSMAN: WINTER’S WAR, THE B+              23,522 6.2
MONEY MONSTER B+              11,349 6.8
MOTHER’S DAY B+                3,643 5.4
BATMAN V SUPERMAN: DAWN OF JUSTICE B            299,641 7.0
KEANU B                7,507 6.6
NEIGHBORS 2: SORORITY RISING B              16,437 6.1
POPSTAR: NEVER STOP NEVER STOPPING B                2,815 7.4
RATCHET AND CLANK B                1,778 6.1
CRIMINAL B-                5,452 6.4
NICE GUYS, THE B-              23,900 7.8
DARKNESS, THE C                1,901 4.1

If an IMDB rating of 7.3 or higher is considered an above average rating, then only a Cinemascore of A is solidly reinforced by the IMDB average ratings. Of the 7 movies  receiving an A- grade only X-Men: Apocalypse and The Conjuring 2 were considered above average by IMDB voters. A Cinemascore of A- may not translate favorably when the more general audience begins to view the film. If on the other hand, you are really into Christian movies and you were really looking forward to God’s Not Dead 2, Cinemascore is going to be a better indicator of the quality of the movie than IMDB, whose voters may not be representative of your taste in movies.

Cinemascore was created before we had sites like IMDB. It still has its use for “must see” opening weekend moviegoers and movies for unique tastes. Once you get past opening weekend, however, IMDB is probably a better tool for word of mouth feedback.

***

6/17/2016

I’ve entered my final estimate for Finding Dory this morning. The early indicators are that this will be a critical and box office success. I’ve forecasted it will be a “really like” of 85%.

If You Want to Watch “Really Like” Movies, Don’t Count on IMDB.

Today’s post is for those of you who want to get your “geek” on. As regular readers of these pages are aware, IMDB is the least reliable indicator of whether I will “really like” a given movie. As you might also be aware, I am constantly making adjustments to my forecasting algorithm for “really like” movies. I follow the practice of establishing probabilities for the movies in my database, measuring how effectively those probabilities are at selecting “really like” movies, and revising the model to improve on the results. When that’s done, I start the process all over. Which brings me back to IMDB, the focus of today’s study.

My first step in measuring the effectiveness of IMDB at selecting “really like” movies is to rank the movies in the database by IMDB average rating and then divide the movies into ten groups of the same size. Here are my results:

IMDB Avg Rating Range # of Movies Probability I Will “Really Like”
> 8.1 198 64.6%
7.8 to 8.1 198 60.6%
7.7 to 7.8 198 64.6%
7.5 to 7.7 198 58.6%
7.4 to 7.5 198 55.1%
7.2 to 7.4 198 52.5%
7.0 to 7.2 198 42.4%
6.8 to 7.0 198 39.4%
6.5 to 6.8 198 35.4%
< 6.5 197 11.7%
All Movies          1,979 48.5%

There seems to be a correlation between IMDB rating and the probability of “really like” movies in the group. The problem is that the results suggest that IMDB does a better job identifying movies that you won’t “really like” rather than which ones that you will “really like”. For example, when I’ve gone through the same exercise for Netflix and Movielens, the probabilities for the top 10% of the ratings have been over 90% for each site, compared to the 64.6% for IMDB.

With the graph displayed here, you can begin to picture the problem.

IMDB Rating Graph

The curve peaks at 7.4. There are enough ratings on the low ratings side of the curve to create significant probability differences in the groups. On the low side, it looks more like a classic bell curve. On the high side, the highest rated movie, Shawshank Redemption has a 9.2 rating. The range between 7.4 and 9.2 is too narrow to create the kind of probability differences that would make IMDB a good predictor of “really like” movies. IMDB would probably work as a predictor of “really like” movies if IMDB voters rated average movies as a 5.0. Instead an average movie is probably in the low 7s.

So, what is a good average IMDB rating to use for “really like” movies? Let’s simplify the data from above:

IMDB Avg Rating Range # of Movies Probability I Will “Really Like”
> 7.7 636 62.7%
7.3 to 7.6 502 55.4%
< 7.2 841 33.7%
All Movies          1,979 48.5%

If we want to incrementally improve IMDB as a predictor of “really like” movies, we might set the bar at movies that are rated  7.7 or higher. I’m inclined to go in the opposite direction and utilize what IMDB does best, identify which movies have a high probability of not being “really like” movies. By setting the IMDB recommendation threshold at 7.3, we are identifying better than average movies and relying on the other recommender websites to identify the “really like” movies.

IMDB is one of the most utilized movie sites in the world. It has a tremendous amount of useful information. But,if you want to select movies that you will “really like” don’t count on IMDB.

Is Opening Weekend at the Movie Theaters a Flip of the Coin?

Last weekend, the top five movies at the Box Office all earned Rotten grades from Rotten Tomatoes. Two of the five have managed to receive favorable scores from IMDB, while the remaining three have received very mediocre feedback from their IMDB voters. Here are the five movies:

TOP FIVE MOVIES AT THE BOX OFFICE
WEEKEND  OF 6/3 TO 6/5
Movie Box Office (000000) Rotten Tomatoes IMDB Avg. Rating
Teenage Mutant Ninja Turtles: Out of the Shadows $35.30 36% Rotten 6.6
X-Men: Apocalypse $22.80 48% Rotten 7.5
Me Before You $18.70 56% Rotten 8.1
Alice Through the Looking Glass $11.30 29% Rotten 6.4
The Angry Birds Movie $10.20 43% Rotten 6.4

These results beg the question, should we ever go to the movies when a movie first comes out? Without the benefit of the feedback from actual moviegoers, our potential enjoyment of a movie during its early run in the theaters might be no better than the flip of a coin. Three of the five movies were released Memorial Day weekend and their numbers are down significantly from their strong numbers the first weekend, possibly the influence of their adverse Rotten Tomatoes grades. All of the movies have a built in audience because they are sequels, or, in the case of Me Before You, they read the book, or, in the case of The Angry Birds Movie, they play the phone app. Despite an audience that is predisposed to like each movie, only in the instances of X-Men: Apocalypse and Me Before You has the audience actually liked the movie, as evidenced by the IMDB ratings. Moviegoers spent $98.3 million last weekend expecting to be entertained by these five movies. Those who saw the TMNT movie, or Angry Birds, or the latest adventure of Alice, were a little disappointed. There has to be a better way.

I don’t know if it’s possible to improve the odds of selecting “really like” movies when they are first released. My efforts to forecast “really like” movies beginning in June will at least test whether I can do it. You may have noticed that I’ve made a notation  in my June forecast that my forecast for Me Before You is final. In order to truly test the ability to project a movie before its opening weekend, all forecast adjustments have to be finalized before it opens in the theaters.  After four to six months, I plan to go back and compare how the actual “really like” probabilities developed against what I projected. After all, a forecast doesn’t have much credibility unless you keep score and demonstrate a track record of success.

I’ve been to movies on opening weekend where I felt pretty confident that I would “really like” the movie,  Captain America: Civil War for example. In that instance there was a significant amount of data out there from its International run the week before the U.S. opening. For most other movies the data will be less robust requiring more creativity.

I’d like to think I can do better than the flip of a coin.

Netflix Attacks Its Movie Deficit But Is It Enough?

Over the last few months there have been a number of stories in the blogosphere about the shrinking Netflix movie catalogue. I contributed to this storyline in my post  Netflix Streaming: The Other Story . The early response to these stories was that Netflix was going to focus their resources on original content because licensing fees were becoming cost prohibitive. Over the last couple of weeks Netflix has, at least for now, decided not to abandon entirely their customers who want more than just original content. On May 23rd Netflix announced that, beginning in September, Netflix would be the exclusive U.S. pay TV provider for the latest films from Disney, Marvel, Lucasfilm, and Pixar. Once they’ve left the movie theaters, the only place you’ll be able to see films from these proven producers of blockbuster movies is on Netflix. Along with this announcement, Netflix released a short clip on what’s coming to Netflix this summer. You can find it in this linked story. This deal, worth $600 million, is on the heels of a $100 million, 5 year deal with Miramax to retain the rights to their inventory of 700 movies. Absent a deal with Miramax, Netflix would have lost the licensing rights to many of their older classic movies such as Good Will Hunting and Pulp Fiction. Amazon’s hopes of catching up to Netflix, in terms of the size of movie inventory, have been seriously damaged.

These deals shore up a growing weakness for Netflix. For movie lovers like myself, though, a huge gap remains between the movies we want to see and the movies available on Netflix streaming and Amazon Prime. In my movie database there are 423 “really like” movies. Only 68 are available on these two leading streaming services, combined. That’s only 16%. If you only have Amazon Prime, 20 of these 423 movies, or 5%, would be available. If you only have Netflix, 48 of 423, or 11%, would be available. A la Carte purchase of movies through pay per view venues isn’t going away. This isn’t good news for those of us who are already paying for streaming services and/or premium cable channels and still have to pay more to see 85% of the movies we’re interested in.

I guess, with apologies to Crosby, Stills, Nash, and Young, if you can’t be with the movie you love, love the movie your with.

Franchise Action Movies are Gold

X-Men: Apocalypse dominated the box office this Memorial Day weekend taking in $65.3 million in ticket sales. It also received a less than enthusiastic response from the critics, earning a 49% Rotten grade from Rotten Tomatoes. Does this mean that Rotten Tomatoes has little impact on how well a movie does at the box office? Absolutely not. X-Men: Days of Future Past received a 91% Certified Fresh Rating when it opened on Memorial Day weekend in 2014 and took in $90.8 million in ticket sales. The latest entry into the X-Men franchise underperformed its predecessor by 28% at the box office for the same holiday weekend. I suspect much of that deficit was due to the lukewarm reviews X-Men: Apocalypse received.

The story isn’t the impact of bad reviews on this particular movie but the box office immunity that franchise action movies have in general. They come with a built in audience that will show up regardless of how the movie is reviewed. The following is a list of the top ten U.S. box office performers for action movies including their Rotten Tomatoes grades:

Release Year US Box Office (000000) Rotten Tomatoes Grade % Fresh
Star Wars: Episode VII – The Force Awakens 2015 $937 Certified Fresh 92%
Avatar 2009 $761 Certified Fresh 83%
Jurassic World 2015 $652 Fresh 72%
The Avengers 2012 $623 Certified Fresh 92%
The Dark Knight 2008 $533 Certified Fresh 94%
Star Wars: Episode I – The Phantom Menace 1999 $475 Rotten 56%
Star Wars 1977 $461 Certified Fresh 94%
Avengers: Age of Ultron 2015 $459 Certified Fresh 75%
The Dark Knight Rises 2012 $448 Certified Fresh 87%
Pirates of the Caribbean: Dead Man’s Chest 2006 $423 Rotten 54%

These box office behemoths have combined for $5.8 billion in ticket sales. Three of these blockbusters aren’t graded Certified Fresh and a fourth, The Avengers: Age of Ultron just barely qualifies as Certified Fresh. What all of these movies have in common is that they are all part of movie franchises. Avatar 2, 3, 4, & 5 are scheduled for release between 2018 and 2023, in case you were wondering why Avatar is considered a franchise. Only Star Wars and Avatar didn’t come with a built in audience from a previous movie in the franchise. Eight of the ten movies were released during Movie Blockbuster season. The other two, Avatar and Star Wars: The Force Awakens, were released during Movie Award season, earning 14 Academy Award nominations and 3 wins. These movies have a combined IMDB rating of 8.2 for males and 8.1 for females. There is little gender gap for this genre.

Franchise Action Movies are  box office gold for the good, the bad, and the ugly (a reference to a franchise of a different genre).

Post Navigation