Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the tag “Rotten Tomatoes”

Why Did “The Big Sick” Drop Out of the Objective Top Fifteen This Week?

This past Sunday my wife, Pam, and I went to see The Big Sick. The movie tells the story of the early relationship days of the two screenwriters, Emily Gordon and Kumail Nanjiani. In fact, Nanjiani plays himself in the movie. It is the authenticity of the story, told in a heartfelt and humorous way, that makes this film special.

On the following day, last weekend’s blockbuster, Dunkirk, moved into the second spot in the revised Objective Top Fifteen rankings. When a new movie comes on the list another one exits. This week’s exiting movie, ironically, was The Big Sick. Wait! If The Big Sick is such a great movie why isn’t it in my top fifteen for the year? Are all of the other movies on the list better movies? Maybe yes. Maybe no. You’ll have to determine that for yourselves. You see the Objective Top Fifteen is your list, not mine.

I developed the Objective Top Ten, which became Fifteen the beginning of July and will become Twenty the beginning of October, to provide you with a ranking of 2017 widely released movies that are most likely to be “really like” movies. Because the ranking is based on objective benchmarks, my taste in movies has no influence on the list. The four benchmarks presently in use are: IMDB Avg. Rating, Rotten Tomatoes Rating, Cinemascore Rating, and Academy Award Nominations and Wins. A movie like Hidden Figures that meets all four benchmarks has the greatest statistical confidence in its “really like” status and earns the highest “really like” probability. A movie that meets three benchmarks has a greater “really like” probability than a movie that meets only two benchmarks. And so on.

The important thing to note, though, is that this is not a list of the fifteen best movies of the year. It is a ranking of probabilities (with some tie breakers thrown in) that you’ll “really like” a movie. It is subject to data availability. The more positive data that’s available, the more statistical confidence, i.e. higher probability, the model has in the projection.

Which brings me back to The Big Sick. Cinemascore surveys those movies that they consider “major releases”. The Big Sick probably didn’t have a big advertising budget. Instead, the producers of the film chose to roll the movie out gradually, beginning on June 23rd, to create some buzz and momentum behind the movie before putting it into wide release on July 14th. This is probably one of the reasons why Cinemascore didn’t survey The Big Sick. But, because The Big Sick is missing that third benchmark needed to develop a higher probability, it dropped out of the Top Fifteen. On the other hand, if it had earned at least an “A-” from Cinemascore The Big Sick would be the #2 movie on the list based on the tie breakers.

And, that is the weakness, and strength of movie data. “Major releases” have it. Smaller movies like The Big Sick don’t.

***

This weekend may be the end of the four week run of Objective Top Fifteen movie breakthroughs. Atomic Blonde, the Charlize Theron spy thriller, has an outside chance of earning a spot on the list. As of this morning, it is borderline for the IMDB and Rotten Tomatoes benchmarks. I’m also tracking Girls Trip which earned a Certified Fresh just in the last couple of days from Rotten Tomatoes and has an “A+” in hand from Cinemascore. For now, it is just below the IMDB benchmark. We’ll see if that changes over the weekend.

 

 

Vacation, My 100th Post, and a July “Really Like” Movie Hot Streak

I arrived in the city of Seattle yesterday in the wee hours of the morning. I’m here to introduce myself to my new, beautiful granddaughter. So if there is a contemplative, or distracted, feel to this week’s post, there is good reason.

This is also my 100th post. Not quite as momentous as your first grandchild, but a marker worthy of reflection nevertheless. It has been a labor of love and a challenge. Blogging was new to me when I started out 99 posts ago. I discovered that you don’t find your voice in the first post. Little by little though you develop a style that you become comfortable with and readers of your blog become comfortable with. If you’re lucky, enough people become engaged in your passion and come back for more. Thanks for your support if you’re one of those loyal followers, or even if you’ve just stopped by for an occasional “check and see”. On to the next 100 posts beginning with a look at what’s caught my eye at the Cineplex this coming weekend.

Dunkirk, which goes into wide release tomorrow, is poised to become the fourth high quality mega-hit in four weeks. As of this morning, it is 94% Certified Fresh on Rotten Tomatoes. And, the early overseas feedback on IMDB has produced an impressive 9.6 average rating. This Christopher Nolan depiction of the rescue of the surrounded British army at the beginning of World War II is being compared to the classic Saving Private Ryan. The Saving Private Ryan comparison benchmarks to keep an eye on are Certified Fresh 92%, IMDB Avg Rating 8.6 and Cinemascore “A”. Pre-wide release Dunkirk is exceeding the Rotten Tomatoes and IMDB scores. We’ll have to wait until Saturday for Cinemascore results. I’m excited about this one.

In addition to off schedule posts to this site, vacation for the Mad Movie Man invariably involves a trip to the movies. With an unusually high number of Certified Fresh movies at the theater it is almost a can’t miss proposition. But, the absolute can’t miss feature of this vacation is the incredible miracle of my granddaughter Addie Rose.

This Is Turning Into a “Really Like” Summer at the Movies.

In case you haven’t noticed, we are in the midst of a pretty good run of high quality movies this summer. Since the first weekend in May, which serves as the unofficial beginning of the summer movie season, there have been at least ten movies that have a 7.2 or higher IMDB average rating and have a Certified Fresh rating on Rotten Tomatoes.

May to July 2017 Wide Released Movies IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh
Baby Driver 8.4 C. Fresh 97%
Spider-Man: Homecoming 8.2 C. Fresh 93%
Wonder Woman 8.0 C. Fresh 92%
Guardians of the Galaxy Vol. 2 8.1 C. Fresh 81%
Big Sick, The 8.0 C. Fresh 97%
I, Daniel Blake  7.9 C. Fresh 92%
A Ghost Story 7.5 C. Fresh 87%
Okja 7.7 C. Fresh 84%
The Beguiled  7.3 C. Fresh 77%
The Hero  7.3 C. Fresh 76%

And if early indicators are accurate, War for the Planet of the Apes will join the list after this coming weekend. And, if the early buzz on social media holds up, Christopher Nolan’s new movie Dunkirk will join the list the following weekend.

This seems to me to be an unusually high number of quality movies for the summer so far but I can’t tell you how unusual…yet. I’m working on a new long term project. I’m creating a database solely made up of objective “really like” movie indicators. It will include all movies finishing in the top 150 in receipts at the box office for each of the last 25 years. This database will provide a better representation of the bad movies that are released each year as well as provide a more robust sample size.

For now, I can only compare this year’s quality to 1992 (the first of the 25 years in my new database). Allowing for the fact that Rotten Tomatoes wasn’t launched until 1998, I’ve allowed movies that aren’t Certified Fresh but would otherwise be if there were enough critic reviews of the movie. Even with that allowance, there are only 3 movies released between May and July 1992 that meet the quality criteria I’m using for this summer.

May to July 1992 Wide Released Movies IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh
Night on Earth             7.5 Fresh 73%
Enchanted April             7.6 Fresh 83%
A League of Their Own             7.2 C. Fresh 78%

I’ll also add that the IMDB average ratings tend to decline over time. It is probable that a few of this year’s movies will ultimately not meet the 7.2 IMDB rating minimum. But, with 7 of the 10 movies sitting with IMDB ratings at 7.7 or better, this year’s list should hold up pretty well over time.

***

As I mentioned above War for the Planet of the Apes opens tomorrow. It is easy to overlook how good this franchise has been. Here are the “really like” indicators for the franchise including a very early look at tomorrow’s entry.

IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh Cinema Score
Rise of the Planet of the Apes (2011)             7.6 C. Fresh 81% A-
Dawn of the Planet of the Apes (2014)             7.6 C. Fresh 90% A-
War for the Planet of the Apes (2017)             9.1 C. Fresh 93% ?

Franchises tend to get tired after the first movie. From the critics’ perspective, this franchise appears to get better with each new movie. I expect to see War for the Planet of the Apes on the Objective Top Fifteen list on Monday.

Leave Mummy Out of Your Father’s Day Plans

One of the goals of this blog is to make sure that you are aware of the internet tools that are out there to protect you from wasting your time on blockbusters like The Mummy. While it had a disappointing opening in the U.S., moviegoers still shelled out an estimated $32.2 million at the box office last weekend for this bad movie. Overseas it met its blockbuster expectations with a box office of $141.8 million. However, if you were really in the mood for a horror genre movie a better choice, but not a sure thing, might have been It Comes At Night which had a more modest U.S. box office of $6 million.

As a general rule, I won’t go to a movie on its opening weekend. I prefer to get at least a weekend’s worth of data. But if you just have to see a movie on its opening weekend here are a couple of hints. First, if you are seeing the movie on its opening Friday, the most reliable indicator is Rotten Tomatoes. Most critics have released their reviews before the day of the movie’s release. The Rotten Tomatoes rating on the movie’s release date is a statistically mature evaluation of the movie. It won’t change much after that day.

If you are going to the movies on the Saturday of opening weekend, you can add Cinemascore to the mix. I’ve blogged about this tool before. This grade is based on feedback moviegoers provide about the movie as they are leaving the theater. The grade is posted on the Saturday after the Friday release.

Finally, by Sunday IMDB will produce a pretty good, though slightly inflated, average rating for the movie.

The comparison of these three checkpoints for The Mummy and for It Comes At Night might’ve been helpful to those who thought they were in for a “really like” movie experience.

Rotten Tomatoes IMDB Avg. Rating Cinemascore Grade
The Mummy Rotten (17%) 5.9 B-
It Comes At Night Certified Fresh (86%) 7.2 D

While the Cinemascore grade of D for It Comes At Night would keep me away from opening weekend for both movies, if I had to see one, it wouldn’t be The Mummy.

Here’s the data behind my reasoning. For IMDB, the breakpoint between a movie with a good chance that I will “really like” it and one that I probably won’t like is an average rating of 7.2. Movies with a 7.2 IMDB average rating of 7.2 or higher I “really like” 63.3% of the time. Movies with an IMDB rating less than 7.2 I “really like” 43.3% of the time. Turning to Rotten Tomatoes, Movies that are Certified Fresh I “really like” 68% of the time. These “really like” percentages drop to 49.6% for movies that are Fresh and 37.5% for movies that are Rotten. So absent any information based on my own personal tastes, I won’t go to the movieplex to watch a movie that isn’t graded Certified Fresh by Rotten Tomatoes and has an IMDB Rating 7.2 or higher. That doesn’t mean that there aren’t any movies out there that don’t meet that criteria that I wouldn’t “really like”. The movie may be in a genre that appeals to me which might provide some tolerance for a little less quality. That being said, the odds that I’ll “really like” a low rated movie are less than 50/50.

I should probably explore the potential of adding Cinemascore to the objective probability factors I use in developing “really like” probabilities. To date, though, I don’t have any Cinemascore data . I don’t yet have a feel for its “really like” reliability. For now, I just use it as another piece of data that might tip me one way or the other if I’m on the fence about a new movie.

Enjoy Father’s Day but stay away from Mummy.

Wonder Woman Is Wonderful But Is It the GOAT Superhero Movie?

Everybody is talking about Wonder Woman and its record-breaking box office last weekend. Critics and audiences agree that Wonder Woman is worth a trip to the theater. The Mad Movie Man is convinced as well. You’ll find the movie in the top half of the 2017 Top Ten List and it is on my Watch List for the week, which means I plan on seeing it within the next week.

I mentioned last week that critics were falling all over themselves in praising this movie with some calling it the Superhero GOAT (Greatest Of All Time). Does it warrant such acclaim? Maybe. When you compare it to four other highly rated superhero movies that kicked off franchises, it holds up pretty well.

Oscar Noms/Wins IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh Combined Rating
Wonder Woman (2017) 0/0 8.3 C. Fresh 93%              17.6
Iron Man (2008) 2/0 7.9 C. Fresh 94%              17.3
Batman Begins (2005) 1/0 8.3 C. Fresh 84%              16.7
Superman (1978) 3/0 7.3 C. Fresh 93%              16.6
Spider-Man (2002) 2/0 7.3 C. Fresh 89%              16.2

All four of these comparison movies were Oscar nominated. We’ll have to wait until next January to see if Wonder Woman earns Oscar recognition. The combined rating presented here totals the IMDB rating and the Rotten Tomatoes % Fresh (converted to a 10 pt. scale) to measure the response of both critics and audiences to the five movies. It is still early, and IMDB ratings tend to fade a little over time, but for now Wonder Woman is clearly in the GOAT discussion.

If Wonder Woman holds on to its statistical GOAT position it will be fueled by the response of women to the movie. A comparison of Female and Male IMDB ratings for the five movies compared here lays this out pretty clearly.

Female IMDB Rating Male IMDB Rating IMDB Rating Differnce
Wonder Woman 8.6 8.2 0.4
Iron Man 7.9 7.9 0.0
Superman 7.3 7.3 0.0
Batman Begins 8.1 8.3 -0.2
Spider-Man 7.1 7.3 -0.2

While men “really like” Wonder Woman, females love the movie. Women are responding like they never have before to a superhero movie. Men, on the other hand, have a slight preference for Christopher Nolan’s vision of Batman. I also have to admit that I personally consider Batman Begins as one of the GOAT movies, irrespective of genre. That being said, I am really excited about seeing Wonder Woman.

***

After all of this praise for Wonder Woman, you might be wondering why it is only fifth on the 2017 Top Movies List. Does that mean that the four movies ahead of it are better movies? It might but not necessarily. The top four movies all went into limited release in 2016 to qualify for Oscar consideration. They didn’t go into wide release until early 2017, which is why they are on this list. All of the other movies on the list won’t be considered for Oscar recognition until January 2018. As I mentioned last week, this list is based on objective criteria. The Oscar nominations that the top four movies received are additional objective pieces of evidence that they are quality movies. This allows the algorithm to be more confident in its evaluation of the movie and as a result produces a higher “really like” probability. Again, just in case you were wondering.

 

When It Comes to Unique Movies, Don’t Put All of Your Movie Eggs in the Netflix Basket.

It is rare to find a movie that isn’t a sequel, or a remake, or a borrowed plot idea, or a tried and true formula. Many of these movies are entertaining because they feel familiar and remind us of another pleasant movie experience. The movie recommender websites Netflix, Movielens, and Criticker do a terrific job of identifying those movie types that you “really liked” before and surfacing those movies that match the familiar plot lines you’ve enjoyed in the past.

But, what about those movies that are truly original. What about those movies that present ideas and plot lines that aren’t in your usual comfort zone. Will these reliable websites surface these unique movies that I might like? Netflix has 20,000+ genre categories that they slot movies into. But, what if a movie doesn’t fit neatly into one of those 20,000 categories.

Yesterday I watched a great movie, Being There.

Being There

This 1979 movie, starring Peter Sellers in an Academy Award nominated performance, is about a simple-minded gardener who never left the home of his employer over the first fifty years of his life. Aside from gardening, the only knowledge he has is what he’s seen on television. After his employer dies he is forced to leave his home. The movie follows Chance the Gardener as he becomes Chauncey Gardner, economic advisor to the President. It’s not a story of transformation but of perception. The movie is fresh.

My most reliable movie recommenders, Netflix and Movielens, warned me away from this movie. The only reason I added it to my weekly Watch List is because I saw the movie in the theater when it first came out in 1979 and “really liked” it.

Another possible reason why Netflix missed on this movie is because I hated Peter Sellers’ other classic movie Dr. Strangelove. I rated it 1 out of 5 stars.  If Being There is slotted among a Netflix category of Peter Sellers classics, it might explain the mismatch.

Is it impossible, then, to surface movies that have creative original content that you might like. Not entirely. Criticker predicted I would rate Being There an 86 out of 100. I gave it an 89. The IMDB rating is 8.0 based on over 54,000 votes. Rotten Tomatoes has it at 96% Certified Fresh. This is why I incorporate ratings from five different websites into the “really like” model rather than just Netflix.

When it comes to unique movies, don’t put all of your movie eggs in the Netflix basket.

 

 

Does Critic Expertise on Rotten Tomatoes Overcome the Law of Large Numbers?

In the evolution of my “really like” movie algorithm, one of the difficulties I keep encountering is how should I integrate Rotten Tomatoes ratings in a statistically significant way. Every time I try I keep rediscovering that its ratings are not as useful as the other websites that I use. It’s not that it has no use. To determine if a movie is worth seeing within a week after its release, you’ll be hard pressed to find a better indicator. The problem is that most of the data for a particular movie is counted in that first week. Most of the critic reviews are completed close to the release dates to provide moviegoers with guidance on the day a movie is released. After that first week, the critics are on to the next batch of new movies to review. With all of the other websites, the ratings continually get better as more people see the movie and provide input. The data pool gets larger and the law of large numbers kicks in. With Rotten Tomatoes, there is very little data growth. Its value is based on the expertise of the critics and less on the law of large numbers.

The question becomes what is the value of film critics expertise. It is actually pretty valuable. When Rotten Tomatoes slots movies into one of their three main rating buckets (Certified Fresh, Fresh, Rotten), it does create a statistically significant differentiation.

Rating “Really Like” %
Certified Fresh 69.7%
Fresh 50.0%
Rotten 36.6%

Rotten Tomatoes is able to separate pretty well those movies I “really like” from those I don’t.

So what’s the problem? If we stick to Certified Fresh movies we’ll “really like” them 7 out of 10 times. That’s true. And, if I’m deciding on which new release to see in the movie theater, that’s really good. But, if I’m deciding what movie my wife and I should watch on Friday night movie night and our selection is from the movies on cable or our streaming service, we can do better.

Of the 1,998 movies I’ve seen in the last 15 years, 923 are Certified Fresh. Which of those movies am I most likely to “really like”? Based on the following table, I wouldn’t rely on the Rotten Tomatoes % Fresh number.

Rating % Fresh Range “Really Like” %
Certified Fresh 96 to 100% 69.9%
Certified Fresh 93 to 95% 73.4%
Certified Fresh 89 to 92% 68.3%
Certified Fresh 85 to 88% 71.2%
Certified Fresh 80 to 84% 73.0%
Certified Fresh 74 to 79% 65.3%

This grouping of six equal size buckets suggests that there isn’t much difference between a movie in my database that is 75% Fresh and one that is 100% Fresh. Now, it is entirely possible that there is an actual difference between 75% Fresh and 100% Fresh. It is possible that, if my database were larger, my data might produce a less random pattern which might be statistically significant. For now, though, the data is what it is. Certified Fresh is predictive and the % Fresh part of the rating less so.

Expertise can reduce the numbers needed for meaningful differentiation between what is Certified Fresh and what is Rotten. The law of large numbers, though, may be too daunting for credible guidance much beyond that.

 

 

The Art of Selecting “Really Like” Movies: New Movies

I watch a lot of movies, a fact that my wife, and occasionally my children, like to remind of. Unlike the average, non-geeky, movie fan, though, I am constantly analyzing the process I go through to determine which movies I watch. I don’t like to watch mediocre, or worse, movies. I’ve pretty much eliminated bad movies from my selections. But, every now and then a movie I “like” rather than “really like” will get past my screen.

Over the next three weeks I’ll outline the steps I’m taking this year to improve my “really like” movie odds. Starting this week with New Movies, I’ll lay out a focused strategy for three different types of movie selection decisions.

The most challenging “really like” movie decision I make is which movies that I’ve never seen before are likely to be “really like” movies. There is only a 39.3% chance that watching a movie I’ve never seen before will result in a “really like” experience. My goal is to improve those odds by the end of the year.

The first step I’ve taken is to separate movies I’ve seen before from movies I’ve never seen in establishing my “really like” probabilities. As a frame of reference, there is a 79.5% chance that I will “really like” a movie I’ve seen before. By setting my probabilities for movies I’ve never seen off of the 39.3% probability I have created a tighter screen for those movies. This should result in me watching fewer never-before-seen movies then I’ve typically watched in previous years. Of the 20 movies I’ve watched so far this year, only two were never-before-seen movies.

The challenge in selecting never-before-seen movies is that, because I’ve watched close to 2,000 movies over the last 15 years, I’ve already watched the “cream of the crop” from those 15 years.. From 2006 to 2015, there were 331 movies that I rated as “really like” movies, that is 33 movies a year, or less than 3 a month. Last year I watched 109 movies that I had never seen before. So, except for the 33 new movies that came out last year that, statistically, might be “really like” movies, I watched 76 movies that didn’t have a great chance of being “really like” movies.

Logically, the probability of selecting a “really like” movie that I’ve never seen before should be highest for new releases. I just haven’t seen that many of them. I’ve only seen 6 movies that were released in the last six months and I “really liked” 5 of them. If, on average, there are 33 “really like” movies released each year, then, statistically, there should be a dozen “really like” movies released in the last six months that I haven’t seen yet. I just have to discover them. Here is my list of the top ten new movie prospects that I haven’t seen yet.

My Top Ten New Movie Prospects 
New Movies =  < Release Date + 6 Months
Movie Title Release Date Last Data Update “Really Like” Probability
Hacksaw Ridge 11/4/2016 2/4/2017 94.9%
Arrival 11/11/2016 2/4/2017 94.9%
Doctor Strange 11/4/2016 2/6/2017 78.9%
Hidden Figures 1/6/2017 2/4/2017 78.7%
Beatles, The: Eight Days a Week 9/16/2016 2/4/2017 78.7%
13th 10/7/2016 2/4/2017 78.7%
Before the Flood 10/30/2016 2/4/2017 51.7%
Fantastic Beasts and Where to Find Them 11/18/2016 2/4/2017 51.7%
Moana 11/23/2016 2/4/2017 51.7%
Deepwater Horizon 9/30/2016 2/4/2017 45.4%
Fences 12/25/2016 2/4/2017 45.4%

Based on my own experience, I believe you can identify most of the new movies that will be “really like” movies within 6 months of their release, which is how I’ve defined “new” for this list. I’m going to test this theory this year.

In case you are interested, here is the ratings data driving the probabilities.

My Top Ten New Movie Prospects 
Movie Site Ratings Breakdown
Ratings *
Movie Title # of Ratings All Sites Age 45+ IMDB Rotten Tomatoes ** Criticker Movielens Netflix
Hacksaw Ridge         9,543 8.2 CF 86% 8.3 8.3 8.6
Arrival      24,048 7.7 CF 94% 8.8 8.1 9.0
Doctor Strange      16,844 7.7 CF 90% 8.2 8.3 7.8
Hidden Figures         7,258 8.2 CF 92% 7.7 7.3 8.2
Beatles, The: Eight Days a Week         1,689 8.2 CF 95% 8.0 7.3 8.0
13th    295,462 8.1 CF 97% 8.3 7.5 8.0
Before the Flood         1,073 7.8 F 70% 7.6 8.2 7.8
Fantastic Beasts and Where to Find Them      14,307 7.5 CF 73% 7.3 6.9 7.6
Moana         5,967 7.7 CF 95% 8.4 8.0 7.0
Deepwater Horizon      40,866 7.1 CF 83% 7.8 7.6 7.6
Fences         4,418 7.6 CF 95% 7.7 7.1 7.2
*All Ratings Except Rotten Tomatoes Calibrated to a 10.0 Scale
** CF = Certified Fresh, F = Fresh

Two movies, Hacksaw Ridge and Arrival, are already probably “really like” movies and should be selected to watch when available. The # of Ratings All Sites is a key column. The ratings for Movielens and Netflix need ratings volume before they can credibly reach their true level. Until, there is a credible amount of data the rating you get is closer to what an average movie would get. A movie like Fences, at 4,418 ratings, hasn’t reached the critical mass needed to migrate to the higher ratings I would expect that movie to reach. Deep Water Horizon, on the other hand, with 40,866 ratings, has reached a fairly credible level and may not improve upon its current probability.

I’m replacing my monthly forecast on the sidebar of this website with the top ten new movie prospects exhibit displayed above. I think it is a better reflection of the movies that have the best chance of being “really like” movies. Feel free to share any comments you might have.

 

The Eighth Decade of Oscar Belonged to the Remarkable Dame Judi

In 1995 two actors eased their way into the consciousness of United States moviegoers after learning their craft across the oceans in Australia and England. The actor made an impression in a box office loser, The Quick and the Dead. The actress broke down the gender barrier in the testosterone laden James Bond franchise to become the first female to play M in Goldeneye. The New Zealand born actor was 31 years old. The English actress was 61. They are my Actor and Actress of the decade from 1997 to 2006.

Dame Judi Dench is the Actress of the Decade.

Top Actresses of the Decade
1997 to 2006
Actress Lead Actress Nominations Lead Actress Wins Supporting Actress Nominations Supporting Actress Wins Total Academy Award Points
Judi Dench 4 0 2 1 15
Hilary Swank 2 2 0 0 12
Meryl Streep 3 0 1 0 10
Kate Winslet 3 0 1 0 10
Nicole Kidman 2 1 0 9
Charlize Theron 2 1 0 9

It is remarkable for a woman to become a Hollywood star in her sixties. As I pointed out in a previous post, good roles for female actors peak between ages 22 and 31. Judi Dench has turned that statistic on its head. Beginning at age 63 with Mrs. Brown to the most recent, Philomena, at age 79, Judi Dench has been nominated for an Academy Award seven times. She won Best Supporting Actress for Shakespeare in Love, a Best Picture winner. While Judi Dench may have been fairly anonymous to United States audiences until the mid-90’s, she was not anonymous across the pond in Great Britain. She was a member of the Royal Shakespeare Company and is one of the most decorated actors in British theater history. She is also a ten time BAFTA winner, which is the British equivalent to the Academy Awards. So, Judi Dench did not just show up in the 90’s, she was always great.

The Actor of the Decade goes to Russell Crowe, beating out Sean Penn in a tie-breaker.

Top Actors of the Decade
1997 to 2006
Actor Lead Actor Nominations Lead Actor Wins Supporting Actor Nominations Supporting Actor Wins Total Academy Award Points
Sean Penn 3 1 0 0 12
Russell Crowe 3 1 0 0 12
Jack Nicholson 2 1 0 0 9
Denzel Washington 2 1 0 0 9
Jamie Foxx 1 1 1 0 7
Tie Breakers for Top Actor of the Decade
Avg IMDB & Rotten Tomatoes Ratings for Nominated Movies
Released from 1997 to 2006
Actor IMDB Avg Rating # of Votes Rotten Tomatoes % Fresh How Fresh? # of Critics Reviews
Russell Crowe 8.3    1,798,645 81% Certified Fresh 522
Sean Penn 7.9       500,465 67% Fresh 398

Russell Crowe’s only three nominations in his career so far occurred in three consecutive years from 1999 to 2001. He won for Gladiator which was released in 2000.

If you were to read critics reviews of the 2012 Best Picture nominee Les Miserables, a common criticism of the movie is that Russell Crowe, in the role of Javert, wasn’t a very good singer. The irony in that criticism is that Russell Crowe was the lead singer for a moderately successful rock band called 30 Odd Foot of Grunts, also known as TOFOG. During a US concert tour, there were nights when a ticket to see TOFOG might command as much as $500 on ebay. In 2001, Crowe and his band performed on the Tonight Show with Jay Leno. If you are interested, you can download songs of TOFOG from ITunes.

The next Actors of the Decade post will be for the current decade. The last nominations to be considered were announced two days ago. The winners will be announced on February 26th. My announcement of the decade winners will be in early March. Who knows, there may be another story as remarkable as Dame Judi’s.

For 1987 to 1996, the Actress of the Decade Comes Down to a Coin Toss?

Three months ago I began a series of articles on the best actors and actresses of each of the nine decades of Oscar. I was satisfied with the approach I was taking until…this month. My scoring system works great when the results come out like the 1987 to 1996 Actor of the Decade.

Top Actors of the Decade
1987 to 1996
Actor Lead Actor Nominations Lead Actor Wins Supporting Actor Nominations Supporting Actor Wins Total Academy Award Points
Tom Hanks 3 2 0 0 15
Anthony Hopkins 3 1 0 0 12
Robin Williams 3 0 0 0 9
Daniel Day Lewis 2 1 0 0 9
Al Pacino 1 1 2 0 8

Clearly, Tom Hanks deserves that honor since he won Best Actor twice and Anthony Hopkins won only once. Both were nominated three times.

Now, let’s look at the Actresses of the decade.

Top Actresses of the Decade
1987 to 1996
Actress Lead Actress Nominations Lead Actress Wins Supporting Actress Nominations Supporting Actress Wins Total Academy Award Points
Susan Sarandon 4 1 0 0 15
Jodie Foster 3 2 0 0 15
Emma Thompson 3 1 1 0 13
Meryl Streep 4 0 0 0 12
Holly Hunter 2 1 1 0 10

It’s a tie…and it’s kind of a mess. Including Supporting Actress nominations, Susan Sarandon, Meryl Streep, and Emma Thompson all have one more nomination than Jodie Foster. Because Jodie Foster won twice, she passes everyone except Susan Sarandon. The two actresses tie because my scoring system values a Lead Actress win twice as much as a nomination. Previously I’ve handled ties by letting IMDB and Rotten Tomatoes results for nominated movies act as a tie breaker. In this case, it’s inconclusive.

Tie Breakers for Top Actresses of the Decade
Avg IMDB & Rotten Tomatoes Ratings for Nominated Movies
Released from 1987 to 1996
Actor IMDB Avg Rating # of Votes Rotten Tomatoes % Fresh How Fresh? # of Critics Reviews
Susan Sarandon 7.3    242,422 88% Certified Fresh 191
Jodie Foster 8.5    971,401 84% Certified Fresh 125

The critics like Susan Sarandon’s movies more, but Jodie Foster rides Silence of the Lambs to a decisive IMDB nod.

In trying to decipher an advantage in these tie-breaker results, I reached a very different conclusion. They’re probably not that relevant. Critics and viewers may like a movie because of an actors performance, or they may like it for an entirely different reason. It isn’t like Oscar voting which is focused solely on the performance of a single actor. It would be better to use Golden Globe or Screen Actors Guild results as tie breakers or supplements to the scoring system.

And, is an Oscar win twice as valuable an indicator of greatness as an Oscar nomination? No, it’s even more valuable.

For Best Actress in a Leading Role
Number of Actresses Who Have:
% of Total Nominated
Been Nominated 219
Been Nominated More than Once 85 38.8%
Won 72 32.9%
Won More Than Once 13 5.9%

It is easier to be nominated twice than it is to win once. And, it has been more than five times as hard to win twice as it is to be nominated twice.

I’ve got to rework my scoring system. For now, with only two decades left to consider, we’ll keep it as it is. For Actress of this decade, it is a coin toss with a coin weighted towards Jodie Foster and her two wins.

Post Navigation