Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the month “January, 2017”

The Eighth Decade of Oscar Belonged to the Remarkable Dame Judi

In 1995 two actors eased their way into the consciousness of United States moviegoers after learning their craft across the oceans in Australia and England. The actor made an impression in a box office loser, The Quick and the Dead. The actress broke down the gender barrier in the testosterone laden James Bond franchise to become the first female to play M in Goldeneye. The New Zealand born actor was 31 years old. The English actress was 61. They are my Actor and Actress of the decade from 1997 to 2006.

Dame Judi Dench is the Actress of the Decade.

Top Actresses of the Decade
1997 to 2006
Actress Lead Actress Nominations Lead Actress Wins Supporting Actress Nominations Supporting Actress Wins Total Academy Award Points
Judi Dench 4 0 2 1 15
Hilary Swank 2 2 0 0 12
Meryl Streep 3 0 1 0 10
Kate Winslet 3 0 1 0 10
Nicole Kidman 2 1 0 9
Charlize Theron 2 1 0 9

It is remarkable for a woman to become a Hollywood star in her sixties. As I pointed out in a previous post, good roles for female actors peak between ages 22 and 31. Judi Dench has turned that statistic on its head. Beginning at age 63 with Mrs. Brown to the most recent, Philomena, at age 79, Judi Dench has been nominated for an Academy Award seven times. She won Best Supporting Actress for Shakespeare in Love, a Best Picture winner. While Judi Dench may have been fairly anonymous to United States audiences until the mid-90’s, she was not anonymous across the pond in Great Britain. She was a member of the Royal Shakespeare Company and is one of the most decorated actors in British theater history. She is also a ten time BAFTA winner, which is the British equivalent to the Academy Awards. So, Judi Dench did not just show up in the 90’s, she was always great.

The Actor of the Decade goes to Russell Crowe, beating out Sean Penn in a tie-breaker.

Top Actors of the Decade
1997 to 2006
Actor Lead Actor Nominations Lead Actor Wins Supporting Actor Nominations Supporting Actor Wins Total Academy Award Points
Sean Penn 3 1 0 0 12
Russell Crowe 3 1 0 0 12
Jack Nicholson 2 1 0 0 9
Denzel Washington 2 1 0 0 9
Jamie Foxx 1 1 1 0 7
Tie Breakers for Top Actor of the Decade
Avg IMDB & Rotten Tomatoes Ratings for Nominated Movies
Released from 1997 to 2006
Actor IMDB Avg Rating # of Votes Rotten Tomatoes % Fresh How Fresh? # of Critics Reviews
Russell Crowe 8.3    1,798,645 81% Certified Fresh 522
Sean Penn 7.9       500,465 67% Fresh 398

Russell Crowe’s only three nominations in his career so far occurred in three consecutive years from 1999 to 2001. He won for Gladiator which was released in 2000.

If you were to read critics reviews of the 2012 Best Picture nominee Les Miserables, a common criticism of the movie is that Russell Crowe, in the role of Javert, wasn’t a very good singer. The irony in that criticism is that Russell Crowe was the lead singer for a moderately successful rock band called 30 Odd Foot of Grunts, also known as TOFOG. During a US concert tour, there were nights when a ticket to see TOFOG might command as much as $500 on ebay. In 2001, Crowe and his band performed on the Tonight Show with Jay Leno. If you are interested, you can download songs of TOFOG from ITunes.

The next Actors of the Decade post will be for the current decade. The last nominations to be considered were announced two days ago. The winners will be announced on February 26th. My announcement of the decade winners will be in early March. Who knows, there may be another story as remarkable as Dame Judi’s.

Advertisements

For 1987 to 1996, the Actress of the Decade Comes Down to a Coin Toss?

Three months ago I began a series of articles on the best actors and actresses of each of the nine decades of Oscar. I was satisfied with the approach I was taking until…this month. My scoring system works great when the results come out like the 1987 to 1996 Actor of the Decade.

Top Actors of the Decade
1987 to 1996
Actor Lead Actor Nominations Lead Actor Wins Supporting Actor Nominations Supporting Actor Wins Total Academy Award Points
Tom Hanks 3 2 0 0 15
Anthony Hopkins 3 1 0 0 12
Robin Williams 3 0 0 0 9
Daniel Day Lewis 2 1 0 0 9
Al Pacino 1 1 2 0 8

Clearly, Tom Hanks deserves that honor since he won Best Actor twice and Anthony Hopkins won only once. Both were nominated three times.

Now, let’s look at the Actresses of the decade.

Top Actresses of the Decade
1987 to 1996
Actress Lead Actress Nominations Lead Actress Wins Supporting Actress Nominations Supporting Actress Wins Total Academy Award Points
Susan Sarandon 4 1 0 0 15
Jodie Foster 3 2 0 0 15
Emma Thompson 3 1 1 0 13
Meryl Streep 4 0 0 0 12
Holly Hunter 2 1 1 0 10

It’s a tie…and it’s kind of a mess. Including Supporting Actress nominations, Susan Sarandon, Meryl Streep, and Emma Thompson all have one more nomination than Jodie Foster. Because Jodie Foster won twice, she passes everyone except Susan Sarandon. The two actresses tie because my scoring system values a Lead Actress win twice as much as a nomination. Previously I’ve handled ties by letting IMDB and Rotten Tomatoes results for nominated movies act as a tie breaker. In this case, it’s inconclusive.

Tie Breakers for Top Actresses of the Decade
Avg IMDB & Rotten Tomatoes Ratings for Nominated Movies
Released from 1987 to 1996
Actor IMDB Avg Rating # of Votes Rotten Tomatoes % Fresh How Fresh? # of Critics Reviews
Susan Sarandon 7.3    242,422 88% Certified Fresh 191
Jodie Foster 8.5    971,401 84% Certified Fresh 125

The critics like Susan Sarandon’s movies more, but Jodie Foster rides Silence of the Lambs to a decisive IMDB nod.

In trying to decipher an advantage in these tie-breaker results, I reached a very different conclusion. They’re probably not that relevant. Critics and viewers may like a movie because of an actors performance, or they may like it for an entirely different reason. It isn’t like Oscar voting which is focused solely on the performance of a single actor. It would be better to use Golden Globe or Screen Actors Guild results as tie breakers or supplements to the scoring system.

And, is an Oscar win twice as valuable an indicator of greatness as an Oscar nomination? No, it’s even more valuable.

For Best Actress in a Leading Role
Number of Actresses Who Have:
% of Total Nominated
Been Nominated 219
Been Nominated More than Once 85 38.8%
Won 72 32.9%
Won More Than Once 13 5.9%

It is easier to be nominated twice than it is to win once. And, it has been more than five times as hard to win twice as it is to be nominated twice.

I’ve got to rework my scoring system. For now, with only two decades left to consider, we’ll keep it as it is. For Actress of this decade, it is a coin toss with a coin weighted towards Jodie Foster and her two wins.

Create, Test, Analyze, and Recreate

Apple’s IPhone just turned 10 years old. Why has it been such a successful product? It might be because the product hasn’t stayed static. The latest version of the IPhone is the IPhone 7+. As a product, it is constantly reinventing itself to improve its utility. It is always fresh. Apple, like most producers of successful products, probably follows a process whereby they:

  1. Create.
  2. Test what they’ve created.
  3. Analyze the results of their tests.
  4. Recreate.

They never dust off their hands and say, “My job is done.”

Now I won’t be so presumptuous to claim to have created something as revolutionary as the IPhone. But, regardless of how small your creation, its success requires you to follow the same steps outlined above.

My post last week outlined the testing process I put my algorithm through each year. This week I will provide some analysis and take some steps towards a recreation. The results of my test was that using my “really like” movie selection system significantly improved the overall quality of the movies I watch. On the negative side, the test showed that once you hit some optimal number of movies in a year the additional movies you might watch has a diminishing quality as the remaining pool of “really like” movies shrinks.

A deeper dive into these results begins to clarify the key issues. Separating movies that I’ve seen at least twice from those that were new to me is revealing.

Seen More than Once Seen Once
1999 to 2001 2014 to 2016 1999 to 2001 2014 to 2016
# of Movies 43 168 231 158
% of Total Movies in Timeframe 15.7% 51.5% 84.3% 48.5%
IMDB Avg Rating                   7.6                   7.6                   6.9                   7.5
My Avg Rating                   8.0                   8.4                   6.1                   7.7
% Difference 5.2% 10.1% -12.0% 2.0%

There is so much interesting data here I don’t know where to start. Let’s start with the notion that the best opportunity for a “really like” movie experience is the “really like” movie you’ve already seen. I’ve highlighted in teal the percentage that My Avg Rating outperforms the IMDB Avg Rating in both timeframes. The fact that, from 1999 to 2001, I was able to watch movies that I “really liked” more than the average IMDB voter, without the assistance of any movie recommender website, suggests that memory of a “really like” movie is a pretty reliable “really like” indicator. The 2014 to 2016 results suggest that my “really like” system can help prioritize the movies that memory tells you that you will “really like” seeing again.

The data highlighted in red and blue clearly display the advantages of the “really like” movie selection system. It’s for the movies you’ve never seen that movie recommender websites are worth their weight in gold. With limited availability of movie websites from 1999 to 2001 my selection of new movies underperformed the IMDB Avg Rating by 12% and they represented 84.3% of all of the movies I watched during that timeframe. From 2014 to 2016 (the data in blue), my “really like” movie selection system recognized that there is a limited supply of new “really like” movies. As a result less than half of the movies watched from 2014 through 2016 were movies I’d never seen before. Of the new movies I did watch, there was a significant improvement over the 1999 to 2001 timeframe in terms of quality, as represented by the IMD Avg Rating, and my enjoyment of the movies, as represented by My Avg Rating.

Still, while the 2014 to 2016 new movies were significantly better than the new movies watched from 1999 to 2001, is it unrealistic to expect My Ratings to be better than IMDB by more than 2%? To gain some perspective on this question, I profiled the new movies I “really liked” in the 2014 to 2016 timeframe and contrasted them with the movies I didn’t “really like”.

Movies Seen Once
2014 to 2016
“Really Liked” Didn’t “Really Like”
# of Movies 116 42
% of Total Movies in Timeframe 73.4% 26.6%
IMDB Avg Rating                       7.6                                  7.5
My Avg Rating                       8.1                                  6.3
“Really Like” Probability 82.8% 80.7%

The probability results for these movies suggest that I should “really like” between 80.7% and 82.8% of the movies in the sample. I actually “really liked” 73.4%, not too far off the probability expectations. The IMDB Avg Rating for the movies I didn’t “really like” is only a tick lower than the rating for the “really liked” movies. Similarly, the “Really Like” Probability is only a tick lower for the Didn’t “Really Like” movies. My conclusion is that there is some, but not much, opportunity to improve selection of new movies through a more disciplined approach. The better approach would be to favor “really like” movies that I’ve seen before and give new movies more time for their data to mature.

Based on my analysis, here is my action plan:

  1. Set separate probability standards for movies I’ve seen before and movies I’ve never seen.
  2. Incorporate the probability revisions into the algorithm.
  3. Set a minimum probability threshold for movies I’ve never seen before.
  4. When the supply of “really like” movies gets thin, only stretch for movies I’ve already seen and memory tells me I “really liked”.

Create, test, analyze and recreate.

 

A New Year’s Ritual: Looking Back to Help Move Forward

I’m a big fan of the New Year’s ritual of taking stock of where you’ve been and resolving to make some adjustments to make the coming year better. This New Year marks the completion of my third year of working with an algorithm to help me select better movies to watch. Since establishing my database, I’ve used each New Year to take two snapshots of my viewing habits.

The first snapshot is of the movies that have met the fifteen year limit that I’ve imposed on my database. This year it’s the year 2001 that is frozen in time. I became a user of IMDB in June 2000. That makes 2001 the first full year that I used a data based resource to supplement my movie selection process which, at the time, was still primarily guided by the weekly recommendations of Siskel & Ebert.

The second snapshot is of the data supporting the movie choices I made in 2016. By looking at a comparison of 2001 with 2016, I can gain an appreciation of how far I’ve come in effectively selecting movies. Since this is the third set of snapshots I’ve taken I can also compare 1999 with 2014 and 2000 with 2015, and all years with each other.

Here are the questions I had and the results of the analysis. In some instances it suggests additional targets of research.

Am I more effective now than I was before in selecting movies to watch?

There is no question that the creation of online movie recommending websites and the systematic use of them to select movies improves overall selection. The comparison below of the two snapshots mentioned previously for the last three years demonstrates significant improvement over the last three years.

 Year # of Movies My Avg Rating  Year # of Movies My Avg Rating % Rating Diff.
2001 109                        6.0 2016 144 7.4 23.3%
2000 106                        6.9 2015 106 8.4 21.7%
1999 59                        6.4 2014 76 8.8 37.5%
1999 – 2001 274 6.4 2014 – 2016 326 8.1 25.1%

One area of concern might be a pattern, or it could be random, in the 2014 to 2016 data that might suggest that there is a diminishing return in the overall quality of movies watched as the number of movies watched increases.

Am I more likely to watch movies I “really like”?

Again, the answer is a resounding “Yes”.

# of Movies # “Really Liked” % “Really Liked”
1999 59 25 42.4%
2000 106 50 47.2%
2001 109 40 36.7%
2014 76 76 100.0%
2015 106 91 85.8%
2016 144 100 69.4%

The concern raised about diminishing returns from increasing the number of movies watched is in evidence here as well. In 2014 I “really liked” all 76 movies I watched. Is it worth my time to watch another 30 movies, as I did in 2015, if I will “really like” 15 of them? Maybe. Maybe not. Is it worth my while to watch an additional 68 movies, as I did in 2016, if I will “really like” only 24? Probably not.

How do I know that I am selecting better movies and not just rating them higher?

As a control, I’ve used the IMDB average rating as an objective measure of quality.

IMDB Avg Rating My Avg Rating Difference
1999 7.0 6.4                   (0.6)
2000 7.1 6.9                   (0.2)
2001 6.9 6.0                   (0.9)
2014 7.8 8.8                     1.0
2015 7.6 8.4                     0.8
2016 7.4 7.4                        –

The average IMDB voter agrees that the movies I’ve watched from 2014 to 2016 are much better than the movies I watched from 1999 to 2001. What is particularly interesting is that the movies I chose to watch from 1999 to 2001, without the benefit of any website recommending movies I’d personally like, were movies I ended up liking less than the average IMDB voter. From 2014 to 2016, with the benefit of tools like Netflix, Movielens, and Criticker, I’ve selected movies that I’ve liked better than the average IMDB voter. The 2016 results feed the diminishing returns narrative, suggesting that the more movies that I watch the more my overall ratings will migrate to average.

My 2017 “Really Like” resolution.

My selection algorithm is working effectively. But, the combination of a diminishing number of “really like” movies that I haven’t seen in the last fifteen years, and my desire to grow the size of my database, may be causing me to reach for movies that are less likely to result in a “really like” movie experience. Therefore, I resolve to establish within the next month a minimum standard below which I will not reach.

Now that’s what New Year’s is all about, the promise of an even better “really like” movie year.

 

 

 

 

 

 

 

 

Post Navigation