Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the month “February, 2017”

The Art of Selecting “Really Like” Movies: Oscar Provides a Helping Hand

Sunday is Oscar night!! From my perspective, the night is a little bittersweet. The movies that have been nominated offer up “really like” prospects to watch in the coming months. That’s a good thing. Oscar night, though, also signals the end of the best time of the year for new releases. Between now and November, there won’t be much more than a handful of new Oscar worthy movies released to the public. That’s a bad thing. There is only a 35.8% chance I will “really like” a movie that doesn’t earn a single Academy Award nomination. On the other hand, a single minor nomination increases the “really like” probability to 56%. If a movie wins one of the major awards (Best Picture, Director, Actor, Actress, Screenplay), the probability increases to 69.7%.

At the end of last week’s post, I expressed a desire to come up with a “really like” movie indicator that was independent of the website data driven indicators. The statistical significance of Academy Award performance would seem to provide the perfect solution. All movies released over the past 90 years have been considered for Oscar nominations. A movie released in 1936 has statistical equivalence to a movie released in 2016 in terms of Academy Award performance.

By using the Total # of Ratings Quintiles introduced last week credibility weights can be assigned to each Quintile to allocate website data driven probabilities and Oscar performance  probabilities. These ten movies, seen more than 15 years ago, illustrates how the allocation works.

My Top Ten Seen Before Movie Prospects 
Not Seen in Last 15 Years
Movie Title Total Ratings Quintile Website Driven Probability Oscar Driven Probability Net  “Really Like” Probability
Deer Hunter, The 4 97.1% 73.8% 88.5%
Color Purple, The 4 97.9% 69.3% 87.4%
Born on the Fourth of July 4 94.0% 73.8% 86.6%
Out of Africa 4 94.0% 73.8% 86.6%
My Left Foot 3 94.0% 73.8% 83.9%
Coal Miner’s Daughter 3 97.9% 69.3% 83.6%
Love Story 3 92.7% 72.4% 82.6%
Fight Club 5 94.0% 55.4% 81.9%
Tender Mercies 2 94.0% 73.8% 81.2%
Shine 3 88.2% 73.8% 81.0%

The high degree of credible website data in Quintiles 4 & 5 weights the Net Probability closer to the Website driven probability. The Quintile 3 movies are weighted 50/50 and the resulting Net Probability ends up at the midpoint between the Data Driven probability and the Oscar driven probability. The movie in Quintile 2, Tender Mercies, which has a less credible probability from the website driven result, tilts closer to the Oscar driven probability.

The concern I raised last week about the “really like” viability of older movies I’ve never seen before goes away with this change. Take a look at my revised older movie top ten now.

My Top Ten Never Seen Movie Prospects 
Never Seen Movies =  > Release Date + 6 Months
Movie Title Last Data Update Release Date Total # of Ratings “Really Like” Probability
Movie Title Total Ratings Quintile Website Driven Probability Oscar Driven Probability Net  “Really Like” Probability
Yearling, The 1 42.1% 73.8% 71.4%
More the Merrier, The 1 26.9% 73.8% 70.2%
12 Angry Men (1997) 1 42.1% 69.3% 67.2%
Lili 1 26.9% 69.3% 66.0%
Sleuth 1 42.1% 66.8% 64.9%
Of Mice and Men (1939) 1 42.1% 66.8% 64.9%
In a Better World 1 41.5% 66.8% 64.9%
Thousand Clowns, A 1 11.8% 69.3% 64.9%
Detective Story 1 11.8% 69.3% 64.9%
Body and Soul 1 11.8% 69.3% 64.9%

Strong Oscar performing movies that I’ve never seen before become viable prospects. Note that all of these movies are Quintile 1 movies. Because of their age and lack of interest from today’s movie website visitors, these movies would never achieve enough credible ratings data to become recommended movies.

There is now an ample supply of viable, Oscar-worthy, “really like” prospects to hold me over until next year’s Oscar season. Enjoy your Oscar night in La La Land.

 

Advertisements

The Art of Selecting “Really Like Movies: Older Never Before Seen

Last week I stated in my article that I could pretty much identify whether a movie has a good chance of being a “really like movie” within six months of its release. If you need any further evidence, here are my top ten movies that I’ve never seen that are older than six months.

My Top Ten Never Seen Movie Prospects 
Never Seen Movies =  > Release Date + 6 Months
Movie Title Last Data Update Release Date Total # of Ratings “Really Like” Probability
Hey, Boo: Harper Lee and ‘To Kill a Mockingbird’ 2/4/2017 5/13/2011          97,940 51.7%
Incendies 2/4/2017 4/22/2011        122,038 51.7%
Conjuring, The 2/4/2017 7/19/2013        241,546 51.7%
Star Trek Beyond 2/4/2017 7/22/2016        114,435 51.7%
Pride 2/4/2017 9/26/2014          84,214 44.6%
Glen Campbell: I’ll Be Me 2/9/2017 10/24/2014        105,751 44.6%
Splendor in the Grass 2/5/2017 10/10/1961        246,065 42.1%
Father of the Bride 2/5/2017 6/16/1950        467,569 42.1%
Imagine: John Lennon  2/5/2017 10/7/1998        153,399 42.1%
Lorenzo’s Oil 2/5/2017 1/29/1993        285,981 42.1%

The movies with a high “really like” probability in this group have already been watched. Of the remaining movies, there are three movies that are 50/50 and the rest have the odds stacked against them. In other words, if I watch all ten movies I probably won’t “really like” half of them. The dilemma is that I would probably “really like” half of them if I do watch all ten. The reality is that I won’t watch any of these ten movies as long as there are movies that I’ve already seen with better odds. Is there a way to improve the odds for any of these ten movies?

You’ll note that all ten movies have probabilities based on less than 500,000 ratings. Will some of these movies improve their probabilities as they receive more ratings? Maybe. Maybe not. To explore this possibility further I divided my database into quintiles based on the total number of ratings. When I look at the quintile with the most ratings, the most credible quintile, it does provide results that define the optimal performance of my algorithm.

Quintile 5

# Ratings Range > 2,872,053

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 152 134 88% 8.6 8.5 -0.1
Movies Seen Once 246 119 48% 7.5 6.9 -0.7
             
All Movies in Range 398 253 64% 7.9 7.5  

All of the movies in Quintile 5 have more than 2,872,053 ratings. My selection of movies that I had seen before is clearly better than my selection of movies I watched for the first time. This better selection is because the algorithm results led me to the better movies and my memory did some additional weeding. My takeaway is that, when considering movies I’ve never seen before, put my greatest trust in the algorithm if the movie falls in this quintile.

Lets look at the next four quintiles.

Quintile 4

# Ratings Range 1,197,745 to 2,872,053

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 107 85 79% 8.3 8.3 0.1
Movies Seen Once 291 100 34% 7.1 6.4 -0.7
             
All Movies in Range 398 185 46% 7.4 6.9
Quintile 3

# Ratings Range 516,040 to 1,197,745

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 122 93 76% 7.8 8.0 0.2
Movies Seen Once 278 102 37% 7.1 6.6 -0.6
             
All Movies in Range 400 195 49% 7.3 7.0
Quintile 2

# Ratings Range 179,456 to 516,040

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 66 46 70% 7.4 7.5 0.2
Movies Seen Once 332 134 40% 7.0 6.4 -0.6
             
All Movies in Range 398 180 45% 7.1 6.6
Quintile 1

# Ratings Range < 179,456

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 43 31 72% 7.0 7.5 0.5
Movies Seen Once 355 136 38% 6.9 6.2 -0.7
             
All Movies in Range 398 167 42% 6.9 6.4

Look at the progression of the algorithm projections as the quintiles get smaller. The gap between the movies seen more than once and those seen only once narrows as the number of ratings gets smaller. Notice that the difference between my ratings and the projected ratings for Movies Seen Once is fairly constant for all quintiles, either -0.6 or -0.7. But for the Movies Seen More than Once, the difference grows positively as the number of ratings gets smaller. This suggests that, for Movies Seen More than Once, the higher than expected ratings I give movies in Quintiles 1 & 2 are primarily driven by my memory of the movies rather than the algorithm.

What does this mean for my top ten never before seen movies listed above? All of the top ten is either in Quintiles 1 or 2. As they grow into the higher quintiles some may emerge with higher “really like” probabilities. Certainly, Star Trek Beyond, which is only 7 months old, can be expected to grow into the higher quintiles. But, what about Splendor in the Grass which was released in 1961 and, at 55 years old, might not move into Quintile 3 until another 55 years pass.

It suggests that another secondary movie quality indicator is needed that is separate from the movie recommender sites already in use. It sounds like I’ve just added another project to my 2017 “really like” project list.

 

 

The Art of Selecting “Really Like” Movies: New Movies

I watch a lot of movies, a fact that my wife, and occasionally my children, like to remind of. Unlike the average, non-geeky, movie fan, though, I am constantly analyzing the process I go through to determine which movies I watch. I don’t like to watch mediocre, or worse, movies. I’ve pretty much eliminated bad movies from my selections. But, every now and then a movie I “like” rather than “really like” will get past my screen.

Over the next three weeks I’ll outline the steps I’m taking this year to improve my “really like” movie odds. Starting this week with New Movies, I’ll lay out a focused strategy for three different types of movie selection decisions.

The most challenging “really like” movie decision I make is which movies that I’ve never seen before are likely to be “really like” movies. There is only a 39.3% chance that watching a movie I’ve never seen before will result in a “really like” experience. My goal is to improve those odds by the end of the year.

The first step I’ve taken is to separate movies I’ve seen before from movies I’ve never seen in establishing my “really like” probabilities. As a frame of reference, there is a 79.5% chance that I will “really like” a movie I’ve seen before. By setting my probabilities for movies I’ve never seen off of the 39.3% probability I have created a tighter screen for those movies. This should result in me watching fewer never-before-seen movies then I’ve typically watched in previous years. Of the 20 movies I’ve watched so far this year, only two were never-before-seen movies.

The challenge in selecting never-before-seen movies is that, because I’ve watched close to 2,000 movies over the last 15 years, I’ve already watched the “cream of the crop” from those 15 years.. From 2006 to 2015, there were 331 movies that I rated as “really like” movies, that is 33 movies a year, or less than 3 a month. Last year I watched 109 movies that I had never seen before. So, except for the 33 new movies that came out last year that, statistically, might be “really like” movies, I watched 76 movies that didn’t have a great chance of being “really like” movies.

Logically, the probability of selecting a “really like” movie that I’ve never seen before should be highest for new releases. I just haven’t seen that many of them. I’ve only seen 6 movies that were released in the last six months and I “really liked” 5 of them. If, on average, there are 33 “really like” movies released each year, then, statistically, there should be a dozen “really like” movies released in the last six months that I haven’t seen yet. I just have to discover them. Here is my list of the top ten new movie prospects that I haven’t seen yet.

My Top Ten New Movie Prospects 
New Movies =  < Release Date + 6 Months
Movie Title Release Date Last Data Update “Really Like” Probability
Hacksaw Ridge 11/4/2016 2/4/2017 94.9%
Arrival 11/11/2016 2/4/2017 94.9%
Doctor Strange 11/4/2016 2/6/2017 78.9%
Hidden Figures 1/6/2017 2/4/2017 78.7%
Beatles, The: Eight Days a Week 9/16/2016 2/4/2017 78.7%
13th 10/7/2016 2/4/2017 78.7%
Before the Flood 10/30/2016 2/4/2017 51.7%
Fantastic Beasts and Where to Find Them 11/18/2016 2/4/2017 51.7%
Moana 11/23/2016 2/4/2017 51.7%
Deepwater Horizon 9/30/2016 2/4/2017 45.4%
Fences 12/25/2016 2/4/2017 45.4%

Based on my own experience, I believe you can identify most of the new movies that will be “really like” movies within 6 months of their release, which is how I’ve defined “new” for this list. I’m going to test this theory this year.

In case you are interested, here is the ratings data driving the probabilities.

My Top Ten New Movie Prospects 
Movie Site Ratings Breakdown
Ratings *
Movie Title # of Ratings All Sites Age 45+ IMDB Rotten Tomatoes ** Criticker Movielens Netflix
Hacksaw Ridge         9,543 8.2 CF 86% 8.3 8.3 8.6
Arrival      24,048 7.7 CF 94% 8.8 8.1 9.0
Doctor Strange      16,844 7.7 CF 90% 8.2 8.3 7.8
Hidden Figures         7,258 8.2 CF 92% 7.7 7.3 8.2
Beatles, The: Eight Days a Week         1,689 8.2 CF 95% 8.0 7.3 8.0
13th    295,462 8.1 CF 97% 8.3 7.5 8.0
Before the Flood         1,073 7.8 F 70% 7.6 8.2 7.8
Fantastic Beasts and Where to Find Them      14,307 7.5 CF 73% 7.3 6.9 7.6
Moana         5,967 7.7 CF 95% 8.4 8.0 7.0
Deepwater Horizon      40,866 7.1 CF 83% 7.8 7.6 7.6
Fences         4,418 7.6 CF 95% 7.7 7.1 7.2
*All Ratings Except Rotten Tomatoes Calibrated to a 10.0 Scale
** CF = Certified Fresh, F = Fresh

Two movies, Hacksaw Ridge and Arrival, are already probably “really like” movies and should be selected to watch when available. The # of Ratings All Sites is a key column. The ratings for Movielens and Netflix need ratings volume before they can credibly reach their true level. Until, there is a credible amount of data the rating you get is closer to what an average movie would get. A movie like Fences, at 4,418 ratings, hasn’t reached the critical mass needed to migrate to the higher ratings I would expect that movie to reach. Deep Water Horizon, on the other hand, with 40,866 ratings, has reached a fairly credible level and may not improve upon its current probability.

I’m replacing my monthly forecast on the sidebar of this website with the top ten new movie prospects exhibit displayed above. I think it is a better reflection of the movies that have the best chance of being “really like” movies. Feel free to share any comments you might have.

 

Blogging Shouldn’t Be Like Groundhog Day

It’s Groundhog Day!!! As I was preparing my post for this week, I discovered I was going through the motions. You see, the first post of the month is reserved for my movie picks for the month and there weren’t any February movies I was particularly excited about. Except for Oscar nominees moving to a wide audience in February (e.g. the 2014 movie Still Alice went into wide release on 2/20/2015), February is a wasteland for new releases, filled with cheesy romances and awards wanna-be’s that couldn’t make the cut. Why should I write about that? Sometimes a good idea gets stale, especially for the writer. If repeated too often, it becomes like Groundhog Day.

My series of articles on the actors of the decade has produced some interesting insights into overlooked male and female actors. But, when I obligate myself to two such articles a month, the interest of both the reader and the writer begins to wane. It becomes Groundhog Day.

This is my 76th, and shortest, article since I began writing this blog last February 12th. I still enjoy analyzing movie data. When I go to bed at night, I look forward to discovering something new the next day. I enjoy sharing in this blog those discoveries that induce me to say “Aha”. Most of all, I enjoy experiencing “really like” movies. Maybe I should connect more with those “Aha” moments.

Next week, as I finish my first year as a blogger, I’ll remember that Blogging shouldn’t be like Groundhog Day. Next week I’m shooting for “fresh as the first day of Spring”.

 

Post Navigation