Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the category “Netflix”

When It Comes to Unique Movies, Don’t Put All of Your Movie Eggs in the Netflix Basket.

It is rare to find a movie that isn’t a sequel, or a remake, or a borrowed plot idea, or a tried and true formula. Many of these movies are entertaining because they feel familiar and remind us of another pleasant movie experience. The movie recommender websites Netflix, Movielens, and Criticker do a terrific job of identifying those movie types that you “really liked” before and surfacing those movies that match the familiar plot lines you’ve enjoyed in the past.

But, what about those movies that are truly original. What about those movies that present ideas and plot lines that aren’t in your usual comfort zone. Will these reliable websites surface these unique movies that I might like? Netflix has 20,000+ genre categories that they slot movies into. But, what if a movie doesn’t fit neatly into one of those 20,000 categories.

Yesterday I watched a great movie, Being There.

Being There

This 1979 movie, starring Peter Sellers in an Academy Award nominated performance, is about a simple-minded gardener who never left the home of his employer over the first fifty years of his life. Aside from gardening, the only knowledge he has is what he’s seen on television. After his employer dies he is forced to leave his home. The movie follows Chance the Gardener as he becomes Chauncey Gardner, economic advisor to the President. It’s not a story of transformation but of perception. The movie is fresh.

My most reliable movie recommenders, Netflix and Movielens, warned me away from this movie. The only reason I added it to my weekly Watch List is because I saw the movie in the theater when it first came out in 1979 and “really liked” it.

Another possible reason why Netflix missed on this movie is because I hated Peter Sellers’ other classic movie Dr. Strangelove. I rated it 1 out of 5 stars.  If Being There is slotted among a Netflix category of Peter Sellers classics, it might explain the mismatch.

Is it impossible, then, to surface movies that have creative original content that you might like. Not entirely. Criticker predicted I would rate Being There an 86 out of 100. I gave it an 89. The IMDB rating is 8.0 based on over 54,000 votes. Rotten Tomatoes has it at 96% Certified Fresh. This is why I incorporate ratings from five different websites into the “really like” model rather than just Netflix.

When it comes to unique movies, don’t put all of your movie eggs in the Netflix basket.

 

 

Advertisements

What Am I Actually Going to Watch This Week? Netflix Helps Out with One of My Selections.

The core mission of this blog is to share ideas on how to select movies to watch that we’ll “really like”. I believe that there have been times when I’ve bogged down on how to build the “really like” model. I’d like to reorient the dialogue back to the primary mission of what “really like” movies I am going to watch and more importantly why.

Each Wednesday I publish the ten movies on my Watch List for the week. These movies usually represent the ten movies with the highest “really like” probability that are available to me to watch on platforms that I’ve already paid for. This includes cable and streaming channels I’m paying for and my Netflix DVD subscription. I rarely use a movie on demand service.

Now, 10 movies is too much, even for the Mad Movie Man, to watch in a week. The ten movie Watch List instead serves as a menu for the 3 or 4 movies I actually most want to watch during the week. So, how do I select those 3 or 4 movies?

The first and most basic question to answer is who, if anyone, am I watching the movie with. Friday night is usually the night that my wife and I will sit down and watch a movie together. The rest of the week I’ll watch two or three movies by myself. So, right from the start, I have to find a movie that my wife and I will both enjoy. This week that movie is Hidden Figures, the 2016 Oscar nominated film about the role three black female mathematicians played in John Glenn’s orbit of the earth in the early 1960’s.

This movie became available to Netflix DVD subscribers on Tuesday May 9. I received my Hidden Figures DVD on that day. Something I’ve learned over the years is that Netflix ships DVD’s on Monday that become available on Tuesday. For this to happen you have to time the return of your old DVD to arrive on the Saturday or Monday before the Tuesday release. This gives you the best chance to avoid “long wait” queues.

I generally use Netflix DVD to see new movies that I don’t want to wait another 3 to 6 months to see or for old movies that I really want to see but aren’t available on my usual platforms.

As of the first quarter of 2017, Netflix reported that there are only 3.94 million subscribers to their DVD service. I am one of them. The DVD service is the only way that you can still access Netflix’ best in the business 5 star system of rating movies. It is easily the most reliable predictor of how you’ll rate a movie or TV show. Unfortunately, Netflix Streaming customers no longer have the benefit of the 5 Star system. They have gone to a less granular “thumbs up” and “thumbs down” rating system. To be fair, I haven’t gathered any data on this new system yet therefore I’ll reserve judgement as to its value. As for the DVD service, they will have me as a customer as long as they maintain their 5 star recommender system as one of the benefits of being a DVD subscriber.

The 5 star system is a critical assist to finding a movie for both my wife and I. Netflix allows you set up profiles for other members of the family. After my wife and I watch a movie, she gives it a rating and I give it a rating. These ratings are entered under our separate profiles. This allows a unique predicted rating for each of us based on our individual taste in movies. For example, Netflix predicts that I will rate Hidden Figures a 4.6 out of 5 and my wife will rate it a 4.9. In other words, according to Netflix, this is a movie that both of us, not only will “really like”, but we should absolutely “love”.

Hidden Figures has a “really like” probability of 61.4%. It’s Oscar Performance probability is 60.7% based on its three nominations. Its probability based solely on the feedback from the recommender sites that I use is 69.1%. At this point in time, it is a Quintile 1 movie from a credibility standpoint. This means that the 69.1% probability is based on a limited number of ratings. It’s not very credible yet. That’s why the 61.4% “really like” probability is closer to the Oscar Performance probability of 60.7%. I would fully expect that, as more people see Hidden Figures and enter their ratings, the “really like” probability will move higher for this movie.

Friday Night Movie Night this week looks like a “really like” lock…thanks to Netflix DVD.

 

 

The Wandering Mad Movie Mind

Last week in my post I spent some time leading you through my thought process in developing a Watch List. There were some loose threads in that article that I’ve been tugging at over the last week.

The first thread was the high “really like” probability that my algorithm assigned to two movies, Fight Club and Amelie, that I “really” didn’t like the first time I saw them. It bothered me to the point that I took another look at my algorithm. Without boring you with the details, I had an “aha” moment and was able to reengineer my algorithm in such a way that I can now develop unique probabilities for each movie. Prior to this I was assigning the same probability to groups of movies with similar ratings. The result is a tighter range of probabilities clustered around the base probability. The base probability is defined as the probability that I would “really like” a movie randomly selected from the database. If you look at this week’s Watch List, you’ll notice that my top movie, The Untouchables, has a “really like” probability of 72.2%. In my revised algorithm that is a high probability movie. As my database gets larger, the extremes of the assigned probabilities will get wider.

One of the by-products of this change is that the rating assigned by Netflix is the most dominant driver of the final probability. This is as it should be. Netflix has by far the largest database of any I use.  Because of this it produces the most credible and reliable ratings of any of the rating websites. Which brings me back to Fight Club and Amelie. The probability for Fight Club went from 84.8% under the old formula to 50.8% under the new formula. Amelie went from 72.0% to 54.3%. On the other hand, a movie that I’m pretty confident that I will like, Hacksaw Ridge changed only slightly from 71.5% to 69.6%.

Another thread I tugged at this week was in response to a question from one of the readers of this blog.  The question was why was Beauty and the Beast earning the low “really like” probability of 36.6% when I felt that there was a high likelihood that I was going to “really like” it. The fact is that I saw the movie this past week and it turned out to be a “really like” instant classic. I rated it a 93 out of 100, which is a very high rating from me for a new movie. In my algorithm, new movies are underrated for two reasons. Because they generate so few ratings in their early months, e.g. Netflix has only 2,460 ratings for Beauty and the Beast so far, the credibility of the movie’s own data is so small that the “really like” probability is driven by the Oscar performance part of the algorithm. This is the second reason for the low rating. New movies haven’t been through the Oscar cycle yet and so their Oscar performance probability is that of a movie that didn’t earn an Oscar nomination, or 35.8%. This is why Beauty and the Beast was only at 36.6% “really like” probability on my Watch List last week.

I’ll leave you this week with a concern. As I mentioned above, Netflix is the cornerstone of my whole “really like” system. You can appreciate then my heart palpitations when it was announced a couple of weeks ago that Netflix is abandoning it’s five star rating system in April. It is replacing it with a thumbs up or down rating with a % next to it, perhaps a little like Rotten Tomatoes. While I am keeping and open mind about the change, it has the potential of destroying the best movie recommender system in the business. If it does, I will be one “mad” movie man, and that’s not “crazy” mad.

Oh, What To Do About Those Tarnished Old Quarters.

In one of my early articles, I wrote about the benefits of including older movies in your catalogue of movies to watch. I used the metaphor of our preference for holding onto shiny new pennies rather than tarnished old quarters. One of the things that has been bothering me is that my movie selection system hasn’t been surfacing older movie gems that I haven’t seen. Take a look at the table below based on the movie I’ve watched over the last 15 years:

Movie Release Time Frame # of Movies Seen % of Total
2007 to 2016 573 29%
1997 to 2006 606 31%
1987 to 1996 226 11%
1977 to 1986 128 6%
1967 to 1976 101 5%
1957 to 1966 122 6%
1947 to 1956 109 6%
1937 to 1946 87 4%
1920 to 1936 25 1%

60% of the movies I’ve watched in the last 15 years were released in the last 20 years. That’s probably typical. In fact, watching movies more than 20 years old 40% of the time is probably unusual. Still, there are probably quality older movies out there that I’m not seeing.

My hypothesis has been that the databases for the movie websites that produce my recommendations are smaller for older movies. This results in recommendations that are based on less credible data. In the world of probabilities, if your data isn’t credible, your probability stays closer to the average probability for randomly selected movies.

I set out to test this hypothesis against the movies I’ve watched since I began to diligently screens my movies through my movie selection system. It was around 2010 that I began putting together my database and using it to select movies. Here is a profile of those movies.

Seen after 2010
Movie Release My
Time Frame Average Rating # of Movies Seen % of Total Seen
2007 to 2016 7.2 382 55%
1997 to 2006 7.9 60 9%
1987 to 1996 7.9 101 15%
1977 to 1986 7.8 57 8%
1967 to 1976 7.9 23 3%
1957 to 1966 8.2 26 4%
1947 to 1956 8.2 20 3%
1937 to 1946 8.4 17 2%
1920 to 1936 6.9 4 1%

It seems that it’s the shiniest pennies, that I watch most often, that I’m least satisfied with. So again I have to ask, why aren’t my recommendations producing more older movies to watch?

It comes back to my original hypothesis. Netflix has the greatest influence on the movies that are recommended for me. So, I compared my ratings to Netflix’ Best Guess ratings for me and added the average number of ratings those “best guesses” were based on.

Movie Release Time Frame My Average Rating Netflix Average Best Guess Avg. # of Ratings per Movie My Rating Difference from Netflix
2007 to 2016 7.2 7.7    1,018,163 -0.5
1997 to 2006 7.9 8.0    4,067,544 -0.1
1987 to 1996 7.9 8.1    3,219,037 -0.2
1977 to 1986 7.8 7.8    2,168,369 0
1967 to 1976 7.9 7.6    1,277,919 0.3
1957 to 1966 8.2 7.9        991,961 0.3
1947 to 1956 8.2 7.8        547,577 0.4
1937 to 1946 8.4 7.8        541,873 0.6
1920 to 1936 6.9 6.1        214,569 0.8

A couple of observations on this table;

  • Netflix pretty effectively predicts my rating for movies released between 1977 to 2006. The movies from this thirty year time frame base their Netflix best guesses on more than 2,000,000 ratings per movie.
  • Netflix overestimates my ratings for movies released from 2007 to today by a half point. It may be that the people who see newer movies first are those who are most likely to rate them higher. It might take twice as many ratings before the best guess finds its equilibrium, like the best guesses for the 1987 to 2006 releases.
  • Netflix consistently underestimates my ratings for movies released prior to 1977. And, the fewer ratings the Netflix best guess is based on, the greater Netflix underestimates my rating of the movies.

What have I learned? First, to improve the quality of new movies I watch, I should wait until the number of ratings the recommendations are based on is greater. What is the right number of ratings is something I have to explore further.

The second thing I’ve learned is that my original hypothesis is probably correct. The number of ratings Netflix has available to base its recommendations on for older movies is probably too small for their recommendations to be adequately responsive to my taste for older movies. The problem is, “Oh, what to do about those tarnished old quarters” isn’t readily apparent.

 

When It Comes To Movie DNA, Do Directors Have It and Has Netflix Mapped It?

I can’t wait to see Christopher Nolan’s next movie, Dunkirk, which is due to reach the theaters in 2017. I’ve seen 8 of Nolan’s 9 feature films and have given those movies an average rating of 87.5 out of 100, my highest average rating for any director with at least 8 movies seen. On the other hand, I’m bored to tears by Wes Anderson’s movies. I’ve seen 2 of his 8 movies and I’ve awarded them an average rating of 43.5 out of 100. Not included in the two movies I watched were Rushmore and Grand Budapest Hotel, both of which I tried to watch but couldn’t finish. Each of us has a distinct movie taste that guides our movie selection. It is our own unique movie DNA, if you will.

Do movie directors have a movie making DNA?  Do movie directors make movies with common traits,  a movie making DNA, that particular viewers might be drawn to or repelled by? Netflix has made millions of dollars by identifying movies and TV shows that we are predisposed to enjoy.Does Netflix, indirectly or directly, draw you to favorite directors and push you away from directors you just don’t get? These are the questions I’ve been researching the past week.

I haven’t successfully come up with a broad systematic answer to these questions yet. But, by looking at a couple of directors, one I like and another I don’t, I can begin to develop a hypothesis. The two directors I looked at have a sizeable body of work. The director who I enjoy is Ron Howard. In the last 15 years I’ve seen 9 of the 21 feature films he has directed.. The director I just don’t get is Stanley Kubrick. Everyone praises his genius but I don’t “really like” his movies. Here are my average ratings for these two director’s movies that I’ve seen over the last 15 years compared to the average ratings of all Netflix customers for the same movies. For purposes of apples to apples comparison, Netflix ratings have been converted to a 100 point scale (e.g.  3.8 out of 5 Netflix Rating is 76 on a 100 point scale).

My Avg Rating Netflix Avg Rating My Rating Difference
Ron Howard 77 76 +1
Stanley Kubrick 52 76 -24

My enjoyment of Ron Howard is fairly consistent with everyone’s enjoyment of Ron Howard. He makes movies that appeal to the general audience. This probably suggests that well done mainstream movies are in my movie DNA. On the other hand, there is a clear difference between my taste for Kubrick and everyone else’s taste for Kubrick. He is not mainstream.

So does Netflix recognize the different appeal that these two directors have for me?  Here’s the same table as the one above, except with the Netflix Best Guess average rating for how I’ll rate the movies instead of how I actually rated the movies..

Netflix Best Guess Avg Rating for Me Netflix Avg Rating Netflix Best Guess Difference
Ron Howard 83 76 +7
Stanley Kubrick 72 76 -4

Directionally it is consistent with my ratings. It is more bullish than my ratings for Ron Howard’s movies and less bearish for Kubrick’s movies. Interestingly enough it is most bullish for Ron Howards best movies as displayed below:

Ron Howard’s Movies I’ve Seen
Netflix Best Guess Avg Rating for Me Netflix Avg Rating Netflix Best Guess Difference
Netflix Avg Rating > 76 92 78 +14
Netflix Avg Rating < 76 72 73 -1

Netflix highly recommends Ron Howard’s best movies to me while taking a neutral position toward his middle of the road movies.

If there is such a thing as a director’s movie making DNA and if Netflix is successfully factoring it into the Best Guess Ratings developed for me, then that DNA relationship should exist in the movies I haven’t seen in the last 15 years as well. Here’s a look at the sample for those movies:

Netflix Best Guess Avg Rating for Me Netflix Avg Rating Netflix Best Guess Difference
Ron Howard 67 70 -3
Stanley Kubrick 55 70 -15

Again, the results are consistent with the results for the movies I’ve seen. My additional observation is that the director I like gets a Netflix recommendation boost for the movies that the Netflix universe rates the highest. Conversely, Netflix more aggressively drives me away from the movies rated lowest by the Netflix universe for the director I don’t like.

Without a broader study, I can’t say for sure that there is such a thing as movie DNA specific to a movie director, or that Netflix’ algorithm indirectly recognizes it in their recommendations. But, based on this isolated comparison, it sure looks like there is and Netflix might have it well mapped.

 

 

When Might We See the Next Perfect Netflix-DVD Movie?

Last Thursday I posted a list of 51 movies that received a Netflix-DVD perfect score of 4.9. For anyone who has experienced the joy of seeing a movie that they absolutely love, you know that those couple of hours of cinema nirvana don’t happen every day. If I’m lucky enough to run across a movie with a Netflix-DVD Best Guess of 4.9, that I haven’t seen, I know that there is a high probability that movie heaven has arrived. So the question is, “When am I likely to discover another Netflix-DVD movie with a 4.9 rating?”

Well, of the 51 perfect score movies out there today, here is the breakdown by month of how many 4.9 movies have gone into wide release for a given month:

Dec 14
Jun 7
Oct 7
May 6
Remaining Months < 5

It is not surprising that December is far and away the most represented month. Producers that are most confident in a particular movie’s chances of winning Oscar gold, release those movies in December. If we consolidate this list down to the three movie seasons, we see that Netflix perfection isn’t limited to Awards Season.

 # of  Movies # of movies per Month
Awards Season 25 8.3
Blockbuster Season 21 4.2
Dump Season 5 1.3

While it might appear that a perfect score movie is almost as likely to be released during Blockbuster Season as Awards Season, you need to keep in mind that Awards Season (Oct – Dec) is three months long while Blockbuster Season (Mar – Jul) is five months long. Based on the monthly average a perfect Netflix movie is almost twice as likely to be released during Awards Season as opposed to Blockbuster Season. Rarely is a perfect movie released in Dump Season. One of the five movies, Million Dollar Baby, went into limited release in December to be eligible for that year’s Awards Season before going into wide release in January. It was therefore released only technically during Dump Season.

So, now we know that the most likely time of the year for a new perfect score movie to be released is during Awards Season, particularly in December. Are we likely to see one released this year? Here’s where it gets tricky. From 1992 to 2010, at least one perfect score movie was released every year. Since 2010, we’ve had three released in 2012 and one released last year. Here’s the breakdown by decade:

2010’s 6
2000’s 15
1990’s 14
1980’s 7
1970’6 5
1960’s 2
1950’s 0
1940’s 2

Does this mean that movie heaven begins and ends between 1990 and 2009? No, the answer is more mundane. The answer lies in the statistical concept of the law of large numbers. Netflix needs a large statistical base of ratings for a particular movie before its model will assign it a 4.9. It is only with those large numbers will the Netflix model be able to confidently predict that you will love a particular movie. Of the 51 perfect score movies on my list, only four have fewer than 1,000,000 ratings – the relatively recent movies, The Martian, Argo, Lincoln, and the 1946 classic, It’s a Wonderful Life. The preponderance of perfect score movies between 1990 and 2009 has more to do with the fact that they are the most seen movies by Netflix raters.

To the question, “When will the next perfect Netflix-DVD movie come along?”, the answer is that it probably already has come along and it’s just waiting for enough Netflix ratings. Based on the results from 1992 to 2010, there is likely to be a perfect score movie this year, although it probably hasn’t been released yet (the one already released movie with a shot is Captain America: Civil War). In the mean time, watch those perfect Netflix movies from my last post that may have slipped by you. Experience a little bit of movie heaven while we wait for th next perfect movie to reveal itself.

A Netflix-DVD Perfect Score Movie Is a Must See Movie

Nothing in life is guaranteed. How often have you heard that? Those who use that phrase are probably right…most of the time. But when Netflix-DVD provides you with a “Best Guess” of 4.9 for a particular movie, I can say that you are guaranteed to “really like” that movie and be pretty confident that I am right. In my database of 1,980 movies, 51 have received a perfect score of 4.9 from Netflix-DVD. That is 2.6% of all of the movies I have watched in the last 15 years. Of those 51 perfect score movies, I have given a “really like” score of 75 (out of 100) or higher to all 51 movies. I have given a “love” score of 85 or higher to 48 of the 51. If Netflix-DVD presents me with a movie with a Best Guess of 4.9, there is a 94.1% probability that I will “love” the movie, and close to 100% that I will “really like” it. That is pretty darn close to a guarantee.

So, after providing all of these guarantees, it would be just cruel of me not to share with you the 51 perfect score movies. Here they are:

Netflix-DVD Perfect Score Movies
American President, The King’s Speech, The
Apollo 13 L.A. Confidential
Argo Lincoln
Batman Begins Lord of the Rings: The Fellowship of the Ring
Bourne Identity, The Lord of the Rings: The Return of the King, The
Bourne Ultimatum, The Martian, The
Braveheart Million Dollar Baby
Casablanca Mystic River
Cinderella Man Raiders of the Lost Ark
Dark Knight, The Rocky
Departed, The Saving Private Ryan
Few Good Men, A Schindler’s List
Field of Dreams Shawshank Redemption, The
Forrest Gump Silver Linings Playbook
Fugitive, The Sixth Sense, The
Gladiator Sleepless in Seattle
Glory Social Network, The
Godfather, The Sound of Music, The
Godfather: Part II, The Spider-Man 2
Gone Baby Gone Star Trek
Good Will Hunting Star Wars IV: A New Hope
Hoosiers Star Wars V: The Empire Strikes Back
It’s a Wonderful Life Star Wars VI: Return of the Jedi
Jerry Maguire Sting, The
Juno To Kill a Mockingbird
When Harry Met Sally

Those of you who are movie lovers probably have seen all or most of these. If not, you probably can’t go wrong sampling some movies from this list. The list is also a peek at my taste in movies. Netflix-DVD is uncanny in its capability to look into the depths of my movie soul and pick out the perfect movie. I’ll just mention again that I’m not referring to the recommendations that you get on Streaming Netflix. It seems like they give five stars to everything. The perfect scores for this post is from the DVD recommender.

We all strive for perfection at different times in our lives. Netflix 4.9 movies define perfection for movie recommendations.

In the Battle of Memory vs. Movie Website, Netflix is Still the Champ

On Monday I posed the question, is your memory of a movie that you’ve already seen the best predictor of “really like” movies. Based on Monday’s analysis memory certainly comes out on top against IMDB and Rotten Tomatoes. Today, I’m extending the analysis to Criticker, Movielens, and Netflix. By reconfiguring the data used in Monday’s post, you also can measure the relative effectiveness of each site. For example, let’s look again at IMDB.

Probability I Will “Really Like” Based on IMDB Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 80.1% 69.2%                            0.11
Never Seen Before 50.6% 33.6%                            0.17

It’s not surprising that the probabilities are higher for the movies that were seen before. After all it wouldn’t make sense to watch again the movies you wished you hadn’t seen the first time. But by looking at the gap between the probability of a recommended movie and a non-recommended movie, you begin to see how effectively the movie recommender is at sorting high probability movies from low probability movies. In this instance, the small 11 point spread for Seen Before movies suggests that IMDB is only sorting these movies into small departures from average. The low probabilities for the Never Seen Before movies suggest that, without the benefit of the memory of a movie seen before, IMDB doesn’t do a very good job of identifying “really like” movies.

Rotten Tomatoes follows a similar pattern.

Probability I Will “Really Like” Based on Rotten Tomatoes Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 80.5% 65.1%                            0.15
Never Seen Before 49.8% 31.8%                            0.18

Rotten Tomatoes is a little better than IMDB at sorting movies. The point spreads are a little broader. But, like IMDB, Rotten Tomatoes doesn’t effectively identify “really like” movies for the Never Seen Before group.

Theoretically, when we look at the same data for the remaining three sites, the Percentage Point Spread should be broader to reflect the more personalized nature of the ratings. Certainly, that is the case with Criticker.

Probability I Will “Really Like” Based on Criticker Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 79.3% 56.4%                            0.23
Never Seen Before 45.3% 18.9%                            0.26

Like IMDB and Rotten Tomatoes, though, Criticker isn’t very effective at identifying “really like” movies for those movies in the Never Seen Before group.

When you review the results for Movielens, you can begin to see why I’m so high on it as a movie recommender.

Probability I Will “Really Like” Based on Movielens Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 86.6% 59.6%                            0.27
Never Seen Before 65.1% 22.3%                            0.43

Unlike the three sites we’ve looked at so far, Movielens is a good predictor of “really like” movies for Never Seen Before movies. And, the spread of 43 points for the Never Seen Before movies is dramatically better than the three previous sites. It is a very effective sorter of movies.

Last, but certainly not least, here are the results for Netflix.

Probability I Will “Really Like” Based on Netflix Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 89.8% 45.7%                            0.44
Never Seen Before 65.7% 21.4%                            0.44

What jumps off the page is that there is no memory advantage in the allocation of movies for Netflix. As expected, the Seen Before probabilities are higher. But, there is an identical 44 point gap for Seen Before movies and movies Never Seen Before. It is the only site where you have a less than 50% chance that you will “really like” a movie you’ve already seen if Netflix doesn’t recommend it.

“If memory serves me correctly, I “really liked” this movie the first time I saw it.” That is an instinct worth following even if the movie websites suggest otherwise. But, if Netflix doesn’t recommend it, you might think twice.

***

6/24/2016 Addendum

I’ve finalized my forecast for the last three movies on my June Prospect list. My optimism is turning to pessimism regarding my hopes that Independence Day: Resurgence and Free State of Jones would be “really like movies”. Unfavorable reviews from the critics and less than enthusiastic response from audiences suggest that they could be disappointments. Of my five June prospects, Finding Dory seems to be the only safe bet for theater viewing, with Me Before You a possibility for female moviegoers. The IMDB gender split is pronounced for Me Before You with female voters giving it an 8.1 rating and males a 7.3 rating. It is also one of those rare movies with more female IMDB voters than males.

Netflix Attacks Its Movie Deficit But Is It Enough?

Over the last few months there have been a number of stories in the blogosphere about the shrinking Netflix movie catalogue. I contributed to this storyline in my post  Netflix Streaming: The Other Story . The early response to these stories was that Netflix was going to focus their resources on original content because licensing fees were becoming cost prohibitive. Over the last couple of weeks Netflix has, at least for now, decided not to abandon entirely their customers who want more than just original content. On May 23rd Netflix announced that, beginning in September, Netflix would be the exclusive U.S. pay TV provider for the latest films from Disney, Marvel, Lucasfilm, and Pixar. Once they’ve left the movie theaters, the only place you’ll be able to see films from these proven producers of blockbuster movies is on Netflix. Along with this announcement, Netflix released a short clip on what’s coming to Netflix this summer. You can find it in this linked story. This deal, worth $600 million, is on the heels of a $100 million, 5 year deal with Miramax to retain the rights to their inventory of 700 movies. Absent a deal with Miramax, Netflix would have lost the licensing rights to many of their older classic movies such as Good Will Hunting and Pulp Fiction. Amazon’s hopes of catching up to Netflix, in terms of the size of movie inventory, have been seriously damaged.

These deals shore up a growing weakness for Netflix. For movie lovers like myself, though, a huge gap remains between the movies we want to see and the movies available on Netflix streaming and Amazon Prime. In my movie database there are 423 “really like” movies. Only 68 are available on these two leading streaming services, combined. That’s only 16%. If you only have Amazon Prime, 20 of these 423 movies, or 5%, would be available. If you only have Netflix, 48 of 423, or 11%, would be available. A la Carte purchase of movies through pay per view venues isn’t going away. This isn’t good news for those of us who are already paying for streaming services and/or premium cable channels and still have to pay more to see 85% of the movies we’re interested in.

I guess, with apologies to Crosby, Stills, Nash, and Young, if you can’t be with the movie you love, love the movie your with.

When It Comes to Movie Rating Websites, There is Strength in Numbers.

If you can only use one website to help you select movies that you will “really like”, which should you choose? That’s a tougher question than you might think. Because I have used all five of the websites recommended here to select movies to watch, my data has been heavily influenced by their synergy. I have no data to suggest how effective using only one site would be. Here’s what I do have:

Probability I Will “Really Like”
Recommendation Standard When Recommended in Combination with Other Sites When Recommended by This Site Only
MovieLens > 3.73 70.2% 2.8%
Netflix > 3.8 69.9% 8.4%
Criticker > 76 66.4% 10.1%
IMDB > 7.4 64.1% 0.3%
Rotten Tomatoes Certified Fresh 62.7% 4.3%

When MovieLens recommends a movie, in synergy with other websites, it produces the highest probability. When Criticker recommends a movie but the other four sites don’t recommend the movie, then Criticker has the highest probability. Netflix is second in both groups. Which one is the best is unclear. What is clear is that the three sites that recommend movies based on your personal taste in movies, MovieLens, Netflix, & Criticker, outperform the two sites that are based on third party feedback, Rotten Tomatoes and IMDB. When Netflix, MovieLens, & Criticker recommend the same movie there is an 89.9% chance I’ll “really like” it. When both IMDB & Rotten Tomatoes recommend the same movie the probability is 75.8% I’ll “really like” it.

What also is clear is that if four websites are recommending that you don’t watch a movie and one is recommending that you do, the probability is that you won’t “really like” the movie no matter how good that one website is overall. The progression of probabilities in the example below gives some perspective of how combining websites works:

Websites Recommending a Movie Probability I Will “Really Like”
None 3.9%
Netflix Only 8.4%
Netflix & MovieLens Only 31.9%
Netflix, MovieLens, & Criticker Only 50.9%
Netflix, MovieLens, Criticker & IMDB Only 71.1%
All Five 96.6%

Stated Simply, your odds increase with each website that recommends a particular movie. If, for example, you were to only use Netflix for your movie recommendations, the probability of “really liking” a movie might be 69.9% but, in reality, it could be any one of the probabilities in the table above with the exception of the 3.9% for no recommendations. You wouldn’t know if other websites had recommended the movie.

So, if I had to choose one website, I’d choose Netflix-DVD if I were one of their 5,000,000 DVD subscribers. If I’m not already a subscriber I’d go with MovieLens. It would be a reluctant recommendation, though, because the strength in numbers provided by using multiple websites is just so compelling.

***

You’ll notice in the Top Ten Movies Available to Watch This Week that there are a number of movies on the list that are available on Starz. I’m taking advantage of the Comcast Watchathon Week which provides for free Starz, HBO, & Cinemax. Some of my highly rated movies which would ordinarily be unavailable are available for the short duration of this promotion. Bonus movies. Wahoo!!

 

 

Post Navigation