Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the tag “Criticker”

When It Comes to Unique Movies, Don’t Put All of Your Movie Eggs in the Netflix Basket.

It is rare to find a movie that isn’t a sequel, or a remake, or a borrowed plot idea, or a tried and true formula. Many of these movies are entertaining because they feel familiar and remind us of another pleasant movie experience. The movie recommender websites Netflix, Movielens, and Criticker do a terrific job of identifying those movie types that you “really liked” before and surfacing those movies that match the familiar plot lines you’ve enjoyed in the past.

But, what about those movies that are truly original. What about those movies that present ideas and plot lines that aren’t in your usual comfort zone. Will these reliable websites surface these unique movies that I might like? Netflix has 20,000+ genre categories that they slot movies into. But, what if a movie doesn’t fit neatly into one of those 20,000 categories.

Yesterday I watched a great movie, Being There.

Being There

This 1979 movie, starring Peter Sellers in an Academy Award nominated performance, is about a simple-minded gardener who never left the home of his employer over the first fifty years of his life. Aside from gardening, the only knowledge he has is what he’s seen on television. After his employer dies he is forced to leave his home. The movie follows Chance the Gardener as he becomes Chauncey Gardner, economic advisor to the President. It’s not a story of transformation but of perception. The movie is fresh.

My most reliable movie recommenders, Netflix and Movielens, warned me away from this movie. The only reason I added it to my weekly Watch List is because I saw the movie in the theater when it first came out in 1979 and “really liked” it.

Another possible reason why Netflix missed on this movie is because I hated Peter Sellers’ other classic movie Dr. Strangelove. I rated it 1 out of 5 stars.  If Being There is slotted among a Netflix category of Peter Sellers classics, it might explain the mismatch.

Is it impossible, then, to surface movies that have creative original content that you might like. Not entirely. Criticker predicted I would rate Being There an 86 out of 100. I gave it an 89. The IMDB rating is 8.0 based on over 54,000 votes. Rotten Tomatoes has it at 96% Certified Fresh. This is why I incorporate ratings from five different websites into the “really like” model rather than just Netflix.

When it comes to unique movies, don’t put all of your movie eggs in the Netflix basket.

 

 

The Art of Selecting “Really Like” Movies: New Movies

I watch a lot of movies, a fact that my wife, and occasionally my children, like to remind of. Unlike the average, non-geeky, movie fan, though, I am constantly analyzing the process I go through to determine which movies I watch. I don’t like to watch mediocre, or worse, movies. I’ve pretty much eliminated bad movies from my selections. But, every now and then a movie I “like” rather than “really like” will get past my screen.

Over the next three weeks I’ll outline the steps I’m taking this year to improve my “really like” movie odds. Starting this week with New Movies, I’ll lay out a focused strategy for three different types of movie selection decisions.

The most challenging “really like” movie decision I make is which movies that I’ve never seen before are likely to be “really like” movies. There is only a 39.3% chance that watching a movie I’ve never seen before will result in a “really like” experience. My goal is to improve those odds by the end of the year.

The first step I’ve taken is to separate movies I’ve seen before from movies I’ve never seen in establishing my “really like” probabilities. As a frame of reference, there is a 79.5% chance that I will “really like” a movie I’ve seen before. By setting my probabilities for movies I’ve never seen off of the 39.3% probability I have created a tighter screen for those movies. This should result in me watching fewer never-before-seen movies then I’ve typically watched in previous years. Of the 20 movies I’ve watched so far this year, only two were never-before-seen movies.

The challenge in selecting never-before-seen movies is that, because I’ve watched close to 2,000 movies over the last 15 years, I’ve already watched the “cream of the crop” from those 15 years.. From 2006 to 2015, there were 331 movies that I rated as “really like” movies, that is 33 movies a year, or less than 3 a month. Last year I watched 109 movies that I had never seen before. So, except for the 33 new movies that came out last year that, statistically, might be “really like” movies, I watched 76 movies that didn’t have a great chance of being “really like” movies.

Logically, the probability of selecting a “really like” movie that I’ve never seen before should be highest for new releases. I just haven’t seen that many of them. I’ve only seen 6 movies that were released in the last six months and I “really liked” 5 of them. If, on average, there are 33 “really like” movies released each year, then, statistically, there should be a dozen “really like” movies released in the last six months that I haven’t seen yet. I just have to discover them. Here is my list of the top ten new movie prospects that I haven’t seen yet.

My Top Ten New Movie Prospects 
New Movies =  < Release Date + 6 Months
Movie Title Release Date Last Data Update “Really Like” Probability
Hacksaw Ridge 11/4/2016 2/4/2017 94.9%
Arrival 11/11/2016 2/4/2017 94.9%
Doctor Strange 11/4/2016 2/6/2017 78.9%
Hidden Figures 1/6/2017 2/4/2017 78.7%
Beatles, The: Eight Days a Week 9/16/2016 2/4/2017 78.7%
13th 10/7/2016 2/4/2017 78.7%
Before the Flood 10/30/2016 2/4/2017 51.7%
Fantastic Beasts and Where to Find Them 11/18/2016 2/4/2017 51.7%
Moana 11/23/2016 2/4/2017 51.7%
Deepwater Horizon 9/30/2016 2/4/2017 45.4%
Fences 12/25/2016 2/4/2017 45.4%

Based on my own experience, I believe you can identify most of the new movies that will be “really like” movies within 6 months of their release, which is how I’ve defined “new” for this list. I’m going to test this theory this year.

In case you are interested, here is the ratings data driving the probabilities.

My Top Ten New Movie Prospects 
Movie Site Ratings Breakdown
Ratings *
Movie Title # of Ratings All Sites Age 45+ IMDB Rotten Tomatoes ** Criticker Movielens Netflix
Hacksaw Ridge         9,543 8.2 CF 86% 8.3 8.3 8.6
Arrival      24,048 7.7 CF 94% 8.8 8.1 9.0
Doctor Strange      16,844 7.7 CF 90% 8.2 8.3 7.8
Hidden Figures         7,258 8.2 CF 92% 7.7 7.3 8.2
Beatles, The: Eight Days a Week         1,689 8.2 CF 95% 8.0 7.3 8.0
13th    295,462 8.1 CF 97% 8.3 7.5 8.0
Before the Flood         1,073 7.8 F 70% 7.6 8.2 7.8
Fantastic Beasts and Where to Find Them      14,307 7.5 CF 73% 7.3 6.9 7.6
Moana         5,967 7.7 CF 95% 8.4 8.0 7.0
Deepwater Horizon      40,866 7.1 CF 83% 7.8 7.6 7.6
Fences         4,418 7.6 CF 95% 7.7 7.1 7.2
*All Ratings Except Rotten Tomatoes Calibrated to a 10.0 Scale
** CF = Certified Fresh, F = Fresh

Two movies, Hacksaw Ridge and Arrival, are already probably “really like” movies and should be selected to watch when available. The # of Ratings All Sites is a key column. The ratings for Movielens and Netflix need ratings volume before they can credibly reach their true level. Until, there is a credible amount of data the rating you get is closer to what an average movie would get. A movie like Fences, at 4,418 ratings, hasn’t reached the critical mass needed to migrate to the higher ratings I would expect that movie to reach. Deep Water Horizon, on the other hand, with 40,866 ratings, has reached a fairly credible level and may not improve upon its current probability.

I’m replacing my monthly forecast on the sidebar of this website with the top ten new movie prospects exhibit displayed above. I think it is a better reflection of the movies that have the best chance of being “really like” movies. Feel free to share any comments you might have.

 

Until That Next Special Movie Comes Along

Happy 4th of July to all of my visitors from the States and, to my friends to the North, Happy Canada Day which was celebrated on this past Saturday. It is a good day to watch Yankee Doodle Dandy, one of those special movie experiences I’m fond of.

This past weekend I watched another patriotic movie,  Courage Under Fire with Denzel Washington, Meg Ryan, and a young Matt Damon among others in a terrific cast. It was one of those special movies that I yearned for in my last post on July movie prospects. It was a July 1996 release that wasn’t nominated for an Academy Award (how it didn’t get an acting nomination among several powerful performances astounds me). It earned a 94 out of 100 score from me. I loved this movie. The feeling I get after watching a movie this good is why I watch so many movies. It is the promise that there are more movies out there to see that I will love that feeds my passion for movies.

As I was thinking about special movies the last few days, a question occurred to me. Can I use my rating system to find movies I’ll “love” rather than just “really like”? Of course I can. Any movie that earns a rating of 85 out of 100 or higher meets my definition of a movie I will “love”. An 85 also converts to a five star movie on Netflix. I can rank each of the movie rating websites that I use in my algorithm from highest rating to lowest. I then can take the top 10% of the rankings and calculate the probability that a movie in that top 10% would earn a score of 85 or higher. Regular readers of this blog shouldn’t be surprised by the results.

Top 10% Threshold Actual % of My Database Probability for “Love” Movie
Netflix >  4.5 9.5% 81.4%
Movielens >  4.2 10.7% 76.9%
Criticker >  90 10.3% 55.4%
IMDB >  8.1 10.8% 45.8%
Rotten Tomatoes >  Cert. Fresh 95% 10.4% 41.7%

High Netflix and Movielens scores are the most reliable indicators of “love” movies. Here’s my problem. There are no movies that I haven’t seen in the last fifteen years that have a Netflix Best Guess of 4.5 or higher. There are fewer than 10 movies that I haven’t seen in the last fifteen years with a Movielens predicted score of greater than 4.2. Here’s the kicker, the probability that I will “love” a movie with a Movielens predicted score of 4.2 or better that doesn’t also have a Netflix Best Guess greater than 4.5 is only 62%. It seems the chances to find movies to “love” are significantly diminished without the strong support of Netflix.

On the 1st of each month Netflix Streaming and Amazon Prime shake up the movies that are available in their inventory. The July 1 shakeup has resulted in a couple of new movies being added to my list of the Top Ten “Really Like” Movies Available on Netflix or Amazon Prime. This list is actually mistitled. It should be the Top Ten “Love” Movies Available. Take a look at the list. Perhaps you haven’t seen one of these movies, or haven’t seen it in a while. It is your good fortune to be able to watch one of these movies the next time you are in the mood for a special movie experience.

As for me, I’m still hoping that one of the movies released this year rises to the top of my watch list and is able to captivate me. If it were easy to find movies that I will “love”, I would have named this blog Will I “Love” This Movie?. For now, I will continue to watch movies that I will “really like” until that next special movie comes along.

In the Battle of Memory vs. Movie Website, Netflix is Still the Champ

On Monday I posed the question, is your memory of a movie that you’ve already seen the best predictor of “really like” movies. Based on Monday’s analysis memory certainly comes out on top against IMDB and Rotten Tomatoes. Today, I’m extending the analysis to Criticker, Movielens, and Netflix. By reconfiguring the data used in Monday’s post, you also can measure the relative effectiveness of each site. For example, let’s look again at IMDB.

Probability I Will “Really Like” Based on IMDB Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 80.1% 69.2%                            0.11
Never Seen Before 50.6% 33.6%                            0.17

It’s not surprising that the probabilities are higher for the movies that were seen before. After all it wouldn’t make sense to watch again the movies you wished you hadn’t seen the first time. But by looking at the gap between the probability of a recommended movie and a non-recommended movie, you begin to see how effectively the movie recommender is at sorting high probability movies from low probability movies. In this instance, the small 11 point spread for Seen Before movies suggests that IMDB is only sorting these movies into small departures from average. The low probabilities for the Never Seen Before movies suggest that, without the benefit of the memory of a movie seen before, IMDB doesn’t do a very good job of identifying “really like” movies.

Rotten Tomatoes follows a similar pattern.

Probability I Will “Really Like” Based on Rotten Tomatoes Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 80.5% 65.1%                            0.15
Never Seen Before 49.8% 31.8%                            0.18

Rotten Tomatoes is a little better than IMDB at sorting movies. The point spreads are a little broader. But, like IMDB, Rotten Tomatoes doesn’t effectively identify “really like” movies for the Never Seen Before group.

Theoretically, when we look at the same data for the remaining three sites, the Percentage Point Spread should be broader to reflect the more personalized nature of the ratings. Certainly, that is the case with Criticker.

Probability I Will “Really Like” Based on Criticker Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 79.3% 56.4%                            0.23
Never Seen Before 45.3% 18.9%                            0.26

Like IMDB and Rotten Tomatoes, though, Criticker isn’t very effective at identifying “really like” movies for those movies in the Never Seen Before group.

When you review the results for Movielens, you can begin to see why I’m so high on it as a movie recommender.

Probability I Will “Really Like” Based on Movielens Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 86.6% 59.6%                            0.27
Never Seen Before 65.1% 22.3%                            0.43

Unlike the three sites we’ve looked at so far, Movielens is a good predictor of “really like” movies for Never Seen Before movies. And, the spread of 43 points for the Never Seen Before movies is dramatically better than the three previous sites. It is a very effective sorter of movies.

Last, but certainly not least, here are the results for Netflix.

Probability I Will “Really Like” Based on Netflix Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 89.8% 45.7%                            0.44
Never Seen Before 65.7% 21.4%                            0.44

What jumps off the page is that there is no memory advantage in the allocation of movies for Netflix. As expected, the Seen Before probabilities are higher. But, there is an identical 44 point gap for Seen Before movies and movies Never Seen Before. It is the only site where you have a less than 50% chance that you will “really like” a movie you’ve already seen if Netflix doesn’t recommend it.

“If memory serves me correctly, I “really liked” this movie the first time I saw it.” That is an instinct worth following even if the movie websites suggest otherwise. But, if Netflix doesn’t recommend it, you might think twice.

***

6/24/2016 Addendum

I’ve finalized my forecast for the last three movies on my June Prospect list. My optimism is turning to pessimism regarding my hopes that Independence Day: Resurgence and Free State of Jones would be “really like movies”. Unfavorable reviews from the critics and less than enthusiastic response from audiences suggest that they could be disappointments. Of my five June prospects, Finding Dory seems to be the only safe bet for theater viewing, with Me Before You a possibility for female moviegoers. The IMDB gender split is pronounced for Me Before You with female voters giving it an 8.1 rating and males a 7.3 rating. It is also one of those rare movies with more female IMDB voters than males.

Will You “Really Like” This Movie

My vision for this blog has never been to recommend movies for you to watch. Instead, my focus has been to familiarize you with tools on the internet that will help you find movies that you will “really like”. I also realize that not everyone has the time to rate movies to generate personalized recommendations. So, each week I’ve been posting two lists of movies that I will “really like”, not so much as recommendations, but as ideas for movies you might look into. As I’ve generated these lists, however, I’ve discovered that finding movies with a high probability that I will “really like”, after already watching 1,981movies, can be problematic. Worse still, many of the movies that I’m suggesting you look into don’t even have a high probability that I will “really like” them.

With yesterday’s post I’ve substituted a new list for My Top Ten Movies to Watch, which was really a misnomer. It really was my top ten movies not including the 1,981 movies I’d already watched. The new list, that I call the Top Ten Movies You Will “Really Like” Available on Netflix and Amazon Prime, includes all of the movies I’ve already seen plus the movies I haven’t seen.

Where do these movies come from? They are movies that are recommended by all five of the websites that I use. They are recommended by IMDB and Rotten Tomatoes which are not influenced by my ratings. They are also influenced by the three sites that are driven by my tastes – Netflix, Movielens, and Criticker. When all five sites recommend a movie there is a 74% probability that I will “really like” it. Just to provide some perspective, the sites that you are most likely to use if you don’t have time to do your own ratings are IMDB and Rotten Tomatoes. If IMDB has a 7.4 or higher average rating for a movie, there is a 55.9% chance I will “really like” it. If a movie is Certified Fresh by Rotten Tomatoes, it has a 60.4% probability I will “really like” it.

Of  the 1,981 movies I’ve seen over the last 15 years, 438 have been recommended by all five sites. Of those 438, I “really liked” 362 of them, or 82.6% of them. That’s a high percentage for a relatively large sample size. These movies are your best bets. There are only 8 movies I haven’t seen in the last 15 years that meet the 74% criteria.

I’ve only posted 10 of the 446 movies that are universally recommended by all five sites. Most of those ten you have probably already seen, but they might be worth another look if you haven’t seen them in a while. They are also available to watch now if you are a Netflix and Amazon Prime subscriber. I want to help you find movies that you will “really like”, even if you don’t have the time to rate your own movies.

Rating Movies: If You Put Garbage In, You’ll get Garbage Out

In my prior life, I would on occasion find myself leading a training session on the predictive model that we were using in our business. Since the purpose of the model was to help our Account Executives make more effective business decisions, one of the points of emphasis was to point out instances when the model would present them with misleading information that could result in ineffective business decisions. One of the most basic of these predictive model traps is that it relies on data input that accurately reflects the conditions being tested in the model. If you put garbage into the model, you will get garbage out of the model.

Netflix, MovieLens, and Criticker are predictive models. They predict movies that you might like based on your rating of the movies you have seen. Just like the predictive model discussed above, if the ratings that you input into these movie models are inconsistent from movie to movie, you increase the chances that the movie website will recommend to you movies that you won’t like. Having a consistent standard for rating movies is a must.

The best approach to rating movies is a simple approach. I start with the Netflix guidelines to rating a movie:

  • 5 Stars = I loved this movie.
  • 4 Stars = I really liked this movie.
  • 3 Stars = I liked this movie.
  • 2 Stars = I didn’t like this movie.
  • 1 Star = I hated this movie.

When I’ve used this standard to guide others in rating movies, the feedback has been that it is an easily understood standard. The primary complaint has been that sometimes the rater can’t decide between the higher and lower rating. The movie fits somewhere in between. For example, “I can’t decide whether I “really like” this movie or just “like” it. This happens enough that I’ve concluded that a 10 point scale is best:

  • 10 = I loved this movie.
  • 9 = I can’t decide between “really liked” and “loved”.
  • 8 = I really liked this movie.
  • 7 = I can’t decide between “liked” and “really liked”.
  • 6 = I liked this movie.
  • 5 = I can’t decide between “didn’t like” and “liked”.
  • 4 = I didn’t like this movie.
  • 3 = I can’t decide between “hated” and “didn’t like”.
  • 2= I hated this movie.
  • 1 = My feeling for this movie is beyond hate.

The nice thing about a 10 point scale is that it is easy to convert to other standards. Using the scales that exist for each of the websites, an example of the conversion would look like this:

  • IMDB = 7  (IMDB uses a 10 point scale already)
  • Netflix = 7 /2 = 3.5 = 4 rounded up.  (Netflix uses 5 star scale with no 1/2 stars)
  • Criticker = 7 x 10 = 70 (Criticker uses 100 point scale).
  • MovieLens = 7 /2 = 3.5 (MovieLens has a 5 star scale but allows input of 1/2 star)

Criticker, being on a 100 point scale, gives you the capability to fine tune your ratings even more. I think it is difficult to subjectively differentiate, for example, between an 82 and an 83. In a future post we can explore this issue further.

So from one simple evaluation of a movie you can generate a consistent rating across all of the websites that you might use. This consistency allows for a more “apples to apples” comparison.

So throw out the garbage. Good data in will produce good data out, and a more reliable list of movies that you will “really like”.

 

When It Comes to Movie Rating Websites, There is Strength in Numbers.

If you can only use one website to help you select movies that you will “really like”, which should you choose? That’s a tougher question than you might think. Because I have used all five of the websites recommended here to select movies to watch, my data has been heavily influenced by their synergy. I have no data to suggest how effective using only one site would be. Here’s what I do have:

Probability I Will “Really Like”
Recommendation Standard When Recommended in Combination with Other Sites When Recommended by This Site Only
MovieLens > 3.73 70.2% 2.8%
Netflix > 3.8 69.9% 8.4%
Criticker > 76 66.4% 10.1%
IMDB > 7.4 64.1% 0.3%
Rotten Tomatoes Certified Fresh 62.7% 4.3%

When MovieLens recommends a movie, in synergy with other websites, it produces the highest probability. When Criticker recommends a movie but the other four sites don’t recommend the movie, then Criticker has the highest probability. Netflix is second in both groups. Which one is the best is unclear. What is clear is that the three sites that recommend movies based on your personal taste in movies, MovieLens, Netflix, & Criticker, outperform the two sites that are based on third party feedback, Rotten Tomatoes and IMDB. When Netflix, MovieLens, & Criticker recommend the same movie there is an 89.9% chance I’ll “really like” it. When both IMDB & Rotten Tomatoes recommend the same movie the probability is 75.8% I’ll “really like” it.

What also is clear is that if four websites are recommending that you don’t watch a movie and one is recommending that you do, the probability is that you won’t “really like” the movie no matter how good that one website is overall. The progression of probabilities in the example below gives some perspective of how combining websites works:

Websites Recommending a Movie Probability I Will “Really Like”
None 3.9%
Netflix Only 8.4%
Netflix & MovieLens Only 31.9%
Netflix, MovieLens, & Criticker Only 50.9%
Netflix, MovieLens, Criticker & IMDB Only 71.1%
All Five 96.6%

Stated Simply, your odds increase with each website that recommends a particular movie. If, for example, you were to only use Netflix for your movie recommendations, the probability of “really liking” a movie might be 69.9% but, in reality, it could be any one of the probabilities in the table above with the exception of the 3.9% for no recommendations. You wouldn’t know if other websites had recommended the movie.

So, if I had to choose one website, I’d choose Netflix-DVD if I were one of their 5,000,000 DVD subscribers. If I’m not already a subscriber I’d go with MovieLens. It would be a reluctant recommendation, though, because the strength in numbers provided by using multiple websites is just so compelling.

***

You’ll notice in the Top Ten Movies Available to Watch This Week that there are a number of movies on the list that are available on Starz. I’m taking advantage of the Comcast Watchathon Week which provides for free Starz, HBO, & Cinemax. Some of my highly rated movies which would ordinarily be unavailable are available for the short duration of this promotion. Bonus movies. Wahoo!!

 

 

Hollywood Has an even Deeper Diversity Problem than it Thinks

Children should be seen and not heard. This is a proverb whose origins date back to medieval times. It is a proverb that is rarely used today because, well, it’s so medieval. When it comes to roles for actresses in major motion pictures, however, we aren’t far removed from those medieval times. Actresses are seen in the movies but are not heard as much as their male counterparts. According to a study released within the last few weeks by the website Polygraph Cool, actresses have less dialogue than male actors by a significant amount in 2,000 of the top grossing box office films of all time. The study measures words of dialogue for each character in the screenplays of these movies. Some of the key findings in the study are:

  • Female characters have the most dialogue in only 22% of the films studied.
  • Female characters have two of the top three roles in only 18% of the movies.
  • In all age groups, actresses have less dialogue than male actors in the same  age group.
  • This dialogue discrepancy gets more pronounced as actresses age. Actresses 22-31 have 3 words of dialogue for every 4 words for Actors in the same age group. In comparison Actresses 42-65 have 1 word of dialogue for every 5 words of male dialogue.
  • Even in Romantic Comedies the dialogue is 58% male.

Are there movies out there with greater gender parity? If so, how do you find them? The answer is yes. They do exist but not in great numbers. At the bottom of the Polygraph study linked above, the authors provide a tool that you can use to access the movies used in the study. As I’ve mentioned in a prior article, there is a male bias to both IMDB and Rotten Tomatoes. You might check out ChickFlix.net, which provides movie reviews from a female perspective as a substitute for Rotten Tomatoes.

There is also the Bechdel Test, which is cited in the Polygraph study. This tests movies based on a simple criteria. There must be two main female characters who have a conversation with each other about something other than a man. Based on studies, only about 50% of movies pass the test.

You can also use the personalized movie recommenders that I’ve recommended on my posts. By rating movies on Netflix-DVD, MovieLens, or Criticker, you will generate movie recommendations based on your taste in movies.

The lack of diversity in today’s movies reflect the box office. The first step is being able to identify which movies reflect the diversity that we’d like to see in film. I would like to think that we can push film evolution out of medieval times.

Who is Your Favorite Actor or Actress? Criticker Knows.

Who is your favorite actor or actress?  If you can’t wait for the next Leonardo DiCaprio or Jennifer Lawrence movie, does that make them your favorite actors?  If you have rated on Criticker every movie you’ve ever seen, or in my case every movie seen in the last 15 years, the answer to these questions is just a click away.

Criticker has a number of neat tools on its website. One of my favorites is its Filmmaker List, which can be found by clicking the Explore button that appears along the top banner. You can rank  Actors, as well as Directors or Screenwriters, using a variety of criteria. I like to rank actors based on the average rating I’ve given the movies that they’ve  appeared in. Once you’ve ranked them, you can click on their name and see which of their movies you’ve seen and which ones you haven’t. You can also set a minimum number of movies for your rankings so that you can get the most representative sample that your number of ratings will allow.

For example, I have 1,999 movies rated in Criticker. If I set my minimum at 20 movies for Actors and rank them by average score, my top five favorite Actors are:

Actor Avg. Score # of Movies Seen Best Movie Not Seen
Tom Hanks 85.88 26 Cloud Atlas
Harrison Ford 83.50 24 The Conversation
Morgan Freeman 82.50 22 The Lego Movie
Phillip Seymour Hoffman 81.18 22 Boogie Nights
Samuel L. Jackson 81.00 25 Kingsman: The Secret Service

Based on a 15 movie minimum, my favorite Actresses are:

Actress Avg. Score # of Movies Seen Best Movie Not Seen
Kate Winslet 79.13 15 Hamlet (1996)
Scarlett Johansson 75.52 21 Hail, Caesar!
Judi Dench 74.22 17 Hamlet (1996)
Laura Linney 74.63 16 Mr. Holmes
Natalie Portman 74.35 17 Paris, je t’aime
Meryl Streep 74.28 18 Fantastic Mr. Fox

I included 6 on my Actress list because you just can’t leave Meryl Streep off of any list of the Best Actresses. Also, the Best Movie not seen is based on the highest predicted Criticker Score for movies I haven’t seen, or haven’t seen in the last 15 years.

There are a couple of surprises on my lists. Samuel L. Jackson is a surprise. I can’t say that I watch a particular movie because Samuel L. Jackson is in it. It does, however reflect how many quality movies he’s been in. Scarlett Johansson is another surprise. It’s amazing that I have seen 21 movies of hers and she is only 31 years old.

There are favorite actors of mine who didn’t make the list, such as Paul Newman and Jodie Foster. In Paul Newman’s case, he didn’t meet the 20 minimum and his average movie score wasn’t high enough (79.32 in 19 movies). Jodie Foster would have been the highest on my list with an average score of 79.64 but I’ve only seen 11 of her movies, under the 15 movie minimum I set.

When you first go to the Filmmaker List the default for the minimum movies rated is 3. Under that criteria my favorite actor of all time is Billy Dee Williams (95.67 and 3 movies). “I love Brian Piccolo”, and Lando Calrissian (Star Wars V & VI) as well.

 

 

 

Movielens: The Reliable Alternative

In previous posts I’ve expressed my concern with corporate interests impacting the integrity of movie recommender algorithms. IMDB is owned by Amazon. Rotten Tomatoes is owned by Fandango. Netflix is owned by, well, …Netflix. Criticker isn’t corporately owned but is partially funded by commercial advertising. Now I present to you, Movielens, which isn’t owned by a corporation and doesn’t advertise on its website. Movielens is operated by GroupLens Research at the University of Minnesota. It exists for the benefit of students at the University who are researching predictive modeling. In other words, it exists because it wants to build the best recommender of movies that you will “really like” that you can possibly build. There is no corporate bottom line. There is just the goal of building a better mousetrap.

So far, it’s done a pretty good job. My benchmark for movies that I will “really like” is 4 out of 5 stars, or 7.5 out of 10, or 75 out of 100, depending on the rating scale used. When I calculate the probability that I will “really like” a movie that meets that benchmark for each individual website, I get the following results:

Website Recommend Criteria Probability I Will “Really Like” 
Netflix > 3.8 94.4%
MovieLens > 3.75 93.7%
IMDB > 7.4 90.2%
Criticker > 76 89.8%
Rotten Tomatoes Certified Fresh 89.0%

Movielens holds its own with Netflix and, unlike Netflix, its algorithm is not held hostage to their corporate interests, and its free. All you have to do is click on the MovieLens link above, sign up, and begin rating movies that you’ve seen. Even though MovieLens uses a five star scale, you can enter half stars. You will, at times, be torn between whether  you “really like” a movie or just “like” it. MovieLens lets you enter 3 1/2 stars for that situation.

I encourage you to use MovieLens. You can pat yourself on the back for making a contribution to science.

***

Geek Alert!! Geek Alert!!

If you look at the Movie Lists I updated yesterday, you may be puzzled why so many movies have the same probability. Each month I recalibrate the probabilities in my Bayesian model. I’m constantly experimenting to get the right balance between a model that has many probability differences among movies, but more uncertainty about reliability, and a model that has fewer probability differences among individual movies but is more reliable. Too many probability groups create the risk of randomness creeping into the probabilities. The Bayesian Model recognizes this and shifts the probabilities closer to the probability of a random movie selection. This happened last month when I used 20 groups. When fewer groups are used, the larger groups that result  are more credible and produce less randomness. The model recognizes this and allows for probabilities closer to the tendencies of the group. This month I went back to 5 groups which produces more reliable probability results but with more movies with the same probability. Just in case you were wondering:)

Post Navigation