Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the category “Uncategorized”

If January Makes You Shiver with Every Movie They Deliver, Then Stick with the Oscar Bait.

What do the movies Molly’s GameThe PostPhantom Thread, and Hostiles have in common? For one thing, they all hope to receive Academy Award nominations when they are announced on January 23rd. Secondly, after going into limited release in December to qualify for 2017 movie awards, most of the world will finally get a chance to actually see these movies this January. Thirdly, these movies are the early front-runners for the 2018 Objective Top Twenty. Finally, they will be your very best bets for “really like” movies released in January.

Why do movie producers push some Oscar contenders into January and sometimes even into February? Are these movies artistically worthy but with limited audience appeal? Sometimes. That may be the case with Hostiles, for example. I’ve heard that the beginning of the movie is intensely violent which might turn off audiences, particularly women and older audiences. The overall IMDB rating is 7.1 but the male/female split is 7.2 and 5.3 respectively. The age demographics in IMDB reflect similar polarization. Voters under 30 give it a 7.6 so far while voters 30 and older give it a 6.5. Like the similarly violent The Revenant, which also went into wide release in January, it may have a better chance to find it’s audience away from the family dominated audiences of December.

Phantom Thread is another movie that might not appeal to wide audiences. This is a Paul Thomas Anderson directed film and, to say the least, he is an acquired taste, a taste that I have yet to acquire. The last time he collaborated with Daniel Day-Lewis was for the film There Will Be Blood, a movie I hated. Personal opinion aside, it has been reported that Phantom Thread may be the most mainstream movie that Paul Thomas Anderson has ever made. Early IMDB ratings are strong with an average rating of 8.8. Sometimes the selection of a release date is nothing more than superstition. There Will Be Blood opened on Jan 25, 2007, which is approximately the same weekend (Jan 19th) when Phantom Thread will open.

Molly’s Game, which I was fortunate to see already, is definitely not a January holdover because it lacks audience appeal. It’s IMDB rating is 7.6 and it is consistently strong across all demographic groups. This is an under-buzzed movie and sometimes the strategy is to roll out a movie slowly to build up the buzz.

The Post, on the other hand has all the buzz and star power it needs. With Spielberg, Streep and Hanks, along with a topical storyline, this movie screams Best Picture. So why slide this movie into January. It’s strategic. The producers hope that this will be the movie that everyone is talking about when Oscar voting is taking place. The strategy is to have the buzz be about The Post just as the buzz is winding down for other Best Picture contenders like The Shape of Water and Lady Bird.

So what about the rest of the January releases. Well, you might find a diamond in the rough but the odds are against you.

% with IMDB Rating 7+ Probability You Will “Really Like”
Prior Year Oscar Contender Jan. Wide Release 84.3% 75.39%
All Other January Wide Releases 51.3% 64.81%
Movies Released in All Other Months 72.0% 71.20%

The high IMDB ratings go to the prior year hold-overs and not the movies being released for the first time in January. The movies held over from the prior year are better, on average, than the movies produced over the remaining eleven months. The remaining January movies are significantly worse.

To avoid the January shivers on your next trip to the Cineplex, stick to the Oscar bait from last year, whenever it was released.

 

 

Advertisements

Merry Christmas

No in depth movie analysis this week. Like many of you, I’m enjoying the holidays with family and friends, and a few movies tossed in. I will be back next week with my year end list of the top ten movies I watched in 2017. Until then here’s wishing you and your loved ones a very Merry Christmas and a Happy “really like” movie New Year.

 

There Are No Turkeys in the Objective Top Seven Movies From 1992 to 1998

Shall we call it “The Drive for Twenty Five”? If so, this installment of our journey to the Objective Top Twenty Five Movies of the last Twenty Five years begs the question which of these Cinematic Seven will survive to Twenty Five. By adding 1998 to the Objective Database more discrete groupings of data are statistically viable. As future years are added the number of groupings will grow resulting in many changes to this list. From the initial Top Six list that was published just two weeks ago, only three movies remain in the Top Seven. I think we can expect this kind of volatility with each year we add. How many of these movies will be in the Top Twenty Five at the end? Fewer than we’d expect, I’m sure.

Here’s our significant seven:

7. Scent of a Woman (IMDB 8.0, Certified Fresh 88%, CinemaScore A, Major Academy Award Win)

This movie is a favorite of mine. It produced Al Pacino’s only Academy Award win after being shut out for his seven previous nominations.

6. Good Will Hunting (IMDB 8.3, Certified Fresh 97%, CinemaScore A. Major  Academy Award Win)

One of my followers wondered why his favorite movie didn’t make the list. Good Will Hunting is a good illustration of what it takes. It requires high ratings from all feedback groups, movie watchers, movie critics, opening night moviegoers, and peer movie artists.

5. The Shawshank Redemption (IMDB 9.3, Certified Fresh 91%, CinemaScore A, Major Academy Award Nomination)

Another one of the original Top Six. The Achilles Heel for this movie from an objective rating standpoint is its failure to win a major Academy Award despite three major nominations.

4. The Usual Suspects (IMDB 8.6, Certified Fresh 88%, No CinemaScore rating, Major Academy Award Win)

Because this is an objective ranking rather than subjective, Kevin Spacey movies are still considered. In the long run, I wonder how much the absence of a CinemaScore rating will hurt this movie and, if so, should it.

3. The Lion King (IMDB 8.5, Certified Fresh 83%, CinemaScore A+, Minor Academy Award Win)

A few weeks before the release of this picture, Elton John was given a private screening of the movie. He noticed the love song he wrote wasn’t in the film and successfully lobbied to have it put back in. That song, Can You Feel the Love Tonight, won Elton John an Academy Award for Best Original Song.

2. Saving Private Ryan (IMDB 8.6, Certified Fresh 92%, CinemaScore A, Major Academy Award Win)

The only movie from the just added 1998 year to make the list. It is also the only movie on the list to be the top grossing movie for the year it was released.

1. Schindler’s List (IMDB 8.9, Certified Fresh 96%, CinemaScore A+, Major Academy Award Win)

According to the Objective “Really Like” algorithm, a 76.98% “really like” probability is the highest score that can be achieved with the algorithm. So far, Schindler’s List is the only movie with that perfect score.

***

Disney animated movies rule Thanksgiving weekend. According to Box Office Mojo, Disney owns 9 of the 10 highest grossing Thanksgiving movies of all time. Coco, which opened in theaters yesterday, is this year’s entrant into their tradition of Thanksgiving dominance. Early IMDB ratings give it a 9.1 average rating to go along with its 96% Certified Fresh Rotten Tomatoes rating. This morning CinemaScore gave it an A+ rating.

Also, two more Oscar hopefuls go into limited release this weekend. Darkest Hour is the perfect bookend to Dunkirk. It follows Winston Churchill’s response to the events at Dunkirk. Gary Oldman’s portrayal of Churchill has him on everyone’s short list for Best Actor. Also worth considering is a festival favorite, Call Me By Your Name, which was nominated this week for an Independent Spirit Award for Best Picture.

Happy Thanksgiving to you and your families.

Add a Year Here. Tweak a Formula There. And, the Objective Top Twenty Looks Very Different.

I was able to add 1998 to the Objective Database last weekend. The extra data allowed me to factor in Oscar wins to the algorithm. But, it was one little tweak to the Oscar performance factor that dramatically altered the 2017 Objective Top Twenty this week.

For the Oscar performance part of my algorithm I created five groupings of movies based on their highest Academy Award achievement. If a movie won in a major category it went in the first group. If it was nominated for a major but didn’t win, it went in the second group. If it wasn’t nominated for a major but won in a minor category, it went into the third group. If it was only nominated in a minor category but didn’t win, it went into the fourth group. Finally, if it wasn’t nominated in any Oscar category, it went into the fifth group.

In terms of what percentage of the movies in each group that had an average IMDB rating of 7 or better, here are the results:

Best Oscar Performance: %  7+ IMDB Avg. Rating
Major Win 90.3%
Major Nomination 87.7%
Minor Win 79.7%
Minor Nomination 71.7%
No Nominations 59.8%

Wins seem to matter, particularly for the minor categories. Major nominations clearly are better “really like” indicators than minor nominations. It’s the no nominations grouping that’s most revealing. If a movie doesn’t get at least one nomination, the odds of it being a “really like” movie are dramatically reduced. This led to my discovery of some faulty thinking on my part.

If movies like DunkirkLady Bird, and Three Billboards Outside of Ebbing, Missouri, all movies headed towards major Oscar nominations in January, are treated in my algorithm as if they failed to earn a single Oscar nomination, those movies are being unfairly penalized. It was this flaw in my system that needed fixing. Now, those movies that haven’t gone through the Oscar nominating process are designated as Not Applicable. No Oscar performance test is applied to them. Without the weight of the No Nomination designation, many of the movies that didn’t get their first release until 2017 have risen significantly in the 2017 Objective Top Twenty rankings.

***

Get ready for a Thanksgiving treat. Now that 1998 has been added to the Objective Database, we can reveal the Objective Top Seven Movies from the years 1992-1998. Adding Academy Award Wins to the mix will shake up those rankings as well. Check in next Thursday after you’ve taken your post-turkey dinner nap.

***

The wide releases this weekend are Justice LeagueThe Star, and Wonder, but it’s the limited release, Mudbound, that I’ll be watching closely . This movie, set in the post-WII rural American South, is being mentioned as a Best Picture contender. Here’s the thing though. Most people won’t see it in the movie theater since it opens simultaneously on Friday on Netflix streaming. Can a movie that is more widely viewed at home than in the theater gain Academy Award traction? Stay tuned.

 

Why Does CinemaScore Leave Out So Many Good Movies When Issuing Grades?

The 2017 Academy Awards will be forever remembered as the year that La La Land was awarded Best Picture for about a minute before they discovered that Moonlight was the actual winner. Those two movies have something else in common. Neither movie received a CinemaScore grade despite, arguably, being the top two movies of 2016.

I’m thinking about this issue this week because three movies with Oscar buzz, StrongerBattle of the Sexes, and Victoria and Abdul,  went into limited release last weekend. None of them were graded by Cinemascore. There is a valid reason for this but that doesn’t make it any less disappointing to movie pre-screeners like myself.

For me, Cinemascore is appealing because it measures only opening night reaction. Most people who go to the opening night of a movie are there because they really want to see that movie. The pre-release buzz has grabbed their attention to such an extent that they can’t wait to see it. They walk into an opening night movie expecting to love it. When they walk out of the movie and respond to CinemaScore’s survey they are probably grading based on expectations. So, when a movie receives an “A” from Cinemascore, it tells us that the movie lives up to the hype. Anything less than that suggests that the movie experience was less than they expected.

CinemaScore gets stuck when it comes to movies that are released in a limited number of theaters prior to them being released widely in most theaters. CinemaScore samples locations throughout the U.S. and Canada to establish a credible unbiased sample. When a movie goes into limited release, it is released in some of their sample locations but not in most of their sample locations. Last weekend, Stronger was released in 573 theaters, Battle of the Sexes was released in 21 theaters, and Victoria and Abdul was released in 4 theaters. To provide some perspective, Kingsman: The Golden Circle opened in 4,003 theaters last weekend and earned a “B+” grade from CinemaScore. When Stronger and Battle of the Sexes goes into wide release tomorrow, does word of mouth reaction from moviegoers who’ve seen the movie in the last week disturb the integrity of any sample taken this weekend? When Victoria and Abdul goes into wide release on October 6, is its release into just 4 theaters last weekend sufficiently small to not taint the sample? I don’t know the answers to these questions. I’ll be looking to see if these movies get graded when they go into wide release. In Box Office Mojo’s article on last weekend’s box office performance they indicate that CinemaScore graded Stronger an “A-” even though it hasn’t been officially posted on their website. Perhaps they are waiting to post it after wide release?

I understand why grades don’t exist on CinemaScore for many limited release movies. I understand the importance of data integrity in the creation of a credible survey. I will just observe, though, that in this age of social media, using limited movie releases to build pre-wide release momentum for quality movies is an increasingly viable strategy. Just this week, A24, the studio behind the rise of Moonlight last year, decided to put their dark horse candidate this year, Lady Bird, into limited release on November 3rd after it emerged from the Telluride and Toronto film festivals with a 100% Fresh grade from Rotten Tomatoes. CinemaScore may be facing the prospect of an even broader inventory of ungraded top tier movies than it does today. It will be interesting to see how they respond to this challenge, if at all.

 

Before You See Mother! This Weekend, You Might Read This Article

As you might expect, I’m a big fan of Nate Silver’s FiveThirtyEight website. Last Thursday they published an interesting article on the impact of polarizing movies on IMDB ratings, using Al Gore’s An Inconvenient Sequel: Truth to Power as an example. This is not the first instance of this happening and it won’t be the last.

When the new Ghostbusters movie with the all female cast came out in July 2016 there was a similar attempt to tank the IMDB ratings for that movie. That attempt was by men who resented the all female cast. At that time I posted this article. Has a year of new ratings done anything to smooth out the initial polarizing impact of the attempt to tank the ratings? Fortunately, IMDB has a nice little feature that allows you to look at the demographic distribution behind a movie’s rating. If you access IMDB on it’s website, clicking the number of votes that a rating is based on will get you to the demographics behind the rating.

Before looking at the distribution for Ghostbusters, let’s look at a movie that wasn’t polarizing. The 2016 movie Sully is such a movie according to the following demographics:

Votes Average
Males  99301  7.4
Females  19115  7.6
Aged under 18  675  7.7
Males under 18  566  7.6
Females under 18  102  7.8
Aged 18-29  50050  7.5
Males Aged 18-29  40830  7.5
Females Aged 18-29  8718  7.6
Aged 30-44  47382  7.4
Males Aged 30-44  40321  7.4
Females Aged 30-44  6386  7.5
Aged 45+  12087  7.5
Males Aged 45+  9871  7.5
Females Aged 45+  1995  7.8
IMDb staff  17  7.7
Top 1000 voters  437  7.2
US users  17390  7.5
Non-US users  68746  7.4

There is very little difference in the average rating (the number to the far right) among all of the groups. When you have a movie that is not polarizing, like Sully, the distribution by rating should look something like this:

Votes  Percentage  Rating
12465  8.1% 10
19080  12.4% 9
52164  33.9% 8
47887  31.1% 7
15409  10.0% 6
4296  2.8% 5
1267  0.8% 4
589  0.4% 3
334  0.2% 2
576  0.4% 1

It takes on the principles of a bell curve, with the most ratings clustering around the average for the movie.

Here’s what the demographic breakdown for Ghostbusters looks like today:

Votes Average
Males  87119  5.0
Females  27237  6.7
Aged under 18  671  5.3
Males under 18  479  4.9
Females under 18  185  6.6
Aged 18-29  36898  5.4
Males Aged 18-29  25659  5.0
Females Aged 18-29  10771  6.7
Aged 30-44  54294  5.2
Males Aged 30-44  43516  5.0
Females Aged 30-44  9954  6.6
Aged 45+  11422  5.3
Males Aged 45+  9087  5.1
Females Aged 45+  2130  6.3
IMDb staff  45  7.4
Top 1000 voters  482  4.9
US users  25462  5.5
Non-US users  54869  5.2

There is still a big gap in the ratings between men and women and it persists in all age groups. This polarizing effect produces a ratings distribution graph very different from the one for Sully.

Votes  Percentage  Rating
20038  12.8% 10
6352  4.1% 9
13504  8.6% 8
20957  13.4% 7
24206  15.5% 6
18686  12.0% 5
10868  7.0% 4
7547  4.8% 3
6665  4.3% 2
27501  17.6% 1

It looks like a bell curve sitting inside a football goal post. But it is still useful because it suggests the average IMDB rating for the movie when you exclude the 1’s and the 10’s is around 6 rather than a 5.3.

You are probably thinking that, while interesting, is this information useful. Does it help me decide whether to watch a movie or not? Well, here’s the payoff. The big movie opening this weekend that the industry will be watching closely is Mother!. The buzz coming out of the film festivals is that it is a brilliant but polarizing movie. All four of the main actors (Jennifer Lawrence, Javier Bardem, Michele Pfeiffer, Ed Harris) are in the discussion for acting awards. I haven’t seen the movie but I don’t sense that it is politically polarizing like An Inconvenient Sequel and Ghostbusters. I think it probably impacts the sensibilities of different demographics in different ways.

So, should you go see Mother! this weekend? Fortunately, its early screenings at the film festivals give us an early peek at the data trends. The IMDB demographics so far are revealing. First, by looking at the rating distribution, you can see the goal post shape of the graph, confirming that the film is polarizing moviegoers.

Votes  Percentage  Rating
486  36.0% 10
108  8.0% 9
112  8.3% 8
92  6.8% 7
77  5.7% 6
44  3.3% 5
49  3.6% 4
40  3.0% 3
52  3.8% 2
291  21.5% 1

57.5% of IMDB voters have rated it either a 10 or a 1. So are you likely to love it or hate it? Here’s what the demographics suggest:

Votes Average
Males  717  6.1
Females  242  5.4
Aged under 18  25  8.4
Males under 18  18  8.2
Females under 18  6  10.0
Aged 18-29  404  7.3
Males Aged 18-29  305  7.5
Females Aged 18-29  98  6.1
Aged 30-44  288  5.0
Males Aged 30-44  215  5.0
Females Aged 30-44  69  5.2
Aged 45+  152  4.3
Males Aged 45+  111  4.3
Females Aged 45+  40  4.1
Top 1000 voters  48  4.6
US users  273  4.4
Non-US users  438  6.5

While men like the movie more than women, if you are over 30, men and women hate the movie almost equally. There is also a 2 point gap between U.S. and non-U.S. voters. This is a small sample but it has a distinct trend. I’ll be interested to see if the trends hold up as the sample grows.

So, be forewarned. If you take your entire family to see Mother! this weekend, some of you will probably love the trip and some of you will probably wish you stayed home.

 

It’s a Good Week To Be on Vacation 

Every now and then I wonder if anyone would notice if I didn’t blog one week. I’m a creature of habit. I update the Objective Top Fifteen every Monday. I update my Watch List every Wednesday. I publish a new article every Thursday.

This week I’m spending the week in beautiful Newport, RI. I didn’t update the Objective Top Fifteen. Did you notice? As it turns out, there were no changes. Both The Hitman’s Bodyguard and Logan Lucky received mediocre grades from Cinemascore, which kept them off the list.

I didn’t update my Watch List today. But that wouldn’t have changed much either. After watching American History X last Wednesday, I haven’t watched another movie since.

As for the movies opening this weekend, there isn’t much to talk about. August is typically weak. If you can believe it, this August is running 64% behind last August at the box office, making it a weaker than weak August. If you want to use your newly purchased Movie Pass (Its price was recently cut to $10), check out the Indies I’ve mentioned before. Good Time, the Robert Pattinson crime drama, opens to a wide audience this weekend. Positive buzz is following its limited release last weekend.

Finally, I just wanted to let you know that I wouldn’t be publishing tomorrow. I’ll be continuing to sample signature drinks throughout Newport. Given where we are in the movie cycle, it’s a very viable alternative.

Why Did “The Big Sick” Drop Out of the Objective Top Fifteen This Week?

This past Sunday my wife, Pam, and I went to see The Big Sick. The movie tells the story of the early relationship days of the two screenwriters, Emily Gordon and Kumail Nanjiani. In fact, Nanjiani plays himself in the movie. It is the authenticity of the story, told in a heartfelt and humorous way, that makes this film special.

On the following day, last weekend’s blockbuster, Dunkirk, moved into the second spot in the revised Objective Top Fifteen rankings. When a new movie comes on the list another one exits. This week’s exiting movie, ironically, was The Big Sick. Wait! If The Big Sick is such a great movie why isn’t it in my top fifteen for the year? Are all of the other movies on the list better movies? Maybe yes. Maybe no. You’ll have to determine that for yourselves. You see the Objective Top Fifteen is your list, not mine.

I developed the Objective Top Ten, which became Fifteen the beginning of July and will become Twenty the beginning of October, to provide you with a ranking of 2017 widely released movies that are most likely to be “really like” movies. Because the ranking is based on objective benchmarks, my taste in movies has no influence on the list. The four benchmarks presently in use are: IMDB Avg. Rating, Rotten Tomatoes Rating, Cinemascore Rating, and Academy Award Nominations and Wins. A movie like Hidden Figures that meets all four benchmarks has the greatest statistical confidence in its “really like” status and earns the highest “really like” probability. A movie that meets three benchmarks has a greater “really like” probability than a movie that meets only two benchmarks. And so on.

The important thing to note, though, is that this is not a list of the fifteen best movies of the year. It is a ranking of probabilities (with some tie breakers thrown in) that you’ll “really like” a movie. It is subject to data availability. The more positive data that’s available, the more statistical confidence, i.e. higher probability, the model has in the projection.

Which brings me back to The Big Sick. Cinemascore surveys those movies that they consider “major releases”. The Big Sick probably didn’t have a big advertising budget. Instead, the producers of the film chose to roll the movie out gradually, beginning on June 23rd, to create some buzz and momentum behind the movie before putting it into wide release on July 14th. This is probably one of the reasons why Cinemascore didn’t survey The Big Sick. But, because The Big Sick is missing that third benchmark needed to develop a higher probability, it dropped out of the Top Fifteen. On the other hand, if it had earned at least an “A-” from Cinemascore The Big Sick would be the #2 movie on the list based on the tie breakers.

And, that is the weakness, and strength of movie data. “Major releases” have it. Smaller movies like The Big Sick don’t.

***

This weekend may be the end of the four week run of Objective Top Fifteen movie breakthroughs. Atomic Blonde, the Charlize Theron spy thriller, has an outside chance of earning a spot on the list. As of this morning, it is borderline for the IMDB and Rotten Tomatoes benchmarks. I’m also tracking Girls Trip which earned a Certified Fresh just in the last couple of days from Rotten Tomatoes and has an “A+” in hand from Cinemascore. For now, it is just below the IMDB benchmark. We’ll see if that changes over the weekend.

 

 

What Am I Actually Going to Watch This Week? Netflix Helps Out with One of My Selections.

The core mission of this blog is to share ideas on how to select movies to watch that we’ll “really like”. I believe that there have been times when I’ve bogged down on how to build the “really like” model. I’d like to reorient the dialogue back to the primary mission of what “really like” movies I am going to watch and more importantly why.

Each Wednesday I publish the ten movies on my Watch List for the week. These movies usually represent the ten movies with the highest “really like” probability that are available to me to watch on platforms that I’ve already paid for. This includes cable and streaming channels I’m paying for and my Netflix DVD subscription. I rarely use a movie on demand service.

Now, 10 movies is too much, even for the Mad Movie Man, to watch in a week. The ten movie Watch List instead serves as a menu for the 3 or 4 movies I actually most want to watch during the week. So, how do I select those 3 or 4 movies?

The first and most basic question to answer is who, if anyone, am I watching the movie with. Friday night is usually the night that my wife and I will sit down and watch a movie together. The rest of the week I’ll watch two or three movies by myself. So, right from the start, I have to find a movie that my wife and I will both enjoy. This week that movie is Hidden Figures, the 2016 Oscar nominated film about the role three black female mathematicians played in John Glenn’s orbit of the earth in the early 1960’s.

This movie became available to Netflix DVD subscribers on Tuesday May 9. I received my Hidden Figures DVD on that day. Something I’ve learned over the years is that Netflix ships DVD’s on Monday that become available on Tuesday. For this to happen you have to time the return of your old DVD to arrive on the Saturday or Monday before the Tuesday release. This gives you the best chance to avoid “long wait” queues.

I generally use Netflix DVD to see new movies that I don’t want to wait another 3 to 6 months to see or for old movies that I really want to see but aren’t available on my usual platforms.

As of the first quarter of 2017, Netflix reported that there are only 3.94 million subscribers to their DVD service. I am one of them. The DVD service is the only way that you can still access Netflix’ best in the business 5 star system of rating movies. It is easily the most reliable predictor of how you’ll rate a movie or TV show. Unfortunately, Netflix Streaming customers no longer have the benefit of the 5 Star system. They have gone to a less granular “thumbs up” and “thumbs down” rating system. To be fair, I haven’t gathered any data on this new system yet therefore I’ll reserve judgement as to its value. As for the DVD service, they will have me as a customer as long as they maintain their 5 star recommender system as one of the benefits of being a DVD subscriber.

The 5 star system is a critical assist to finding a movie for both my wife and I. Netflix allows you set up profiles for other members of the family. After my wife and I watch a movie, she gives it a rating and I give it a rating. These ratings are entered under our separate profiles. This allows a unique predicted rating for each of us based on our individual taste in movies. For example, Netflix predicts that I will rate Hidden Figures a 4.6 out of 5 and my wife will rate it a 4.9. In other words, according to Netflix, this is a movie that both of us, not only will “really like”, but we should absolutely “love”.

Hidden Figures has a “really like” probability of 61.4%. It’s Oscar Performance probability is 60.7% based on its three nominations. Its probability based solely on the feedback from the recommender sites that I use is 69.1%. At this point in time, it is a Quintile 1 movie from a credibility standpoint. This means that the 69.1% probability is based on a limited number of ratings. It’s not very credible yet. That’s why the 61.4% “really like” probability is closer to the Oscar Performance probability of 60.7%. I would fully expect that, as more people see Hidden Figures and enter their ratings, the “really like” probability will move higher for this movie.

Friday Night Movie Night this week looks like a “really like” lock…thanks to Netflix DVD.

 

 

Sometimes When You Start To Go There You End Up Here

There are some weeks when I’m stumped as to what I should write about in this weekly trip to Mad Moviedom. Sometimes I’m in the middle of an interesting study that isn’t quite ready for publication. Sometimes an idea isn’t quite fully developed. Sometimes I have an idea but I find myself blocked as to how to present it. When I find myself in this position, one avenue always open to me is to create a quick study that might be halfway interesting.

This is where I found myself this week. I had ideas that weren’t ready to publish yet. So, my fallback study was going to be a quick study of which movie decades present the best “really like” viewing potential. Here are the results of my first pass at this:

“Really Like” Decades
Based on Number of “Really Like” Movies
As of April 6, 2017
My Rating
Really Liked Didn’t Really Like Total “Really Like” Probability
 All       1,108                  888         1,996
 2010’s           232                  117            349 60.9%
 2000’s           363                  382            745 50.5%
 1990’s           175                    75            250 62.0%
 1980’s             97                    60            157 58.4%
 1970’s             56                    49            105 54.5%
 1960’s             60                    55            115 53.9%
 1950’s             51                    78            129 46.6%
 1940’s             55                    43               98 55.8%
 1930’s             19                    29               48 46.9%

These results are mildly interesting. The 2010’s, 1990″s, 1980’s, and 1940’s are above average decades for me. There are an unusually high number of movies in the sample that were released in the 2000’s. Remember that movies stay in my sample for 15 years from the year I last watched the movie. After 15 years they are removed from the sample and put into the pool of movies available to watch again. The good movies get watched again and the other movies are never seen again, hopefully. Movies last seen after 2002 have not gone through the process of separating out the “really like” movies to be watched again and permanently weeding from the sample the didn’t “really like” movies. The contrast of the 2000’s with the 2010’s is a good measure of the impact of the undisciplined selection movies and the disciplined selection.

As I’ve pointed out in recent posts, I’ve made some changes to my algorithm. One of the big changes I’ve made is that I’ve replaced the number of movies that are “really like” movies with the number of ratings for the movies that are “really like” movies. After doing my decade study based on number of movies, I realized I should have used the number of ratings method to be consistent with my new methodology. Here are the results based on the new methodology:

“Really Like” Decades
Based on Number of “Really Like” Ratings
As of April 6, 2017
My Rating
Really Liked Didn’t Really Like Total “Really Like” Probability
 All    2,323,200,802    1,367,262,395    3,690,463,197
 2010’s        168,271,890        166,710,270        334,982,160 57.1%
 2000’s    1,097,605,373        888,938,968    1,986,544,341 56.6%
 1990’s        610,053,403        125,896,166        735,949,569 70.8%
 1980’s        249,296,289        111,352,418        360,648,707 65.3%
 1970’s          85,940,966          25,372,041        111,313,007 67.7%
 1960’s          57,485,708          15,856,076          73,341,784 68.0%
 1950’s          28,157,933          23,398,131          51,556,064 59.5%
 1940’s          17,003,848            5,220,590          22,224,438 67.4%
 1930’s            9,385,392            4,517,735          13,903,127 64.6%

While the results are different, the big reveal was that 63.0% of the ratings are for “really like” movies and only 55.5% of the number of movies are “really like” movies. It starkly reinforces the impact of the law of large numbers. Movie website indicators of “really like” movies are more reliable when the number of ratings driving those indicators are larger. The following table illustrates this better:

“Really Like” Decades
Based on Average Number of “Really Like” Ratings per Movie
As of April 6, 2017
My Rating
Really Liked Didn’t Really Like Total “Really Like” % Difference
 All      2,096,751.63      1,539,709.90      1,848,929.46 36.2%
 2010’s          725,309.87      1,424,874.10          959,834.27 -49.1%
 2000’s      3,023,706.26      2,327,065.36      2,666,502.47 29.9%
 1990’s      3,486,019.45      1,678,615.55      2,943,798.28 107.7%
 1980’s      2,570,064.84      1,855,873.63      2,297,125.52 38.5%
 1970’s      1,534,660.11          517,796.76      1,060,123.88 196.4%
 1960’s          958,095.13          288,292.29          637,754.64 232.3%
 1950’s          552,116.33          299,976.04          399,659.41 84.1%
 1940’s          309,160.87          121,409.07          226,779.98 154.6%
 1930’s          493,968.00          155,783.97          289,648.48 217.1%

With the exception of the 2010’s, the average number of ratings per movie is larger for the “really like” movies. In fact, they are dramatically different for the decades prior to 2000. My educated guess is that the post-2000 years will end up fitting the pattern of the other decades once those years mature.

So what is the significance of this finding. It clearly suggests that waiting to decide whether to see a new movie or not until a sufficient number of ratings come in will produce a more reliable result. The unanswered question is how many ratings is enough.

The finding also reinforces the need to have something like Oscar performance to act as a second measure of quality for movies that will never have “enough” ratings for a reliable result.

Finally, the path from “there to here” is not always found on a map.

Post Navigation