Cinemascore Is a “Really Like” Indicator

Those of  you who checked in on Monday to see the updated Objective Top Ten may have noticed that Cinemascore grades were included in the information provided for each movie. If you were particularly observant, you might have also noticed that the bar at the top of the page, which includes links to the movie ratings websites I use, now includes the link to Cinemascore. All of which means that Cinemascore grades are now officially included in the “really like” algorithm.

Those of  you who checked in on Monday to see the updated Objective Top Ten may have noticed that Cinemascore grades were included in the information provided for each movie. If you were particularly observant, you might have also noticed that the bar at the top of the page, which includes links to the movie ratings websites I use, now includes the link to Cinemascore. All of which means that Cinemascore grades are now officially included in the “really like” algorithm.

As I’ve mentioned before, the folks at Cinemascore have been surveying moviegoers as they leave the theater since 1978. They limit their surveys to the three or four movies each week that they suspect will do the best at the box office. This limited sample of movies represents around 40% of the movies in my database, which is a plenty big enough sample for me to work with.

The other factor in using the data is that the grades seem to line up with their “really like” potential.

Cinemascore Database Results
Grade Database Total Graded “Really Like” %
A+ 51 82%
A 201 80%
A- 212 73%
B+ 156 58%
B 117 50%
B- 52 42%
C+ 21 33%
C 9 11%
C- 4 0%
D+ 1 0%
D 0 0%
D- 1 0%

The “really like” percentages follow a logical progression by grade. Now, because the sample sizes for each grade are relatively small, I’ve had to group the grades into two buckets that represent above average Cinemascore grades and below average grades.

All Grades               825 65%
A+,A, A-               464 77%
All Other               361 50%

This suggests that a good Cinemascore grade is an A- or better (Talk about grade inflation!!). The statistical gap between the two buckets is great enough for it to be an effective differentiator of “really like” movies.

The practical effect of this change is that the Objective Top Ten will be more weighted to mainstream movies. Independent movies are less likely to be surveyed by Cinemascore for example. On the other hand, a movie like Hidden Figures, which already benefitted from high IMDB and Rotten Tomatoes scores, now adds a Cinemascore grade of A+. This makes the model even more confident that this movie is a “really like” movie and as a result the probability % for the movie goes higher, lifting it to the top of the list.

I’m excited about this enhancement and I hope you will be too.

***

I mentioned last week that I had my eye on two movies, The Beguiled and The Big Sick. I jumped the gun a little bit because both of these movies only went into limited release last Friday. The Beguiled goes into wide release tomorrow, while The Big Sick goes into wide release on July 14th. Baby Driver, which went into wide release yesterday, is another new movie that looks good from the early indicators.

Next Monday the Objective Top Ten will become the Objective Top Fifteen (just in case you needed something else to look forward to this weekend). Have a “Really Like” 4th of July weekend at the movies!

Author: Mad Movie Man

I love good movies. In my prior life I worked with predictive models. I've combined my love of movies with my prior experience to create a simple Bayesian probability model to help select movies that you will probably "really like".

Leave a comment