Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Musings After a Quiet Movie Weekend

There were no changes this week to the 2017 Objective Top Ten. None of the movies that opened last weekend earned a Certified Fresh on Rotten Tomatoes. So, I have nothing to talk about. Right? Oh, you are so wrong.

First, regarding that Objective Top Ten that I update every Monday, I want to be clear about something. I’m not suggesting that you will like every movie on that list. I’m not suggesting that there aren’t good movies that didn’t make the list. In fact, my two favorite movies so far, Beauty and the Beast and Gifted, aren’t on the list. It is just an objective measure of quality. It doesn’t take into account your personal taste in movies. For example, if you typically don’t like Art House movies you may not like Kedi, which is a documentary about the hundreds of thousands of cats that have been roaming Istanbul for thousands of years, or Truman, which is a Spanish language film that celebrates the enduring nature of good friendship. These low budget movies tend to take risks and aren’t intended to please the general audience. But, would you really prefer to see the new Transformers movie which opened yesterday and is 16% Rotten on Rotten Tomatoes? You may prefer to avoid all three movies and that’s okay. The point of the list is to give you a menu of quality movies and if any naturally intrigue you, the odds are that it will be a “really like” movie for you.

Turning from low budget Art House films to big budget Blockbusters, the success of two other movies on the list explain why movies based on comic books are here to stay for the foreseeable future. Logan with its estimated $97 million production budget and Wonder Woman with its estimated budget of $149 million have returned a tidy return in worldwide box office receipts of over $617 million and $578 million, respectively. When quality movies in the comic book genre are made, they spin box office gold.

A couple of other notes on the Objective Top Ten List. In July I plan to expand the list to fifteen movies and in October I’ll expand it again to twenty movies. This will better accommodate the number of quality movies that typically are released over the second half of the year. Also, I’m close to being able to incorporate Cinemascore grades into the probabilities for the Objective Top Ten. It’s possible that this change may be incorporated as early as next Monday’s update. This change will differentiate better one movie from the next.

Finally, two movies that I have my eye on for this weekend are The Beguiled ,which earned Sofia Coppola the top director award at Cannes, and The Big Sick, which is already 98% Certified Fresh on Rotten Tomatoes.

Leave Mummy Out of Your Father’s Day Plans

One of the goals of this blog is to make sure that you are aware of the internet tools that are out there to protect you from wasting your time on blockbusters like The Mummy. While it had a disappointing opening in the U.S., moviegoers still shelled out an estimated $32.2 million at the box office last weekend for this bad movie. Overseas it met its blockbuster expectations with a box office of $141.8 million. However, if you were really in the mood for a horror genre movie a better choice, but not a sure thing, might have been It Comes At Night which had a more modest U.S. box office of $6 million.

As a general rule, I won’t go to a movie on its opening weekend. I prefer to get at least a weekend’s worth of data. But if you just have to see a movie on its opening weekend here are a couple of hints. First, if you are seeing the movie on its opening Friday, the most reliable indicator is Rotten Tomatoes. Most critics have released their reviews before the day of the movie’s release. The Rotten Tomatoes rating on the movie’s release date is a statistically mature evaluation of the movie. It won’t change much after that day.

If you are going to the movies on the Saturday of opening weekend, you can add Cinemascore to the mix. I’ve blogged about this tool before. This grade is based on feedback moviegoers provide about the movie as they are leaving the theater. The grade is posted on the Saturday after the Friday release.

Finally, by Sunday IMDB will produce a pretty good, though slightly inflated, average rating for the movie.

The comparison of these three checkpoints for The Mummy and for It Comes At Night might’ve been helpful to those who thought they were in for a “really like” movie experience.

Rotten Tomatoes IMDB Avg. Rating Cinemascore Grade
The Mummy Rotten (17%) 5.9 B-
It Comes At Night Certified Fresh (86%) 7.2 D

While the Cinemascore grade of D for It Comes At Night would keep me away from opening weekend for both movies, if I had to see one, it wouldn’t be The Mummy.

Here’s the data behind my reasoning. For IMDB, the breakpoint between a movie with a good chance that I will “really like” it and one that I probably won’t like is an average rating of 7.2. Movies with a 7.2 IMDB average rating of 7.2 or higher I “really like” 63.3% of the time. Movies with an IMDB rating less than 7.2 I “really like” 43.3% of the time. Turning to Rotten Tomatoes, Movies that are Certified Fresh I “really like” 68% of the time. These “really like” percentages drop to 49.6% for movies that are Fresh and 37.5% for movies that are Rotten. So absent any information based on my own personal tastes, I won’t go to the movieplex to watch a movie that isn’t graded Certified Fresh by Rotten Tomatoes and has an IMDB Rating 7.2 or higher. That doesn’t mean that there aren’t any movies out there that don’t meet that criteria that I wouldn’t “really like”. The movie may be in a genre that appeals to me which might provide some tolerance for a little less quality. That being said, the odds that I’ll “really like” a low rated movie are less than 50/50.

I should probably explore the potential of adding Cinemascore to the objective probability factors I use in developing “really like” probabilities. To date, though, I don’t have any Cinemascore data . I don’t yet have a feel for its “really like” reliability. For now, I just use it as another piece of data that might tip me one way or the other if I’m on the fence about a new movie.

Enjoy Father’s Day but stay away from Mummy.

Wonder Woman Is Wonderful But Is It the GOAT Superhero Movie?

Everybody is talking about Wonder Woman and its record-breaking box office last weekend. Critics and audiences agree that Wonder Woman is worth a trip to the theater. The Mad Movie Man is convinced as well. You’ll find the movie in the top half of the 2017 Top Ten List and it is on my Watch List for the week, which means I plan on seeing it within the next week.

I mentioned last week that critics were falling all over themselves in praising this movie with some calling it the Superhero GOAT (Greatest Of All Time). Does it warrant such acclaim? Maybe. When you compare it to four other highly rated superhero movies that kicked off franchises, it holds up pretty well.

Oscar Noms/Wins IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh Combined Rating
Wonder Woman (2017) 0/0 8.3 C. Fresh 93%              17.6
Iron Man (2008) 2/0 7.9 C. Fresh 94%              17.3
Batman Begins (2005) 1/0 8.3 C. Fresh 84%              16.7
Superman (1978) 3/0 7.3 C. Fresh 93%              16.6
Spider-Man (2002) 2/0 7.3 C. Fresh 89%              16.2

All four of these comparison movies were Oscar nominated. We’ll have to wait until next January to see if Wonder Woman earns Oscar recognition. The combined rating presented here totals the IMDB rating and the Rotten Tomatoes % Fresh (converted to a 10 pt. scale) to measure the response of both critics and audiences to the five movies. It is still early, and IMDB ratings tend to fade a little over time, but for now Wonder Woman is clearly in the GOAT discussion.

If Wonder Woman holds on to its statistical GOAT position it will be fueled by the response of women to the movie. A comparison of Female and Male IMDB ratings for the five movies compared here lays this out pretty clearly.

Female IMDB Rating Male IMDB Rating IMDB Rating Differnce
Wonder Woman 8.6 8.2 0.4
Iron Man 7.9 7.9 0.0
Superman 7.3 7.3 0.0
Batman Begins 8.1 8.3 -0.2
Spider-Man 7.1 7.3 -0.2

While men “really like” Wonder Woman, females love the movie. Women are responding like they never have before to a superhero movie. Men, on the other hand, have a slight preference for Christopher Nolan’s vision of Batman. I also have to admit that I personally consider Batman Begins as one of the GOAT movies, irrespective of genre. That being said, I am really excited about seeing Wonder Woman.

***

After all of this praise for Wonder Woman, you might be wondering why it is only fifth on the 2017 Top Movies List. Does that mean that the four movies ahead of it are better movies? It might but not necessarily. The top four movies all went into limited release in 2016 to qualify for Oscar consideration. They didn’t go into wide release until early 2017, which is why they are on this list. All of the other movies on the list won’t be considered for Oscar recognition until January 2018. As I mentioned last week, this list is based on objective criteria. The Oscar nominations that the top four movies received are additional objective pieces of evidence that they are quality movies. This allows the algorithm to be more confident in its evaluation of the movie and as a result produces a higher “really like” probability. Again, just in case you were wondering.

 

“Really Like” Previews of Coming Attractions 

Recently I mentioned to someone that I was a movie blogger. Naturally they assumed I wrote movie reviews. It did get me thinking, though, “what is my blog really about?”

Looking back at my previous 92 posts, it’s hard to discern a consistent theme. I confess that it has been hard to come up with a fresh idea every single week. The result has been a hodgepodge of articles that intersect movies and data, but lack a unifying core. That is…until now.

It occurs to me that, while I’m not in the movie reviewing business, I am in the movie previewing business. I use statistical analysis to preview what movies I might “really like”. It also occurs to me that I created my algorithm for my benefit, not yours. I write this blog, though, for your benefit.

With all of that in mind, I’ve decided to reorient this blog to a discussion of movies you might “really like”, using my statistical analysis as the foundation of the discussion. My algorithm has two parts. The first produces a “really like” probability based on data from websites like Netflix, Movielens, and Critcker that are oriented to movies that I, personally, will “really like”.

The second part of the equation is based on general data that has nothing to do with my personal taste in movies. IMDB and Rotten Tomatoes produce ratings based on the input of all of their website participants. Oscar award performance has nothing to do with me. I’m not a member of the academy. For now, these are the factors that go into my “really like” probability based on general information. It’s this “really like” probability that might be most useful to you, the followers of this blog.

On Monday I added a new list to this site. The Top Ten 2017 Movies Based on Objective Criteria uses this second half of my algorithm to suggest movies that you might “really like”. I intend to update this list every Monday after the initial data from the previous weekend’s new releases comes in. This Friday, for example, Wonder Woman goes into wide release. Some critics are calling it the “best superhero movie of all time”. It will be interesting to look at the data feedback on Monday to see if it’s actually trending that way.

I’m also exploring the addition of other general data to the criteria. For example is there statistical significance to when a movie is released. I’m in the data gathering stage of that study. I’m also planning on adding in future months Top Ten lists for years prior to 2017.

I will also continue to update on Wednesday’s my Watch List for the week. While it is based on movies I should “really like”, you might find some movies there that pique your interest.

As for this blog, I plan to orient each week’s post around one or two of the movies on my lists and offer up some ideas as to why it might be a movie that you’ll “really like”. For now I would encourage you to check back on Monday to see if the hyperbolic buzz surrounding Wonder Woman is supported by strong enough numbers to move it into 2017’s “really like” Top Ten. Then, return again on Thursday to see what movies that you might “really like” have caught my eye.

This One Is All About You and Movielens

A few months ago my daughter texted me for recommendations for good movies on Netflix or Amazon Prime. I recommended a hidden treasure of a movie, Begin Again, but I couldn’t remember if it was on Netflix or Amazon. I knew it was on one of them. I had to go to each site to find the movie to nail down which streaming service it was on.

My daughter, and others like her, will no longer need to search blindly for movies on the streaming services they subscribe to if they’ve signed up to use my favorite movie recommender site, Movielens. Aside from being a very reliable movie recommender site, it is also the most useful in terms of finding movies to watch.

Within the last couple of months Movielens has added its best feature to date. Not only can you get pages and pages of recommended movies, once you’ve taken the time to rate the movies you’ve seen, but now you can filter them by the most popular streaming services.

Movielens allows you to filter recommendations by movies currently on Netflix, Amazon, Hulu, HBO, and Showtime. You can filter them individually or in combination. In my case, I filter by Netflix, Amazon and HBO. This means that you can get a list of movies that you can watch right now, ranked by the probability that you will “really like” them.

If I go to the Home Page of Movielens right now and go to Top Picks, I can click on the filter’s drop down menu and select Streaming Services. This will provide me with a list of the five services mentioned previously. By clicking on Netflix, Amazon, and HBO, I get a list of movies that I can watch now that I haven’t seen before. There are 5,256 movies available for me to watch right now ranked from the one I’m most likely to enjoy, last summer’s box office surprise Me Before You (Amazon), to the movie I’m least likely to enjoy, The Admirer (Amazon). You’ve never heard of The Admirer? Neither have I. It is a 2012 Russian movie based on the love between Anton Chekhov and a young writer, Lidiya Avilova. ZZZ.

More often than not my posts are about my experiences in finding movies that I will “really like”. This one’s for you. If you only have time to track one movie recommender website, go to Movielens. It will be fun and it will save you time scrolling through lines and lines of movies searching for movies that you might like.

When It Comes to Unique Movies, Don’t Put All of Your Movie Eggs in the Netflix Basket.

It is rare to find a movie that isn’t a sequel, or a remake, or a borrowed plot idea, or a tried and true formula. Many of these movies are entertaining because they feel familiar and remind us of another pleasant movie experience. The movie recommender websites Netflix, Movielens, and Criticker do a terrific job of identifying those movie types that you “really liked” before and surfacing those movies that match the familiar plot lines you’ve enjoyed in the past.

But, what about those movies that are truly original. What about those movies that present ideas and plot lines that aren’t in your usual comfort zone. Will these reliable websites surface these unique movies that I might like? Netflix has 20,000+ genre categories that they slot movies into. But, what if a movie doesn’t fit neatly into one of those 20,000 categories.

Yesterday I watched a great movie, Being There.

Being There

This 1979 movie, starring Peter Sellers in an Academy Award nominated performance, is about a simple-minded gardener who never left the home of his employer over the first fifty years of his life. Aside from gardening, the only knowledge he has is what he’s seen on television. After his employer dies he is forced to leave his home. The movie follows Chance the Gardener as he becomes Chauncey Gardner, economic advisor to the President. It’s not a story of transformation but of perception. The movie is fresh.

My most reliable movie recommenders, Netflix and Movielens, warned me away from this movie. The only reason I added it to my weekly Watch List is because I saw the movie in the theater when it first came out in 1979 and “really liked” it.

Another possible reason why Netflix missed on this movie is because I hated Peter Sellers’ other classic movie Dr. Strangelove. I rated it 1 out of 5 stars.  If Being There is slotted among a Netflix category of Peter Sellers classics, it might explain the mismatch.

Is it impossible, then, to surface movies that have creative original content that you might like. Not entirely. Criticker predicted I would rate Being There an 86 out of 100. I gave it an 89. The IMDB rating is 8.0 based on over 54,000 votes. Rotten Tomatoes has it at 96% Certified Fresh. This is why I incorporate ratings from five different websites into the “really like” model rather than just Netflix.

When it comes to unique movies, don’t put all of your movie eggs in the Netflix basket.

 

 

What Am I Actually Going to Watch This Week? Netflix Helps Out with One of My Selections.

The core mission of this blog is to share ideas on how to select movies to watch that we’ll “really like”. I believe that there have been times when I’ve bogged down on how to build the “really like” model. I’d like to reorient the dialogue back to the primary mission of what “really like” movies I am going to watch and more importantly why.

Each Wednesday I publish the ten movies on my Watch List for the week. These movies usually represent the ten movies with the highest “really like” probability that are available to me to watch on platforms that I’ve already paid for. This includes cable and streaming channels I’m paying for and my Netflix DVD subscription. I rarely use a movie on demand service.

Now, 10 movies is too much, even for the Mad Movie Man, to watch in a week. The ten movie Watch List instead serves as a menu for the 3 or 4 movies I actually most want to watch during the week. So, how do I select those 3 or 4 movies?

The first and most basic question to answer is who, if anyone, am I watching the movie with. Friday night is usually the night that my wife and I will sit down and watch a movie together. The rest of the week I’ll watch two or three movies by myself. So, right from the start, I have to find a movie that my wife and I will both enjoy. This week that movie is Hidden Figures, the 2016 Oscar nominated film about the role three black female mathematicians played in John Glenn’s orbit of the earth in the early 1960’s.

This movie became available to Netflix DVD subscribers on Tuesday May 9. I received my Hidden Figures DVD on that day. Something I’ve learned over the years is that Netflix ships DVD’s on Monday that become available on Tuesday. For this to happen you have to time the return of your old DVD to arrive on the Saturday or Monday before the Tuesday release. This gives you the best chance to avoid “long wait” queues.

I generally use Netflix DVD to see new movies that I don’t want to wait another 3 to 6 months to see or for old movies that I really want to see but aren’t available on my usual platforms.

As of the first quarter of 2017, Netflix reported that there are only 3.94 million subscribers to their DVD service. I am one of them. The DVD service is the only way that you can still access Netflix’ best in the business 5 star system of rating movies. It is easily the most reliable predictor of how you’ll rate a movie or TV show. Unfortunately, Netflix Streaming customers no longer have the benefit of the 5 Star system. They have gone to a less granular “thumbs up” and “thumbs down” rating system. To be fair, I haven’t gathered any data on this new system yet therefore I’ll reserve judgement as to its value. As for the DVD service, they will have me as a customer as long as they maintain their 5 star recommender system as one of the benefits of being a DVD subscriber.

The 5 star system is a critical assist to finding a movie for both my wife and I. Netflix allows you set up profiles for other members of the family. After my wife and I watch a movie, she gives it a rating and I give it a rating. These ratings are entered under our separate profiles. This allows a unique predicted rating for each of us based on our individual taste in movies. For example, Netflix predicts that I will rate Hidden Figures a 4.6 out of 5 and my wife will rate it a 4.9. In other words, according to Netflix, this is a movie that both of us, not only will “really like”, but we should absolutely “love”.

Hidden Figures has a “really like” probability of 61.4%. It’s Oscar Performance probability is 60.7% based on its three nominations. Its probability based solely on the feedback from the recommender sites that I use is 69.1%. At this point in time, it is a Quintile 1 movie from a credibility standpoint. This means that the 69.1% probability is based on a limited number of ratings. It’s not very credible yet. That’s why the 61.4% “really like” probability is closer to the Oscar Performance probability of 60.7%. I would fully expect that, as more people see Hidden Figures and enter their ratings, the “really like” probability will move higher for this movie.

Friday Night Movie Night this week looks like a “really like” lock…thanks to Netflix DVD.

 

 

Can You Increase Your Odds of Having a “Really Like” Experience at the Movie Theater

Last Friday, my wife and I were away from home visiting two different sets of friends. One group we met for lunch. The second group we were meeting in the evening. With some time to spare between visits, we decided to go to a movie. The end of April usually has slim pickings for “really like” movies at the theater. With the help of IMDB and Rotten Tomatoes, I was able to surface a couple of prospects but only one that both my wife and I might “really like”. We ended up seeing a terrific little movie, Gifted.

My experience got me thinking about the probabilities of seeing “really like” movies at the movie theater. These movies have the least data to base a decision off of and yet I can’t recall too many movies that I’ve seen in the theater that I haven’t “really liked”. Was this reality or merely perception.

I created a subset of my database of movies that I’ve seen within 3 months of their release. Of the 1,998 movies in my database, 99 movies, or 5%, met the criteria. Of these 99 movies, I “really liked” 86% of them. For the whole database, I “really liked” 60% of the movies I’ve watched over the last 15 years. My average score for the 99 movies was 7.8 out of 10. For the remaining 1,899 movies my average score was 6.8 out of 10.

How do I explain this? My working theory is that when a movie comes with an additional cash payout, i.e. theater tickets, I become a lot more selective in what I see. But, how can I be more selective with less data? I think it’s by selecting safe movies. There are movies that I know I am going to like. When I went into the movies theater a couple of months ago to see Beauty and the Beast I knew I was going to love it and I did. Those are the types of movie selections I tend to reserve for the theater experience.

There are occasions like last Friday when a specific movie isn’t drawing me to the movies but instead I’m drawn by the movie theater experience itself. Can I improve my chances of selecting a “really like” movie in those instances?

Last week I mentioned in my article that I needed to define better what I needed my “really like” probability model to do. One of the things that it needs to do is to provide better guidance for new releases. The current model has a gap when it comes to new releases. Because the data is scarce most new releases will be Quintile 1 movies in the model. In other words, very little of the indicators based on my taste in movies, i.e. Netflix, Movielens, and Criticker, is factored into the “really like” probability.

A second gap in the model is that new releases haven’t been considered for Academy Awards yet. The model treats them as if they aren’t award worthy, even though some of them will be Oscar nominated.

I haven’t finalized a solution to these gaps but I’m experimenting with one. As a substitute for the Oscar performance factor in my model I’m considering a combined IMDB/Rotten Tomatoes probability factor. These two outputs are viable indicators of the quality of a new release. This factor would be used until the movie goes through the Oscar nomination process. At that time, it would convert to the Oscar performance factor.

I’ve created a 2017 new release list of the new movies I’m tracking. You can find it on the sidebar with my Weekly Watch List movies. This list uses the new “really like” probability approach I’m testing for new releases. Check it out.

If you plan on going to the movies this weekend to see Guardians of the Galaxy Vol. 2, it is probably because you really liked the first one. Based on IMDB and Rotten Tomatoes, you shouldn’t be disappointed. It is Certified Fresh 86% on Rotten Tomatoes and 8.2 on IMDB.

 

 

“Really Like” Movies: Is That All There Is?

After scoring a movie that I’ve watched, one of my rituals is to read a critic’s review of the movie. If the movie is contemporaneous to Roger Ebert’s tenure as the world’s most read critic, he becomes my critic of choice. I choose Ebert, first of all, because he is a terrific writer. He has a way of seeing beyond the entertainment value of the movie and observing how it fits into the culture of the time. I also choose Ebert because I find that he “really likes” many of the movies I “really like”. He acts as a validator of my film taste.

The algorithm that I use to find “really like” movies to watch is also a validator. It sifts through a significant amount of data about a movie I’m considering and validates whether I’ll probably “really like” it or not based on how I’ve scored other movies. It guides me towards movies that will be “safe” to watch. That’s a good thing. Right? I guess so. Particularly, if my goal is to find a movie that will entertain me on a Friday night when I might want to escape the stress of the week.

But what if I want to experience more than a comfortable escape? What if I want to develop a more sophisticated movie palate? That won’t happen if I only watch movies that are “safe”. Is it possible that my algorithm is limiting my movie options by guiding me away from movies that might expand my taste? My algorithm suggests that because I “really liked” Rocky I & II, I’ll “really like” Rocky III as well. While that’s probably a true statement, the movie won’t surprise me. I’ll enjoy the movie because it is a variation of a comfortable and enjoyable formula.

By the same token, I don’t want to start watching a bunch of movies that I don’t “really like” in the name of expanding my comfort zone. I do, however, want to change the trajectory of my movie taste. In the end, perhaps it’s an algorithm design issue. Perhaps, I need to step back and define what I want my algorithm to do. It should be able to walk and chew gum at the same time.

I mentioned that I used Roger Ebert’s reviews because he seemed to “really like” many of the same movies that I “really liked”. It’s important to note that Roger Ebert “really liked” many more movies than I have over his lifetime. Many of those movies are outside my “really like” comfort zone. Perhaps I should aspire to “really like” the movies that Ebert did rather than find comfort that Ebert “really liked” the movies that I did.

 

Does Critic Expertise on Rotten Tomatoes Overcome the Law of Large Numbers?

In the evolution of my “really like” movie algorithm, one of the difficulties I keep encountering is how should I integrate Rotten Tomatoes ratings in a statistically significant way. Every time I try I keep rediscovering that its ratings are not as useful as the other websites that I use. It’s not that it has no use. To determine if a movie is worth seeing within a week after its release, you’ll be hard pressed to find a better indicator. The problem is that most of the data for a particular movie is counted in that first week. Most of the critic reviews are completed close to the release dates to provide moviegoers with guidance on the day a movie is released. After that first week, the critics are on to the next batch of new movies to review. With all of the other websites, the ratings continually get better as more people see the movie and provide input. The data pool gets larger and the law of large numbers kicks in. With Rotten Tomatoes, there is very little data growth. Its value is based on the expertise of the critics and less on the law of large numbers.

The question becomes what is the value of film critics expertise. It is actually pretty valuable. When Rotten Tomatoes slots movies into one of their three main rating buckets (Certified Fresh, Fresh, Rotten), it does create a statistically significant differentiation.

Rating “Really Like” %
Certified Fresh 69.7%
Fresh 50.0%
Rotten 36.6%

Rotten Tomatoes is able to separate pretty well those movies I “really like” from those I don’t.

So what’s the problem? If we stick to Certified Fresh movies we’ll “really like” them 7 out of 10 times. That’s true. And, if I’m deciding on which new release to see in the movie theater, that’s really good. But, if I’m deciding what movie my wife and I should watch on Friday night movie night and our selection is from the movies on cable or our streaming service, we can do better.

Of the 1,998 movies I’ve seen in the last 15 years, 923 are Certified Fresh. Which of those movies am I most likely to “really like”? Based on the following table, I wouldn’t rely on the Rotten Tomatoes % Fresh number.

Rating % Fresh Range “Really Like” %
Certified Fresh 96 to 100% 69.9%
Certified Fresh 93 to 95% 73.4%
Certified Fresh 89 to 92% 68.3%
Certified Fresh 85 to 88% 71.2%
Certified Fresh 80 to 84% 73.0%
Certified Fresh 74 to 79% 65.3%

This grouping of six equal size buckets suggests that there isn’t much difference between a movie in my database that is 75% Fresh and one that is 100% Fresh. Now, it is entirely possible that there is an actual difference between 75% Fresh and 100% Fresh. It is possible that, if my database were larger, my data might produce a less random pattern which might be statistically significant. For now, though, the data is what it is. Certified Fresh is predictive and the % Fresh part of the rating less so.

Expertise can reduce the numbers needed for meaningful differentiation between what is Certified Fresh and what is Rotten. The law of large numbers, though, may be too daunting for credible guidance much beyond that.

 

 

Post Navigation