Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

“Really Like” Movie Recommendations Are Even Better When You Exercise a Little Judgement

Last Saturday night my wife Pam and I watched 20th Century Woman for our weekend movie night. If you’ve been following the Objective Top Twenty, you’ll note that this movie has been on the list for most of the year. We were pretty excited to see it. In the end, though, it wasn’t the movie we expected.

20th Century Woman is a semi-autobiographical movie directed and written by Mike Mills and reminisces about his teenage years in Santa Barbara, CA, He is raised by a single mother, played by Annette Bening, with the assistance of two other women in his social circle.

It is an intriguing movie with interesting characters. I wasn’t bored by it but the movie didn’t quite connect with me. As an aside, I found it interesting that Greta Gerwig, who co-stars as one of the other female influences in the story, turned around after this movie and drew on her own teenage experience in Sacramento, CA.  Gerwig wrote and directed a similar movie, the recently released and highly acclaimed Lady Bird. While Mills made the focus of his movie about the mother, Gerwig centered her movie on Lady Bird, the teenager. Perhaps 20th Century Woman would have more effectively connected with me if it were focused on the teenager, Jamie. Punk Rock also has a prominent place in 20th Century Woman, a music genre that passed me by without hardly an acknowledgement of its existence.

I ended up rating this movie as a “like” but not a “really like” movie. The “really like” algorithm estimated that there was a 67% probability that I would “really like” 20th Century Woman. Is this a case of the movie simply representing the 33% probability that I wouldn’t “really like” it. Sure, but that doesn’t mean that there weren’t warning signs that it might end up in the 33%.

Without getting into the mathematical weeds of the algorithm, let it suffice to say that the probability that I will “really like” a movie is the blend of the objective data that goes into the Objective Top Twenty and subjective data from Netflix, Movielens, and Criticker which are based on my personal taste in movies. If the data from the subjective sites is limited, my “really like” probability is weighted closely to the objective data. On the other hand, if the subjective data is plentiful, then its recommendation is very reliable and my “really like” probability is close to the subjective recommendation.

You might find this illustration helpful. The Credibility Quintile organizes the movies into five groups based on how reliable the subjective data is. Quintile 5 is very reliable data and Quintile 1 is not very reliable. The five movies listed all have close to the same probability that I will “really like” them but are in different quintiles.

Movie Credibility Quintile Objective “Really Like” Probability % Subjective “Really Like” Probability % Probability I Will “Really Like” This Movie
Men of Honor 5 63.4% 69.0% 67.2%
Far and Away 4 61.6% 69.6% 66.6%
Nebraska 3 69.1% 63.4% 66.3%
Fabulous Baker Boys, The 2 65.3% 69.9% 67.0%
20th Century Women 1 68.3% 51.2% 67.0%

While all five movies have relatively the same overall probability, they aren’t equally reliable. Men of Honor is clearly a movie that, according to the highly reliable Quintile 1 data, I will like more than the rest of the world and the algorithm reflects that. The same could be said for Far and Away. The movie Nebraska, on the other hand, seems to be a movie that I would like less than the general public. Note as a Quintile 3 movie my probability is halfway between the objective and the subjective probabilities.

It’s the last two movies that illustrate the point I want to make. The probability that I will “really like” The Fabulous Baker Boys is identical to 20th Century Woman. Both movies are in below average credibility quintiles. That is where the similarities end. When you look at the subjective probabilities for both movies, The Fabulous Baker Boys has a strong trend towards being a movie I will “really like”. Even without reliable data it might be a movie worth taking a chance on. 20th Century Woman is headed in the opposite direction towards being a movie I probably wouldn’t “really like”. I should have caught that before watching the movie. It doesn’t mean I would have given up on the movie. It just means that I should have waited another cycle or two for more data to more reliably predict whether I would “really like” it or not.

Algorithms are tools to help you analyze data. Using algorithms to make decisions requires the exercise of a little judgement.

 

 

Advertisements

“Really Like” Movie Experiences With My Family at Thanksgiving

Over the course of a typical Thanksgiving weekend, movies have become a part of our family experience. We watch them. We discuss them. For me, my family is my own private focus group. They challenge my ideas and generate new avenues of thought to explore.

This Thanksgiving was no different as my wife Pam and I flew into Seattle to visit with Meggie, Richie and Addie, our daughter, son-in-law and 4 month old granddaughter. Our son Brendan and his girlfriend Kristen (a very loyal follower of this blog) flew in from Boston. And our youngest, Colin, made the trip up the coast from L.A. With our family scattered from coast to coast, these family gatherings are very special.

Movies aren’t the only topic of conversation, especially when Addie’s in the room, but they do surface from time to time. Richie and I had a conversation about my Objective Top Seven from the years 1992 to 1998 that was in my last post. While he thought Schindler’s List was good, he would never put it at number one. He liked movies that made him feel happy when they were over. Now, Scent of a Woman, that was a movie on my list he could get on board with. On the other hand, my son Brendan couldn’t understand why his favorite movie Braveheart wasn’t on the list.

My conversations with Richie and Brendan illustrate why I rank movies based on “really like” probabilities. What movies we like and why we like them are unique to our own experiences and tastes. Many of us watch a movie to boost our mood. Schindler’s List is not a mood booster. On the other hand, if we are in the mood to expose ourselves to a harsh reality of the human experience and have our emotions touched in a very different way, there are few movies as moving as Schindler’s List. I confess that, like Richie, I prefer the mood boost to the harsh reality of life. The movie Moonlight has been sitting on my Watch List for some time now, waiting for me to be in the mood to experience it.

Later in the weekend, Meggie and Colin watched The Big Sick with me on Amazon Prime. They were really excited to see it based on the enthusiastic recommendations from Pam and I, and from many of the other people in their lives. At the end of the movie, they indicated that they both liked it but expected more from a movie that everyone else had raved about. It gave me another interesting insight into why people “really like” some movies but not others. Your expectation for a movie can significantly shape your opinion of the movie. Watching a movie that others say you “gotta see” may set the bar so high that only the great movies will reach it. A mere really good movie has no shot.

That expectations shape your opinion of a movie is a truism. If I flip the scenario to movies that I’ve stumbled upon that became unexpected movie treasures, I can attest to a second truism. Good movies that fly under the radar will be enjoyed more than they have any reason to be. One of my personal top fifty movies is the greatest baseball movie few people have seen, Bang the Drum Slowly. Less than 5,000 voters have rated it on IMDB. Released in 1973, it stars De Niro before he was “De Niro”. At the time it didn’t go totally unnoticed. The movie earned a Best Supporting Actor nomination for Vincent Gardenia. I only saw the movie because I went to a double feature at the drive-in. The second movie was one of those “gotta see” movies. Bang the Drum Slowly was the first. That’s the movie that I fondly remember today and not the second feature.

Rating movies is not a science. Movie fans who rate movies on websites like IMDB don’t use a Pythagorean Formula to derive that one correct answer. But it’s from these disparate reasons for each individual rating that I try to tease out some understanding each week as to which movies you will “really like”.

I am very thankful for the strong support and inspiration of my family at Thanksgiving and all of the other 364 days of the year.

 

There Are No Turkeys in the Objective Top Seven Movies From 1992 to 1998

Shall we call it “The Drive for Twenty Five”? If so, this installment of our journey to the Objective Top Twenty Five Movies of the last Twenty Five years begs the question which of these Cinematic Seven will survive to Twenty Five. By adding 1998 to the Objective Database more discrete groupings of data are statistically viable. As future years are added the number of groupings will grow resulting in many changes to this list. From the initial Top Six list that was published just two weeks ago, only three movies remain in the Top Seven. I think we can expect this kind of volatility with each year we add. How many of these movies will be in the Top Twenty Five at the end? Fewer than we’d expect, I’m sure.

Here’s our significant seven:

7. Scent of a Woman (IMDB 8.0, Certified Fresh 88%, CinemaScore A, Major Academy Award Win)

This movie is a favorite of mine. It produced Al Pacino’s only Academy Award win after being shut out for his seven previous nominations.

6. Good Will Hunting (IMDB 8.3, Certified Fresh 97%, CinemaScore A. Major  Academy Award Win)

One of my followers wondered why his favorite movie didn’t make the list. Good Will Hunting is a good illustration of what it takes. It requires high ratings from all feedback groups, movie watchers, movie critics, opening night moviegoers, and peer movie artists.

5. The Shawshank Redemption (IMDB 9.3, Certified Fresh 91%, CinemaScore A, Major Academy Award Nomination)

Another one of the original Top Six. The Achilles Heel for this movie from an objective rating standpoint is its failure to win a major Academy Award despite three major nominations.

4. The Usual Suspects (IMDB 8.6, Certified Fresh 88%, No CinemaScore rating, Major Academy Award Win)

Because this is an objective ranking rather than subjective, Kevin Spacey movies are still considered. In the long run, I wonder how much the absence of a CinemaScore rating will hurt this movie and, if so, should it.

3. The Lion King (IMDB 8.5, Certified Fresh 83%, CinemaScore A+, Minor Academy Award Win)

A few weeks before the release of this picture, Elton John was given a private screening of the movie. He noticed the love song he wrote wasn’t in the film and successfully lobbied to have it put back in. That song, Can You Feel the Love Tonight, won Elton John an Academy Award for Best Original Song.

2. Saving Private Ryan (IMDB 8.6, Certified Fresh 92%, CinemaScore A, Major Academy Award Win)

The only movie from the just added 1998 year to make the list. It is also the only movie on the list to be the top grossing movie for the year it was released.

1. Schindler’s List (IMDB 8.9, Certified Fresh 96%, CinemaScore A+, Major Academy Award Win)

According to the Objective “Really Like” algorithm, a 76.98% “really like” probability is the highest score that can be achieved with the algorithm. So far, Schindler’s List is the only movie with that perfect score.

***

Disney animated movies rule Thanksgiving weekend. According to Box Office Mojo, Disney owns 9 of the 10 highest grossing Thanksgiving movies of all time. Coco, which opened in theaters yesterday, is this year’s entrant into their tradition of Thanksgiving dominance. Early IMDB ratings give it a 9.1 average rating to go along with its 96% Certified Fresh Rotten Tomatoes rating. This morning CinemaScore gave it an A+ rating.

Also, two more Oscar hopefuls go into limited release this weekend. Darkest Hour is the perfect bookend to Dunkirk. It follows Winston Churchill’s response to the events at Dunkirk. Gary Oldman’s portrayal of Churchill has him on everyone’s short list for Best Actor. Also worth considering is a festival favorite, Call Me By Your Name, which was nominated this week for an Independent Spirit Award for Best Picture.

Happy Thanksgiving to you and your families.

Add a Year Here. Tweak a Formula There. And, the Objective Top Twenty Looks Very Different.

I was able to add 1998 to the Objective Database last weekend. The extra data allowed me to factor in Oscar wins to the algorithm. But, it was one little tweak to the Oscar performance factor that dramatically altered the 2017 Objective Top Twenty this week.

For the Oscar performance part of my algorithm I created five groupings of movies based on their highest Academy Award achievement. If a movie won in a major category it went in the first group. If it was nominated for a major but didn’t win, it went in the second group. If it wasn’t nominated for a major but won in a minor category, it went into the third group. If it was only nominated in a minor category but didn’t win, it went into the fourth group. Finally, if it wasn’t nominated in any Oscar category, it went into the fifth group.

In terms of what percentage of the movies in each group that had an average IMDB rating of 7 or better, here are the results:

Best Oscar Performance: %  7+ IMDB Avg. Rating
Major Win 90.3%
Major Nomination 87.7%
Minor Win 79.7%
Minor Nomination 71.7%
No Nominations 59.8%

Wins seem to matter, particularly for the minor categories. Major nominations clearly are better “really like” indicators than minor nominations. It’s the no nominations grouping that’s most revealing. If a movie doesn’t get at least one nomination, the odds of it being a “really like” movie are dramatically reduced. This led to my discovery of some faulty thinking on my part.

If movies like DunkirkLady Bird, and Three Billboards Outside of Ebbing, Missouri, all movies headed towards major Oscar nominations in January, are treated in my algorithm as if they failed to earn a single Oscar nomination, those movies are being unfairly penalized. It was this flaw in my system that needed fixing. Now, those movies that haven’t gone through the Oscar nominating process are designated as Not Applicable. No Oscar performance test is applied to them. Without the weight of the No Nomination designation, many of the movies that didn’t get their first release until 2017 have risen significantly in the 2017 Objective Top Twenty rankings.

***

Get ready for a Thanksgiving treat. Now that 1998 has been added to the Objective Database, we can reveal the Objective Top Seven Movies from the years 1992-1998. Adding Academy Award Wins to the mix will shake up those rankings as well. Check in next Thursday after you’ve taken your post-turkey dinner nap.

***

The wide releases this weekend are Justice LeagueThe Star, and Wonder, but it’s the limited release, Mudbound, that I’ll be watching closely . This movie, set in the post-WII rural American South, is being mentioned as a Best Picture contender. Here’s the thing though. Most people won’t see it in the movie theater since it opens simultaneously on Friday on Netflix streaming. Can a movie that is more widely viewed at home than in the theater gain Academy Award traction? Stay tuned.

 

Objectively Speaking, What Are The Top Six Movies From 1992 to 1997.

Now, I might admit that a Top Six list from a seemingly random six year period seems a little odd. There is a method to my Movie Madness.

As I’ve mentioned on more than one occasion, I’m building a twenty five year movie database with solely objective factors to better identify those movies most of us would “really like”. It’s a time consuming process. If I’m uninterrupted by other priorities in my life, I can usually add a complete year to the database in a week and a half. There will always be interruptions, though, and I don’t expect to finish my project before mid-year 2018.

I’m a little impatient to get some useful information from my efforts and so I thought it might be fun to create an Objective Best Movie List for however many years I’ve completed. I’ve completed six years and so I now have a list of the best six movies from my completed time frame. I should complete 1998 by the weekend and after incorporating the new data into my algorithm I’ll be able to create a Top Seven list. Now that you have the picture here’s the top six in ascending order.

6. Sense and Sensibility (1995). IMDB Avg. 7.7, Certified Fresh 80%, CinemaScore A, Oscar- 3 Major nominations, 4 Minor

This was the first of a mid-1990’s run of Jane Austen titles to make it to the big screen. Emma Thompson won the Oscar for Best Screenplay. She is the only person to ever win both a Best Acting and a Best Screenwriting award. The movie is also noteworthy for the breakthrough performance of Kate Winslet who at age 20 earned her first of seven Oscar nominations.

5. In the Name of the Father (1994). IMDB Avg. 8.1, Certified Fresh 94%, CinemaScore A, Oscar- 4 Major nominations, 3 Minor

This is the movie that will probably surprise many of you. This biopic of Gerry Conlon, who was wrongly imprisoned for an IRA bombing, was the second of Daniel Day-Lewis’ five Best Actor nominations. He lost 30 pounds in preparation for the role and spent his nights on the set in the prison cell designed for the movie.

4. Good Will Hunting (1997). IMDB Avg. 8.3, Certified Fresh 97%, CinemaScore A, Oscar- 4 Major nominations,, 5 Minor

This movie is in my personal top ten. Two relatively unknown actors, Matt Damon and Ben Affleck became stars overnight and won Oscars for Best Screenplay as well. If either of them ever get a Best Actor award, they’ll join Emma Thompson in that select group. In his fourth nominated performance Robin Williams won his only Oscar for Best Supporting Actor.

3. Toy Story (1995). IMDB Avg. 8.3, Certified Fresh 100%, CinemaScore A, Oscar-1 Major Nomination, 2 Minor

Toy Story’s ranking is driven by its 100% Fresh Rotten Tomatoes rating from 78 critics. While its Oscar performance is weaker than the other movies on the list, it should be noted that Toy Story was the first animated movie to ever be nominated for Best Screenplay. As the database grows, I would expect that the number of Oscar nominations and the number of wins will become credible factors in these rankings. For now, receiving one Major and one Minor nomination has the same impact on the algorithm as for a movie like Titanic that won eleven awards. This is probably the only movie of the six that appears out of place in the rankings.

2. Shawshank Redemption (1994). IMDB Avg. 9.3, Certified Fresh 91%, CinemaScore A, Oscar- 3 Major nominations, 4 Minor

Shawshank still ranks as IMDB’s top movie of all time. At some point, I’m going to write an article about movies that achieve cult status after having only modest success at the box office. Shawshank would be one of those movies. After a pedestrian $28,341,469 domestic gross at the Box Office, it became one of the highest grossing video rentals of all time.

1. Schindler’s List (1994). IMDB Avg. 8.9, Certified Fresh 96%, CinemaScore A+, Oscar- 4 Major nominations, 8 Minor

Interestingly, this is the only movie of the six on the list to win Best Picture. It is also the only one on the list to earn an A+ from CinemaScore. Combine that with its twelve Oscar nominations and you can see why, objectively, it is at the top of the list.

Objectivity improves as data grows. It should be fun to see this list change as the database grows.

What do you think?

 

What Does the Best Picture Oscar Race Look Like Today.

I was away most of the week and so I wasn’t able to update my databases, my lists, or come up with new interesting studies. But, that doesn’t mean I haven’t been thinking about “really like” movies.

This time of year I follow AwardsCircuit.com to follow the latest thinking in the Oscar race. AwardsCircuit updated their projected nominees this past Monday and with nine weekends left in the year eight of the ten Best Picture projections have not gone into wide release yet. Does this mean that the best is yet to come? It could. But, it could also mean that they are still hyped for Best Picture because their exposure to critics and audiences has been limited.

There were other movies that have already been released that were expected to be Best Picture contenders. Of these only Dunkirk and Blade Runner 2049 have met their pre-release expectations and are still considered Best Picture caliber movies. Other Best Picture hyped movies, like Battle of the Sexes, Marshall, Suburbicon, and Mother, have either wilted or flopped when exposed to critics and audiences. The same could happen to the eight pre-release movies still projected for Best Picture nominations.

If Dunkirk and Blade Runner 2049 have survived the scrutiny of critics and audiences to remain Best Picture contenders, how do the remaining eight projected contenders measure up to those movies so far. All eight have been seen at film festivals to a limited degree by critics and audiences and so there is some feedback to see how these movies are trending. Using average ratings from IMDB and Rotten Tomatoes % Fresh ratings, we can get some early feedback on how those eight movies are faring so far. I’ve converted the Rotten Tomatoes % Fresh to a ten point scale to get an apples to apples comparison with IMDB. I’ve also included the four movies mentioned above that haven’t lived up to the hype so far. The eight pre-release contenders are in bold on the list.

Movie IMDB Rotten Tomatoes Total Score
Call Me By Your Name 8.3 9.8 18.1
Three Billboards Outside of Ebbing, Missouri 8.3 9.8 18.1
Lady Bird 7.8 10.0 17.8
Dunkirk 8.3 9.2 17.5
Blade Runner 2049 8.5 8.8 17.3
Shape of Water, The 7.5 9.7 17.2
I, Tonya 7.4 9.1 16.5
Mudbound 6.3 9.5 15.8
Battle of the Sexes 6.9 8.5 15.4
Marshall 7.0 8.3 15.3
Mother 7.1 6.9 14.0
Last Flag Flying 6.7 6.8 13.5
Darkest Hour 5.3 7.9 13.2
Suburbicon 4.7 2.5 7.2

If the post-release feedback is consistent with the pre-release feedback, then Call Me By Your NameThree Billboards Outside of Ebbing, Missouri, and Lady Bird are the real deal. The Shape of Water, and I, Tonya also appear solid. Mudbound could be on the fence. The early audience response to Last Flag Flying and Darkest Hour may be warning signs that these movies may have been overhyped. If they falter, Battle of the Sexes could move back into contention. You could also see two movies that haven’t been seen by either critics or audiences yet, The Post and Phantom Thread, possibly emerge as contenders. You could also see a dark horse like The Florida Project (IMDB=8.1, Rotten Tomatoes=97% Fresh) sneak in. There are still many twists and turns that will present themselves before Best Picture nominations are announced in January.

The first of these eight movies to test themselves will be Lady Bird which goes into limited release this coming weekend. With fifty critic reviews registered in Rotten Tomatoes, it is still at 100% Certified Fresh. This is one that I’ll probably see in the theaters. Soairse Ronan has become one of my favorite young actresses.

The Objective Top Twenty Doesn’t Account for Personal Taste

Over the last few months I’ve spent a lot of time introducing you to the 2017 Objective Top Twenty Movies. But, let’s be clear. It isn’t my list of the top twenty movies so far. As a matter of fact, I’ve only seen a handful of the movies and I may only see a handful more in the future. There are some movies on the list that I’ll never watch. At the end of the day, which movies you watch on the list is a matter of personal taste.

The Objective Top Twenty is ranking of movies that, based on available data today, I’m most confident are good movies, regardless of personal taste. Hidden Figures and Lion are at the top of the list because there is more data available for those movies than any of the other movies on the list and that mature data continues to support the quality of those two movies. I can say with a high degree of confidence that both of these movies are really good movies. On the other hand Blade Runner 2049, which probably is a good movie, just doesn’t have the data support yet to confidently support that subjective opinion.

While I’m confident all of the movies on the Objective Top Twenty are good movies, I’m not confident that you, personally, would “really like” every movie on the list. In fact, I’m fairly confident you wouldn’t like every movie on the list. Our personal taste in movies reflects our life experiences. Those movies that we “really like” somehow connect with our memories, our aspirations, our curiosity, or represent a fun place to escape. Not every movie on the Objective Top Twenty is going to make the personal connection needed to make it a “really like” movie for each of us.

So, which of the Objective Top Twenty should you watch. Other than using the websites I promote in this blog, most people use trailers to see if they connect with a small sample of the movie. If it’s an Objective Top Twenty movie and the trailer connects with you, that’s not a bad approach. The only caution is that sometimes a trailer leaves you with the impression that a movie is about X when it’s really about Y.

My recommendation is to use at least one personal rating website that will model your personal taste in movies. I use three, Netflix-DVD, Movielens, and Criticker. There are links for all three at the top of the page. I’ve created a subjective “really like” model to go along with the objective model used to create the Objective Top Twenty. Here’s a ranking of the Objective Top Twenty based on the probability today that I will personally “really like” the movie.

2017 Released Movies Subjective “Really Like” Probability Objective “Really Like” Probability My Rating for Seen Movies
Hidden Figures 74.3% 76.78% 7.9
Lion 74.0% 76.00% 7.9
Wonder Woman 73.2% 71.39% 8.5
Dunkirk 72.7% 70.71% 8.4
Patriots Day 72.7% 71.01%
Spider-Man: Homecoming 71.9% 71.39%
Logan 71.3% 70.71%
Big Sick, The 69.5% 70.56% 8.4
Guardians of the Galaxy Vol. 2 69.2% 71.01%
Only the Brave  62.6% 71.01%
Monster Calls, A 62.2% 71.01%
Land of Mine 61.2% 74.72%
Salesman, The 59.2% 75.18%
I Am Not Your Negro 56.0% 75.18%
Kedi 52.4% 70.56%
Florida Project, The 51.6% 70.56%
Truman 50.8% 70.56%
20th Century Women 50.5% 75.21%
Silence 48.7% 72.78%
Lucky 45.7% 70.56%

The movies that I’ve seen so far are, for the most part, the movies at the top of the list. I’ve, in effect, ranked the Objective Top Twenty based on those movies with the greatest probability that I will “really like” them. I am certain that I will watch all of the top nine movies on this list. I will probably watch some of the remaining eleven movies on the list. I will definitely not watch all of them.

However, you choose to do it, the Objective Top Twenty needs a personal touch when you use the list to pick movies to watch. I can only guarantee that they are good movies. It’s up to you to figure out which ones will be “really like” movies for you.

I’m Stating the Obvious But You Will Probably “Really Like” Oscar Nominated Movies.

You are more likely to “really like” a movie that has received an Oscar nomination than one that hasn’t. Now, there’s a bold statement. But while most people would intuitively agree with the statement, I have statistical data to support it.

As followers of this blog are aware, I’m building a database of  objective movie ratings data from the past 25 years. Last week I added a fifth year of data. With each year that I add I can pose questions that are easier to test statistically, such as, do Oscar nominations have “really like” statistical significance. I even take it a step further by exploring if there are differences between major nominations and minor ones.

Major nominations are the commonly accepted major awards for Best Picture, Director, Actor, Actress, and Screenplay. Minor nominations are for all of the other categories presented on Oscar night. It doesn’t include the special technical awards presented in a separate ceremony.

Here are the results for the years 1992 to 1996. The movies are grouped by whether they were awarded at least one major and/or minor nomination. The table represents the percentage of IMDB voters who gave the movies in each group a rating of 7 or higher.

Movies with: % 7+
Major & Minor Nominations 90.5%
Major Nominations Only 84.6%
Minor Nominations Only 74.7%
No Nominations 61.4%
All Sample Movies 73.0%

Major nominations have a clear statistical advantage over minor nominations. The size of the gap between movies with just minor nominations and those with no nominations might be surprising. My gut tells me that this gap will narrow as we add more years, especially when we add more recent years. But, it is interesting nonetheless. It does suggest that members of the Academy of Motion Picture Arts and Sciences (AMPAS) understand their craft and that knowledge does a great job identifying the “really like” movies released in a given year.

There are more questions to answer regarding Oscar performance as a “really like” indicator. What is the predictive value of an Oscar win? Does predictive value increase with number of nominations that a movie receives? Does a Best Picture nomination have more predictive value than any other category? All of these questions and more will have to wait for more data.

One question we have answered is why all of the movies at the top of the Objective Top Twenty are Oscar nominated movies from last year’s voting. The other takeaway is that all of the other movies on the list that didn’t go through last year’s nominating process, probably won’t stay on the list unless their name is called on January 23, 2018 when this year’s Oscar nominations are announced.

***

It might be a light weekend for new Objective Top Twenty contenders. I’m keeping my eye on Only The Brave which chronicles the story of the Granite Mountain Hotshots, one of the elite firefighting units in the USA. As of this morning, it is 89% Fresh on Rotten Tomatoes and has a 7.3 on IMDB.

 

 

 

 

 

In the Objective Top Twenty, a Certified Fresh Is a Must…But Is It Enough?

When you review the Objective Top Twenty you’ll notice that every movie has earned a Certified Fresh designation from Rotten Tomatoes. It is a dominant factor in my rating system. It may even be too dominant.

All of the analysis that I’ve done so far suggests that a Certified Fresh designation by Rotten Tomatoes is a strong indicator of a “really like” movie. The new Objective Database that I’m working with also shows that a Certified Fresh rating results in a high likelihood that IMDB voters will rate the movie a 7 or higher.

 # of IMDB Votes IMDB Votes 7+ %
Certified Fresh               19,654,608 88.2%
Fresh                  6,144,742 75.4%
Rotten                  9,735,096 48.5%

And, as you might expect, the likelihood of a 7 or higher rating stair steps down as you move into the Fresh and Rotten groups of movies.

This exposes a flaw in my previous thinking about Rotten Tomatoes. In the past I’ve indicated that I haven’t seen a statistical relationship between the % Fresh and the likelihood of a “really like” movie. And, actually, that’s a true statement. The flaw in my thinking was that because I didn’t see it I assumed it didn’t exist.

The Certified Fresh, Fresh, and Rotten designations are primarily defined by % Fresh:

  • Certified Fresh for most movies is > 75% Fresh
  • Fresh for most movies is > 60% and < 75% Fresh
  • Rotten is < 60% Fresh

If differentiation exists for these three groups then it should exist between other % Fresh groups. For example, movies that are 95% Certified Fresh should have a greater “really like” probability than movies that are 80% Certified Fresh. I now believe that I haven’t seen the difference because there hasn’t been enough data to produce stable differences.

When I begin to marry Rotten Tomatoes data with IMDB, I also get more data. Below I’ve grouped the Certified Fresh movies into four groups based on % Fresh.

Certified Fresh:  # of IMDB Votes IMDB Rating 7+ %
100%                     966,496 90.7%
90-99%               10,170,946 89.9%
80-89%                  5,391,437 87.3%
70-79%                  3,125,729 83.5%

We might be seeing the differences you’d expect to see when the units of data get larger.

So, why is this important? If we treat all Certified Fresh movies as strong “really like” prospects, we are in effect saying that we are as likely to “really like” The Shawshank Redemption (Certified Fresh 91%, IMDB Avg. Rating 9.3) as The Mask ( Certified Fresh 77%, IMDB Avg. Rating 6.9). The “really like” model becomes a more dynamic movie pre-screening tool if it can make a Rotten Tomatoes distinction between those two movies.

I believe that the database has to get much larger before we can statistically differentiate between Certified Fresh 87% movies and Certified Fresh 85% movies. But, I think I can begin to integrate the Certified Fresh groupings I developed above to create some additional means of defining quality movies within the Certified Fresh grade.

You might just see this change in next Monday’s Objective Top Twenty.

***

In looking at this weekend’s new releases, there are no sure things but three of the movies are worth keeping an eye on. The Foreigner, the Jackie Chan action thriller, is getting good early feedback from critics and IMDB voters. I expect it to do well at the box office. Marshall, the Thurgood Marshall bio-pic starring Chadwick Boseman, has received some early Oscar buzz. It appears to be headed towards a Certified Fresh rating from Rotten Tomatoes. The movie that may sneak up on audiences is Professor Marston & the Wonder Woman. Professor Marston created the character of Wonder Woman in the 1940’s. This movie tells that story. Already 34 of 38 critics have given it a Fresh rating on Rotten Tomatoes. I would expect it to receive its Certified Fresh designation by tomorrow morning.

 

 

 

 

 

 

Will “You” Really Like This Movie?

If you reviewed this week’s Objective Top Twenty, you might have noticed something other than five additional movies on the list. You might have noticed that, other than Hidden Figures holding onto the number one spot on the list, all of the rankings had changed.

A few month’s back I mentioned that I was developing a new objective database to project “really like” movies that are not influenced at all by my taste in movies. This week’s Objective Top Twenty reflects the early fruits of that labor.

The plan is to build a very robust database of all of the movies from the last twenty five years that finished in the top 150 in box office sales for each year . I have 1992 through 1995 completed which gives me enough movies to get started with.

The key change in the “really like” formula is that my algorithm measures the probability that users of the IMDB database will rate a particular movie as a 7 out of 10 or higher, which is my definition of a “really like” movie. The key components of the formula are IMDB Average Rating, Rotten Tomatoes Rating, CinemaScore Grade, and the number of  Academy Award wins and nominations for the major categories and for the minor categories.

In future posts, I’ll flesh out my logic for all of these factors. But, the key factor is the capability to measure on IMDB the percentage of IMDB voters who have rated a particular movie as a 7 or higher. When you aggregate all of the movies with a particular IMDB average rating you get results that look like this sample:

Avg. Rating % Rating 7+
                8.5 92.8%
                8.0 88.8%
                7.5 81.4%
                7.0 69.2%
                6.5 54.7%
                6.0 41.5%
                5.5 28.7%

Note that, just because a movie has an average rating of 7.0, doesn’t mean that every movie with a 7.0 average rating is a “really like” movie.  Only 69.2% of the votes cast for the movies with a 7.0 rating were ratings of 7 or higher. Conversely, every movie with an average rating of 6.0 isn’t always a “don’t really like” movie since 41.5% of the voters handed out 7’s or higher. It does mean, though, that the probability of a 7.0 average rated movie is more likely to be a “really like” movie than one with a 6.0 rating.

These changes represent a start down a path towards a movie pre-screening tool that is more useful to the followers of this blog. It is a work in progress that will only get better as more years are added to the database. But, we have a better answer now to the question, “Will you ‘really like’ this movie?”

***

If you’re going to the movies this weekend, chances are that you’re going to see Blade Runner 2049. The early indications are that it is going to live up to the hype. You might also check out The Florida Project, an under the radar movie that is getting some apparently well-deserved buzz.

Post Navigation