Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the category “Movie Lists”

There Are No Turkeys in the Objective Top Seven Movies From 1992 to 1998

Shall we call it “The Drive for Twenty Five”? If so, this installment of our journey to the Objective Top Twenty Five Movies of the last Twenty Five years begs the question which of these Cinematic Seven will survive to Twenty Five. By adding 1998 to the Objective Database more discrete groupings of data are statistically viable. As future years are added the number of groupings will grow resulting in many changes to this list. From the initial Top Six list that was published just two weeks ago, only three movies remain in the Top Seven. I think we can expect this kind of volatility with each year we add. How many of these movies will be in the Top Twenty Five at the end? Fewer than we’d expect, I’m sure.

Here’s our significant seven:

7. Scent of a Woman (IMDB 8.0, Certified Fresh 88%, CinemaScore A, Major Academy Award Win)

This movie is a favorite of mine. It produced Al Pacino’s only Academy Award win after being shut out for his seven previous nominations.

6. Good Will Hunting (IMDB 8.3, Certified Fresh 97%, CinemaScore A. Major  Academy Award Win)

One of my followers wondered why his favorite movie didn’t make the list. Good Will Hunting is a good illustration of what it takes. It requires high ratings from all feedback groups, movie watchers, movie critics, opening night moviegoers, and peer movie artists.

5. The Shawshank Redemption (IMDB 9.3, Certified Fresh 91%, CinemaScore A, Major Academy Award Nomination)

Another one of the original Top Six. The Achilles Heel for this movie from an objective rating standpoint is its failure to win a major Academy Award despite three major nominations.

4. The Usual Suspects (IMDB 8.6, Certified Fresh 88%, No CinemaScore rating, Major Academy Award Win)

Because this is an objective ranking rather than subjective, Kevin Spacey movies are still considered. In the long run, I wonder how much the absence of a CinemaScore rating will hurt this movie and, if so, should it.

3. The Lion King (IMDB 8.5, Certified Fresh 83%, CinemaScore A+, Minor Academy Award Win)

A few weeks before the release of this picture, Elton John was given a private screening of the movie. He noticed the love song he wrote wasn’t in the film and successfully lobbied to have it put back in. That song, Can You Feel the Love Tonight, won Elton John an Academy Award for Best Original Song.

2. Saving Private Ryan (IMDB 8.6, Certified Fresh 92%, CinemaScore A, Major Academy Award Win)

The only movie from the just added 1998 year to make the list. It is also the only movie on the list to be the top grossing movie for the year it was released.

1. Schindler’s List (IMDB 8.9, Certified Fresh 96%, CinemaScore A+, Major Academy Award Win)

According to the Objective “Really Like” algorithm, a 76.98% “really like” probability is the highest score that can be achieved with the algorithm. So far, Schindler’s List is the only movie with that perfect score.

***

Disney animated movies rule Thanksgiving weekend. According to Box Office Mojo, Disney owns 9 of the 10 highest grossing Thanksgiving movies of all time. Coco, which opened in theaters yesterday, is this year’s entrant into their tradition of Thanksgiving dominance. Early IMDB ratings give it a 9.1 average rating to go along with its 96% Certified Fresh Rotten Tomatoes rating. This morning CinemaScore gave it an A+ rating.

Also, two more Oscar hopefuls go into limited release this weekend. Darkest Hour is the perfect bookend to Dunkirk. It follows Winston Churchill’s response to the events at Dunkirk. Gary Oldman’s portrayal of Churchill has him on everyone’s short list for Best Actor. Also worth considering is a festival favorite, Call Me By Your Name, which was nominated this week for an Independent Spirit Award for Best Picture.

Happy Thanksgiving to you and your families.

Advertisements

Objectively Speaking, What Are The Top Six Movies From 1992 to 1997.

Now, I might admit that a Top Six list from a seemingly random six year period seems a little odd. There is a method to my Movie Madness.

As I’ve mentioned on more than one occasion, I’m building a twenty five year movie database with solely objective factors to better identify those movies most of us would “really like”. It’s a time consuming process. If I’m uninterrupted by other priorities in my life, I can usually add a complete year to the database in a week and a half. There will always be interruptions, though, and I don’t expect to finish my project before mid-year 2018.

I’m a little impatient to get some useful information from my efforts and so I thought it might be fun to create an Objective Best Movie List for however many years I’ve completed. I’ve completed six years and so I now have a list of the best six movies from my completed time frame. I should complete 1998 by the weekend and after incorporating the new data into my algorithm I’ll be able to create a Top Seven list. Now that you have the picture here’s the top six in ascending order.

6. Sense and Sensibility (1995). IMDB Avg. 7.7, Certified Fresh 80%, CinemaScore A, Oscar- 3 Major nominations, 4 Minor

This was the first of a mid-1990’s run of Jane Austen titles to make it to the big screen. Emma Thompson won the Oscar for Best Screenplay. She is the only person to ever win both a Best Acting and a Best Screenwriting award. The movie is also noteworthy for the breakthrough performance of Kate Winslet who at age 20 earned her first of seven Oscar nominations.

5. In the Name of the Father (1994). IMDB Avg. 8.1, Certified Fresh 94%, CinemaScore A, Oscar- 4 Major nominations, 3 Minor

This is the movie that will probably surprise many of you. This biopic of Gerry Conlon, who was wrongly imprisoned for an IRA bombing, was the second of Daniel Day-Lewis’ five Best Actor nominations. He lost 30 pounds in preparation for the role and spent his nights on the set in the prison cell designed for the movie.

4. Good Will Hunting (1997). IMDB Avg. 8.3, Certified Fresh 97%, CinemaScore A, Oscar- 4 Major nominations,, 5 Minor

This movie is in my personal top ten. Two relatively unknown actors, Matt Damon and Ben Affleck became stars overnight and won Oscars for Best Screenplay as well. If either of them ever get a Best Actor award, they’ll join Emma Thompson in that select group. In his fourth nominated performance Robin Williams won his only Oscar for Best Supporting Actor.

3. Toy Story (1995). IMDB Avg. 8.3, Certified Fresh 100%, CinemaScore A, Oscar-1 Major Nomination, 2 Minor

Toy Story’s ranking is driven by its 100% Fresh Rotten Tomatoes rating from 78 critics. While its Oscar performance is weaker than the other movies on the list, it should be noted that Toy Story was the first animated movie to ever be nominated for Best Screenplay. As the database grows, I would expect that the number of Oscar nominations and the number of wins will become credible factors in these rankings. For now, receiving one Major and one Minor nomination has the same impact on the algorithm as for a movie like Titanic that won eleven awards. This is probably the only movie of the six that appears out of place in the rankings.

2. Shawshank Redemption (1994). IMDB Avg. 9.3, Certified Fresh 91%, CinemaScore A, Oscar- 3 Major nominations, 4 Minor

Shawshank still ranks as IMDB’s top movie of all time. At some point, I’m going to write an article about movies that achieve cult status after having only modest success at the box office. Shawshank would be one of those movies. After a pedestrian $28,341,469 domestic gross at the Box Office, it became one of the highest grossing video rentals of all time.

1. Schindler’s List (1994). IMDB Avg. 8.9, Certified Fresh 96%, CinemaScore A+, Oscar- 4 Major nominations, 8 Minor

Interestingly, this is the only movie of the six on the list to win Best Picture. It is also the only one on the list to earn an A+ from CinemaScore. Combine that with its twelve Oscar nominations and you can see why, objectively, it is at the top of the list.

Objectivity improves as data grows. It should be fun to see this list change as the database grows.

What do you think?

 

Why Did “The Big Sick” Drop Out of the Objective Top Fifteen This Week?

This past Sunday my wife, Pam, and I went to see The Big Sick. The movie tells the story of the early relationship days of the two screenwriters, Emily Gordon and Kumail Nanjiani. In fact, Nanjiani plays himself in the movie. It is the authenticity of the story, told in a heartfelt and humorous way, that makes this film special.

On the following day, last weekend’s blockbuster, Dunkirk, moved into the second spot in the revised Objective Top Fifteen rankings. When a new movie comes on the list another one exits. This week’s exiting movie, ironically, was The Big Sick. Wait! If The Big Sick is such a great movie why isn’t it in my top fifteen for the year? Are all of the other movies on the list better movies? Maybe yes. Maybe no. You’ll have to determine that for yourselves. You see the Objective Top Fifteen is your list, not mine.

I developed the Objective Top Ten, which became Fifteen the beginning of July and will become Twenty the beginning of October, to provide you with a ranking of 2017 widely released movies that are most likely to be “really like” movies. Because the ranking is based on objective benchmarks, my taste in movies has no influence on the list. The four benchmarks presently in use are: IMDB Avg. Rating, Rotten Tomatoes Rating, Cinemascore Rating, and Academy Award Nominations and Wins. A movie like Hidden Figures that meets all four benchmarks has the greatest statistical confidence in its “really like” status and earns the highest “really like” probability. A movie that meets three benchmarks has a greater “really like” probability than a movie that meets only two benchmarks. And so on.

The important thing to note, though, is that this is not a list of the fifteen best movies of the year. It is a ranking of probabilities (with some tie breakers thrown in) that you’ll “really like” a movie. It is subject to data availability. The more positive data that’s available, the more statistical confidence, i.e. higher probability, the model has in the projection.

Which brings me back to The Big Sick. Cinemascore surveys those movies that they consider “major releases”. The Big Sick probably didn’t have a big advertising budget. Instead, the producers of the film chose to roll the movie out gradually, beginning on June 23rd, to create some buzz and momentum behind the movie before putting it into wide release on July 14th. This is probably one of the reasons why Cinemascore didn’t survey The Big Sick. But, because The Big Sick is missing that third benchmark needed to develop a higher probability, it dropped out of the Top Fifteen. On the other hand, if it had earned at least an “A-” from Cinemascore The Big Sick would be the #2 movie on the list based on the tie breakers.

And, that is the weakness, and strength of movie data. “Major releases” have it. Smaller movies like The Big Sick don’t.

***

This weekend may be the end of the four week run of Objective Top Fifteen movie breakthroughs. Atomic Blonde, the Charlize Theron spy thriller, has an outside chance of earning a spot on the list. As of this morning, it is borderline for the IMDB and Rotten Tomatoes benchmarks. I’m also tracking Girls Trip which earned a Certified Fresh just in the last couple of days from Rotten Tomatoes and has an “A+” in hand from Cinemascore. For now, it is just below the IMDB benchmark. We’ll see if that changes over the weekend.

 

 

This Is Turning Into a “Really Like” Summer at the Movies.

In case you haven’t noticed, we are in the midst of a pretty good run of high quality movies this summer. Since the first weekend in May, which serves as the unofficial beginning of the summer movie season, there have been at least ten movies that have a 7.2 or higher IMDB average rating and have a Certified Fresh rating on Rotten Tomatoes.

May to July 2017 Wide Released Movies IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh
Baby Driver 8.4 C. Fresh 97%
Spider-Man: Homecoming 8.2 C. Fresh 93%
Wonder Woman 8.0 C. Fresh 92%
Guardians of the Galaxy Vol. 2 8.1 C. Fresh 81%
Big Sick, The 8.0 C. Fresh 97%
I, Daniel Blake  7.9 C. Fresh 92%
A Ghost Story 7.5 C. Fresh 87%
Okja 7.7 C. Fresh 84%
The Beguiled  7.3 C. Fresh 77%
The Hero  7.3 C. Fresh 76%

And if early indicators are accurate, War for the Planet of the Apes will join the list after this coming weekend. And, if the early buzz on social media holds up, Christopher Nolan’s new movie Dunkirk will join the list the following weekend.

This seems to me to be an unusually high number of quality movies for the summer so far but I can’t tell you how unusual…yet. I’m working on a new long term project. I’m creating a database solely made up of objective “really like” movie indicators. It will include all movies finishing in the top 150 in receipts at the box office for each of the last 25 years. This database will provide a better representation of the bad movies that are released each year as well as provide a more robust sample size.

For now, I can only compare this year’s quality to 1992 (the first of the 25 years in my new database). Allowing for the fact that Rotten Tomatoes wasn’t launched until 1998, I’ve allowed movies that aren’t Certified Fresh but would otherwise be if there were enough critic reviews of the movie. Even with that allowance, there are only 3 movies released between May and July 1992 that meet the quality criteria I’m using for this summer.

May to July 1992 Wide Released Movies IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh
Night on Earth             7.5 Fresh 73%
Enchanted April             7.6 Fresh 83%
A League of Their Own             7.2 C. Fresh 78%

I’ll also add that the IMDB average ratings tend to decline over time. It is probable that a few of this year’s movies will ultimately not meet the 7.2 IMDB rating minimum. But, with 7 of the 10 movies sitting with IMDB ratings at 7.7 or better, this year’s list should hold up pretty well over time.

***

As I mentioned above War for the Planet of the Apes opens tomorrow. It is easy to overlook how good this franchise has been. Here are the “really like” indicators for the franchise including a very early look at tomorrow’s entry.

IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh Cinema Score
Rise of the Planet of the Apes (2011)             7.6 C. Fresh 81% A-
Dawn of the Planet of the Apes (2014)             7.6 C. Fresh 90% A-
War for the Planet of the Apes (2017)             9.1 C. Fresh 93% ?

Franchises tend to get tired after the first movie. From the critics’ perspective, this franchise appears to get better with each new movie. I expect to see War for the Planet of the Apes on the Objective Top Fifteen list on Monday.

What Was The “Really Like” Movie of 2016? The Result May Surprise You.

According to Box Office Mojo, the website that tracks all things related to movie box office results, Baby Driver was last weekend’s big surprise at the box office. It also debuted in the number two spot on the 2017 Objective Top Fifteen posted on this site on Monday. What exactly does that mean? Not much yet. Think of it as the score in a game that is almost half over where most of the scoring occurs near the end of the game. The final result won’t crystalize until the Academy Award winners are announced next February. Also, keep in mind that most of the major Oscar contenders won’t be released until late in the year.

To give you some idea of what a final score does look like, here is the 2016 Objective Top Ten:

Top Ten 2016 Movies Based on Objective Criteria
As Of 7/7/2017
2016 Released Movies Oscar Noms/ Wins IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh Cinema Score Objective “Really Like” Probability
Hacksaw Ridge 6/2 8.2 C. Fresh 87% A 65.9%
La La Land 14/6 8.2 C. Fresh 92% 65.7%
Big Short, The 5/1 7.8 C. Fresh 88% A- 65.4%
Moonlight 8/3 7.5 C. Fresh 98% 65.1%
Fences 4/1 7.3 C. Fresh 93% A- 65.0%
Rogue One 2/0 7.9 C. Fresh 85% A 64.7%
Deepwater Horizon 2/0 7.2 C. Fresh 84% A- 64.7%
Jungle Book, The 1/1 7.5 C. Fresh 95% A 64.6%
Sully 1/0 7.5 C. Fresh 85% A 64.6%
Revenant, The 12/3 8.0 C. Fresh 81% B+ 64.6%

Just to clarify, eligibility for the list is based on when a movie goes into wide release. This pits Oscar contenders from 2015, like The Big Short and The Revenant, that were widely released in early 2016 against Oscar contenders from 2016, like Moonlight and La La Land, that were widely released late in 2016.

Are you surprised that Hacksaw Ridge is the 2016 “Really Like” Movie of the Year? The response of movie watchers is what separates this movie from the others,. That, and the fact that Cinemascore for some reason didn’t survey La La Land. I will say this though. I have talked to people who didn’t like Moonlight. I have also talked to people who felt that La La Land was over-hyped. But, I haven’t talked to a single person who hasn’t “really liked” Hacksaw Ridge.

This ranking approach intersects a number of different movie viewing perspectives. Movie critics are represented in Rotten Tomatoes. People who go to the movie theaters on opening weekend and provide feedback before movie word of mouth has influenced their opinion are represented by Cinemascore. People who watch movies on a variety of platforms are represented by IMDB. And, finally, the people who understand how difficult it is to create movies, the artists themselves, are represented by their Academy Award performance. All of them are statistically significant indicators of whether you will “really like” a movie or not.

All of you won’t like every movie on this list. While there is around a 65% chance you will “really like” these movies, there is also around a 35% chance that you won’t. All I’m saying is that there is better chance that you will “really like” one of these movies rather than the latest installment in the Transformers or Pirates of the Caribbean franchises.

***

While my last paragraph may sound as if I have a reflexive aversion to movies that are part of a franchise, that couldn’t be further from the truth. Whether it’s part of a franchise or not, well made movies with fresh perspectives are worth the time of movie-lovers. The big movie opening this weekend is the second reboot of the Spider-Man franchise, Spider-Man: Homecoming and I’m really looking forward to it. The early indicators from Rotten Tomatoes and IMDB are all positive. Keep an eye on this one.

Musings After a Quiet Movie Weekend

There were no changes this week to the 2017 Objective Top Ten. None of the movies that opened last weekend earned a Certified Fresh on Rotten Tomatoes. So, I have nothing to talk about. Right? Oh, you are so wrong.

First, regarding that Objective Top Ten that I update every Monday, I want to be clear about something. I’m not suggesting that you will like every movie on that list. I’m not suggesting that there aren’t good movies that didn’t make the list. In fact, my two favorite movies so far, Beauty and the Beast and Gifted, aren’t on the list. It is just an objective measure of quality. It doesn’t take into account your personal taste in movies. For example, if you typically don’t like Art House movies you may not like Kedi, which is a documentary about the hundreds of thousands of cats that have been roaming Istanbul for thousands of years, or Truman, which is a Spanish language film that celebrates the enduring nature of good friendship. These low budget movies tend to take risks and aren’t intended to please the general audience. But, would you really prefer to see the new Transformers movie which opened yesterday and is 16% Rotten on Rotten Tomatoes? You may prefer to avoid all three movies and that’s okay. The point of the list is to give you a menu of quality movies and if any naturally intrigue you, the odds are that it will be a “really like” movie for you.

Turning from low budget Art House films to big budget Blockbusters, the success of two other movies on the list explain why movies based on comic books are here to stay for the foreseeable future. Logan with its estimated $97 million production budget and Wonder Woman with its estimated budget of $149 million have returned a tidy return in worldwide box office receipts of over $617 million and $578 million, respectively. When quality movies in the comic book genre are made, they spin box office gold.

A couple of other notes on the Objective Top Ten List. In July I plan to expand the list to fifteen movies and in October I’ll expand it again to twenty movies. This will better accommodate the number of quality movies that typically are released over the second half of the year. Also, I’m close to being able to incorporate Cinemascore grades into the probabilities for the Objective Top Ten. It’s possible that this change may be incorporated as early as next Monday’s update. This change will differentiate better one movie from the next.

Finally, two movies that I have my eye on for this weekend are The Beguiled ,which earned Sofia Coppola the top director award at Cannes, and The Big Sick, which is already 98% Certified Fresh on Rotten Tomatoes.

Wonder Woman Is Wonderful But Is It the GOAT Superhero Movie?

Everybody is talking about Wonder Woman and its record-breaking box office last weekend. Critics and audiences agree that Wonder Woman is worth a trip to the theater. The Mad Movie Man is convinced as well. You’ll find the movie in the top half of the 2017 Top Ten List and it is on my Watch List for the week, which means I plan on seeing it within the next week.

I mentioned last week that critics were falling all over themselves in praising this movie with some calling it the Superhero GOAT (Greatest Of All Time). Does it warrant such acclaim? Maybe. When you compare it to four other highly rated superhero movies that kicked off franchises, it holds up pretty well.

Oscar Noms/Wins IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh Combined Rating
Wonder Woman (2017) 0/0 8.3 C. Fresh 93%              17.6
Iron Man (2008) 2/0 7.9 C. Fresh 94%              17.3
Batman Begins (2005) 1/0 8.3 C. Fresh 84%              16.7
Superman (1978) 3/0 7.3 C. Fresh 93%              16.6
Spider-Man (2002) 2/0 7.3 C. Fresh 89%              16.2

All four of these comparison movies were Oscar nominated. We’ll have to wait until next January to see if Wonder Woman earns Oscar recognition. The combined rating presented here totals the IMDB rating and the Rotten Tomatoes % Fresh (converted to a 10 pt. scale) to measure the response of both critics and audiences to the five movies. It is still early, and IMDB ratings tend to fade a little over time, but for now Wonder Woman is clearly in the GOAT discussion.

If Wonder Woman holds on to its statistical GOAT position it will be fueled by the response of women to the movie. A comparison of Female and Male IMDB ratings for the five movies compared here lays this out pretty clearly.

Female IMDB Rating Male IMDB Rating IMDB Rating Differnce
Wonder Woman 8.6 8.2 0.4
Iron Man 7.9 7.9 0.0
Superman 7.3 7.3 0.0
Batman Begins 8.1 8.3 -0.2
Spider-Man 7.1 7.3 -0.2

While men “really like” Wonder Woman, females love the movie. Women are responding like they never have before to a superhero movie. Men, on the other hand, have a slight preference for Christopher Nolan’s vision of Batman. I also have to admit that I personally consider Batman Begins as one of the GOAT movies, irrespective of genre. That being said, I am really excited about seeing Wonder Woman.

***

After all of this praise for Wonder Woman, you might be wondering why it is only fifth on the 2017 Top Movies List. Does that mean that the four movies ahead of it are better movies? It might but not necessarily. The top four movies all went into limited release in 2016 to qualify for Oscar consideration. They didn’t go into wide release until early 2017, which is why they are on this list. All of the other movies on the list won’t be considered for Oscar recognition until January 2018. As I mentioned last week, this list is based on objective criteria. The Oscar nominations that the top four movies received are additional objective pieces of evidence that they are quality movies. This allows the algorithm to be more confident in its evaluation of the movie and as a result produces a higher “really like” probability. Again, just in case you were wondering.

 

“Really Like” Previews of Coming Attractions 

Recently I mentioned to someone that I was a movie blogger. Naturally they assumed I wrote movie reviews. It did get me thinking, though, “what is my blog really about?”

Looking back at my previous 92 posts, it’s hard to discern a consistent theme. I confess that it has been hard to come up with a fresh idea every single week. The result has been a hodgepodge of articles that intersect movies and data, but lack a unifying core. That is…until now.

It occurs to me that, while I’m not in the movie reviewing business, I am in the movie previewing business. I use statistical analysis to preview what movies I might “really like”. It also occurs to me that I created my algorithm for my benefit, not yours. I write this blog, though, for your benefit.

With all of that in mind, I’ve decided to reorient this blog to a discussion of movies you might “really like”, using my statistical analysis as the foundation of the discussion. My algorithm has two parts. The first produces a “really like” probability based on data from websites like Netflix, Movielens, and Critcker that are oriented to movies that I, personally, will “really like”.

The second part of the equation is based on general data that has nothing to do with my personal taste in movies. IMDB and Rotten Tomatoes produce ratings based on the input of all of their website participants. Oscar award performance has nothing to do with me. I’m not a member of the academy. For now, these are the factors that go into my “really like” probability based on general information. It’s this “really like” probability that might be most useful to you, the followers of this blog.

On Monday I added a new list to this site. The Top Ten 2017 Movies Based on Objective Criteria uses this second half of my algorithm to suggest movies that you might “really like”. I intend to update this list every Monday after the initial data from the previous weekend’s new releases comes in. This Friday, for example, Wonder Woman goes into wide release. Some critics are calling it the “best superhero movie of all time”. It will be interesting to look at the data feedback on Monday to see if it’s actually trending that way.

I’m also exploring the addition of other general data to the criteria. For example is there statistical significance to when a movie is released. I’m in the data gathering stage of that study. I’m also planning on adding in future months Top Ten lists for years prior to 2017.

I will also continue to update on Wednesday’s my Watch List for the week. While it is based on movies I should “really like”, you might find some movies there that pique your interest.

As for this blog, I plan to orient each week’s post around one or two of the movies on my lists and offer up some ideas as to why it might be a movie that you’ll “really like”. For now I would encourage you to check back on Monday to see if the hyperbolic buzz surrounding Wonder Woman is supported by strong enough numbers to move it into 2017’s “really like” Top Ten. Then, return again on Thursday to see what movies that you might “really like” have caught my eye.

A Movie Watch List is Built by Thinking Fast and Slow

In early 2012 I read a book by Daniel Kahneman titled Thinking Fast and Slow. Kahneman is a psychologist who studies human decision making and, more precisely, the thinking process. He suggests that the human mind has two thinking processes. The first is the snap judgement that evolved to quickly identify threats and react to them quickly in order to survive. He calls this “thinking fast”. The second is the rational thought process that weighs alternatives and evidence before reaching a decision. This he calls “thinking slow”. In the book, Kahneman discusses what he calls the “law of least effort”. He believes that the mind will naturally gravitate to the easiest solution or action rather than to the more reliable evidence based solution. He suggests that the mind is most subject to the “law of least effort” when it is fatigued, which leads to less than satisfactory decision making more often than not.

How we select the movies we watch, I believe, is generally driven by the “law of least effort”. For most of us, movie watching is a leisure activity. Other than on social occasions, we watch movies when we are too tired to do anything else in our productive lives. Typically, the movies we watch are driven by what’s available to watch at the time we decide to watch. From the movies available, we decide what seems like a movie we’d like at that moment in time. We choose by “thinking fast”. Sometimes we are happy with our choice. Other times, we get half way through the movie and start wondering, over-optimistically I might add, if this dreadful movie will ever be over.

It doesn’t have to be that way. One tool I use is a Movie Watch List that I update each week using a “thinking slow” process.. My current watch list can be found on the side bar under Ten Movies on My Watch List This Week. Since you may read this blog entry sometime in the future, here’s the watch list I’ll be referring to today:

Ten Movies On My Watch List This Week
As Of March 22, 2017
Movie Title Release Year Where Available Probability I Will “Really Like”
Fight Club 1999 Starz 84.8%
Amélie 2002 Netflix – Streaming 72.0%
Hacksaw Ridge  2016 Netflix – DVD 71.5%
Emigrants, The 1972 Warner Archive 69.7%
Godfather: Part III, The 1990 Own DVD 68.7%
Pride and Prejudice 1940 Warner Archive 67.3%
Steel Magnolias 1989 Starz 67.1%
Paper Moon 1973 HBO 63.4%
Confirmation 2016 HBO 57.0%
Beauty and the Beast 2017 Movie Theater 36.6%

The movies that make it to this list are carefully selected based on the movies that are available in the coming week on the viewing platforms I can access. I use my algorithm to guide me towards movies with a high “really like” probability. I determine who I’m likely to watch movies with during the upcoming week. If I’m going to watch movies with others, I make sure that there are movies on the list that those others might like. And, finally, I do some “thinking fast” and identify those movies that I really want to see and those movies that, instinctively, I am reluctant to see.

The movies on my list above in green are those movies that I really want to see. The movies in turquoise are those movies I’m indifferent to but are highly recommended by the algorithm. The movies in red are movies that I’m reluctant to see.

So, you may ask, why do I have movies that I don’t want to see on my watch list? Well, it’s because I’m the Mad Movie Man. These are movies that my algorithm suggests have a high “really like” probability. In the case of Fight Club, for example, I’ve seen the movie before and was turned off by the premise. On the other hand, it is a movie that my algorithm, based on highly credible data,  indicates is the surest “really like” bet of all the movies I haven’t seen in the last 15 years. Either my memory is faulty, or my tastes have changed, or there is a flaw in my algorithm, or a flaw in the data coming from the websites I use. It may just be that it is among the movies in the 15% I won’t like. So, I put these movies on my list because I need to know why the mismatch exists. I have to admit, though, that it is hard getting these red movies off the list because I often succumb to the “law of least effort” and watch another movie I’d much rather see.

Most of our family is gathering together in the coming week and so Beauty and the Beast and Hacksaw Ridge are family movie candidates. In case my wife and I watch a movie together this week, Amélie , Pride and Prejudice, and Steel Magnolias are on the list.

The point in all this is that by having a Watch List of movies with a high “really like” probability you are better equipped to avoid the “law of least effort” trap and get more enjoyment out of your leisure time movie watching.

 

The Art of Selecting “Really Like Movies: Older Never Before Seen

Last week I stated in my article that I could pretty much identify whether a movie has a good chance of being a “really like movie” within six months of its release. If you need any further evidence, here are my top ten movies that I’ve never seen that are older than six months.

My Top Ten Never Seen Movie Prospects 
Never Seen Movies =  > Release Date + 6 Months
Movie Title Last Data Update Release Date Total # of Ratings “Really Like” Probability
Hey, Boo: Harper Lee and ‘To Kill a Mockingbird’ 2/4/2017 5/13/2011          97,940 51.7%
Incendies 2/4/2017 4/22/2011        122,038 51.7%
Conjuring, The 2/4/2017 7/19/2013        241,546 51.7%
Star Trek Beyond 2/4/2017 7/22/2016        114,435 51.7%
Pride 2/4/2017 9/26/2014          84,214 44.6%
Glen Campbell: I’ll Be Me 2/9/2017 10/24/2014        105,751 44.6%
Splendor in the Grass 2/5/2017 10/10/1961        246,065 42.1%
Father of the Bride 2/5/2017 6/16/1950        467,569 42.1%
Imagine: John Lennon  2/5/2017 10/7/1998        153,399 42.1%
Lorenzo’s Oil 2/5/2017 1/29/1993        285,981 42.1%

The movies with a high “really like” probability in this group have already been watched. Of the remaining movies, there are three movies that are 50/50 and the rest have the odds stacked against them. In other words, if I watch all ten movies I probably won’t “really like” half of them. The dilemma is that I would probably “really like” half of them if I do watch all ten. The reality is that I won’t watch any of these ten movies as long as there are movies that I’ve already seen with better odds. Is there a way to improve the odds for any of these ten movies?

You’ll note that all ten movies have probabilities based on less than 500,000 ratings. Will some of these movies improve their probabilities as they receive more ratings? Maybe. Maybe not. To explore this possibility further I divided my database into quintiles based on the total number of ratings. When I look at the quintile with the most ratings, the most credible quintile, it does provide results that define the optimal performance of my algorithm.

Quintile 5

# Ratings Range > 2,872,053

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 152 134 88% 8.6 8.5 -0.1
Movies Seen Once 246 119 48% 7.5 6.9 -0.7
             
All Movies in Range 398 253 64% 7.9 7.5  

All of the movies in Quintile 5 have more than 2,872,053 ratings. My selection of movies that I had seen before is clearly better than my selection of movies I watched for the first time. This better selection is because the algorithm results led me to the better movies and my memory did some additional weeding. My takeaway is that, when considering movies I’ve never seen before, put my greatest trust in the algorithm if the movie falls in this quintile.

Lets look at the next four quintiles.

Quintile 4

# Ratings Range 1,197,745 to 2,872,053

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 107 85 79% 8.3 8.3 0.1
Movies Seen Once 291 100 34% 7.1 6.4 -0.7
             
All Movies in Range 398 185 46% 7.4 6.9
Quintile 3

# Ratings Range 516,040 to 1,197,745

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 122 93 76% 7.8 8.0 0.2
Movies Seen Once 278 102 37% 7.1 6.6 -0.6
             
All Movies in Range 400 195 49% 7.3 7.0
Quintile 2

# Ratings Range 179,456 to 516,040

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 66 46 70% 7.4 7.5 0.2
Movies Seen Once 332 134 40% 7.0 6.4 -0.6
             
All Movies in Range 398 180 45% 7.1 6.6
Quintile 1

# Ratings Range < 179,456

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 43 31 72% 7.0 7.5 0.5
Movies Seen Once 355 136 38% 6.9 6.2 -0.7
             
All Movies in Range 398 167 42% 6.9 6.4

Look at the progression of the algorithm projections as the quintiles get smaller. The gap between the movies seen more than once and those seen only once narrows as the number of ratings gets smaller. Notice that the difference between my ratings and the projected ratings for Movies Seen Once is fairly constant for all quintiles, either -0.6 or -0.7. But for the Movies Seen More than Once, the difference grows positively as the number of ratings gets smaller. This suggests that, for Movies Seen More than Once, the higher than expected ratings I give movies in Quintiles 1 & 2 are primarily driven by my memory of the movies rather than the algorithm.

What does this mean for my top ten never before seen movies listed above? All of the top ten is either in Quintiles 1 or 2. As they grow into the higher quintiles some may emerge with higher “really like” probabilities. Certainly, Star Trek Beyond, which is only 7 months old, can be expected to grow into the higher quintiles. But, what about Splendor in the Grass which was released in 1961 and, at 55 years old, might not move into Quintile 3 until another 55 years pass.

It suggests that another secondary movie quality indicator is needed that is separate from the movie recommender sites already in use. It sounds like I’ve just added another project to my 2017 “really like” project list.

 

 

Post Navigation