Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the category “Uncategorized”

What Am I Actually Going to Watch This Week? Netflix Helps Out with One of My Selections.

The core mission of this blog is to share ideas on how to select movies to watch that we’ll “really like”. I believe that there have been times when I’ve bogged down on how to build the “really like” model. I’d like to reorient the dialogue back to the primary mission of what “really like” movies I am going to watch and more importantly why.

Each Wednesday I publish the ten movies on my Watch List for the week. These movies usually represent the ten movies with the highest “really like” probability that are available to me to watch on platforms that I’ve already paid for. This includes cable and streaming channels I’m paying for and my Netflix DVD subscription. I rarely use a movie on demand service.

Now, 10 movies is too much, even for the Mad Movie Man, to watch in a week. The ten movie Watch List instead serves as a menu for the 3 or 4 movies I actually most want to watch during the week. So, how do I select those 3 or 4 movies?

The first and most basic question to answer is who, if anyone, am I watching the movie with. Friday night is usually the night that my wife and I will sit down and watch a movie together. The rest of the week I’ll watch two or three movies by myself. So, right from the start, I have to find a movie that my wife and I will both enjoy. This week that movie is Hidden Figures, the 2016 Oscar nominated film about the role three black female mathematicians played in John Glenn’s orbit of the earth in the early 1960’s.

This movie became available to Netflix DVD subscribers on Tuesday May 9. I received my Hidden Figures DVD on that day. Something I’ve learned over the years is that Netflix ships DVD’s on Monday that become available on Tuesday. For this to happen you have to time the return of your old DVD to arrive on the Saturday or Monday before the Tuesday release. This gives you the best chance to avoid “long wait” queues.

I generally use Netflix DVD to see new movies that I don’t want to wait another 3 to 6 months to see or for old movies that I really want to see but aren’t available on my usual platforms.

As of the first quarter of 2017, Netflix reported that there are only 3.94 million subscribers to their DVD service. I am one of them. The DVD service is the only way that you can still access Netflix’ best in the business 5 star system of rating movies. It is easily the most reliable predictor of how you’ll rate a movie or TV show. Unfortunately, Netflix Streaming customers no longer have the benefit of the 5 Star system. They have gone to a less granular “thumbs up” and “thumbs down” rating system. To be fair, I haven’t gathered any data on this new system yet therefore I’ll reserve judgement as to its value. As for the DVD service, they will have me as a customer as long as they maintain their 5 star recommender system as one of the benefits of being a DVD subscriber.

The 5 star system is a critical assist to finding a movie for both my wife and I. Netflix allows you set up profiles for other members of the family. After my wife and I watch a movie, she gives it a rating and I give it a rating. These ratings are entered under our separate profiles. This allows a unique predicted rating for each of us based on our individual taste in movies. For example, Netflix predicts that I will rate Hidden Figures a 4.6 out of 5 and my wife will rate it a 4.9. In other words, according to Netflix, this is a movie that both of us, not only will “really like”, but we should absolutely “love”.

Hidden Figures has a “really like” probability of 61.4%. It’s Oscar Performance probability is 60.7% based on its three nominations. Its probability based solely on the feedback from the recommender sites that I use is 69.1%. At this point in time, it is a Quintile 1 movie from a credibility standpoint. This means that the 69.1% probability is based on a limited number of ratings. It’s not very credible yet. That’s why the 61.4% “really like” probability is closer to the Oscar Performance probability of 60.7%. I would fully expect that, as more people see Hidden Figures and enter their ratings, the “really like” probability will move higher for this movie.

Friday Night Movie Night this week looks like a “really like” lock…thanks to Netflix DVD.

 

 

Sometimes When You Start To Go There You End Up Here

There are some weeks when I’m stumped as to what I should write about in this weekly trip to Mad Moviedom. Sometimes I’m in the middle of an interesting study that isn’t quite ready for publication. Sometimes an idea isn’t quite fully developed. Sometimes I have an idea but I find myself blocked as to how to present it. When I find myself in this position, one avenue always open to me is to create a quick study that might be halfway interesting.

This is where I found myself this week. I had ideas that weren’t ready to publish yet. So, my fallback study was going to be a quick study of which movie decades present the best “really like” viewing potential. Here are the results of my first pass at this:

“Really Like” Decades
Based on Number of “Really Like” Movies
As of April 6, 2017
My Rating
Really Liked Didn’t Really Like Total “Really Like” Probability
 All       1,108                  888         1,996
 2010’s           232                  117            349 60.9%
 2000’s           363                  382            745 50.5%
 1990’s           175                    75            250 62.0%
 1980’s             97                    60            157 58.4%
 1970’s             56                    49            105 54.5%
 1960’s             60                    55            115 53.9%
 1950’s             51                    78            129 46.6%
 1940’s             55                    43               98 55.8%
 1930’s             19                    29               48 46.9%

These results are mildly interesting. The 2010’s, 1990″s, 1980’s, and 1940’s are above average decades for me. There are an unusually high number of movies in the sample that were released in the 2000’s. Remember that movies stay in my sample for 15 years from the year I last watched the movie. After 15 years they are removed from the sample and put into the pool of movies available to watch again. The good movies get watched again and the other movies are never seen again, hopefully. Movies last seen after 2002 have not gone through the process of separating out the “really like” movies to be watched again and permanently weeding from the sample the didn’t “really like” movies. The contrast of the 2000’s with the 2010’s is a good measure of the impact of the undisciplined selection movies and the disciplined selection.

As I’ve pointed out in recent posts, I’ve made some changes to my algorithm. One of the big changes I’ve made is that I’ve replaced the number of movies that are “really like” movies with the number of ratings for the movies that are “really like” movies. After doing my decade study based on number of movies, I realized I should have used the number of ratings method to be consistent with my new methodology. Here are the results based on the new methodology:

“Really Like” Decades
Based on Number of “Really Like” Ratings
As of April 6, 2017
My Rating
Really Liked Didn’t Really Like Total “Really Like” Probability
 All    2,323,200,802    1,367,262,395    3,690,463,197
 2010’s        168,271,890        166,710,270        334,982,160 57.1%
 2000’s    1,097,605,373        888,938,968    1,986,544,341 56.6%
 1990’s        610,053,403        125,896,166        735,949,569 70.8%
 1980’s        249,296,289        111,352,418        360,648,707 65.3%
 1970’s          85,940,966          25,372,041        111,313,007 67.7%
 1960’s          57,485,708          15,856,076          73,341,784 68.0%
 1950’s          28,157,933          23,398,131          51,556,064 59.5%
 1940’s          17,003,848            5,220,590          22,224,438 67.4%
 1930’s            9,385,392            4,517,735          13,903,127 64.6%

While the results are different, the big reveal was that 63.0% of the ratings are for “really like” movies and only 55.5% of the number of movies are “really like” movies. It starkly reinforces the impact of the law of large numbers. Movie website indicators of “really like” movies are more reliable when the number of ratings driving those indicators are larger. The following table illustrates this better:

“Really Like” Decades
Based on Average Number of “Really Like” Ratings per Movie
As of April 6, 2017
My Rating
Really Liked Didn’t Really Like Total “Really Like” % Difference
 All      2,096,751.63      1,539,709.90      1,848,929.46 36.2%
 2010’s          725,309.87      1,424,874.10          959,834.27 -49.1%
 2000’s      3,023,706.26      2,327,065.36      2,666,502.47 29.9%
 1990’s      3,486,019.45      1,678,615.55      2,943,798.28 107.7%
 1980’s      2,570,064.84      1,855,873.63      2,297,125.52 38.5%
 1970’s      1,534,660.11          517,796.76      1,060,123.88 196.4%
 1960’s          958,095.13          288,292.29          637,754.64 232.3%
 1950’s          552,116.33          299,976.04          399,659.41 84.1%
 1940’s          309,160.87          121,409.07          226,779.98 154.6%
 1930’s          493,968.00          155,783.97          289,648.48 217.1%

With the exception of the 2010’s, the average number of ratings per movie is larger for the “really like” movies. In fact, they are dramatically different for the decades prior to 2000. My educated guess is that the post-2000 years will end up fitting the pattern of the other decades once those years mature.

So what is the significance of this finding. It clearly suggests that waiting to decide whether to see a new movie or not until a sufficient number of ratings come in will produce a more reliable result. The unanswered question is how many ratings is enough.

The finding also reinforces the need to have something like Oscar performance to act as a second measure of quality for movies that will never have “enough” ratings for a reliable result.

Finally, the path from “there to here” is not always found on a map.

Is Meryl Streep’s Oscar Record Out of Reach?

With the presentation of Academy Awards completed last Sunday, I am able to tabulate the last Actors of the Decade winners. For the male actors, the winner is Daniel Day Lewis.

Top Actors of the Decade
2007 to 2016 Releases
Actor Lead Actor Nominations Lead Actor Wins Supporting Actor Nominations Supporting Actor Wins Total Academy Award Points
Daniel Day Lewis 2 2 0 0 12
Jeff Bridges 2 1 1 0 10
Leonardo DiCaprio 2 1 0 0 9
Colin Firth 2 1 0 0 9
Eddie Redmayne 2 1 0 0 9
George Clooney 3 0 0 0 9

This result is pretty incredible when you consider that Daniel Day Lewis only appeared in three movies during the entire decade. His three Academy Award Best Actor wins stands alone in the history of the category. It might be interesting to measure Oscar nominations per movie made. I’d be surprised if we found any actor who is even close to Daniel Day Lewis.

As for the Best Female Actor, once again, it is Meryl Streep.

Top Actresses of the Decade
2007 to 2016 Releases
Actress Lead Actress Nominations Lead Actress Wins Supporting Actress Nominations Supporting Actress Wins Total Academy Award Points
Meryl Streep 5 1 1 0 19
Cate Blanchett 3 1 1 0 13
Jennifer Lawrence 3 1 1 0 13
Marion Cotillard 2 1 0 0 9
Sandra Bullock 2 1 0 0 9
Natalie Portman 2 1 0 0 9

When the 28 year old Emma Stone accepted her Best Actress in a Leading Role award, she commented that she still has a lot to learn. It is that kind of attitude, and a commensurate work ethic, for a young actress today to take a run at Meryl Streep’s Oscar nomination record of 20 nominations. Consider that the actresses that Streep chased early in her career, Katherine Hepburn and Bette Davis, received their first nominations some 45 years before Streep earned her first nomination. It has been 38 years since Meryl Streep received her first nomination. We should be on the lookout for the next actress of a generation. Is there a contender already out there?

Let’s look first at the career Oscar performance of Streep, Hepburn, and Davis.

Acting Nomination Points
Lead Actress = 1 point,  Supporting Actress = .5 points
Points at Age:
30 40 50 60 70 80
Meryl Streep 1 7 11 14.5 18
Katherine Hepburn 2 4 6 9 10 11
Bette Davis 3 8 10 11 11 11

I chose not to equate a supporting actress role with a lead actress role to be fair to Hepburn and Davis. With the studios in control of the movies they appeared in, stars didn’t get the chance to do supporting roles. Bette Davis had a strong career before age 50. Katherine Hepburn was strong after age 50. Meryl Streep has outperformed both of them before 50 and after 50. It is not unreasonable to expect more nominations in her future.

As for today’s actresses, I looked at multiple nominated actresses in different age groups to see if anyone is close to tracking her.

Age as of 12/31/2016 Comparison Age Points at Comparison Age Streep at Comparison Age
Cate Blanchett 47 50 5.5 11
Viola Davis 51 50 2 11
Kate Winslet 41 40 5.5 7
Michelle Williams 36 40 3 7
Amy Adams 42 40 3 7
Natalie Portman 35 40 2.5 7
Marion Cotillard 41 40 2 7
Jennifer Lawrence 26 30 3.5 1
Emma Stone 28 30 1.5 1
Keira Knightley 31 30 1.5 1
Rooney Mara 31 30 1.5 1

Except for the 30-ish actresses, none are keeping pace. You might argue that Kate Winslet is in striking distance but, given Streep’s strength after 40, that’s probably not good enough.

Of the young actresses, Jennifer Lawrence has had a very strong start to her career. With 3 lead actress nominations and 1 supporting nomination over the next 14 years she would join Bette Davis as the only actresses to keep pace with Meryl Streep through age 40. Then all she would have to do is average between 3.5 and 4 points every 10 years for anther 30 years or more.

Good luck with that. Along side Joe DiMaggio’s 56 game hitting streak, it may become a record that will never be broken.

Create, Test, Analyze, and Recreate

Apple’s IPhone just turned 10 years old. Why has it been such a successful product? It might be because the product hasn’t stayed static. The latest version of the IPhone is the IPhone 7+. As a product, it is constantly reinventing itself to improve its utility. It is always fresh. Apple, like most producers of successful products, probably follows a process whereby they:

  1. Create.
  2. Test what they’ve created.
  3. Analyze the results of their tests.
  4. Recreate.

They never dust off their hands and say, “My job is done.”

Now I won’t be so presumptuous to claim to have created something as revolutionary as the IPhone. But, regardless of how small your creation, its success requires you to follow the same steps outlined above.

My post last week outlined the testing process I put my algorithm through each year. This week I will provide some analysis and take some steps towards a recreation. The results of my test was that using my “really like” movie selection system significantly improved the overall quality of the movies I watch. On the negative side, the test showed that once you hit some optimal number of movies in a year the additional movies you might watch has a diminishing quality as the remaining pool of “really like” movies shrinks.

A deeper dive into these results begins to clarify the key issues. Separating movies that I’ve seen at least twice from those that were new to me is revealing.

Seen More than Once Seen Once
1999 to 2001 2014 to 2016 1999 to 2001 2014 to 2016
# of Movies 43 168 231 158
% of Total Movies in Timeframe 15.7% 51.5% 84.3% 48.5%
IMDB Avg Rating                   7.6                   7.6                   6.9                   7.5
My Avg Rating                   8.0                   8.4                   6.1                   7.7
% Difference 5.2% 10.1% -12.0% 2.0%

There is so much interesting data here I don’t know where to start. Let’s start with the notion that the best opportunity for a “really like” movie experience is the “really like” movie you’ve already seen. I’ve highlighted in teal the percentage that My Avg Rating outperforms the IMDB Avg Rating in both timeframes. The fact that, from 1999 to 2001, I was able to watch movies that I “really liked” more than the average IMDB voter, without the assistance of any movie recommender website, suggests that memory of a “really like” movie is a pretty reliable “really like” indicator. The 2014 to 2016 results suggest that my “really like” system can help prioritize the movies that memory tells you that you will “really like” seeing again.

The data highlighted in red and blue clearly display the advantages of the “really like” movie selection system. It’s for the movies you’ve never seen that movie recommender websites are worth their weight in gold. With limited availability of movie websites from 1999 to 2001 my selection of new movies underperformed the IMDB Avg Rating by 12% and they represented 84.3% of all of the movies I watched during that timeframe. From 2014 to 2016 (the data in blue), my “really like” movie selection system recognized that there is a limited supply of new “really like” movies. As a result less than half of the movies watched from 2014 through 2016 were movies I’d never seen before. Of the new movies I did watch, there was a significant improvement over the 1999 to 2001 timeframe in terms of quality, as represented by the IMD Avg Rating, and my enjoyment of the movies, as represented by My Avg Rating.

Still, while the 2014 to 2016 new movies were significantly better than the new movies watched from 1999 to 2001, is it unrealistic to expect My Ratings to be better than IMDB by more than 2%? To gain some perspective on this question, I profiled the new movies I “really liked” in the 2014 to 2016 timeframe and contrasted them with the movies I didn’t “really like”.

Movies Seen Once
2014 to 2016
“Really Liked” Didn’t “Really Like”
# of Movies 116 42
% of Total Movies in Timeframe 73.4% 26.6%
IMDB Avg Rating                       7.6                                  7.5
My Avg Rating                       8.1                                  6.3
“Really Like” Probability 82.8% 80.7%

The probability results for these movies suggest that I should “really like” between 80.7% and 82.8% of the movies in the sample. I actually “really liked” 73.4%, not too far off the probability expectations. The IMDB Avg Rating for the movies I didn’t “really like” is only a tick lower than the rating for the “really liked” movies. Similarly, the “Really Like” Probability is only a tick lower for the Didn’t “Really Like” movies. My conclusion is that there is some, but not much, opportunity to improve selection of new movies through a more disciplined approach. The better approach would be to favor “really like” movies that I’ve seen before and give new movies more time for their data to mature.

Based on my analysis, here is my action plan:

  1. Set separate probability standards for movies I’ve seen before and movies I’ve never seen.
  2. Incorporate the probability revisions into the algorithm.
  3. Set a minimum probability threshold for movies I’ve never seen before.
  4. When the supply of “really like” movies gets thin, only stretch for movies I’ve already seen and memory tells me I “really liked”.

Create, test, analyze and recreate.

 

Oh, What To Do About Those Tarnished Old Quarters.

In one of my early articles, I wrote about the benefits of including older movies in your catalogue of movies to watch. I used the metaphor of our preference for holding onto shiny new pennies rather than tarnished old quarters. One of the things that has been bothering me is that my movie selection system hasn’t been surfacing older movie gems that I haven’t seen. Take a look at the table below based on the movie I’ve watched over the last 15 years:

Movie Release Time Frame # of Movies Seen % of Total
2007 to 2016 573 29%
1997 to 2006 606 31%
1987 to 1996 226 11%
1977 to 1986 128 6%
1967 to 1976 101 5%
1957 to 1966 122 6%
1947 to 1956 109 6%
1937 to 1946 87 4%
1920 to 1936 25 1%

60% of the movies I’ve watched in the last 15 years were released in the last 20 years. That’s probably typical. In fact, watching movies more than 20 years old 40% of the time is probably unusual. Still, there are probably quality older movies out there that I’m not seeing.

My hypothesis has been that the databases for the movie websites that produce my recommendations are smaller for older movies. This results in recommendations that are based on less credible data. In the world of probabilities, if your data isn’t credible, your probability stays closer to the average probability for randomly selected movies.

I set out to test this hypothesis against the movies I’ve watched since I began to diligently screens my movies through my movie selection system. It was around 2010 that I began putting together my database and using it to select movies. Here is a profile of those movies.

Seen after 2010
Movie Release My
Time Frame Average Rating # of Movies Seen % of Total Seen
2007 to 2016 7.2 382 55%
1997 to 2006 7.9 60 9%
1987 to 1996 7.9 101 15%
1977 to 1986 7.8 57 8%
1967 to 1976 7.9 23 3%
1957 to 1966 8.2 26 4%
1947 to 1956 8.2 20 3%
1937 to 1946 8.4 17 2%
1920 to 1936 6.9 4 1%

It seems that it’s the shiniest pennies, that I watch most often, that I’m least satisfied with. So again I have to ask, why aren’t my recommendations producing more older movies to watch?

It comes back to my original hypothesis. Netflix has the greatest influence on the movies that are recommended for me. So, I compared my ratings to Netflix’ Best Guess ratings for me and added the average number of ratings those “best guesses” were based on.

Movie Release Time Frame My Average Rating Netflix Average Best Guess Avg. # of Ratings per Movie My Rating Difference from Netflix
2007 to 2016 7.2 7.7    1,018,163 -0.5
1997 to 2006 7.9 8.0    4,067,544 -0.1
1987 to 1996 7.9 8.1    3,219,037 -0.2
1977 to 1986 7.8 7.8    2,168,369 0
1967 to 1976 7.9 7.6    1,277,919 0.3
1957 to 1966 8.2 7.9        991,961 0.3
1947 to 1956 8.2 7.8        547,577 0.4
1937 to 1946 8.4 7.8        541,873 0.6
1920 to 1936 6.9 6.1        214,569 0.8

A couple of observations on this table;

  • Netflix pretty effectively predicts my rating for movies released between 1977 to 2006. The movies from this thirty year time frame base their Netflix best guesses on more than 2,000,000 ratings per movie.
  • Netflix overestimates my ratings for movies released from 2007 to today by a half point. It may be that the people who see newer movies first are those who are most likely to rate them higher. It might take twice as many ratings before the best guess finds its equilibrium, like the best guesses for the 1987 to 2006 releases.
  • Netflix consistently underestimates my ratings for movies released prior to 1977. And, the fewer ratings the Netflix best guess is based on, the greater Netflix underestimates my rating of the movies.

What have I learned? First, to improve the quality of new movies I watch, I should wait until the number of ratings the recommendations are based on is greater. What is the right number of ratings is something I have to explore further.

The second thing I’ve learned is that my original hypothesis is probably correct. The number of ratings Netflix has available to base its recommendations on for older movies is probably too small for their recommendations to be adequately responsive to my taste for older movies. The problem is, “Oh, what to do about those tarnished old quarters” isn’t readily apparent.

 

September, When the Movies You Expect to be Good Are Bad and Vice Versa

Picking “really like” movie prospects for September is a tricky game. The movies that sound good probably aren’t and the movies that don’t sound good may be alright. As I mentioned in my last post, there is a 38.3% chance that I will “really like” a movie released in September. That means that, if I pick five movies as “really like” September prospects, I could randomly pick five and stand a good chance that two of them will be “really like” movies. Right? Theoretically, that’s true. But, I’m not picking randomly, I’m trying to pick movies I’d like which may work against me.

For example, my first movie is:

Sully. Release Date: September 9, 2016       “Really Like” Probability: 40%

This movie is directed by Clint Eastwood and stars Tom Hanks. In the last 15 years I’ve seen 15 movies directed by Clint Eastwood and “really liked” 12 of them. Over the same time frame I’ve watched 25 movies that Tom Hanks starred in and 20 of those I “really liked”. So, I “really liked” 80% of the movies I’ve seen for both the Director and the Actor. Here’s something else those 40 movies have in common. None of them were released in September. If a movie that involves the pedigree of Eastwood and Hanks is released in September, should we be skeptical? Yes, but because of the pedigree, I have to put the movie on the prospect list.

Similarly, my second prospect:

The Magnificent Seven . Release Date: September 23, 2016  “Really Like” Probability: 35%

This movie also stars a very bankable actor, Denzel Washington, and a new star, Chris Pratt. I’ve seen 23 Denzel Washington movies and “really liked” 17 of them. The movie is also in one of my favorite genres, the Western. But, guess what, none of those 23 movies was released in September. Again, this is a movie I want to see but the release date makes me skeptical.

Which brings me to three movies that don’t jump out and say “watch me” but are intriguing nonetheless. The first is:

The Light Between Oceans. Release Date:September 2, 2016 “Really Like” Probability: 35%

This movie stars two big name actors, Alicia Vikander and Michael Fassbender, who happen to be in a relationship in their private lives. Does the off-screen chemistry translate on-screen a la Katherine Hepburn and Spencer Tracy? We’ll have to wait and see if this melodrama rises above September expectations.

The next movie may be September-proof:

Queen of Katwe.  Release Date: September 30, 2016    “Really Like” Probability: 45%

Walt Disney Pictures over the last couple of decades has developed  a sub-genre specialty in their efforts  to produce family oriented entertainment, the “true underdog ” Sports Movie. While on occasion they’ve taken liberty with the facts, as in the 2015 McFarland, USA, their product has been consistently entertaining.This year’s underdog competitor is a young girl from a Ugandan village who trains to be a world chess champion. I believe that this movie is the most promising of the month.

For my final choice, I’m going with a selection from the odd filmography of Tim Burton:

Miss Peregrine’s Home for Peculiar Children.  Release Date: September 30, 2016         “Really Like” Probability: 35%

Why did I pick this movie? Maybe it’s because it has a little bit of a Harry Potter feel to it. The clincher though is that Samuel L. Jackson is in the movie. There are five actors who I’ve seen in at least 25 movies over the past 15 years. Samuel L. Jackson is one of the five. And, he ranks only behind Tom Hanks in the average rating I’ve given those movies.

That’s all I’ve got. After all it is September movies we’re talking about.

 

 

 

The Best Thing About September Movies is that Awards Season is Just Around the Corner

You might think from my title that no good movie has ever been released for broad distribution in September. Of course that would be wrong. After all the movie industry has released to the public movies such as Goodfellas, Moneyball, Almost Famous, L.A. Confidential, and The Town in the month of September. But, as followers of this blog are aware, I deal in probabilities. And, based on my data, there is only a 38.3% chance I will “really like” a movie released in September. For the optimists out there, that does mean that almost 4 out of every 10 movies released I will “really like”. There will be some good movies but I wouldn’t gamble that seeing a movie on a September opening weekend is a good bet. After all you wouldn’t go to a casino and feed your entire paycheck to a slot machine with only a 38.3% chance of winning .

If you need more convincing, consider this. Out of 158 movies nominated for a Best Picture Oscar since 1990, only 4 (2.5%) were released in September and none of them won. Three of those four movies were mentioned above. The fourth is The Full Monty. How many of you have seen The Full Monty?

Here is a list of last September’s top five box office hits:

Gross (000000) Budget (000000)
Hotel Transylvania 2 $169.70  $                   80.00
Maze Runner: The Scorch Trials $81.70  $                   61.00
The Intern $75.80  $                   35.00
The Visit $65.20  $                     5.00
Black Mass $62.60  $                   53.00

Only one of the five , Black Mass, earned a Certified Fresh rating on Rotten Tomatoes, and that was with the bare minimum for the rating, 75% Fresh. None of the five met my IMDB baseline of 7.3 for a “really like” movie.

So should we just skip the movie theater in September? Maybe. But, if you really want to see a movie in the theater in September, pick one of the little movies released in August, like Hell or High Water or Florence Foster Jenkins. Be patient! Awards Season at the movies is just around the corner, beginning in October.

According to IMDB, Do We Get More Conservative as We Age or Just Smarter

We are in the midst of a hotly contested election season in which the pundits attempt to place groups of people in defined boxes. Two of the boxes are the liberal youth vote and the conservative Senior Citizen vote. Some pundits argue that as Senior Citizens die off the country will become more liberal. Others argue that young people will become conservative as they age, maintaining the status quo. Do we become more conservative as we age? There are studies on both sides of the issue as evidenced in this 2012 article and this 2014 article.

I won’t attempt to resolve this issue here but, instead, use it as a backdrop for another interesting finding in my demographic study of IMDB’s Top 250 Movies. Age does appear to factor in IMDB voting. Take a look at these results by Age Group:

Avg. IMDB Rating
Age Group All Male Female
Under 18             8.7             8.8                       8.6
18 to 29             8.5             8.5                       8.4
30 to 45             8.3             8.4                       8.3
Over 45             8.2             8.2                       8.0

As IMDB voters get older, the average rating for the same group of movies is lower. It doesn’t matter whether the groups are male or female. The pattern is still the same. The fact that the avg. ratings for the female groups is consistently lower than the male groups is probably due to the bias towards male-oriented movies in the Top 250. Is this further evidence that we get more conservative as we get older?

I’ll offer up a counter-argument, maybe we get smarter as we get older. There are scientific studies that support this including those cited in this 2011 article. There is some IMDB support for this argument, as well. One of the demographic groups that IMDB captures data for is the Top 1,000 IMDB voters. These are voters who have rated the most movies on IMDB and presumably have watched the most movies. The avg. IMDB rating from this group for the Top 250 Movies is 7.9. Perhaps, the more movies that you watch, the smarter you get at differentiating one movie from another. If so, then maybe the lower average ratings for the older age groups are more representative of the experience gained from watching a greater number of movies. Whether we get more conservative or smarter as we age, it would be wise for the older moviegoer to recognize that the avg. IMDB rating is heavily influenced by males aged 18 to 29. You’ll need to apply a Senior Discount to the rating. What do you think?

 

Screening Room and the Long Shadow of Netflix

Over the last few weeks, an interesting news item that might significantly impact the future of home entertainment has been making its way into the mainstream media. A new home video service called Screening Room is being pitched in Hollywood. Here is the idea. For $150, customers would buy a set-top box, probably similar to a cable TV box, which would enable them to rent, for 48 hours, a movie on the day it is released in the movie theaters for $50 a movie, with a share of that $50 going back to movie theaters. But wait, the renter of the movie would also get two tickets to see the same movie at the theaters. This is either a crazy idea or the next big thing on the horizon for an industry already in great flux.

My initial reaction was that this is a crazy idea because the $50 cost is out of reach for the average moviegoer. What gives me pause, though, is that the idea belongs to Sean Parker, the co-founder of Napster and former president of Facebook. For moviegoers, he was the character played by Justin Timberlake in The Social Network. Screening Room is backed by a powerful cadre of a who’s who in Hollywood, including producer Brian Grazer and directors Steven Spielberg, Ron Howard, J.J. Abrams and Peter Jackson. Next week it is being demonstrated to industry professionals at the CinemaCon convention in Las Vegas. Stay tuned for further developments. This story could have legs.

For now, let’s consider, is the idea viable and why are powerful moviemakers getting behind it? Many home entertainment customers have bought a streaming device, Roku, Apple TV etc., for $50 – $100. The $150 for the set-top box probably isn’t unreasonable if there is sufficient demand for $50 movies. That’s the big question. Event movies, like Star Wars: The Force Awakens, that are candidates for viewing parties, would be a reasonable target for the service. It would be an extension of the demand that exists today for pay per view boxing matches. Families, that often go to the movies as a family, are another potential market. Couples with young children might also be a target when you add to the movie theater costs the cost of hiring a babysitter. But, for Singles, and Couples with no children, the economic cost benefit doesn’t seem to make sense.

As for moviemakers, the motivations might be complex. There is the possibility that it could add to the box office. For example, young couples with kids who would rather stay home than pay for a sitter, might see more first run movies than they normally would go to the theater to see. The biggest motivation for the movie industry, though, may be the looming threat posed by Netflix and other streaming services.

When Netflix streamed Beasts of No Nation (Idris Elba was a Screen Actors Guild Award winner for the film.) for all of its subscribers on the same day it was released in movie theaters, it sent a shot across the bow of the movie industry, particularly theater owners. The four major theater chains in the US boycotted the film and it ended up getting only a limited release in 200-250 independent and art film theaters. Another Sundance favorite, The Fundamentals of Caring with Paul Rudd and Selena Gomez, was purchased by Netflix and will be streamed on June 24, 2016 on Netflix. It is dawning on the industry that moviegoers may not need to actually go to movie theaters to see first run movies. They can be delivered to them at home.

Screening Room may be seen by the movie theater industry as a financially viable alternative to Netflix, at least until Screening Room also eliminates their payoff to the middle man, making it more affordable to average movie watchers. After all, that’s how Netflix does it.

What do you think? Is it a viable concept?

What Movie Are You?

This past weekend I watched Saturday Night Fever for the fourth time. Roger Ebert mentions in his Great Movies review of the film that it was Gene Siskel’s favorite movie of all-time, having seen it 17 times. I’m in the Siskel camp. It is one of my favorite movies of all-time as well. I watched it for the first time in a Chicago area theater when it first came out in 1977. I was in the first year of my new job, the first year of 35 successful years with the same company. I was within a year of meeting my future wife, married 36+ years and still going strong. And, a little less than two years prior, I had left the middle class, New England town I grew up in and moved to the Chicago area. As it turned out, it was that momentous decision that shaped my entire adult life.

When I mention to others that Saturday Night Fever is a favorite of mine, a typical reaction is “I hate disco”. It is so much more than a disco movie. Disco is just its milieu. It is a movie about dreams and the barriers that get in the way of realizing those dreams. It is about being stuck in your current existence and coming to the realization that you won’t like the consequences of staying stuck. It is about breaking away and giving yourself a chance.

As I watched Saturday Night Fever that first time, I began to identify with the movie. I identified with Tony Manero’s yearning to create a bigger footprint in his life than he could in his Bay Ridge neighborhood. I recognized the emotional traps that were holding him back from pursuing his dream. I felt his relief when he finally decided to make the move to Manhattan, even though he had no job to go to. I was Saturday Night Fever without, of course, the disco dance king lifestyle.

In the next series of Posts, I will introduce movie recommender sites that try to answer the question “What Movie Are You” based on the movies that you “really like”. No site can identify all of the deep down personal reasons why a movie connects with you. Under my system, for example, there was only a 28.2% chance that I would “really like” Saturday Night Fever. But, the movies that you do “really like”, do identify the types of movies that draw you in and these sites effectively select quality movies within genres you enjoy watching. The sites are all different, using a variety of assumptions and methodologies. They are all just waiting for you to start rating the movies you’ve seen, both good and bad, so that they can get to know you.

In the meantime, consider sharing a comment on your reaction to this Post. Are there any movies that connect with you on a personal level? What Movie Are You?

Post Navigation