Will I "Really Like" this Movie?

Navigating Movie Website Ratings to Select More Enjoyable Movies

Archive for the tag “Beauty and the Beast”

Musings After a Quiet Movie Weekend

There were no changes this week to the 2017 Objective Top Ten. None of the movies that opened last weekend earned a Certified Fresh on Rotten Tomatoes. So, I have nothing to talk about. Right? Oh, you are so wrong.

First, regarding that Objective Top Ten that I update every Monday, I want to be clear about something. I’m not suggesting that you will like every movie on that list. I’m not suggesting that there aren’t good movies that didn’t make the list. In fact, my two favorite movies so far, Beauty and the Beast and Gifted, aren’t on the list. It is just an objective measure of quality. It doesn’t take into account your personal taste in movies. For example, if you typically don’t like Art House movies you may not like Kedi, which is a documentary about the hundreds of thousands of cats that have been roaming Istanbul for thousands of years, or Truman, which is a Spanish language film that celebrates the enduring nature of good friendship. These low budget movies tend to take risks and aren’t intended to please the general audience. But, would you really prefer to see the new Transformers movie which opened yesterday and is 16% Rotten on Rotten Tomatoes? You may prefer to avoid all three movies and that’s okay. The point of the list is to give you a menu of quality movies and if any naturally intrigue you, the odds are that it will be a “really like” movie for you.

Turning from low budget Art House films to big budget Blockbusters, the success of two other movies on the list explain why movies based on comic books are here to stay for the foreseeable future. Logan with its estimated $97 million production budget and Wonder Woman with its estimated budget of $149 million have returned a tidy return in worldwide box office receipts of over $617 million and $578 million, respectively. When quality movies in the comic book genre are made, they spin box office gold.

A couple of other notes on the Objective Top Ten List. In July I plan to expand the list to fifteen movies and in October I’ll expand it again to twenty movies. This will better accommodate the number of quality movies that typically are released over the second half of the year. Also, I’m close to being able to incorporate Cinemascore grades into the probabilities for the Objective Top Ten. It’s possible that this change may be incorporated as early as next Monday’s update. This change will differentiate better one movie from the next.

Finally, two movies that I have my eye on for this weekend are The Beguiled ,which earned Sofia Coppola the top director award at Cannes, and The Big Sick, which is already 98% Certified Fresh on Rotten Tomatoes.

Advertisements

Can You Increase Your Odds of Having a “Really Like” Experience at the Movie Theater

Last Friday, my wife and I were away from home visiting two different sets of friends. One group we met for lunch. The second group we were meeting in the evening. With some time to spare between visits, we decided to go to a movie. The end of April usually has slim pickings for “really like” movies at the theater. With the help of IMDB and Rotten Tomatoes, I was able to surface a couple of prospects but only one that both my wife and I might “really like”. We ended up seeing a terrific little movie, Gifted.

My experience got me thinking about the probabilities of seeing “really like” movies at the movie theater. These movies have the least data to base a decision off of and yet I can’t recall too many movies that I’ve seen in the theater that I haven’t “really liked”. Was this reality or merely perception.

I created a subset of my database of movies that I’ve seen within 3 months of their release. Of the 1,998 movies in my database, 99 movies, or 5%, met the criteria. Of these 99 movies, I “really liked” 86% of them. For the whole database, I “really liked” 60% of the movies I’ve watched over the last 15 years. My average score for the 99 movies was 7.8 out of 10. For the remaining 1,899 movies my average score was 6.8 out of 10.

How do I explain this? My working theory is that when a movie comes with an additional cash payout, i.e. theater tickets, I become a lot more selective in what I see. But, how can I be more selective with less data? I think it’s by selecting safe movies. There are movies that I know I am going to like. When I went into the movies theater a couple of months ago to see Beauty and the Beast I knew I was going to love it and I did. Those are the types of movie selections I tend to reserve for the theater experience.

There are occasions like last Friday when a specific movie isn’t drawing me to the movies but instead I’m drawn by the movie theater experience itself. Can I improve my chances of selecting a “really like” movie in those instances?

Last week I mentioned in my article that I needed to define better what I needed my “really like” probability model to do. One of the things that it needs to do is to provide better guidance for new releases. The current model has a gap when it comes to new releases. Because the data is scarce most new releases will be Quintile 1 movies in the model. In other words, very little of the indicators based on my taste in movies, i.e. Netflix, Movielens, and Criticker, is factored into the “really like” probability.

A second gap in the model is that new releases haven’t been considered for Academy Awards yet. The model treats them as if they aren’t award worthy, even though some of them will be Oscar nominated.

I haven’t finalized a solution to these gaps but I’m experimenting with one. As a substitute for the Oscar performance factor in my model I’m considering a combined IMDB/Rotten Tomatoes probability factor. These two outputs are viable indicators of the quality of a new release. This factor would be used until the movie goes through the Oscar nomination process. At that time, it would convert to the Oscar performance factor.

I’ve created a 2017 new release list of the new movies I’m tracking. You can find it on the sidebar with my Weekly Watch List movies. This list uses the new “really like” probability approach I’m testing for new releases. Check it out.

If you plan on going to the movies this weekend to see Guardians of the Galaxy Vol. 2, it is probably because you really liked the first one. Based on IMDB and Rotten Tomatoes, you shouldn’t be disappointed. It is Certified Fresh 86% on Rotten Tomatoes and 8.2 on IMDB.

 

 

Some Facts Are Not So Trivial

As I’ve mentioned before on these pages, I always pay a visit to the IMDB trivia link after watching a movie. Often I will find a fun but ultimately trivial fact such as the one I discovered after viewing Beauty and the Beast. According to IMDB, Emma Watson was offered the Academy Award winning role of Mia in La La Land but turned it down because she was committed to Beauty and the Beast. Coincidentally, the heretofore non-musical Ryan Gosling was offered the role of the Beast and turned it down because he was committed to that other musical, La La Land. You really can’t fault either of their decisions. Both movies have been huge successes.

On Tuesday I watched the “really like” 1935 film classic Mutiny on the Bounty.My visit to the trivia pages of this film unearthed facts that were more consequential than trivial. For example, the film was the first movie of  historically factual events with actors playing historically factual people to win the Academy Award for Best Picture. The previous eight winners were all based on fiction. Real life became a viable source for great films as the next two Best Picture winners, The Great Ziegfeld and The Life of Emile Zola, were also biographies. Interestingly, it would be another 25 years before another non-fictional film, Lawrence of Arabia, would win a Best Picture award.

Mutiny on the Bounty also has the distinction of being the only movie ever to have three actors nominated for Best Actor. Clark Gable, Charles Laughton, and Franchot Tone were all nominated for Best Actor. Everyone expected one of them to win. After splitting the votes amongst themselves, none of them won. Oscar officials vowed to never let that happen again. For the next Academy Awards in 1937, they created two new awards for Actor and Actress in a Supporting Role. Since then, in only six other instances, have two actors from the same movie been nominated for Best Actor.

Before leaving Mutiny on the Bounty, there is one more non-trivial fact to relate about the movie. The characters of Captain Bligh and First Mate Fletcher Christian grow to hate each other in the plot. To further that requisite hate in the movie, Irving Thalberg, one of the producers, purposely cast the overtly gay Charles Laughton as Bligh and the notorious homophobe Gable as Fletcher Christian. This crass manipulation of the actors’ prejudice seemed to have worked as the hate between the two men was evident on the set and clearly translated to the screen. This kind of morally corrupt behavior was not uncommon in the boardrooms of the Studio system in Hollywood at the time.

Some other older Best Picture winning films with facts, not trivial, but consequential to the film industry or the outside world include:

  • It Happened One Night, another Clark Gable classic, in 1935 became the first of only three films to win the Oscar “grand slam”. The other two were One Flew Over the Cuckoo’s Nest and Silence of the Lambs. The Oscar “grand slam” is when a movie wins all five major awards, Best Picture, Director, Actor, Actress, and Screenplay.
  • Gone with the Wind, along with being the first Best Picture filmed in color,  is the longest movie, at four hours, to win Best Picture. Hattie McDaniel became the first black actor to be nominated and win an Oscar for her role in the film.
  • In Casablanca, there is a scene where the locals drown out the Nazi song “Watch on the Rhine” with their singing of the “Marseillaise”. In that scene you can see tears running down the cheeks of many of the locals. For many of these extras the tears were real since they were actual refugees from Nazi tyranny. Ironically, many of the Nazis in the scene were also German Jews who had escaped Germany.
  • To prepare for his 1946 award winning portrayal of an alcoholic in The Lost Weekend, IMDB reveals that “Ray Milland actually checked himself into Bellevue Hospital with the help of resident doctors, in order to experience the horror of a drunk ward. Milland was given an iron bed and he was locked inside the “booze tank.” That night, a new arrival came into the ward screaming, an entrance which ignited the whole ward into hysteria. With the ward falling into bedlam, a robed and barefooted Milland escaped while the door was ajar and slipped out onto 34th Street where he tried to hail a cab. When a suspicious cop spotted him, Milland tried to explain, but the cop didn’t believe him, especially after he noticed the Bellevue insignia on his robe. The actor was dragged back to Bellevue where it took him a half-hour to explain his situation to the authorities before he was finally released.”
  • In the 1947 film Gentlemen’s Agreement about anti-Semitism, according to IMDB, “The movie mentions three real people well-known for their racism and anti-Semitism at the time: Sen. Theodore Bilbo (D-Mississippi), who advocated sending all African-Americans back to Africa; Rep. John Rankin (D-Mississippi), who called columnist Walter Winchell  “the little kike” on the floor of the House of Representatives; and leader of “Share Our Wealth” and “Christian Nationalist Crusade” Gerald L. K. Smith, who tried legal means to prevent Twentieth Century-Fox from showing the movie in Tulsa. He lost the case, but then sued Fox for $1,000,000. The case was thrown out of court in 1951.”

One of the definitions of “trivia” is “an inessential fact; trifle”. Because IMDB lists facts under the Trivia link does not make them trivia. The facts presented here either promoted creative growth in the film industry or made a significant statement about society. Some facts are not so trivial.

 

 

 

The Wandering Mad Movie Mind

Last week in my post I spent some time leading you through my thought process in developing a Watch List. There were some loose threads in that article that I’ve been tugging at over the last week.

The first thread was the high “really like” probability that my algorithm assigned to two movies, Fight Club and Amelie, that I “really” didn’t like the first time I saw them. It bothered me to the point that I took another look at my algorithm. Without boring you with the details, I had an “aha” moment and was able to reengineer my algorithm in such a way that I can now develop unique probabilities for each movie. Prior to this I was assigning the same probability to groups of movies with similar ratings. The result is a tighter range of probabilities clustered around the base probability. The base probability is defined as the probability that I would “really like” a movie randomly selected from the database. If you look at this week’s Watch List, you’ll notice that my top movie, The Untouchables, has a “really like” probability of 72.2%. In my revised algorithm that is a high probability movie. As my database gets larger, the extremes of the assigned probabilities will get wider.

One of the by-products of this change is that the rating assigned by Netflix is the most dominant driver of the final probability. This is as it should be. Netflix has by far the largest database of any I use.  Because of this it produces the most credible and reliable ratings of any of the rating websites. Which brings me back to Fight Club and Amelie. The probability for Fight Club went from 84.8% under the old formula to 50.8% under the new formula. Amelie went from 72.0% to 54.3%. On the other hand, a movie that I’m pretty confident that I will like, Hacksaw Ridge changed only slightly from 71.5% to 69.6%.

Another thread I tugged at this week was in response to a question from one of the readers of this blog.  The question was why was Beauty and the Beast earning the low “really like” probability of 36.6% when I felt that there was a high likelihood that I was going to “really like” it. The fact is that I saw the movie this past week and it turned out to be a “really like” instant classic. I rated it a 93 out of 100, which is a very high rating from me for a new movie. In my algorithm, new movies are underrated for two reasons. Because they generate so few ratings in their early months, e.g. Netflix has only 2,460 ratings for Beauty and the Beast so far, the credibility of the movie’s own data is so small that the “really like” probability is driven by the Oscar performance part of the algorithm. This is the second reason for the low rating. New movies haven’t been through the Oscar cycle yet and so their Oscar performance probability is that of a movie that didn’t earn an Oscar nomination, or 35.8%. This is why Beauty and the Beast was only at 36.6% “really like” probability on my Watch List last week.

I’ll leave you this week with a concern. As I mentioned above, Netflix is the cornerstone of my whole “really like” system. You can appreciate then my heart palpitations when it was announced a couple of weeks ago that Netflix is abandoning it’s five star rating system in April. It is replacing it with a thumbs up or down rating with a % next to it, perhaps a little like Rotten Tomatoes. While I am keeping and open mind about the change, it has the potential of destroying the best movie recommender system in the business. If it does, I will be one “mad” movie man, and that’s not “crazy” mad.

Post Navigation