In my prior life, I would on occasion find myself leading a training session on the predictive model that we were using in our business. Since the purpose of the model was to help our Account Executives make more effective business decisions, one of the points of emphasis was to point out instances when the model would present them with misleading information that could result in ineffective business decisions. One of the most basic of these predictive model traps is that it relies on data input that accurately reflects the conditions being tested in the model. If you put garbage into the model, you will get garbage out of the model.
Netflix, MovieLens, and Criticker are predictive models. They predict movies that you might like based on your rating of the movies you have seen. Just like the predictive model discussed above, if the ratings that you input into these movie models are inconsistent from movie to movie, you increase the chances that the movie website will recommend to you movies that you won’t like. Having a consistent standard for rating movies is a must.
The best approach to rating movies is a simple approach. I start with the Netflix guidelines to rating a movie:
- 5 Stars = I loved this movie.
- 4 Stars = I really liked this movie.
- 3 Stars = I liked this movie.
- 2 Stars = I didn’t like this movie.
- 1 Star = I hated this movie.
When I’ve used this standard to guide others in rating movies, the feedback has been that it is an easily understood standard. The primary complaint has been that sometimes the rater can’t decide between the higher and lower rating. The movie fits somewhere in between. For example, “I can’t decide whether I “really like” this movie or just “like” it. This happens enough that I’ve concluded that a 10 point scale is best:
- 10 = I loved this movie.
- 9 = I can’t decide between “really liked” and “loved”.
- 8 = I really liked this movie.
- 7 = I can’t decide between “liked” and “really liked”.
- 6 = I liked this movie.
- 5 = I can’t decide between “didn’t like” and “liked”.
- 4 = I didn’t like this movie.
- 3 = I can’t decide between “hated” and “didn’t like”.
- 2= I hated this movie.
- 1 = My feeling for this movie is beyond hate.
The nice thing about a 10 point scale is that it is easy to convert to other standards. Using the scales that exist for each of the websites, an example of the conversion would look like this:
- IMDB = 7 (IMDB uses a 10 point scale already)
- Netflix = 7 /2 = 3.5 = 4 rounded up. (Netflix uses 5 star scale with no 1/2 stars)
- Criticker = 7 x 10 = 70 (Criticker uses 100 point scale).
- MovieLens = 7 /2 = 3.5 (MovieLens has a 5 star scale but allows input of 1/2 star)
Criticker, being on a 100 point scale, gives you the capability to fine tune your ratings even more. I think it is difficult to subjectively differentiate, for example, between an 82 and an 83. In a future post we can explore this issue further.
So from one simple evaluation of a movie you can generate a consistent rating across all of the websites that you might use. This consistency allows for a more “apples to apples” comparison.
So throw out the garbage. Good data in will produce good data out, and a more reliable list of movies that you will “really like”.