Hacker News new | comments | show | ask | jobs | submit login
Netflix Replacing Star Ratings With Thumbs Ups and Thumbs Down (variety.com)
242 points by gerosan 186 days ago | hide | past | web | 289 comments | favorite



As I posted in the dupe:

This saddens me.

>Users would rate documentaries with 5 stars, and silly movies with just 3 stars, but still watch silly movies more often than those high-rated documentaries

That's not incongruous to me. The stars are not about "enjoyment" factor, they are about perceived quality. I may have a go to cheap ice-cream and rate it 3-stars but rate a good affogato 5-stars and only have it once in a while.

They are diluting the meaning of quality and instead are opting for a saccharine "enjoyment" factor. This binary choice does not sit well with me and I hope they abandon the idea soon.


We've analysed product ratings in the past and found that the online ratings tend to have primarily bimodal distributions. People who rate online welf select and either love something​ and give very good ratings or they dislike and give poor ratings. Unlike surveys where people have to answer questions, only people with extreme reactions tend to answer these online rating questions and you don't get any meaningful distributions. So the only meaningful way to model say drivers from comments vs. ratings is to treat the ratings as binary and do a logistic type model.

Ultimately the key reasons to have these ratings in the first place is not to let people express their feelings, it is to get data for the recommendation modeling to predict better. Perhaps showing a simple like/don't like increases the number of people rating something which gives you more training data and thus better business outcome from your model without compromising on midel quality since you had to anyway model it bi-modally before.


I'm far from an expert on statistics, but wouldn't the fact that star ratings tend to fall on a bimodal distribution just mean that the ratings that don't fit the bimodal distribution have a higher signal/noise ratio than the ones that do? I'd think you could use a 2-4 rating as an indicator of a thoughtful reviewer. Additionally, it seems like you could also use the fact that a particular reviewer doesn't fit the overall bimodal distribution as a sign that this reviewer's reviews are similarly more thought out.


I've built recommender systems at Etsy, among other places, and you get much better results if you make the data yes/no first. Shockingly MUCH better results. You'd think it would help to take into account, say how _many_ items of a shop a user likes, rather than just the fact that they liked some item, but it doesn't – it actively hurts. The only meaningful piece of information, it turns out is does this person like this thing or not.


That doesn't really surprise me. A lot of people imagine that highly granular, multi-dimensional rating systems must be best and they're often not. Of course, it depends a bit on what you're trying to distinguish but then it may not really matter to distinguish among the 75% that people are genuinely happy with.


The fact that a reviewer uses 2 and 4 ratings might not be too meanngful. One reviewer might use 2s and 4s the way another uses 1s and 5s.

There may be reviewers who do not generally fit the bimodal distribution, but if they are the minority, why design your whole rating system around them? Better to optimize your system for the way most of your users behave.


yes scaled ratings do tend to have subjectivity which in practice we normalize by databasing surveys over time for categories x markets.

The reason online ratings tend to be bimodal is because the respondents are self selecting sample rather than randomized. Since product ratings are not typically compulsory, the extreme likers/dis-likers tend to fill the survey question.

When implemented as compulsory, it drives poor user experience and having to fill a survey to get something anyway introduces a bias (just fill something to get it over with). So no easy answer.

This is also the reason traditional survey firms haven't really gone out of business even with the wealth of online data from Facebook, Twitter and product reviews available, though better/easier access to actual behavioral signals are certainly replacing surveys in some areas.


I have no doubt you are right about the people who rate online (you definitely see it in restaurant ratings and comment sections) but when I rate movies on Netflix I very, very rarely ever rate something a 1 or a 5. I would imagine that 2/3 of what I rate has been a three (so average content for me) and that it takes something special to go up or down from there.

Just curious if I am an outlier with how I do my ratings.


I'm the same way.

1 - Terrible. Good Lord, who thought making this was a good idea?

2 - Poor. Would not watch again

3 - Fair. Neither for or against. Unlikely to watch again, but I don't grudge the time spent.

4 - Good. I enjoyed it. I would probably watch again.

5 - Exceptional. Probably in my top 10 now, and foresee enjoying watching for years to come.

3 and 4 are used a lot. 2 is used occasionally. 1 and 5 are rare.


I prefer the Goodreads rating system which compresses 1 and 2:

1 - Did not like it

2 - It was ok

3 - Liked it

4 - Really liked it

5 - It was amazing

The reason being is that most books won't end up being 1 or 2, and there really isn't much difference between them. But it sucks giving 2 stars to something that wasn't bad. Perhaps emoji arranged in a non-star-like pattern would be the best.


Interesting. I pretty much only rate 1s and 5s, hoping that this will influence what content is recommended to me.


I've heard this a lot. When I look at reviews anywhere where there is a star rating and text review the only useful reviews tend to be 2-4 star reviews. 1 and 5 star reviews seem to have a tendency to be outliers or people who are really upset about something that most people won't care about. For example 1 star restaurant reviews that waited for someone to take their order and left because it took too long. I wonder if that could be applied to weighing the ratings accordingly. I have no data to test this unfortunately.


> They are diluting the meaning of quality and instead are opting for a saccharine "enjoyment" factor.

They are just trying to increase number of views for a poor quality library. This will mask how bad most of the content is and increase their numbers.


Like many people I've had a Netflix account for a long time - I can't recall the last thing I watched on their streaming service. It has probably been four months since I've used it to watch a movie for example. Their library is increasingly complete trash outside of a few original content programs. I'd cancel it if it weren't so cheap, mostly I keep the subscription as an entertainment fall back (in the category of: well, maybe I'll watch something on Netflix, oh, nope). I suspect I'll just ditch Netflix in the near future and be done with it, as I'm pretty happy with Prime as essentially a freebie to my Amazon shopping.


Or edge cases like me will get that little boost to just give up on Netflix. I already use them less than other streaming services, and I could always just turn it on and off again.


Honestly one of the only reasons I keep it is because I have so many movies that I've rated and didn't want that time investment to go to waste. I guess now that isn't going to be a factor.


This!

My dream is to build an open source app that gathers all these personal ratings (from Netflix to Pandora to GoodReads to restaurants) and puts it back on where it belongs: the users hand!!

Unfortunately finding the time to work on this will be tricky..

Anyone knows if that's a thing already?


I've rated thousands of movies and shows on Netflix, not just what I streamed or received from Netflix on disc but anything I've seen anywhere that I can recall and came across on their site. Having rated something is also a reminder that I've seen it. I didn't want to lose all that so I downloaded the pages [0]. I'll figure out how to scrape them and put them in a better format some other time.

[0] https://www.reddit.com/r/netflix/comments/5zzc4g/batch_downl...


If you go to your ratings page and scroll down so they all load this worked for me (paste into console)

  ratings = [];
  jQuery('li.retableRow').each(function (i, row) { 
    ratings.push({
      id: jQuery(row).find('.title a').attr('href').replace('/title/', ''), 
      title: jQuery(row).find('.title a').text(), 
      rating: jQuery(row).find('.starbar .personal').length 
    }) 
  });
then

  JSON.stringify(ratings)

Or for CSV:

  ratings = '';
  jQuery('li.retableRow').each(function (i, row) { 
    ratings += 
      jQuery(row).find('.date').text() + ',' +
      jQuery(row).find('.title a').attr('href').replace('/title/', '') + ',' +
      '"' + jQuery(row).find('.title a').text().replace('"', '""') + '",' +
      jQuery(row).find('.starbar .personal').length  + "\n"; 
  });
then print out the ratings variable.


That's the streaming site page which has less information and other problems (follow the link).


I pulled mine from https://www.netflix.com/MoviesYouveSeen. I did notice a bunch that just said MOVIE but that was just a handful. My ratings page goes back to 2005 and I'm not subscribed to the DVD plan anymore.


Why on earth would you use Netflix as a rating service? First of all their library lacks a lot of great movies, or info about the movies they have. Netflix isn't IMDB.


If you include the DVD service they have practically everything. I've been rating movies I've watched in Netflix since 2005-ish. IMDB makes sense, but I chose Netflix and now I'm stuck with them. Data silos suck.


>If you include the DVD service they have practically everything.

Though quite a bit less than they used to. My observation is that more and more back catalog type of items are no longer available.


They're definitely not replenishing stocks of some DVD's, which I suppose makes sense given the trend. I would assume if they get a certain number of requests they probably do though?

Or is that my wishful thinking?


Maybe? The thing is that the incentives don't really align to make this a real priority of theirs and they've been pretty clear that shipping physical DVDs is not their long-term business model. So long as a lot of people don't start canceling their DVD service because of unavailable disks, they don't have real reason to replenish.

Of course, given that they pretty much put every other rental place out of business, your only choice in a lot of cases is just to buy a disk if you want to watch something that isn't available streaming.


Almost all of the ratings I did were back before they even had a streaming service and the library was great. Plus it was helpful to get good recommendations. Now it doesn't really matter, but all of those ratings are locked in.

I've looked before halfheartedly for a way to export them but now that they cut the API off it doesn't seem like there's a great way to do that.


This lock-in is a pain. A similar thing happens with music streaming where getting playlists between services is just awful.


That is one reason I've always preferred to control my music directly. Music in particular is something I can't stand to have discontinuity of playlists and collections with.


Spotify reordered my lists. It frustrates me just less than the chore of fixing them would, which is a special class of irritation.


I hear you. It's one of the few things that actually elicits a physical response me from... a kind of shiver of anxiety. I put time and effort and maybe even a little bit of myself into my music lists... it's personal!


That was definitely part of it or me too; the rating system has always been a draw. It took time to make it actually respond the way I needed it to, and now it's just... sigh


That used to be a big deal for me too. I had a little happy moment back when I rated my 1000th movie. Oh well.


What other services do you use? I paid for prime, so i've watched some stuff on prime video, but there's really not much on there. Netflix is pretty much my only option for most TV shows. Worth noting I'm in Canada.


If you enjoy Anime or Asian drama (asian drama being live action, not animated) Crunchyroll has a huge catalog. It fills the void that Netflix has between two big Netflix Originals.

Edit: Giving away one of my 48h Crunchyroll premium guest pass. No credit card required. I'm not working for Crunchyroll in any way, they give out guest pass monthly to premium users. First one to use this code wins: UWWJAQTDKYZ. Note that I will receive a notification telling me your username and perhaps your email address, if privacy is important to you.


I'm in the US here, so I'm not sure where out options differ. For me, I do (as a poster below noted) like anime so Crunchyroll is a good choice. Hulu is good for seasonal TV stuff, and then I actually use a smartDNS to get access to UK channels. It's probably a matter of taste though... I don't like most movies, and prefer longer-forms for telling stories.


>use a smartDNS

At that point, isn't just plain old piracy more convenient ?


Sadly, the simple answer is "yes".

I used to use a VPN to watch US Netflix, but they started to block known VPNs and I didn't feel like spending time on workarounds. On the other hand, downloading the latest episode of some TV series takes only a few mouse clicks. For sure easier than fighting geo-restrictions.

I still pay for Netflix, but doesn't really watch it much because of the tiny catalog available to me locally, I simply want to support them. Perhaps it will make me feel better if I would download any of their content. :)


>Sadly, the simple answer is "yes".

>I simply want to support them.

Nothing to be sad about. Also curious, why do you want to support them ? Being a market leader, Netflix is a unique position to influence some of these regressive practices. More support is clearly not helping; it's only making it worse by creating monopolies that exert even more negative influence.


The biggest problem with Netflix is that it is not able to obtain rights for a lot of movies or series so you don't have a full catalog. So series and movies are fragmented between providers or totally unavailable​.

Netflix is not much of a market leader until it contains at least 90% of the movies i want to watch, which can't happen without enough leverage from their part given by user support.


I do like some of their own content. I also like their general business model and I believe (well, I hope) that the stupid geo-restrictions are results of old-school licensing and not active business decisions made by Netflix.

But ofc, I wouldn't mind if they had some real competition, I don't want a monopoly.


>(well, I hope) that the stupid geo-restrictions are results of old-school licensing and not active business decisions made by Netflix.

Geo-restrictions have existed forever. What Netflix started was actively blocking entire ranges of IPs belonging to vpns and commercial ISPs (cloud providers like aws, dedicated servers like ovh etc.). This should fall under some corollary to net-neutrality where a service cannot discriminate between ISPs. Does such a corollary exist in countries that have some form of net neutrality (US, India, Netherlands etc.) ? Can someone with legal expertise comment ? From my standpoint, they can have geo-licensing, but should stick with some consistent way of showing content either based on billing info or IP. Blocking out vast portions of the internet because it doesn't suit them should clearly be some violation of laws like net neutrality.


There is a proposal in the EU that would prevent "unjustified geo-blocking" but I'm not holding my breath waiting for it to become law.


I've been so long out of that game, I wouldn't know where to begin, so not for me, no.


I have to say, I quit Netflix a few years ago after a long membership, due to security concerns (it was a nightmare at the time). Recently I started my membership up again, and I'm really disappointed with their library. Every movie I've wanted to watch has turned out to not be available for streaming. They've got some good shows, but I'm getting tired of super heroes and I don't always want to watch television, sometimes I want movies.

I'm on the brink of cancelling my membership as it is, this certainly won't help, though it's obviously the proverbial straw rather than a big deal in its own right.


Indeed. Note that Netflix has never offered a filter by rating, or the ability to hide already-watched or disliked content.


When I watch netflix, I use the stars as a measure of if I would enjoy something. I don't think most people use stars as a measure of "quality" because when they browse movies, they are looking for something they enjoy, not something that's high quality that they might not enjoy.


This seems like pure difficulty for me. When I think of thumbs up / down metrics I think in terms of "would I endorse this?"

A movie like "Suicide Squad" was (to me at least) highly enjoyable, though I thought it objectively had many problems with plot and delivery. I would probably give it 3 stars, which is relatively neutral for me (or leave it altogether unrated). If it were thumbs up/down, I would probably have to leave it unrated completely.

Maybe that's an improvement, but the more discussion I see around this, the more I realize the futility of single-axis metrics for things like this, until and unless you're asking a single question "Did you enjoy this movie" is a much different question from "is this a good movie" -- and I don't really know which one I've been answering on Netflix all these years past.


I think you are an anomalous user. I'm not sure why you're OK with rating 1-5 stars but not a thumbsup or down. I think you're just upset because they are changing.


I like giving weight to my vote.

3 stars = "not a bad movie, but wouldn't recommend"

4 starts = "decent movie, would recommend"

5 stars = "great movie, would definitely recommend"

A binary choice doesn't have any weight to it.


I don't think there's a need for weight since there are many reviewers. The weight is the percentage of people who thumbed it up.


How does weight have anything to do with amount of reviewers? Those are completely unrelated metrics.


The current suggested rating is influenced by your history of rating, with weight given to how highly you rated it. There are shows that are suggested at 4 stars for my wife but at 2 stars for me.


This. And based on my history of rating, Netflix had been pretty good at guessing if something will be 3+ stars. If they think it is 3 stars, I consider watching. If they think it is 4+, I tend to really enjoy the suggestion. Though, nearly every Netflix original is suggested at 5 stars, lol.


I agree. I even want a couple more increments between 2,3, and 4. There are a lot of movies I see that I think are better or worse than a 3 rating but don't justify a 2 or a 4. I feel there's a lot of room for nuance around the middle area.


Lets say a movie previously would have shown 3 out of 5 stars. Where does that fit in with the binary thumbs up/down? I think they are just doing this because recently some actors movies are being attacked with 1 star ratings for being politically vocal.


> I think you're just upset because they are changing.

This is actually rude of you, bordering on ridicule. He explained very clearly why he is dissatisfied.


I've had this argument with my (tending towards elderly) parents while introducing them to Netflix. They don't understand rating for enjoyment, and keep rating for "quality". They'll watch a show, absolutely hate it, but give it a 4 or 5 star rating because "it was well produced and the camera work was good, but it isn't the sort of thing I personally like to watch".

I'm worried about the thumbs up/down, as Netflix has increasingly recommended shows I do not like this year, and I don't think the thumbs solves the "rating for perceived quality" issue. I've been using Movielens for recommendations a lot instead, and it seems more accurate for me... and it has enough stats to show that on their site, most rating is not binary, much closer to a normal curve for average movies or a linear graph on really good movies. (It lets you see the distribution curve on individual movies, and other metrics.)


interesting. I always rated for enjoyment.

I thought that Neflix made it pretty clear that the ratings were used to choose which content to show me. So I should only rate based on what I'd actually like to see.


I think a lot of people look at the stars as a measure of enjoyment when trying to find something to watch, but when actually rating something, as a measure of quality.


I use the stars to signal quality as a means of determining if I'd enjoy it. 5 stars? This is likely to be high-quality and enjoyable. 3 stars? This is going to be lower quality and I may not enjoy it. 1 star? This is likely shit quality, and I'll regret playing—or it's a cult classic and all the haters don't get it.


They're optimizing the wrong metric.

They should be optimizing "number of people who pay for Netflix". Instead they are optimizing "number of hours spent watching Netflix".

Those two metrics are correlated, but definitely not fully.

If I spend hundreds of hours a month watching meaningless action movies and TV on Netflix, I can cancel my subscription and save myself hundreds of hours and $10. I've got lots of other ways of filling my time with endless pablum.

If I spend 2 hours watching a thought-provoking documentary, I say to myself "yup, that justified spending $10 on Netflix this month."


Accounting for your average person, who do you estimate is more likely to cancel: The person watching hundreds of hours of meaningless action movies and TV, or the person subscribing but rarely/never watching anything? By average I mean not prone to suddenly realizing they want/feel sufficiently motivated to spend those hundreds of hours on something else, or posting on hacker news.

Speaking anecdotally, the one (temporary) time I canceled was when I found myself in the paying-but-not-using category.


How do you know what metric Netflix are optimising?


They tell you so in TFA.

> Users would rate documentaries with 5 stars, and silly movies with just 3 stars, but still watch silly movies more often than those high-rated documentaries.


it's interesting because the more someone uses netflix, the less money netflix makes.


Not necessarily. I strongly suspect (but can't prove) that they sell their rating data back to the studios.


> The stars are not about "enjoyment" factor, they are about perceived quality.

For you. But are you representative of the whole or even a majority? I personally use it to reflect whether I enjoyed it or considered it a waste of time.

I give some of my favorites - like Hackers - 5 stars but would never consider it a "quality" movie. It's awful and silly but that's the fun of it.


Netflix have said for years that the star ratings they gather are not actually very good indicators for their recommendation engines compared to implicit signals such as - 'did the user watch the show / film to the end?' If they still believe this, it seems fairly irrelevant what they use.


> implicit signals such as - 'did the user watch the show / film to the end?'

I hope their version of that signal is better than what they're presenting in the UI. I have a few movies Netflix shows me as not finished, becaue I closed them just as the ending credits started.


Also, it seems incredibly delicate.

Upon finishing the final season of Dexter, I clicked on an episode in the first season to look up something. I jumped to the 18 minute mark and watched about 10 seconds and closed the tab.

Now Netflix asks me every time if I want to continue watching Dexter from where it now thinks I left off in season one even though I spent a whole year watching all of the seasons.

So I have the same concern. If they have a more sophisticated model, it certainly doesn't get reflected in the UI...


The UI... They can't even make an 'f' shortcut key in the Win10 app (app, not browser) to go fullscreen.

And they force 21:9 movies to be played back inside the 16:9 portion of a 21:9 ultrawide monitor, with massive black borders as a result where there should be none.

I'm seriously unimpressed with Netflix as a company, for what they do with all their resources. Sometimes the MVP just isn't good enough.


Shouldn't this quality rating be within a genre? Not all documentaries are 5-star, and not all silly movies are 3-stars.

How do you compare ice-cream with cheesecake?


Yeah, if they went with up/downs but allowed genre up/downs I think this would correct some anxiety about this change. "Yes, I still want to watch anime. No, the last 4 shows you recommended to me were not good."


I don't work in the recommendations side of the house, but the way I understand things is that nowadays your star ratings (and now up/down ratings) are primarily to learn about you, what you liked and didn't like and therefore what you might like to see in the future. They're not really about creating a Rotten Tomatoes or IMDB style database of trying to rank all movies on some sort of objective "regardless of what you like, this is a high quality movie" sort of scale.


> The stars are not about "enjoyment" factor, they are about perceived quality

I ignore ratings because of that. They don't mean anything and I am guilty.

On Good Reads, I have given what is deemed as a trashy piece of writing 5 stars. At the same time, I have given Dune 1 star because the story was garbage.

So what is this 5 star rating? Some people say it is the experience, some say it is the writing style, others say it is the story telling, or the plot.

Lets also face it, when it comes to films, the news papers have their own agendas and personal commitments to other agencies. My film could be absolutely garbage, but since I dined with the Editor a few times, It gets 4/5 stars and then, it gets 4/5 stars because the rest of the population is told how to feel.

I would even say that this whole system of rating based on Stars or Likes can simply be removed. If a film is good, you will know about it. If a film has been watched 1000000 times more this weekend than another, base it on that if anything.


When I rate things it's usually an "overall" rating, rather than weigh one dimension disproportionately. Which is why I would almost never grant a 5-star rating to something. But I'll dish out 2, 3 and some 4s regularly.


That is only you, I can't use that information. Like you might be taking into consideration "Special effects were 3 on their own". Whereas my Mum would never even consider that as a thing which influences the rating.


I understand that. It's not perfect and I don't think the assumption is that it was but more that overall people as a group will smooth those rough edges. In other words the majority will compensate for the minority who use the ratings system differently.

For all the faults of the old system, the new one is even worse. Like Pandora, I'll probably just never rate things or mostly just give them a down vote if they don't crack a threshold of goodness.


Which is fine, because the recommendation engine should be using your history, not someone else's. They gave it a 5, you have it a 1. They will see recommendations for similar movies and you won't.


Your thesis is basically, ratings mean absolutely nothing, and how good something is should solely be based on how popular it is?


The problem as I understand it is that the "rating space" is multidimensional (a piece can be funny with great plot but bad writing compensated for by a great lead and awesome special effects although the set design was inconsistent) but all of this has to be projected into a 1-dimensional system.

And the rub is that not everybody uses the same projection, which means the utility of one person's rating for another unknown their projection system (the weight they give to each sub-component of the overall rating) is completely unknown.


A much more succinct way of making the point. Thank you.


This is a very clarifying way to put it thank you.


Yes, the whole thing is meaningless but atleast views offer some glimpse. I would have given Barb Wire 6/5 as I was a teenager with a crush on Pamela. How can that relate to a rating of a film that you could use?


I agree with you. And hat tip for the nice analogy. Perhaps they should implement a system analogous to facebook's Reactions (e.g. "disliked", "thought provoking", "easy watching", etc) as a way to differentiate between cheap/easy and high quality releases.


But there's no difference in price between going to see a silly movie or a"good" movie.


If you are sophisticated enough to appreciate "foreign documentaries", you should have little trouble discovering them by another means than Netflix rating algorithm. The latter is for the laziest, couch-potatoest audience, and it makes sense to optimize for that.


Some friends of mine used to have a movie rating site which asked two questions:

1) How good was the movie? [1-5]

2) How likely are you to watch it again? [1-5]

I think it was too many questions to ask, but trying to separate out the enjoyment from the quality struck me as a good idea.


But Netflix doesn't care about the quality of a movie. They care about whether you would watch it or not.


I feel like star ratings, if just given in general, are pretty useless for anything that has multiple qualities. For example, maybe I like the cinematography of a movie but I think the acting is shit, or maybe I think the taste of the beer is nice but it's too acidic or sour and wouldn't work with specific foods because it's overpowering.

Airbnb does this pretty much spot on, in that they ask for 1-5 stars for five categories, rather than just "did you enjoy this Airbnb?" I also think that Untappd does this wrong, because in general beers can't be described in just a 1-5 star rating without deeper explanation.

I think that moving from a pure star rating to a thumbs up and down rating is better overall, if only because it makes me, as a watcher, not have to think as much and therefore give a rating where I might not have before. If I want to go more in depth, I can explain more too.


> I feel like star ratings, if just given in general, are pretty useless for anything that has multiple qualities.

This is only true when those star ratings are taken out of context. Individual reviewers can have completely coherent star ratings that are very useful for making determinations.

The golden age of Netflix discoverability, for me, was back when they added social networking features. I connected my account with many friends and family and was really able to get a sense for everyone's taste in movies. When it came time to discover new movies/tv to watch, I'd open up the ratings history for someone who's tastes I felt aligned with the mood that I was in. Some of my friends would only rate high quality content with 5 stars, others would typically rate mass market, mindless entertainment with 5 stars and others tended to include a lot more 5-star reviews within a certain genre. Over time, I got a feel for that and could use it to my advantage.

But when you remove that individual context and only aggregate 5 star reviews across a massive population who all view the 5-star scale in their own way, you get that uselessness that you talked about. Some people may reserve 5-star ratings for truly extraordinary content, but Netflix doesn't weight those ratings any more highly than the 5-star ratings of someone who rates nearly everything as 5 stars. This makes the ratings that Netflix displays meaningless. But that's only because Netflix's current methodology makes them meaningless, not because they're inherently meaningless.


> Netflix doesn't weight those ratings any more highly than the 5-star ratings of someone who rates nearly everything as 5 stars.

Is that fact or speculation? It's a common recommendation technique to find the mean rating for a person, and use that to normalize. Normalized score = (actual_score - mean_score + 2.5) for a 5-point scale, for example.

I have no insight into whether or not Netflix does something similar, but I'd be surprised if they don't.


Doesn't normalisation assume that a person who rates everything five-star has low standards? It may just be that that person rates something only if they're really (un)happy.

Imagine two people watched 100 movies, and assessed them the same way, say that 20 are five-star, 20 are four-star, 20 are three-star, and so on. The first person rates all 100 movies, but the second one has a habit of writing a review only if he considers a movie excellent. Then your algorithm will wrongly discount the second person's ratings.

Is this true in the real world? Have you found different people to have different habits, like rating everything vs rating only good movies vs rating only bad movies?


First, few people will put in effort into understanding their friends' psychology and habits to use Netflix effectively. We shouldn't optimise for such users.

Second, I don't know whether individual ratings should be normalised as you say, but the final rating that's displayed should be normalised. For example, the app stores should have 20% of apps rated five stars, 20% four stars, and so on. This will make the ratings more objective and easy to understand. If the average rating is 4, then an an app rated 4 is merely average, which is counter-intuitive.

(Perhaps normalised within a top-level category, like Productivity and Games)


I like to use the French Laundry example for how meaningless the star ratings are. Was it the best meal of my life? Yes. Was having the best meal of my life worth $1000? Not even close. So, did I have a 5-star meal, or did I have a one-star money waste? And are my preferences at all meaningful to a random person who has no knowledge of my finances or desire for good tasting food?


I think that binary ratings are the purest form of rating. But you have to ask the right question. "Did you enjoy this Airbnb?" is not the right question, because you end up wanting to give a nuanced answer. But "Would you recommend this Airbnb to a friend?" is, I think, the right binary question because it forces you to distill all of your likes and dislikes into a single, unambiguous decision.

The nuance comes when you aggregate many people's binary decisions. A value near 1 tells you many people felt the good way outweighed the bad; a value near 0 tells you the opposite; and a value near 0.5 tells you that people are split. I think that's far more useful than trying to ask many individuals to provide you with a granular score.


Incidentally, Steam ratings are also simple up/down, with total and sliding window 30 days aggregate, and an Amazon-style helpful/unhelpful/funny meta-rating.


Steam ratings have their own problems. Like the creators changing something minor and getting thousands of protest negative reviews from upset people. Which doesn't reflect the quality of the game at all. But does discourage new people from trying the game.

My favorite example is Fallout 4. There are tons of negative reviews from people who put 600+ hours into the game. How the hell do you put 25 days of your life into a game you don't like? They claim it "got boring". Not many games will not be boring after 600 hours...


That may be true from your point of view as a rater, but from mine as someone deciding whether to watch a movie, I want you to condense your thoughts down to a yes or no by assigning an appropriate weight to each factor. Was the acting so bad that it ruined the movie despite the good cinematography?

For example, I found Mirror Mask enjoyable and would recommend it. They dreamed up and brought to life a fantastical world full of magic and wonder. The visuals are excellent. The story is mediocre, so you should watch it with the right expectations, but watch it nevertheless.

The ability to distill something complex and multi-dimensional to a clear conclusion ("enjoyable and I recommend it" in the above paragraph) is a sign of clarity of thinking and expression. If you can't do that, I'll go by other people's ratings.

Remember that the goal isn't to accurately capture one person's opinion, but to help potential viewers decide whether to watch it.


Certainly the data you get will be of better quality that way, I just don't know that you'll get a huge quantity of it. Seems like for better or worse, Netflix is going for a higher quantity of ratings. (As well as the possible ulterior motive mentioned elsewhere in this thread, of making lame movies look not-so-bad - like helping out the dullards in class by grading it pass/fail.)

An exquisitely detailed system like Airbnb's costs the user more time & hassle. Which means, assuming you make it voluntary, that a lot of them skip it. Meanwhile a simple/easy system like this, turns a scalpel into a sledgehammer somewhat, but you get a lot of them. So there's a balance to be struck for your site's particular userbase.


I agree, that's a good point. I think it's a different cost/value proposition with Airbnb vs Netflix, since you paid for a, sometimes expensive, service and then when the service is rendered you can review it, whereas with Netflix, while you are paying for the service you're not paying directly for a movie/show. So, with Airbnb, since you invested a bit of money you feel like you should take the time and review, but with Netflix, if it's not too simple you may just skip the review.


Is this just part of the trend to dumb everything down to a polarising dichotomy? Because the trend to stop people from having a more complex and rational opinion than just picking sides doesn't seem to exactly improve discourse or debate, or inform and educate...

Wouldn't it be more beneficial to have fewer people give a more detailed review to describe what they like and dislike about something? What use is "good" or "meh" compared to for example "I rate this product low because the shoes are very narrow for my normal feet" so you can actual relate it to what you actually are looking for? One person's "meh" could be another's "good"...

From the article:

"This makes sense – giving a five-star rating takes some thought, especially for something like a movie or TV show.

A binary “yes or no” option is much easier for viewers to commit to, [...]


I think one problem is that people have vastly different opinions on what 3 stars mean. For some it's quite bad, for some it's in the middle.

A binary option actually has three options. Did it prompt me to rate it as bad, or did it prompt me to rate it as good, or was it bland enough where I don't want to sway the rating of it at all?

If I rate a movie on IMDB as 5-6, and maybe even 7, the possibility of me recommending it to someone "depends". In that case I might as well not rate it at all.

Ratings are inherently tricky, because we all like different things. I don't think you lose anything by going to a like/dislike system. Recommendations should be based on aggregate maps and matching you with similar viewers that have a certain overlap in ratings. That's made easier with binary choices.

IMDB ratings are only partially informative to me anyways.


From my perspective, that shouldn't be a problem. Ultimately (as a user) the point of the rating is for me to get stuff that I like -- so it's all relative to me, regardless of individual rating differences.

Now, I realize that's almost certainly not the level of individualization offered by their ranking algorithm (I'm convinced there's way more of a "groupthink" dynamic), but that's partly my point: if your ratings and recommendation system isn't capable of supporting individual differences, then your algorithm is broken.

If you have a complex problem (personalized recommendations absolutely are complicated) then you shouldn't fix a broken algorithm (Netflix recommendations are, at least for me, a bit meh) by simplifying the inputs (switching from stars to up/down). You should fix the broken algorithm, by creating a better algorithm.

I realize it isn't that simple, particularly for a publicly-traded company whose stock price would absolutely be influenced by a decision to scrap part of their core technology, but from my perspective as a user, this is going to hurt my experience with the product.


I think it's more a reaction to a usage pattern of their star system. YouTube abandoned their five star rating system citing that most people voted only 1 or 5 stars with almost nothing in between. [1] I'd guess a vast amount of Netflix users do the same.

[1] https://youtube.googleblog.com/2009/09/five-stars-dominate-r...


YouTube is partly to blame for this. Since YouTube reported the average stars, if you saw a 4-star video and thought that it should be 3-star, then the optimal strategy was to vote 1-star to pull it down as fast as possible. If instead YouTube had reported the majority option (and publicized that fact) then it would have incentivized people to vote what they truly think.


Your suggestion has its own problems. Imagine two videos with a majority rating of three stars. Say 60% of raters have rated it that. But one video has the remaining 40% one-star, and the other has the remaining 40% five-star. Your algorithm will ignore this data, showing a completely wrong result that both videos are equally good.


From the linked Variety article, "However, over time, Netflix realized that explicit star ratings were less relevant than other signals. Users would rate documentaries with 5 stars, and silly movies with just 3 stars, but still watch silly movies more often than those high-rated documentaries."

But the signal isn't just for Netflix; it is also for users, who might sometimes be in the mood for something silly and sometimes in the mood for something good. Also, people might rather get more suggestions of good movies even if they are more likely to watch bad ones. (Of course, people might also just overrate documentaries.)


I'll admit that I have had Netflix recommend to me movies that it thought I would rate at one star, and I have watched that movie and then rated it one star. Not only did their current system predict that I would watch that movie, they predicted my relative opinion of the movie's quality. And yet I watched the movie, and thus felt like their recommendation system served its purpose. I have to say that going into it expecting a one-star rating allowed me to set expectations for the film.

If they remove the axis of quality and merely opt for "will you watch this: yes/no", it will make it harder to select "good popcorn trash" versus "high-quality, high-enjoyment" to fit my mood.


But I want Netflix to surface good documentaries as well as good silly movies. I'm more likely to watch a silly movie, so with this change my queue is more likely to be filled with silly movies, making me more likely to unsubscribe because it appears that Netflix only has silly movies. I watch more silly movies than good documentaries, but I pay for Netflix mostly for the good documentaries. I can get mindless pablum elsewhere easily.


I'd be pretty happy if netflix managed to actually suggest things there's a vague chance I'll be interested in rather than whatever they recently released/bought and desperately want viewers for..


It's a good thing netflix created original programming, because I would have long ago been gone from netflix if they didn't have /something/ nobody else had. Their nonoriginal selection has gone to total shit. Even hulu is competitive with netflix's selection excluding originals


To me it sounds like they should instead use categories as hints for how to weigh ratings of different users for different categories. I definitely don't want my feed to be filled with silly comedies all of a sudden.


Thumbs up or thumbs down is just as good of a signal for the users. Everybody has some innate star rating threshold where they won't watch a series or film. If it's below 3.5 stars I won't even give it a chance. Thumbs down accomplishes the same thing.

I'm going to rely on Netflix's recommendation engine to put things in front of me that I am going to enjoy.


The problem with that approach is that it assumes that liking is normally distributed.

I'm reminded of Jerry Garcia's famous quote about the Grateful Dead: "We're like licorice. A lot of people don't like licorice - but the people who like licorice REALLY like licorice".

A 3.5 could either mean 50% of users give it a 3, and 50% give it a 4, or it could mean that, say, 30% of users give it a 5 and 70% give it a 3 or less. How do you know you aren't part of the 30%? Great art is often polarizing.


Their recommendation system is useless anyway, to the point that I've started to see five-starred content as a negative indicator.

Why not remove recommendations altogether? They clearly don't have a library to support it.


I don't get it. Don't they already have the signal available to them of "did the user watch all/most of this video, or did they abandon it early" ? That seems like it does most of what they want, and thumbs up/down will add only very little information on top of that?


If you have many reviewers, you don't need to get more information than thumbs up or down from each person. How much people thumbed it up is a good indicator.

Star-rating can be too much detail, anyway: if you're comparing two shows, and one has a higher fraction of five-star ratings, but the other has a higher fraction of ratings that are four or above, which is better? Star ratings can be too much detail and cause confusion.

If you want more detail from each person, you can ask specific questions with a yes/no answer, like, "Were parts of it boring?" or "Was it violent?" That's probably better than star ratings.


The goal isn't to broadly determine which movies are good and bad; the goal is to make accurate predictions about which titles a particular person will enjoy.

One good way to accomplish this is to pool users into cohorts of similar tastes, then suggest movies enjoyed by their cohort but not yet watched by a given user.

The question is whether doubling the frequency of rating a title is more valuable than a slightly more nuanced understanding of what each person likes. 'More valuable' meaning they can build you a better cohort from which to make title suggestions.


Everything you've said makes sense, but I don't see it being contradictory to what I said.

If you decide you want a more nuanced understanding of what a given reviewer likes, I think it's better to evaluate different dimensions (Was it funny? Was it violent? etc) than shades of grey* of the same one.

* pun not intended :)


I do find IMDB's ratings breakdown fascinating to explore, though.

For example, you can look at the 50 Shades films or A Dog's Purpose or (especially) Gunday and have a pretty good idea that lots people are giving them bad ratings without having seen the films. (Their weighted average ratings are supposed to combat that, but they don't seem to be doing a very good job.)


Actually, Netflix is poised in a better position to filter between people who just rated the item vs people rating it AFTER viewing it (for a minimum duration). The latter class of people will have have more weight attached to their voting.


That's true, but it doesn't sound as fun to poke around in.

(Obviously, "collect a dataset that philh finds interesting" should not be a business goal of Netflix. I'm not suggesting this is a reason to keep the star ratings. It's just a thing that I like about having ratings more detailed than yes-no.)


Interesting point. Would polarised reviews make you less or more likely to watch a film (or download an app or whatever)? In other words, two items with the same average, but different distributions.


That and Netflix has far more data that indicates likability to a far greater degree, such as how many people finished watching the show, over what period of time did they finish watching the show, after what episode are people more likely to get hooked and finish the series or binge on it, etc.


>> Star ratings can be too much detail and cause confusion

The iOS App Store is a good example of this. An unbelievable number of people, for some reason, thing 1 star = great and 5 star = bad.


Haha yes! Where the hell do these people learn that??

The other thing that annoys me with ratings in general, specifically places like amazon, is when people pan a perfectly good item because of some third party factor.

"Item was great, but it came in a beat up box from the post office, I'm giving it one star."

WTH does someone drop kicking your box have to do with the product?


The other thing you see in ratings: "I've patronized this business every day for 5 years and always loved it. Then the service was kinda rude the other day. 1 star. Can't believe they stay in business."


I have to think that some of those reviews are just trolls.

Bad reviewers will always be part of the system. You have to expect it. Netflix has tons of reviews that go "I turned this off before the opening credit sequence finished, worst movie ever!!!! -- 1 star".


IIRC there was a blog post on HN a while past that complained about it, and it seemed to be people that went with 1-star so that their review was potentially more visible.


Very true. There could be socio-cultural reason as well. If a movie hurt the sentiment of a particular section of the society, you can expect 1-star ratings even from people who weren't going to watch the movie anyway.


Since the early days of Yelp I learned to read restaurant star ratings:

3.5: Good place, check it out

4.0: Great place, impress your friends

< 3.3: Double check your health insurance first

> 4.2: Fake


Haha totally agree. I've often wished they would put the actual decimal rating (4.12, 3.56 etc.) because we have no idea what their rounding strategy is. Does 3.99 become 4? How about 3.75? 3.51?


I've been in favor of up/down for years over x out of 5 or x out of 10. The core reasoning is: it captures your overall emotional response to something very succinctly. Sure you might hate X more than Y or like A more than you like B, but you'll return, revisit, recommend all the things you like and not do so for the things you don't like. When you can rate out of 5 or 10 you get thrown into a sort of tizzy trying to rationalize a system of what is 1 star, 2 stars, etc. Do you subtract a star because you think something is overpriced even though you like it? Do you subtract a star because it's hard to get it (say slow delivery) even though you like it? Basically you want to start to categorize every facet of a product/service which becomes complicated quickly. Thumbs up/down avoids all that and I think over many hundreds or thousands of reviews the ratio of up to down will provide a good probability indicator of liking something or not.


This benefits those that want to data mine such data, not those that want to meticulously catalog their interests. This is one of my biggest gripes with music services such as Google Play. Thumbs up/down ratings are great for Google, but suck for me, the listener, as I wish to construct a detailed, per-song breakdown of my musical tastes.


I don't believe up/down should replace written reviews, but neither does an x out of 5 or 10 system. Written reviews are the place for that in depth analysis. That said I'm not familiar with Google Play and don't know if they have written reviews or not.


If they captured and displayed to the user how many users watched it vs how many ups and downs it might give users a better indication.


That's a pretty interesting idea. It would indicate a sort of apathy rating. Not that a product or service with a high apathy rating is bad - might actually be good for certain things.


It's about damn time.

5-star rating systems are broken. A 1/2/3-star rating is effectively a dislike, and a 4/5-rating is effectively a like. In the big data sense, a deviation from a 1-star for dislike and 5-star for like is statistically meaningless. (and this is universal; star ratings on Amazon Products and Yelp Locations have the exact same distribution).

The interesting part of this is what Netflix gains from the change, since their recommendation algorithms will become less granular. Maybe they came to the same conclusion?


The problem is that is your opinion of a 5-star system. Everybody approaches them differently. Some people give everything they like 5 stars, everything they kind of like 4 stars and just don't vote on everything else. Some people just do 5 for like and 1 for dislike. The 10-point system on IMDB is even worse.


but that's the problem with 5-star systems - everyone uses them differently. Like/dislike leaves little room for ambiguity.


That's what I meant.


no, 1 is bad movie, 2.5 is average movie, not bad, not really good, 5 is masterpiece seen once in few years

it's sad teenagers who think life is black and white are the ones deciding it for others

luckily there is TMDb where i have my watchlist, imported ratings from IMDb and can discuss there each individual movie


> 2.5 is average movie

That's the other problem: since the breakdowns of ratings are skewed, the average movie (the arithmetic mean) tends to have a rating of ~4 stars. Another reason to get rid of the system.


> star ratings on Amazon Products and Yelp Locations have the exact same distribution

Source?



I actually thought Randy Farmer touched on this in his Google talk on "Building Web Reputation Systems" - but maybe I saw it somewhere else, years ago. Like this:

http://www.lifewithalacrity.com/2006/08/using_5star_rat.html

At any rate I thought it was common knowledge that 5-point ratings have issues - especially as a "social rating system"?


Unfortunately, as other comments on this HN submission demonstrate, the 5-star system is sacrosanct and any change, even those made a decade later, is bad.


I think a lot of people are reacting here as if Netflix is moving from the only signal they have being the star and review system to the only signal they have being the up/down system, and thus confused as to why Netflix would throw away the vast majority of their rating ability.

But that's not even close to true. In addition to it technically being up/down/no rating, we've got how long people watch the show for, how the shows they watch form a pattern, how all the patterns of what people watch show global preference patterns, whether they rewatch a show, when they watch what sort of shows, what individual scenes are rewatched vs. skipped... Netflix is swimming in a sea of preference data, not sitting here trying to figure out "Gosh, um, if the user likes this movie 3 vs. 4 stars, uh... what does that mean?"

It makes perfect sense to me to optimize this one-data-stream-among-many to increase user participation and get more bits of information from more people engaging with the simpler system, rather than trying to squeeze bits out of the few people willing to use the star system and the even fewer willing to write useful reviews.

It isn't really even as shocking as it may seem at first. The star system has 6 states, "no rating", 1, 2, 3, 4, and 5 stars. That's 2.6 bits, with some simplifying assumptions [1]. The thumb system has up, down, and no rating; that's 1.6 bits [1]. To make up for the bits, you need only see ~40% increase in participation over the current star system... think that's going to happen?

[1]: The simplifying assumption is that all outcomes are equally likely, but that's not true. I don't have the numbers to run a more complete information theory analysis, but it's not hard to imagine the "no star rating" case is so common that it produces such a small fractional bit that a higher-participation-rate yea/nay/no rating (if you puth this UI in their face, "no rating" becomes much more meaningful, too) straight-up produces more bits of information on average, and is thus simply an improvement even before considering the superior UI experience. I rather suspect this is the case, some very sensible assumptions would suggest this, but I lack the ability to prove this; the "assume all outcomes are equally likely" is at least a concrete case I can discuss.


> To make up for the bits, you need only see ~40% increase in participation over the current star system... think that's going to happen?

For some people, such as myself, it will decrease participation (making the "no rating" indistinguishable from "it was okay, but I don't love it and don't hate it, so I wouldn't know whether to upvote or downvote"). A 40% increase is also quite a significant change. I'd be astonished if they got that much more engagement out of it.

There's also quite a difference between having the data for their internal algorithms and what people want to see. Just like Facebook eventually added alternatives for just the "Like" button.

I don't mind having a percentage displayed rather than a star rating (I automatically convert everything in my mind anyway, even though the star ratings are more easily visible at a glance when scrolling through a list), but I mourn losing a lot of expressiveness in the ability to record what I really thought of the video that I watched.


Yes, but the ratings aren't just for Netflix to use, they're also interesting information for users.

This move seems very inward-focused: Netflix is thinking about the information it wants to get from users, but I wonder if they surveyed people about how much they want to see percentage-likes instead of stars.


I'm not looking forward to a world where I have to like "boyhood" and "pirates of the Caribbean" the same amount. What a joke.

Of course, Netflix solves this by removing the quality content....


Does Netflix offer star histogram charts the way Amazon does? I don't recall seeing them. If not, there's no difference to a user between a 6.8/10 generated by thumbs and 6.8/10 generated by stars. They are already synthesized numbers, not just an average across all users or anything.


> The streaming service said it had been testing thumbs up and down ratings “with hundreds of thousands of members” in 2016 – and it led to 200% more ratings being given.

I don't understand why engagement is the right metric here. If someone isn't sure how they feel about a movie, why is it a benefit to have them spew their half-formed thoughts into a like/dislike rating?


>If someone isn't sure how they feel about a movie, why is it a benefit to have them spew their half-formed thoughts into a like/dislike rating?

That's pretty harsh.

I've watched plenty of movies that I enjoyed and would give a thumbs up but aren't sure whether it should be a 3 or 4 star movie. There's movies I love but consider too flawed to give 5 stars. And I find little difference between 1 and 2 stars other than the level of regret for wasting my time.

Overall I rarely bother rating things on Netflix because it just doesn't feel worth it. So they get nothing out of me, whereas with a thumbs up or down I am more likely to chime in.


Thumbs up/down also misses an important metric, for me: the "meh" factor. Maybe something sucks, maybe it's great, but what if it's like...I have nothing to do, I guess I'll watch this show.

You could say "well that's the same as not rating it!" but it's not. I'd like there to be a clear distinction between "I abstain from voting" and "I thought this content neither vomit-inducing nor life-changing."


Some reaction from

many has more value than

thougthgful one from few


Did ... did you just answer with a Haiku?


Here's the benefit - getting a clearer picture of your tastes, which allows them to make you better recommendations. Their recommendation engine would be based on finding users similar to you and telling you what they liked. More ratings is better than nuanced ratings for this engine because you're more likely to find overlap between users.

The goal isn't making a database of the "best" and "worst" movies and TV shows. We already have the International Movie Database for that. This is just about finding out what you like, and finding other stuff like it.


Big data.

Seriously. Lots of low fidelity measurement is sometimes better than a smaller amount of higher fidelity measurement. Especially when that higher fidelity measurement has errors.


I'm still disappointed they removed their friends feature. I valued my friends' ratings more than the ratings of the Netflix user base as a whole. I have noticed that Netflix tends to be more accurate with the "users like you" star rating than the whole user base ratings.

I no longer see a sum of my ratings but I believe it is well over 2,500. I know others that have rated a lot more (when there was the friends feature you could see your friends' number of ratings). Because of my number of ratings I felt that Netflix did a pretty good job with recommending me content. I will be sad to see this go as I definitely refer to star ratings when adding content to my queue.

Others have mentioned that the two factors of quality and enjoyment make the star rating more valuable and I agree. The only time I remember it breaking down for me was the film Rachel Getting Married (though I am sure there were others). I couldn't stand Anne Hathaway's character to the point that I gave it one star but at the same time I recognized that she gave a really strong performance in what was probably a good film.

Are they converting ratings to thumbs up/down? What does a three-star rating convert to? Those are typically movies that I enjoyed but wouldn't rewatch or recommend to others.


> Are they converting ratings to thumbs up/down? What does a three-star rating convert to?

It would convert to no rating.


why not use TMDb or IMDb for ratings? you can export your ratings from IMDb and import them at TMDb which has also discussions for individual movies


Bummer. I understand why they do it but I think it's too coarse. I'd much rather know how many people thought a movie was fantastic and not just decent.


The current rating system is not a running average of how all users ranked the movie, but represents users with similar taste to yours. The same move could show a different rating for different users.


It would be neat if they gave each user a set number super thumbs (obviously with a better name). You can use them so many times per month or year. I think month would be ideal, but if they went with the year route, make it possible to remove your super thumb like to use it on another movie.


On the contrary, I think that superthumbs would be a brilliant name. If you introduce something as dry and complicated as a quota for strong upvotes, you'd better do anything possible to make it more playful. A certain dosage of silly naming might be a good start.


Likes?


Super Thumbs!


The problem is that everyone has a different idea of what the 1-5 scale means. My girlfriend never gives a 5 except for her favorite movie since nothing can match its greatness.

Meanwhile I rate everything 1-star vs 5-star.

So I'm not too convinced there's any nuance in star averages, much less a difference between "decent" and "fantastic". It might just be a more divisive movie that averages out to a 3-4.


You're forgetting all the other metrics that Netflix collects, like which shows get binge watched, finished, at what episode are people hooked or more or less guaranteed to finish a show after, etc.


What's next. Just a thumb up like on Facebook?

Rating is a hard problem. No system works universally well. IMDb (Amazon property) for example uses 10 stars, Amazon uses five stars. Facebook use thumb up ("likes"). Games ratings are often in percent 1-100% (summarized by metacritics.com and others). School systems around the world use A, B, etc or numbers like 1-5 or 1-6 for grades.


You're lumping together different kinds of rating systems. Netflix just wants to recommend you stuff you might like, which is a very different problem from scoring things on an absolute scale (grades, IMDb, game critics). Facebook likes have a social interaction component that's not there in Netflix, so it isn't comparable either.


Netflix lacks the issues that lead Facebook not to implement a thumbs-down. Netflix should still want a thumbs-down.

You should also distinguish between aggregate ratings and individual ratings. Metacritic and other aggregators normalize to some scale, but not all the individual reviews use the normalized granularity. A common movie case is Metacritic having to normalize a 5-star, in half-star increments, review to a 100-point scale. I believe Metacritic also allows user reviews to use a 10-point scale, but present the average on a 100-point scale (the first ten integers but with one decimal point, technically allowing 101 ratings from 0.0 to 10.0). Aggregate ratings always have more detail than their constituent parts.


I can safely say up/down voting is pretty terrible across all social media. You end up with HN and Reddit: popularity contests. Meanwhile, popular movies are rarely the ones worth watching, and the top 100 movies on metacritic is a grab bag of (great) movies that probably don't fit your taste profile.

Meanwhile, I have to like all the things the same amount. This is just going to lead to me not rating the vast majority of mediocre content on the platform.

Why not let you choose as a user?


HN uses upvotes and flags


Amy Schumer's comedy special bombed so hard they're changing their ranking system? Total failure.


She's blaming the boogeyman of the "alt-right" for the poor reviews. Yeah, it's got to be that, not that she has no actual material beyond "I'm a girl saying gross things, now laugh at me."

I'd rather have seen the money wasted on her spent boosting the publicity of female comics that are actually, you know, talented. Start with Tig Notaro.


I read that headline too, but the actual story didn't provide any evidence to support it. Schumer's 'Leather Special' was your usual tedious progressive lecture disguised as comedy with a few laughs and a lot of porn, and was modded appropriately, but that doesn't connect it to the Netflix rating change as far as I can see.


If/When Dave Chappelle sets a new record for 5 Star reviews, perhaps they'll consider reversing this decision.


This system would arguably be more unfavorable to Netflix/Amy Schumer.


Right. The original thumbs down means being put to death.


At no point do I recall Netflix promoting this rating system as anything but way to get better matches. The fact that so many comments are about losing a system for criticism, just confirms that it was the wrong metaphor to use in the first place.

Perhaps they could bring back a "critical review" mechanism, but I'd guess it wouldn't affect your matches at all. And probably be mostly unusable on a TV anyway.


The issue is that their digital library is vanishingly small. I swear half the tv shows on there are Netflix original content from the last five years.


A problem I'd like to see a good solution to is selection bias. Unless there is some kind of reward, only people that really care are likely rate something

I think Netflix's move might help this. It certainly lowers the cost rating.


There's another type of selection bias as well. I'm not going to watch a movie unless I think I'm going to like it (based on reviews, comments, description, etc.) As a result, with the exception of a few real disappointments, I'm generally at least sorta OK with most of the movies I watch.

Even most of my disappointments tend to be in the vein of "It was OK I guess but I don't see why people think this is so great."


But the UX of the current rating system sucks. Rating stars are very small and you have to hover them to know you can rate. Only if you watch a movie to the very end you will see a clearer rating functionality, but even then you have few seconds to use.

I bet many new users don't even know they can rate. I wonder if people aren't rating much because the UX sucks, not because it's a 5-star system.


The UX on their web app may have this issue, but they have different methods for rating across their platforms. Some are easier than others.


What bothers me is that once you rated something you can't unrate it, or see what the public rating is.


I believe what they display is just a guess at how much you will like it (based on people with similar viewing/rating histories), not a public rating as such, so once you've actually rated it, there's very little reason to show the other figure.


Wait, really? Damn, that explains why I love literally every 5-star-rated thing they serve me!


I don't believe you can ever see a public rating. Before you rate a title, you can see netflix's prediction of what you will rate it, but that's not public.


YouTube got rid of star ratings back in 2009 for similar reasons: https://techcrunch.com/2009/09/22/youtube-comes-to-a-5-star-...


And not only that, but also removing their existing collections of videos and replacing it with their "Made by Netflix" shows that mostly amount to shovelware.

I'm about ready to drop them, just like they've dropped MASH, soon to be X-Files, and many many movies.


Sigh.

Why do so many not seem to understand that, in most cases, Netflix is likely not "dropping" anything? Netflix does not own the rights to the content and the owners can decide on the terms of the agreement on how long the content will be on Netflix.

People threatening to leave, or actually leaving, lowers Netflix's leverage to try to get and keep the content in their system.

This is why Netflix is creating their own content and bringing in content that's not normally within your area. So they can keep it up longer and have more control over their own destiny.


I really don't care how they run their business. Not my problem. What is my problem is:

I started watching X-Files for the first time (yeah, never go around to it for the initial airing). Started watching in late February. And starting March 1st, is a warning on the upper left of my monitor saying "Show will be removed April 1".

From my viewpoint, I'm paying the same, and getting less. Not only less, but specifically something I'm trying to watch.

And yes, I tried watching that psychic alien abducted Netflix show. The sex scene at ep1 annoyed me, and had no interest still in ep3. Boring is putting it mildly.


> "Show will be removed April 1".

Don't worry, the show goes downhill after the first couple of seasons, so it's probably for the best.


I watched until the T-1000 showed up. Can't see him as an organic lifeform. But in all seriousness it's a huge UI improvement that they give you a few weeks heads up now. I've narrowly missed the ending to some damn good shows and all I needed was a few days to get to a spot with some closure. I get that the rights aren't entirely in their control, so at least it's something.


I'm just pointing out that such complaints are aimed at the wrong target, that's the way it shall be with any service as long as people don't complain to the companies that control the rights to such content.


Sure, that makes sense, but as a consumer, all that matters to me is whether they have the content I want or not. If they don't, Amazon will gladly sell me the season DVD's that Netflix hasn't been able to host. I'm not personally interested in 'investing' in Netflix creating their own content (which I generally haven't been that into honestly).


I'm cool with that, I'm just bothered by the constant blaming Netflix for something that is not of their creation. These problems will exist with any streaming service as long as people point their complaints to the wrong companies. People should complain to the content rights holders, not the streamers.

It's just as possible for the content owner to tell Amazon to stop selling their content if they wanted for some strange reason.


Yeah I think anyone who's been following Netflix should know that keeping content they didn't create available for streaming is actually quite difficult and not just because 'Netflix sucks' or something.


I read something (on HN actually) awhile ago about how Amazon seems to be actively trying to kill Netflix, not by going after subscribers but by going after licenses for programming by over-bidding.

I think Netflix though is making more off of their in-house produced content these days so I wouldn't say they're in danger, they just might look quite different in a few years.


I use amazon channel instead of Netflix now. I subscribe to channel for a while and watch their content exclusively till I am ready to move on. I am currently watching the excellet Xive Tv channel on amazon [1].

Netflix is waay overhyped for what they actually have in their catalog.

https://www.amazon.com/Instant-Video/b?benefitId=xivetv&node...


They used to have more. As they've built up their own programming they've been losing a lot more of the other content, but like I said, I think they're doing it out of necessity.


The way amazon is competing against one of their largest customers should be concerning for anyone on aws


Same. They dropped Peep Show and that was about the only show I Liked that was still on. Plus King of the Hill and Spongebob, but they took those off ages ago.


I'd hardly call most of their shows shovelware. A great deal of them are at least decent or comparable to top shows on TV. Also, you do realize they don't set prices on licensing and probably drop the shows after people already went through them or stopped watching them. If you like a show, watch it while its still on there, don't put it in your queue for 2 years then get mad when it disappears before you got around to it.


I support Netflix out of philosophy. I want to support media which is offered without restriction and most importantly without advertisement. Netflix goes out of its way to make sure it's as watchable as possible (sans being able to download their stuff without DRM, but I can't blame them for that, it's too risky for their business).

I have similar reasons to support HBO. However, I will not send money towards FOX/Comcast/Disney/Viacom because they still want to restrict their content and make it as hard as possible to watch it.

Of course, the danger is they could just start introducing advertising again by way of product placement in the story itself, but if it goes too far, I guess I'll cancel then.


After you hand Comcast your money, you can also watch their content on your phone or tablet, anywhere, both live and on-demand. You can also use the phone or tablet to access what's on your DVR. Comcast is actually doing a good job of making it quite easy to watch the content on any device.


I haven't looked it up in a long time, but I don't think it's as easy as Netflix. Also, I want zero ads, not just the ability to stream.

It's a business model problem, and the entertainment model media providers offer is just not comparable to what is available on the internet.

Currently, I don't think the interest of Comcast's shareholders is the same as mine, they want to protect and grow their existing revenue streams, which depend on ads and outdated licensing contracts. My interest is to watch what I want, when I want, with as little friction as possible. Netflix and HBO provide that, none of the other big media companies do.

Also, Comcast has an interest in opposing net neutrality and providing a shitty internet experience so as to support their dying ad and TV business, so I also don't wish to support them for those reasons.


Why are they dropping shows? Does it save them money somehow?


They originally negotiated licenses super cheap as the studios thought they were just some niche upstart. Now they're eating the market the studios are realizing their mistake and either not renewing licenses or jacking the rates up massively - hence the move into original content which can't be taken away.


Lots of assumptions, presumptions, guesses and maybes here, but:

- Netflix's licences to stream shows are presumably for a limited period of time.

- There's only so much money in the pot, and they want to spend it on the renewals and new acquisitions they think mostly likely to retain existing subscribers and attract new ones.

- Sometimes the rights owner may not want to renew Netflix's licence: they may strike an exclusive deal with another platform, or may feel that having the show on Netflix is cannibalising their DVD sales.


> but also removing their existing collections of videos and replacing it with their "Made by Netflix" shows that mostly amount to shovelware.

You are not wrong

"US Netflix offers only 31 of IMDb's top 250 movies, study shows"

"David Wells, Netflix’s chief financial officer, was quoted by Variety stating that the company wants half of its content to be original productions over the next few years." [1]

RIP Netflix!

1. https://www.theguardian.com/film/2016/oct/14/us-netflix-imdb...


Netflix vs non Netflix content aside, the low amount of movies is a direct result of viewing data. TV shows are shorter so can be squeezed into smaller spaces for viewing and have longer plot lines to hook viewers. Can't blame them for putting there money where people waste their time.


They're becoming their own content creator...like HBO, and thus reducing their dependence on often hostile networks for content and able to offer more content to more international markets, there is nothing wrong with that.

I find myself watching more of the stuff they've produced then the licensed content, so for me, it's been an improvement. If I run out of things to watch, maybe I'll care then, but at least I'm not tied to a cable contract, I can just switch to another streaming provider until they fix that issue. At this point though, I can't keep pace.


I can't imagine any film that conveys a message, cinematography, style, and meaning that could be encapsulated with a simple thumbs up or thumbs down.

What a disappointing simplification.


I have found with the current 5 star rating system that anything with less than 4 stars isn't worth watching.

And anything with 5 stars has a 50/50 chance of being worth watching.

So, for me, any simplification is welcome.


The star system didn't capture that either though


That's very rare cases. Cinema is the most unhuman conveyor of entertainment content consumption. Modern movies are nice looking moving pictures generated based on big data and emotion exploitation.


I can only wonder why. 5 stars is the normal rating system for movies, and there's a huge difference between each star, at least for me.

I basically rate like this: unbearable-cannot-finish (1), bad (2), okay-for-background-noise (3), what-I-expect (4), awesome (5).

Of course, some movies are somewhat between 3 and 4 or somewhat between 4 or 5. But if you just give me thumbs up or down there are like 80% of movies I cannot rate at all, because they are neither.


I've always liked Rotten Tomatoes ratings more than IMDB star ratings. Rotten Tomatoes rating is basically, what percentage of critics gave a thumbs up.


Terrible idea, thumbs up or down rating only promote political reviews, a movie that makes critics of feminism? thumbs down despite the acting or contents inmediatly, a movie showing the good parts of religion? thumbs down by viceral atheists, some one just needs to look at youtube and see how this doesn't work,


I think it's a great move. Maybe on a similar note, one of the reasons I really like and find more use out of Foursquare is the way their rating system works compared to the five-star Yelp rating system. For example, in Yelp I found that all "good" restaurants are in the above 4-star rating, and that's all the information I can get out of that rating. With Foursquare, they have turned the upvote/downvote/neutral rating into a number between 1-10 that tells me more. Above average is 7+, something unique about the place 8+, truly exceptional 9+.

It just seems to me it's a simpler way to rate something by a user, and at the same time classify it more adequately.


What Netflix needs to do is improve their content, not screw around with ratings. The quality of the content has definitely deteriorated over the last few years. It is almost impossible to find good material to watch. I have found myself wanting to watch a good movie only to give up after searching for half an hour.

Amazingly enough non-premium TV channels have better movies than Netflix, so we record them and skip the commercials.

Frankly, the only reason we keep Netflix is because there are a couple of kids TV series our kids like. We are using them to improve their Spanish language skills. If it weren't for that we would have cancelled a long time ago.


I wish we could have the IMDB/RT ratings directly in Netflix and a page with all the movies sorted by those ratings.

There are articles on Internet like "top 100 Netflix movies of the month" but it's almost always for the US Netflix.


Unfortunately, A Better Queue is dead, but you can use NEnhancer (https://chrome.google.com/webstore/detail/nenhancer/ijanohec...) in Chrome to add Rotten Tomatoes and IMDB ratings to Netflix.


The way I understand their explanation is that users tend to view the act of rating as a stronger endorsement than it actually is. In other words, users think their ratings are more predictive of what they would enjoy in the future than the implicit signals that can be derived from actual behaviour. Netflix concluded that it's better to boost the weak(er) signal by encouraging users to rate more. They will continue to use it on top of the stronger implicit signals to provide personalized ratings, which they will display as a percentage match as long as it's greater than 50%.

So their recommendations may actually improve.


Question to the Machine Learning folk: are five-star ratings "better" than thumbs up/down, or is it just a matter of algorithm design?

I know ratings/prediction has long been studied by the MovieLens.org scientists.


I think the fact that Netflix saw a 200% increase in "thumbs" ratings vs "stars" in the A/B test makes it much better from a machine learning perspective, even if it might at first seem like the data is of "lower quality". One of the biggest problems for recommender systems is that the data is extremely sparse - most users will have only rated a tiny proportion of films, and most films will only have been rated by a tiny proportion of users.

This is just my opinion now, but having studied recommender systems in a decent amount of depth, I don't think the design of the algorithm will need to change. The current techniques that provide the best results use matrix factorisation to simultaneously learn characteristics of films and how much each user likes each characteristic. My intuition would be that the algorithm can learn a lot more about a user from 3 up/down-ratings than from 1 star-rating, and the only reason Netflix are doing this is so that they can provide better recommendations so it's almost certainly the case.

TL;DR: All else being equal, 5-star ratings carry more signal for any machine learning algorithm, but the fact that thumbs up/down will result in 200% more data is a lot more significant than the delta in the signal.


The technical term for a multi-perference rating is a Likert Scale. There are different schools on this -- an even or odd number of options (even "forces" reviewers to indicate some preference positive or negative, odd allows fence-sitting), how to compare different raters' preferences, etc.

I'm finding interesting the rating system I've created and am trying to apply with some consistency to my Pocket archive of a few thousand items. Nominally it runs from 0 to 5, though I may reserve a 6 for an absolutely mind-blowing piece.

A 0 is a net negative: you are less informed for having read it, it reduces teh intelligence of its reader.

A 1 is, generally, a simple noting of some event.

A 2 should be a general news story, without strong insight.

A 3 is a news or general interest story with strong insight, or a typical scientific paper, or an undistinguished book (generally nonfiction).

A 4 is a particularly good scientific paper, or a typically well-thought-out book.

A 5 is a document which establishes a fundamental idea or field. Claude Shannon's original paper on information theory, say.

I don't think I've run across a 6 yet, but that might be a work which ties together two or more previously unrelated fields into a common theory.

My problem has been in assigning far too many '3' class articles. I've already carved out a list to re-assess and downgrade if appropriate.

I'd also like to be able to report on the numbers for each classification, though Pocket's utter lack of quantitative reporting (I cannot even state how many articles I've collected in total) stymies this.

I am ... increasingly dissastisfied with Pocket as an information management tool.


I get why they do this and it'll probably work out fine, but for me personally, it's going to mean I will rate a lot fewer movies/shows. In the past, I've rated pretty much every single thing I've ever watched on Netflix. On YouTube, I hit the thumbs-up or thumbs-down buttons on a very small fraction of the videos that I watch. That's just because there is no middle ground (and no 2 or 4 star ratings either). There's plenty of stuff that I enjoyed, but didn't love, plenty of stuff that I disliked, but didn't really hate.


I don't have particularly strong opinions about this change, I don't actually rate many things that I watch (I think the recommendation system can go mostly by what I watch).

I think the fundamental problem with different rating systems is that they are all one-dimensional. Maybe a two-category system, like 1-5 stars for "enjoyment" and another 1-5 stars for "quality". Or a 1-5 rating paired with a label, like "campy" or "serious", to prevent apples-and-oranges comparisons.


In most cases I would agree with the simple up/down rating, because we're seeing the lunacy of "less than 5* is failure" in various platforms. But in the case of Netflix where it's primarily being used to manage my own personal preferences on content, some nuance of "more like this" and "well I kinda enjoyed it but it was badly flawed" and "meh" and "never ever show this to me again" is useful.


I wish there was a "meh" option. Sometimes things are ok, but not good or bad. This will leave me not rating a bunch of things unless they're good or bad.


Special categories would also be nice. "Would probably be funny if I was high". "Don't watch when high - will freak you the hell out".


I'd bet Netflix uses the watched but not rated signal, especially if you often rate things.


I understand people who want more expressiveness in their reviewing, but the reality is that the 5-star system just leaves too much ambiguity.

It's like how in Uber/Lyft, most people just default to 1/5 star ratings. In that case, the average rating a driver has to maintain gets skewed to a a pretty high number, like 4.5, and people who think "oh, this was a pretty good ride! 4 stars!" end up unintentionally boning the driver.


You could come up with an algorithm that instead of showing an average of 5 stars, biases the weighting towards people who rate things in a similar way to you. We all have an internal set of assumptions about what a 5 star rating system means to us; this is a case where the filter bubble leads to better understanding. People would then see the ratings they expected rather than a mix of different methodologies averaged?


I don't rate things on Netflix (and rarely on any other service). If I made it thru the movie, I probably enjoyed it; if not, I didn't.

What I would like to see is a collection of all available movies by genre, and I'll pick what I want to watch. That's it - I don't need or want recommendations. In fact, if it's recommended, my first thought is "who's getting paid for that?"


I've found some solace using Mubi, which has a pretty devoted film fan base that really puts thought into written reviews + star ratings. I almost always read a handful of these before following-through with a viewing. They also have a nice 1-in-1-out movie per day system which lets them theme films by event (Cannes, director birthdays, etc). Not for everyone, but it's out there.


Glad to see MUBI mentioned here. I spent 9 years of my life building it as co-founder and tech lead.

My biased view is that if you are sick of bland streaming selections then you should really support MUBI. The more subscribers we can get the more we are able to fund great unique content. Our approach is fundamentally different from the data-driven, mass-appeal approach that tends to sand all the corners off of Netflix/Amazon content. Currently most MUBI content is licensed with a few exclusive releases sprinkled in, and the US selection is not as good as the UK, but as the subscriber base grows it will get better and better. Netflix et al are a volume game and the unit economics don't work out for them to purchase great indie films. If you care about great film as opposed to episodic content, MUBI is the streaming service that is pushing the envelope there.


Wow, how random that you are on here too, haha. Thanks for building a great product. It addressed pretty much every gripe I had with the bigger streaming services.


Normalize star ratings. If something has an average rating of 4 stars, but four stars is in the lowest 10 percent, it should be given 1 star. This solves the classic issue with star ratings where anything under 5 stars is below average.

Personally I use star ratings badly myself. I only ever rate titles 5 stars. But that's because I mostly only watch stuff I am fairly sure I am going to like.


I have always wanted to get ratings from a selected group of viewers who has similar tastes as me. For example, I like SciFi movies that involves planetary travel, ie Star Trek or similar style movies. If I can get ratings from folks with similar interests as well, then the rating system would make sense for me. Otherwise, it is still a hit and miss for me.


That's exactly how their recommendation system works. If it just surfaced the highest-rated movies globally, everyone's recs would be identical. Collaborative filtering implicitly weighs the opinions of those similar to you more highly.


I already push my Netflix ratings higher or lower than I really feel just to more heavily influence Netflix's suggestions. Otherwise, Netflix thinks I would give a 3.5 stars to pretty much all movies, which is not helpful to me.

That said, a simple thumbs up/down doesn't allow me to tell Netflix which movies were not bad and which to never show me again.


I've been a Netflix subscriber for a decade.

To me, 3-stars means: this movie has all the characteristics of something you'd enjoy - but we don't think you will.

Unfortunately, it's been more and more challenging for me to find 4- & 5- star stuff in their streaming interface.

Perhaps the thumbs-up will help expose those things, or at least help me hide the thumbs-down things.


Netflix's star rating system is really poorly designed. Of course people clicked on a thumbs up/thumbs down widget more. I used Netflix for months before I even noticed the star rating was interactive.

I wish they would invest in a quality star rating system like Amazon's, rather than dumbing down the UI.


My Previous ratings already in Netflix, was the reason for being there. This dumbed down version of up/down voting is ridiculous. Most companies become sophisticated. Netflix is going backwards in time.

All in all this is a RIDICULOUS move. I wish I could download all my previous ratings before they remove them!


I'm going to guess they looked at things and found that 80% of the ratings were either 5 stars or 1 star. (edit - not that this is a good thing.. I'm guessing they could have carved out the 2-4 star ratings and did its own data analysis that probably provided much more nuanced information)


What I really want to avoid is the mediocre content i.e. 2.0 - 3.5 stars, very good and very bad can both be entertaining..

Now that mediocre content will be scattered across the two categories I really care about; I guess IMDB/Rotten tomatoes will steer clear of this ridiculous binary rating systems.


Isn't Rotten Tomatoes exactly a (aggregated) binary rating system?


Yes it is, and it sometime gives some strange ratings like "Ash vs Evil: season 2" having a 100% rating[0]. Though it's probably a good show, it's not absolute perfection like the rating would imply.

0: https://www.rottentomatoes.com/tv/ash_vs_evil_dead/s02


Aha, I misunderstood, I thought Netflix was going to recommend movies or not based on that rating system, without giving us the background. I would still prefer something more fine grained though.


I've seen so much good content on Netflix rated at 2 stars or less. Movies and TV shows are way too subjective for user reviews to be used with any kind of reliability. Or maybe I just have low standards.


I also find the Netflix critics somewhat harsh, perhaps we are more critical from our sofa, then at the cinemas


There's always a better option if you like metrics: instantwatcher.com sorts Netflix/Amazon content with a variety of external metrics like IMDB score etc. It's pretty great if you are just browsing (rather than seeking recommendations).


After almost 5 years I finally cancelled my Netflix subscription. I just realized I was watching more movies / shows on Amazon, Youtube, or any of the other paid ones more than Netflix. I used to enjoy it a lot more back then.


i would recommend FMovies.se or 123movies.gs


Why not keep the rating system, and have an additional thumbs up/down rating? That way you get to vote on the 'objective' quality and the 'subjective' enjoyment factor of the content.


This is what even Google does now for the movies and tv shows on the search. If you are logged in while searching for a movie or TV show, it gives thumbs up or down and shows the approval rating in total.


Uh oh, now they're going to need to re-film the Black Mirror "Nosedive" episode (s3e1) using thumbs-up/down instead of 5-star ratings...


The best way to rate a movie for me would be: when will I watch it again?

They kind of already know this though. (How many times have I watched a particular movie on Netflix?)


not for me. I rarely watch movies twice, even those I really enjoy. A better metric for me would be "am I glad I watched this movie"


Therefore, any movie you've watched more than once becomes pure gold for Netflix! Spread that across several million subscribers and get enough data to be useful.


Should be "five star-rating"..."five-star rating" indicates they are replacing ratings of five stars :)


I don't have Netflix, so I 'm not clear if the rating shown to users is changing. Or does it not show a rating, just use it to decide what movies to surface for you?

In any case, simplifying the ratings users can bestow makes a lot of sense, unfortunately. We like to imagine people carefully considering the merits and flaws like a professional reviewer, but the fact is that the vast majority of users only ever use two ratings anyway: if they like it, 5 stars; if they didn't, 1 star. There's maybe a third option where if they're ambivalent they just don't rate it.

So star ratings aren't actually very useful for evaluating products. This xkcd https://xkcd.com/1098/ made me realize that an Amazon product which has a 20% chance of exploding when you open the box, and otherwise works normally but unexceptionally, is gonna average out to a four-star rating for a product you really shouldn't buy. I always look at the one-star reviews before buying stuff now, no matter how few of them there are; knowing a thing's failure modes is much more useful than a bunch of praise.


the histogram of the star ratings are much more valuable. I tend to avoid bathtub curves.


Whenever I would browse the Netflix reviews it appeared most people voted either 5 stars or 1 star anyway.


Interesting. YouTube made the same switch some years ago, I wonder how it's worked out for them.


YouTube used to have a star based ratings system too. I wonder if they drew the same conclusions.


Great. I can't stand 5-star systems. Anything other than 5 stars is negative signal.


Netflix recommendations are quite bad.

You've seen this standup? How about watching it again a day later.


this makes sense to me considering most people only leave bother to use the extreme star amounts anyway. anyway, you're kind of a sap if you take netflix ratings seriously and don't use something like rotten tomatoes


What they should be asking is "would you watch another movie like this?"


Well, if no one else is gonna link it I may as well: https://xkcd.com/1098/

I'm all for this!


Life is not black and white, how you feel about something isn't terrible or brilliant every time. This is idiotic.


What does this say about their algorithm? Forget UX/UI.

I feel as though this devalues their recommendations to you.


so how exactly i distinguish between liking something a little and loving something?

this seem like pretty retarded decision

at least we have TMDb where we can rate and discuss movies, unlike IMDb or Netflix


I think you should distribute your TMDB shilling across a few more accounts, mate. I actually recognized your username from one of your shills I found so bad that I linked it to a buddy of mine.

Just some of your recent posts on HN:

    > why would i participate on dumb backup of database, 
    > done without my consent (OP stole 14 years of my posts),
    > when I can go to proper movie database with discussions, 
    > imported ratings and watchlist from IMDb, called TMDb (themoviedb.org)
Shill Tip: don't link to your shill target when it's googleable like "TMDb" is. It makes it look like you are just dropping a name everyone should know.

Love the fake outrage about having your comments "stolen", as if anybody really cares.

    > he doesn't and he illegally copied 14 years of my posts which 
    > i don't like especially since I moved to TMDb which is proper
    > movie database where you can also import your IMDb ratings 
    > and watchlist and each movie there has own discussions
Another:

    > i would recommend checking TMDb which launched discussions 
    > for each listed title just yesterday, so don't expect there 
    > will be much content, but at least each movie or TV show had
    > its own page and own forum, much more neat than reddit
Consider leveling the load across more accounts though, and mix in some different strategies.

ctrl-f for "TMDb" on each page of your comment history and you'll see the problem.


it's related to discussion for people who prefer normal star ratings instead of retarded thumbs up, so not sure what's your problem, also not sure what's your definition of shill, but I doubt it's someone who doesn't even bother to provide link for his potential victims and who is just regular user of some website where he was forced to move from IMDb

not sure where you got some fake outrage from my post, when some thief is stealing 14 years worth of my content from IMDb

judging by your language you are most likely shill from some competitors failed forums which are already forgotten as IMDb alternatives

everyone feel free to go through my history to see if I am some account made to spam with TMDb as this expert imply


> everyone feel free to go through my history to see if I am some account made to spam with TMDb as this expert imply

36 mentions in 76 days.

https://hn.algolia.com/?query=author:Markoff%20tmdb&sort=byP...


> judging by your language you are most likely shill

Please keep this sort of tit-for-tat off HN.

I don't think you're a spammer and it's ok to occasionally promote your own stuff on HN in relevant contexts, but if you do it too much then other readers here will start to strenuously object.


except it's not even my stuff, I am just annoyed IMDb user who moved to TMDb and just wanna let know other people about alternative so we can all continue discussing movies, after all it's in my best interest to attract as many users as possible so I can have more people to discuss with even when I have zero monetary benefit from it and I am in no way affiliated with TMDb other than any other user, heck I don't even help editing movies there, no Mod, nothing...


That's totally fine! But if users have noticed it enough to get annoyed it might mean you were mentioning it excessively. In that case the solution is simply to do so a bit less, and make sure the context is a good and relevant one when you do.





Applications are open for YC Winter 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: