Hacker Newsnew | comments | show | ask | jobs | submit login

The comment that it was "Human error" was also interesting, perhaps someone made it a favorite while logged in as the Editor? Of course the interesting bit for me was that the system had a porn screen in place, it "knew" it was porny since it required you to tap it to view it, so why does the program / tool that makes it "Editor's Choice" not automatically reject as an error an attempt to promote a porny video to that spot?

Google Video (the service that existed at Google before and to some extent after :-) they bought YouTube) early on used an algorithm for picking the top videos to put on the page based on views/ratings/comments etc but that early algorithm had built in from the start a check for things being NSFW and thus preventing them from ever making the list.

Seems like a brain fart. Either that or a poor attempt at getting publicity for the service. That latter would be really lame if Apple pulls the App based on the commotion.




why does the program / tool that makes it "Editor's Choice" not automatically reject as an error an attempt to promote a porny video to that spot

This was the second mistake. There should be business rules in place to prevent this. Most likely they rushed the app to production without considering this case.

-----


I bet it automatically rejects porn for featured spots now. Features like that tend not to get put in until somebody screws up. It's too easy to think, "nobody will ever do this in the first place."

-----




Applications are open for YC Winter 2016

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: