The comment that it was "Human error" was also interesting, perhaps someone made it a favorite while logged in as the Editor? Of course the interesting bit for me was that the system had a porn screen in place, it "knew" it was porny since it required you to tap it to view it, so why does the program / tool that makes it "Editor's Choice" not automatically reject as an error an attempt to promote a porny video to that spot?
Google Video (the service that existed at Google before and to some extent after :-) they bought YouTube) early on used an algorithm for picking the top videos to put on the page based on views/ratings/comments etc but that early algorithm had built in from the start a check for things being NSFW and thus preventing them from ever making the list.
Seems like a brain fart. Either that or a poor attempt at getting publicity for the service. That latter would be really lame if Apple pulls the App based on the commotion.