> ... management didn't quite see what was wrong with
> that. Instead, they told the client to NEVER copy
> paste content from other pages.
I'm reminded of the story about Feynman exposing the security problems with the combination locks used on safes at Los Alamos and Oak Ridge. Instead of fixing the problem, management just told people not to let Feynman near their safes.
> He tells the story of being in Oak Ridge, and delivering
> a report to a colonel there. He reports that the colonel
> felt himself far too important to have an ordinary safe
> -- he ordered a special multi-ton safe. Feynman was
> delighted to discover that this big, important safe used
> the exact same type of lock as their little safes did,
> and just to be sure, he took the last two numbers off it
> while standing in the colonel's office. After the colonel
> closed the safe, Feynman told him the safes weren't
> secure, and proved it by opening the safe, then
> explaining how he did it. He told the colonel that the
> vulnerability was in leaving the safe open while he
> worked. "I see, very interesting," replied the colonel.
> Several months later, Feynman was again at Oak Ridge, and
> was surprised at all the secretaries telling him, "Don't
> come in the office! Don't come in here!"
> It developed that the colonel had immediately sent around
> a memo asking everyone, "During his last visit here, was
> Professor Feynman in your office?" Those that answered
> yes received another memo: "Change your safe combination."
> That was his solution - Feynman himself was the danger.
> Meanwhile, of course, people still continued to work with
> their safes open ...
To me, the problem isn't that authentication depends on cookies and JS. The problem is that the system does the exact opposite of what it should.
An authentication system should allow access if and only only if the client presents valid credentials in the expected manner (e.g. reporting a cryptographic cookie back with the HTTP request). In other words, the auth system should use the presence of valid credentials as the criterion for access.
Instead, this system used the absence of something as the criterion. Imagine if a secure building did this. Everyone arrives at the security checkpoint, and if the guards don't recognize you, they give you a badge that says "I'm not allowed in." Anyone who's not wearing the badge is allowed in. That's what this website was doing.
Moral of the story: Your authentication system can depend on cookies, JavaScript, and any other technology being enabled on the client side as a precondition for access. If you do that, and someone has disabled the technology, they're locked out, and there's no security breach--just a frustrated user. But your system should never trust the absence of some marker as proof that the client is allowed in.
Local legend has it that the guy didn't have a backup, but senior Googler Matt Cutts wrote a custom MapReduce to process the spidering results and make him a tarball of the content GoogleBot recorded as it deleted the site.
Supposedly he sent it with a note along the lines of, "This one's on the house, but we're not doing it again."
I think the authentication problem is more dangerous. If you fix the GET issue, you're still allowing any savvy stranger to delete your articles by hand.