One of my former colleagues used to put weird error messages (made up words mostly, but really funny to pronounce). One of these messages escaped QA and went to a customer - but this was a nice guy, and he called up and asked what that word means (it was a while ago, can't remember the word now).
We didn't have anything to do one afternoon, so we searched google maps for names of funny places - there is a town called Hell. Imagine going there and reading the sign "Welcome to Hell". Good times were had at that job :p
"Mooning Steve Ballmer: How a Bungie dev's butt may have cost Microsoft $500K" http://www.polygon.com/2015/4/14/8382089/bungie-butt-microso...
One morning, the inevitable happened: a very angry client phoned up complaining that his site was swearing at him.
It turned out that the debug log had filled up, causing the site to crash on load, displaying only a single line:
CANNOT OPEN SHIT
The lessons you learn early on in your career.
I probably don't have to mention who got that alert while working one day. The president of the company, of course! Luckily for me it was not a customer-facing function & they found it funny.
Now, older and wiser, I include full stack trace and my explicit assumptions on all errors - orders of magnitude easier to debug than cryptic error messages. Treat your future self kindly !
Finally, one day I accidentally triggered that annoying error by doing a specific sequence of things and then clicking the browser back button a few times, then doing another specific sequence. That was a day of celebration!
You can be professional and have fun at the same time. (Or at least most Google engineers believe so.)
This whilst we were working on a product called the A?????????  Information Delivery System. We actually laughed on the conference call when they told us but they weren't joking so someone had to use that super-polite voice you use when telling your boss he's been an obvious moron and explain.
 Name of company ommitted :)
We'll make mistakes, its better to be proactive.
(another eg: vine shipping iOS apps with debug enabled)
The skull, perhaps was just having some fun, though!
But the build strips it for production during minification along with a bunch of other development-only code that will never make it in.
Just be careful, and have fun. It's what hacking's all about. :)
Creeped me out, i assumed it was some Browser Plugin gone rouge.
Here's some old-school Windows programming, along with great ads for the hot computer products of the day:
Edit: "publicizing", not "releasing". As others pointed out, the website doesn't contain any actual code.
I certainly, certainly hope not. While of course there is an ethical question involved here, making it illegal to release "code that was obviously not intended to be public" is a MASSIVE slippery slope. I could put your grandmother in jail for clicking a broken link in her email and sharing the confusing things she saw, because of a "bug" in my app.
It's called the "Computer Fraud and Abuse Act".
The exact portion is: "Whoever intentionally accesses a computer without authorization or exceeds authorized access, and thereby obtains information from any protected computer" is guilty of a criminal offence.
This user of Gmail obviously surmised this was debug information he was not meant to see. As soon as he clicked that debug link or the detail link, he was intentionally accessing information without authorized access. He knew he was not supposed to access that information and he did so anyways.
The CFAA has been used before for things not too far off for this. 3Tap was found guilty of CFAA abuse when it scraped Craigslist after its IPs were banned.
Weev was prosecuted under the CFAA for accessing unprotected AT&T customer data that was hidden behind a url with an incrementing integer ID (no password, no username, just a perl script to increment a url parameter in a get request).
This is a fairly well documented law that has been used a number of times and it's almost certain that the author is guilty under it, as written. It's one hell of a broad law.
Would you say that everyone who has ever clicked 'view source' is a criminal? (despite the fact that the source was sent to them in plain-text with the knowledge that a 'view-source' function is available to them)
This user was accessing Google's debug servers and debug information. Did you read up on the Weev case? It's not that dissimilar.
It seems like you're being intentionally obtuse in saying "his computer was not protected from him", well, no, Google's debug servers and information were meant to be.
If someone accesses the source code of a website while knowing that the website author intends them to not access it, then yes, they're potentially exceeding their authorized access and breaking the law under the CFAA.
I don't think the law is good nor makes sense, but explaining why it's dumb logically to me doesn't help. You're preaching to the choir. I know it's dumb and doesn't make sense. This law was created by people who do not understand technology or the internet except by analogies to it being "kinda like a supermarket" or such.
I gave suitable evidence that this is quite possibly illegal because of a dumb law. You've told me that it's dumb for this to be illegal (yes it is dumb) as if that means it can't be illegal. That's not a rebuttal to the links and statements I provided and, without a meaningful counter argument that isn't you intentionally being obtuse about what I said, you aren't furthering this discussion.
I don't know much about weev's case except that it sounds like it was information that AT&T had decided that the public was authorized to access without any authentication or protection. They screwed up. I agree that a lot of legal people are tech-illiterate, and they screw up, too. Which may be why they eventually bailed on a venue technicality rather than address the actual case.
But your point was: "Whoever intentionally accesses a computer without authorization or exceeds authorized access, and thereby obtains information from any protected computer" (which doesn't really sound that dumb on its own)
My point is that you're authorized to access your own computer and it isn't protected from you, so that would not apply (unless the legals involved couldn't figure it out). Is clicking the 'About' button in the help menu of an application and accessing the version number a crime? Seeing the Gmail debug info in chrome is just that with more detail. Try putting "chrome://about" in your Chrome url bar. Ooh, there's data. Lots of debug data. Are you a criminal now? No, it's your system and you're authorized to use it. And the makers of Chrome chose to give you access to that data. Just for fun, try chrome://quit/
> As soon as he clicked that debug link or the detail link, he was intentionally accessing information without authorized access
No he was not, he was accessing information because he did not know what it was (not explicit enough) and have been authorized to do it by Gmail. In both cases you are quoting, there is work done on the user part to access the information, in this case there is not.
The initial load had no intent, but all exploration afterwards probably did. He stated himself that he thought it was debug information. If he only realized that Google did not intend him to have that information after he finished screenshotting everything, maybe he's in the clear for intent.
Again, this is similar to Weev. Weev found a url which AT&T gave him that had a number in it. He knew AT&T didn't intend him to change the number (just like this author knew Google didn't intend him to see the links), but he changed the number anyways (and this user clicked the links anyways).
I don't see the fundamental difference here.