Probably 99.99% of google.com submissions are not linking to any official Google statements. And when submissions do happen to deal with official Google statements, they tend to be Google+ posts from someone who represents Google. So subdomains won't magically fix anything.
Why can't you just click through if it is ambiguous to you?
(Note: I am not jcs)
Each represents a rather type of communication which may or may not originate from Google HQ.
2. Consider how the JS will affect your pages' instant previews. Overlays are likely to show up in the previews, and these will take some time to update as we re-generate the previews.
3. Consider your users. Some webmasters are suggesting keeping the blackout overlay visible without any means of hiding it. That may not be the best user experience, and may be annoying to your non-US based users.
I'm happy to answer any other questions.
They ought to start of with "Hi I'm XYZ, lead PQR coordinator of Google ABC." or something.
At least this time it says "google" if you mouseover his photo, the last one didn't even that.
What do I do at Google? I help webmasters build better websites, so you'll see me talking to webmasters on our and other forums, writing blog posts, saying things like "I'll ask internally" and the like."
Security wise a bad idea. For example email of a actual company employee (at a company that offers email) might normally might be firstname.lastname@example.org or email@example.com not firstname.lastname@example.org etc.
By posting in this manner nothing to prevent someone from posting the wrong information.
I'm the author of the post. Your concern is valid and it's one of the reasons the Google+ account is verified.
Also, there is plenty of evidence that it's a real Googler's account. For example, when I blog on our official Webmaster Central blog, I link to this Google+ profile; for example the most recent post: http://googlewebmastercentral.blogspot.com/2012/01/better-pa...
Hope this assures you a bit!
And even if they did how do you even know that info is true? Who verifies that?
Retry-After is also usefull for "down for maintenance" pages since you usually know how long your page will be down.
However, I can see how your initial suggestion of showing full content to Google, could be viewed as solely for preserving rankings in an artificial manner.
As an example of cloaking, some News Sites let Google index all their pages, while requiring actual users to login/register to view it.
Typically, if the user has a Google Referrer, they can view the page one time for free and then need to login/register to view anything else.
Visiting the page directly or with a non-google referrer shows a register/login page.
New York Times was one that does(did?) this. I stopped visiting them when they started. I think Washing Post, or one of the posts, was doing it too, as well as a number of other sites.
Experts Exchange used to basically be the same way, although I think they are doing it differently now, and they were slapped by Google a long time ago for cloaking, so changed to a different method of cloaking...
Will a 503 make it into the listing? If so be sure to put your message in the title.
If you just add a banner or message to your site and keep the rest of your content intact then sure, it may show up.
If they want censorship, let's give them censorship.
Why not set the page titles and meta descriptions for all your pages so that "This site was blacked out in protest to SOPA" shows up in your search results for a while? That would honestly probably have more impact and Google will get around to indexing you again.
Doing a blackout for twelve hours harms your business, fucking up your SEO harms it again. Obviously in this, as in any protest, there is a decision to be made on what to do. Why not only blackout for one hour? Why not blackout for an entire month?
If you want people searching for anti-SOPA messages to find your site and not your regular traffic then go for it.
The other thing it could do is redirect back to the source site every x visitors, so one could set the ratio: 1/3 visitors gets the redirect. Search bots don't get the redirect.
It'd also make a good point for media coverage, providing some metrics on the effect of the blackout.. so long as people use it.
"xx,xxx internets were censored in the last yy seconds/minutes/hours from zz domains"