// Do AJAXy thing instead, then
Edit I just realized, if you have links to `http://twitter.com/#!ev` in the wild, then the AJAX crawler thing becomes actually pretty useful.
The biggest holdback is Internet Explorer, though. (Well, just like almost everything else.)
I think Google could have done this better; they could have made it so all ajax URLs are crawlable (standard robots rules apply) as long as your site opts in, then you'd only have to support pretty URLs, something you probably have to do anyway.
For example, now twitter needs to support twitter.com/#!/ev and twitter.com/ev and twitter.com/?_escaped_fragment_=ev
How else are people supposed to bookmark or share an URL in your opinion?
The simple fact is that plenty of sites (like Twitter, Facebook, and lots of Google properties) are using client-side code to build fast interactive sites, and it necessitates this kind of infrastructure.
For instance, use Chrome and browse around the site. You'll notice there are no hashed URLs.
The same thing is done on Flickr. Look at this page: http://www.flickr.com/photos/timdorr/3707685058/in/set-72157... Now click on the sections under "This photo belongs to". As you expand them out, you'll notice that the URL in your address bar changes. This is particularly useful in Flickr because you can use the arrow keys to navigate through photos. And when you link to a page, your personal state might have been browsing through a set instead of a full photostream. This keeps the state intact when sending links to other people. It's a great usability feature.
This mapping is needed because the fragment string is never passed to the server, so it has to be encoded elsewhere and the query section is the only available place.
The "!" is needed because otherwise crawlers would start fruitlessly hammering all the existing sites that use '#' but don't support the '?escaped_fragment=" hack.
edit: Yes, it does: http://news.ycombinator.com/item?id=1799047
Before this technique was widespread, it was all too common to hit "Back" and be taken to the previous website you'd viewed, even if you'd viewed more than one page on the original site.
By changing the URL's fragment the website can add entries to the browser's history. That way when the user clicks Back the website can react appropriately, changing its state back one iteration. This preserves the expected behavior for the end user, while allowing the website to leverage the benefits of AJAX.
The shebang (#!) is merely a way to distinguish AJAX links from bookmarks, for the purpose of crawling.
> Before this technique was widespread, it was all too common to hit "Back" and be taken to the previous website you'd viewed, even if you'd viewed more than one page on the original site.
Uh yes, the original poster of the question is aware of that:
> As far as I can recall, earlier this year it was just a normal URL-fragment-like string (starting with #), without the exclamation mark.
His question was not about the hash, but specifically about the shebang.
There, a Twitter employee confirmed that they used the hashbang to comply with Google's proposal.
Are invisible urls (for archor urls) still frown upon by Google?
It was a major PITA because you basically have to implement URL routing both client and server side.
The libraries I have used (jQTouch and Sammy) serve one single page and the client interprets the anchor. Thus, there is no need to implement URL routing on the server since you simply serve the application and let it route the URL.
When you change states on the client, instead of changing state and then changing the anchor, you change the anchor and that triggers an event which causes the client to interpret the URL and change its own state. Thus, there is only one piece of code, ever, that handles routing.
Thanks for your GO game link, learned about "fun" alternatives for the game I didn't know before. I'll try them next time I play, thanks!