HTML5 provides the new history.push system, which provides all the functionality hashbangs are being used for without the hashbangs: http://www.kylescholz.com/blog/2010/04/html5_history_is_the_.... The fact that a new feature has been added for it goes to show that it's something necessary - Facebook, for instance, uses history.push if present, and falls back on hashbangs when it isn't.
It's often a very important thing to be able to do, else if you have (for example) multiple tabs or other frontend navigation for your web-app, pressing back would have to take you right out of the application, rather than back to the last place you were.
Why do we need a back button?
The DOS and Unix command lines never had a back button. Lotus 1-2-3 and WordStar and Word and Excel never had a back button. Photoshop and GIMP never had a back button. SQL scripts and database servers never had a back button.
Many of these programs do have application-specific implementations of an "undo" function. Perhaps that is the way to go for future software, rather than trying to wedge a universal "back" function into applications that don't really map to that paradigm? What does "back" even mean when the application is a game, for example? Flight Simulator and Ultima and Doom never had back buttons.
The web is a platform for modal applications with addressable state. To meet the expectations of users, any app on the web must somehow shape itself to that model.
BTW have you ever used a navigation based app that didn't have a back button? Try the native YouTube app on iOS. It's infuriating.
Of course, but not in all applications and situations. What does it mean to link to the middle of a Farmville game session? It's by definition a transient state within a process that does not need re-linkability or history. Similarly, we've been using desktop applications forever with no notion of addressability. You don't send an image with the expectation that the recipient can jump into the middle of your editing actions in Photoshop.
Nobody expects a "back" or "undo" button in everyday activities like driving a car or cooking or making social conversation. How did this expectation become so irreversibly fundamental in computers?
Wow. We're only just now getting to the point where it isn't possible to accidentally render your computer unusable with a few stray actions.
You seem to be forgetting just how unforgiving computers were even 8 years ago, any why iOS devices are so enticing to regular folks.
Don't conflate the two concepts. Undo good, back... not always required?
But most apps can be broken into at least a few states. For any document based app, like Photoshop, navigation should probably move between document views. Any modal dialogs should be closable with the back button. You don't have to use it for everything, it just has to do something sensible and non-destructive at any given time.
You don't send an image with the expectation that the recipient can jump into the middle of your editing actions in Photoshop.
People make mistakes. All people, in everything they do, at every level of responsibility.
"Back" or "undo" buttons in everyday activities like driving a car or cooking or making social conversation would be astoundingly useful. In your first example alone, literally tens of thousands of lives would be saved every year.
The expectation is irreversibly fundamental in computing because in computing, unlike reality, it is actually possible.
* If the back button behaviour is important, perhaps not breaking it in the first place is an appropriate approach
* The addressability of resources is one third of the fundamental basis of the Web. Ignoring it is essentially non-web, so I suggest the label "web-app" is incorrect.
* The user-expectation of clicking the back button but expecting to remain on your one-page website sounds like a usability problem brought about by applying a non-web-like interface inside a web browser. Perhaps a better user-experience is not breaking the well-understood web-navigation model, and thus not be confusing to the user. Or perhaps these "web-apps" need to make it clear to the user that they deliberately don't follow the characteristics of a web resource?
- The web application could serve static pages and then update them with AJAX
What if the user clicks back /then/? All the AJAX updates would be lost, and they'd go back to an older version of the page - they'd then see the data on the page "jump" to the updated version.
- The web application could not use AJAX at all, and serve everything as static pages, without automatic updating
Some web-applications can't work like that. Mine has instant messaging, and I want the IM windows to be there when it loads. Besides, most users /like/ AJAX. They're not thinking "oh no, this application ruined the web", they're thinking "nice, the page updated itself".
Which of these are you suggesting developers do instead? Why is the HTML5 history/hashbang URL model not better than all of these?
There are already internet protocols for real-time notifications, like XMPP and a whole host of non-open protocols. Re-implementing messaging on a platform that isn't by it's nature a push platform will always be inferior to the real thing.
Wouldn't it make more sense to implement a web-like application on-top of a real-time messaging platform instead of the other way around?
(Though, it's not clear what a back button should be doing in an "instant messaging" interface - will it undo the a conversation? Does a user bookmark a message or a conversation so they can edit/refine individual messages later?)
"Which of these are you suggesting developers do instead?"
I'm suggesting developers use the right tools and platforms for the job.
Well, it's pretty clear what you mean now, and I guess we'll just have to agree to differ. Real-time AJAX applications in the web browser are the future, and I know I'm not the only one who thinks so.
Also, if you think I'm trying to say the back button should do anything to an IM conversation, you're misunderstanding the whole issue.
That's a pretty broad generalization. On Twitter for example I'd say they're intentionally trying to make it harder for you to have a URL to reference an individual tweet.
Why should we be forced NOT to use a technology that provides superior response and productivity than what was traditionally possible? Forcing us to live in a world with server round trips for everything just so we can say we perfectly fit within this artificial restriction as defined 20 years ago before any consideration was given to the Internet being used for more than viewing anything but linked documents?
There still exists the 'content web' suitable for browsing and navigating content! For the Wikepedia and the plethora of blogs carrying content that remains as business as usual and has no need to change.
There are other apps like that provide enhanced functionality like gmail, google maps / docs / calendar, Zoho Writer, groove shark, etc delivering a far superior experience to anyone with an internet connection and a modern browser better than any set of hyper-linked documents ever can.
You might be overestimating the differences between Flash and JS here -- especially when you add HTML5 into the mix. What's to stop a skilled JS dev from playing <audio> when the user hits a page, burning processors with a few autoplaying <videos>, after forcing the user to sit through an interminable, animated <js/css3/svg> 'loading' screen? At that point, it's up to our own good tastes to decide how to wield that kind of power.
Then their site doesn't get return visitors and starts dying, just like Flash is.
Most JS use is sparing and lightweight and more about interaction than action/animation. Most Flash use is heavy-handed, ham-fisted, and overbearing, done more for cosmetic effect than additional function. Is this a part of the technology? No, but it certainly is nurtured by the differences between the development environments — Flash is still sold as an animation studio! What's more, JS gets to manipulate the DOM directly, giving it a much nicer relationship with the typical web experience than Flash, which is effectively a replacement for the DOM.
Have you used Twitter or Facebook on resource constrained machines? Even on latest nightly builds of Chromium it's painful on my Dell Mini netbook.
Imagine what Gmail would be like if they went overboard with CSS.
Who said anything about forcing? The article just seems to be asking what people think... has anyone actually talked about forcing developers to step backwards from using JS?
I always love taking philosophical guidance and development advice from people who can't even get their own shit together.
You mean the web. Do not confuse the web with the internet.
It makes life difficult for us by making the browser into a huge ball of complexity and churn. (I still remember the days when the people who wanted to use the web as an applications-delivery platform implored people to upgrade to Internet Explorer 5 to make their lives easier, and of course they were not satisfied with IE 5 for long.) The vast majority of the crashes, unresponsiveness and mystifying software behavior on my Linux and OS X systems come from the graphical browser.
In other words, how do we get it all, future-proofed, without so much frustration and so many debates?
Having one URL per piece of content is especially important in this age of sharing and aggregation, where you don't want your pagerank / number-of-shares / other metrics spread equally across two different URLs.
But it would be nice to be able to degrade nicely, and still support some degree of ajax with linkable states. Beggars can't always be so choosey, I suppose.
1. browser: get /something; referrer /anotherpage
2. server: here's the diff data only
I guess one of the problems is that Google only pulls down content + markup. It would be nice if there was a way to provide machine-optimized content to crawlers for indexing, basically telling the crawler "here's the data in XML format, and here's a link to the data in human-interactable format". It's sort of what Google's escaped_fragment URLs do, but escaped_fragment still expects content + markup.
That doesn't have to be by malicious intent, it can be enough that a backend developer adds some fields, and the template doesn't use them.
On the other hand the machine-exposed data misses headlines that are usually present on all web pages, so that if the user searches for <company name> + <keyword>, special care must be taken that the <company name> part doesn't fail.
But it's academic in any case; any web that anyone tried to engineer without something along the lines of Postel's law would have been outcompeted very early by one that followed it. A strict web would have kept too many marginally qualified people out and would have had much smaller returns owing to lack of scale (Metcalfe's law etc.). Keeping marginally qualified people out means they are strongly discouraged from learning (instead they give up), and that would throttle any possible growth.
Any developer working on any application has to make decisions and trade-offs about platforms, user experience, accessibility, and a hundred other things. If something else is more important than making the back button work or having pretty urls, then who cares? Focus on what improves your app the most, not adhering to tradition. It breaks the web? Get real, it doesn't break anything. It just has other priorities. The only reasonable barometers for whether something works are user adoption and satisfaction.
The hash should look suspiciously like the fragment that non-ajax agents would see.
I say this not as a dogmatic should, but because I find separating concerns makes it easier to develop, and preserves web-ness. Ajax is an enhancement, not an architecture.
If every website was a blog, then sure - addressability, archiveability, searchability are paramount. If you're building the control panel for a nuclear power plant, you have a completely different set of concerns.
The moment you accept that there is The Web and there are web apps and they're not the same thing, you're right in the middle of the debate Shiflett is talking about. The down-with-hashbang crowd doesn't acknowledge any such difference.
Specifically, There are at least some which use ASP.NET in their SCADA control systems (although they are not connected to the internet).
I have performed security assessments on several of these.
(BTW, that's a pretty cool site.)
I think it's a disservice to common users of any site when the address becomes an opaque mess of seemingly random characters.
What is truly amazing is that there actually are other solutions that still offer the best practices like graceful upgradation and degradation with less effort. This article goes into the issues and the alternatives:
Unfortunately, there are two reasons why this assumption falls apart. First, there are a lot of non-Computer Scientists out there writing software. Second, the push for everyone-gets-a-degree credentialing and the cries of "we're short on qualified candidates" in industry have led to a serious erosion of standards in universities; either the topics aren't being taught or students are passed without the lesson being absorbed.
On the first point, I think this is a good and a bad thing. It's good because it's democratic. It means that more people are finding their own way, it means that systems are getting more intuitive, it means that barriers to entry are being torn down. Literacy is good for everyone. But it's also bad because the actually trained Computer Scientists and Software Engineers, aren't focusing on keeping this process going. Instead, they just want to open consulting firms and roll out yet another CRUD business intelligence Web app. Concerning, to say the least.
On the second point, there is a serious problem with the number of people we are graduating through universities. And I don't care what the top universities are doing, there are 50 more low-end state schools and technical schools churning out people to each one of them. I know at my university, I received education in such topics of Software Engineering. BUT--and this is a big but--it was all but unrequired knowledge. I can't say any of it was truly "required", because I ended up working with quite a few of my classmates several years later, and they had forgotten it all. They stood in awe of my prowess as a programmer, when all I was doing was using the same lessons we had all received. If you were good enough at programming to avoid basic syntax errors, you were good enough to get through to the end with a barely passing grade. And few hiring managers care about grades.
These people that I graduated with, they have benefited from the democratization of programming as well. We should make this relationship more explicit. Keep Computer Science pure and only graduate people who are up to the chops. Encourage people who want to write websites to go in to Graphic Design. Make programming on equal terms as algebra and creative writing. We all have to study it, we don't all have to be experts, but the people studying the subject specifically had better be held to the highest standards.
The only way this problem will be resolved is for more of your "true" computer scientists to start working on browsers. However, at the moment, many of them are making too much money at their consulting firms fixing the websites created by graphic designers.
HTML 5 is the right answer to the wrong problem. The problem is not how to make HTML more expressive and more powerful. The problem is how to serve applications to users. We need an open standard for WebStart/ClickOnce, not a hack on top of a hack on top of HTML to make web pages appear like applications (with every web designer's own notions as to how to render UI elements and display notifications and control flow, etc.).
Next time you see a JQuery modal dialog, ask yourself, "why did someone on the client level have to write this code? Why wasn't it part of the platform?" Why do we have competing implementations of modal dialogs on web apps, in today's day and age? And it's often represented by a severely nested DIV in a section of code that has no meaning to the modal itself. Semantic markup, for sure.