If your career is currently bet on apps, I'd be sure to diversify in your spare time into something not-app. (Said not-app thing may also be useful in apps, I'd just avoid specializing extensively and solely on things that are only uesful in the app context.)
That's certainly possible, but here we are in 2012 and offline use of web apps is still atrocious. To say nothing of how inefficiently web traffic currently maps to the realities of cell phone networks.
And then there are user interface concerns (website support for multitouch? camera use? video capture? audio capture?). And everywhere, the all-to-familiar bugbear of disparate specs and support for any and all would-be extension to the web.
I personally think the odds are higher that we'll see another computer use method rising to prominence , that shakes up everyone's expectations of what an Everywhere Web needs to be able to do.
That's ultimately where we're headed and ultimately why the web site vs app distinction is moot. The future is services with many interfaces. Web interfaces, desktop OS interfaces, mobile interfaces, even interfaces to other services -- whatever makes sense so that all of it's features are ready for you anywhere and everywhere you might want them.
 Head's up displays? Maybe. But I'm thinking more like presence-aware general computing resources. Think appleTV's AirPlay, but for further computing tasks. Walk over to an all-in-one and it becomes an extension of your workspace. More display, more resources, better connection and all your data being sent to it as needed.
Right... because the browsers aren't there yet. They absolutely could be, though. Maybe Apple will stop Safari from being able, but they're actually yet to show any indication that they would do that.
App ecosystems however will continue to evolve over time. I see most of the big players having a median of about a 10yr foothold on the industry until the next player out innovates them and manages to gain significant market share.
In all likelihood however, we'll probably be way off. I should save my HN posts so I can laugh at myself in 20 years.
> If your career is currently bet on apps, I'd be sure to diversify in your spare time into something not-app.
People still make plenty of money doing desktop apps in this day and age. "Computer stuff" is a big enough field that there are plenty of niches for everyone.
"To be fair, eBay is struggling under the massive accumulated design debt of a website originally conceived in the late 90s, whereas their mobile and tablet app experiences are recent inventions."
We have done a bad job up until now of realising users expectations that the web should have a great / simple ui and work offline, we just need to improve, much in the same way as the slow transition from desktop apps to web apps has been going, I think its inevitable that we will build most of our technology for the web
> I think its inevitable that we will build most of our
> technology for the web
As much as our connectivity is getting better, it is never going to be 100%, without failures, and latency is not going to disappear
Maybe UIs will start acting more like a browser and let me find apps easily. Maybe browsers will just present apps with all the same power apps have. Maybe it doesn't matter.
I would suggest the essential difference may be found along an axis of user input that has data entry at one end and pure selection on the the other.
I'd further suggest that as this technology evolves the emphasis shifts from data entry to selection and interaction with technology becomes less about inputting data and more about making choices.
What data is needed is grabbed from your device's sensors; you don't actually need to enter it.
Ultimately what we're trying to build is a device that knows what you want (before you do!) and serves it up to you with the greatest efficiency.
The best thing is having multiple ways to do things. Which the iPhone provides, of course.
One especially nice one is how double-pressing the home button presents icons for your running apps at the bottom of the screen. You touch to re-enter it or hold it down to get a way to kill it.
Google/Yahoo/Bing are still the jump-off point for most desktop web sessions. Because applications are sandboxed and don't play nice with crawlers, there's still a huge advantage for websites with efficiently structured and persistent information to capture user demand.
While applications do offer significantly improved performance and better payment workflows, an app's infrastructure / OS is unfortunately only as good as the restricted set of people (in the case of private companies, employees) who govern it.
Apple is an amazing company with extremely talented, passionate people and they have built an incredible ecosystem. But they will eventually fade into history, just like everything naturally does.
Luckily, the residue of their design will persist and drive future enterprises to continue to innovate on this front, which in turn will spawn the next generation of zuckerbergs and jobs'.
Essentially, we're witnessing the exact evolution of development ecosystems that we'd expect within our economic constructs. The only danger is that our legal constructs impede the ability for those outside of the walls to extend the technologies to accelerate innovation for the entire industry.
In the longer term these cycles will become less extreme because we are migrating to a model where every device will always be 'online'. At that point apps will lose a lot of their power.
Another thing that will drive migration to web based applications is that selling a subscription is far more lucrative than selling a piece of software for a fixed amount.
Apps are transitional, and I think they always will be.
They can give a short-lived advantage over the functionality that can be created on new devices, especially if not all the hardware on those devices is supported or accessible by web browsers.
For instance, on the current generation of mobile devices there are all kinds of input devices (gps, compas, inertial guidance devices) that are not supported by HTML but I fully expect future versions of HTML to allow access to those devices.
Falls under the category of articles whose headline poses a question the answer to which is "no".
The number one reason I prefer the web is that it's extensible! If there's a web app I don't like, I can write a userscript to modify it to my needs. The web has a standard API (HTML/JS/CSS) and a means for extension so I'm free to make the web suit me. This freedom is something that I miss when I'm using a mobile device and I hope it never goes away.
Imagine if someone found your phone with email logged in. They would immediately be able to scan through your email for registration confirmation emails, go to that site and reset the password giving them complete access. If one of those accounts was for a site which saved cc details and had something the thief wanted (or could sell), they can drain your card with out setting off any fraud warnings, because you wanted to convenience of not having to type your password every time.
I agree that it's troublesome that people have their phones set up so that anyone with physical access to an unlocked phone has access to everything. But I'm inclined to think of that as a symptom of the problem rather than the problem itself. The root of the problem being that too many folks who draw up security schemes don't seem to grasp the most basic lesson about how people deal with security: When given a choice between excessively inconvenient security and no security, your average user will always opt for no security. If that's not an option by default, they will figure out a way to make it an option, and then opt for it. (Sticky notes, y'all.) If there's no way to make it an option, they will go find someone who lets it be an option, then opt for it.
One great thing about non-biometric authentication systems is that it's easy to replace a compromised keycard or password. Replacing your own fingers, not so much.
Aren't we talking about alternatives to conventional password entry because it is a great nuisance?
Voice recognition, Face recognition and Finger Printing all seem reasonable alternatives. I do believe it is a matter of time before they become viable in smartphones.
How about an RFID chip in your wrist watch which makes your smartphone log in. Outside of a meter or two, it will ask for a complex password. Why not? I will buy it :-)
First came applications. In the mid-90s Microsoft was terrified that the Internet and Netscape would kill the Windows/Office golden goose and they (fairly successfully) subverted the Internet through browser fragmentation.
The advantage of the Web was that it wasn't OS-specific. Microsoft wanted (wants!) you to be locked into their platform.
The 2000s see the rise of the RIA (Rich Internet Applicatdion). One-page sites like GMail, etc (although they aren't always strictly one page). The core idea here is that even though performance was (is?) bad, increasing computer power will solve that problem sooner rather than later.
Let's face it, HTML/CSS/JS is a pretty terrible solution. Browser/OS differences are endemic. It's slow. Modularization (of a Web app) is awkward at best. Offline is incredibly awkward.
What caught people by surprise was mobile. Unlike a desktop, power usage and size became far more important than raw CPU power. Uh oh, Moore's Law no longer to the rescue.
You can be pedantic about J2ME apps (or whatever) predating iOS apps but let's face it: Apple popularized and commercialized the idea of apps even if they didn't outright invent them.
While the rise of the mobile app may appear quick, the pedigree of iOS in particular goes back 15 years. It's really an amazing set of APIs. At the same time, Apple has largely avoided fragmentation issues.
So what makes the app market successful on mobile is:
1. Easy to purchase, install and update. You cannot discount the lack of friction in purchasing apps. It is (IMHO) incredibly important;
2. Much better performance both online and especially offline; and
3. Ease of discovery.
Apple may not have been the first to recognize it but they've also embraced this same strategy on OSX. Google (disclaimer: I work for Google) has the Chrome Web Store. Microsoft is essentially copying the OSX App Store for Windows 8.
I don't see any doomsday scenarios about the Web going away. That's just linkbait. If anything, what I see will happen is consolidation. Now instead of producing just a Website, you need an app (or, preferably, several apps for the different relevant mobile OSs).
Take Newegg. The website is still as good as ever but honestly it's a joy to use their app on the iPad, so much so that I will have trouble buying my parts from anywhere else.
Apple has recognized the need in the modern computing environment to essentially sandbox everything. The Microsoft of old used to take as gospel the need for backwards compatibility so always avoided breaking changes. Google too has realized this to a degree (websites are sandboxed).
Personally I believe the dark horse in that race is Chrome's NaCl (Native Client) as it combines the delivery of the Web (to Chrome at least) with the speed of native applications. Time will tell.
But please do me one favour and quit it with the linkbait-y "apps will kill the Web", "Apple's/Facebook's walled garden will ruin everything", etc. Fears of the worst are nearly always overblown.
You must be new to Coding Horror.
I appreciate upholding high standards.
Compared to what? Not having to purchase, install and update at all? Lack of friction compared to not having to travel the mile at all?
Ease of discovery
I think "read tweet, click link" is as fast as it can to go. And still, the most used mobile app is the browser.
My personal opinion is that native apps that have a WWW equivalent are doomed: Their appeal rests on the fact that they can still do "cool" stuff by taking advantage the touch interfaces that the web was not designed for. Otherwise, they are indistinguishable from desktop apps: games and other heavy-duty processing. The web is not going anywhere.
If I am a start up, I will try to iterate on a single platform ( a la instagram) and get my product right. Then I can move to new platforms - It takes more effort to get the product right than implementing it on several platforms. If my product is successful, supporting that on a new platform will take 1% of the money I could raise and effort. Who cares a hoot if you have a half baked product released at once on 100 platforms?
Do I think it is the best suited technology to all the things we are trying to do with it? - Absolutely not, but I wouldn't dare invest myself in anything that competes with it.
I want to have applications that integrate with the operating system, and take advantage of its features.
No more web projects were the customers try, without success, to replicate a desktop UI inside the browser full with CSS dark magic and browser idiosyncrasies., please.
If you want a desktop look alike web page, then create a native application for it.
I guess it's natural that we want to push boundaries, but maybe Gopher wasn't such a bad idea? :)
If apps really do "kill" websites, or make a significant dent in them, this could be bad news for Google's business model, as well as anyone else who relies on adsense income.
html was invented for "web documents". We are now using "document" to create "web app". It's simply difficult.
This is something I've been thinking about when redesigning the 99bikes.com.au ecommerce website. How to bring the app experience to the web.
I don't think apps will kill websites, but should influence the way websites work.
A great example of a app style ecommerce site is fab.com
That's probably quite a bit farther down the road, though.
You have less real-estate so you have to focus on delivering what matters. Less is more!
I believe that the .NET-, Objective-C- and Java-based languages/platforms have dated core characteristics that make them disadvantageous as well as the problem of requiring extra coding. So even if people aren't developing mobile apps on HTML5 or HTML5-based platforms they are likely to move towards using some other layer above the native based on the economics alone.
Its difficult to predict what will happen for sure though. Personally, if I had my way, a lot of the current stack would be dropped immediately. CSS would be first to go. The concept of coding markup directly would be dropped. Instead, interactive tools would be used for creating and maintaining UIs and other application structures. Those tools would exchange data in the simplest possible common formats that could be agreed upon. The common formats across tools would extend to include semantic descriptions of components above the implementation level. ASCII source code would fall out of favor and be replaced with more interactive structural and semantic application editors. HTTP, the web, and the traditional client-server would be dropped and replaced with content-oriented networking integrated with the common application and data exchange formats. Browser-type applications/sites and desktop applications would converge so that web applications can integrate with the desktop environment.