Hacker News new | past | comments | ask | show | jobs | submit login
Linus Torvalds on new Chromebook Aura UI (plus.google.com)
201 points by simanyay on May 1, 2012 | hide | past | web | favorite | 152 comments



I think a lot of people are missing the point of what Linus is saying somewhat.

He's really just addressing the elephant in the room by saying that as of yet the whole open source desktop mess has failed to produce anything that looks like a compelling , easy to use desktop.

If Google can produce a Linux based operating system that addresses the needs of casual users and power users/technophiles (like Linus) and get it on enough hardware (i.e not just netbooks but "serious" desktops as well) then they can (combined with android) simply eat everyone elses lunch.

What is not so important necessarily is whether or not local storage or local apps will "go away" but that Google understands that a modern OS needs to treat "the cloud", i.e webapps , online storage , social networks as first class citizens inside the OS. This is a metaphor that is missing (in a really intuitive form) mostly from even new OSes like Windows 8.

Hard disks are not dead, but I think the idea of having C:\Program Files\.. is. Even as a technical user myself the directory hierarchy is mostly just an abstraction that gets in my way. My work (i.e source code) is in git, my games are in Steam, my email is at google , my music on Spotify and all the "other stuff" is in dropbox.

What I need is an OS that ties all of these things together seamless regardless of whether the bytes themselves reside on the disk inside my computer or on some website somewhere but can provide sufficient tools when needed to fix leaky abstractions.

Let's suppose you could buy a new workstation soon that ran a nice clean Chrome OS desktop but had sufficient memory and hardware level virtualization that simply clipping in a "proper" Linux or all the MS libraries such as DirectX that you needed to run Windows apps was a trivial activity that could be abstracted away from the end user if required. Would you not be curious to buy one?


I'm not quite sure what you're trying to argue for. You seem to want an integrated way of accessing remote resources (a la SMB but for extremely remote resources), but then you seem to be arguing in favour of hiding the directory hierarchy entirely (forgive me if I'm mistaken here).

In regards to remote resources, currently there are various FUSE plugins which can present remote resources in the filesystem (e.g. flickrfs). These are of course issues with the abstraction (how do we represent tags in a hierarchy?), but the use of these plugins is mostly transparent to the user (ignoring performance). It's certainly possible to build a layer atop of these that would integrate better with the system (e.g. add a Facebook selection to the open dialog).

You also seem to want to conflate applications with the data they manage. For a large number of users this is fine (iTunes is already their music, IE is the internet, etc), but for many others this is problematic (how do you manage moving data between these applications without a common intermediary. consider scanning an image, touching it up, then including it in a document - where do the original and altered versions live, how are they accessed by applications?).


FUSE plugins such as the ones you mentioned could be a fine enabling technology for this or at least a stepping stone. As you suggested, tags might not cleanly fit into the hierachical system, so perhaps a simple tree hierarchy is not the way to go for organisation of data at all. How we actually map things into the file system is a difficult question since there will need to be some common interface at some level. This could be a relational system or key/value store or something more like a graph perhaps.

Conflating data with applications is an interesting issue. On the one hand I feel that there may be a move towards this, for example it would be advantageous to (say) Apple for every photo you took on your iphone to be tied into the iCloud ecosystem never to escape.

On the other hand , storage is such a commodity that there's no reason that you couldn't hold your data where you want but hook into the functionality of a particular application to organise it.

For example maybe you use gmail to look through your archives of old emails that you had from a previous provider that are stored on your HDD. Spotify already does something like this with it's local music functionality.


He's really just addressing the elephant in the room by saying that as of yet the whole open source desktop mess has failed to produce anything that looks like a compelling , easy to use desktop.

If Google can produce a Linux based operating system that addresses the needs of casual users... [they can] simply eat everyone elses lunch.

He also addresses the elephant in the room that the major Linux UI pushes that consciously and grandly targeted those goals were grand steps backwards. That the things the UI prognosticators think users want, like abstraction and simplification, alienated real Linux users like Linus. That normal users missed features the UI cognoscenti banished as "too confusing," such as "easy mouse configurability for things like how to launch applications."

When you talk about abstraction, metaphors, making the filesystem invisible to users, and (most telling of all) appealing to "casual users" it sounds like more of the same talk that we've had for many years now. The more grand and self-conscious the community gets about user interfaces, and the more they target non-techie users, the less useful the desktop environments become for their actual users. That missing configurability that Linus laments, which was sacrificed for the sake of "casual" users -- did we really attract a bunch of casual users? Did we eat anyone's lunch, take market share from OSX or Windows? Or did we just make our desktop environments less usable for the people who actually use them?

Linus says GNOME was "useless" without that configurability, and that this new device might be usable as a laptop if it had a terminal and a development environment. That doesn't sound like a call for Linux environments to appeal to "casual" users. I don't think Linus is asking for even more simplifications to get in the way of his usage, like abstracting away the filesystem.

The elephant in the room is that Linux desktop environments did a better job of giving Linux users what they wanted when that's all they tried to do. Trying to give us what we didn't know we wanted has been a failure. Targeting non-techie users has gained us nothing that we didn't already want for ourselves. It's time for the community to get off its high horse about UIs and go back to what it's good at, which is humbly (and very successfully) catering to its own needs. And nobody should feel ashamed of creating or using an operating system that never eats Apple's or Microsoft's lunch (isn't that such a ten years ago obsession?)

(P.S. You know what's good at abstracting away the filesystem when and how it's appropriate? Applications.)


I kind of read it as "look at this, chrome book was designed for the most casual of casual users in mind and it's still more useful to programmers than gnome3"

I agree that applications are good at abstracting away the filesystem which is why I would argue for a lighter weight OS (which I presume chrome OS is).


For most of the 90s I remember hearing how Linux was so stupid because it was derivative and behind the times and never innovated.

Now whenever there is an innovation, it is reviled and everyone asks for the interface which looked roughly like Windows 95.

So I guess they listened to the wrong messages.


What I remember people complaining about in the 90s was that UIs were unstable and it was really hard to get X set up at all. Any complaints they had about UI design were muffled by the tears of joy they shed when they finally saw the black X on the screen instead of junk and patterns. But in all seriousness, I'm not against innovation at all. I'd just like to hear about some innovation aimed at existing Linux users rather than insisting that the users who really matter are the casual, non-techie users necessary for mainstream desktop adoption.

Targeting "casual" users has resulted in some valuable work such as better stability and completeness in many programs, simpler configuration interfaces for many things (such as wireless configuration GUIs that were better than Windows'), and better support for multimedia. However (leaving aside the fact that those were all things that techie users were begging for anyway) there was a downside. Customizability and configuration options disappeared. People made assumptions about "normal" users that implicitly labeled a large chunk of the Linux community abnormal. Case in point: if you configured your wireless connection using NetworkManager and then switched to a different desktop environment, your wireless connection might not work anymore. That might sound like a bug, but it wasn't. It wasn't designed to work that way. How could that be considered remotely acceptable? Because normal people don't use alternative desktop environments. (Thankfully, I've since read that NetworkManager accepts plugins that write to the correct system configuration files, though I don't know if distros test and install them.)

And where's the payoff? What was the payoff supposed to be, anyway? Back in the nineties and early 2000s many people assumed Linux had to make it on the mainstream desktop in order to be successful, but I think we've put that that misconception behind us. Yet people still hold up the "casual user" as the gold standard that Linux is supposed to cater to, as if it were a moral imperative. Few Linux users fit that stereotype. It isn't that we're concretely knowledgeable about Linux. If you put ten randomly chosen Linux users in a room, chances are that for every component of a running Linux system there will be somebody in the room who is utterly unfamiliar with how it works or how to configure it. Obviously we should cater to ignorance, if only so we have the luxury of remaining ignorant ourselves. But there is a general savoir faire with computers that it is acceptable to assume. By savoir faire I mean whatever factor it is that explains why I am the person in my family who always gets a call on the phone when someone needs help with Windows 7. Even though I've never developed on Windows, haven't used Windows for anything more than e-mail, web browsing, and Office in ten years, even though I haven't used Windows personally in six months, even though I've never used Windows 7 or Vista at all. They still ask me for help with Windows 7, and usually I can help them.

All I'm saying is that it's acceptable for Linux UI designers to assume that the people using their UIs are likely to resemble current and past Linux users. Take a break from innovating for projected, postulated, hoped-for users. Innovate for the ones we already have instead.


Like the shell, the hierarchical filesystem is valuable as a 'maintenance hatch' for power users and administrators. If it is too much underemphasized then it will suck to work with. Similarly it is very valuable for the functionality you get in a GUI to be exposed at a lower API level and for the GUI to use that API, rather than for APIs to be locked up in GUIs which are incredibly awkward to wrap or automate.


I have a feeling Google will announce some kind of merger of ChromeOS and Android for Android 5.0 at Google I/O, or at least they will announce porting NativeClient to Android.


The most interesting thing to me about each of these little episodes is how Linus, by creating the brilliant Linux kernel, is lauded as a sort-of-oracle when giving his opinion about very non-kernel things. I quite like the simple, opinionated approach to Gnome and Unity but my opinion (and others who may agree with me) matters somehow less because I didn't create this great thing that has nothing to do with UI design philosophy.

I guess that's the true value of geek cred.


I suspect that the people who approve what Linus says on this matter are the people who already agree with what he says on this matter. Already having this kind of opinion, it feels good for some smart high-profile to say it in an articulate way.

It's true that liking Gnome and/or Unity seems to have a kind of stigma these days.


I think you are wrong. I don't think people listen to Torvalds because he created and leads the development of the Linux kernel. I think people listen to him because he is very active in the free software community, voicing his opinions, writing in a somewhat balanced way and clearly expressing what he believes. People don't always agree with him (obviously the Gnome guys don't).

You can do exactly the same thing: make your voice heard, state clearly and preferrably factually what you think is best for the free software stack. People will listen!


You are making some huge assumptions about how people form their own opinions and then using those assumptions to invalidate those of us who disagree with your view. Stop.


It's probably also because we find him 'interesting' (whatever that means). I give my opinion too but it doesn't really matter. :)


I think all the comments about the benefits/detriments of Chrome OS as a whole are a little off topic. I think the most important element of this piece is Torvalds' recognition that "simple" UI doesn't need to limit a user's ability to visually organize.

I completely agree with his comparison to GTK3 (and I think the same applies to Unity). Forcing users into a singular workflow with limited options will always seem somewhat inhibiting, regardless of how "easy to use" the UI may seem. This crucial element is lacking from these modern Linux window managers.


The understated comment in his post that really caught my eye was this:

"Today it decided to update itself to the new chrome version with the Aura window manager."

This will be a boon to the average user, as they'll have constantly up-to-date software, and security holes can get patched as quickly as they're found. That said, I would be concerned with things changing like that without my control, but I expect I'm in the minority.


This is one of my heroes, he never has let me down in his opinions. No BS, straight talk, one of the very few relevant people still not sold out in this business.

Turning the desktop into a retarded terminal for some company's javascript apps is a bad idea, but some can't seem to grasp that.


Anyone have a link to screenshots of the Aura UI?



Looks like Windows 7 if it had OS X's launchpad.


I bet they wanted to make the UI as Windows like as possible because that's what most of the users are used to. The fact the chrome icon sits where the 'start' button sits on windows clearly isn't a coincidence.


Apparently it does have a terminal (Ctrl + Alt + T).


It saddens me that Chrome OS is switching from full-screen windows to a stacking metaphor. The opportunity to move to a tiling (even just two columns like Metro) paradigm was there and would be a natural extension of the tab metaphor.

Disappointing, although I'm happy to see progress towards an OS that treats web apps as first-class.


Aura does do 2 column tiling


I like the cloud, but I really hate giving out control of my data. It makes a lot of sense for public data (tweets, blogposts), which you want to share anyways.

I see a lot of potential in a "drop": a personal cloud server installed at home, with the same user friendliness as e.g. google docs. Public data/services in the cloud, personal stuff at home. There still is enough opportunity to integrate mmoc? (massively multiple online collaboration?) into those apps. An example of this might be diaspora.

Many drops make a cloud, too.


I desire something along similar lines, TonidoPlug plus a rich app store. Owned by me. Plug in and log on simplicity. The option to mirror it up on a VPS rented by me if/when it needs a fatter pipe than my home connection. And mmoc (massively multiple online collaboration) instead of Facebook et alia sounds great to me. If someone wants to buy/use data about me, I would like them to have to come negotiate with the source.


While reading the post, I was not sure if the tone was sarcastic or genuine! On a second read, it does feel like a genuine comment. Some folks should use 'Not Sarcasm' tag!


That's the Linus way. You never know when he's being sarcastic, but mostly he's being serious with a few sarcastic turns.


I agree, especially about the Atom processor This computer is mid priced. Other laptops in that price range use real CPUs.


I don't think a new UI is enough to make Chromebooks suddenly matter. I got one at a conference and ended up giving it to a friend because the build quality and specs were terrible - it looked and felt and ran crap without even taking into consideration how different they are to use.


What's sad is that they couldn't get the original Chrome OS to feel any faster in use than Windows 7 on Atom.


I've never had the chance to try Chromebook yet, so does anyone know whether Chromebook has a native terminal & a good text editor? I would love to turn it into a web development environment.


i've been using the first gen chromebooks since they came out and have to say that the UI is a lot better than before. it felt like you could not escape the browser unless you closed all your tabs or shut it off, but with aura you feel at home with a "desktop" experience that just seems right.

the hardware is another story...they're great for light browsing and netflix...kinda.


UIs that have worked for people for years should never be replaced.


So we should all be using the windows 3.1 ui?


Close.

I've been using WindowMaker for 10 years. It's pretty perfect. No gimmicks. No unnecessary features. Lots of flexibility. Minimal. User friendly. Fast. Stable. It just works.

If someone has used Windows 95 before, a whole slew of operating systems after will be intuitive and a random user will know how to use it without instruction. They don't need anything more advanced. They just need it not to crash or bloat or change and become confusing. Most people just want shit to work.


Last time I used windowmaker it required hand editing config files, thats a pretty long way from shit that just work for most people.


I've been using this handy graphical tool called WPrefs that ships with WindowMaker (since at least 1998). The only time I hand-edited a config or menu was to enable functionality which nobody ever used (hiding all dock items but keeping icons small and visible and preserving both global and virtual desktop-specific icons/windows depending on context).


I think I last used windowmaker around 2004 and I have a distinct recollection of editing configfiles. Maybe I am remembering wrong.

I still think that OSX provides a much better GUI than WM, which should not have been introduced according to your logic and so people would miss out on improvements (but i miss focus follows mouse...).


Time to boot back into Chrome OS on my Cr-48!


Maybe not, though:

"The latest Chrome OS release is only available for Samsung and Acer Chromebooks as Cr-48 Chromebooks will skip Chrome 19."

from http://googlesystem.blogspot.jp/2012/04/new-window-manager-f...


Google will be adding support for the Cr-48 in Chrome OS 20.

From the comments: http://googlechromereleases.blogspot.com/2012/04/dev-channel...


The "All-Web" paradigm is coming, folks. And it really doesn't matter how much you love your iPhone, or your Android, or Windows phone. Native apps are toast, in the long run. Your data is moving to the cloud -- your pictures, your music, your movies, and every document you write. It's all going up there, and local hard drives will be history within 3 years. And what that means is ALL software is heading there too. Native apps running locally on your computer are going to be thing of the past, and it simply blows my mind that even people here on HackerNews completely fail to understand this fact.

The computing landscape is changing right now, and any company that revolves around servicing Windows desktop software is going to be in for a real hard time.

Whether or not Google's Chrome OS is the eventual successor is not clear (I don't think it is), but their general idea is correct. The all-network, all-cloud world is coming whether you like it or not.


Even if you were 100% correct, hard drives aren't going anywhere. Browser cache is around 1GB and web developers are itching for vast landscapes of local storage systems.

And where do you expect the operating system and marvelous web stack where all these things are created to reside? Burned in EEPROM?

I think the consumer already can't really differentiate between what is local or over the internet, they don't know what a browser is by and large.

But if what you say is right, the pendulum, at the consumer level, will soon swing back the other way the first time that someone can't access the "cloud" on a trip or forgets to pay their "cloud access bill" (phone/internet whatever) and then gets locked out of ... well, everything.

Castrating the device and agitating the users by making everything Microsoft Palladium style will continue to be a bad idea. Remember Sun's motto, "The network is the computer"?

Nah, don't think so; the computer is the computer. Networks are pretty important, but they do not replace everything, they are fundamentally not reliable, responsive, or resilient enough to.

And every time a step is made in any of those directions, it's done because some machine or some network, somewhere else, is better off. That rising tide will lift all boats, keeping the consumer devices edging ahead; that is unless, you are harking back to the "give consumers bare minimum craptaculous cloud terminals".

That's been tried about every 3 months for the past 35 years or so and has never gone well.


>But if what you say is right, the pendulum, at the consumer level, will soon swing back the other way the first time that someone can't access the "cloud" on a trip or forgets to pay their "cloud access bill" (phone/internet whatever) and then gets locked out of ... well, everything.

One day the idea that you don't have internet access will be just as silly as saying you don't have electricity. Its in the very infant stages but its coming. For what its worth Im typing this on my Cr-48 which has 3G so I'm pretty much there.

I'm inclined to agree especially in cases like movies or music. The idea that we all have a copy of a song on our hard drives is odd. Leaving aside ideas of copyright and licenses there really isnt a reason all media shouldnt live in some kind of shared music/video folder in the sky. I'm going to watch Breaking Bad S2E4 maybe a handful of times and eventually forget about it. Keeping it in the cloud frees the user from needing to think about keeping it or backing it up or deleting it.

Or maybe I'm biased from living in a fairly cloudy world already.


Well this is because the word "computer" needs to be dismantled a bit.

Computers as a personal mainstream entertainment device will eventually work much like television. Data retention policies will be at the whim of the content distributor and consumers will select from sets of content dictated by different price packages.

Computers as midrange workstations for engineers, artists, musicians, and scientists aren't going anywhere and will be the same as they always have. Same goes for the NOC and the Datacenter. Devices that collect and analyze data will work much in the same way. The people that will create and maintain the consumption paradise will continue to do things on traditional configurations. The biggest change here is that display and input systems will continue to diversify in form and function. However, the machine will still be a rectangular block, with all the internal components you are used to.

The million dollar question is how will people that do mostly office-productivity work? For instance, the person at the dentists office that makes the phone calls and does appointment and billing management. Most could probably get by with a desktop version of a smartphone, but nobody has produced the right one yet.

But this doesn't mean that say, the traditional computer will exodus from the dentist office. Just the opposite. They are in every room now, hooked up to cameras and xray machines, talking to redundant servers either on or offsite.


Agreed. Many people are starting to ask less for a general purpose computing machine and more for something that can go on the internet or watch netfix or use skype. They want a simple to use appliance like an iPad that simplifies the experience. Want Netflix? Push the Netflix button. Want facebook? Push the facebook button. Technical people will still want general machines that do what ever is demanded of them. I think this will lead to more divergence in the OS market with very simple OSs taking over the 'I just want it to work and I dont care how you do it' market and more technical OSs that let you destroy your computer for anyone else.

One worrying thing is that when things just work you lose some people that might have otherwise played around with things, possibly (probably) breaking things along the way, and learning important lessons. I learned to use Dreamweaver from downloading it illegally and playing around with it but it made it easy to write HTML and got me to learn ColdFusion and SQL. Poking around a DOS prompt and accidentally formatting your computer teaches people some important lessons.

To me the million dollar question is how to keep kids engaged in the transistors and languages of computers when we start to pretend they dont exist. What happens when the next would be Linus grows up in family with only iPads and iPhones?


In the US, computers might work like television. In some other countries, they might work like cell phones: always bought unlocked, no service lock-in. It just depends on what we tolerate. There's no historic inevitability to the kind of peculiar consumer exploitation that goes on in the US in markets like cable TV or cell phone data plans.


"One day the idea that you don't have internet access will be just as silly as saying you don't have electricity."

Much of the world lives with unreliable or intermittent electrical power, and likewise unreliable or intermittent internet service.

Also with mobile devices you're talking wireless bandwidth which is shared by everyone in a given area. The combination of crowds and streaming media is not trivial to engineer for. There are some promising technologies (MIMO, etc) but I don't believe any of them remove this fundamental scaling.

Local cache will continue to be important. What will be lost is the requirement to manage which device your shit is on.


>One day the idea that you don't have internet access will be just as silly as saying you don't have electricity.

This is interesting, because I see things from the totally opposite perspective. I think a period of war is coming and that people in America will look back astonished that they didn't realize the value of what they had, including reliable electrical service.

Of course, the internet is far too fragile to successfully operate in a real war zone so I expect the internet as we know it will change rather dramatically too.


> One day the idea that you don't have internet access will be just as silly as saying you don't have electricity.

I don't think that day will be in the next 3 years.


  Browser cache is around 1GB and web developers are itching
  for vast landscapes of local storage systems.
We're moving toward a computing world where hard drives don't even exist. Within 3 years we may see devices that have only persistent RAM. And for large files, I bet we'll see some nice competition between Dropbox, Google, Microsoft, and others which may eventually lead to making hard drives obsolete.

  Nah, don't think so; the computer is the computer. 
  Network... is not reliable, responsive, or resilient enough to.
Not yet, but I think we are steamrolling toward this conclusion.

Apple might have successfully convinced everyone into thinking that downloading software and installing it is some fancy new thing, but really just it's old-fashioned nonsense that is only necessary for their arcane software platform to work.


No. Even if we assume a magical device that stores nothing, cloud storage is still backed by guess what? hard disks. And this time, heavily redundant to ensure low latency and high availability.

So no, moving everything to the cloud, tomorrow, would vastly increase hard drive sales permanently. Oompff! http://allthingsd.com/20111005/it%E2%80%99s-all-about-conten...

Here's an example. I upload a video to youtube. youtube converts it into about 8 formats (depending on the initial quality; (mobile + flash 8/10 + html5 ogg/webm) * multiple resolutions), stores the original, generates dozens of multiple sized thumbnails for the seekbar, farms these out to multiple data centers, and then stores a bunch of metadata along with that (every like/dislike, comment, playlist, landing source, geoip of user). You are looking at something like 15 times more space needed for the cloud architecture.


> Even if we assume a magical device that stores nothing, cloud storage is still backed by guess what?

You store that in the cloud too, of course! It's turtles all the way down.


Depends on the content, something like a movie that is on millions of hard drives around the world today could be replaced some redundant cloud storage in a handful of formats.


You're conflating two different things: Local Storage/caching, and the medium it is stored on, spinning disks.

The data may live in the cloud, but it will be heavily cached locally. We'll still need several GB or TB of local storage, be it spinning disks, SSDs or permanent RAM.

Grandparents won't want to be without the home video of their grandchildren just because their WiFi router stopped working or their ISP is having problems. And it will be a long time before mobile coverage reaches everywhere you might take your computer: communications black spots, rural areas, Planes, Tunnels etc.


  We'll still need several GB or TB of local storage, be 
  it spinning disks, SSDs or permanent RAM.
Yes I should have clarified -- I expect memristors to be on the market within a few years. Once they're ready I bet we'll see all laptops and mobile devices move over to them. Between memristor based local storage/cache, and large/cheap cloud storage, I don't see much reason for having hard disks, software, and data in our local computers for very long. Just keep all your data and apps on the cloud, and use web-based apps for accessing them. The foundation for this type of computing world is absolutely being built right now by thousands of companies. It's definitely coming.


Between memristor based local storage/cache, ... I don't see much reason for having hard disks, software, and data in our local computers for very long.

These apps will still be stored (permanently cached as a web-app) and running on the local box and able to run in an offline mode. We won't be installing pre-packaged software that we have to upgrade manually, but we'll still need a powerful machine to run them; not a thin client.

The mechanisms whereby the software gets on our machines may change, but it's still going to be on our machines.


We're moving toward a computing world where hard drives don't even exist.

That will only happen when my pipe hits SATA speed.


That will only happen when my pipe hits SATA speed.

I dunno.

GMail is orders of magnitude faster in every way than my work Outlook installation.

Google Docs is much quicker to start & use than Office.

I've often used online image editors rather than start Photoshop.

Yes, if I had a SSD on my work computer it might help, but still I think there is an important principle here..


Sure, but compare your pipe speed today with your pipe speed ten years ago. I don't think that it's out of the realm of possibility that we will begin to see devices with no local storage within three years.

Take Google Music, for instance. You can have 200 GB of music "on your phone", even though none of it is actually stored on your phone.


> Sure, but compare your pipe speed today with your pipe speed ten years ago.

Actually, things haven't improved much in this department for the vast majority of Americans.


It hasn't increased nearly as much as my HDD speed.


>Sure, but compare your pipe speed today with your pipe speed ten years ago.

You mean the 3Mbs DSL I had 10 years ago versus the 3Mbs DSL that's still the fastest thing available at my house today?


and then we also run into contention - the more people wirelessly streaming, the less bandwidth each gets. Add in that with the cloud, there are more links in the chain that can go wrong, plus you frequently get transient outages (instead of the rare catastrophic ones with local storage), and it's not a zero sum game. Some things are fine in the cloud. Some really aren't.


> We're moving toward a computing world where hard drives don't even exist.

How do you think that "the cloud" stores data? Maybe you meant "hard drives in consumer devices."


Why is this comment being downvoted? It's not aggressive, rude or otherwise mean. It's also not factually incorrect (because it's talking about a possible future).

I agree that I don't believe this future is likely, but that's no reason to downvote him.


Care to explain why you believe native apps are toast in the long run, or how that's related to data moving to the cloud?

Web apps solved the problem of having to figure out how to find, install, or update software and not having it everywhere you go. App stores have caught up on discovery/installation/updates, and mobile has solved the problem of not having it everywhere I go.

Web apps will eventually catch up on performance and maybe someday they will feel like first class citizens in any given OS rather than being relegated to mess of throwaway tabs. We've been within 3 years of that for something like a decade, but maybe we really finally are within 3 years of that.

But we aren't within 3 years of having any two web apps looking similar and using familiar UI elements. Every web app is a new UI to learn, maybe not a problem for you or me, but mobile has reached 3 year olds and 90 year olds and everyone in between, most of whom have not acquired the skill to pick up umfamiliar UIs and don't want to.

As for data moving to the cloud, I'm with you. And I have a lot of native apps that interact with remote data.


I believe that Scott Jenson put it best in 'Why Mobile Apps Must Die'- http://vimeo.com/33692624

Watch it.

The problem with native apps is the installation barrier- if I had apps installed to do everything that I'd love my phone to do, I'd be overwhelmed with hundreds of apps installed. Web-based apps have the potential to slip in and out of use as necessary, with more applicable discovery mechanisms (geo, proximity, etc).


You speak of the "installation barrier" as if it's something insurmountable. But why should it be? A native app zero-install mechanism seems more plausible to me than wide adoption of a hypothetical geolocating proximity web.

The reason is that native apps can move so much more quickly than web apps. I'll bet Apple could get widespread adoption of a zero-install feature on iOS in six months, simply by mandating it for the app store and featuring early adopters. But HTML5 YouTube has been in beta for over two years, and the <video> tag was introduced over half a decade ago. The web moves at a snail's pace by comparison.

I did try to watch the video, but scrubbing didn't work, which sort of illustrates my point.


Both web apps and native apps are here to stay. They have complementary strengths and weaknesses.


Mobile apps, especially iOS apps, are hobbled by their curators. iPhone's don't even have an open file system and the platform was purposely designed so that apps are isolated, and funneled through a strict paradigm. This most certainly is not what I, or a lot of people want in a computing device -- it will not work, simply because a majority of innovators do not want to be locked into that universe. Apple has designed their platform to be a dead end. It is almost comical that that Google, Microsoft, and RIM have been trying follow Apple toward this dead end.

On the development side of things I believe we're going to see a massive shift back to web-app development simply because of economics. There is no way any small group of developers can build desktop, iOS, and Android versions of their application simultaneously. This iOS fad WILL DIE because of costs, and that fact that Android numbers are rising dramatically, and people need to come up with a way to support each mobile platform and be able to use that app from their desktops too -- mobile web apps are the answer.


And what open file system is there for "web apps"? Dropbox and a cobbled together flash uploader / email attachment / webapp specific API which probably only goes to FaceBook?

Aren't webapps isolated? Funneled through a strict paradigm? And also at the other end of a comparatively slow link?

The idea of a general purpose computer as it exists in the current desktop form must be put aside. We can't head to a future of everything we do in one portable device and from there to ubiquitous life integration when every app can read all your data and send it to anyone for any purpose, Android is falling fowl of that right now, we have to have isolation and sandboxing, different trust levels, restricted access to more important data (contacts, for example), to make a solid and trustable future platform.


And what open file system is there for "web apps"? Dropbox and a cobbled together flash uploader / email attachment / webapp specific API which probably only goes to FaceBook?

Playing Devil's advocate, Google Drive has an API for webapps to access the user's files (with permissions only to files created by the app itself or that the user specifically opens). The webapp chooses the MIMEtypes it can open and then the user can just do Open With → App.

Sure, it's GDrive specific for now, but the API is fairly generic and if it's successful, I can see other online drives implementing it themselves. Well, assuming Oracle loses this lawsuit, I guess :)

Aren't webapps isolated? Funneled through a strict paradigm? And also at the other end of a comparatively slow link?

APIs de-isolate them. Both the proprietary ones that already exist and open, distributed ones like http://webintents.org/

And besides, aren't native apps on mobile isolated too?

And the link between the user and the app may be slow, but it only needs to carry the UI. The data will flow from "cloud to cloud" at very high speed.

We can't head to a future of everything we do in one portable device and from there to ubiquitous life integration when every app can read all your data and send it to anyone for any purpose, Android is falling fowl of that right now, we have to have isolation and sandboxing, different trust levels, restricted access to more important data (contacts, for example), to make a solid and trustable future platform.

Uh, Android does have sandboxing and isolation, and apps can only access your data if you specifically allow them to. iOS was the one which didn't require permission to access your contacts (though Apple has said it will).


> Uh, Android does have sandboxing and isolation, and apps can only access your data if you specifically allow them to.

Wrong; The filesystem is FAT32, and has no additional permissions beyond a "read only" bit. Any data that's there can be accessed by apps, as things stand.


No. Android uses Ext4 since Gingerbread, and before it used YAFFS:

http://arstechnica.com/open-source/news/2010/12/ext4-filesys...


For internal storage. On SD cards or internal "external" storage, it's world-readable: http://developer.android.com/guide/topics/data/data-storage....

Relevant text: "Every Android-compatible device supports a shared "external storage" that you can use to save files. This can be a removable storage media (such as an SD card) or an internal (non-removable) storage. Files saved to the external storage are world-readable and can be modified by the user when they enable USB mass storage to transfer files on a computer."

I forgot to specify that in my original post. My mistake.


Web 2.0 Expo NY 2010: John Gruber, "Apple and the Open Web" http://www.youtube.com/watch?v=Qss5RnD7wK8

This is an old presentation by Gruber (Daring Fireball) from 2010, but it is still excellent. And only 10 minutes long. He argues that iOS vs Web is a false dichotomy, in fact, the Web was from the start a big part of iOS. He then differentiates between "Web Apps vs Browser Apps". Your notion is, that a Web App is something which uses html. But he argues, and I think this rings true, that a Web App is something which uses http.

"How can you say that Twitter, even though it is a native iPhone App, isn't really a Web App?"


This most certainly is not what I, or a lot of people want in a computing device

Most people don't even understand what a filesystem is, nevermind whether it's open or not. I agree with you that data is moving online but that's orthogonal to the native vs web app question.


I want a "cloud filesystem", but want it controlled by ME, not Apple, and not iCloud. That's the problem. Apple wants to control the filesystem, and I, and others are not willing to accept this horrific scenario.


Yes you and your techie friends care about this. Most people don't.

Current trends indicate user attention is shifting even more to native, BTW: http://blog.flurry.com/bid/80241/Mobile-App-Usage-Further-Do...

The web vs native debate is really just a debate about which technology is better suited to building cloud client apps. HTML5 + JS is just another UI stack like Cocoa Touch or Android.


You "and others" are in the tiniest minority. Surely the current state of the market is clear enough indication of this?

I don't know you from a bar of soap but reading your comments, I get the impression of an American late-teen with a solid amount of technical knowledge, strong opinions but staggeringly little perspective.

Even amongst those who are capable of administering an 'open platform cloud infrastructure' there are few who have the time. In my own circumstances, ten years of programming experience, recently shifted from consultancy to an analysis role at a major financial institution, start up on the side, seven week old baby,ten year old daughter, wife, 33 years of age... Let me tell you: the more my technology 'just works' the better. I won't accept my rights bing trampled on but a system that is elegant and robust albeit a bit restrictive is far preferable to spending hour on end tinkering with something that I need to actually use as opposed to configure endlessly.


Go read cageface's comment again. I think you are using the words "native app" wrong, or at least different than what most people think it means.

You are correct that data is moving into the cloud, and away from Apple's restricted file system (or any local file system). But that has nothing to do with native vs. web (HTML) apps. Native apps can easily store all their data in the cloud (ANY cloud, not just iCloud), and still be native apps in the usual sense of the word.

It seems to me that the trend is toward native, cloud based apps. And these two concepts are not in conflict. As cageface says, they are orthogonal.


I find it hilarious that you are embracing cloud computing as your personal savior because you want MORE control. It's almost oxymoronic!


I want a cloud everything. Docs, spreadsheets, filesystem, email, todo list, media player - everything. There's a niche in the OSS field for these kinds of applications at the moment.


> Mobile apps, especially iOS apps, are hobbled by their curators. iPhone's don't even have an open file system and the platform was purposely designed so that apps are isolated, and funneled through a strict paradigm.

Which is different from browser apps... how, again?


Your reasons, while important and meaningful from a developers perspective, mean little to the consumer choosing between a native app and a mobile web app. What's in it for the consumer?


Why would a consumer even know or care about that? They'll use whatever is easy to find, "install" (even if it means just a button that links to an URL), use and looks nice. They don't give a fuck if it's written JS or Java/ObjC or if it's HTML or XML or code rendering the UI.


It's really goddamn depressing to hear this, whether or not it's a joke. Both because it's asinine (most of my computing is done offline except for small bursts of internet activity, because the alternative would be completely unusable) and the idea is totally nonsensical.

Next you're gonna tell me they're gonna introduce a language which when compiled into a single compressed file in some kind of 'virtual mechanism' will run on any platform and everyone will use it because the technology is so much better than anything else.


I mostly agree, but web apps have been set back a few years by mobile.

I had this conversation with my girlfriend the other day:

Her: If ____ had a [mobile] app I'd waste so much time on it.

Me: You know you can use their website on your phone, right?

Her: Yeah, but apps are so much better.

Out of curiosity, I went to this particular site on my phone and they actually do have a mobile web app, complete with home screen icon and the meta tags to make it fullscreen. If you add it your homescreen it could pass as a decent native app.

There are two problems with web apps right now:

1) Quality. It's pretty hard to make really nice mobile web apps, or apps in general. There are lots of behaviors in the browser that are un-app-like which you need to work around. Native platforms were designed for apps from the start. Performance can also be an issue (possibly solved by something like NativeClient).

2) Discoverability. Users look for app in the app store first. If they don't find it they often don't check for a mobile web app, either because they're not aware they exist or they assume it will be crap. App stores need to have better integration for web apps. In the meantime, wrappers like PhoneGap can help.


Great, another prophet. Worst thing is, if you somehow are correct, you would not stop shoving it in people's faces how right you are. Cloud is not the future, not with the current US internet infrastructure at least.

Also why would someone pay 10 dollars a month for 50 GB of space in the cloud when you can buy 1 TB portable hard drives for dirt cheap?


> Also why would someone pay 10 dollars a month for 50 GB of space in the cloud when you can buy 1 TB portable hard drives for dirt cheap?

housefires.


If anything I'd be more worried about thieves stealing my data. (In the literal, 'break into my house and take it.' sort of way.)

I know it's happened to at least one indie developer. And at least one youtube blogger that comes to mind.

That having been said, local hard drives aren't going to die anytime soon. Uncle McCarthy has not yet had his day, at least not on that one.


You know, encryption and a decently strong password pretty much seals the deal on somebody physically stealing your data. There is no reason to not encrypt your home computer if you are so concerned.


>You know, encryption and a decently strong password pretty much seals the deal on somebody physically stealing your data.

Maybe stealing it and using the data. If someone swipes my drive the data is still gone. Even if it is inaccessible.


Backups, offsite too.


When you consider the time and effort required for a good offsite backup, it's easier to just pay the $10/month.


If a lot of what takes up hard disk space moves to on demand providers, 50GB may go a long way to satisfying most people.


> Great, another prophet.

Heh. "Pandora didn't get half the kicking around she deserved", as I think that R.A.Heinlein jokingly put it.

(Not that I believe that GP is that prescient.)


It's Cassandra, actually, and she was pretty harmless. It's not like anybody believed her.


I'm only a prophet if I'm correct. If I'm wrong I'm just some dude on the internet.


Yeah, and in 2 years desktop computers will be gone, people will only use tablets, and Linux will have completely disappeared as a desktop environment (together with the fore-mentioned gone desktop computers, probably?).

I always like people who over-simplify things to the extreme. They don't seem to understand that different solutions exist because there are different needs. Ever thought about doing video editing of a More-than-HD-video on a network connection ? Ever thought of RAW images storage, taking up 30 seconds to load one image through your ISP? Ever thought of people living away from large cities, with slow internet connection ?

Obviously not. You must be living in a microcosm where everyone has 1Tb/s Net access under their tables.


"and Linux will have completely disappeared as a desktop environment"

Of course not! 2014 will be the year of Linux on the desktop, just like every other year for the last 15 years!


Hehe. Joking aside, I do not see the Linux desktop losing "share" in the near future. It may be a very long time before it reaches significant market size, but the latest major distributions are very solid (OpenSuse 12.1, Ubuntu 12.04, Mint), drivers support is better than ever (while not perfect), and "app stores" are becoming more and more common. There are signs of progress, and if the rumours of a Linux client for Steam are real, there is no way it cannot grow further.


The year of Linux on the desktop already happened a few years ago. I recently fixed a friend's PC by installing Ubuntu on it. Everything worked out of the box - sound, flash, games. All I had to do was adjust the audio input levels. The only reason everyone else isn't using it is politics (in particular, shoddy marketing by companies like Ubuntu, Microsoft's constant attacks on organisations which use OSS, Microsoft's deals with OEM's).


"the year of Linux on the desktop" is not the year in which installing Linux actually works for the first time (gasp). I installed Redhat on machines in 2002 where things worked, too. "Linux on the desktop" is about Linux taking a significant market share of desktop installations. The fact of the matter is that Linux is confusing to users ('distributions'? 'desktop environments'? and that's just the start), there is very little software for it (relative to what is available for Windows), a moving target (both for users and developers - radical changes in UI from one year to the next, libraries that aren't ABI stable for a year let alone 2 decades) and because it's too hard (approaching impossible) to make real money from Linux software.


>The fact of the matter is that Linux is confusing to users ('distributions'? 'desktop environments'? and that's just the start)

I was doing this for a very non-technical user. All I said was that Ubuntu was like Windows but free. Nothing confusing about that (from their reaction).

>there is very little software for it (relative to what is available for Windows)

Yes and no. Games are a big problem. For most tasks that most users perform, there's a web browser, email client and an office suite. Applications moving to the web only helps with this. Bespoke software which manages a warehouse's stock we don't really care about in this context.

>a moving target (both for users and developers - radical changes in UI from one year to the next, libraries that aren't ABI stable for a year let alone 2 decades)

I agree. Distributions should take more care with this or desktop environment developers should call their new DE something completely different and not make the look like a replacement but rather a competitor. Then again, installing an LTS release should keep a person happy for a couple of years. After that, if they're not a power user, there may be little to no benefit in upgrading to the newest release.

>it's too hard (approaching impossible) to make real money from Linux software.

Tell that to: Sun (hah, great example), RedHat, Google, Apple, the countless software agencies developing on an OSS stack, all the businesses in the world running on a network backbone mostly composed of open software and all the research projects using fancy supercomputers running on Linux. Almost everyone uses Linux in one way or another and almost everyone makes money. Redhat is proof that you can directly use Linux to build a billion dollar business. It's a young business model, but it works well.


I think there's going to be a big watershed moment. One of the big cloud service providers is going to have a major data loss event. (Probably Google or Facebook, but who knows, Microsoft or Amazon or a number of others are equally vulnerable.) At that point, it will be interesting to see how far into the rabbit hole we've gone and how we react to seeing so much of our info vanish.

I'll be hanging on to my local hard drive tooth and nail. (Backing up to S3 or something mind you, but make no mistake my local disk is the master copy.)


Google's been indexing the web and Amazon executing bazillions of legally binding sales without a "major data loss event" for 13 years now. I don't think that's really likely. At that scale, backups aren't really a .serious problem, as you have to have redundancy in the system for reliability reasons anyway.

A "major privacy event" seems quite likely though.


> Google's been indexing the web and Amazon executing bazillions of legally binding sales without a "major data loss event" for 13 years now. I don't think that's really likely. At that scale, backups aren't really a .serious problem, as you have to have redundancy in the system for reliability reasons anyway.

How do you know that there hasn't been a major data loss event for Google? Would you be able to tell the difference if, say, a quarter of their data were gone?

(I agree that it would be more visible if Amazon had a major data loss, but the fact that you don't know about it doesn't mean it didn't. In fact, can you name a major data loss of which you do know? (I can't, but that may just be because I don't follow these things.) Surely one can't conclude from the fact that there are few well publicised examples that businesses do not suffer such losses.)


It's happened before, at least once very early on:

  Google had taken ill.
  
  The problem was the index storing the contents of the web 
  in Google's servers. For a couple of months in early 2000, 
  it wasn't updating at all. Millions of documents created 
  during that period weren't being being collected. As far as 
  the Google search engine was concerned, they didn't exist.
http://books.google.com/books?id=V1u1f8sv3k8C&pg=PA41...


It's a nit, but this wasn't a "data loss" event. It was a bug that prevented the storing of the data in the first place. Backup strategies (either cloud-managed or "hanging on to your local hard drive" as lukeschlather posits) would not have helped.


I know of at least one incident at Google in the early part of the last decade where the entire crawl got deleted (or otherwise corrupted), meaning results didn't get updated for an extra six weeks or so.


>A "major privacy event" seems quite likely though.

The breakdown that is going to happen when Facebook and/or Gmail's database gets dumped and uploaded to The Pirate Bay or Wikileaks is going to be mind-blowingly tremendous.

On top of that, I wouldn't expect any slowdown after "a major privacy event". I fully believe that we are in a world where very few organizational secrets can be kept. Virtually no one, including major corporations and governments, has a correct concept of what is necessary to protect digital data. In the early days of the war, when everyone is still able to use the internet, the intelligence leakage is just going to be astonishing.


Not in most of America. Maybe the Valley or NYC. But in places like Florida, where Comcast (the biggest carrier) drops when thunderstorm roll by good luck.

And business... pushing all your data to the cloud is a mistake. For many reasons such as; security. Look there is no way I'm putting sensitive data, like accounting, patient information, (etc) in the cloud. It's just not worth the risk or the lawsuit.

Also, I know a guy who owns a gym. Was talked into a "hosted cash register". BIG MISTAKE. After the install, a week later early morning storm rolls in from the ocean. His power flips (a common thing in South Florida). But his Comcast never came back online. He was down (and out of business) for 4 hours. Personally though, I enjoyed hearing his plight. He's a cheap sob.

Some of you might say, that is what he deserves for relying on Comcast. Fine. But what small business you know it gonna go through the expense of a T1/3 or fiber? That $29.99 a month cash register just became $300+ a month. Lovely.

My point: out in the real world (in America) the infrastructure is barely stable. Until the carries seriouslly invest (and I doubt they ever will since they only provide exactly what they have too - I don't even want to get started on our TERRIBLE cell service in this country) in their networks, the "cloud" will be a site you share pictures of your dog.


People have been saying this for decades. It may turn out to be true but there aren't a competitive versions of a lot of the programs I use on the web yet. And "whether you like it or not" is a silly thing to say in light of the fact that user preferences are exactly what will determine whether your utopia will be the one that actually comes.


Great concept - your timeline is way off. Wireless network connectivity is unlikely to be where it needs to be for at least another 20 years for people who work with larger documents. Hell, I have a tough enough time in our corporate offices in São Paulo, Brazil accessing 100 MB Slide Decks in our Redwood City offices over _wired_ connections - I don't want to think how long before places like Belém get half decent _wireless connectivity_. Until then - I'll still need my Windows/Macintosh Laptop with local Apps/data.

Thick apps aren't going anywhere in the near future. Video Editing, Adobe Photoshop - they'll all be huge-honking local apps for at least another 10-15 years.


"Cloud devices" can still use local hardware acceleration, storage, and memory for client-side things. Nothing is really preventing us from developing cloud-based video and photo editing software.

Some work might have to be done to minimize the amount of data that gets transferred when saving and loading, but that's about it. I'm sure if you compress your 100MB Slide Decks file you'll find it much easier to send over!


Why jump through hoops to make a suboptimal solution half-viable, if we got a solution that just works and works well: native apps.

Seriously, all that cloud stuff is overrated nonsense. There's some areas where it might sense, but stop trying to put every fucking thing on the cloud. It just doesn't work. End of story.


I hate to break it to you, but a web browser is just an arbitrary native app running data over TCP/IP. Native apps aren't going anywhere.


taps his sarcasm detector looking confused, frowns, throws it over his shoulder and glares


Nonsense.

True, companies will encourage voluntarily uploading your stuff to their computers. Why shouldn't they? Just like they solicit your email address, track your movements, snarf your smartphone address book and other neat tricks. There's value in getting that info.

That doesn't mean everyone is going to fall for it. Some will, no doubt.

But GB's are getting cheaper every day. And not just for "cloud providers". How many terbaytes does one person need? You can fit your whole life's worth of data onto today's capacities of consumer digital storage.

Are we to imagine a future where consumers cannot purchase storage media? What drugs are you on?

The web is brittle. It's but one of many things that can be run over IP. You web-fanatics crack me up.


  Are we to imagine a future where consumers cannot purchase storage media?
Yes that's coming. Many countries are starting to put tariffs on hard drives. And in my country, Canada, they also slow down your internet when downloading large files and it costs $1.5 per gigabyte to exceed my 85GB/mo limit. I might as well redirect my downloads to an online service and keep all my data online until I need.

My bet is within 3 years we'll reach the threshold where it might as well be worth it to store files online rather download them and store them permanently on hard drives on a computer running constantly in your home.


I feel like I'm missing something terribly obvious, but...

How does "it is getting harder and slower to move large files over the net" lead to the conclusion "therefore we should store all our large files on the net and retrieve them whenever we want, rather than keep them on local media with negligible latency and huge storage space"?


The problem with this is generally speaking, if you take possession of a file, either locally, or on a remote filesystem. You are likely to use it, and also likely to use it shortly after taking possession. Be it Music, Photos, Documents, or a Movie, the data that normal people take possession of needs to be downloaded and viewed to be useful.

If at this point, there is no local storage, and it's stored in the cloud, you need to pay the bandwidth price every time you wish to view it, rather just on taking possession.

This particular example, about Canada increasing bandwidth charges, actually works against your argument.

Sure, some data won't need to make it to the client, but that's mostly only true if your a scientist, or otherwise run computation on data, but that's hardly the case for the general person.


Solution: VPS outside Canada to do your downloads from.

But what you describe is a networking problem (transit costs of your ISP passed on to you). Not a storage problem.

A "cloud storage" provider is just going to keep the data you store in a datacenter near to you anyway, not halfway around the world. But it's not like keeping your physical stuff with a storage company. You can easily store the data yourself with many more advantages. The marketing teams for "cloud storage" services will no doubt try to convince people otherwise.


That's not a solution, you still need to get the downloads to a local system to use them for that vast majority of media.


True, but he's admitting he doesn't access ("use") the data in the same month he downloads it. So downloading from his VPS IP address to the VPS's secondary storage, outside Canada, won't count toward his monthly Canada ISP's cap. It's only when he decides to download data from the VPS's secondary storage to his "local system" he'll begin using his monthly allowance.


But that is a pretty atypical usage pattern. So, not the rule we should expect when we are talking about the general "future" of technology.


I'm waiting for it to come full circle - the day when the browser VM is so advanced it's basically a VMWare shim and each web page is really an OS that boots up, then we can finally throw away all the development crap collectively known as web duct tape and use real tools for software development.


Presumably you've seen http://bellard.org/jslinux/ ?

If someone (Google, Mozilla) pushed for it, this could be merged with a browser API / VirtualBox for speed and completeness, and you could download a paused machine with state.

It's not happening now, but it's possible already.


As jodrellblank says, it's been possible to do this for a long time:

Inferno IE4 plugin: http://www.vitanuova.com/inferno/plugin/index.html

Lively Kernel: http://www.lively-kernel.org/


"All-Web" makes hardware a commodity. Apple, Amazon, Microsoft, etc. have a huge incentive (literally billions of dollars of incentive) to keep the user experience on their devices better than that of any other device.

Native apps may be toast, but they won't go without a fight.


OOO, so in your future I get to choose between having ads on all my apps or renting my own data.


>but their general idea is correct. The all-network, all-cloud world is coming whether you like it or not.

Clearly you have the religious fervor upon you. Personally I like it when I see the benefits. Currently that's for a subset of my computing tasks. Sharing stuff? Sure. Off-site backup, OK. Most other stuff? I'm happy sticking to the PC model rather than the thin client-mainframe timeshare model.


Bollocks I say. It's all about latency and bandwidth. If I'd have fiber in my cabin in Montana, I might agree. Perfect connectivity all times is just a dream. Plus native will always have an edge. Sure my calendar is in the cloud, but everything.. nah, don't think so. For photos and videos the pipes ain't fat enough for anything but backup and sync.


Written like someone who has never experienced 3rd world Internet.


Exactly. It's easy to tout this 'cloud' based world as a looming, common-place utopia when you're an uber nerd who is completely detached from everyday 'normal' people who barely understand the basic concepts of the internet/web, or who have severely lacking bandwidth and data usage quotas.


Or who has never experienced the bill-shock nightmare that is international data roaming. Always on is nice if you live in a city with good coverage and you can afford the bill.


I can have all my data and my apps in the cloud and still run native software. In fact, this is the iOS approach.

On the whole, the ability to transparently cache big chunks of data locally turns out to offer a superior user experience in many non insignificant use cases (e.g. Downloading content for later consumption while off grid).

I don't think it's clear that the web approach will win out, and I certainly think it will not be settled in three years. And photo and video libraries (which are popular and growing faster than bandwidth or storage) aren't going to the cloud any time soon.


"Native apps are toast, in the long run."

Until mobile browsers have access to the full range of hardware on a device (camera, microphone), native apps will have an edge. Example: camera/pictures - ios doesn't allow camera/cameraroll access from the browser, android does, and wp7 used to but removed it. If you want to offer people a way to send you pictures from your app, it's gotta be native for now.


I think it's a bit binary to suggest that everything's going to the cloud and nothing will be local any more. I rather expect we're heading toward the more robust, resilient approach of local storage with seamless cloud synchronization, dropbox-style.


I remember similar things being said in the 90s. We'd all be be ditching Windows and running internet appliances and doing everything via the web and Java applets.

I'm not saying you're wrong, just that similar predictions haven't panned out in the past.


Are you being sarcastic?


I surely hope not.

The web is for documents.


>local hard drives will be history within 3 years

Want to make a bet? Name your stakes.


I didn't know Aura was making such progress until my buddy got it on his CR48. I'm tempted to remove Ubuntu and reinstall crOS to use it longer, but I also still really appreciate my full Ubuntu environment.

If I recall correctly, Aura also serves as a real proper full window manager and already has support for Wayland too!


Love Linus but I think he is excited that someone is taking on Windows (cough cough :)). Files on cloud are great, until you get locked out and lose everything so...


Chrysis (cough cough)


Nobody else thinks it's an odd coincidence that this post by Linus and a post by Douglas Crockford both on Google Plus were submitted two hours apart?

To clarify, I don't think it's a coincidence and I think Google employees probably submitted these.


> To clarify, I don't think it's a coincidence and I think Google employees probably submitted these.

If you look at the posters' profiles you can see where they both work. (Neither is currently a Google employee.)


And?

They were both worthy of submission IMO.

(The Douglas Crockford post more than this one though.)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: