He's really just addressing the elephant in the room by saying that as of yet the whole open source desktop mess has failed to produce anything that looks like a compelling , easy to use desktop.
If Google can produce a Linux based operating system that addresses the needs of casual users and power users/technophiles (like Linus) and get it on enough hardware (i.e not just netbooks but "serious" desktops as well) then they can (combined with android) simply eat everyone elses lunch.
What is not so important necessarily is whether or not local storage or local apps will "go away" but that Google understands that a modern OS needs to treat "the cloud", i.e webapps , online storage , social networks as first class citizens inside the OS. This is a metaphor that is missing (in a really intuitive form) mostly from even new OSes like Windows 8.
Hard disks are not dead, but I think the idea of having C:\Program Files\.. is.
Even as a technical user myself the directory hierarchy is mostly just an abstraction that gets in my way. My work (i.e source code) is in git, my games are in Steam, my email is at google , my music on Spotify and all the "other stuff" is in dropbox.
What I need is an OS that ties all of these things together seamless regardless of whether the bytes themselves reside on the disk inside my computer or on some website somewhere but can provide sufficient tools when needed to fix leaky abstractions.
Let's suppose you could buy a new workstation soon that ran a nice clean Chrome OS desktop but had sufficient memory and hardware level virtualization that simply clipping in a "proper" Linux or all the MS libraries such as DirectX that you needed to run Windows apps was a trivial activity that could be abstracted away from the end user if required. Would you not be curious to buy one?
In regards to remote resources, currently there are various FUSE plugins which can present remote resources in the filesystem (e.g. flickrfs). These are of course issues with the abstraction (how do we represent tags in a hierarchy?), but the use of these plugins is mostly transparent to the user (ignoring performance). It's certainly possible to build a layer atop of these that would integrate better with the system (e.g. add a Facebook selection to the open dialog).
You also seem to want to conflate applications with the data they manage. For a large number of users this is fine (iTunes is already their music, IE is the internet, etc), but for many others this is problematic (how do you manage moving data between these applications without a common intermediary. consider scanning an image, touching it up, then including it in a document - where do the original and altered versions live, how are they accessed by applications?).
Conflating data with applications is an interesting issue. On the one hand I feel that there may be a move towards this, for example it would be advantageous to (say) Apple for every photo you took on your iphone to be tied into the iCloud ecosystem never to escape.
On the other hand , storage is such a commodity that there's no reason that you couldn't hold your data where you want but hook into the functionality of a particular application to organise it.
For example maybe you use gmail to look through your archives of old emails that you had from a previous provider that are stored on your HDD. Spotify already does something like this with it's local music functionality.
If Google can produce a Linux based operating system that addresses the needs of casual users... [they can] simply eat everyone elses lunch.
He also addresses the elephant in the room that the major Linux UI pushes that consciously and grandly targeted those goals were grand steps backwards. That the things the UI prognosticators think users want, like abstraction and simplification, alienated real Linux users like Linus. That normal users missed features the UI cognoscenti banished as "too confusing," such as "easy mouse configurability for things like how to launch applications."
When you talk about abstraction, metaphors, making the filesystem invisible to users, and (most telling of all) appealing to "casual users" it sounds like more of the same talk that we've had for many years now. The more grand and self-conscious the community gets about user interfaces, and the more they target non-techie users, the less useful the desktop environments become for their actual users. That missing configurability that Linus laments, which was sacrificed for the sake of "casual" users -- did we really attract a bunch of casual users? Did we eat anyone's lunch, take market share from OSX or Windows? Or did we just make our desktop environments less usable for the people who actually use them?
Linus says GNOME was "useless" without that configurability, and that this new device might be usable as a laptop if it had a terminal and a development environment. That doesn't sound like a call for Linux environments to appeal to "casual" users. I don't think Linus is asking for even more simplifications to get in the way of his usage, like abstracting away the filesystem.
The elephant in the room is that Linux desktop environments did a better job of giving Linux users what they wanted when that's all they tried to do. Trying to give us what we didn't know we wanted has been a failure. Targeting non-techie users has gained us nothing that we didn't already want for ourselves. It's time for the community to get off its high horse about UIs and go back to what it's good at, which is humbly (and very successfully) catering to its own needs. And nobody should feel ashamed of creating or using an operating system that never eats Apple's or Microsoft's lunch (isn't that such a ten years ago obsession?)
(P.S. You know what's good at abstracting away the filesystem when and how it's appropriate? Applications.)
I agree that applications are good at abstracting away the filesystem which is why I would argue for a lighter weight OS (which I presume chrome OS is).
Now whenever there is an innovation, it is reviled and everyone asks for the interface which looked roughly like Windows 95.
So I guess they listened to the wrong messages.
Targeting "casual" users has resulted in some valuable work such as better stability and completeness in many programs, simpler configuration interfaces for many things (such as wireless configuration GUIs that were better than Windows'), and better support for multimedia. However (leaving aside the fact that those were all things that techie users were begging for anyway) there was a downside. Customizability and configuration options disappeared. People made assumptions about "normal" users that implicitly labeled a large chunk of the Linux community abnormal. Case in point: if you configured your wireless connection using NetworkManager and then switched to a different desktop environment, your wireless connection might not work anymore. That might sound like a bug, but it wasn't. It wasn't designed to work that way. How could that be considered remotely acceptable? Because normal people don't use alternative desktop environments. (Thankfully, I've since read that NetworkManager accepts plugins that write to the correct system configuration files, though I don't know if distros test and install them.)
And where's the payoff? What was the payoff supposed to be, anyway? Back in the nineties and early 2000s many people assumed Linux had to make it on the mainstream desktop in order to be successful, but I think we've put that that misconception behind us. Yet people still hold up the "casual user" as the gold standard that Linux is supposed to cater to, as if it were a moral imperative. Few Linux users fit that stereotype. It isn't that we're concretely knowledgeable about Linux. If you put ten randomly chosen Linux users in a room, chances are that for every component of a running Linux system there will be somebody in the room who is utterly unfamiliar with how it works or how to configure it. Obviously we should cater to ignorance, if only so we have the luxury of remaining ignorant ourselves. But there is a general savoir faire with computers that it is acceptable to assume. By savoir faire I mean whatever factor it is that explains why I am the person in my family who always gets a call on the phone when someone needs help with Windows 7. Even though I've never developed on Windows, haven't used Windows for anything more than e-mail, web browsing, and Office in ten years, even though I haven't used Windows personally in six months, even though I've never used Windows 7 or Vista at all. They still ask me for help with Windows 7, and usually I can help them.
All I'm saying is that it's acceptable for Linux UI designers to assume that the people using their UIs are likely to resemble current and past Linux users. Take a break from innovating for projected, postulated, hoped-for users. Innovate for the ones we already have instead.
I guess that's the true value of geek cred.
It's true that liking Gnome and/or Unity seems to have a kind of stigma these days.
You can do exactly the same thing: make your voice heard, state clearly and preferrably factually what you think is best for the free software stack. People will listen!
I completely agree with his comparison to GTK3 (and I think the same applies to Unity). Forcing users into a singular workflow with limited options will always seem somewhat inhibiting, regardless of how "easy to use" the UI may seem. This crucial element is lacking from these modern Linux window managers.
"Today it decided to update itself to the new chrome version with the Aura window manager."
This will be a boon to the average user, as they'll have constantly up-to-date software, and security holes can get patched as quickly as they're found. That said, I would be concerned with things changing like that without my control, but I expect I'm in the minority.
Disappointing, although I'm happy to see progress towards an OS that treats web apps as first-class.
I see a lot of potential in a "drop": a personal cloud server installed at home, with the same user friendliness as e.g. google docs. Public data/services in the cloud, personal stuff at home. There still is enough opportunity to integrate mmoc? (massively multiple online collaboration?) into those apps. An example of this might be diaspora.
Many drops make a cloud, too.
the hardware is another story...they're great for light browsing and netflix...kinda.
I've been using WindowMaker for 10 years. It's pretty perfect. No gimmicks. No unnecessary features. Lots of flexibility. Minimal. User friendly. Fast. Stable. It just works.
If someone has used Windows 95 before, a whole slew of operating systems after will be intuitive and a random user will know how to use it without instruction. They don't need anything more advanced. They just need it not to crash or bloat or change and become confusing. Most people just want shit to work.
I still think that OSX provides a much better GUI than WM, which should not have been introduced according to your logic and so people would miss out on improvements (but i miss focus follows mouse...).
"The latest Chrome OS release is only available for Samsung and Acer Chromebooks as Cr-48 Chromebooks will skip Chrome 19."
From the comments: http://googlechromereleases.blogspot.com/2012/04/dev-channel...
The computing landscape is changing right now, and any company that revolves around servicing Windows desktop software is going to be in for a real hard time.
Whether or not Google's Chrome OS is the eventual successor is not clear (I don't think it is), but their general idea is correct. The all-network, all-cloud world is coming whether you like it or not.
And where do you expect the operating system and marvelous web stack where all these things are created to reside? Burned in EEPROM?
I think the consumer already can't really differentiate between what is local or over the internet, they don't know what a browser is by and large.
But if what you say is right, the pendulum, at the consumer level, will soon swing back the other way the first time that someone can't access the "cloud" on a trip or forgets to pay their "cloud access bill" (phone/internet whatever) and then gets locked out of ... well, everything.
Castrating the device and agitating the users by making everything Microsoft Palladium style will continue to be a bad idea. Remember Sun's motto, "The network is the computer"?
Nah, don't think so; the computer is the computer. Networks are pretty important, but they do not replace everything, they are fundamentally not reliable, responsive, or resilient enough to.
And every time a step is made in any of those directions, it's done because some machine or some network, somewhere else, is better off. That rising tide will lift all boats, keeping the consumer devices edging ahead; that is unless, you are harking back to the "give consumers bare minimum craptaculous cloud terminals".
That's been tried about every 3 months for the past 35 years or so and has never gone well.
One day the idea that you don't have internet access will be just as silly as saying you don't have electricity. Its in the very infant stages but its coming. For what its worth Im typing this on my Cr-48 which has 3G so I'm pretty much there.
I'm inclined to agree especially in cases like movies or music. The idea that we all have a copy of a song on our hard drives is odd. Leaving aside ideas of copyright and licenses there really isnt a reason all media shouldnt live in some kind of shared music/video folder in the sky. I'm going to watch Breaking Bad S2E4 maybe a handful of times and eventually forget about it. Keeping it in the cloud frees the user from needing to think about keeping it or backing it up or deleting it.
Or maybe I'm biased from living in a fairly cloudy world already.
Computers as a personal mainstream entertainment device will eventually work much like television. Data retention policies will be at the whim of the content distributor and consumers will select from sets of content dictated by different price packages.
Computers as midrange workstations for engineers, artists, musicians, and scientists aren't going anywhere and will be the same as they always have. Same goes for the NOC and the Datacenter. Devices that collect and analyze data will work much in the same way. The people that will create and maintain the consumption paradise will continue to do things on traditional configurations. The biggest change here is that display and input systems will continue to diversify in form and function. However, the machine will still be a rectangular block, with all the internal components you are used to.
The million dollar question is how will people that do mostly office-productivity work? For instance, the person at the dentists office that makes the phone calls and does appointment and billing management. Most could probably get by with a desktop version of a smartphone, but nobody has produced the right one yet.
But this doesn't mean that say, the traditional computer will exodus from the dentist office. Just the opposite. They are in every room now, hooked up to cameras and xray machines, talking to redundant servers either on or offsite.
One worrying thing is that when things just work you lose some people that might have otherwise played around with things, possibly (probably) breaking things along the way, and learning important lessons. I learned to use Dreamweaver from downloading it illegally and playing around with it but it made it easy to write HTML and got me to learn ColdFusion and SQL. Poking around a DOS prompt and accidentally formatting your computer teaches people some important lessons.
To me the million dollar question is how to keep kids engaged in the transistors and languages of computers when we start to pretend they dont exist. What happens when the next would be Linus grows up in family with only iPads and iPhones?
Much of the world lives with unreliable or intermittent electrical power, and likewise unreliable or intermittent internet service.
Also with mobile devices you're talking wireless bandwidth which is shared by everyone in a given area. The combination of crowds and streaming media is not trivial to engineer for. There are some promising technologies (MIMO, etc) but I don't believe any of them remove this fundamental scaling.
Local cache will continue to be important. What will be lost is the requirement to manage which device your shit is on.
This is interesting, because I see things from the totally opposite perspective. I think a period of war is coming and that people in America will look back astonished that they didn't realize the value of what they had, including reliable electrical service.
Of course, the internet is far too fragile to successfully operate in a real war zone so I expect the internet as we know it will change rather dramatically too.
I don't think that day will be in the next 3 years.
Browser cache is around 1GB and web developers are itching
for vast landscapes of local storage systems.
Nah, don't think so; the computer is the computer.
Network... is not reliable, responsive, or resilient enough to.
Apple might have successfully convinced everyone into thinking that downloading software and installing it is some fancy new thing, but really just it's old-fashioned nonsense that is only necessary for their arcane software platform to work.
So no, moving everything to the cloud, tomorrow, would vastly increase hard drive sales permanently. Oompff! http://allthingsd.com/20111005/it%E2%80%99s-all-about-conten...
Here's an example. I upload a video to youtube. youtube converts it into about 8 formats (depending on the initial quality; (mobile + flash 8/10 + html5 ogg/webm) * multiple resolutions), stores the original, generates dozens of multiple sized thumbnails for the seekbar, farms these out to multiple data centers, and then stores a bunch of metadata along with that (every like/dislike, comment, playlist, landing source, geoip of user). You are looking at something like 15 times more space needed for the cloud architecture.
You store that in the cloud too, of course! It's turtles all the way down.
The data may live in the cloud, but it will be heavily cached locally. We'll still need several GB or TB of local storage, be it spinning disks, SSDs or permanent RAM.
Grandparents won't want to be without the home video of their grandchildren just because their WiFi router stopped working or their ISP is having problems. And it will be a long time before mobile coverage reaches everywhere you might take your computer: communications black spots, rural areas, Planes, Tunnels etc.
We'll still need several GB or TB of local storage, be
it spinning disks, SSDs or permanent RAM.
These apps will still be stored (permanently cached as a web-app) and running on the local box and able to run in an offline mode. We won't be installing pre-packaged software that we have to upgrade manually, but we'll still need a powerful machine to run them; not a thin client.
The mechanisms whereby the software gets on our machines may change, but it's still going to be on our machines.
That will only happen when my pipe hits SATA speed.
GMail is orders of magnitude faster in every way than my work Outlook installation.
Google Docs is much quicker to start & use than Office.
I've often used online image editors rather than start Photoshop.
Yes, if I had a SSD on my work computer it might help, but still I think there is an important principle here..
Take Google Music, for instance. You can have 200 GB of music "on your phone", even though none of it is actually stored on your phone.
Actually, things haven't improved much in this department for the vast majority of Americans.
You mean the 3Mbs DSL I had 10 years ago versus the 3Mbs DSL that's still the fastest thing available at my house today?
How do you think that "the cloud" stores data? Maybe you meant "hard drives in consumer devices."
I agree that I don't believe this future is likely, but that's no reason to downvote him.
Web apps solved the problem of having to figure out how to find, install, or update software and not having it everywhere you go. App stores have caught up on discovery/installation/updates, and mobile has solved the problem of not having it everywhere I go.
Web apps will eventually catch up on performance and maybe someday they will feel like first class citizens in any given OS rather than being relegated to mess of throwaway tabs. We've been within 3 years of that for something like a decade, but maybe we really finally are within 3 years of that.
But we aren't within 3 years of having any two web apps looking similar and using familiar UI elements. Every web app is a new UI to learn, maybe not a problem for you or me, but mobile has reached 3 year olds and 90 year olds and everyone in between, most of whom have not acquired the skill to pick up umfamiliar UIs and don't want to.
As for data moving to the cloud, I'm with you. And I have a lot of native apps that interact with remote data.
The problem with native apps is the installation barrier- if I had apps installed to do everything that I'd love my phone to do, I'd be overwhelmed with hundreds of apps installed. Web-based apps have the potential to slip in and out of use as necessary, with more applicable discovery mechanisms (geo, proximity, etc).
The reason is that native apps can move so much more quickly than web apps. I'll bet Apple could get widespread adoption of a zero-install feature on iOS in six months, simply by mandating it for the app store and featuring early adopters. But HTML5 YouTube has been in beta for over two years, and the <video> tag was introduced over half a decade ago. The web moves at a snail's pace by comparison.
I did try to watch the video, but scrubbing didn't work, which sort of illustrates my point.
On the development side of things I believe we're going to see a massive shift back to web-app development simply because of economics. There is no way any small group of developers can build desktop, iOS, and Android versions of their application simultaneously. This iOS fad WILL DIE because of costs, and that fact that Android numbers are rising dramatically, and people need to come up with a way to support each mobile platform and be able to use that app from their desktops too -- mobile web apps are the answer.
Aren't webapps isolated? Funneled through a strict paradigm? And also at the other end of a comparatively slow link?
The idea of a general purpose computer as it exists in the current desktop form must be put aside. We can't head to a future of everything we do in one portable device and from there to ubiquitous life integration when every app can read all your data and send it to anyone for any purpose, Android is falling fowl of that right now, we have to have isolation and sandboxing, different trust levels, restricted access to more important data (contacts, for example), to make a solid and trustable future platform.
Playing Devil's advocate, Google Drive has an API for webapps to access the user's files (with permissions only to files created by the app itself or that the user specifically opens). The webapp chooses the MIMEtypes it can open and then the user can just do Open With → App.
Sure, it's GDrive specific for now, but the API is fairly generic and if it's successful, I can see other online drives implementing it themselves. Well, assuming Oracle loses this lawsuit, I guess :)
APIs de-isolate them. Both the proprietary ones that already exist and open, distributed ones like http://webintents.org/
And besides, aren't native apps on mobile isolated too?
And the link between the user and the app may be slow, but it only needs to carry the UI. The data will flow from "cloud to cloud" at very high speed.
We can't head to a future of everything we do in one portable device and from there to ubiquitous life integration when every app can read all your data and send it to anyone for any purpose, Android is falling fowl of that right now, we have to have isolation and sandboxing, different trust levels, restricted access to more important data (contacts, for example), to make a solid and trustable future platform.
Uh, Android does have sandboxing and isolation, and apps can only access your data if you specifically allow them to. iOS was the one which didn't require permission to access your contacts (though Apple has said it will).
Wrong; The filesystem is FAT32, and has no additional permissions beyond a "read only" bit. Any data that's there can be accessed by apps, as things stand.
Relevant text: "Every Android-compatible device supports a shared "external storage" that you can use to save files. This can be a removable storage media (such as an SD card) or an internal (non-removable) storage. Files saved to the external storage are world-readable and can be modified by the user when they enable USB mass storage to transfer files on a computer."
I forgot to specify that in my original post. My mistake.
This is an old presentation by Gruber (Daring Fireball) from 2010, but it is still excellent. And only 10 minutes long. He argues that iOS vs Web is a false dichotomy, in fact, the Web was from the start a big part of iOS. He then differentiates between "Web Apps vs Browser Apps". Your notion is, that a Web App is something which uses html. But he argues, and I think this rings true, that a Web App is something which uses http.
"How can you say that Twitter, even though it is a native iPhone App, isn't really a Web App?"
Most people don't even understand what a filesystem is, nevermind whether it's open or not. I agree with you that data is moving online but that's orthogonal to the native vs web app question.
Current trends indicate user attention is shifting even more to native, BTW:
The web vs native debate is really just a debate about which technology is better suited to building cloud client apps. HTML5 + JS is just another UI stack like Cocoa Touch or Android.
I don't know you from a bar of soap but reading your comments, I get the impression of an American late-teen with a solid amount of technical knowledge, strong opinions but staggeringly little perspective.
Even amongst those who are capable of administering an 'open platform cloud infrastructure' there are few who have the time. In my own circumstances, ten years of programming experience, recently shifted from consultancy to an analysis role at a major financial institution, start up on the side, seven week old baby,ten year old daughter, wife, 33 years of age... Let me tell you: the more my technology 'just works' the better. I won't accept my rights bing trampled on but a system that is elegant and robust albeit a bit restrictive is far preferable to spending hour on end tinkering with something that I need to actually use as opposed to configure endlessly.
You are correct that data is moving into the cloud, and away from Apple's restricted file system (or any local file system). But that has nothing to do with native vs. web (HTML) apps. Native apps can easily store all their data in the cloud (ANY cloud, not just iCloud), and still be native apps in the usual sense of the word.
It seems to me that the trend is toward native, cloud based apps. And these two concepts are not in conflict. As cageface says, they are orthogonal.
Which is different from browser apps... how, again?
Next you're gonna tell me they're gonna introduce a language which when compiled into a single compressed file in some kind of 'virtual mechanism' will run on any platform and everyone will use it because the technology is so much better than anything else.
I had this conversation with my girlfriend the other day:
Her: If ____ had a [mobile] app I'd waste so much time on it.
Me: You know you can use their website on your phone, right?
Her: Yeah, but apps are so much better.
Out of curiosity, I went to this particular site on my phone and they actually do have a mobile web app, complete with home screen icon and the meta tags to make it fullscreen. If you add it your homescreen it could pass as a decent native app.
There are two problems with web apps right now:
1) Quality. It's pretty hard to make really nice mobile web apps, or apps in general. There are lots of behaviors in the browser that are un-app-like which you need to work around. Native platforms were designed for apps from the start. Performance can also be an issue (possibly solved by something like NativeClient).
2) Discoverability. Users look for app in the app store first. If they don't find it they often don't check for a mobile web app, either because they're not aware they exist or they assume it will be crap. App stores need to have better integration for web apps. In the meantime, wrappers like PhoneGap can help.
Also why would someone pay 10 dollars a month for 50 GB of space in the cloud when you can buy 1 TB portable hard drives for dirt cheap?
I know it's happened to at least one indie developer. And at least one youtube blogger that comes to mind.
That having been said, local hard drives aren't going to die anytime soon. Uncle McCarthy has not yet had his day, at least not on that one.
Maybe stealing it and using the data. If someone swipes my drive the data is still gone. Even if it is inaccessible.
Heh. "Pandora didn't get half the kicking around she deserved", as I think that R.A.Heinlein jokingly put it.
(Not that I believe that GP is that prescient.)
I always like people who over-simplify things to the extreme. They don't seem to understand that different solutions exist because there are different needs. Ever thought about doing video editing of a More-than-HD-video on a network connection ? Ever thought of RAW images storage, taking up 30 seconds to load one image through your ISP? Ever thought of people living away from large cities, with slow internet connection ?
Obviously not. You must be living in a microcosm where everyone has 1Tb/s Net access under their tables.
Of course not! 2014 will be the year of Linux on the desktop, just like every other year for the last 15 years!
I was doing this for a very non-technical user. All I said was that Ubuntu was like Windows but free. Nothing confusing about that (from their reaction).
>there is very little software for it (relative to what is available for Windows)
Yes and no. Games are a big problem. For most tasks that most users perform, there's a web browser, email client and an office suite. Applications moving to the web only helps with this. Bespoke software which manages a warehouse's stock we don't really care about in this context.
>a moving target (both for users and developers - radical changes in UI from one year to the next, libraries that aren't ABI stable for a year let alone 2 decades)
I agree. Distributions should take more care with this or desktop environment developers should call their new DE something completely different and not make the look like a replacement but rather a competitor. Then again, installing an LTS release should keep a person happy for a couple of years. After that, if they're not a power user, there may be little to no benefit in upgrading to the newest release.
>it's too hard (approaching impossible) to make real money from Linux software.
Tell that to: Sun (hah, great example), RedHat, Google, Apple, the countless software agencies developing on an OSS stack, all the businesses in the world running on a network backbone mostly composed of open software and all the research projects using fancy supercomputers running on Linux. Almost everyone uses Linux in one way or another and almost everyone makes money. Redhat is proof that you can directly use Linux to build a billion dollar business. It's a young business model, but it works well.
I'll be hanging on to my local hard drive tooth and nail. (Backing up to S3 or something mind you, but make no mistake my local disk is the master copy.)
A "major privacy event" seems quite likely though.
How do you know that there hasn't been a major data loss event for Google? Would you be able to tell the difference if, say, a quarter of their data were gone?
(I agree that it would be more visible if Amazon had a major data loss, but the fact that you don't know about it doesn't mean it didn't. In fact, can you name a major data loss of which you do know? (I can't, but that may just be because I don't follow these things.) Surely one can't conclude from the fact that there are few well publicised examples that businesses do not suffer such losses.)
Google had taken ill.
The problem was the index storing the contents of the web
in Google's servers. For a couple of months in early 2000,
it wasn't updating at all. Millions of documents created
during that period weren't being being collected. As far as
the Google search engine was concerned, they didn't exist.
The breakdown that is going to happen when Facebook and/or Gmail's database gets dumped and uploaded to The Pirate Bay or Wikileaks is going to be mind-blowingly tremendous.
On top of that, I wouldn't expect any slowdown after "a major privacy event". I fully believe that we are in a world where very few organizational secrets can be kept. Virtually no one, including major corporations and governments, has a correct concept of what is necessary to protect digital data. In the early days of the war, when everyone is still able to use the internet, the intelligence leakage is just going to be astonishing.
And business... pushing all your data to the cloud is a mistake. For many reasons such as; security. Look there is no way I'm putting sensitive data, like accounting, patient information, (etc) in the cloud. It's just not worth the risk or the lawsuit.
Also, I know a guy who owns a gym. Was talked into a "hosted cash register". BIG MISTAKE. After the install, a week later early morning storm rolls in from the ocean. His power flips (a common thing in South Florida). But his Comcast never came back online. He was down (and out of business) for 4 hours. Personally though, I enjoyed hearing his plight. He's a cheap sob.
Some of you might say, that is what he deserves for relying on Comcast. Fine. But what small business you know it gonna go through the expense of a T1/3 or fiber? That $29.99 a month cash register just became $300+ a month. Lovely.
My point: out in the real world (in America) the infrastructure is barely stable. Until the carries seriouslly invest (and I doubt they ever will since they only provide exactly what they have too - I don't even want to get started on our TERRIBLE cell service in this country) in their networks, the "cloud" will be a site you share pictures of your dog.
Thick apps aren't going anywhere in the near future. Video Editing, Adobe Photoshop - they'll all be huge-honking local apps for at least another 10-15 years.
Some work might have to be done to minimize the amount of data that gets transferred when saving and loading, but that's about it. I'm sure if you compress your 100MB Slide Decks file you'll find it much easier to send over!
Seriously, all that cloud stuff is overrated nonsense. There's some areas where it might sense, but stop trying to put every fucking thing on the cloud.
It just doesn't work. End of story.
True, companies will encourage voluntarily uploading your stuff to their computers. Why shouldn't they? Just like they solicit your email address, track your movements, snarf your smartphone address book and other neat tricks. There's value in getting that info.
That doesn't mean everyone is going to fall for it. Some will, no doubt.
But GB's are getting cheaper every day. And not just for "cloud providers". How many terbaytes does one person need? You can fit your whole life's worth of data onto today's capacities of consumer digital storage.
Are we to imagine a future where consumers cannot purchase storage media? What drugs are you on?
The web is brittle. It's but one of many things that can be run over IP. You web-fanatics crack me up.
Are we to imagine a future where consumers cannot purchase storage media?
My bet is within 3 years we'll reach the threshold where it might as well be worth it to store files online rather download them and store them permanently on hard drives on a computer running constantly in your home.
How does "it is getting harder and slower to move large files over the net" lead to the conclusion "therefore we should store all our large files on the net and retrieve them whenever we want, rather than keep them on local media with negligible latency and huge storage space"?
If at this point, there is no local storage, and it's stored in the cloud, you need to pay the bandwidth price every time you wish to view it, rather just on taking possession.
This particular example, about Canada increasing bandwidth charges, actually works against your argument.
Sure, some data won't need to make it to the client, but that's mostly only true if your a scientist, or otherwise run computation on data, but that's hardly the case for the general person.
But what you describe is a networking problem (transit costs of your ISP passed on to you). Not a storage problem.
A "cloud storage" provider is just going to keep the data you store in a datacenter near to you anyway, not halfway around the world. But it's not like keeping your physical stuff with a storage company. You can easily store the data yourself with many more advantages. The marketing teams for "cloud storage" services will no doubt try to convince people otherwise.
If someone (Google, Mozilla) pushed for it, this could be merged with a browser API / VirtualBox for speed and completeness, and you could download a paused machine with state.
It's not happening now, but it's possible already.
Inferno IE4 plugin: http://www.vitanuova.com/inferno/plugin/index.html
Lively Kernel: http://www.lively-kernel.org/
Native apps may be toast, but they won't go without a fight.
Clearly you have the religious fervor upon you. Personally I like it when I see the benefits. Currently that's for a subset of my computing tasks. Sharing stuff? Sure. Off-site backup, OK. Most other stuff? I'm happy sticking to the PC model rather than the thin client-mainframe timeshare model.
On the whole, the ability to transparently cache big chunks of data locally turns out to offer a superior user experience in many non insignificant use cases (e.g. Downloading content for later consumption while off grid).
I don't think it's clear that the web approach will win out, and I certainly think it will not be settled in three years. And photo and video libraries (which are popular and growing faster than bandwidth or storage) aren't going to the cloud any time soon.
Until mobile browsers have access to the full range of hardware on a device (camera, microphone), native apps will have an edge. Example: camera/pictures - ios doesn't allow camera/cameraroll access from the browser, android does, and wp7 used to but removed it. If you want to offer people a way to send you pictures from your app, it's gotta be native for now.
I'm not saying you're wrong, just that similar predictions haven't panned out in the past.
The web is for documents.
Want to make a bet? Name your stakes.
If I recall correctly, Aura also serves as a real proper full window manager and already has support for Wayland too!
To clarify, I don't think it's a coincidence and I think Google employees probably submitted these.
If you look at the posters' profiles you can see where they both work. (Neither is currently a Google employee.)
They were both worthy of submission IMO.
(The Douglas Crockford post more than this one though.)