Web 2.0 was a pretty annoying term, but it was at least loosely defined as something. Basically if you had some mix of HTML, Javascript and XMLHttpRequests doing stuff without complete page reloads, you were more or less there. Add some social context to it and you were hot.
Cloud computing on the other has to be the fuzziest, least definable, over-hyped buzz-word I have heard in a long. When people started calling a networked machine with a web-browser a "cloud computing device", it totally broke my bullshit-meter and the tolerance for the word approached zero.
That Amazon S3 incident also showed that all that "guaranteed uptime" supposedly promised by the "cloud infrastructure" was pretty much all fiction.
Without a second thought, too busy to do a reality-check, people started talking about setting up "cloud balancers", using multiple cloud providers and what not, to maintain uptime. Obviously not realizing they were full of shit, too busy chasing a hype instead of a working solution.
I'll stick to local storage, local DBs, you know a proven solution without the added risk of externalizing your core data and services, thank you very much.
Adding "cloud" or "grid" is really bs... when you think about it, companies like IBM and EDS have done this for years with their data centers. I don't think there is a health insurer in the U.S. who's purchased their own hardware in 20 years or more. It's all gussied up with a new, trendy name, and it's repackaged for the little guys, but it's about as new as pegging your jeans.
Hyped? sure.
But fuzzy... It's not that vague is it? It's moving data storage and/or processing to servers. Even cloudier if it's through the web & even cloudier if it's something that was once done on the desktop.
Outlook >> Gmail
Word Perfect >> Google Docs
Even wording it most fuzzily it's be something like.'Making the device less important'