If you want to make a new prediction, a new thread has gotten started: https://news.ycombinator.com/item?id=21941278.
- Bitcoin would become the decade's best investment by far
- the President of the US would conduct foreign policy through Twitter
- electric scooters would become a billion dollar business (Bird, Lime, etc.)
- the sharing economy would threaten the taxi and hotel industries (AirBNB and Uber)
- escalation of school shootings
- the explosive growth of quantitative easing
- negative nominal interest rates on sovereign debt
- revelations of mass surveillance by the US government
- streaming video services would begin to produce content rivaling major studios in quality
- Amazon would become the everything company
- DNA testing would become a major recreational activity with numerous judicial and social implications
- the US would approach energy independence, driven largely by a boom in oil extraction technologies
Also interesting to consider how all of these ideas would have seemed more or less ludicrous in 2010.
Bubble thought. School shootings are down from previous highs and are actually very rare .
COVERAGE and willingness to use any incident to promote a very specific narrative are up.
What we're seeing is a large increasing in the number of school shootings, but a decrease in the average success of each school shooter. Is it surprising that the average shooter has become less effective when schools are spending a lot of time and money preparing to deal with them? Shooter drills, established lockdown procedures, buildings being designed with escape and hiding in mind, staff training, employment of security guards, etc.
The data you're citing states it plainly: there are more shootings at schools now than 10 or 20 years ago, and more people are being shot at school than 10 or 20 years ago. 50 people killing 3 kids each isn't somehow safer than 25 people killing 4 kids each.
> 50 people killing 3 kids each isn't somehow safer than 25 people killing 4 kids each.
But you have to treat people killing 2 kids in a completely different way from people that are killing 20. Drills, lockdown procedures, almost all gun limitations... those don't do anything against a single murder.
Seems many will not disarm in the US, so should we be exploring other options rather than repeatedly saying the US should be more like other countries?
Not at all. He’s accurately reflecting gun laws and car laws. You are willfully hand waving the fact that cars are entirely unregulated except for use on public roads. As where guns are regulated in every facet of their existence.
You are doing this because you read on a liberal site and heard the liberal talking points that guns are easier to get than a car with is wrong.
It’s not semantic, your argument is invalid.
In my jurisdiction one may not "move" or "leave standing" upon a public road an unregistered vehicle, in addition to not being permitted to "drive" it.
The state wants its registration dollars.
The ONLY people that say this are the ones that would be (fucking) horrified if we treated guns like cars.
Ok. You are proposing I should be able to buy any gun I want whenever I want regardless of size or capacity, automatic, or loud or silenced for cash without any checks or regulations at all? That the only requirement to use it in public is that I need a license - BUT - that carry license is good in all 50 states?
... that sounds good to me! Let’s do it.
Yours is an argument based entirely in ignorance of gun laws. And maybe car laws too.
> regardless of size or capacity
Vehicles of different size/capacity do require different qualifications to drive. Certain classes of vehicles require a CDL for their class. See https://driving-tests.org/cdl-classification-licenses/.
You also have to be 21+ in may states to even apply for them in many states.
Maybe you’re confusing purchasing a car with licensing a car?
Amusingly... it was likely their opposition. This is why you should hold yourself and those you align with to high standards. It only stengthens your position...even if you are completely wrong (although only a little bit)
Does the onion’s satirical joke take into account all the places there are no guns but this does still happens?
Paris France, and Norway, have both had worse mass and school shootings than any in the USA ever.
Norway, School summer camp in 2011. I suppose it’s not exactly what you think of when you think of skin shooting inside a school building, but the people killed were almost exclusively children. Perpetrator use a rifle that has not been legal in Norway for civilians.
He used a Ruger Mini-14 and a Glock pistol for which he had a legal license as a civilian (for hunting and sport shooting respectively), purchased in Norway - the country with the second highest legal gun ownership rate in the EU - after he failed to illegally acquire guns in Prague.
You have a source for that?
I understood neither of those guns are to be civilian legal.
Also... to the topic. If Norway’s extremely strict (draconian) gun laws didn’t prevent this shooting which was worse than any in US history; why would US benefit from them?
Sure, I can do a Google search for you:
> Also... to the topic.
Oh, I'm not interested in playing out that tired debate. I just wanted to correct the misinformation.
Someone in New Jersey shot a rat 2 blocks from a school - school shooting.
Drive by between rival gangs near a college - school shooting.
Resource officer discharges his gun in his car - school shooting.
I think the only thing worse than lying about incidents to improve your statistics, is ignoring reality in arguing for policies and regulations that further instead of solve the problem.
It goes to show you never can tell
My overall sentiments, more or less:
1990->2000 we went from very few people owning personal computers to wider adoption of PCs and broadband was gaining traction (I personally had @Home cable internet in 1998).
2000->2010 mass adoption of cell phones, starting with the flip phones to pocket PCs to the first iPhone in 2007. Social media (MySpace, The Facebook). Peer to peer networking, piracy (The Pirate Bay). Laptops. The first tablets.
2010->2020 mild improvements on the stuff from the previous two decades
EDIT: I'll give the last decade Uber (although I've only used it once) and AirBnb since that was a huge shift as well, but my overall feeling remains the same.
I am surprised with no mention of Smartphone. Which is arguably the most important innovation in modern history.
2010s, Smartphone ( iPhone 3GS at the time ) went from niche to 4B users ( iOS, Androids and KaiOS ), that is nearly every person on earth above age 14 in developed countries. I dont think there has ever been a product or technology innovation as important that spread faster than Smartphone. And it changes everyone's life. The post mentioned of Google, Facebook and Amazon empire, all partly grows to this point because of Smartphones. Technology companies together now worth close to 10 Trillions. The whole manufacturing supply chain exists and became huge in Shenzhen because of Smartphone. It was the reason why TSMC managed to catch up to Intel in both capacity and leading edge node. It was the reason why we went from 3G to 5G in mere 10 years because of all the investment kept pouring it. It was the reason why everyone went on to the Internet and had Internet economy. It brought a handheld PC and Internet to a much wider audience.
I would even argue it was the Smartphone innovation that saved us from the post 2008 Financial Crisis doom as it created so much wealth, innovation and opportunities.
I would point to the 2010's as the rise of two phenomena:
- Social networking and meme culture
- Explosive growth of software
These will prove to be just as important as the developments of the 2000's, but haven't had the time for other developments to build on them.
We can expect Chile to drop positions since its economy is going downhill creating massive unemployment. Inequality will increase as people from lower middle class will move into poverty as we've seen in Brazil since 2015.
In any case, many economists argue over this question :)
There is a reason those who don’t like inequality had to come up with new measurements to highlight it. There is little impact on the mean, median, and mode if the rich get richer.
They were clearly describing the handling and discourse (I suppose on a large public scale) of climate change as FUD, not climate change itself.
Valued at a billion dollars by venture capitalists. I'd be genuinely surprised if they are anywhere near a $B in revenue
And surprisingly the WoW one is the most off :D
Interesting how plausible this one is, yet turned out to be terribly wrong: the newer hyped languages that got some uptake were largely compiled ones like Swift, Rust, Kotlin, and Dart.
Like yeah, you can make an interpreted dynamic language that's pretty neat, but you can also make something like go, swift, or Julia that jits and also captures 90% of that ease of use while significantly reducing your hosting costs/energy consumption.
Going forward I think static compiler inference will be the future of language design. Either for safety like in rust or for convenience like in swift.
And we can already see pitfalls in the interpreted world of python where the wrong implementation, like using a loop instead of numpy, can lead to devastating performance impacts. Looking forward, something like this seems as outdated as having to manage your 640k of executable space in DOS: an unreasonable design constraint caused by the legacy implications of the day.
My prediction is 10 years from now we'll look at interpreters and language VMs as relics from a simpler time when clockrates were always increasing and energy was cheap.
As far as the carbon footprint, well, yeah, it depends. At Google I remember a friend talking about how he cringed whenever he added more code to the pipeline that ingests the entire internet. He said that he wondered how much extra carbon was release into the atmosphere just because of his stupid code.
As for numpy, we are seeing that loops are stupid and Iverson and APL were right. :-)
I wish companies were under more financial pressure to actually track and mitigate this. (hint hint, carbon taxes)
Funnily enough someone said that exact same thing 10 years ago in that thread:
>* Functional programming / dynamic languages will go out of fashion. People still using them will be judged as incompetent programmers by the people who moved on to the new fashionable programming paradigm(s). At the same time, huge corporations will embrace functional programming / dynamic languages and third world universities will start focusing on them in their courses.
Wrong then wrong now.
While pithy, that's not actually an interesting rebuttal.
You should elaborate, especially since "huge corporations", namely Apple, Google, and Mozilla (I guess?), are the ones pushing Swift, Go, and Rust, respectively.
So what happens is that people gravitate towards languages which are pleasant enough to work on to try new ideas in.
The progressing of bash -> awk -> perl -> python happened for a reason. Hell we're even seeing people use lisp like languages unironically for the first time in decades.
More powerful languages enable talented developers to be more productive individually, but it's hard to teach all the other developers about the new and interesting abstractions that the powerful languages enable, and it's impossible to hire for them directly. This limits velocity.
And it's not just huge corporations, it's any company which is trying to scale from 10 to 100 developers; where you're hiring continuously, where there are too many other developers to efficiently rely on ambient learning.
Cloud spot pricing is I think one example where things that can be batched and deferred will be cheap even in a energy-decline future.
This is assuming we don’t sink so far as to lose our productive capacity for energy infrastructure altogether. Depends how bull/bear you are about the whole thing I suppose. In that scenario, there’s also gonna be no market to sell whatever you’re coding to.
In overall total effort needed to make something robust, something like rust will beat something like ruby, because the dynamic language-ists compensate via a larger test suite.
JS really shows what's possible in this field with some benchmarks even outperforming c++. This space of expressive yet fast languages is a gap that remains to be filled and will probably be by 2030, sadly it's a bit of a chicken-and-egg situation because you don't only need a good syntax to become popular, also a healthy ecosystem around it. What we will see is probably not any new language but rather that existing languages from both sides of the spectrum converge more towards the middle ground.
I doubt Swift replaced much besides other compiled languages, and Kotlin just compiles to Java anyway. Dart's VM idea was dropped so its small usage is largely compiling to JS still.
I would say that the overall idea of performance being traded for programmer time is definitely happening despite the emergence of Rust.
Are you sure about this? I was under the impression that today's Flutter development is highly dependent on the niceties the Dart VM provides, and newer Dart releases improved upon them.
I personally don't think swift will ever escape the Mac ecosystem, just like Objective-C never did, but something with the same DNA will.
In the same way that Objective-C and Ruby both implement the philosophy of Smalltalk, I think that philosophy has yet to be fleshed out in a simple syntax that is natively jit'ed/compiled.
The same goes for julia as an answer to R/pandas.
In some sense Rust is in that same vain - reducing certain errors and the need for tools like Coverity, which in turn makes programmers more efficient.
ryacko is perfectly clearly stating that Electron reduces the amount of training that developers need. This is different from making them develop faster. It just makes them more replaceable.
I have no idea how to parse twobat's question. The "or" is especially confusing. I'm not surprised that ryacko is baffled by it.
Hence the training for "what" or for "who" not being immediately parseable.
Until @ryacko said "Whom do you think I'm talking about?", I wasn't sure either.
I still think asking "what or who?" with no other elaboration is really confusing. It's such a vague question that you have to guess how to answer, and it's super easy to answer it in a way that doesn't satisfy what the asker actually meant to ask.
We're adding more and more cores, but more cores don't help you write a program in easy mode. Easy mode doesn't multithread.
I think the prediction still had a kernel of truth to it though - every edge that chip manufacturers and JIT developers can give us has been burned away on slower and slower client-side rendering frameworks.
I was always curious what happened to HipHop. Didn’t it become a part of Facebook’s custom flavor of PHP? (Forgot it’s name)
I'd wager there are lots of people like me - writing small tools at work in Rust, but the employer would never hire for Rust.
I think we'll see broader market adoption in the next 2-5yrs though, in terms of people actually hiring Rust developers. There's still a little maturing to do in a couple areas to clean up some papercuts.
Disclaimer: I really like Rust am a little biased.
The not hiring but using bit is based on not seeing Rust jobs despite companies actively using Rust, and many anecdotes of "I wasn't hired for rust, but we use it" over in reddit on r/rust when people ask how to get a rust job.
I mean, that’s not entirely wrong — Besides JS, Python is arguably the most dominant language across all domains. Python is also faster than Ruby, though, and 3.x added static-like features to the language. It doesn’t seem like there’s much appetite for languages any slower than what we have now.
I think the perception is there because of python’s scientific and ML libraries that have a lot of c bindings, perhaps?
Both are drastically faster than they were a decade ago, so 2010-era python or ruby are still the high-water mark.
binarytrees-5 220.557 elapsed secs
That isn't something which can be checked, just stuff someone said.
If I look at a composite benchmark like debian's and don't see one leading by an order of magnitude, I file that away as "probably about the same speed, depends on your use case"
and golang (Go)
I think this basically describes AI.
It replaces trust with massive resource consumption and doesn't do a good job of being digital money (it is hard to use safely, it is hard to use anonymously).
What is groundbreaking is the concept of P2P value exchange. And that's what bitcoin really is, still a young PoC that will surely evolve or get superseded by a 2.0 tech.
The genie is out of the bottle....
> "various 'experts' on it will spring from nowhere to lecture us all about it"
The fall and return of the cat picture still amuses me. In the early days of Web adoption when people would put up a personal home page just for the thrill of having one, stereotypically the only bits of content they would be able to come up with for it were a list of favourite bands and one or more scanned-in pictures of their cat. So for years and years afterwards cat pictures were a synechdote, a punchline for jokes about the naïvety of the early Web. Then phone cameras and low-effort social media became big, and ...
Turns out the “killer enterprise web application” was YouTube :)
> if there is a way to work Elon Musk in there somewhere that would be good, I have a feeling he's going to make some big waves in the next 10 years but I haven't a clue how.
"not [electronically] tracking your kids will be considered somewhat negligent"
i dunno, seems negligent to me
The only language I've felt that way about was Prolog. When it works right, it's amazing. Too bad I can rarely make that happen.
Chrome and V8 were released in 2008, and Node.js (which leverages V8) in 2009. V8 was already considered fast back then. Mozilla and Apple also focused on optimizing their JS stack in the following years.
 I'm not a fan of linking to benchmarks, but that's the best I can do: https://benchmarksgame-team.pages.debian.net/benchmarksgame/... / https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
As for Adobe, they never did much for JS; they tried to get people to use ActionScript instead, which was a statically-typed language with syntactic similarities to JS.
Chrome + V8 was already behind in real world performance, not ahead!
Macromedia (later Adobe) funded a whole bunch of JIT research as the Tamarin Tracing project, which was shared with Firefox as TraceMonkey. The NanoJIT assembly backend was actually shared code.
I don't want to piss too much on SunSpider, but unsurprisingly V8 did better on V8’s benchmarks, while SFX and TM did better on SunSpider, once they were eventually released. That's because V8 was written to do well on its benchmarks, presumably, not because the benchmarks were rigged. Certainly they were in accordance with my experience at the time.
When Chrome was released, V8 was way ahead of the other browser JS engines at JS performance, partly because it had the first JS JIT. But the other browsers took a year or two to catch up, which they had because it also took websites a few years to move most of their functionality into browser JS.
I don't know why you keep emphasizing this “real-world” thing. Are you saying you think nbody and fannkuch are especially realistic benchmarks?
You're right though that Tamarin was a JS engine, not just an AS engine. We regret the error.
nbody and Fannkuch are meaningless for real world performance on a client side JS app. V8s benchmarks were widely considered to be unrepresentative of what browsers actually do. Sunspider and derivatives were the most realistic tests at the time and Chrome wasn’t a dramatic improvement.
Chrome’s marketing was incredibly successful in getting more people to use a browser better than IE but it didn’t have a dramatic improvement over FF or Safari.
That's true relative to the work they had done on, for example, Gopher support, but it's not true relative to the work they did after Chrome was released. As I said, JS performance from 1995 to 2008 hadn't increased by even a factor of 4; then Chrome was released in 2008 and it immediately increased by a factor of 4, more than it had increased in the entire previous 13-year history of JS. It's true that there existed other optimization efforts. But they weren't successful, probably because not nearly enough effort was devoted to them. Tamarin Tracing/TraceMonkey was eventually discarded and is not part of SpiderMonkey today, although a different JIT strategy is. (LuaJIT uses the tracing-JIT strategy very successfully, though.)
> nbody and Fannkuch are meaningless for real world performance on a client side JS app. V8s benchmarks were widely considered to be unrepresentative of what browsers actually do. Sunspider and derivatives were the most realistic tests
I mentioned nbody and fannkuch because they are in SunSpider, so it seems that you are contradicting yourself in addition to, as demonstrated previously, mixing up the historical sequence of how things happened.
I had just written a parser generator and an experimental functional language that compiled to JS when Chrome came out, and the performance improvements I saw were in line with the Chrome benchmarks. My experience is not part of Google's marketing.
I’m working on a tracing JIT for Ruby at the moment partially inspired by LuaJIT. Fingers crossed it’ll be published research this year!
Right, but the Sunspider benchmark also tested strings, regex and date manipulation. V8s original benchmarks didn’t. Performance on stuff like a parser generator was way up, but it didn’t run jQuery based Ajax sites any faster.
This should read, "before Chrome was released on 2008-09-02".
100x Ruby 1.9 #2 34 min
But, nobody is paying attention to it. Some of that is because of the Perl 6 baggage, but just as much is probably because this past decade was so heavily entrenched in proliferation of the web-cloud-mobile paradigm that new scripting systems weren't part of the hype cycle. If it didn't get the backing of the FAANGs, it didn't register.
I can imagine a day coming where scripting shines again, though. It might actually be closer than we think. There is always a need for glue code, and glue code benefits from being a kitchen sink of built-in functionality.
It's not good enough to build a slightly better language any more. People won't learn a new language without their favourite packages (or equivalents).
Also the community migration to Python 3 is done now. Yes there are massive P2 code bases out there still, but for new projects P3 has been the clear choice for a long time now. It’s over.
I sense a strong convergence between all of them (builtin DS, linguistic traits, bits of static typing)
It's rather interesting to see how the web went from IE dominating, to Chrome doing essentially the same. Looks like predicting such a growth was farfetched.
Funny to see that the prediction of IE sticking around was mostly right. I long for the day that IE11 will mostly be gone.
The latter half of this one didn't come to be. Rule of law is weaker and civil liberties have declined.
“China Must Never Adopt Constitutionalism, Separation of Powers, or Judicial Independence“ - Xi Jinping
My favorite is the joke that Zuck will buy Portugal.
As in the saying. Hindsight is 20/20.
Nadella’s excellent pivoting of MS into a cloud and services focused company saved Microsoft from a stagnant or declining state.
I dont think this happened?
I can totally picture new languages as described doing well today if we had exponentially more single threaded processing power/watt at our disposal than we actually do.
I think there is still opportunity for languages that offer huge leaps in expressiveness w.r.t handling concurrency even at the cost of raw single threaded performance. Though I would not label a language like that as "slower" as it'd allow us to actually make much better use of our computing resources than we reasonably could today without blowing through most of our complexity budget, resulting in faster programs in practice.
I feel like that one simply missed the mark. If anything, there was greater emphasis towards languages that were more performant, even if less productive for banging out a working product.
I rewrote it using XML + XSLT and it was instant. But it was truly gnarly code.
Below is how I calculated 6 out of 17 predictions correct
> Facebook will not be displaced by another social network. It will IPO some time in the next two years.
> Twitter will become profitable, but not as much as some expect. It will be less profitable than Facebook, and may sell to another company.
Correct (profitable since last year)
> Microsoft will .. have shrunk and may have evolved into a consultancy company on the lines of IBM
> Internet Explorer will shrink, but won't go away
Correct? (debatable, since software never completely goes away, but MS is no longer developing it and Edge doesn't use the same rendering engine)
> Chrome OS or a similar operating system that relies on web access may grow extremely slowly at first, before rapidly gaining share amongst certain market segments. It will be most successful in places like cities that grant free municipial wifi access.
> Mobile phones won't replace computers, but increasing penetration amongst the poorest in developing countries, and increasingly capable handsets in developed countries (and developing countries) will make them a colossal juggernaut. Many of the really big changes, especially social changes, will be caused by mobiles.
This was already true in 2010, so it doesn't even count.
> For any definition of 'success', there will be more tech startups reaching that level in the 2010s than in the 2000s. For example, there will be more than four startups of Youtube/Facebook/Twitter/Zynga proportions.
> In addition, at least one of the 'big' startups of the second half of the decade will have been possible with 2009 technology. By this I mean that people will still be discovering new potential for browser-based web applications built with current client-side technologies, which will remain ubiquitous, although new alternatives will appear.
Correct, but the predication that there will be at least one new web-based startup is not very interesting.
> It will be an even better time to start a startup in 2020 than it is now. One of the key drivers of ease-of-starting-up-ness will not be new technology, but new platforms - like Facebook and viral marketing, but better; or that solve other problems like micropayments, customer development, retention, and so on.
> Hence, starting up will become a more attractive career option, though well-meaning family will still say "at least finish your degree first".
> As Moore's Law marches on, dynamic languages that are even slower than Ruby are likely to catch on. They may be to Ruby what Ruby is to Java, trading even more programmer time for CPU time.
> Having said that, Moore's law will at least hiccup and may stop altogether in the middle of the decade, as semiconductor feature widths drop below 11nm. Since this will likely encourage investment in quantum computing and nanotechnology, by 2020 we might be seeing something faster than Moore's Law.
Wrong, IMO (transistors are still on the curve, but the performance impact of adding more transistors doesn't matter the same way it used to; but he cited Moore's Law specifically, so he's wrong).
> An international deal, of the kind that was aimed for at Copenhagen, will be reached over the next five years, though it might not be far-reaching enough to limit warming to 2 degrees in the long-term. (Despite the failure of the Copenhagen talks, it appears that world leaders almost universally recognize the need to take action over man-made climate change, though the various political problems will remain hard problems). China may not be part of such a deal, though the US likely will. Environmental disasters will begin to increase through the decade, as will disasters that are probably not caused by anthropogenic global warming but will be blamed by it anyway; this will provoke more of a push for action.
> Increasing fuel prices, and green taxes or incentives, will mean large shops will begin to replaced by warehouses, as traditional retail gives way to home delivery.
Wrong, but only because he cited a reason. Otherwise, the prediction that brick-and-mortar stores will continue to be replaced would have been correct.
> China will not become a democracy, or even make moves in that direction. However the rule of law will strengthen, and some civil liberties will increase. Internet crackdowns will continue, and may increase in severity, and will still be rationalized by porn.
I want to say the civil liberties situation in China is the same, but I don't know enough about China to comment on it.
> Despite multiple new fads that purport to make software development ten times faster and error-free, it will remain a hard problem.
Correct, but trivial. Of course writing better software is hard.
> You still won't be able to talk to your fridge, and gesture-based HCI will remain a fun gimmick.
I'm counting this as wrong because of Alexa.
> Virtual worlds like Second Life will remain niche, but World of Warcraft will pass 20 million users and a Facebook game or similar will pass 200 million users.
> The next big thing will be something totally unknown and unpredictable now, as user-generated content and social networking were in 1999. However, when it does appear, various 'experts' on it will spring from nowhere to lecture us all about it. It will still be really cool, though.
Wrong, but maybe I'm forgetting about some big tech thing in the 2010s. It seems that the biggest disprupters was a modernization of cab companies and hotel bookings. We may be on the dawn of several actually unpredicted new things, like driverless cars and the use of neural networks for some kind of cool new applications, but there's been no killer app. These are hopefuls on the horizon, not profitable products and industries.
I don't know about that. ChromeOS did grow slowly, and has gained significant share of the entire US education space (K-12). Yeah, it's a niche market. But I think that fits the for "certain market segments" qualifier.
> (Virtual worlds will remain niche, but WoW will pass 20 million users, a Facebook-like game or similar will pass 200 million users) Wrong.
This is technically wrong, but in-spirit correct. Virtual worlds / MMO games did get the traction claimed, just not WoW specifically. (The MMORPG FF14 Online has ~20 million users, MMO Warframe has 50 million registered users, 'facebook-game' Farmville has 73 million users, and 'or similar game' Fortnite has 250 million registered users). Minecraft holds similar numbers.
And while WoW itself never quite hit those numbers, a different game from the same studio did. Blizzard's Hearthstone has over 100 million registered users today.
Facebook was started as a silly php script. Now you “need” half of npm, react, backend, devops etc to get started.
Anyone who starts off with anything more than that (or Django or your favorite language’s equivalent) is just wasting time and effort trying to be trendy. Over-engineering a startup only serves to keep you from testing market fit, which is only a good plan if you’re trying to milk more investment money because you know you’re going to fail as soon as you launch.
Says who? You can literally spin up nearly any server computing infrastructure imaginable in a matter of seconds. Just because software is getting bloated under the covers, doesn't mean its any harder.
> This was already true in 2010, so it doesn't even count.
This definitely wasn't true in 2010. On June 1, 2010, Steve Jobs proclaimed that the post-PC era had arrived, and was promptly ridiculed for it by the industry and media. Massive social changes caused by mobile phones have only occurred this decade, as the gig economy has exploded and places like India have 10x'd the number of citizens with internet access.
>> For any definition of 'success', there will be more tech startups reaching that level in the 2010s than in the 2000s. For example, there will be more than four startups of Youtube/Facebook/Twitter/Zynga proportions.
Huh? Of those 4, Facebook in 2009 was the most valuable at $10 billion. Now there are 20+ private unicorns with that valuation, as well as dozens more that have IPOd in the last few years. How is that prediction wrong in any way, shape, or form?
>> Hence, starting up will become a more attractive career option, though well-meaning family will still say "at least finish your degree first".
>> An international deal, of the kind that was aimed for at Copenhagen, will be reached over the next five years, though it might not be far-reaching enough to limit warming to 2 degrees in the long-term. (Despite the failure of the Copenhagen talks, it appears that world leaders almost universally recognize the need to take action over man-made climate change, though the various political problems will remain hard problems). China may not be part of such a deal, though the US likely will. Environmental disasters will begin to increase through the decade, as will disasters that are probably not caused by anthropogenic global warming but will be blamed by it anyway; this will provoke more of a push for action.
Again, this is correct. The US did enter the Paris Agreement, and has not actually formally withdrawn yet as it is not legally eligible to do so until November 2020. "Environmental disasters will begin to increase through the decade, as will disasters that are probably not caused by anthropogenic global warming but will be blamed by it anyway; this will provoke more of a push for action." is an especially cogent prediction.
>> China will not become a democracy, or even make moves in that direction. However the rule of law will strengthen, and some civil liberties will increase. Internet crackdowns will continue, and may increase in severity, and will still be rationalized by porn.
> I want to say the civil liberties situation in China is the same, but I don't know enough about China to comment on it.
Civil liberties have likely gotten worse, but the rule of law has indeed strengthened. Internet crackdowns being rationalized by porn is also correct, as seen in the UK and elsewhere.
As far as ChromeOS and video games go, maxsilver covered those pretty well already.
This is wrong too.
Curious if the "wisdom of crowds" works with these future predictions or if only a few were on target.
And yet, people regularly think "a decade is a long time, surely this trend will be finished by then". Recognizing that existing trends are still likely valid is worthwhile.
- Apple and Google are still the only two mobile operating systems that matter and they are still in relatively the same position. iOS still controls the high end where the money is and Android has the market share but the OEMs are not making any money.
- Facebook is more profitable and popular.
- Amazon is still the number one online retailer, the Kindle is still by far the most popular ereader but more importantly, the Kindle platform is still dominant.
- Google still hasn’t managed to diversify from its ad business and YouTube is still dominant for video.
- Netflix is still the dominant streaming platform.
- Microsoft is doing better than ever.
Facebook didn’t exist.
Apple was “beleaguered”.
Google was in its infancy and Yahoo was still dominant (before the crash)
Microsoft was seen as unassailable.
Amazon was mostly just an interesting book seller. They were just getting into CDs and hadn’t even made the deal with Toys R Us yet.
Netflix was struggling against BlockBuster renting DVDs.
AOL was still dominant.
The entire app ecosystems was in the earliest infancy in 2010. AWS was in its infancy. Slack wasn’t released.
The app ecosystem was growing like crazy in 2010. It was a year and a half into “there is an app for that”. Facebook had already pivoted to being big in mobile.
The same can be said about Uber. Once the companies have a profitable and sustainable business model that doesn’t involve burning cash then they can be considered a major player.
By 2010, streaming services were big, Apple was selling movies and TV shows digitally and had already started removing DVD drives from its computers.
Apple has made a lot more in mobile from Google than Google has made from Android.
For awhile, MS was making more in patent fees from Android OEMs than Google was making in licensing and advertising.
Losing mobile was the best thing that could happen to MS so it could focus on Azure and making Office ubiquitous- where the real money is.
That is not the way to think about it - Imagine if Android didn't exist or faltered like Symbian, RIM, Windows Mobile etc. Google would have been paying way more than 8 billion. Android is definitely worth more than the money accounted towards it.
But the best I could find in Europe is.
Personally, I have quite a few paid apps, but the majority I bought over the years and keep using them. The top grossing apps are all subscriptions and the only one that I have with one is OSMAnd (Open Street Maps app) which includes a donation to OSM/contributors.
You're badly misremembering or recontextualizing with hindsight bias. At the start of 2010, iOS had 32%, Android had 4.5%, and there were at least 5 other mobile OSs with >4% market share. Start of 2020 and we're looking at Android with 74%, iOS with 24%, and zero other OSs with >1% market share. These are very different situations!
I don’t think this was a sure thing in 2009. Symbian still had half the market. BB had a larger share than either iOS or Android. WebOS was cool.
Netflix didn't start streaming until 2011. So it wasn't a streaming platform at all in 2010.
By 2009, they had 12,000 movies up for streaming, and Netflix-compatible devices were advertised heavily in stores, with Best Buy including Netflix apps on their store-brand devices.
If you want to make a new prediction, a new thread has gotten started: https://news.ycombinator.com/item?id=21941278.