Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: A New Decade. Any Predictions? (2010) (news.ycombinator.com)
908 points by tcharlton on Jan 2, 2020 | hide | past | favorite | 437 comments



All: this thread is about predictions that HN users made 10 years ago. Note the year in the title above.

If you want to make a new prediction, a new thread has gotten started: https://news.ycombinator.com/item?id=21941278.


The most striking thing about the thread is how little of current importance was even mentioned:

- Bitcoin would become the decade's best investment by far

- the President of the US would conduct foreign policy through Twitter

- electric scooters would become a billion dollar business (Bird, Lime, etc.)

- the sharing economy would threaten the taxi and hotel industries (AirBNB and Uber)

- escalation of school shootings

- the explosive growth of quantitative easing

- negative nominal interest rates on sovereign debt

- revelations of mass surveillance by the US government

- streaming video services would begin to produce content rivaling major studios in quality

- Amazon would become the everything company

- DNA testing would become a major recreational activity with numerous judicial and social implications

- the US would approach energy independence, driven largely by a boom in oil extraction technologies

Also interesting to consider how all of these ideas would have seemed more or less ludicrous in 2010.


Regarding escalation of school shootings, it seems plausible, but only because (here is an unpopular opinion that will get 20 downvotes) the rich want to disarm the poor and use school/kids/safety as an excuse. And that's understandable: if I was rich enough to afford private schools for my kids in a classy neighborhood, I'd want to keep weirdos with guns far from my family, while at the same time Id have 24/7 former-marines guards armed with full auto weapons.


> escalation of school shootings

Bubble thought. School shootings are down from previous highs and are actually very rare [0].

COVERAGE and willingness to use any incident to promote a very specific narrative are up.

[0] https://news.northeastern.edu/2018/02/26/schools-are-still-o...


Your source doesn't say school shootings are down from previous highs nor that they're very rare; it specifically only deals with shootings where _4 or more students were killed_ excluding the shooter. That's a very different things from "school shootings", where the mean number of fatalities is 3. If you define "school shooting" as "people being shot at school", then use the same data as in your source, you see 247 people shot in 64 incidents during the 2000s, and 555 people shot in 213 incidents during the 2010s. (It's worth noting that Fridel & Fox came under some scrutiny for choosing their definition of school shooting, because the standard definition used--by the FBI, the Bureau of Justice Statistics, and by Fox himself in previous studies that conflicted with his new one--is four people _shot_. )

What we're seeing is a large increasing in the number of school shootings, but a decrease in the average success of each school shooter. Is it surprising that the average shooter has become less effective when schools are spending a lot of time and money preparing to deal with them? Shooter drills, established lockdown procedures, buildings being designed with escape and hiding in mind[0], staff training, employment of security guards, etc.

The data you're citing states it plainly: there are more shootings at schools now than 10 or 20 years ago, and more people are being shot at school than 10 or 20 years ago. 50 people killing 3 kids each isn't somehow safer than 25 people killing 4 kids each.

[0] https://www.washingtonpost.com/education/2019/08/22/new-high...


Hang on. If you're counting things where only one or two people are shot, there is very often no intent to shoot more than that. You can't use that data point by itself to talk about the "success" of school shooters. And those smaller incidents are definitely not what I think of when I hear the words "school shooting".

> 50 people killing 3 kids each isn't somehow safer than 25 people killing 4 kids each.

But you have to treat people killing 2 kids in a completely different way from people that are killing 20. Drills, lockdown procedures, almost all gun limitations... those don't do anything against a single murder.


"No Way To Prevent This", Says Only Nation Where This Regularly Happens [0]

[0] https://en.wikipedia.org/wiki/%27No_Way_To_Prevent_This,%27_...


On that thought, is there any way to prevent shootings without making guns harder or impossible for citizens to own?

Seems many will not disarm in the US, so should we be exploring other options rather than repeatedly saying the US should be more like other countries?


Is it so bad to make guns as hard to own as a car? Simple licensing would go a long way to reduce the risk of guns.

https://www.vox.com/2019/9/11/20861019/gun-solution-backgrou...


Last I checked, cars do not require any licensing to own. They require special licensing and often times yearly taxes to operate on public roads (but merely possessing them on public roads does not).


You need to register the car, and that often requires proof of insurance, which requires a driver's license. Either way, my point still stands. You're quibbling over semantics. So few people want to buy a car just to not drive it, that we don't even consider it a problem.


> You're quibbling over semantics.

Not at all. He’s accurately reflecting gun laws and car laws. You are willfully hand waving the fact that cars are entirely unregulated except for use on public roads. As where guns are regulated in every facet of their existence.

You are doing this because you read on a liberal site and heard the liberal talking points that guns are easier to get than a car with is wrong.

It’s not semantic, your argument is invalid.


No, you don’t need to register the car if you don’t intend to drive it on public roads. See farms.


> and often times yearly taxes to operate on public roads (but merely possessing them on public roads does not).

In my jurisdiction one may not "move" or "leave standing" upon a public road an unregistered vehicle, in addition to not being permitted to "drive" it.


Wow, you can't tow an unregistered vehicle? That definitely sounds off, but then I've seen even more silly laws in the past (such as one that forbids minors from playing pinball).


You can tow it if its wheels do not touch the road; otherwise the owner is liable for it being on the road without registration.

The state wants its registration dollars.


> Is it so bad to make guns as hard to own as a car?

The ONLY people that say this are the ones that would be (fucking) horrified if we treated guns like cars.

Ok. You are proposing I should be able to buy any gun I want whenever I want regardless of size or capacity, automatic, or loud or silenced for cash without any checks or regulations at all? That the only requirement to use it in public is that I need a license - BUT - that carry license is good in all 50 states?

... that sounds good to me! Let’s do it.

Yours is an argument based entirely in ignorance of gun laws. And maybe car laws too.


I believe he was referring to licensing a car, since the following sentence referred to "simple licensing."

> regardless of size or capacity

Vehicles of different size/capacity do require different qualifications to drive. Certain classes of vehicles require a CDL for their class. See https://driving-tests.org/cdl-classification-licenses/.

You also have to be 21+ in may states to even apply for them in many states.


> Ok. You are proposing I should be able to buy any gun I want whenever I want regardless of size or capacity, automatic, or loud or silenced for cash without any checks or regulations at all?

What? No.


But that's how loosely cars are regulated, and you said you wanted guns regulated like cars. I think what you meant to say is to regulate guns much more strictly than cars.


Well that’s how purchasing a car works, so...

Maybe you’re confusing purchasing a car with licensing a car?


It's not a tradeoff between manufacturing restrictions and licensing restrictions.


They've moved the goalpost from this to "the deaths are statistically insignificant".


Where did they get that idea from? [1]

Amusingly... it was likely their opposition. This is why you should hold yourself and those you align with to high standards. It only stengthens your position...even if you are completely wrong (although only a little bit)

[1] https://ourworldindata.org/terrorism


Despite the vail of Wikipedia, your source is The Onion.

Does the onion’s satirical joke take into account all the places there are no guns but this does still happens?

Paris France, and Norway, have both had worse mass and school shootings than any in the USA ever.


Which school shootings in France and Norway have been worse than any in the USA?


Paris was the mass shooting I was referencing. 184 dead irrc.

Norway, School summer camp in 2011. I suppose it’s not exactly what you think of when you think of skin shooting inside a school building, but the people killed were almost exclusively children. Perpetrator use a rifle that has not been legal in Norway for civilians.


It wasn't a school summer camp - it was organized by a left-wing political party - and most of the people killed were adults.

He used a Ruger Mini-14 and a Glock pistol for which he had a legal license as a civilian (for hunting and sport shooting respectively), purchased in Norway - the country with the second highest legal gun ownership rate in the EU - after he failed to illegally acquire guns in Prague.


> He used a Ruger Mini-14 and a Glock pistol for which he had a legal license as a civilian

You have a source for that?

I understood neither of those guns are to be civilian legal.

Also... to the topic. If Norway’s extremely strict (draconian) gun laws didn’t prevent this shooting which was worse than any in US history; why would US benefit from them?


> You have a source for that?

Sure, I can do a Google search for you:

https://translate.googleusercontent.com/translate_c?u=https:...

https://translate.googleusercontent.com/translate_c?u=https:...

> Also... to the topic.

Oh, I'm not interested in playing out that tired debate. I just wanted to correct the misinformation.


This one also surprised me. Are school shootings up? Or is the media peddling violent news more aggressively?


There's a weird thing that happens here with language: school shootings mean multiple things. So, if someone is shot at a school at an after-hours drug deal gone awry, that counts just as much as parkland, even though there is a very distinguishable difference between the two.


That’s exactly what the lists of “4375 school shootings this year!” are actually made of.

Someone in New Jersey shot a rat 2 blocks from a school - school shooting.

Drive by between rival gangs near a college - school shooting.

Resource officer discharges his gun in his car - school shooting.


Well, they're comprised of that, and, y'know, children being murdered in mass.


Those should not be put in the same category.


I agree that’s bad.

I think the only thing worse than lying about incidents to improve your statistics, is ignoring reality in arguing for policies and regulations that further instead of solve the problem.


So don’t you think it’s disingenuous to group them together? Or is it just too convenient to pass up for those looking for anti-gun talking points?


I don’t think this should be downvoted. The facts are the facts. It’s also the case that even 1 shooting is a tragedy and should give us pause for how we can fix it (whether that is gun control or something else).


"C'est la vie", say the old folks

It goes to show you never can tell


For those who don't recognize the reference, it's the chorus of a song made famous by Chuck Berry:

https://genius.com/Chuck-berry-you-never-can-tell-lyrics



people tend to make optimistic predictions. A lot of predictions were correct, and many of them were right on subject, but predicted the positive instead of the negative turn of things. Interesting views regardless, but i cant help to notice that tech progress was nothing like 2000->2010.


I feel like enormous cultural and technological hardware advancements were made between 1990-2010 that affected people on a national or global level. The advances were revolutionary. I have a hard time associating the last decade the same way. To me, the 2010-2020 decade has been stagnant.

My overall sentiments, more or less:

1990->2000 we went from very few people owning personal computers to wider adoption of PCs and broadband was gaining traction (I personally had @Home cable internet in 1998).

2000->2010 mass adoption of cell phones, starting with the flip phones to pocket PCs to the first iPhone in 2007. Social media (MySpace, The Facebook). Peer to peer networking, piracy (The Pirate Bay). Laptops. The first tablets.

2010->2020 mild improvements on the stuff from the previous two decades

EDIT: I'll give the last decade Uber (although I've only used it once) and AirBnb since that was a huge shift as well, but my overall feeling remains the same.


Copying my answer from another thread

I am surprised with no mention of Smartphone. Which is arguably the most important innovation in modern history. 2010s, Smartphone ( iPhone 3GS at the time ) went from niche to 4B users ( iOS, Androids and KaiOS ), that is nearly every person on earth above age 14 in developed countries. I dont think there has ever been a product or technology innovation as important that spread faster than Smartphone. And it changes everyone's life. The post mentioned of Google, Facebook and Amazon empire, all partly grows to this point because of Smartphones. Technology companies together now worth close to 10 Trillions. The whole manufacturing supply chain exists and became huge in Shenzhen because of Smartphone. It was the reason why TSMC managed to catch up to Intel in both capacity and leading edge node. It was the reason why we went from 3G to 5G in mere 10 years because of all the investment kept pouring it. It was the reason why everyone went on to the Internet and had Internet economy. It brought a handheld PC and Internet to a much wider audience. I would even argue it was the Smartphone innovation that saved us from the post 2008 Financial Crisis doom as it created so much wealth, innovation and opportunities.


There is a tendency to undervalue recent advancements because they haven't had the same amount of time to demonstrate an impact on the world.

I would point to the 2010's as the rise of two phenomena:

- Social networking and meme culture

- Explosive growth of software

These will prove to be just as important as the developments of the 2000's, but haven't had the time for other developments to build on them.


Electric cars really started gaining traction in the 2010s, with the Leaf, Model S, Zoe, etc.


One more to add: - Hong Kong's protests on scale that world has never seen


And the collapse (in a matter of weeks) of Chile as the most prosperous, richest, free/open society in Latin America.


Chile has massive income and wealth inequality. What good is being "rich" in terms of raw GDP if in practice that wealth goes to a small elite?


Apparently quite a lot. Poverty has gone down six-fold in a generation: https://data.worldbank.org/indicator/SI.POV.DDAY?end=2017&lo...


I wonder how much more it would have gone down if Chile had a more equitable economic system.


Hasn't Chile done much better than its Latin American peers? I suppose it would be unfair to mention Venezuela, but how about Brazil?


Chile is much better when compared to Brazil in terms of equality. Brazil is ranked 8 most unequal (tied with Botsuwana) whereas Chile holds the 26th position in the GINI index [1].

We can expect Chile to drop positions since its economy is going downhill creating massive unemployment. Inequality will increase as people from lower middle class will move into poverty as we've seen in Brazil since 2015.

[1] https://www.indexmundi.com/facts/indicators/SI.POV.GINI/rank...


I think many economists would argue it'd have gone _up_ rather than down due to the counterintuitive nature of capitalism.

In any case, many economists argue over this question :)


Not much. The thing about inequality is that it makes the top more powerful politically. The amount of money held by the top 0.1 percent is irrelevant to moving the income/wealth needle for the rest of the population.

There is a reason those who don’t like inequality had to come up with new measurements to highlight it. There is little impact on the mean, median, and mode if the rich get richer.


How does your list not include climate change as a thing of current importance? It wasn't mentioned much in the original either - though i saw one comment calling it "FUD" - but they were making forecasts ;)


> Hopefully network analysis/data mining laws/politics will be handled with a little less FUD, a little less grandstanding, and a little more efficiency than we're seeing with climate change politics...

They were clearly describing the handling and discourse (I suppose on a large public scale) of climate change as FUD, not climate change itself.


> - electric scooters would become a billion dollar business (Bird, Lime, etc.)

Valued at a billion dollars by venture capitalists. I'd be genuinely surprised if they are anywhere near a $B in revenue


Now just to predict what will be the big thing in 2030 so I can get rich :)


Honk-honk!


IsaacL nailed it the best

https://news.ycombinator.com/item?id=1027093

And surprisingly the WoW one is the most off :D


> - As Moore's Law marches on, dynamic languages that are even slower than Ruby are likely to catch on. They may be to Ruby what Ruby is to Java, trading even more programmer time for CPU time.

Interesting how plausible this one is, yet turned out to be terribly wrong: the newer hyped languages that got some uptake were largely compiled ones like Swift, Rust, Kotlin, and Dart.


One thing that always really bugged me about the "the language will figure it out" philosophy is the complete disregard towards carbon footprint.

Like yeah, you can make an interpreted dynamic language that's pretty neat, but you can also make something like go, swift, or Julia that jits and also captures 90% of that ease of use while significantly reducing your hosting costs/energy consumption.

Going forward I think static compiler inference will be the future of language design. Either for safety like in rust or for convenience like in swift.

And we can already see pitfalls in the interpreted world of python where the wrong implementation, like using a loop instead of numpy, can lead to devastating performance impacts. Looking forward, something like this seems as outdated as having to manage your 640k of executable space in DOS: an unreasonable design constraint caused by the legacy implications of the day.

My prediction is 10 years from now we'll look at interpreters and language VMs as relics from a simpler time when clockrates were always increasing and energy was cheap.


I'm actually pretty amazed at what the Javascript VM people have done in the past decade. It's way more than I ever expected.

As far as the carbon footprint, well, yeah, it depends. At Google I remember a friend talking about how he cringed whenever he added more code to the pipeline that ingests the entire internet. He said that he wondered how much extra carbon was release into the atmosphere just because of his stupid code.

As for numpy, we are seeing that loops are stupid and Iverson and APL were right. :-)


> At Google I remember a friend talking about how he cringed whenever he added more code to the pipeline that ingests the entire internet. He said that he wondered how much extra carbon was release into the atmosphere just because of his stupid code.

I wish companies were under more financial pressure to actually track and mitigate this. (hint hint, carbon taxes)


If you run your data centers where electricity is cheap, you're using electricity but not emitting very much carbon dioxide.


>My prediction is 10 years from now we'll look at interpreters and language VMs as relics from a simpler time when clockrates were always increasing and energy was cheap.

Funnily enough someone said that exact same thing 10 years ago in that thread:

>* Functional programming / dynamic languages will go out of fashion. People still using them will be judged as incompetent programmers by the people who moved on to the new fashionable programming paradigm(s). At the same time, huge corporations will embrace functional programming / dynamic languages and third world universities will start focusing on them in their courses.

Wrong then wrong now.


> Wrong then wrong now.

While pithy, that's not actually an interesting rebuttal.

You should elaborate, especially since "huge corporations", namely Apple, Google, and Mozilla (I guess?), are the ones pushing Swift, Go, and Rust, respectively.


Huge corporations always push for languages which are brain dead and make developers be as fungible as possible. Unfortunately not all software has been invented yet and writing in such a language is a nightmare.

So what happens is that people gravitate towards languages which are pleasant enough to work on to try new ideas in.

The progressing of bash -> awk -> perl -> python happened for a reason. Hell we're even seeing people use lisp like languages unironically for the first time in decades.


Enabling developers of different talents and backgrounds to be productive and contribute successfully, and to scale up in number of separate teams, is a hard problem and doesn't deserve the kind of dismissiveness you give it here.

More powerful languages enable talented developers to be more productive individually, but it's hard to teach all the other developers about the new and interesting abstractions that the powerful languages enable, and it's impossible to hire for them directly. This limits velocity.

And it's not just huge corporations, it's any company which is trying to scale from 10 to 100 developers; where you're hiring continuously, where there are too many other developers to efficiently rely on ambient learning.


Maybe the progression is actually bash -> awk -> perl -> python -> raku ? One can wish :-)


Not so sure our energy problems are going to come to a head in the form of all-out energy poverty. More likely I think: the convenience of energy being instantly available at all the right times and places will diminish.

Cloud spot pricing is I think one example where things that can be batched and deferred will be cheap even in a energy-decline future.

This is assuming we don’t sink so far as to lose our productive capacity for energy infrastructure altogether. Depends how bull/bear you are about the whole thing I suppose. In that scenario, there’s also gonna be no market to sell whatever you’re coding to.


I think the new languages are about developer productivity, but in a different way. It allows you to make more robust software via a better typing system, or more robust multithreading via rust's borrow system. In the past, our machines couldn't handle such strict languages as well because they have a high compile time cost.

In overall total effort needed to make something robust, something like rust will beat something like ruby, because the dynamic language-ists compensate via a larger test suite.


Most setups and most hardware isn’t efficient enough for language to really matter. So the solution to carbon footprint isn’t to stop using python, it’s to stop burning coal.


The sad thing about Python is that it shouldn't need to be interpreted. People don't use it for that reason. Remove one or two obscure meta-programming features that nobody uses (or shouldn't be using) and you could in principle remove GIL and make it JIT just as good as Javascript.

JS really shows what's possible in this field with some benchmarks even outperforming c++. This space of expressive yet fast languages is a gap that remains to be filled and will probably be by 2030, sadly it's a bit of a chicken-and-egg situation because you don't only need a good syntax to become popular, also a healthy ecosystem around it. What we will see is probably not any new language but rather that existing languages from both sides of the spectrum converge more towards the middle ground.


Energy is gonna keep on getting cheaper.


I mean, JavaScript and Python have definitely become much more popular for many more things than they were in 2010.

I doubt Swift replaced much besides other compiled languages, and Kotlin just compiles to Java anyway. Dart's VM idea was dropped so its small usage is largely compiling to JS still.

I would say that the overall idea of performance being traded for programmer time is definitely happening despite the emergence of Rust.


> Dart's VM idea was dropped so its small usage is largely compiling to JS still.

Are you sure about this? I was under the impression that today's Flutter development is highly dependent on the niceties the Dart VM provides, and newer Dart releases improved upon them.


In terms of language design, it seems like swift and julia are the most forward looking.

I personally don't think swift will ever escape the Mac ecosystem, just like Objective-C never did, but something with the same DNA will.

In the same way that Objective-C and Ruby both implement the philosophy of Smalltalk, I think that philosophy has yet to be fleshed out in a simple syntax that is natively jit'ed/compiled.

The same goes for julia as an answer to R/pandas.


Those are both typically faster than Ruby, tho. Ruby (especially pre-1.9) probably does represent the mainstream peak for tolerance of slowness.


> I would say that the overall idea of performance being traded for programmer time is definitely happening despite the emergence of Rust.

In some sense Rust is in that same vain - reducing certain errors and the need for tools like Coverity, which in turn makes programmers more efficient.


JS and Python are not to Ruby what Ruby is to Java, though.


In my opinion, Electron is the fulfillment of that prediction.


JavaScript is quite a bit faster than Ruby in most cases.


Electron isn’t just JavaScript, else I would have said JavaScript. Electron is front and center in the movement to shorten development time by consuming more PC resources at runtime. I feel that’s the spirit of the prediction.


I think Electron's advantage is to shorten training time, not development time.


I don't agree. I've implemented GUIs with raw Xlib, Tcl/Tk, Qt, GTK, MFC, SDL, XUL, and DHTML. Of all of these, DHTML is the most productive for me, and by a significant margin. The Tcl/Tk topic in Dercuano goes into those experiences in more detail, if you're interested.


Training for what or for who?


Training for developers, presumably, especially web developers who want to build native apps. If retraining was free, Electron would have approximately no market.


Who? Whom do you think I'm talking about?


Well I as an innocent bystander have no idea what you mean either, unfortunately.


I'm also an innocent bystander!

ryacko is perfectly clearly stating that Electron reduces the amount of training that developers need. This is different from making them develop faster. It just makes them more replaceable.

I have no idea how to parse twobat's question. The "or" is especially confusing. I'm not surprised that ryacko is baffled by it.


Training is an abstract term these days. Could've meant training an algorithm, or training a person, depending on context.

Hence the training for "what" or for "who" not being immediately parseable.

Until @ryacko said "Whom do you think I'm talking about?", I wasn't sure either.


Electron gives you a UI and some standard library stuff. That's pretty obviously a totally separate issue from training an algorithm... I think? Am I missing something?

I still think asking "what or who?" with no other elaboration is really confusing. It's such a vague question that you have to guess how to answer, and it's super easy to answer it in a way that doesn't satisfy what the asker actually meant to ask.


Because it's not Moore's Law that matters here, but rather the subset that is the performance of single cores. And that subset stalled out pretty badly.

We're adding more and more cores, but more cores don't help you write a program in easy mode. Easy mode doesn't multithread.


It's also been quite a decade of growth for old stodgy languages. In 2010, it seemed like C++, Java, and Javascript were never going to change, but they've all significantly cleaned up their acts.

I think the prediction still had a kernel of truth to it though - every edge that chip manufacturers and JIT developers can give us has been burned away on slower and slower client-side rendering frameworks.


And PHP. PVP 5.3 was the newest version in 2010 and as the years wore on many thought HipHop by Facebook would start to take over for serious PHP applications. PHP 7 completely changed the course of the PHP community and took the air out of HHVM’s sails.


Did PHP 7 introduce optimizations similar to HipHop?

I was always curious what happened to HipHop. Didn’t it become a part of Facebook’s custom flavor of PHP? (Forgot it’s name)


HipHop was a cross-compiler which facebook abandoned for HHVM, a rewrite of the PHP engine with faster internals. It always had compatibility issues, but the improved performance made it interesting (afaik nobody outside facebook really got that much into Hack the language). PHP7 was a rewrite of the internal data structures which put it on close enough performance footing. Because HHVM was never a drop-in replacement for PHP and moved further away over time the community ended up sticking with PHP7 and now pretty much only facebook uses HHVM, with the big players that moved to HHVM all transitioning back to PHP7.


Hack, right? I'm curious as well.


Rust and Dart hasn’t seen that much “uptake” by the broader market, it’s hyped by geeks but that’s about it. Swift is becoming popular because if you want to develop for iOS, you don’t have a choice. Kotlin is getting some uptake.


Rust has seen lots of quiet uptake. What you don't see is hiring for people for Rust because most organizations don't experience a whole-sale shift to Rust, and you just retrain people on the inside.

I'd wager there are lots of people like me - writing small tools at work in Rust, but the employer would never hire for Rust.

I think we'll see broader market adoption in the next 2-5yrs though, in terms of people actually hiring Rust developers. There's still a little maturing to do in a couple areas to clean up some papercuts.

Disclaimer: I really like Rust am a little biased.


Any citations?


So, for the uptake, you can see a large list of companies here: https://www.rust-lang.org/production/users

The not hiring but using bit is based on not seeing Rust jobs despite companies actively using Rust, and many anecdotes of "I wasn't hired for rust, but we use it" over in reddit on r/rust when people ask how to get a rust job.


I was thinking that exact same thing. Rails was hitting its peak in 2010, and the idea that programmer time is more valuable than CPU time suggested dynamic languages were the future.

I mean, that’s not entirely wrong — Besides JS, Python is arguably the most dominant language across all domains. Python is also faster than Ruby, though, and 3.x added static-like features to the language. It doesn’t seem like there’s much appetite for languages any slower than what we have now.


I don’t think one can say that python is faster than ruby: https://benchmarksgame-team.pages.debian.net/benchmarksgame/...

I think the perception is there because of python’s scientific and ML libraries that have a lot of c bindings, perhaps?

Both are drastically faster than they were a decade ago, so 2010-era python or ruby are still the high-water mark.


afaict until the 25 Dec 2019 ruby 2.7.0 release you probably could "say that python is faster than ruby".

fwiw

    ruby 2.6.3p62
    ruby 2.7.0p0

    binarytrees-5 220.557 elapsed secs
    binarytrees-5 124.421

    binarytrees-2 178.751 
    binarytrees-2 149.458

    binarytrees-3 176.672
    binarytrees-3 150.645

    binarytrees-4 136.930 
    binarytrees-4 113.516 

    binarytrees-1 176.208 
    binarytrees-1 149.078 

    fannkuchredux-1 1702.734 
    fannkuchredux-1 1738.016

    fannkuchredux-2 2129.404 
    fannkuchredux-2 1153.396 

    fasta-3 132.449 
    fasta-3 81.565 

    fasta-6 105.649 
    fasta-6 54.305 

    fasta-2 236.321 
    fasta-2 180.647 

    fasta-5 181.954 
    fasta-5 146.962 

    fasta-4 178.753 
    fasta-4 144.130 

    knucleotide-7 702.380 
    knucleotide-7 361.174 

    knucleotide-2 412.034 
    knucleotide-2 365.819 

    knucleotide-1 678.888 
    knucleotide-1 378.368 

    knucleotide-3 426.748 
    knucleotide-3 385.934 

    mandelbrot-4 886.246
    mandelbrot-4 1202.783 

    mandelbrot-2 1770.692
    mandelbrot-2 1329.560

    mandelbrot-3 1689.111
    mandelbrot-3 2090.329

    mandelbrot-5 1789.859
    mandelbrot-5 2119.157

    nbody-2 423.428
    nbody-2 376.758

    pidigits-1 60.476
    pidigits-1 30.232

    pidigits-5 6.446
    pidigits-5, 4.790

    pidigits-2 18.670
    pidigits-2 11.841

    regexredux-3 17.774
    regexredux-3 9.431

    regexredux-9 71.050
    regexredux-9 40.018

    regexredux-2 55.729
    regexredux-2 33.070

    revcomp-2 66.741
    revcomp-2 32.203

    revcomp-5 44.957
    revcomp-5 21.445

    revcomp-3 89.536
    revcomp-3 39.507

    revcomp-4 107.196
    revcomp-4 43.994

    spectralnorm-5 1059.201
    spectralnorm-5 587.204

    spectralnorm-1 318.088
    spectralnorm-1 291.167

    spectralnorm-4 271.367
    spectralnorm-4 209.650


Ruby pulled ahead of Python in speed in version 1.9 and has stayed ahead (most of the time, in most tests) since.


> most of the time, in most tests

That isn't something which can be checked, just stuff someone said.


I wasn't trying to assert that ruby is faster, just trying to correct the notion that python is an inherently faster language than ruby.

If I look at a composite benchmark like debian's and don't see one leading by an order of magnitude, I file that away as "probably about the same speed, depends on your use case"


> Swift, Rust, Kotlin, and Dart

and golang (Go)


> - The next big thing will be something totally unknown and unpredictable now, as user-generated content and social networking were in 1999. However, when it does appear, various 'experts' on it will spring from nowhere to lecture us all about it. It will still be really cool, though.

I think this basically describes AI.


No, definitely Bitcoin


Bitcoin isn't really all that cool though.

It replaces trust with massive resource consumption and doesn't do a good job of being digital money (it is hard to use safely, it is hard to use anonymously).


It is cool because it really disintermediates powerful entities. Maybe the first iteration of digital money/assets is not that efficient, but that could also be argued for many other innovative technologies.

What is groundbreaking is the concept of P2P value exchange. And that's what bitcoin really is, still a young PoC that will surely evolve or get superseded by a 2.0 tech.

The genie is out of the bottle....


Eh, bitcoin is still a cool proof-of-concept. It's just that as it often happens, people took it to production when they shouldn't.


Agree, particularly this bit describes the blockchain hype...

> "various 'experts' on it will spring from nowhere to lecture us all about it"


I suppose that partly depends on exactly how novel the blockchain part is, but broadly speaking the world of efforts to combine some of electronic money, cryptography, proof of work and decentralisation was not a new one in the late 2000s. For example Ecash had been hot once https://en.wikipedia.org/wiki/Ecash , while later on Hashcash got a fair bit of attention as a possible solution to the spam crisis. In 2007 (early 2010 is another story) it was definitely more of a deep cut than AI or cloud, though.


AI was hot too before winter came


definitely Bitcoin/Blockchain. I wouldn't say it will actually disrupt traditional banking empires, but it was the next big thing and 'experts' definitely came out of nowhere and its still pretty cool.


AI wasn't totally unknown: basically every coastal university during the 1970s and 1980s were filled with people working on AI, and it didn't die off, it just became embarrassing to work on.


Neither was user generated content (that's what the original web was) or social networking (irc, email, bulletin boards, etc). Making it useful and usable beyond an academic and techie niche was new.


> Neither was user generated content (that's what the original web was)

The fall and return of the cat picture still amuses me. In the early days of Web adoption when people would put up a personal home page just for the thrill of having one, stereotypically the only bits of content they would be able to come up with for it were a list of favourite bands and one or more scanned-in pictures of their cat. So for years and years afterwards cat pictures were a synechdote, a punchline for jokes about the naïvety of the early Web. Then phone cameras and low-effort social media became big, and ...


And more, the AI hype-cycle was definitely on the upswing again already by the beginning of 2010. The Netflix Prize and the motor-vehicle DARPA Grand Challenges were all 2000s events, for instance.


Ummm, Uber?


Crowdsourcing / the ‘sharing’ or gig economy are huge.


Doesn’t really qualify as unknown and unpredictable. We’re in about the third AI bubble right now, and I doubt it’ll be the last.


I thought about virtual assistants and deep fakes


Regarding deep fakes, I don't think you're wrong, just early. I think we are just getting a glimpse of what is unfortunately possible as this decade closes out, they could introduce some real chaos as in this new decade.

Edit: spelling


Neither virtual assistants nor deep fakes can be considered "the next big thing". They're in their infancy.


Nah, the rise of cloud, Shirley?


Not at all unforeseen by the end of 2009 though. The GMail beta dates to 2004, while EC2 was announced by 2006, and that's aside from all older precedents or predictions. (That said, not everyone 'got' EC2 at first.)


neural networks are not new


> IE6 will hang around for a few years, but may die very rapidly in workplaces when some killer enterprise web application stops supporting it.

Turns out the “killer enterprise web application” was YouTube :)

https://blog.chriszacharias.com/a-conspiracy-to-kill-ie6


In my experience (very large contact center multinational), what killed IE6 in the enterprise was PCI and other security compliance standards. Large companies do whatever it takes to stay compliant to a lot of these standards because having that "stamp of approval" is the only way to keep other equally large companies as clients.


Also jacquesm on Jan 1, 2010 [-] ...

> if there is a way to work Elon Musk in there somewhere that would be good, I have a feeling he's going to make some big waves in the next 10 years but I haven't a clue how.


I thought ericb did very well:

https://news.ycombinator.com/item?id=1025787

"not [electronically] tracking your kids will be considered somewhat negligent"


we're not really quite there yet. Give another 10 years.


are you saying you have a child with a cell phone or tablet that has location services explicitly turned off ?

i dunno, seems negligent to me


If you're tracking them, so are other people. (Though, as we all know, the real, proper privacy fix is not having a phone.)


Hey, does your child have rfid implant for easy id and tracking?


The dynamic languages slower than ruby prediction didn't hold up either. The past ten years have just seen nibbling around the edges for dynamic languages, without any compelling ideas about how to make them fundamentally more expressive.


Doesn’t that describe the rise of JavaScript? I haven’t done or seen any specific benchmarks, but I gathered that JS was among the slowest of languages.


Like others have said, major Javascript engines are much than Ruby implementations, owing to the huge effort Firefox, Google, Apple, and Microsoft have put into those engines. In principle, I'm not sure if there's reasons that make one intrinsically harder to optimize.

But even if Javascript were slower, it wouldn't mean the prediction was right. Javascript didn't do something super amazing that let you write beautiful code (the way Rubyists feel about Ruby, or Python folks feel about Python). It just ran the web so we used it. Over time, it's morphed into a language that's about as expressive as any other good dynamic language, also with some warts. But it's not a leap ahead, which is what the original post predicted.

The only language I've felt that way about was Prolog. When it works right, it's amazing. Too bad I can rarely make that happen.


I am convinced there has to be a way to leverage the expressive power of a language like pure prolog without the strain of the 'hints' you need to give it when the depth-first algorithm can't solve the problem. My bet is that if the interpreter 'knows' enough algorithms to solve common calculations, it could leverage these to compute 90% of problems (without needing to have one algorithm for each problem). For the other 10% you would have to supply the interpreter with the necessary algorithms yourself.


JS is pretty much the fastest of the popular dynamic langs, if only because JS interpreters see highly-competitive investment from well-funded companies with armies of developers thanks to the web.


No, it's pretty fast [1].

Chrome and V8 were released in 2008, and Node.js (which leverages V8) in 2009. V8 was already considered fast back then. Mozilla and Apple also focused on optimizing their JS stack in the following years.

[1] I'm not a fan of linking to benchmarks, but that's the best I can do: https://benchmarksgame-team.pages.debian.net/benchmarksgame/... / https://benchmarksgame-team.pages.debian.net/benchmarksgame/...


Google’s marketing for Chrome and V8 was incredibly effective. We’ve basically forgotten that Mozilla, Adobe and Apple were pushing JS perf long before V8 was released. Memory usage of early V8 was atrocious and real world performance was barely improved over Safari and Firefox but Google dominated dev mindshare for a decade.


No, Mozilla and Apple didn't release their JIT engines until after Chrome (though I think at least Apple was working on theirs) and so when Chrome came out JS immediately got about 4x faster, which was a bigger improvement than had happened over the entire previous lifetime of JS. Popular web pages didn't immediately get 4x faster, because they had carefully avoided doing anything in JS if its performance mattered, but JS did. It's true that V8 used a ridiculous amount of RAM, and still does, but its existence made it possible to do things in JS that previously required native code.

As for Adobe, they never did much for JS; they tried to get people to use ActionScript instead, which was a statically-typed language with syntactic similarities to JS.


Apple built the Sunspider benchmarks in 2007, and along with Mozilla, they made dramatic real world performance improvements before Chrome was even publicly announced.

https://webkit.org/blog/152/announcing-sunspider-09/ https://webkit.org/blog/214/introducing-squirrelfish-extreme...

Chrome + V8 was already behind in real world performance, not ahead!

http://www.satine.org/archives/2008/09/19/squirrelfish-extre...

Macromedia (later Adobe) funded a whole bunch of JIT research as the Tamarin Tracing project, which was shared with Firefox as TraceMonkey. The NanoJIT assembly backend was actually shared code.


I guess you're right that TraceMonkey was announced before Chrome was released on 2007-09-02, one day after its inadvertent public announcement. TraceMonkey was announced, though incomplete, more than a week earlier, 2008-08-23: https://brendaneich.com/2008/08/tracemonkey-javascript-light.... SFX, however, was announced later, 2008-09-18.

I don't want to piss too much on SunSpider, but unsurprisingly V8 did better on V8’s benchmarks, while SFX and TM did better on SunSpider, once they were eventually released. That's because V8 was written to do well on its benchmarks, presumably, not because the benchmarks were rigged. Certainly they were in accordance with my experience at the time.

When Chrome was released, V8 was way ahead of the other browser JS engines at JS performance, partly because it had the first JS JIT. But the other browsers took a year or two to catch up, which they had because it also took websites a few years to move most of their functionality into browser JS.

I don't know why you keep emphasizing this “real-world” thing. Are you saying you think nbody and fannkuch are especially realistic benchmarks?

You're right though that Tamarin was a JS engine, not just an AS engine. We regret the error.


The point is Mozilla, Apple and Adobe did enormous work on JS before Chrome was released.

nbody and Fannkuch are meaningless for real world performance on a client side JS app. V8s benchmarks were widely considered to be unrepresentative of what browsers actually do. Sunspider and derivatives were the most realistic tests at the time and Chrome wasn’t a dramatic improvement.

https://johnresig.com/blog/javascript-performance-rundown/

Chrome’s marketing was incredibly successful in getting more people to use a browser better than IE but it didn’t have a dramatic improvement over FF or Safari.


> The point is Mozilla, Apple and Adobe did enormous work on JS before Chrome was released.

That's true relative to the work they had done on, for example, Gopher support, but it's not true relative to the work they did after Chrome was released. As I said, JS performance from 1995 to 2008 hadn't increased by even a factor of 4; then Chrome was released in 2008 and it immediately increased by a factor of 4, more than it had increased in the entire previous 13-year history of JS. It's true that there existed other optimization efforts. But they weren't successful, probably because not nearly enough effort was devoted to them. Tamarin Tracing/TraceMonkey was eventually discarded and is not part of SpiderMonkey today, although a different JIT strategy is. (LuaJIT uses the tracing-JIT strategy very successfully, though.)

> nbody and Fannkuch are meaningless for real world performance on a client side JS app. V8s benchmarks were widely considered to be unrepresentative of what browsers actually do. Sunspider and derivatives were the most realistic tests

I mentioned nbody and fannkuch because they are in SunSpider, so it seems that you are contradicting yourself in addition to, as demonstrated previously, mixing up the historical sequence of how things happened.

I had just written a parser generator and an experimental functional language that compiled to JS when Chrome came out, and the performance improvements I saw were in line with the Chrome benchmarks. My experience is not part of Google's marketing.


Safari perf had already improved more than 4x with the move from AST Interpretation to a register based bytecode VM with SquirrelFish. This happened before Chrome.

I’m working on a tracing JIT for Ruby at the moment partially inspired by LuaJIT. Fingers crossed it’ll be published research this year!

Right, but the Sunspider benchmark also tested strings, regex and date manipulation. V8s original benchmarks didn’t. Performance on stuff like a parser generator was way up, but it didn’t run jQuery based Ajax sites any faster.


Best of luck with that! I look forward to seeing it.


> before Chrome was released on 2007-09-02

This should read, "before Chrome was released on 2008-09-02".


fwiw n-body 2009

10x JavaScript V8 211.76 secs

100x Ruby 1.9 #2 34 min

106x JavaScript TraceMonkey 36 min

http://web.archive.org/web/20090705014239/http://shootout.al...


Real world performance was the opposite story, with Safari's super efficient bytecode VM out performing V8s JIT: http://www.satine.org/archives/2008/09/19/squirrelfish-extre...


No, it really wasn't. You had to use it in really specific ways and it had all sorts of gotchas. One I vaguely remember is that if an array got too big performance would tank like crazy.


Did anybody predict the slow death of jQuery or the rise of giant monster JavaScript frameworks? 2010 is when jQuery was a cult of personality. It was everywhere and any criticism of it was quickly met with desperate mob violence.

Did anybody predict a typical corporate web page would require 3gb of JavaScript code from frameworks and thousands of foreign packages just to write a couple of instructions to screen?


Applications written in JavaScript tend to be hideously slow (and getting worse) but this is more of a community problem; the language itself is now rather fast by dynamic language standards.


JavaScript is now both faster and less productive than Ruby, so it doesn't match his prediction that a slower and more productive language will become popular.


What it does show is that being a web standard trumps everything else.


Raku (formerly Perl 6) basically sits alone as the successor to the slower-and-more-expressive throne. It's a great language to study for the gee-whiz-they-thought-of-everything factor.

But, nobody is paying attention to it. Some of that is because of the Perl 6 baggage, but just as much is probably because this past decade was so heavily entrenched in proliferation of the web-cloud-mobile paradigm that new scripting systems weren't part of the hype cycle. If it didn't get the backing of the FAANGs, it didn't register.

I can imagine a day coming where scripting shines again, though. It might actually be closer than we think. There is always a need for glue code, and glue code benefits from being a kitchen sink of built-in functionality.


We've reached a point where package ecosystem eclipses the importance of language features (which, in brand new languages, typically only offer incremental improvements these days).

It's not good enough to build a slightly better language any more. People won't learn a new language without their favourite packages (or equivalents).


A solution to this is leveraging the ecosystem of another language, like elixir (erlang) and all the jvm languages do.


Julia does this by being able to call Python, R and Fortran libraries.


This has a habit of creating impedance mismatch problems though.


Python is eating the world. In many measures, Python is slower than Ruby.


Python is just a very nice interface to a large amount of optimized C code.


Don't neglect the Fortran code!


I would have agreed with you in 2013. Now it's just seems so fractured and all over the place, like it's had it's peak and is on the way back down.


Sometimes fractured and all over the place is a sign of success. Python is used by a huge variety of different communities for a myriad of use cases, which is pulling it in different directions. That’s not a bad problem to have, as long as it’s managed well and the core webs seem very tuned in to this issue.

Also the community migration to Python 3 is done now. Yes there are massive P2 code bases out there still, but for new projects P3 has been the clear choice for a long time now. It’s over.


python won't last, just like ruby / js / php

I sense a strong convergence between all of them (builtin DS, linguistic traits, bits of static typing)


JavaScript will last. There's no question there at all.


> Now that Google is advertising Chrome on billboards here in the UK, all that can be safely predicted about the browser market is that it'll be extremely competitive.

It's rather interesting to see how the web went from IE dominating, to Chrome doing essentially the same. Looks like predicting such a growth was farfetched.

Funny to see that the prediction of IE sticking around was mostly right. I long for the day that IE11 will mostly be gone.


Internet Explorer 6 could have been the dominant browser essentially forever if Microsoft hadn’t zeroed its investment into it. Its incentives weren’t aligned with a robust web platform. Google’s incentives are definitely aligned for Chrome to dominate.


> - China will not become a democracy, or even make moves in that direction. However the rule of law will strengthen, and some civil liberties will increase.

The latter half of this one didn't come to be. Rule of law is weaker and civil liberties have declined.


Civil liberty and rule of law are orthogonal mostly. China has used a decade to make most court cases publicly accessible through internet, all corporation and their holding/debt status similarly accessible, a host of other governmental information available. While there are still a significant amount of corruption, the law enforcement would have been at most transparent in history. Civil Liberty is on a different track however, with laws obstructing such being enforced.


Most court cases yes, the ones that actually matter from a political and civil liberties perspective, no not at all. China is ruled by Xi Jinping and the Communist Party, not the law.

“China Must Never Adopt Constitutionalism, Separation of Powers, or Judicial Independence“ - Xi Jinping


And you actually believe the accuracy of those published documents?


Yes. Most of these are not related to politics at all, 99.9% of cases in China is the same as the west, traffic, violence, contract breach etc.


According to the World Bank's metric, China's percentile ranking on rule of law improved somewhat from 2010 to 2018 (eyeballing the chart, looks like it moved from 40th to 48th). This is of course not quite the same thing as an absolute increase in rule of law; if rule of law is worsening everywhere, then the ranking change could still be a decrease (though globally worse rule of law isn't my impression).

https://info.worldbank.org/governance/wgi/Home/Reports


Rule of law seems to be mainly improved on economical related area, not outside of that.


I think the Microsoft prediction is even more off...


My gut response to his thought of MS pivoting to consulting a la IBM was way off but after thinking about it for a minute you could kind of make a case that Azure is a form of "self-serve operationalized consulting", he was just off on it's impact and success by a factor of 50 or 100.


I heard someone recently describe the value proposition of enterprise cloud like IBM and Azure as “not having to talk to a sales guy as a service”.


Well then every SaaS is just consulting. That’s pretty pointless then.


There was one comment in the thread about what Steve Jobs will do at the end of the decade. As a ghost I suppose? (Another says he will step down due to his health, which is closer.)

My favorite is the joke that Zuck will buy Portugal.


Saying he'll step down due to his health isn't just closer. It's what happened. He resigned a bit over a month before his death.


Jobs was alive in 2010.


Yes I know.

As in the saying. Hindsight is 20/20.


You have to remember that in 2010, Windows was still saving its reputation from Vista, Office was stale, and OneDrive didn’t exist. I don’t think Azure did, and if so it was more enterprise focused.

Nadella’s excellent pivoting of MS into a cloud and services focused company saved Microsoft from a stagnant or declining state.


I don’t disagree that was the situation at the time - but it doesn’t change how incorrect the prediction was :-)


That was shockingly accurate, really. Thanks for linking it. Who is this IsaacL and which year did he come from? ;)


IsaacL, if you're still around, you should make predictions for stuff by 2030!


This is why I like to stick around HN and just read. The minds here seem to be much more brilliant compared to other parts of the web. That dude nailed it!


IsaacL is still around it seems:https://news.ycombinator.com/threads?id=IsaacL


It's interesting that it's so far down the page, too. Popularity and accuracy were inversely correlated from my reading of that thread.


- As Moore's Law marches on, dynamic languages that are even slower than Ruby are likely to catch on. They may be to Ruby what Ruby is to Java, trading even more programmer time for CPU time.

I dont think this happened?


The fact that in 2020 we still don't really have CPUs that are significantly faster than those released in the middle of the decade in terms of single threaded perf (which is still the paradigm that most programming languages today operate under for the most part) kind of put a nail in the coffin of that prediction.

I can totally picture new languages as described doing well today if we had exponentially more single threaded processing power/watt at our disposal than we actually do.

I think there is still opportunity for languages that offer huge leaps in expressiveness w.r.t handling concurrency even at the cost of raw single threaded performance. Though I would not label a language like that as "slower" as it'd allow us to actually make much better use of our computing resources than we reasonably could today without blowing through most of our complexity budget, resulting in faster programs in practice.


You could argue that it's JavaScript. JavaScript had terrible performance even in 2008, can't remember now what it was like by 2010, but I would guess V8 was still pretty lacklustre back then.


I read it such that if, for arguments's sake, Ruby is 10 times slower than Java that people would start using a new language that is 10 times slower than Ruby. I'm not sure that ever described Javascript, and it certainly didn't as the decade progressed with the advancements that have been made in executing it.

I feel like that one simply missed the mark. If anything, there was greater emphasis towards languages that were more performant, even if less productive for banging out a working product.


It definitely did. IE6's JavaScript was super slow. You cannot even imagine how slow it was if you never worked in it.

Remember, IE6 was the dominate browser for years, so very slow JavaScript was the norm.

For example round about 2008 I made a client-side pivot table for a web app, tried it in JavaScript first, even trivial tables would take 10 seconds despite all the optimization I tried (in IE6, IE7 + Firefox).

I rewrote it using XML + XSLT and it was instant. But it was truly gnarly code.


I think the point of the prediction is that the new, slower language would be more expressive. More natural-language-like. That doesn't describe JS.


People who have seen more than I have: how much slower were JavaScript in IE6 vs V8?


Maybe JavaScript libraries and frameworks? I haven't touched too many of the "batteries included" ones but I see them out of the corner of my eye at meetups and hackathons, they seem unwieldy enough to fit the bill.


Well Moore's Law had pretty much ended then, but it was less visible.


If 6 out of 17 predictions correct with the only unambiguously correct predictions being trivial or vague is "nailing it," then I probably won't be taking stock advice from you any time soon. On the other hand, predicting the future is kind of hard. IsaacL made some really conservative predictions, sticking to things that seemed obvious. In fact, most of the wrong ones would be perfectly reasonable predictions for the next 10 years. It just goes to show how hard making predictions really is.

Below is how I calculated 6 out of 17 predictions correct

---

> Facebook will not be displaced by another social network. It will IPO some time in the next two years.

Correct

> Twitter will become profitable, but not as much as some expect. It will be less profitable than Facebook, and may sell to another company.

Correct (profitable since last year)

> Microsoft will .. have shrunk and may have evolved into a consultancy company on the lines of IBM

Wrong

> Internet Explorer will shrink, but won't go away

Correct? (debatable, since software never completely goes away, but MS is no longer developing it and Edge doesn't use the same rendering engine)

> Chrome OS or a similar operating system that relies on web access may grow extremely slowly at first, before rapidly gaining share amongst certain market segments. It will be most successful in places like cities that grant free municipial wifi access.

Wrong

> Mobile phones won't replace computers, but increasing penetration amongst the poorest in developing countries, and increasingly capable handsets in developed countries (and developing countries) will make them a colossal juggernaut. Many of the really big changes, especially social changes, will be caused by mobiles.

This was already true in 2010, so it doesn't even count.

> For any definition of 'success', there will be more tech startups reaching that level in the 2010s than in the 2000s. For example, there will be more than four startups of Youtube/Facebook/Twitter/Zynga proportions.

Wrong.

> In addition, at least one of the 'big' startups of the second half of the decade will have been possible with 2009 technology. By this I mean that people will still be discovering new potential for browser-based web applications built with current client-side technologies, which will remain ubiquitous, although new alternatives will appear.

Correct, but the predication that there will be at least one new web-based startup is not very interesting.

> It will be an even better time to start a startup in 2020 than it is now. One of the key drivers of ease-of-starting-up-ness will not be new technology, but new platforms - like Facebook and viral marketing, but better; or that solve other problems like micropayments, customer development, retention, and so on.

Wrong.

> Hence, starting up will become a more attractive career option, though well-meaning family will still say "at least finish your degree first".

Wrong.

> As Moore's Law marches on, dynamic languages that are even slower than Ruby are likely to catch on. They may be to Ruby what Ruby is to Java, trading even more programmer time for CPU time.

Wrong.

> Having said that, Moore's law will at least hiccup and may stop altogether in the middle of the decade, as semiconductor feature widths drop below 11nm. Since this will likely encourage investment in quantum computing and nanotechnology, by 2020 we might be seeing something faster than Moore's Law.

Wrong, IMO (transistors are still on the curve, but the performance impact of adding more transistors doesn't matter the same way it used to; but he cited Moore's Law specifically, so he's wrong).

> An international deal, of the kind that was aimed for at Copenhagen, will be reached over the next five years, though it might not be far-reaching enough to limit warming to 2 degrees in the long-term. (Despite the failure of the Copenhagen talks, it appears that world leaders almost universally recognize the need to take action over man-made climate change, though the various political problems will remain hard problems). China may not be part of such a deal, though the US likely will. Environmental disasters will begin to increase through the decade, as will disasters that are probably not caused by anthropogenic global warming but will be blamed by it anyway; this will provoke more of a push for action.

Wrong.

> Increasing fuel prices, and green taxes or incentives, will mean large shops will begin to replaced by warehouses, as traditional retail gives way to home delivery.

Wrong, but only because he cited a reason. Otherwise, the prediction that brick-and-mortar stores will continue to be replaced would have been correct.

> China will not become a democracy, or even make moves in that direction. However the rule of law will strengthen, and some civil liberties will increase. Internet crackdowns will continue, and may increase in severity, and will still be rationalized by porn.

I want to say the civil liberties situation in China is the same, but I don't know enough about China to comment on it.

> Despite multiple new fads that purport to make software development ten times faster and error-free, it will remain a hard problem.

Correct, but trivial. Of course writing better software is hard.

> You still won't be able to talk to your fridge, and gesture-based HCI will remain a fun gimmick.

I'm counting this as wrong because of Alexa.

> Virtual worlds like Second Life will remain niche, but World of Warcraft will pass 20 million users and a Facebook game or similar will pass 200 million users.

Wrong.

> The next big thing will be something totally unknown and unpredictable now, as user-generated content and social networking were in 1999. However, when it does appear, various 'experts' on it will spring from nowhere to lecture us all about it. It will still be really cool, though.

Wrong, but maybe I'm forgetting about some big tech thing in the 2010s. It seems that the biggest disprupters was a modernization of cab companies and hotel bookings. We may be on the dawn of several actually unpredicted new things, like driverless cars and the use of neural networks for some kind of cool new applications, but there's been no killer app. These are hopefuls on the horizon, not profitable products and industries.


> (ChromeOS will grow slowly, then rapidly gaining share amongst certain market segments) Wrong.

I don't know about that. ChromeOS did grow slowly, and has gained significant share of the entire US education space (K-12). Yeah, it's a niche market. But I think that fits the for "certain market segments" qualifier.

> (Virtual worlds will remain niche, but WoW will pass 20 million users, a Facebook-like game or similar will pass 200 million users) Wrong.

This is technically wrong, but in-spirit correct. Virtual worlds / MMO games did get the traction claimed, just not WoW specifically. (The MMORPG FF14 Online has ~20 million users, MMO Warframe has 50 million registered users, 'facebook-game' Farmville has 73 million users, and 'or similar game' Fortnite has 250 million registered users). Minecraft holds similar numbers.

And while WoW itself never quite hit those numbers, a different game from the same studio did. Blizzard's Hearthstone has over 100 million registered users today.


How is it not easier to start something now than ten years ago? I would say he was correct on that one.


Maybe analysis paralysis?

Facebook was started as a silly php script. Now you “need” half of npm, react, backend, devops etc to get started.


You can still take the 2010 approach and just run everything on Rails and Postgres. It’ll be a long time before any startup outgrows that stack, and by the time they do they’ll have the ability to rewrite like Twitter did.

Anyone who starts off with anything more than that (or Django or your favorite language’s equivalent) is just wasting time and effort trying to be trendy. Over-engineering a startup only serves to keep you from testing market fit, which is only a good plan if you’re trying to milk more investment money because you know you’re going to fail as soon as you launch.


that's a tech view only. Startup-ing is harder because incumbents in innovative business have built huge moats, in pharma, chemistry, web, mobile, etc. Big business are more ready to react to disrupting newcomers by pricing them out, lobbying for new rules, buying them out, etc. Incumbents are also reaching a global scale that generates in itself such an advantage...


> Now you “need” half of npm, react, backend, devops etc to get started.

Says who? You can literally spin up nearly any server computing infrastructure imaginable in a matter of seconds. Just because software is getting bloated under the covers, doesn't mean its any harder.


>> Mobile phones won't replace computers, but increasing penetration amongst the poorest in developing countries, and increasingly capable handsets in developed countries (and developing countries) will make them a colossal juggernaut. Many of the really big changes, especially social changes, will be caused by mobiles.

> This was already true in 2010, so it doesn't even count.

This definitely wasn't true in 2010. On June 1, 2010, Steve Jobs proclaimed that the post-PC era had arrived, and was promptly ridiculed for it by the industry and media. Massive social changes caused by mobile phones have only occurred this decade, as the gig economy has exploded and places like India have 10x'd the number of citizens with internet access.

>> For any definition of 'success', there will be more tech startups reaching that level in the 2010s than in the 2000s. For example, there will be more than four startups of Youtube/Facebook/Twitter/Zynga proportions.

> Wrong.

Huh? Of those 4, Facebook in 2009 was the most valuable at $10 billion. Now there are 20+ private unicorns with that valuation, as well as dozens more that have IPOd in the last few years. How is that prediction wrong in any way, shape, or form?

>> Hence, starting up will become a more attractive career option, though well-meaning family will still say "at least finish your degree first".

> Wrong.

>> An international deal, of the kind that was aimed for at Copenhagen, will be reached over the next five years, though it might not be far-reaching enough to limit warming to 2 degrees in the long-term. (Despite the failure of the Copenhagen talks, it appears that world leaders almost universally recognize the need to take action over man-made climate change, though the various political problems will remain hard problems). China may not be part of such a deal, though the US likely will. Environmental disasters will begin to increase through the decade, as will disasters that are probably not caused by anthropogenic global warming but will be blamed by it anyway; this will provoke more of a push for action.

> Wrong.

Again, this is correct. The US did enter the Paris Agreement, and has not actually formally withdrawn yet as it is not legally eligible to do so until November 2020. "Environmental disasters will begin to increase through the decade, as will disasters that are probably not caused by anthropogenic global warming but will be blamed by it anyway; this will provoke more of a push for action." is an especially cogent prediction.

>> China will not become a democracy, or even make moves in that direction. However the rule of law will strengthen, and some civil liberties will increase. Internet crackdowns will continue, and may increase in severity, and will still be rationalized by porn.

> I want to say the civil liberties situation in China is the same, but I don't know enough about China to comment on it.

Civil liberties have likely gotten worse, but the rule of law has indeed strengthened. Internet crackdowns being rationalized by porn is also correct, as seen in the UK and elsewhere.

As far as ChromeOS and video games go, maxsilver covered those pretty well already.


Fair, but OP (DanielBMarkham) was only 2 days off on his prediction that "major changes will happen in Iran."


The way he phrased pointed to an internal change, not something like this.


I remember Iran had very large scale protests in 2009 leading into 2010, so that was probably on their mind, thinking it would lead to some kind of radical change.


And even more surprisingly, WoW in 2019 released an 2006 version :D


- You still won't be able to talk to your fridge

This is wrong too.


> IsaacL nailed it the best

Curious if the "wisdom of crowds" works with these future predictions or if only a few were on target.


Yes, pretty close... too bad we can't comment on old threads but at least I upvoted him =)


these time travelers are getting cocky


Half of those predictions read like "the thing that is a trend now, will continue to be a trend". I can be a prophet too - Google, Amazon, Facebook, Apple and Microsoft are not going to be displaced in the next decade.


> Half of those predictions read like "the thing that is a trend now, will continue to be a trend"

And yet, people regularly think "a decade is a long time, surely this trend will be finished by then". Recognizing that existing trends are still likely valid is worthwhile.


Just wait until we hit the 2040s and get to compare Kurzweil’s predictions.


What’s amazing is how little changed.

- Apple and Google are still the only two mobile operating systems that matter and they are still in relatively the same position. iOS still controls the high end where the money is and Android has the market share but the OEMs are not making any money.

- Facebook is more profitable and popular.

- Amazon is still the number one online retailer, the Kindle is still by far the most popular ereader but more importantly, the Kindle platform is still dominant.

- Google still hasn’t managed to diversify from its ad business and YouTube is still dominant for video.

- Netflix is still the dominant streaming platform.

- Microsoft is doing better than ever.


I think Microsoft was the biggest surprise. In 2010 MacBooks were starting to become very popular, and it looked like Microsoft’s monopoly on consumer computing was coming to an end. It was common to think Microsoft was on a long decline in 2010.


It's interesting how none of the commenters envisioned Microsoft repositioning itself to adapt to technological and business trends. Everyone seems to have thought they would either fight them and lose, or contract to becoming an IBM-like consulting business.


Also, look how much things changed between 2000 and 2010 compared to 2010 to 2020.

Facebook didn’t exist.

Apple was “beleaguered”.

Google was in its infancy and Yahoo was still dominant (before the crash)

Microsoft was seen as unassailable.

Amazon was mostly just an interesting book seller. They were just getting into CDs and hadn’t even made the deal with Toys R Us yet.

Netflix was struggling against BlockBuster renting DVDs.

AOL was still dominant.


Sure, the large players stayed about the same in the last decade but we also found some new large players.

The entire app ecosystems was in the earliest infancy in 2010. AWS was in its infancy. Slack wasn’t released.


Slack isn’t a “large player” they are nowhere near being profitable and are nowhere in the same league as Facebook, Amazon, Apple, Google or Microsoft.

The app ecosystem was growing like crazy in 2010. It was a year and a half into “there is an app for that”. Facebook had already pivoted to being big in mobile.

The same can be said about Uber. Once the companies have a profitable and sustainable business model that doesn’t involve burning cash then they can be considered a major player.


Regardless of profitability I think people are spending a lot more money on the internet today as they were 10 years ago. Subscription services, gig economy, and online retail have exploded in the last 10 years in a way that I don't think many people predicted.


Everyone saw online retail exploding by 2010. Amazon was already big and growing, Barnes and Noble was already struggling against Amazon and Apple/iTunes was the number one music retailer as bricks and mortar music retailers had started to close.

By 2010, streaming services were big, Apple was selling movies and TV shows digitally and had already started removing DVD drives from its computers.


Also Kodak was still king of photography. On decline but still king in 2000. Who'd thought in just 5 to 10 years would've become a nobody?


And Nokia as well...


But nobody would quite imagine that Microsoft didn't control an iota of mobile computing operating system.


Honestly it doesn’t matter. It came out in the Oracle trial that Google only made $23 billion in profit from Android from its inception to the start of the trial and t still has to pay Apple a reported $8 billion a year to be the default search engine for iOS.

Apple has made a lot more in mobile from Google than Google has made from Android.

For awhile, MS was making more in patent fees from Android OEMs than Google was making in licensing and advertising.

Losing mobile was the best thing that could happen to MS so it could focus on Azure and making Office ubiquitous- where the real money is.


> still has to pay Apple a reported $8 billion a year to be the default search engine for iOS.

That is not the way to think about it - Imagine if Android didn't exist or faltered like Symbian, RIM, Windows Mobile etc. Google would have been paying way more than 8 billion. Android is definitely worth more than the money accounted towards it.


Not really. The Android market is really not worth that much. Statistically it’s made up of consumers who aren’t willing to spend that much compared to iOS owners. The high end non Chinese Android market is minuscule. They just aren’t as attractive as iOS users to advertisers.


Only slightly related, but is it the same in EU? Unlike the USA, Android has a much higher market share here.


I can’t find data on the EU specifically.

https://www.mobilemarketer.com/news/survey-iphone-owners-spe...

But the best I could find in Europe is.

https://sensortower.com/blog/europe-app-revenue-and-download...


Thanks, the 2nd link seems to show that iPhone owners spend way more compared to the amount of downloads here as well.

Personally, I have quite a few paid apps, but the majority I bought over the years and keep using them. The top grossing apps are all subscriptions and the only one that I have with one is OSMAnd (Open Street Maps app) which includes a donation to OSM/contributors.


Even after buying Nokia's phone business.


> - Apple and Google are still the only two mobile operating systems that matter and they are still in relatively the same position. iOS still controls the high end where the money is and Android has the market share but the OEMs are not making any money.

You're badly misremembering or recontextualizing with hindsight bias. At the start of 2010, iOS had 32%, Android had 4.5%, and there were at least 5 other mobile OSs with >4% market share. Start of 2020 and we're looking at Android with 74%, iOS with 24%, and zero other OSs with >1% market share. These are very different situations!


Android was at 22.7% in 2010. It was growing like crazy.

https://www.computerworld.com/article/2512940/android-smartp...


It was! It grew from ~4% at the beginning to ~22% at the end.


> - Apple and Google are still the only two mobile operating systems that matter and they are still in relatively the same position.

I don’t think this was a sure thing in 2009. Symbian still had half the market. BB had a larger share than either iOS or Android. WebOS was cool.


BlackBerry was already going downhill by 2010.

https://business.financialpost.com/technology/personal-tech/...


Hindsight is 2020... err that article is 2016.


> Netflix is still the dominant streaming platform.

Netflix didn't start streaming until 2011. So it wasn't a streaming platform at all in 2010.


Netflix was a streaming platform in 2007.

https://arstechnica.com/uncategorized/2007/01/8627/

By 2009, they had 12,000 movies up for streaming, and Netflix-compatible devices were advertised heavily in stores, with Best Buy including Netflix apps on their store-brand devices.

https://www.cnet.com/news/netflix-compatible-video-devices-c...


The big announcement when the iPad was introduced in March 2010 was that Netflix would have an app for it. It had been streaming for at least three or four yesss by then.


they used to have discs for many platforms (e.g. PS3, Wii) where you could install Netflix and play online content before the app was released


I was streaming from Netflix in 2007.


Applications are open for YC Summer 2023

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: