Regarding escalation of school shootings, it seems plausible, but only because (here is an unpopular opinion that will get 20 downvotes) the rich want to disarm the poor and use school/kids/safety as an excuse. And that's understandable: if I was rich enough to afford private schools for my kids in a classy neighborhood, I'd want to keep weirdos with guns far from my family, while at the same time Id have 24/7 former-marines guards armed with full auto weapons.
Your source doesn't say school shootings are down from previous highs nor that they're very rare; it specifically only deals with shootings where _4 or more students were killed_ excluding the shooter. That's a very different things from "school shootings", where the mean number of fatalities is 3. If you define "school shooting" as "people being shot at school", then use the same data as in your source, you see 247 people shot in 64 incidents during the 2000s, and 555 people shot in 213 incidents during the 2010s. (It's worth noting that Fridel & Fox came under some scrutiny for choosing their definition of school shooting, because the standard definition used--by the FBI, the Bureau of Justice Statistics, and by Fox himself in previous studies that conflicted with his new one--is four people _shot_. )
What we're seeing is a large increasing in the number of school shootings, but a decrease in the average success of each school shooter. Is it surprising that the average shooter has become less effective when schools are spending a lot of time and money preparing to deal with them? Shooter drills, established lockdown procedures, buildings being designed with escape and hiding in mind[0], staff training, employment of security guards, etc.
The data you're citing states it plainly: there are more shootings at schools now than 10 or 20 years ago, and more people are being shot at school than 10 or 20 years ago. 50 people killing 3 kids each isn't somehow safer than 25 people killing 4 kids each.
Hang on. If you're counting things where only one or two people are shot, there is very often no intent to shoot more than that. You can't use that data point by itself to talk about the "success" of school shooters. And those smaller incidents are definitely not what I think of when I hear the words "school shooting".
> 50 people killing 3 kids each isn't somehow safer than 25 people killing 4 kids each.
But you have to treat people killing 2 kids in a completely different way from people that are killing 20. Drills, lockdown procedures, almost all gun limitations... those don't do anything against a single murder.
On that thought, is there any way to prevent shootings without making guns harder or impossible for citizens to own?
Seems many will not disarm in the US, so should we be exploring other options rather than repeatedly saying the US should be more like other countries?
Last I checked, cars do not require any licensing to own. They require special licensing and often times yearly taxes to operate on public roads (but merely possessing them on public roads does not).
You need to register the car, and that often requires proof of insurance, which requires a driver's license. Either way, my point still stands. You're quibbling over semantics. So few people want to buy a car just to not drive it, that we don't even consider it a problem.
Not at all. He’s accurately reflecting gun laws and car laws. You are willfully hand waving the fact that cars are entirely unregulated except for use on public roads. As where guns are regulated in every facet of their existence.
You are doing this because you read on a liberal site and heard the liberal talking points that guns are easier to get than a car with is wrong.
> and often times yearly taxes to operate on public roads (but merely possessing them on public roads does not).
In my jurisdiction one may not "move" or "leave standing" upon a public road an unregistered vehicle, in addition to not being permitted to "drive" it.
Wow, you can't tow an unregistered vehicle? That definitely sounds off, but then I've seen even more silly laws in the past (such as one that forbids minors from playing pinball).
> Is it so bad to make guns as hard to own as a car?
The ONLY people that say this are the ones that would be (fucking) horrified if we treated guns like cars.
Ok. You are proposing I should be able to buy any gun I want whenever I want regardless of size or capacity, automatic, or loud or silenced for cash without any checks or regulations at all? That the only requirement to use it in public is that I need a license - BUT - that carry license is good in all 50 states?
... that sounds good to me! Let’s do it.
Yours is an argument based entirely in ignorance of gun laws. And maybe car laws too.
> Ok. You are proposing I should be able to buy any gun I want whenever I want regardless of size or capacity, automatic, or loud or silenced for cash without any checks or regulations at all?
But that's how loosely cars are regulated, and you said you wanted guns regulated like cars. I think what you meant to say is to regulate guns much more strictly than cars.
Amusingly... it was likely their opposition. This is why you should hold yourself and those you align with to high standards. It only stengthens your position...even if you are completely wrong (although only a little bit)
Paris was the mass shooting I was referencing. 184 dead irrc.
Norway, School summer camp in 2011. I suppose it’s not exactly what you think of when you think of skin shooting inside a school building, but the people killed were almost exclusively children. Perpetrator use a rifle that has not been legal in Norway for civilians.
It wasn't a school summer camp - it was organized by a left-wing political party - and most of the people killed were adults.
He used a Ruger Mini-14 and a Glock pistol for which he had a legal license as a civilian (for hunting and sport shooting respectively), purchased in Norway - the country with the second highest legal gun ownership rate in the EU - after he failed to illegally acquire guns in Prague.
> He used a Ruger Mini-14 and a Glock pistol for which he had a legal license as a civilian
You have a source for that?
I understood neither of those guns are to be civilian legal.
Also... to the topic. If Norway’s extremely strict (draconian) gun laws didn’t prevent this shooting which was worse than any in US history; why would US benefit from them?
There's a weird thing that happens here with language: school shootings mean multiple things. So, if someone is shot at a school at an after-hours drug deal gone awry, that counts just as much as parkland, even though there is a very distinguishable difference between the two.
I think the only thing worse than lying about incidents to improve your statistics, is ignoring reality in arguing for policies and regulations that further instead of solve the problem.
I don’t think this should be downvoted. The facts are the facts. It’s also the case that even 1 shooting is a tragedy and should give us pause for how we can fix it (whether that is gun control or something else).
people tend to make optimistic predictions. A lot of predictions were correct, and many of them were right on subject, but predicted the positive instead of the negative turn of things. Interesting views regardless, but i cant help to notice that tech progress was nothing like 2000->2010.
I feel like enormous cultural and technological hardware advancements were made between 1990-2010 that affected people on a national or global level. The advances were revolutionary. I have a hard time associating the last decade the same way. To me, the 2010-2020 decade has been stagnant.
My overall sentiments, more or less:
1990->2000 we went from very few people owning personal computers to wider adoption of PCs and broadband was gaining traction (I personally had @Home cable internet in 1998).
2000->2010 mass adoption of cell phones, starting with the flip phones to pocket PCs to the first iPhone in 2007. Social media (MySpace, The Facebook). Peer to peer networking, piracy (The Pirate Bay). Laptops. The first tablets.
2010->2020 mild improvements on the stuff from the previous two decades
EDIT: I'll give the last decade Uber (although I've only used it once) and AirBnb since that was a huge shift as well, but my overall feeling remains the same.
I am surprised with no mention of Smartphone. Which is arguably the most important innovation in modern history.
2010s, Smartphone ( iPhone 3GS at the time ) went from niche to 4B users ( iOS, Androids and KaiOS ), that is nearly every person on earth above age 14 in developed countries. I dont think there has ever been a product or technology innovation as important that spread faster than Smartphone. And it changes everyone's life. The post mentioned of Google, Facebook and Amazon empire, all partly grows to this point because of Smartphones. Technology companies together now worth close to 10 Trillions. The whole manufacturing supply chain exists and became huge in Shenzhen because of Smartphone. It was the reason why TSMC managed to catch up to Intel in both capacity and leading edge node. It was the reason why we went from 3G to 5G in mere 10 years because of all the investment kept pouring it. It was the reason why everyone went on to the Internet and had Internet economy. It brought a handheld PC and Internet to a much wider audience.
I would even argue it was the Smartphone innovation that saved us from the post 2008 Financial Crisis doom as it created so much wealth, innovation and opportunities.
Chile is much better when compared to Brazil in terms of equality. Brazil is ranked 8 most unequal (tied with Botsuwana) whereas Chile holds the 26th position in the GINI index [1].
We can expect Chile to drop positions since its economy is going downhill creating massive unemployment. Inequality will increase as people from lower middle class will move into poverty as we've seen in Brazil since 2015.
Not much. The thing about inequality is that it makes the top more powerful politically. The amount of money held by the top 0.1 percent is irrelevant to moving the income/wealth needle for the rest of the population.
There is a reason those who don’t like inequality had to come up with new measurements to highlight it. There is little impact on the mean, median, and mode if the rich get richer.
How does your list not include climate change as a thing of current importance? It wasn't mentioned much in the original either - though i saw one comment calling it "FUD" - but they were making forecasts ;)
> Hopefully network analysis/data mining laws/politics will be handled with a little less FUD, a little less grandstanding, and a little more efficiency than we're seeing with climate change politics...
They were clearly describing the handling and discourse (I suppose on a large public scale) of climate change as FUD, not climate change itself.
> - As Moore's Law marches on, dynamic languages that are even slower than Ruby are likely to catch on. They may be to Ruby what Ruby is to Java, trading even more programmer time for CPU time.
Interesting how plausible this one is, yet turned out to be terribly wrong: the newer hyped languages that got some uptake were largely compiled ones like Swift, Rust, Kotlin, and Dart.
One thing that always really bugged me about the "the language will figure it out" philosophy is the complete disregard towards carbon footprint.
Like yeah, you can make an interpreted dynamic language that's pretty neat, but you can also make something like go, swift, or Julia that jits and also captures 90% of that ease of use while significantly reducing your hosting costs/energy consumption.
Going forward I think static compiler inference will be the future of language design. Either for safety like in rust or for convenience like in swift.
And we can already see pitfalls in the interpreted world of python where the wrong implementation, like using a loop instead of numpy, can lead to devastating performance impacts. Looking forward, something like this seems as outdated as having to manage your 640k of executable space in DOS: an unreasonable design constraint caused by the legacy implications of the day.
My prediction is 10 years from now we'll look at interpreters and language VMs as relics from a simpler time when clockrates were always increasing and energy was cheap.
I'm actually pretty amazed at what the Javascript VM people have done in the past decade. It's way more than I ever expected.
As far as the carbon footprint, well, yeah, it depends. At Google I remember a friend talking about how he cringed whenever he added more code to the pipeline that ingests the entire internet. He said that he wondered how much extra carbon was release into the atmosphere just because of his stupid code.
As for numpy, we are seeing that loops are stupid and Iverson and APL were right. :-)
> At Google I remember a friend talking about how he cringed whenever he added more code to the pipeline that ingests the entire internet. He said that he wondered how much extra carbon was release into the atmosphere just because of his stupid code.
I wish companies were under more financial pressure to actually track and mitigate this. (hint hint, carbon taxes)
>My prediction is 10 years from now we'll look at interpreters and language VMs as relics from a simpler time when clockrates were always increasing and energy was cheap.
Funnily enough someone said that exact same thing 10 years ago in that thread:
>* Functional programming / dynamic languages will go out of fashion. People still using them will be judged as incompetent programmers by the people who moved on to the new fashionable programming paradigm(s). At the same time, huge corporations will embrace functional programming / dynamic languages and third world universities will start focusing on them in their courses.
While pithy, that's not actually an interesting rebuttal.
You should elaborate, especially since "huge corporations", namely Apple, Google, and Mozilla (I guess?), are the ones pushing Swift, Go, and Rust, respectively.
Huge corporations always push for languages which are brain dead and make developers be as fungible as possible. Unfortunately not all software has been invented yet and writing in such a language is a nightmare.
So what happens is that people gravitate towards languages which are pleasant enough to work on to try new ideas in.
The progressing of bash -> awk -> perl -> python happened for a reason. Hell we're even seeing people use lisp like languages unironically for the first time in decades.
Enabling developers of different talents and backgrounds to be productive and contribute successfully, and to scale up in number of separate teams, is a hard problem and doesn't deserve the kind of dismissiveness you give it here.
More powerful languages enable talented developers to be more productive individually, but it's hard to teach all the other developers about the new and interesting abstractions that the powerful languages enable, and it's impossible to hire for them directly. This limits velocity.
And it's not just huge corporations, it's any company which is trying to scale from 10 to 100 developers; where you're hiring continuously, where there are too many other developers to efficiently rely on ambient learning.
Not so sure our energy problems are going to come to a head in the form of all-out energy poverty. More likely I think: the convenience of energy being instantly available at all the right times and places will diminish.
Cloud spot pricing is I think one example where things that can be batched and deferred will be cheap even in a energy-decline future.
This is assuming we don’t sink so far as to lose our productive capacity for energy infrastructure altogether. Depends how bull/bear you are about the whole thing I suppose. In that scenario, there’s also gonna be no market to sell whatever you’re coding to.
I think the new languages are about developer productivity, but in a different way. It allows you to make more robust software via a better typing system, or more robust multithreading via rust's borrow system. In the past, our machines couldn't handle such strict languages as well because they have a high compile time cost.
In overall total effort needed to make something robust, something like rust will beat something like ruby, because the dynamic language-ists compensate via a larger test suite.
Most setups and most hardware isn’t efficient enough for language to really matter. So the solution to carbon footprint isn’t to stop using python, it’s to stop burning coal.
The sad thing about Python is that it shouldn't need to be interpreted. People don't use it for that reason. Remove one or two obscure meta-programming features that nobody uses (or shouldn't be using) and you could in principle remove GIL and make it JIT just as good as Javascript.
JS really shows what's possible in this field with some benchmarks even outperforming c++. This space of expressive yet fast languages is a gap that remains to be filled and will probably be by 2030, sadly it's a bit of a chicken-and-egg situation because you don't only need a good syntax to become popular, also a healthy ecosystem around it. What we will see is probably not any new language but rather that existing languages from both sides of the spectrum converge more towards the middle ground.
I mean, JavaScript and Python have definitely become much more popular for many more things than they were in 2010.
I doubt Swift replaced much besides other compiled languages, and Kotlin just compiles to Java anyway. Dart's VM idea was dropped so its small usage is largely compiling to JS still.
I would say that the overall idea of performance being traded for programmer time is definitely happening despite the emergence of Rust.
> Dart's VM idea was dropped so its small usage is largely compiling to JS still.
Are you sure about this? I was under the impression that today's Flutter development is highly dependent on the niceties the Dart VM provides, and newer Dart releases improved upon them.
In terms of language design, it seems like swift and julia are the most forward looking.
I personally don't think swift will ever escape the Mac ecosystem, just like Objective-C never did, but something with the same DNA will.
In the same way that Objective-C and Ruby both implement the philosophy of Smalltalk, I think that philosophy has yet to be fleshed out in a simple syntax that is natively jit'ed/compiled.
> I would say that the overall idea of performance being traded for programmer time is definitely happening despite the emergence of Rust.
In some sense Rust is in that same vain - reducing certain errors and the need for tools like Coverity, which in turn makes programmers more efficient.
Electron isn’t just JavaScript, else I would have said JavaScript. Electron is front and center in the movement to shorten development time by consuming more PC resources at runtime. I feel that’s the spirit of the prediction.
I don't agree. I've implemented GUIs with raw Xlib, Tcl/Tk, Qt, GTK, MFC, SDL, XUL, and DHTML. Of all of these, DHTML is the most productive for me, and by a significant margin. The Tcl/Tk topic in Dercuano goes into those experiences in more detail, if you're interested.
Training for developers, presumably, especially web developers who want to build native apps. If retraining was free, Electron would have approximately no market.
ryacko is perfectly clearly stating that Electron reduces the amount of training that developers need. This is different from making them develop faster. It just makes them more replaceable.
I have no idea how to parse twobat's question. The "or" is especially confusing. I'm not surprised that ryacko is baffled by it.
Electron gives you a UI and some standard library stuff. That's pretty obviously a totally separate issue from training an algorithm... I think? Am I missing something?
I still think asking "what or who?" with no other elaboration is really confusing. It's such a vague question that you have to guess how to answer, and it's super easy to answer it in a way that doesn't satisfy what the asker actually meant to ask.
Because it's not Moore's Law that matters here, but rather the subset that is the performance of single cores. And that subset stalled out pretty badly.
We're adding more and more cores, but more cores don't help you write a program in easy mode. Easy mode doesn't multithread.
It's also been quite a decade of growth for old stodgy languages. In 2010, it seemed like C++, Java, and Javascript were never going to change, but they've all significantly cleaned up their acts.
I think the prediction still had a kernel of truth to it though - every edge that chip manufacturers and JIT developers can give us has been burned away on slower and slower client-side rendering frameworks.
And PHP. PVP 5.3 was the newest version in 2010 and as the years wore on many thought HipHop by Facebook would start to take over for serious PHP applications. PHP 7 completely changed the course of the PHP community and took the air out of HHVM’s sails.
HipHop was a cross-compiler which facebook abandoned for HHVM, a rewrite of the PHP engine with faster internals. It always had compatibility issues, but the improved performance made it interesting (afaik nobody outside facebook really got that much into Hack the language). PHP7 was a rewrite of the internal data structures which put it on close enough performance footing. Because HHVM was never a drop-in replacement for PHP and moved further away over time the community ended up sticking with PHP7 and now pretty much only facebook uses HHVM, with the big players that moved to HHVM all transitioning back to PHP7.
Rust and Dart hasn’t seen that much “uptake” by the broader market, it’s hyped by geeks but that’s about it. Swift is becoming popular because if you want to develop for iOS, you don’t have a choice. Kotlin is getting some uptake.
Rust has seen lots of quiet uptake. What you don't see is hiring for people for Rust because most organizations don't experience a whole-sale shift to Rust, and you just retrain people on the inside.
I'd wager there are lots of people like me - writing small tools at work in Rust, but the employer would never hire for Rust.
I think we'll see broader market adoption in the next 2-5yrs though, in terms of people actually hiring Rust developers. There's still a little maturing to do in a couple areas to clean up some papercuts.
Disclaimer: I really like Rust am a little biased.
The not hiring but using bit is based on not seeing Rust jobs despite companies actively using Rust, and many anecdotes of "I wasn't hired for rust, but we use it" over in reddit on r/rust when people ask how to get a rust job.
I was thinking that exact same thing. Rails was hitting its peak in 2010, and the idea that programmer time is more valuable than CPU time suggested dynamic languages were the future.
I mean, that’s not entirely wrong — Besides JS, Python is arguably the most dominant language across all domains. Python is also faster than Ruby, though, and 3.x added static-like features to the language. It doesn’t seem like there’s much appetite for languages any slower than what we have now.
I wasn't trying to assert that ruby is faster, just trying to correct the notion that python is an inherently faster language than ruby.
If I look at a composite benchmark like debian's and don't see one leading by an order of magnitude, I file that away as "probably about the same speed, depends on your use case"
> - The next big thing will be something totally unknown and unpredictable now, as user-generated content and social networking were in 1999. However, when it does appear, various 'experts' on it will spring from nowhere to lecture us all about it. It will still be really cool, though.
It replaces trust with massive resource consumption and doesn't do a good job of being digital money (it is hard to use safely, it is hard to use anonymously).
It is cool because it really disintermediates powerful entities. Maybe the first iteration of digital money/assets is not that efficient, but that could also be argued for many other innovative technologies.
What is groundbreaking is the concept of P2P value exchange. And that's what bitcoin really is, still a young PoC that will surely evolve or get superseded by a 2.0 tech.
I suppose that partly depends on exactly how novel the blockchain part is, but broadly speaking the world of efforts to combine some of electronic money, cryptography, proof of work and decentralisation was not a new one in the late 2000s. For example Ecash had been hot once https://en.wikipedia.org/wiki/Ecash , while later on Hashcash got a fair bit of attention as a possible solution to the spam crisis. In 2007 (early 2010 is another story) it was definitely more of a deep cut than AI or cloud, though.
definitely Bitcoin/Blockchain. I wouldn't say it will actually disrupt traditional banking empires, but it was the next big thing and 'experts' definitely came out of nowhere and its still pretty cool.
AI wasn't totally unknown: basically every coastal university during the 1970s and 1980s were filled with people working on AI, and it didn't die off, it just became embarrassing to work on.
Neither was user generated content (that's what the original web was) or social networking (irc, email, bulletin boards, etc). Making it useful and usable beyond an academic and techie niche was new.
> Neither was user generated content (that's what the original web was)
The fall and return of the cat picture still amuses me. In the early days of Web adoption when people would put up a personal home page just for the thrill of having one, stereotypically the only bits of content they would be able to come up with for it were a list of favourite bands and one or more scanned-in pictures of their cat. So for years and years afterwards cat pictures were a synechdote, a punchline for jokes about the naïvety of the early Web. Then phone cameras and low-effort social media became big, and ...
And more, the AI hype-cycle was definitely on the upswing again already by the beginning of 2010. The Netflix Prize and the motor-vehicle DARPA Grand Challenges were all 2000s events, for instance.
Regarding deep fakes, I don't think you're wrong, just early. I think we are just getting a glimpse of what is unfortunately possible as this decade closes out, they could introduce some real chaos as in this new decade.
Not at all unforeseen by the end of 2009 though. The GMail beta dates to 2004, while EC2 was announced by 2006, and that's aside from all older precedents or predictions. (That said, not everyone 'got' EC2 at first.)
In my experience (very large contact center multinational), what killed IE6 in the enterprise was PCI and other security compliance standards. Large companies do whatever it takes to stay compliant to a lot of these standards because having that "stamp of approval" is the only way to keep other equally large companies as clients.
> if there is a way to work Elon Musk in there somewhere that would be good, I have a feeling he's going to make some big waves in the next 10 years but I haven't a clue how.
The dynamic languages slower than ruby prediction didn't hold up either. The past ten years have just seen nibbling around the edges for dynamic languages, without any compelling ideas about how to make them fundamentally more expressive.
Doesn’t that describe the rise of JavaScript? I haven’t done or seen any specific benchmarks, but I gathered that JS was among the slowest of languages.
Like others have said, major Javascript engines are much than Ruby implementations, owing to the huge effort Firefox, Google, Apple, and Microsoft have put into those engines. In principle, I'm not sure if there's reasons that make one intrinsically harder to optimize.
But even if Javascript were slower, it wouldn't mean the prediction was right. Javascript didn't do something super amazing that let you write beautiful code (the way Rubyists feel about Ruby, or Python folks feel about Python). It just ran the web so we used it. Over time, it's morphed into a language that's about as expressive as any other good dynamic language, also with some warts. But it's not a leap ahead, which is what the original post predicted.
The only language I've felt that way about was Prolog. When it works right, it's amazing. Too bad I can rarely make that happen.
I am convinced there has to be a way to leverage the expressive power of a language like pure prolog without the strain of the 'hints' you need to give it when the depth-first algorithm can't solve the problem. My bet is that if the interpreter 'knows' enough algorithms to solve common calculations, it could leverage these to compute 90% of problems (without needing to have one algorithm for each problem). For the other 10% you would have to supply the interpreter with the necessary algorithms yourself.
JS is pretty much the fastest of the popular dynamic langs, if only because JS interpreters see highly-competitive investment from well-funded companies with armies of developers thanks to the web.
Chrome and V8 were released in 2008, and Node.js (which leverages V8) in 2009. V8 was already considered fast back then. Mozilla and Apple also focused on optimizing their JS stack in the following years.
Google’s marketing for Chrome and V8 was incredibly effective. We’ve basically forgotten that Mozilla, Adobe and Apple were pushing JS perf long before V8 was released. Memory usage of early V8 was atrocious and real world performance was barely improved over Safari and Firefox but Google dominated dev mindshare for a decade.
No, Mozilla and Apple didn't release their JIT engines until after Chrome (though I think at least Apple was working on theirs) and so when Chrome came out JS immediately got about 4x faster, which was a bigger improvement than had happened over the entire previous lifetime of JS. Popular web pages didn't immediately get 4x faster, because they had carefully avoided doing anything in JS if its performance mattered, but JS did. It's true that V8 used a ridiculous amount of RAM, and still does, but its existence made it possible to do things in JS that previously required native code.
As for Adobe, they never did much for JS; they tried to get people to use ActionScript instead, which was a statically-typed language with syntactic similarities to JS.
Apple built the Sunspider benchmarks in 2007, and along with Mozilla, they made dramatic real world performance improvements before Chrome was even publicly announced.
Macromedia (later Adobe) funded a whole bunch of JIT research as the Tamarin Tracing project, which was shared with Firefox as TraceMonkey. The NanoJIT assembly backend was actually shared code.
I guess you're right that TraceMonkey was announced before Chrome was released on 2007-09-02, one day after its inadvertent public announcement. TraceMonkey was announced, though incomplete, more than a week earlier, 2008-08-23: https://brendaneich.com/2008/08/tracemonkey-javascript-light.... SFX, however, was announced later, 2008-09-18.
I don't want to piss too much on SunSpider, but unsurprisingly V8 did better on V8’s benchmarks, while SFX and TM did better on SunSpider, once they were eventually released. That's because V8 was written to do well on its benchmarks, presumably, not because the benchmarks were rigged. Certainly they were in accordance with my experience at the time.
When Chrome was released, V8 was way ahead of the other browser JS engines at JS performance, partly because it had the first JS JIT. But the other browsers took a year or two to catch up, which they had because it also took websites a few years to move most of their functionality into browser JS.
I don't know why you keep emphasizing this “real-world” thing. Are you saying you think nbody and fannkuch are especially realistic benchmarks?
You're right though that Tamarin was a JS engine, not just an AS engine. We regret the error.
The point is Mozilla, Apple and Adobe did enormous work on JS before Chrome was released.
nbody and Fannkuch are meaningless for real world performance on a client side JS app. V8s benchmarks were widely considered to be unrepresentative of what browsers actually do. Sunspider and derivatives were the most realistic tests at the time and Chrome wasn’t a dramatic improvement.
Chrome’s marketing was incredibly successful in getting more people to use a browser better than IE but it didn’t have a dramatic improvement over FF or Safari.
> The point is Mozilla, Apple and Adobe did enormous work on JS before Chrome was released.
That's true relative to the work they had done on, for example, Gopher support, but it's not true relative to the work they did after Chrome was released. As I said, JS performance from 1995 to 2008 hadn't increased by even a factor of 4; then Chrome was released in 2008 and it immediately increased by a factor of 4, more than it had increased in the entire previous 13-year history of JS. It's true that there existed other optimization efforts. But they weren't successful, probably because not nearly enough effort was devoted to them. Tamarin Tracing/TraceMonkey was eventually discarded and is not part of SpiderMonkey today, although a different JIT strategy is. (LuaJIT uses the tracing-JIT strategy very successfully, though.)
> nbody and Fannkuch are meaningless for real world performance on a client side JS app. V8s benchmarks were widely considered to be unrepresentative of what browsers actually do. Sunspider and derivatives were the most realistic tests
I mentioned nbody and fannkuch because they are in SunSpider, so it seems that you are contradicting yourself in addition to, as demonstrated previously, mixing up the historical sequence of how things happened.
I had just written a parser generator and an experimental functional language that compiled to JS when Chrome came out, and the performance improvements I saw were in line with the Chrome benchmarks. My experience is not part of Google's marketing.
Safari perf had already improved more than 4x with the move from AST Interpretation to a register based bytecode VM with SquirrelFish. This happened before Chrome.
I’m working on a tracing JIT for Ruby at the moment partially inspired by LuaJIT. Fingers crossed it’ll be published research this year!
Right, but the Sunspider benchmark also tested strings, regex and date manipulation. V8s original benchmarks didn’t. Performance on stuff like a parser generator was way up, but it didn’t run jQuery based Ajax sites any faster.
No, it really wasn't. You had to use it in really specific ways and it had all sorts of gotchas. One I vaguely remember is that if an array got too big performance would tank like crazy.
Did anybody predict the slow death of jQuery or the rise of giant monster JavaScript frameworks? 2010 is when jQuery was a cult of personality. It was everywhere and any criticism of it was quickly met with desperate mob violence.
Did anybody predict a typical corporate web page would require 3gb of JavaScript code from frameworks and thousands of foreign packages just to write a couple of instructions to screen?
Applications written in JavaScript tend to be hideously slow (and getting worse) but this is more of a community problem; the language itself is now rather fast by dynamic language standards.
JavaScript is now both faster and less productive than Ruby, so it doesn't match his prediction that a slower and more productive language will become popular.
Raku (formerly Perl 6) basically sits alone as the successor to the slower-and-more-expressive throne. It's a great language to study for the gee-whiz-they-thought-of-everything factor.
But, nobody is paying attention to it. Some of that is because of the Perl 6 baggage, but just as much is probably because this past decade was so heavily entrenched in proliferation of the web-cloud-mobile paradigm that new scripting systems weren't part of the hype cycle. If it didn't get the backing of the FAANGs, it didn't register.
I can imagine a day coming where scripting shines again, though. It might actually be closer than we think. There is always a need for glue code, and glue code benefits from being a kitchen sink of built-in functionality.
We've reached a point where package ecosystem eclipses the importance of language features (which, in brand new languages, typically only offer incremental improvements these days).
It's not good enough to build a slightly better language any more. People won't learn a new language without their favourite packages (or equivalents).
Sometimes fractured and all over the place is a sign of success. Python is used by a huge variety of different communities for a myriad of use cases, which is pulling it in different directions. That’s not a bad problem to have, as long as it’s managed well and the core webs seem very tuned in to this issue.
Also the community migration to Python 3 is done now. Yes there are massive P2 code bases out there still, but for new projects P3 has been the clear choice for a long time now. It’s over.
> Now that Google is advertising Chrome on billboards here in the UK, all that can be safely predicted about the browser market is that it'll be extremely competitive.
It's rather interesting to see how the web went from IE dominating, to Chrome doing essentially the same. Looks like predicting such a growth was farfetched.
Funny to see that the prediction of IE sticking around was mostly right. I long for the day that IE11 will mostly be gone.
Internet Explorer 6 could have been the dominant browser essentially forever if Microsoft hadn’t zeroed its investment into it. Its incentives weren’t aligned with a robust web platform. Google’s incentives are definitely aligned for Chrome to dominate.
> - China will not become a democracy, or even make moves in that direction. However the rule of law will strengthen, and some civil liberties will increase.
The latter half of this one didn't come to be. Rule of law is weaker and civil liberties have declined.
Civil liberty and rule of law are orthogonal mostly. China has used a decade to make most court cases publicly accessible through internet, all corporation and their holding/debt status similarly accessible, a host of other governmental information available. While there are still a significant amount of corruption, the law enforcement would have been at most transparent in history. Civil Liberty is on a different track however, with laws obstructing such being enforced.
Most court cases yes, the ones that actually matter from a political and civil liberties perspective, no not at all. China is ruled by Xi Jinping and the Communist Party, not the law.
“China Must Never Adopt Constitutionalism, Separation of Powers, or Judicial Independence“ - Xi Jinping
According to the World Bank's metric, China's percentile ranking on rule of law improved somewhat from 2010 to 2018 (eyeballing the chart, looks like it moved from 40th to 48th). This is of course not quite the same thing as an absolute increase in rule of law; if rule of law is worsening everywhere, then the ranking change could still be a decrease (though globally worse rule of law isn't my impression).
My gut response to his thought of MS pivoting to consulting a la IBM was way off but after thinking about it for a minute you could kind of make a case that Azure is a form of "self-serve operationalized consulting", he was just off on it's impact and success by a factor of 50 or 100.
There was one comment in the thread about what Steve Jobs will do at the end of the decade. As a ghost I suppose? (Another says he will step down due to his health, which is closer.)
My favorite is the joke that Zuck will buy Portugal.
You have to remember that in 2010, Windows was still saving its reputation from Vista, Office was stale, and OneDrive didn’t exist. I don’t think Azure did, and if so it was more enterprise focused.
Nadella’s excellent pivoting of MS into a cloud and services focused company saved Microsoft from a stagnant or declining state.
This is why I like to stick around HN and just read. The minds here seem to be much more brilliant compared to other parts of the web. That dude nailed it!
- As Moore's Law marches on, dynamic languages that are even slower than Ruby are likely to catch on. They may be to Ruby what Ruby is to Java, trading even more programmer time for CPU time.
The fact that in 2020 we still don't really have CPUs that are significantly faster than those released in the middle of the decade in terms of single threaded perf (which is still the paradigm that most programming languages today operate under for the most part) kind of put a nail in the coffin of that prediction.
I can totally picture new languages as described doing well today if we had exponentially more single threaded processing power/watt at our disposal than we actually do.
I think there is still opportunity for languages that offer huge leaps in expressiveness w.r.t handling concurrency even at the cost of raw single threaded performance. Though I would not label a language like that as "slower" as it'd allow us to actually make much better use of our computing resources than we reasonably could today without blowing through most of our complexity budget, resulting in faster programs in practice.
You could argue that it's JavaScript. JavaScript had terrible performance even in 2008, can't remember now what it was like by 2010, but I would guess V8 was still pretty lacklustre back then.
I read it such that if, for arguments's sake, Ruby is 10 times slower than Java that people would start using a new language that is 10 times slower than Ruby. I'm not sure that ever described Javascript, and it certainly didn't as the decade progressed with the advancements that have been made in executing it.
I feel like that one simply missed the mark. If anything, there was greater emphasis towards languages that were more performant, even if less productive for banging out a working product.
It definitely did. IE6's JavaScript was super slow. You cannot even imagine how slow it was if you never worked in it.
Remember, IE6 was the dominate browser for years, so very slow JavaScript was the norm.
For example round about 2008 I made a client-side pivot table for a web app, tried it in JavaScript first, even trivial tables would take 10 seconds despite all the optimization I tried (in IE6, IE7 + Firefox).
I rewrote it using XML + XSLT and it was instant. But it was truly gnarly code.
Maybe JavaScript libraries and frameworks? I haven't touched too many of the "batteries included" ones but I see them out of the corner of my eye at meetups and hackathons, they seem unwieldy enough to fit the bill.
If 6 out of 17 predictions correct with the only unambiguously correct predictions being trivial or vague is "nailing it," then I probably won't be taking stock advice from you any time soon. On the other hand, predicting the future is kind of hard. IsaacL made some really conservative predictions, sticking to things that seemed obvious. In fact, most of the wrong ones would be perfectly reasonable predictions for the next 10 years. It just goes to show how hard making predictions really is.
Below is how I calculated 6 out of 17 predictions correct
---
> Facebook will not be displaced by another social network. It will IPO some time in the next two years.
Correct
> Twitter will become profitable, but not as much as some expect. It will be less profitable than Facebook, and may sell to another company.
Correct (profitable since last year)
> Microsoft will .. have shrunk and may have evolved into a consultancy company on the lines of IBM
Wrong
> Internet Explorer will shrink, but won't go away
Correct? (debatable, since software never completely goes away, but MS is no longer developing it and Edge doesn't use the same rendering engine)
> Chrome OS or a similar operating system that relies on web access may grow extremely slowly at first, before rapidly gaining share amongst certain market segments. It will be most successful in places like cities that grant free municipial wifi access.
Wrong
> Mobile phones won't replace computers, but increasing penetration amongst the poorest in developing countries, and increasingly capable handsets in developed countries (and developing countries) will make them a colossal juggernaut. Many of the really big changes, especially social changes, will be caused by mobiles.
This was already true in 2010, so it doesn't even count.
> For any definition of 'success', there will be more tech startups reaching that level in the 2010s than in the 2000s. For example, there will be more than four startups of Youtube/Facebook/Twitter/Zynga proportions.
Wrong.
> In addition, at least one of the 'big' startups of the second half of the decade will have been possible with 2009 technology. By this I mean that people will still be discovering new potential for browser-based web applications built with current client-side technologies, which will remain ubiquitous, although new alternatives will appear.
Correct, but the predication that there will be at least one new web-based startup is not very interesting.
> It will be an even better time to start a startup in 2020 than it is now. One of the key drivers of ease-of-starting-up-ness will not be new technology, but new platforms - like Facebook and viral marketing, but better; or that solve other problems like micropayments, customer development, retention, and so on.
Wrong.
> Hence, starting up will become a more attractive career option, though well-meaning family will still say "at least finish your degree first".
Wrong.
> As Moore's Law marches on, dynamic languages that are even slower than Ruby are likely to catch on. They may be to Ruby what Ruby is to Java, trading even more programmer time for CPU time.
Wrong.
> Having said that, Moore's law will at least hiccup and may stop altogether in the middle of the decade, as semiconductor feature widths drop below 11nm. Since this will likely encourage investment in quantum computing and nanotechnology, by 2020 we might be seeing something faster than Moore's Law.
Wrong, IMO (transistors are still on the curve, but the performance impact of adding more transistors doesn't matter the same way it used to; but he cited Moore's Law specifically, so he's wrong).
> An international deal, of the kind that was aimed for at Copenhagen, will be reached over the next five years, though it might not be far-reaching enough to limit warming to 2 degrees in the long-term. (Despite the failure of the Copenhagen talks, it appears that world leaders almost universally recognize the need to take action over man-made climate change, though the various political problems will remain hard problems). China may not be part of such a deal, though the US likely will. Environmental disasters will begin to increase through the decade, as will disasters that are probably not caused by anthropogenic global warming but will be blamed by it anyway; this will provoke more of a push for action.
Wrong.
> Increasing fuel prices, and green taxes or incentives, will mean large shops will begin to replaced by warehouses, as traditional retail gives way to home delivery.
Wrong, but only because he cited a reason. Otherwise, the prediction that brick-and-mortar stores will continue to be replaced would have been correct.
> China will not become a democracy, or even make moves in that direction. However the rule of law will strengthen, and some civil liberties will increase. Internet crackdowns will continue, and may increase in severity, and will still be rationalized by porn.
I want to say the civil liberties situation in China is the same, but I don't know enough about China to comment on it.
> Despite multiple new fads that purport to make software development ten times faster and error-free, it will remain a hard problem.
Correct, but trivial. Of course writing better software is hard.
> You still won't be able to talk to your fridge, and gesture-based HCI will remain a fun gimmick.
I'm counting this as wrong because of Alexa.
> Virtual worlds like Second Life will remain niche, but World of Warcraft will pass 20 million users and a Facebook game or similar will pass 200 million users.
Wrong.
> The next big thing will be something totally unknown and unpredictable now, as user-generated content and social networking were in 1999. However, when it does appear, various 'experts' on it will spring from nowhere to lecture us all about it. It will still be really cool, though.
Wrong, but maybe I'm forgetting about some big tech thing in the 2010s. It seems that the biggest disprupters was a modernization of cab companies and hotel bookings. We may be on the dawn of several actually unpredicted new things, like driverless cars and the use of neural networks for some kind of cool new applications, but there's been no killer app. These are hopefuls on the horizon, not profitable products and industries.
> (ChromeOS will grow slowly, then rapidly gaining share amongst certain market segments) Wrong.
I don't know about that. ChromeOS did grow slowly, and has gained significant share of the entire US education space (K-12). Yeah, it's a niche market. But I think that fits the for "certain market segments" qualifier.
> (Virtual worlds will remain niche, but WoW will pass 20 million users, a Facebook-like game or similar will pass 200 million users) Wrong.
This is technically wrong, but in-spirit correct. Virtual worlds / MMO games did get the traction claimed, just not WoW specifically. (The MMORPG FF14 Online has ~20 million users, MMO Warframe has 50 million registered users, 'facebook-game' Farmville has 73 million users, and 'or similar game' Fortnite has 250 million registered users). Minecraft holds similar numbers.
And while WoW itself never quite hit those numbers, a different game from the same studio did. Blizzard's Hearthstone has over 100 million registered users today.
You can still take the 2010 approach and just run everything on Rails and Postgres. It’ll be a long time before any startup outgrows that stack, and by the time they do they’ll have the ability to rewrite like Twitter did.
Anyone who starts off with anything more than that (or Django or your favorite language’s equivalent) is just wasting time and effort trying to be trendy. Over-engineering a startup only serves to keep you from testing market fit, which is only a good plan if you’re trying to milk more investment money because you know you’re going to fail as soon as you launch.
that's a tech view only. Startup-ing is harder because incumbents in innovative business have built huge moats, in pharma, chemistry, web, mobile, etc. Big business are more ready to react to disrupting newcomers by pricing them out, lobbying for new rules, buying them out, etc. Incumbents are also reaching a global scale that generates in itself such an advantage...
> Now you “need” half of npm, react, backend, devops etc to get started.
Says who? You can literally spin up nearly any server computing infrastructure imaginable in a matter of seconds. Just because software is getting bloated under the covers, doesn't mean its any harder.
>> Mobile phones won't replace computers, but increasing penetration amongst the poorest in developing countries, and increasingly capable handsets in developed countries (and developing countries) will make them a colossal juggernaut. Many of the really big changes, especially social changes, will be caused by mobiles.
> This was already true in 2010, so it doesn't even count.
This definitely wasn't true in 2010. On June 1, 2010, Steve Jobs proclaimed that the post-PC era had arrived, and was promptly ridiculed for it by the industry and media. Massive social changes caused by mobile phones have only occurred this decade, as the gig economy has exploded and places like India have 10x'd the number of citizens with internet access.
>> For any definition of 'success', there will be more tech startups reaching that level in the 2010s than in the 2000s. For example, there will be more than four startups of Youtube/Facebook/Twitter/Zynga proportions.
> Wrong.
Huh? Of those 4, Facebook in 2009 was the most valuable at $10 billion. Now there are 20+ private unicorns with that valuation, as well as dozens more that have IPOd in the last few years. How is that prediction wrong in any way, shape, or form?
>> Hence, starting up will become a more attractive career option, though well-meaning family will still say "at least finish your degree first".
> Wrong.
>> An international deal, of the kind that was aimed for at Copenhagen, will be reached over the next five years, though it might not be far-reaching enough to limit warming to 2 degrees in the long-term. (Despite the failure of the Copenhagen talks, it appears that world leaders almost universally recognize the need to take action over man-made climate change, though the various political problems will remain hard problems). China may not be part of such a deal, though the US likely will. Environmental disasters will begin to increase through the decade, as will disasters that are probably not caused by anthropogenic global warming but will be blamed by it anyway; this will provoke more of a push for action.
> Wrong.
Again, this is correct. The US did enter the Paris Agreement, and has not actually formally withdrawn yet as it is not legally eligible to do so until November 2020. "Environmental disasters will begin to increase through the decade, as will disasters that are probably not caused by anthropogenic global warming but will be blamed by it anyway; this will provoke more of a push for action." is an especially cogent prediction.
>> China will not become a democracy, or even make moves in that direction. However the rule of law will strengthen, and some civil liberties will increase. Internet crackdowns will continue, and may increase in severity, and will still be rationalized by porn.
> I want to say the civil liberties situation in China is the same, but I don't know enough about China to comment on it.
Civil liberties have likely gotten worse, but the rule of law has indeed strengthened. Internet crackdowns being rationalized by porn is also correct, as seen in the UK and elsewhere.
As far as ChromeOS and video games go, maxsilver covered those pretty well already.
I remember Iran had very large scale protests in 2009 leading into 2010, so that was probably on their mind, thinking it would lead to some kind of radical change.
Half of those predictions read like "the thing that is a trend now, will continue to be a trend".
I can be a prophet too - Google, Amazon, Facebook, Apple and Microsoft are not going to be displaced in the next decade.
> Half of those predictions read like "the thing that is a trend now, will continue to be a trend"
And yet, people regularly think "a decade is a long time, surely this trend will be finished by then". Recognizing that existing trends are still likely valid is worthwhile.
- Apple and Google are still the only two mobile operating systems that matter and they are still in relatively the same position. iOS still controls the high end where the money is and Android has the market share but the OEMs are not making any money.
- Facebook is more profitable and popular.
- Amazon is still the number one online retailer, the Kindle is still by far the most popular ereader but more importantly, the Kindle platform is still dominant.
- Google still hasn’t managed to diversify from its ad business and YouTube is still dominant for video.
- Netflix is still the dominant streaming platform.
I think Microsoft was the biggest surprise. In 2010 MacBooks were starting to become very popular, and it looked like Microsoft’s monopoly on consumer computing was coming to an end. It was common to think Microsoft was on a long decline in 2010.
It's interesting how none of the commenters envisioned Microsoft repositioning itself to adapt to technological and business trends. Everyone seems to have thought they would either fight them and lose, or contract to becoming an IBM-like consulting business.
Slack isn’t a “large player” they are nowhere near being profitable and are nowhere in the same league as Facebook, Amazon, Apple, Google or Microsoft.
The app ecosystem was growing like crazy in 2010. It was a year and a half into “there is an app for that”. Facebook had already pivoted to being big in mobile.
The same can be said about Uber. Once the companies have a profitable and sustainable business model that doesn’t involve burning cash then they can be considered a major player.
Regardless of profitability I think people are spending a lot more money on the internet today as they were 10 years ago. Subscription services, gig economy, and online retail have exploded in the last 10 years in a way that I don't think many people predicted.
Everyone saw online retail exploding by 2010. Amazon was already big and growing, Barnes and Noble was already struggling against Amazon and Apple/iTunes was the number one music retailer as bricks and mortar music retailers had started to close.
By 2010, streaming services were big, Apple was selling movies and TV shows digitally and had already started removing DVD drives from its computers.
Honestly it doesn’t matter. It came out in the Oracle trial that Google only made $23 billion in profit from Android from its inception to the start of the trial and t still has to pay Apple a reported $8 billion a year to be the default search engine for iOS.
Apple has made a lot more in mobile from Google than Google has made from Android.
For awhile, MS was making more in patent fees from Android OEMs than Google was making in licensing and advertising.
Losing mobile was the best thing that could happen to MS so it could focus on Azure and making Office ubiquitous- where the real money is.
> still has to pay Apple a reported $8 billion a year to be the default search engine for iOS.
That is not the way to think about it - Imagine if Android didn't exist or faltered like Symbian, RIM, Windows Mobile etc. Google would have been paying way more than 8 billion. Android is definitely worth more than the money accounted towards it.
Not really. The Android market is really not worth that much. Statistically it’s made up of consumers who aren’t willing to spend that much compared to iOS owners. The high end non Chinese Android market is minuscule. They just aren’t as attractive as iOS users to advertisers.
Thanks, the 2nd link seems to show that iPhone owners spend way more compared to the amount of downloads here as well.
Personally, I have quite a few paid apps, but the majority I bought over the years and keep using them. The top grossing apps are all subscriptions and the only one that I have with one is OSMAnd (Open Street Maps app) which includes a donation to OSM/contributors.
> - Apple and Google are still the only two mobile operating systems that matter and they are still in relatively the same position. iOS still controls the high end where the money is and Android has the market share but the OEMs are not making any money.
You're badly misremembering or recontextualizing with hindsight bias. At the start of 2010, iOS had 32%, Android had 4.5%, and there were at least 5 other mobile OSs with >4% market share. Start of 2020 and we're looking at Android with 74%, iOS with 24%, and zero other OSs with >1% market share. These are very different situations!
By 2009, they had 12,000 movies up for streaming, and Netflix-compatible devices were advertised heavily in stores, with Best Buy including Netflix apps on their store-brand devices.
The big announcement when the iPad was introduced in March 2010 was that Netflix would have an app for it. It had been streaming for at least three or four yesss by then.
If you want to make a new prediction, a new thread has gotten started: https://news.ycombinator.com/item?id=21941278.