Hacker News new | past | comments | ask | show | jobs | submit login
The next big thing will start out looking like a toy (2010) (cdixon.org)
191 points by feross 49 days ago | hide | past | web | favorite | 198 comments

I think most participants in the Internet circa 1996 and personal computers in the early 1980s would understand this idea.

The Internet was being widely written off as a toy that many couldn't ever imagine having practical applications.

Case in point, this widely-cited piece from 1998:

> The growth of the Internet will slow drastically, as the flaw in "Metcalfe's law"--which states that the number of potential connections in a network is proportional to the square of the number of participants--becomes apparent: most people have nothing to say to each other! By 2005 or so, it will become clear that the Internet's impact on the economy has been no greater than the fax machine's.


Here are some fun things about this concept of toys turning into big things:

1. toys start out looking like toys, too

2. it's extremely difficult to distinguish the toys that will remain toys forever from those that will become "the next big thing."

A useful approach to the problem posed by (2) is to look at things that appear to be toys but refuse to die despite every indication that they should have done so long ago.

Toys that remain toys often have a trendiness and inelasticity to them that gives them naturally short lives. If you see a toy lingering long beyond the time it should have given up the ghost, you may be looking at the next big thing.

The next problem is that it can take a very long time for a toy to turn into the next big thing.

I think your last sentence is perhaps the most important - timing the "Toys to Big Thing" transition is really hard and leaves a lot of dead companies in its wake. For example, at some point in the future, VR and AR will probably be in this category, but who knows how long it will actually take (basically-modern VR has been around since the 80s!) or what will cause the shift. Meanwhile, VR has left a decades-long trail of dead plans/businesses and AR seems to be doing the same.

Augmented reality is a great example. Everyone knows that it will become a "Big Thing" in the future. Like why would smartphones be the final form factor of personal computing? Of course there will be something that will dethrone it.

But despite every big company throwing a couple billions at the problem for the last few years (Facebook's $2 billion acquisition of oculus, Google's $500 million investment into Magic Leap, Apple's acquisition of Metaio/ARkit, Microsoft Hololens), all we have are literal toys!

So now you can't talk about AR as a serious subject anymore. People all dismiss it, saying we are still too early. I imagine it's like that for every "toy to big thing" transition.

I think there’s a very good chance the smartphone will be the dominant form factor essentially forever. Anything smaller won’t be good for presenting visual information, and when you don’t need it, it isn’t stuck to your face.

Alternatively taking a long view, how many times do you think the dominant form factor of personal computers (in the general sense, including smart phones) will change? Another 3 times, another 5? Every 20 years forever?

The only exception I can think of is a direct neural link, but that’s generations away and even then I’m not entirely convinced. Maybe in the very long term.

> You could use glasses, but not everyone wants to wear them.

What about contact lenses?

I don't see how that could physically work. To present a high fidelity bright, clear display has irreducible base power requirements, and I don't see any way to provision that amount of energy in the form factor. Signal reception would also be a problem, and then there's the issue of dealing with thermal losses. Thermodynamics is not our friend, especially for powered devices in physical contact with our eyeballs.

Contact lenses are a pita. I suspect it's going to be a smartphone-like form factor combined with a voice UI similar to the movie Her from 2013. Eventually the visuals on the phone screen will be replaced with a neural link. It would make sense if the implant is just a dumb antenna, combined with a removable "earbud" that connects it to the computing device.

We have not conceived useful applications that actually require AR yet. It has to be something requiring immediate feedback - maybe driving a car or piloting a similar vehicle would be such application. Maybe stock market. Or something entirely unforseen, like data analysis or warehousing - using human intelligence better?

Voice is much too slow, as is listening - might as well use the current devices at those speeds.

Computers started with bookkeeping and mass production control. It took quite some time for them to penetrate office market or to (partially) replace the previously used communication tools.

Yes because contacts are an absolute pleasure in comparison to glasses.

Are you being sarcastic? I've been wearing contacts and glasses for nearly 40 years. Contacts are absolutely an absolute pleasure compared to glasses. I only wear glasses when I absolutely have to (i.e., right after I take out my contacts until I have to go to sleep at night and vice versa in the morning). For instance, I hate working out in glasses. They get all foggy and start slipping down my face from the sweat. Doing high intensity sprints in glasses is the worst.

You're thinking very small-picture here. Contacts can be improved upon somewhat if we have the technology to make them into full AR screens.

As could glasses, starting with how they're kept on your head and the weight. Just look at how far VR helmets have come, or skiing goggles. There's more that can be done still, but not enough pressure to do it.

Yes, this is correct.

You can't readily "disengage" from contact lenses. Imagine getting hacked and somebody beaming shockporn onto your vision.

Remember the PC revolution.

Altair was a toy. Atari 2600 was a toy. TRS-80, Sinclair Spectrum, Apple II — all toys. Amiga was almost not a toy, and Aople Macintosh and IBM PC were mostly not toys anymore.

From there on, businesses started to use massive amounts of PCs, thanks to killer apps like word processors and spreadsheets.

Prices fell, home users took note, and a real explosion began.

It took mere 20 years, say, from 1980 to 2000.

In that 20 years the hardware and software exponentially grew in capacity and decreased greatly in cost. Computers turned into something you’d want in your pocket once they could be unobtrusive enough to be tolerable with you all day.

On that note, who knows what would AR require in order to take off. It probably needs to be something you can get 80% of the killer features without wearing glasses. It also needs to support whatever frames glass wearers want to wear. It might require actual visual implants as I don’t think there’s a way to turn a contact lens into a display.

I've read about contact lenses that used super—tiny LEDs to produce a focused overlay image in the eye.

It of course had a large external induction coil to power it, and very low resolution.

I, for one, do not know that AR will become a "Big Thing". It is fine for some applications, but not enough to be Big.

ehhhhh I would disagree with your statement that "basically-modern VR has been around since the 80s." It really seems like this phase of VR really began with the release of the Oculus Rift DK1. Sure there had been earlier prototypes floating around, but they weren't taken seriously.

The game changer between the 80's VR and 10's VR is basically the smartphone. In the old days, folks were hooking up big bulky CRTs to their face. Now we have cheap flat panel displays as a commodity.

But there are still more technical problems to solve before the killer application and corresponding headset will be cheap, useful, and lightweight enough for mass adoption.

By this test, is Lego the next big thing?

The real test, I think, is something that has genuinely useful applications to anyone with a little bit of vision.

Big entrenched players are unable to see the next big thing, because they see the world through the glasses of their successful business. Microsoft for example almost missed the internet, and suddenly realized it in 1995. I believe Bill Gates was quoted as saying it was just a fad. The funny thing is that more than a decade earlier (about 1980) I heard the same thing about "personal computers" or "microcomputers". Just a fad. A toy. Never will compete with IBM's "real" computers, etc.

Similarly, these new fangled "automobile" thingies will never become popular. They are noisy. Smelly. Unreliable. Difficult to start. You can even break your arm if it kicks back while you are cranking it. And worst of all (gasp!) they frighten the horses! That's why these automobile thingies are just rich kid's toys. A fad.

I guess they couldn't see that all of those imperfections I listed could be fixed.

Back in the 1990s I saw this interesting tv program. Sorry, can't remember. It was about personal computers. The person being interviewed said the entrenched players NEVER see the next big thing coming. So they hire someone smart who will see it coming, and then tell them when its coming. And then when they are told this is the NEXT BIG THING! They never believe it.

Speaking of Microsoft almost missing the internet, they did entirely miss the new generation of smartphones starting with the iPhone. Ballmer laughed at the iPhone. They woke up to the new reality after it was too late. Sorry iPhone and Android already dominated. Smart phones sold in unit volumes that dwarfed PC unit volumes. Win Phone 7, followed by an incompatible Win Phone 8 didn't stand a chance -- even with nice Nokia hardware.

I remember building dial-up internet providers in the late nineties(97/98) here in Brazil as a teen, and some consulting folks trying to sell us some Solaris licenses, laughing at us because we were using Linux, 'that toy OS'.

It would be kind of easy to make a stand against the microsoft fanboys against NT 4, but it was much more difficult to make a good case of Linux against Solaris back then. (Remember Linux starting to be taken seriously by 2.2 and 2.4 period)

Also i got the same dismissive snarks, when installing and programing in PHP back then to make dinamic websites, in a world then taken by ASP and Cold Fusion. (And the other alternative in Unix were only Perl, for which i thank PHP for avoiding me the trouble).

It's a reminder for me, that the most common trend in IT sector is of folks with a "colonized" sort of mentality, that will fail to recognize the things that will turn into real things in the future.

It's sad when you think that its even worse than that, giving most people just show off and wait for their next paycheck. They dont even bother thinking in those things.

So the psychological profile of who are saying what is kind a very big deal, not just what they are saying.

They didn't miss it, they saw it as a threat to be stomped.

They almost succeeded, but Google would've canned their game eventually. Believe it or not, "free stuff" is popular.

The idea that you speak of, about bug entrenched companies not being able to see new things coming - it's the central theme of The Innovator's Dilemma. I highly recommend that book if you wish to read about this further.

I think you are replying to someone that has worked for G for a long time, and is using their knowm alt-account (i.e. not hiding anything). He probably read that book when he was ~25 yrs. old.

> Smart phones sold in unit volumes that dwarfed PC unit volumes.

Popcorn sold in unit volumes even higher than that, but so what? These aren't comparable things.

I think this is a legitimate question. There's two main things smartphones and PCs have in common, but both are arguable:

1. They do the same things. Both are general purpose computers. For some use cases, they both work fine.

2. They're about the same cost: ~$800 or so. But that's comparing flagship smartphone costs with entry-level desktop PC costs.

I suppose from hte perspective of a large company, a smartphone sale seems like it might be more or less valuable than a desktop PC sale depending on a variety of factors. So I think it's safe to say they are similarly comparable in that sense.

Literally, no. Conceptually, maybe.

Lego represents the construction of objects from catalogs of standard components. That’s in comparison to carving unique, one off toys from raw wood.

Professional fabrication has followed in the footsteps of LEGO (and Lincoln Logs, Erector Set). If you use SketchUp to design a building you’ll notice it’s encouraged to use standard building components that can be picked from a library. Lots of those components can then be picked up at Home Depot. Ikea furniture kits are also similar to Lego (or at least the broader category of toy).

LEGO’s Mindstorms GUI programming model is very forward looking. LabVIEW and Simulink are picking up on those concepts & professional engineers are definitely using them.

If Lego blocks were bigger and all building codes and regulations were removed and zoning could be eliminated, then yes absolutely legos would be the next big thing that could upend the entire multi trillion dollar housing market. Imagine the enormous societal wealth creation, if anyone could build a house with the same ease (and lack of regulations) of building a website.

of course, these hypothetical legos would need plumbing and electrical infrastructure and maybe a few other things ;)

Do those need siding or is that an acceptable external wall? Would be nice if these structures were widely allowed, but I have a feeling that it would conflict with many building codes. That sawdust wall filler also looks like a fire hazard. Look like it'd be good fun building though!

Already was - spawned games, movies, theme-park rides, etc.

Lego has an annual revenue in the billions and has recently grown bigger than Mattel or Hasbro. So, yes

Maybe? The definitions evolve and stretch. What if legos where made of smarter material or had more powerful components? 3D printing might actually be the right way to think about small-scale at-home manufacturing rather than per-brick components - but even then plenty of components (such as compute related) require different manufacturing methods - so standards based on something like lego might allow for interoperability. It's a toy ... until it is not.

Yes; by 2036, 78% of the earth's surface will be composed of Mindstorms.

if by Lego you mean Minecraft, then yes

I was certainly a "participant in the Internet circa 1996" (and before) and besides the inevitable quote from Krugman pretending not to understand what is happening here [Network effects don't care that "most people have nothing to say to each other" and we should laugh at a supposed "economist" who didn't understand that in 1998] I don't see a connection to the Internet as a "toy".

I guess there are two and a half possible mistakes here, which I'll address separately.

1. The Network is a toy. No. At least as far back as the Treaty of Bern (in 1874) the importance of the global Network is so obvious that national governments give up a little sovereignty to have it work better. Communication makes the difference almost everywhere almost all the time. Bern of course is about written letters, but the Telephone follows the exact same course, and so if you accept the Internet as a natural successor all the same things apply.

2. Maybe the Internet is not the Network? This argument looks completely reasonable in 1986 when IP networks are mostly an American phenomenon and there is a lot of X.25 out in the rest of the world. Maybe the Network will be X.25. By 1996 that's a distant memory, everything is IP. It's a done deal, the Internet is the Network.

2.5. Mistaking the Web for the Internet. The Web initially looks like a toy. HTTP/0.9 era Web has lots of toy-like qualities. Academics have been designing powerful hypermedia technology for many years by 1990, surely this half-arsed tech demo from a CERN guy who isn't even a Computer Scientist is a flash in the pan? You can make a better argument that the Web is still a toy in 1996 than the Internet, I don't think it's a good argument but it can be made -- however the Web is just an application. A competitor hypermedia solution that crushes Tim's Web but uses IP is still the Internet.

It might not have been a toy in the Fisher-Price sense of the word, but in 1998 it was only a couple of years into the mainstream, and had a long history of being the preserve of geeks and hobbyists. Big business was only just starting to take it seriously, and even then, not particularly so.

> By 2005 or so, it will become clear that the Internet's impact on the economy has been no greater than the fax machine's.

I wonder how big the fax machine's impact on the economy actually was; I feel like it probably was quite large. Not nearly as large as the internet, of course.

This makes me wonder, though.

I wonder if you went back to the very earliest days of the WWW, did everyone see it as a toy? Or were there already people (those developing it, perhaps) who saw that it had awesome non-toy potential?

I suspect people brand new tech as "toys" when they're not, and I bet there's also developers that take their toys and call them products :)

Maybe the solution is to completely discard the "toy" / "non-toy" dichotomy. It seems like it just muddies the discussion when the important thing is to evaluate the actual merits of the product/technology in question.

> I wonder if you went back to the very earliest days of the WWW, did everyone see it as a toy?

The WWW that was created to share scientific papers? Or you mean the GP's Internet that was created to organize the US troops on the event of a nuclear strike?

I guess so, there was plenty of people aware they were not toys. Most of those people did use them as toys most of the times, like everybody else at the time, but the toy classification was mostly from uninformed people and trolls.

Just as a side note: the Internet was not actually created to organize US troops in the event of a nuclear strike. The Internet, the one initially birthed by ARPA, initially had academic and civilian aims. That said, some of the early research on networking was developed to aid the military in case of nuclear war. In particular, Paul Baran invented package switching explicitly to help the US air force maintain its defense radar network in case of war.

That was in fact the impetus:

While working at RAND on a scheme for U.S. telecommunications infrastructure to survive a “first strike,” Paul Baran conceived of the Internet and digital packet switching, the Internet's underlying data communications technology. His concepts are still employed today; just the terms are different. His seminal work first appeared in a series of RAND studies published between 1960 and 1962 and then finally in the tome “On Distributed Communications,” published in 1964.


Baran's original packet (block) switched networks development papers, and in fact his entire catalogue, are available at RAND:

"On Distributed Communications I. Introduction to Distributed Communications Networks"


(This and the others are listed at the first link.)

>The WWW that was created to share scientific papers?

Most of the citizens of the western world (even those with a degree) definitely pidgeon-hole expensive university projects as "toys", because they don't have one and don't see a need for one.

Academia is a weird place and is often underestimated.

I meant the WWW. Thanks for checking!

Random, single data point 80's person here.

Compuserve, AOL and newsgroups were already big when I discovered the www - Compuserve and newsgroups seemed vastly superior to me at the time.

Netscape Navigator seemed so basic, and how would anyone know which URLs to use (discoverability)? I chuckled, wrote it off, and stuck with what I knew.

Well, I was just a little bit wrong :)

I was in college in the mid-90s. I don't know anyone who saw it as a toy. We'd already been using Pine and BBS for many years prior and this seemed like the next iteration of that. The first time I saw the WWW was when I was in a computer lab and someone was on ESPN (or some other sport's website) on the computer next to me. I leaned over and asked them what they were looking at. They showed me what a browser is and how you go to web pages. I tried it out and was hooked. I immediately started to try to make my own webpages. Of course, back then webpages were ugly, but they were infinitely easier to build than these days. No CSS, no JavaScript, just plain HTML and nothing else. Anyway, getting off topic. The WWW was not seen as a toy by anyone that I knew of.


A wonderful 1 minute snippet of David Bowie explaining how huge of a deal it’s going to be to a skeptical interviewer in the 90s

What about the next big things that turn into toys? I'm thinking of the Segway. In the days around it's unveiling there were all sorts of people saying this was the next big thing and it would revolutionize human transportation. They were even saying that cities would be rebuilt to accommodate them. These days I only see them being used as toys for tourists to ride around on doing city tours.

I guess by a bit of a leap you could see it as a pre-cursor to city-wide electric transport, such as electric scooters?

>The Internet was being widely written off as a toy

There was a definite split in opinion with some writing it off as a toy and some smart and informed people thinking it the next big thing (eg Gates https://www.wired.com/2010/05/0526bill-gates-internet-memo/)

>The next problem is that it can take a very long time for a toy to turn into the next big thing.

And humans are very impatient and are driven toward immediate gratification.

If you're hung up on the word "toy," another way of phrasing this is the next big thing will look like just a hobby to most people.

Looking forward, smart homes and 3D printing are 2 industries that fall in this bucket. Both feel like toys to the majority of the market right now, but have the potential to completely change how we live.

3D printing - low-cost manufacturing has completely revolutionized what's possible for startups, and developed a large hobbyist following, but we haven't scratched the surface of how they can change day-to-day life for everybody. Imagine what's possible once they support more material types (metal, electrical circuits, food, etc) and every home has one, like an oven.

Smart homes - we just have fancy remote controls so far, but eventually we'll be living like Tony Stark with homes that constantly adapt to us without input. This is where we'll see the benefit of ambient computing present itself clearest. Imagine a home that helps you raise your child, keep your chores in order, or handle the repetitive tasks without requiring a primary care-taker to be responsible for keeping the home in order.

disclaimer: I'm building Hiome (https://hiome.com) to turn smart homes into more than just toys, so maybe I'm a little biased.

I was working on software in the 3D printing space back in 2012, and I too thought it was going to be something big that looked like a toy.

However, I got out once I realized that speed of prints mattered a lot of consumers. And the current methods of consumer 3D printers were FDM (fused deposition modeling) and Stereolithography, which both had asymptotic limitation on how fast you can get a print to go. For FDM, there's a head you need to shunt back and forth. For Sterolithography, the consumer printers like Form 1 had a mechanical step between each layer.

Unlike chips with Moore's law or LEDs with Haitz's law, 3D printers aren't going to get much faster. That's when I got out.

So in order to get an order of magnitude faster print, someone needed to invent a completely different method--a step function.

So I think it's not just that it's a hobby, but you might need another criteria, like there's at least a plausible incremental path to improvement along the parameter that the market would care about.

PDAs fit this IMO. We had newtons, palm pilots (Clie), Windows CE, devices (iPaq, Axim, ...), all of them were for geeks/hobbyists only until iPhone. Anyone carrying one was considered a geek out of the norm. Now you couldn't pry a smartphone from a non-geek's hands.

I had palm pilots, Psions, a Newton (wow that thing was huge) and some Windows CE devices (HP all of them); I was considered a geek at first for that sure, but I didn't have to lug around an agenda and notebook (esp Psion replaced that in 1 device which was nice to use) and when 2g became a thing, I had Windows CE HP 'phones' (bulky, with keyboard and very fragile) but when on vacation I did all my company business on there (buying, selling, talking to developers, setting meetings, sending/receiving emails) while my colleagues had to go to internet cafe's or lug around laptops. But yeah, they were things for geeks, mostly impractical unless you really wanted to use them (which I did; I hate carrying crap around); iPhone changed that to making it easy to use. Ofcourse I bought that the moment it came out in my country (iphone 3g) and never looked back. That one is also still working by the way, not that I have any real use for it :)

I was a geeky programmer (still am), but I had absolutely no interest in owning a cell phone or any other PDA until the first iPhone came out. I was floored by it. The touch screen, the built in useable web browser, the apps, etc. were so much better than what had come before.

It's far too early for most consumers... despite what the promoters claimed.[1] Most of the same issues existed with paper printers in the 80's: they could be fiddly, slow, noisy, power hungry, prone to jam and messy. If you had to print something too much longer than a book report, you had to keep your eyes on it while it printed and cross your fingers. There was no Moore's law for paper printers, just years of incremental improvements in the respective technologies and mechanisms. In time, the problems will get solved and then we'll see how big the consumer market for them is or isn't.

[1] Names like 'Replicator' conjuring visions of Star Trek when the thing most printers were able to produce were variants of piles of expensive plastic spaghetti when prints go wrong.

But for paper printers you had no alternatives... So you just focused and printed. You could not take your luggable[0] out in the train and read off screen. Or take it to the couche or sit in your college bench.

For 3d printers etc we have alternatives that are fine; we don't actually need them for anything urgent.

[0] https://oldcomputers.net/pics/osborne1.jpg

Before smart homes are more than just a toy, they need to have functionality that is actually time saving.

I've messed with smart home stuff and so far the only thing that has been useful beyond being a toy is our robot vacuum, which I'm not sure is really classified as a smart home thing.

As someone that lives in California, something that could be an actual $$ and time saver rather than just a toy is if I could have a system that would open windows rather than turning on the AC when the outside temperature is low enough.

A simple example of useful stuff than can be done today but is annoying to set up is to have all the lights in a home turn off when everyone is gone, but switch back to their previous state when one or more people get home.

Another is to have blinds that open just before sunrise, so that you can sleep in darkness but wake up with the sun (especially useful if you're in a city). This is what I do myself, but it's a pain to change anything because the legacy motorized shades don't integrate with anything.

which I'm not sure is really classified as a smart home thing.

This is amusing to me, I think as soon as something becomes somewhat common it will no longer be classified as "smart home". Sort of like how the definition of AI is a moving target.

It's definitely a smart device. But to me a smart home has a central system that can monitor a variety of inputs and, in response, control a set of things to accomplish some goal.

We have a robot vacuum and robot mopper. Me going over and pressing the robot vacuum button, then once its done, placing the robot mopper in position and pressing the start button, is using smart devices but isn't a smart home. If I can tell a central system to "clean the kitchen" and it starts the vacuum, and when that's done dispatches the mopper, that's a smart home.

I know higher end iRobots can do this sort of thing in their dedicated app. And if you can control that via a smart home hub then that would be smart home functionality to me.

> it starts the vacuum, and when that's done dispatches the mopper

Great point -- outsourcing control leads to logic that can improve and be coordinated between unrelated systems.

> If I can tell a central system

Or better yet, not need to tell it, and have your kitchen cleaned while you're at work, possibly with a phone notification if you really want it. Probably just a passive history+status interface normally, though, once you trust it.

Obviously a vacuum cleaner could have heuristics to tell when you're out (clocks and radar etc) but that's just a dead-end solution when your thermostat and your lights and your curtains and your security system all want to know too.

A vacuum is really easy to use; what's a real pain is cleaning the vacuum cleaner. When are they going to automate that?

It might not sound like a lot, but it saves us 15-20 minutes of cleaning per day. We have 3 kids and get about 1-2 hours of free time per night. So its 12-33% more time to spend doing whatever we want rather than cleaning the house.

Vacuuming every day strikes me as misguided, even if (especially if) you have allergies, because the residual dust never has a chance to settle. I'm speaking as someone who has issues with household dust much beyond most people, and who has found even a good air purifier to be counterproductive if used 100% of the time.

> they need to have functionality that is actually time saving.

Robot vacuums didn't achieve this for me, I'd have to spend more time removing "clutter" from the floor than I would vacuuming and half the time it'd get stuck on something anyway. Some of that clutter was me being messy, but some of it was things I liked to keep handy under the couch. As advanced as they are, they're still stupid machines that don't properly adapt to humans.

I 100% agree! No denying there's still a lot of work, and, like usual, the transformation from toy to mainstream will take time and significant R&D.

The thing is a hobby isn't too different from a business. They can even be hard to distinguish: https://www.irs.gov/newsroom/hobby-or-business-irs-offers-ti...

The next big thing will start out looking like a business! Wahey!

I think cpsmith said it well in another comment on this page that we're talking about technologies, not individual companies[0]. Would other people (mainstream users) say the tech is for hobbyists? The examples I cited (desktop 3D printers and smart homes) are definitely considered hobbies today, regardless of whether anyone would classify Hiome as a hobby.

[0]: https://news.ycombinator.com/item?id=21316776

This did not hold up well, imo. Not sure how to count it but by some lists this is stuff like: Tencent, Alibaba, Amazon, Netflix, Priceline, Baidu, Salesforce.com, JD.com.

Others would include Uber/Lyft, Airbnb, GitHub, ...

There are only a few that would qualify that I can think of - Instagram, Snapchat, and WhatsApp.

If folks here have other examples or thoughts on why this does or doesn't hold true would love to hear them.

I think the "Thing" in Next Big Thing as this article uses it is rarely a business, more often a technology. WhatsApp has never been considered a toy, but IM as a medium has—WhatsApp's genesis as a valuable business is itself that toy becoming a Big Thing. Netflix was never a toy; streaming video was a toy in 2005, Netflix turned it into a billion dollar business.

Edit: Technology, not product, and Netflix example

Could just be where we are in the technology cycle. Using Carlota Perez terminology, in 2010 we were in the midst of Synergy for web technology and Frenzy for mobile. Now both web & mobile are nearing Maturity and whatever the next big technology cycle is still in Irruption.

If you looked at PCs from 1993-2003 you would've had a similar view. PCs from 1983-1993 underwent dramatic progress: you went from 16-color TV outputs, 64K of RAM, 8-bit CPUs, floppies, command-line interfaces, and BASIC to 24-bit color, 3D computer graphics, GUIs, 16MB of RAM, 200+ MB hard disks, 32-bit CPUs, IDEs, desktop publishing, CD-ROMs, modems and Internet access, even speech recognition and text-to-speech on some Apple machines. From 1993-2003, you had incremental progress: Microsoft won, Windows 3.1 became Win95 and then eventually Win2k, CPUs got faster, RAM and disks expanded, broadband happened, but what we used the computer for didn't change much, except for the advent of the Internet. The Internet itself was supposed to revolutionize computing, but the dot-com bust happened in 2001 and in 2003 it was still pretty much a toy. And other much-hyped developments like WebTV, VR, voice recognition, and AI had fallen flat.

There are plenty of toys that are still in Irruption now. Cryptocurrency was supposed to change the world; the bubble burst in 2018, but maybe we'll see it come back in 2020 with DeFi the way the Internet did in 2005 with social media. Drones are literal toys right now. So is VR & AR. There's been a lot of progress in computing for kids with things like Scratch, RoBlox, or Minecraft.

I can see curren-gen VR and AR games leading to next-gen "real AR" glasses with an impact on far more than just gaming.

Most VR/AR development activity I've seen seems to be about (industrial) training.

Where it would be most definitely not a toy after a few iterations.

I see AR as having an enormous future impact, games only a small fraction of that. We jumped at the chance to escape reality into our phones, but AR is far more seductive. VR limits your mobility, but AR can be used every waking hour.

I see it starting with small tweaks to reality...a dingy concrete sidewalk replaced with a golden brick road. Empty walls in an apartment awash with art, scenic views, and/or entertainment.

I look forward to my golden brick road speckled with syringes and human feces.

It didn’t hold up because, while it does sound catchy and nice, the assertion is wrong.

And it’s wrong not because of the “toy” part but because of the “next big thing” part.

There is absolutely no guarantee of a ”next”, a ”big” or “a thing“ coming, regardless of origin.

A different future is not inevitable, at best it’s a toss up. This last decade went for inertia. Next one? Who knows...

Evan Speigel admittedly said this about Snapchat [0] (2014):

"When we first started working on Snapchat in 2011, it was just a toy. In many ways it still is – but to quote [Charles] Eames, 'Toys are not really as innocent as they look. Toys and games are preludes to serious ideas.' "

[0] https://www.snap.com/en-GB/news/post/2014-axs-partner-summit...

Cars started out as ridiculous luxury toys for rich people.

Sometimes you gotta wait a few decades to see where things go.

That doesn't appear to be true, Duryea and Benz vehicles (ICE burning petrol/gasoline in the late 1800s) appear to have replaced bicycles and horse-drawn carriages and acted as functional means of transportation, rather than "toys".

Earlier electric and hydrogen powered vehicles I've seen appear similarly to have been created as functional replacements for horse-drawn vehicles.

Maybe you could expand your comment to demonstrate your point?

I don't know about horse-drawn carriages, but bicycles of the late 1800s were luxuries for, well, not rich people but certainly for the well-to-do. It's one of the reasons that child-sized highwheel bicycles are rare: they were ridiculously expensive, and most certainly not a child's toy. And they did arguably serve a utilitarian purpose; at least a bicycle doesn't eat if you don't ride it. But how were bicycles, specifically highwheelers, portrayed by the press of the day? As ridiculous toys for wealthy people. Source: grew up around antique bicycle collectors, and have an 1886 Columbia Standard myself.

I remain unconvinced, however, that an early ICE-powered automobile was nothing but a temperamental toy that the owner really, really wanted to be practical. I've hung around with enough people with Ford Model Ts (my parents also had a Ford A for a while) to suspect that something built twenty years prior to the Ford T had to be laughably unreliable. Because I only rely on a T to get me to work if my boss were pretty laid back. :-)

There were steam-powered cars in the late 1700s that were toys of a few wealthy patrons ...

Your first link is blocked for me by the website owner.

The second in turn links to https://www.cbc.ca/news/business/climate-change-will-push-ca... . When they say "cars are an expensive toy for the rich" they're saying figuratively that cars aren't reliable nor efficient. I don't think anyone would disagree that [important, useful] new technologies tend to get cheaper and more efficient over time.

It's really not clear, if this is the sort of backing to the "toy" claim what it is hoped to prove?

The early users appeared to use cars for transportation, which the second link doesn't contradict, so not literal toys. The "toy" claim is just a perjorative that the link appears, implicitly, to say was used by those who were already profiting from horse related industry.

From the first piece:

The oft-printed statement that early automobiles were ‘playthings of the rich’ until Henry Ford’s super-cheap Model T started rolling off the assembly line in October 1908, is easily proven by scanning the makes of high-priced chariots and their wealthy owners who participated in the Amsterdam Evening Recorder’s Saturday, July 10, 1909 “Sociability Run” from Amsterdam to Lake Luzerne.

Everything I've ever seen indicates cars were considered to be "toys" for the rich when they first came out. Roads were not really designed for them. They were insanely expensive. They were not generally considered to be serious new tech that would eventually compete as transportation.

That changed when Henry Ford made cars affordable for the masses.

Old laws often said things like "You must have someone walk in front of your car ringing a bell so you don't spook the horses." This implicitly tells you that early cars were also extremely slow and horses were the form of serious transportation that the culture revolved around.

Even more so when post-WWII cheap petrol made cars affordable, and suburban living made them essential.

Half of all households (based on ~4 persons per household) didn't own an automobile until between 1945-1950. Ownership didn't cross the 10% threshold until the 1930s.



Internet Archive mirror, should be accessible.

For clarity, that's a mirror of the first link I posted.

Updated to note that.

"The early users appeared to use cars for transportation, which the second link doesn't contradict, so not literal toys."

I think when people say they were toys, they mean that in the same sense that a Spyker or Ariel Atom are "toys" today.

I think the "looking like a toy" terminology is confusing the discussion.

From the article:

> Disruptive technologies are dismissed as toys because when they are first launched they “undershoot” user needs.

So instead of asking if it "looks like a toy", what if we asked, which of those products started out by "undershooting" user's needs?

In that lens, I think you could make arguments for a few of those having started by "undershooting." The most obvious example to me is Amazon starting out as an online bookstore.

edit: I just want to add that from my vantage point this might be a true idea, but it gives very little actionable information. Perhaps Christensen's books give better insight on why this is something we should care about.

Scooters, Bitcoin, Tinder, Meditation Apps. Then literal toys like Fortnite, Pokemon Go.

Would you call Bitcoin a "toy"? If so, it really is just saying that it is likely to be something you are initially dismissive of.

World of Warcraft Gold & similar in-game currencies are toys.

Bitcoin is not a toy, but follows in the footsteps of those toys (in addition to traditional currencies).

Bitcoin started off as a toy, with 10,000 being exchanged for a pizza once. Ethereum smart contracts, even ones dealing with cryto-asset-exchange, are often treated as toys

Uber/Lyft/AirBnb were considered toys when they began. It may be that nowadays the transition from toy to big company is faster.

I don't remember anyone saying Uber/Lyft/Airbnb were toys when they began. Some might have thought them unlikely to succeed, but that's not the same thing as being a toy.

I do not think we are referring to Toy's literal meaning here. IMO, Toy here means something insignificant that a few people/small market uses.

Exactly. Those of you that are hung up on toy meaning a literal toy or video game with no intrinsic value can replace the word toy with "gimmick" or "novelty" or even "a serious idea that won't scale" (http://paulgraham.com/ds.html).

Early Air Bed and Breakfast charging people for couch surfing and selling Obama O's (https://www.airbnb.com/obamaos) to fund their startup definitely feels toy like to me.

I'd argue there are entire categories that are currently considered toys, but eventually with the right business model, tech improvements and cultural changes will make the serious billion dollar tech companies we have now look like toys.

Toy vs Tool means novelty vs efficiency creator. A toy is fun to use it exists to be used for pleasure, a tool is useful and exists to make work easier. Toy should be used to mean superfluous bonus, unnecessary but enjoyable pastime.

> Disruptive technologies are dismissed as toys because when they are first launched they “undershoot” user needs. The first telephone could only carry voices a mile or two. The leading telco of the time, Western Union, passed on acquiring the phone because they didn’t see how it could possibly be useful to businesses and railroads – their primary customers. What they failed to anticipate was how rapidly telephone technology and infrastructure would improve (technology adoption is usually non-linear due to so-called complementary network effects). The same was true of how mainframe companies viewed the PC (microcomputer), and how modern telecom companies viewed Skype. (Christensen has many more examples in his books).

Skype feels like a bad examples it was always fairly obvious video chat would be prevalent eventually. It was never obvious that Skype was the best answer, and there were Skype alternatives with similar offerings which have failed.

Skype itself may not last the long run if it can’t keep up with zoom or whatnot.

Here’s a more full account of the Western Union and Bell settlement on patent infringement-


I’m also generally skeptical of revisionist narratives. People often say no one saw X coming, no one saw apps on a phone or a screen with buttons etc etc. But it seems hundreds or thousands of writers, designers, futurists, failed technologists and so on did predict these things. The hard part is building them and then creating a system (eg financed hardware) to create widespread adoption.

> People often say no one saw X coming, no one saw apps on a phone or a screen with buttons etc etc. But it seems hundreds or thousands of writers, designers, futurists, failed technologists and so on did predict these things.

Often technologies are available long before they ever catch on with the mainstream. There were several PDAs sold in the 90s, and at least one had a built-in telephone. The Buick Reatta had a touch screen center stack in the late 1980s.

Technology adoption is weird. There needs to be the perfect storm of integration, marketing, and (for lack of a better term) zeitgeist to really make something mainstream. And even with all those things, it still takes YEARS to really get there. Think about how long techies were downloading TV shows/movies before streaming really took off -- Windows Media Center came out in 2002! That's a full six years before Netflix started its streaming service, which itself took several years to take off.

This went by on HN a week or so ago: https://www.gwern.net/Timing I tend to agree, the problem isn't seeing what's coming exactly (though that is non-trivial). The problem is the timing.

I have a question for those who were there at the very beginning of the internet. I heard that many were skeptical about both personal computers and the internet.. did it feel the same as it does right now with crypto-currency/quantum-computing? I wasn't there, but on my side, it feels like people were able to use internet right from the beginning, whereas there's a bigger leap to using crypto/quantum. Is it just that the infrastructure isn't in place to allow normal people to do easy things?

In the very beginning, it was restricted to a very, very small list of users in the military and colleges. At the time it seemed like a very useful tool and pretty fun to play with. The next phase was Mosaic and then Netscape creating the web, which also felt like a pretty neat and potentially useful toy. I remember passing around sheets of paper with URLs to type in, before there were search engines. And then it all exploded, everyone had to have it, and things changed a lot as the average user was both way less technical and way less well-behaved.

Given that, blockchain may be in a similar state to the early-middle Internet. It's out there, you can play with it, it's neat and seems useful even if for most people that practical purpose is elusive. Maybe someone has yet to create the Netscape that turns it into a must-have tech, but it might be happening as we speak.

(Edit: TCP/IP came out in 1983, and HTTP in 1990. In 1995, the year of the Netscape IPO and when everyone "knew" the web was a big deal, there were 40 million Internet users. Now there are about 32 million Bitcoin wallets. I think some of the other answers are remembering things a bit later in time, or more colored by hindsight.)

On the other hand, quantum is much farther out. The percentage of people who have used a quantum computer at all is minuscule. When it does become useful, it may be too expensive to see mass adoption for decades.

> I remember passing around sheets of paper with URLs to type in, before there were search engines

Wow, what a throwback in time! I had completely forgotten that. Thanks for the sweet memories!

Also, I agree with your take. One key difference re. cryptocurrencies might be legal context however. Internet wasn't stepping on governments toes, more like disruption of the telco sector (which was already focused on emerging mobile at the time, a decade prior to smartphones). I'm not so sure the road is wide open for alternative currencies, because central banks and, oh, stability of the economy etc. It's a much tougher sell.

Blockchain itself however, the tech, may have killer-apps and uses — either as an evolution of currencies, or decoupled entirely from 'currency' (trustless secured distributed databases may fit many needs IMHO, notably for social purposes like contracts or unhackable communication).

I was there in early 1990s, and it was completely different.

The immediate benefits of the Internet were obvious, and much more interesting than any theoretical "take over the world" growth story. People were willing to tackle slow connections, bad UIs and device configuration hell so that they could get chatting on IRC or posting on Usenet or downloading files from weird/shady FTP servers.

With cryptocurrency, there's nothing new or interesting you can actually do, no meaningful human connections to be made. It's just speculating on a made-up growth story, like penny stocks you can trade 24/7 — but we already had penny stocks and boiler room scams.

>With cryptocurrency, there's nothing new or interesting you can actually do, no meaningful human connections to be made. It's just speculating on a made-up growth story, like penny stocks you can trade 24/7 — but we already had penny stocks and boiler room scams.

You really should look into that tech (Blockchain, DLT) a bit deeper. This sounds exactly like the "It's just a toy it's never gonna change the world." that people said about literally anything that changed the world. Yet that toy is here since 10+ years and it's not going away anytime soon. (Not talking about Bitcoin that could still end anyday)

The speculation/trading/scams etc. are the "kids" playing with the toy. One day value will move like data and people will question why it took so long to implement this. Exchanging data for value will be normal and young people will ask how it ever could work without that. Kids will aks if it's true that back in the days people who inserted data into the internet had to pay for it and not the people who consumed the data. The internet really works backwards because we do not yet have the "internet of value". The whole internet runs on ads instead.

As always when it comes to this topic I suggest David Schwartz's speak at We are Developers - World Congress Berlin https://youtu.be/Lxqz_NlX3GI

> Exchanging data for value will be normal and young people will ask how it ever could work without that.

How is this different from using a credit card to e.g. purchase a streaming movie? Is it just more seamless? Or does the fact that there isn't a centralized bank behind the transaction somehow significant for the consumer or business?

(I'm really just trying to understand this btw)

> Kids will aks if it's true that back in the days people who inserted data into the internet had to pay for it and not the people who consumed the data.

If I understand correctly, the current situation is that there are free services where you can upload/insert data, free services where you can download/consume—and there are paid services for both too.

Could you elaborate on what the difference would be there? Anyone who creates/hosts content does so for free, and the people who consume it always pay?

In short credit cards do not move value at all. You authorize payment instructions. Which in turn leads to balances being adjusted. It's basically a debt transaction instead of a value transaction. It has several advantages like that banks can adjust balances for a specific time with many parties and then calculate the effective difference (a lot will cancel each other out) and only settle the real differences. There is more of course but that's the basics.

For the consumer, streaming value doesn't need any kind of subscription. It doesn't even require a contract. You do not need to stream from netflix because that's what you already have an pay for monthly. You pay whoever gives you what you want. It's kinda like if you could use cash without the obvious downsides of physically moving coins and with the ability to split it in whatever fraction you'd like. Please watch the youtube video. He explains this way better than I could. It goes down to the level of IP data packages. Something similar could be made (is being made) with value.

>If I understand correctly, the current situation is that there are free services where you can upload/insert data, free services where you can download/consume—and there are paid services for both too.

Most of the content people access on the internet is "free" as in no need to pay for it other than pay for the internet access. Same is also true for "inserting" as in upload it somewhere. However physically inserting it (hosting) isn't free even if you find a free host it's just paid by someone else. Hence hosting (inserting) costs while pulling is mostly free. Every other market works exactly the opposite way. You can't get an apple for free at any store. There is no way someone else is gonna pay the apple for you if you just look at some ads.

If the internet would work more like a store, some people would insert data and other people would access/download and they would pay for that directly not let someone else pay in exchange for horrible userexperiences, horrible privacy, huge additional traffic waste etc. All of that could really only turn successful because data started moving freely over the internet and it grew exponentially while moving value basically has not changed at all.

Why do I care whether it's a "debt transaction" or a "value transaction?"

If you get that transaction you probably prefer the value rather than the debt. Value is free to move, debt is an agreement between 2 parties. If you get paid with debt it means you get nothing but someone owes you something now. This someone is usually your bank. You can cash out that debt at any time and that gives the feeling that there is value stored at the bank. But that's not really the case everyone knows that it would not work if a lot people would want their debt payed back.

Instead of cashing the debt out you can just send that debt to someone else. This is called transaction but as you already know there is nothing to move. It just adjust balances of debt. You tell the bank that a part of what they owe you should now be owed by someone else. If its someone is using the same bank then this is super simple. If he is form another bank the 2 Banks must work together to update the balances over 2 systems. If the 2 Banks do not have any relationship with each-other then they need to find corresponding banks and create a chain of trusted relationships to move that debt around. This is why Tx from account to account of the same banks is super simple fast and cheap/free but the further you go the longer it takes and the more Banks in between want some kind of fee for the service. It comes to the point where you simply can't make a transaction anymore because either the fees are way to high or there just isn't a bank to bank to bank chain that could move the debt. Or of course there are billions of people who do not have a bank. There are payment companies who fill in that gap where you can input a bank transaction (debt) and they deliver cash etc. It's sill expensive and slow in most cases and it absolutely does not work with small value(debt) Tx. There is no way to send 1USD from USA to Mexico for an acceptable fee and in an acceptable time.

If you could send value instead you could avoid literally all these problems. Value is value everywhere it doesn't matter where you send it from and where you send it to. There is no need for cooperating banks. It's like if you could pay cash but globally (without the different currencies of course because value is currency independent). Imagine if value would be represented by gold. I buy gold worth 1 USD and send it anywhere in the world and the person who gets it sells it for his local currency. Now replace the gold with crypto because gold can not be send over the internet but cryptos can. Everywhere, very fast/cheap (not with BTC) and without any chain of trusted entities in between the endpoints.

It's also final unlike debt Tx which can be reversed by the banks and payment provider in between. Which is bad for the endpoints but also for the corresponding bank chains since they all have to take the risk of reversed Tx which leads to higher fees.

It goes into my bank account either way. I trust the bank to accurately count my dollars and to give them to me whenever I ask, because they have a very long history of doing just that.

You're proposing a scenario in which all banks collapse and FDIC insurance fails, but also one in which my computer continues to be a productive member of the bitcoin network and my neighbor will sell me access to his bunker for 3 bitcoins. And my ISP won't be upset that I can't pay them in USD.

If I was worried about the collapse, I think I'd want to store my money in something with real value, not imaginary internet coins that none of my neighbors have ever heard of. Maybe gold, to be cliche. Canned food?

You wanted to know what the benefits are and I told you. I never said that it should/could replace all debt Tx or the debt system in general. I never said anything about banks collapsing. I never said anything about Bitcoin.

You can trust your bank you choose the one you want to trust. But what about a companies who hav to deal with foreign banks or what about banks who have to deal with other banks. what if the best choice is still bad and should not be trusted. Value Tx can solve this because they are trust less.

First, loans create deposits. It is a (short-term) loan for the purchaser but it creates a real deposit for the seller. In as much as you equate "value" with exchange medium (money/crypto) then credit cards create value out of thin air (the purchaser still has his deposit AND the seller gains a deposit, all the while their deposits enable the loans by the bank). The fact that we have more than one bank and real settling between banks doesn't reflect the minutia of actual transaction is irrelevant.

In terms of "who pays," well, I doubt people who currently don't pay will somehow decide to pay while free services remain available (even if at the cost of getting targeted ads/loss of privacy). If anything, right to be targeted is the real "value" that is being (and presumably will continue to be) exchanged.

Legacy payments network is a dumpster fire. It works but just barely. Crypto is much cleaner and will take it over eventually, give it time.

This is an example of what I meant by speculating on a growth story.

1994 Internet let you talk to people and exchange information around the world — things that you couldn’t do before.

2019 cryptocurrency lets you pay for something, sort of, not really, but will be very clean one day — never mind that millions are already paying for things and sending money because “it’s a dumpster fire”.

> 1994 Internet let you talk to people and exchange information around the world — things that you couldn’t do before.

You could. There were phones and snail mail after all, weren't there?

They were slow an expensive just like wire transfers between US and EU bank accounts are: 2-3 workdays and a $35-45 fee to move around a few bytes in existing financial messaging systems. Pure madness.

Wire transfers between US and EU take a few hours to clear and cost $25. I send them all the time.

In Europe, transfers between all banks are becoming instantaneous starting Jan 2020: https://www.europeanpaymentscouncil.eu/what-we-do/sepa-insta...

The cost is pennies.

No need for a crypto-anything solution.

> Wire transfers between US and EU take a few hours to clear and cost $25. I send them all the time.

I receive a lot of payments from the US and I have never seen a wire transfer that takes less than a few days to arrive. This is consistent with the info given on Bank of America's website: the fee is $35 when sending foreign currency, $45 when sending USD, time 1-2 workdays. https://www.bankofamerica.com/foreign-exchange/wire-transfer... Chase: fee $40, time 3-5 workdays. https://www.chase.com/digital/wire-transfer

Even $25 for that is absolutely insane price, as instantaneous and free transfers within the European SEPA system demonstrate. Nothing comparable to SEPA exists for dealing with rest of the world. Express delivery services like Fedex deliver physical items faster from the US to Europe than payments move over digital channels.

The situation is even worse for businesses that deal internationally with larger sums of money and may have to take significant hits from currency fluctuations during slow transactions. There are billions to be saved annually with systems like RippleNet (xRapid).

US <-> EU is a corridor with huge volume the legacy systems are therefore very very optimized for this. Inside EU Tx are basically inside the same system which is centralized and updating balances is super simple cheap and fast. Try to send some USD to Mexico or something instead.

Also a Tx taking hours and costing 25 bucks is absolutely not going to allow the Internet of Value to show it's possibilities. It may be acceptable to buy something and let it be ship to you. but what about services? instant services how could they be paid with such a system.

Watch the youtube video linked above if you want to get an idea of the IoV. It's on a completely different scale. Value streaming a fraction of a cent at a time. Instant and basically free.

Like millions of people, I buy instant services on the Internet all the time using those nasty legacy "dumpster fire" systems.

They even provide a nice built-in feature called "chargeback" in case the service isn't what I expected. Cryptocurrencies don't seem to care about that part of customer experience at all.

(Btw, when you say "just watch this YouTube video about an amazing business opportunity", what I hear is: "I believe you're a sucker".)

If you don't want to watch the video I suggested but then argue with semi valid arguments because you have not seen the video, I kinda feel like I should just not reply anymore which is exactly what I will do after this post.

>Cryptocurrencies don't seem to care about that part of customer experience at all.

The idea which again is described in the video which you don't want to watch makes "chargebacks" something you do not need in many cases. The data/value streaming just stops if one side stops hence you don't need any agreement or contract. You pay for what you get and if you stop paying the other will stop as well. Some services obviously don't work that way and there will always be use for a third party middleman who takes the risk, guaranties service/money for both sides. The idea isn't to destroy all the business agreements that exists and then start them all again but use some fancy new way to pay each-other that does the exact same thing just differently. The idea is that a whole new sector of services will become possible because of that new tech.

I don't feel the need to convince you into anything I just think you aren't aware of the fundamental changes the new tech can bring because all you saw is Bitcoin. I suggested the video purely because he explains the vision of the IoV very accurate. He's not gonna tell you to buy shitcoin XY or trow money into companies that try to build stuff with that new tech. If you watch it and think that this all will never happen, that's fine who knows. At least we talk about the same Idea then.

Here is the link again https://youtu.be/Lxqz_NlX3GI Should be quite interesting for anyone in the tech/IT space even if one completely disagrees with his vision.

Cryptocurrency seems like the inverse of the internet revolution to me.

Internetworking realized the vast potential of all the computers in the world. They could do so much more if talking to each other.

Crypto via "proof of work" takes all the unused computing capacity everywhere and creates an incentive to waste it.

What it seems like to me is a hybrid virus that uses both human minds and machines to spread and suck up resources.

BTC is 10 years old. Tech has come along way since then. Proof of work clearly isn't the revolution. There is no need to waist energy to make blockchains/DLT work. Just like light doesn't have to be produced by turning 95% of the energy into heat. Inventions never start with the final product. But there will always be people who prefer the first product.

I mean, if someone says that (a) proof of work is the only viable solution and all others have flaws you can't get around, (which I have noticed people claiming) and (b) BTC is still the most successful cryptocurrency, which people are not abandoning, what's your response?

You just made a string of assertions without explanation, so why do you think these things?

Well (a) is simply wrong. Proof of works has flaws (energy wasting/tends to centralize) which can not be fixed. But others have fixed these flaws by not using PoW.

Yes there are people telling that PoW and Bitcoin is the best and all other systems are flawed. They are called Bitcoin Maxis, their arguments usefully fall flat when you apply logic. They don't even agree with each other about which Bitcoin is the real Bitcoin.

>BTC is still the most successful cryptocurrency

How do you measure success of a cryptocurrency? Also being the first gives huge advantage for "success". Like Whatsapp which is probably the most successful instant messenger but does that mean it's the best? It takes time to dethrone something like Whatsapp or Bitcoin just having the better tech isn't enough to instantly make everyone switch.

>You just made a string of assertions without explanation, so why do you think these things?

You have any specific question? I don't really know what I should explain. If you are interested in alternatives to PoW you should lookup XRPL Consensus (Federated Byzantine agreement) a variant of this is also uses by the Stellar Consensus Protocol. Then there is PoS which is the most known alternative. DPoS is building on top of that idea. Then there is PoWeight, Tangle, Hashgraph and some other less popular and/or not (yet) working ideas/theories.

Personal computing and the Internet were a much bigger deal than cryptocurrency. Each completely dominated the overall cultural dialogue for years at least in the U.S.

If you ever watch "The Graduate" and the guy gives the college grad advice to go into "plastics" because it's the next big thing, that's computers in the 80's and the Internet in the 90's.

People were quick to latch on to the web because it facilitated things they really wanted to do like keep in touch with friends and buy things easily. Cryptocurrency doesn't do that for most of the population. PC's were a hobbyist thing at home and more akin to cryptocurrency, but at the same time they were completely taking over how business operated.

The web had a similar impact on small businesses too. Having a web presence exposed small companies to a much larger customer base than they could have ever hoped for before.

Cryptocurrency doesn't really solve real-world problems like this. Most people are perfectly fine using traditional currency and wouldn't see any benefit to replacing that with Bitcoin, et al.

Edit: I think the most recent change that you might have been around for was the introduction of the iPhone and the smart phones that followed. Huge paradigm shift for the general population.

Your question tickled something really interesting for me. Here's a brief timeline from my own life:

1991 - I am 8 years old and am learning to program BASIC on a Vic-20

1992 - My family gets an XT-class machine (640kB of RAM!)

1993 - NCSA Mosaic is released.

1994 - An Internet Cafe opens in my small town of 35,000. Using library books, I manage to find cool software to download onto 5 1/4" floppies and bring home.

1995 - I get my first C compiler (Power C) and start programming my own games (still on my XT). I start spending time at a local LAN gaming place, wishing for a better computer.

1997 - my family gets a Pentium 133, dial-up Internet, and shortly thereafter I discover Linux. I also build a 386 from parts that I scrounge up.

1999 - I get a job working at the LAN shop as a tech. The company has shifted to being a PC Hardware and repair shop.

2000 - I get 1.5mbit DSL at home, build my own Celeron 433, and dive way deeper into programming.

Crypto-currency first showed up around, what, 2010 or so? In those 9 years, I don't think there's been anywhere near the kind of (pardon the pun) quantum leap we had technology- and utility-wise in the 90s.

The argument is, it is year 1981 of internet in cryptocurrency world.

And rate of innovation in cryptocurrency might not be as breakneck as it was in networks. And it's mosaic is not yet written (this is all very obvious).

But smart contracts are going to be big, given enough time. Smart-everything requires them to work, and we want smart-everything.

> The argument is, it is year 1981 of internet in cryptocurrency world.

Fair enough!

> And rate of innovation in cryptocurrency might not be as breakneck as it was in networks.

Any thoughts on why that might be? It seems like there's a whole lot more people looking to carve out businesses in that space than there was in 1981, although I was -2 years old in '81 so I don't know for sure. There is definitively a lot more opportunity for globally-dispersed people to collaborate on things than there was in '81.

> And it's mosaic is not yet written (this is all very obvious).

I'm really excited to see the "Mosaic" of cryptocurrency, if it ever comes to pass.

> But smart contracts are going to be big, given enough time. Smart-everything requires them to work, and we want smart-everything.

I'm not sure I follow what "smart-everything" means, nor how smart contracts are required for that to happen. It seems like there's a lot of large companies who would be delighted if smart-everything was dependent on long-term subscriptions paid to centralized entities :)

It allows for permissionless systems, decentralized organizations, built in incentive models, built in authorization systems, trustless smart contracts.....I'd say it's a pretty huge leap. The issue is that everyone always just talks about it being a currency and the news makes a joke about it similar to what was done with the internet early on.

The other thing is that governments are greatly hindering growth. Enterprises really want to get involved but are forced to take it slow and be careful of all their actions for fear of legal reprocussions. Of course governments aren't happy and don't want to support it because it puts government currency at risk, as noted by US representatives.

If you mean “the mid 90s”, I was there for that.

Need Magazine articles and TV shows at the time were hyping the internet and the web. I think they were about right. Mainstream media didn’t really catch on until the 00s.

I was anticipating a complete revolution that would change everything. I remember repeatedly telling my parents how it was going to change the world. They weren’t dismissive but also didn’t think it was really going to be something for them. Eventually they relented and I got a modem - so I could surf the small number of websites available.

In retrospect I was incorrect in that it has changed everything more than I imagined it would. I didn’t see mobile phones as the main access point, for example.

As for the comparison with crypto and quantum, only quantum has the potential to make a comparable impact on our lives and it would likely be totally different in that quantum will be indirectly impactful, at least for the foreseeable future, whereas the internet was and is most impactful when actually used by the individual. It feels a bit silly to compare crypto (a clever but relatively simple idea for electronic currency that is currently producing a lot of CO2 and suffers from other huge problems) vs quantum computing (a revolutionary technology that has and will require major breakthroughs in theoretical physical and engineering, vast sums of investment and some of the smartest minds working for generations to enable previously unrealisable control of the physical word and paradigm-shifting computation). Real quantum is also a long way off.

If you want a more near-term tech that I am as excited about now as I was about the internet in the 90s, it is without a doubt VR. It’s going to change everything.

The skepticism in the PC was well dead by the 90s. I had access to BBC micros in the early 80s and my own C64 in '84. Two years later and I was using WordStar on a PC to type all my school assignments and learning MsDOS BASICA, Turtle Graphics and DB2.

Computers were very much about creativity and doing work.

I used an Apple Macintosh in 89/90 to access Usenet and it was a whole other world and the flood of information that came with the web took it to another level.

Crypto/Quantum don't really have the same impact for the average person. Quantum is just different hardware to do the same thing (though make some calculations feasible to execute in hours/minutes rather than the users lifetime)

Crypto will probably stay niche like other alternate payment methods like barter cards or the technology becomes part of the underlying mainstream infrastructure that no one sees along with all those billions of lines of COBOL.

When the first web browser came out, for me (and, I've heard, very many others) it was a "where were you the day of..?" moment. I literally remember the afternoon X-Mosaic came out. Computer lab, 1993, a group of grad students huddled around a SPARCStation, I walked over and tried out this new program. No, I didn't see the huge eventual potential that day (the World Wide Web would remain an ad-free space for a few more years), but it was a definitely a "today everything changed" moment.

I'm trying to brainstorm what I think is a "toy" in today's tech market, and I can't think of much. Tech pundits who have seen 30 years of crazy endeavors are taking everything overly seriously. Companies have the resolve to invest billions in "jokes" (like Facebook Portal) stubbornly refusing to recognize a sunk-cost fallacy.

Instagram is the most recent example of a revolutionary toy - when everyone thought social media was a puzzle solved by Facebook, Instagram cut the fat and became a more elegant tool to getting a message across (and making some money in the process).

I could see Slack's fun approach to workflow automation and collaboration take off, but maybe I'm biased by what I'm investing my own time into learning.

Some examples of 'toys' in today's market - AR/VR, Drones,

Another factor is that investment in tech is dramatically higher today that it was 20 years ago (partly due to success that early investors had) and people are willing to invest in anything that looks like a 'toy' in the off chance it pays off.

Are drones still considered 'toys'? At least in photography, they have long ceased to be toys. Given that you have big companies working on them (like amazon for deliveries), I'm wondering if they are still in the 'toy' phase.

VR is. The incremental changes are not yet enough to make them practical for non-gaming use (and even gaming, they are amazing, but cumbersome). AR is firmly still in the toy camp.

Just look at what's polarizing, what you might dismiss without looking deep into it. Cryptocurrencies is a good candidate. So is VR and AR.

Surprised no one here has mentioned bitcoin.

Fun to see that a decade ago, people were convinced that AR was going to be the next big thing. I could see that exact same argument being made today, nearly ten years later!

If you’re excited by this concept I recommend the book “Wonderland: How Play Made the Modern World” by Steven Johnson. Fascinating stuff!

Here’s a recorded talk and more info: http://longnow.org/seminars/02017/jan/04/wonderland-how-play...

I can't see how Skype was different in 2010 vs 2003. I mean, it provided facility for perfect clarity voice calls over the Internet right away and that never changed (arguably, clarity of a Skype voice call is worse now than it used to be). And it perfectly worked over then-day Internet connections. I had a mediocre quality dial-up in 2003 which was already long in the past for most developed countries, and i used Skype a lot, it worked OK - and when i moved onto 112kbps dedicated line - also considered slowish by developed countries' standards of the day - and Skype calls were literally perfectly clear all the time, every time. Much better than over a landline phone, if you had a headset.

So while agreeing with the rest of the article, Skype sounds like a wrong example. It was a total replacement for phone calls from day one. I never made, and maybe received only about 10, calls to/from my U.S. customers since getting on Skype, PSTN immediately lost me as a long-distance client.

Recently watched George Hotz on video announce his latest update for Comma.ai. To sign up for their optional prime(?) monthly subscription account you've got to give them your GitHub account.

Gave a friend anxious to try self-driving a link. He then asks me what is GitHub? I explain it and he tells me, what is this some sort of tech toy? I could tell that he began to lose interest.

I well remember talking to people about getting on the Internet in 1994 and as soon as they found out they'd have to install a tcp/ip stack on Windows 3.1 they too soon lost interest.

I believe against all odds that Comma.ai just might be one of the eventual winners in the self driving market. Which is crazy because they're battling people who've invested billions of dollars. By coincidence the partner who brought Comma.ai to Andreesen Horowitz is Chris Dixon! Coincidence?

Personally I'm just waiting for them to add support for Ford.

This was the approach Anki (which recently shut down) took for robotics. I actually had a similar idea, but whenever I went to the drawing board to make robot toys I could never make anything that was both fun and affordable - using videogames as a benchmark it basically always lost. I think Anki did as close to an admirable job as you could at trying to sell robotic toys while developing more utilitarian robots (they were working on a chore bot but it was woefully inadequate), but they still failed pretty hard. I was always surprised by their ability to continue to raise large sums of money (they raised about a quarter billion in total).

There is a similar approach being taken in the eVTOL space, with Kitty Hawk trying to develop their Heaviside as an obvious toy for the rich (single seater, short range, "quiet"). I think people will refuse to allow so much noise pollution in the skies, not to mention the inherent question of safety as all these designs make serious compromises to flight stability in order to make them so lightweight and take off vertically.

There is merit to this approach though - you could certainly argue hypercars are toys for the super rich and eventually that technology trickles down to cheaper vehicles. But the economics around racing cars are much more complicated, including making an entire sport (and an extremely popular one at that) to justify the development costs. You could also argue Tesla did this, starting with the Roadster, then the Model S, and now the Model 3. Tesla's financials have been a source of great debate for a decade now though.

Whatever the next big thing is, I bet the media hasn't caught onto it yet. I don't think it's VR, AR, self-driving cars, general AI, or anything else that has already been woefully overhyped. It'll be something nobody was looking for and the right person stumbles upon it at the right place and time.

I actually think AR could still be the next big thing, once the integration becomes seamless and the peripheral technology doesn't look ridiculous

I'd have to try it. The question is what's the killer app? VR essentially has 3 things going for it after becoming somewhat mature - Google Earth, Beat Saber, and porn. All are considered better in VR, but none are really "omg VR is the future!" level of amazing (to be fair I haven't tried the porn yet so maybe I don't realize how much I'm missing out).

The notion of having a HUD for my eyes everywhere I go sounds neat at first, until I think about it a bit more and it just seems like sensory overload. I can always just pull out my phone if there's something I want to know, and that thing is already pretty invasive when it comes to my daily living - do I really need something more than that?

When it happens, we'll know. The iPhone was famously expected to be such a failure that Verizon refused to be a carrier. Microsoft also joked about Apple being able to make a smart phone. Blackberry felt the same way. The only company that took them seriously was Google with Android.

Out of curiosity, what is it you think AR will revolutionize? What do you consider the killer app?

A possible killer app might be something for business meetings. Videocalls are already a huge thing, but many feel like they're missing some real personal intimacy in there. If instead of a videocall, you could all be using AR to see each other walk around in front of you, that might make it a whole lot more appealing. If it's done right, you could have the same private one on one side conversations that are common in real life, but just in some AR space.

AR has a clear advantage over VR here as you wouldn't be blind to your real surroundings so you'd still be able to talk with the people in real life too (and not look like a doofus who walks into walls)

I don't think there's going to be one killer app per se, just the feed of information streamed to peoples' brains is going to become so much larger. The one good example I can think of is walking the city, wearing an AR device and you get relevant news, historical pieces, restaurants, etc. as you walk.

>The reason big new things sneak by incumbents is that the next big thing always starts out being dismissed as a “toy.” This is one of the main insights of Clay Christensen’s “disruptive technology” theory. This theory starts with the observation that technologies tend to get better at a faster rate than users’ needs increase.

Or perhaps new technologies are solutions in search of a problem, and without any concrete sense of self and hierarchy, user "needs" can get artificially inflated to infinity...

Kind of how people "needed" cigarettes in 1950, and "needed" bell bottoms in the 70s, and "need" a good smartphone camera today (because of course they also "need" to broadcast to their friends and everybody else what they ate for dinner, their latest selfie, and their life 24/7) -- even if nobody on the receiving end really cares, except perhaps their parents and grandparents...

Observation: "comic book" movies today are largely not suitable for non-teenage children (practically all rated PG-13). What was literary version of "toy" is now 9-digit-budget blockbusters.

FWIW, my son is in 3rd grade and most of his peer group (8-9yo) has seen at least some of the MCU movies.

This made me think of disparaging comments from ESA about SpaceX:

'Asked about how the Ariane 5 compares to lower-cost alternatives on the market today, such as SpaceX's Falcon 9 rocket, Stefano Bianchi, Head of ESA Launchers Development Department, responded with a question of his own. “Are you buying a Mercedes because it is cheap?”

Ranzo, sitting nearby, chimed in and referenced the India-based maker of the world’s least expensive car. As he put it, “We don’t sell a Tata.”'

But whoever makes that Fiat Ducato or equivalent for space will make big bank. Different markets warrant different solutions, including custom.

Agreed, but lower cost access to space could also change the game itself. If launch costs are 1/10 or 1/100 what they are now, your bespoke $100m absolutely-cannot-fail-at-all-costs satellite can similarly be blown away by satellites that can be built without the super-expensive verification required to make them infallible. Just build them cheap and good enough and if/when they fail, throw up another.

Video games will always show what's next. Gaming companies are developing the expertise at constructing virtual worlds, with rules that engage and immerse users, that will be essential to the popularization of VR. VR will go mainstream under the umbrella of gaming before it's used in any practical sense. It's the same way computers went mainstream, and I bet video games will spawn whatever comes after VR as well.

Most toys remain toys...

I really like Clay Christenen's ideas, but he dismissed the greatest recent disruption (smartphones) as "sustaining"... His theories are not predictive.

What actual toys are there? Especially new ones. def toy: fun/enjoyable/popular but useless.

video games (e.g. minecraft, pubg), social media (stackexchange, reddit, meme-makers), scooters, fidget spinners, drones, 3D printers, face swaps, shadertoy

This makes me think of Fortnite. What if Fortnite, and the impact it's going to have on us, is yet but a tiny baby?

Have you tried Fortnite Creative mode? It can be as much of a virtual world as a FPS game engine. People are building islands for other types of games, for socializing, and just as art for art's sake.

You must not have school-aged kids, Fortnite is already dead.

You sure about that? It currently has the second-highest viewership on Twitch, topped only by League of Legends (which is surprisingly almost a decade old).

It was mostly a joke for people with school-aged kids. At least in my kids school, and what r/memes seems to reflect around the world, is they all say it's "trash" and Minecraft is the thing again. I think they all still play it, it's just cool to say it's not cool. However, a lot of them seem to have stopped buying V-bucks, at least to get skins.

What're kids into these days? I could believe it, but haven't seen anything else get hot recently.

Of course I don't know what your kids are into, but Fortnite just started a whole new map, and there seem to be plenty of people playing.

Tiktok comes to mind. Maybe drones as well

This was my thought regarding Google's Soli chip[0] when I saw comments talking about it's usability limited to cooking or other fringe cases.

[0] https://news.ycombinator.com/item?id=21269877

The chip has potential to change the way we live, but this potential will only become reality when proper applications and bussiness models create really useful stuff around it.

I think seeing this tech as disruptive somehow is kind of obvious, but to change our lifes upside down it needs much more.

The same can be said about 3D printing. It has a lot of potential, but timing + the right ecosystem around it is also a big deal.

Apple with Newton, MS and specially Palm were there first for the pocket computer + phone, but why only with the iPhone things started to really happen?

The right bet + timing + ecosystem around it is also pretty important.

Thats an interesting topic and an evergreen article as every few years we can try and observe the landscape of up and coming toys. Some candidates:

Drones Bitcoin Electric Scooters VR AR Voice recognition (Alexa, Siri, OK Google) 3d printer Robotics

what else?

I hope the next big thing will move atoms around.

I think bits are very limiting. My Iphone, no matter how good the apps are, will never be able to get up and clean up the kitchen, or build me a house for that matter.

Aren’t atomic bombs a counter example to this? Though maybe an example that proves the trend. There’s a strong connection between nuclear power failures and political push back.

PCs, Smartphones, the Web, JavaScript... All things that helped me making a good living.

kubernetes didn't start out looking like a toy. nor did tensorflow or kafka nor ipod / iphone nor spacex nor tesla

unless looks like a toy is relative to the incumbents, but not to the population at large.

In “Wonderland: How Play Made the Modern World” the toy as predictor concept is explored more deeply.

The emphasis is less on the implementation of the toy than on the popularity of the style of play or human amusement that the toy enables. Then over time various technologies come along to implement the mode of play. Eventually someone says, wow that might help in serious adult activity X. And then disruption.

Kubernetes, Tensorflow, Kafka, Falcon 9, iPhone, and Model S aren’t toys.

Video game AI, model rockets, iPod (limited to music playback), and RC cars are toys that predicted these modern, serious adult innovations.

The toy as predictor theory doesn’t require that all technologies begin as toys. So maybe Kubernetes & Kafka don’t fit (I can’t think of a relevant toy).

Ten years later... it's still very true :)

There are some interesting user comments like one guy who talked up Google Search compared to Yahoo and AOL and how it gave you what you wanted and sent you on your way and didn't try to keep you on site indefinitely.

How things change and end up repeating the same old mistakes.

Remember 3D TV?

Or weapons / military hardware.

Am I the only one thinking of WebAssembly? (e.g. https://wasmer.io )


Possibly the absolute last market sector where you want people thinking of their products as toys.

Wearables started off looking like toys, but are poised to drastically change how we think about health. The whole point of this argument is that the next big thing will come out of left field, not that you should build toys to try to solve a major problem.

Imo if a statement about the future doesn't contain word "probably" - it is probably wrong.

probably.. suppose... it seems to me.. I hate it when people try and hedge their opinions.

I hate when people make no effort to hedge their opinions. It comes off as lacking nuance.

Why? Hedging is fine. I think it's important to accurately communicate how certain you are of a thing. Constant hedging is annoying, but so is stating everything as if it's a fact, despite how sure you really are.

Communicating confidence is useful.

I totally agree. My beef was with people using vague words like "probably". What does probably mean? 51% likely? 55% likely? 90% likely? It contains no sense of probability, that can be agreed upon.

If you're going to attach a probability to an opinion, use "most likely", or "almost certainly", or "unlikely".

You are right. I could re-phrase my comment - all statements about the future which don't contain notion of probability are pure gambling.

I totally agree. except "probably" contains a weak notion of probability . What does probably mean? 51% likely? 55% likely? 90% likely? If you're going to attach a probability to an opinion, use "most likely", or "almost certainly", or "unlikely". That to me, isn't hedging an opinion. That's attaching a real probability to the outcome.


Mods: Please put (2010) in the title.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact