Ask HN: What tech were you convinced would take the world by storm but didn't? - swyx
======
Jtsummers
I'm not sure I expected it to take the world "by storm", but I expected more
from Google Wave. It was a great concept, by a major company, but it was too
slow on release and the rollout killed it. In retrospect, it probably wouldn't
have lasted anyways. Google was in the process of minimizing their federated
communication services by that point, and that was another major selling point
of Google Wave's initial proposal.

But the rollout, that was just the worst way to ever get a product into the
hands of users. If you got in they gave you some number of invites. You'd set
them up to be sent out to your friends or coworkers or whatever. Turned out,
it just put them in a queue to _eventually_ get an invite. Wave was
fundamentally a collaboration platform, without anyone to collaborate with it
had zero value. Fantastic way to fail.

~~~
deepsun
Internally in Google it was really, really hard to get in the Wave team. So
many people felt the opportunity to show what they can and get a career boost.
So, presumably, they had best Google engineers working on that product.

What I want to say is an obvious thing -- stellar team doesn't imply success.

~~~
ewjordan
The alternative facts: maybe the great team they had _did_ maximize their
chance of success, it was just one of those 90% of "startups" that don't hit
the market.

~~~
smt88
That's a good excuse for a product that fails to profit. But a team definitely
isn't "great" if it can't release a coherent, useful prototype to the market.

------
captainmuon
Peer-to-peer file sharing.

There was a time when Napster, Kazaa, eMule were king. The content industry
fought against it, but developers came up with decentralized solutions like
DHTs or supernodes.

I was convinced the next step would be friend-of-a-friend sharing. You only
share files with your direct friends. But they can pass on these files
automatically, so their friends can get the files, too. The friends of your
friend don't know that the file is originally coming from you, it is
completely transparent. The goal is to not let anybody you don't trust know
what you are down- or uploading.

I would piggyback on the social graph of existing networks to get started. I
actually had a prototype of the friend discovery code, based on XMPP (using
custom stanzas). You login with your credentials, and it shows you all your
friends that are using the same app. It worked with GTalk, Facebook Messenger,
and Skype (via Skype API). One nice feature was that this worked without
getting an API key from Facebook etc., or having a central server. I was so
pissed when they all changed their APIs to make this impossible. It felt like
a conspiracy to stop this use case.

I still think if somebody pulled this off, it might work, and it would be
pretty distruptive (in the good and the bad sense of the word). It would be
the last word in the debate between "data wants to be free, change society to
enable what technology promises" and "data has to be a commodity, restrict
technology because in capitalism we can't feed artists otherwise".

~~~
scoggs
I, personally, felt that Audiogalaxy was the hallmark program from this era.
It was the most surefire way to get the songs you were looking for and
discover plenty more in the process.

You'd set the music folder you wanted to share (there was no opting out) and
so long as you had the program open your files were available for download by
other users. The program operated like an always on satellite torrent program
with very low impact. You'd find a file to download and the file would come in
many chunks from any available user who had the file on their client side.
Downloads were fast and in the event you lost connection or closed Audiogalaxy
the download would resume immediately on reload.

From my POV Audiogalaxy was extremely threatening to Copyright holders in a
way that prior p2p programs weren't. Your comment about software being
especially 'disruptive' and ruffling all of the right feathers is what
reminded me of Audiogalaxy.

There was no indication of who you were receiving files from. There were no
usernames or features outside of download and upload. In terms of creating a
piece of software that picked a mission and executed it I'll always look at
Audiogalaxy and think it achieved precisely what it set out to do, entirely.

~~~
sotojuan
Soulseek is still alive with the same system. The program does not make you
share anything but users can disable sharing files with users who don't share
files.

At least in 2012-14 Soulseek was very popular with extremely niche and obscure
music fans (stuff not even on What.cd!). Not sure how it is now.

~~~
twelvechairs
Yes strange that soulseek seems to have been the one that has lasted longest.
Probably its just flown under the radar and hence not been targeted legally or
otherwise like Napster Kazaa Limewire Edonkey etc . I dont think its in any
particular way more technologically advanced.

In any case today YouTube is now surely the biggest repository of illegally
shared music by some distance yet seems accepted in general by the recording
industry.

~~~
dabockster
> yet seems accepted in general by the recording industry

YouTube (Google) pays the recording royalties out of their own pocket for
those videos to keep user engagement on the site. If YouTube actually banned
this sort of content, everyone would have left the site a long time ago.
(IIRC, some gamers moved to Twitch after ContentID flagged their channels on
YouTube.)

------
Al-Khwarizmi
I don't know if this has a name, but... ICQ-like "searchable" instant
messaging.

Let me explain myself. In the late 90s, there was an IM client called ICQ. For
starters, its UI was leaps and bounds ahead of anything to be seen in the next
decade, and it had functionality like bots or groups, that has again become
widespread a few years ago.

But what I'm talking about is the fact that I could look for "female, age
between 16 and 20, who likes writing, RPGs and electronic music and is
available for chat" and results would come up. You could enter your data,
interests, hobbies, etc. in the program and mark yourself as available (only
if you wanted) and it was a really nice way to meet people. I collected stamps
at the time so I remember I would search for people who were stamp collectors
in countries for which I had no stamp, leading to stamp exchanges apart from
some interesting conversation (as you could also throw some of your interests
in in the search window).

I thought that was the future and the tech could only get better from there,
but then came MSN Messenger, with really bare-bones features in general and in
particular no "searchable" functionality, and displaced ICQ. And since then,
nothing similar has appeared. Instant messengers are focused on talking to
people you already know, and if you want to meet new people, you have to use
specific-purpose communities or dating sites. But good luck finding someone
who likes writing, RPGs and electronic music at the same time... That function
of searching people by interests in a huge directory of people (not
restricted, e.g., to a dating site) is what I thought would take the world by
storm, and as far as I know it doesn't even exist anymore at enough scale to
be meaningful, at least in the West (maybe in WeChat, QQ or one of those apps
used in Asia they have something similar, I don't know).

~~~
herbst
Facebook Graph search was like this (and creepier as people didnt always know
_how_ they are searchable) nur Facebook heavily crippled this in the Last
years.

~~~
nl
Graph search still works surprisingly well if you know what you can search
for.

Try [http://graph.tips/](http://graph.tips/)

~~~
herbst
It respects privacy setting now tho or?

~~~
nl
Yes. And it actually goes further, hiding some things which privacy settings
would let people see.

But there is a lot you can still do with it.

------
grumblestumble
E-Ink. It is the "correct" choice for display technology, and with enough
research money put into it, it could replace these abominable light-emitting
displays. But what we already have is "good enough", despite all of the hidden
costs, and so we're stuck with it.

~~~
swyx
100% agree. imagine my shock when as a young technology analyst I discovered
that E-Ink was a smallcap tech company in Taiwan and basically produced kindle
displays, store price tags and that cool double sided phone that one time. I
thought it was a huge deal when they managed to do color E-Ink. but no one
cared.

~~~
wrinkl3
Around 2010 I assumed it would a matter of a couple of years until we were
reading comic books on our Kindle Color ink tablets.

~~~
deusum
Still waiting on that E-Ink phone that lasts a month between charges.

~~~
frik
I bought a Motorola F3 phone like 10 years ago for relatives.

[https://en.wikipedia.org/wiki/Motorola_Fone](https://en.wikipedia.org/wiki/Motorola_Fone)

E-ink display, good for elder people, you can drop it from 100 feets height,
or run a tank over it, drop it in a pool. It will survive.

------
turc1656
Paypal used to have this feature that allowed you to install a browser add-on
and you could generate a CC number on the fly that was good for either one
time use or recurring use (for subscription services). This feature served two
primary purposes: 1) to be able to pay using PayPal on sites that didn't
support it 2) to help prevent against fraud which was becoming a massive
problem at the time. If the number was stolen, it immediately wasn't good
anymore and a hacker/thief could not use the CC number to purchase/steal
anything.

It was that second aspect that I thought would totally eliminate all credit
card fraud and make people comfortable with online purchases on smaller sites.
I have no idea why PayPal killed the program, but even before it did, not many
people used it. I was the only person I knew that was even aware it existed.

EDIT - if anyone is curious, I looked it up. Two ex-PayPal employees explain
here: [https://www.quora.com/Why-did-PayPal-discontinue-their-
one-t...](https://www.quora.com/Why-did-PayPal-discontinue-their-one-time-
credit-card-numbers)

~~~
arfrank
I’d suggest you look at what we built at Final. Getfinal.com

We took a hard deep look at a massive stagnant industry (credit cards) and use
experience and features as differentiators

~~~
sjs382
Too bad it requires an invite code. Do you have one for us? :)

I use privacy.com for something similar, but it's a debit card (so it just
connects to your bank account) rather than a credit card and doesn't offer any
rewards.

~~~
bwanab
They don't even seem to have a way to apply for an invite.

------
MichaelGG
F#. Back in 2006 I first stumbled upon it and was amazed. It had so much
potential. Everything C# did, and more, better. I was sure we'd see 20% of MS
devs moving to it.

I underestimated the momentum of MS, the power of embarrassment of hiring a
high-profile figure to be shown up by a researcher, the incredible anti-FP and
even anti-generics...resentment(?) that MS kept towards them. Plus the insane
comments from actual developers that literally did not understand the basics
of C# ("C is a subset of C#" and "var is dynamic typing" <\- comments from a
public high-profile MS hire).

I've basically given up hope on programming becoming better over time. A lot
of apps are boring grunt work anyway, so the edge in using better tools can be
beaten just by throwing a lot of sub-par people at it.

On the plus side, for people looking to strike it rich in "tech", knowing tech
isn't really a prerequisite. Persistence and the 'hacker' spirit, even if it
means you spend all night writing something in PHP that would literally be one
line if you know what you were doing, hey, that's what leads to big exits.

~~~
RyanZAG
I feel it's more a case of functional programming trying to target the wrong
segment - albeit out of necessity.

There's two kinds of programs being made. First most common one is the kind
you're talking about here: grunt work. It's not about the code, but more about
having the code do a straight forward task with very flexible constraints. A
web app that talks to a database and does a few simple transformations -- like
forum software or a todo app.

The second type is the interesting one: actual difficult code that does
something unique and difficult and can take years to write by very experienced
developers. Usually the difficult part here is coming up with the correct
algorithm and then applying it, often with performance considerations being
extremely important as the code is doing a lot of work. An impressive new
MMORPG game, a new rendering technique, complex simulations, control code for
rockets or advanced batteries.

The big program with functional programming is that it's never been positioned
as a solution to the second type. Generally people trying the second type are
told to use C, C++ or recently, Rust. Functional programming is marketed as
making the first type "better" because trying to market it as being more
efficient than C/C++ has not worked as people rely on micro benchmarks for
these decisions. But this falls apart with the argument you gave: for the
first type, it's better to just hire more junior programmers as there's
nothing really difficult involved. And using a functional language makes it
extremely difficult to hire junior programmers, because functional languages
are aimed at advanced programmers.

It's a massive product-market fit problem.

~~~
lastofus
I think there is a third segment where FP fits well: large difficult problems
that can afford a 10-20% performance hit.

The examples you gave of difficult problems are mostly soft realtime, which
not everything needs to be.

Granted this is a relatively small subset of problems.

~~~
wbl
It's all the problems where you get a nice clear market if you solve it.

------
bsaul
BeOS . That was the best os at the time by far, stellar performance, fantastic
C++ api usable by a newbie, i still haven’t found a GUI as responsive as this
one.

DVD-audio. Great multi channel, high rate, high resolution audio. But it
required the whole production chain to upgrade as well as new consumer
equipement, and they competed with sony own format ( which was also a
failure)... But it would have given the record industry a few more years of
revenue before the bandwidth would have been sufficient to download or stream
music with this resolution.

~~~
IshKebab
I'm not surprised DVD-Audio and co failed. Their quality is indistinguishable
from CDs despite what audiophiles would have you believe, and by the time they
came out MP3 was clearly the future.

~~~
baldfat
> Their quality is indistinguishable from CDs despite what audiophiles would
> have you believe

Well I blame the system. You needed upgraded headphones or speakers. I have
$2000 studio monitors when I was an audio engineer and well I can 100% tell
you in a blind test I can tell the difference.

95% of the reason why is people don't care about quality audio. It's something
about human brains. MP3 sound horrible compared to a FLAC and good equipment.
Instead people wear Beats Bluetooth Headphones listening to streamed Audio.

I also blame myself. I use $15 in ear bluetooth ear piece (Looks like a
hearing aid) Inovate G10. I listen almost exclusively to podcasts or YouTube
videos. The convenience is so much more needed than sound quality. I listen to
music at home (When my kids aren't around because all they do is complain). I
need to get a good pair of headphones.

~~~
abainbridge
> Well I blame the system. You needed upgraded headphones or speakers. I have
> $2000 studio monitors when I was an audio engineer and well I can 100% tell
> you in a blind test I can tell the difference.

Do you have an explanation of how that is possible? On the face of it, CD
seems to have more SNR and frequency response than necessary. I've got
intelligent friends on both sides of the debate, but I've never heard a
plausible explanation from the "I promise I can tell the difference" friends.
I tend to believe Monty Montgomery (of Ogg Vorbis fame). After watching
[https://www.youtube.com/watch?v=2qFjdQP7Ep0](https://www.youtube.com/watch?v=2qFjdQP7Ep0)
there's little obvious room for doing better than CD. The best explanation
I've heard is that CD mastering is generally aimed at mainstream equipment,
and therefore isn't optimal for good equipment.

~~~
baldfat
I can explain. On stereo you will be hard pressed by most people. The issue is
where and how is everything done from recording all the way to mastering.

Technically here is how it works. CDs are at 44.1khz sample rate and you
record at 96khz back in 2000s, similar how video recordings are 8k for 4k or
1080p. Then when you exported your mix to be mastered it would get knocked
down to 44.1khz not very noticeable in the least unless you listened to stuff
all day everyday and had to train your ear to hear everything.

If you compare the two 96khz and 44.1khz there is a distinct difference BUT it
is not something that people would care about because their equipment only
gives a clarity of less than 44.1khz. My system from recording all the way to
audio output and DAC were 96khz and cost my thousands and thousands of
dollars.

I do believe that you can tell when you hear the two. Kind of like how FLAC
and MP3 sound different on my LG V20 phone.

~~~
dhimes
But the question remains. Given Nyquist theorem and an upper limit to human
ear response, what is the advantage of going over ~40kHZ? I understand you can
encode more information, but decoding to higher frequencies than we can hear
doesn't seem (to the novice) to be beneficial. Do we somehow 'detect'
harmonics above what we can hear? Or is the extra information used to process
something else differently, like the relative volumes of two slightly
different instruments making a note at the same pitch? It's an interesting
question IMO.

EDIT: upper ear response is presumably limited to ~20 kHz

~~~
Loginid
You are right about upper ear response.

I can tell the difference with my now-old ears, if it is a good recording of
real-world instruments in a common space.

That is a lot of qualifiers, but on playback you can ‘feel’ that other space.

I believe that it is because of the way that the higher frequency components
of the sound interact with the environment and effectively down-transpose and
affect the rest of the signal before they hit the listener’s ear.

------
kobeya
The general-purpose computer. It's a dinosaur on the verge of extinction at
the hands of walled-garden app stores on phones and tablets. I never, ever
expected that to happen.

~~~
fsloth
Uh oh, I think I'm about to rant. Scuse me.

Given how cheap even powerful general purpose computers are I don't think they
ever will be prized beyond availability. Aficionados, internet stores, arm,
Linux, market forces, etc.

To some part, good riddance. Software engineers and product designers are the
reason the PC is effectively broken by design in UX. Because they are
inexcusably slow. There is 0 reason anything I want to do with e.g. office
would not have 0 wait time except non-user centric design. My friggin circa
1984 Mac Plus felt faster on most stuff than the generic desktop PC.

The decade when Moores law gave freebies to software is probably one of the
reasons for this astronomical sluggishness. Application feels slow? Well, just
wait a year and get a new faster CPU. No need to spend resources on
optimization.

I don't know who started with the idea that it's smarter to buy more hardware
than to spend resources on programming but if I had to guess, it were the
mainframe vendors. Cool for embarrassingly parallellizible batch processing
jobs for massive bureaucracies, not so much for desktops.

And don't get me started on the idiocy of thinking it's fine software
developers don't need to develop domain understanding in the field they
operate.

With the general purpose computer desktop development, we have a system that's
broken on so, so many levels.

At least Os X seems to try to do the right thing at an attempt at fluency and
I _can_ get a linux desktop to operate smoothly. Even windows 10 _starts_ fast
but O brother and sister of the clunkyness of software.

Despite what I said, I think the situation is improving. Immediacy in handheld
devices puts pressure on the desktop to concentrate on not wasting the users
time as well.

Let's see how it goes.

~~~
baldfat
> My friggin circa 1984 Mac Plus felt faster on most stuff than the generic
> desktop PC

See how fast it felt.

[https://youtu.be/XwbrCYJcrKQ?t=4m34s](https://youtu.be/XwbrCYJcrKQ?t=4m34s)

No way and NO HOW! That statement is 100% false. The issue is you remember
that it felt fast. My Amiga felt like lightening to me in 1985 and I still
boot it up from time to time. The delays in everything is over the top and Mac
128k were horribly slow.

~~~
fsloth
I meant the Macintosh Plus, not the Macintosh 128k. I was off by a few years,
the Mac Plus came in 1986.

[https://en.wikipedia.org/wiki/Macintosh_128K](https://en.wikipedia.org/wiki/Macintosh_128K)

[https://en.wikipedia.org/wiki/Macintosh_Plus](https://en.wikipedia.org/wiki/Macintosh_Plus)

I think the example is a bit too sluggish but is probably a more correct
presentation of the actual facts than my nostalgic outburst.

~~~
baldfat
The delays of just writing a widow is so slow. Macintosh Plus Review
[https://youtu.be/_bI0moHdjPQ?t=22m29s](https://youtu.be/_bI0moHdjPQ?t=22m29s)

I was an Amiga kid. I had friends and co-workers who had Macintosh Plus and
they always seemed to be so S-L-O-W to an Amiga.

Amiga 1000 from 1986 [https://youtu.be/CDWdVk-
hmgA?t=51s](https://youtu.be/CDWdVk-hmgA?t=51s)

The window draws were also slow BUT Ray Tracing and full animation compared to
that lame Macintosh is just a whole different generation. Amiga lost and it
took till 1994 or 1995 to beat an Amiga in 1986.

~~~
Veedrac
That did not look any less responsive than doing the same on LibreOffice on my
high-end gaming laptop, with an SSD, and in many cases it looked more so.

Of course, the newer one has newer graphics, but it also has enough
transistors to replace each one in the 68k with a whole new 68k.

------
mncolinlee
Near-field communication (NFC).

It came out in 2010 for Android and Microsoft phones, but has been hamstrung
by Apple for most of its development. I had several startup ideas which
required NFC market penetration on phones, but which even today are
impractical due to Apple's chokehold on NFC APIs for fear of a payments
competitor.

In order to even find "tap to pair" devices, you often need to seek out the
one manufacturer or option available. There should be personal phone-to-phone
"tap to pay" by e-cash or cryptocurrency. "Tap to auth" using your phone. "Tap
to key exchange" should be part of doing business, sending "business card"
contact details as well. Samsung had S-Beam, "tap to share files" securely
with WiFI Direct.

But even with the addition of Core NFC, you need to find a responsible adult
with an Android phone to even write a single bit as an iPhone owner.

~~~
allthing
We use it in products at work and it always works seamlessly. It could be a
quick and easy way to pair a phone to a car or connect a phone to WiFi. Just
tapping is way more efficient than the current connecting options when you're
near the hardware you want to connect with.

------
adrianmsmith
GWT.

To create client-side web apps, rather than writing Javascript, write Java and
have it transpiled to Javascript. Java has IDEs, isn't a new language to learn
(like Dart), has static type checking, support for refactoring in the IDEs,
etc.

GWT could do tree shaking, obfuscation, etc.

You could write your server code in Java, and "call" remote server methods
from your client code, "passing" Java objects transparently back and forth
between the client and the server.

It was released in around 2006, it was not without issues but seemed to be a
much better way to develop complex client code than the Javascript options
available in 2006, as far as I could see.

It never really took off.

Google moved their team over to Dart at some point. I guess the Oracle Java
lawsuit meant Google were unenthusiastic about continuing to have anything to
do with Java where they didn't need to.

The compile times were slow. Not all Java compiled so you had to have a mental
model in your head not only of Java but also the Javascript that was going to
be generated which was mental overhead.

~~~
leephillips
Doesn't clojurescript use GWT? If it does, then GWT has certainly lived on. If
I'm confused, sorry.

~~~
adrianmsmith
I'm no expert but I think Clojurescript takes the Clojure language (spelt with
a "j") and converts it to unoptimized Javascript, which then gets processed by
the Closure compiler (spelt with an "s"). The Closure compiler takes
unoptimized Javascript and emits optimized/minified Javascript.

Closure is from Google, but is independent of GWT which is also from Google,
despite the two projects having similar aims. Closure is modern, whereas GWT
is not.

The open source team who maintain GWT now (after Google abandoned it) are in
the middle of a re-write, which will be called GWT 3 (current version is GWT
2). GWT 3 will convert Java to unoptimized Javascript and use the Closure
compiler to optimize it etc (as Clojurescript does). Current GWT 2 does not
use Closure but does its optimizations itself.

I may have got some of that wrong, corrections welcome!

~~~
leephillips
Thanks for untangling that.

------
Dowwie
The _semantic web_. It spawned an entire generation of academic study, but
aside from assuring job security for researchers hasn't amounted to anything
beyond their ivory towers.

~~~
perlgeek
I get the impression that "dumb" crawling and machine learning have turned out
to be more practicable.

~~~
mannykannot
It is definitely more practicable in the sense that no-one has figured out how
to implement the vision of the semantic web.

------
shove
WebGL. I spent ~10 years doing Flash / ActionScript and when Jobs nixed the
mobile Flash plugin, I figured everything would shift to HTML5 / WebGL. Nope.
The era of fancy, interactive websites faded into the rearview mirror.

I'm sure most of you don't miss it, but there was so much amazing work. Rather
than the budgets shifting to Three.js etc, it went to apps or dried up
completely.

~~~
disease
There is still demand for interactive, animated and "flashy" websites in the
educational sector.

~~~
shove
Any way you slice it, demand is probably less than a tenth what it once was.

------
Animats
Virtual worlds. The "metaverse" of Snow Crash. "Snow Crash", the TV series,
has been green-lighted by Amazon. That may drum up more interest.

Second Life comes close. They've had a working system for over a decade and
have with adequate solutions to most of the problems. But it's hard to use,
and not that many people are interested.

~~~
nathan_f77
I agree with that. I think it just suffers from stigma, like online dating a
few years ago. It's very easy to make fun of Second Life. For example, there's
a scene in The Office where Dwight has an account, and Jim signs up in order
to prank him, but then he ends up taking it too seriously and gets made fun
of. All the articles I've read about Second Life paint a sad picture of lonely
people in their basement or bedroom.

I would like to try a virtual world with facial feature tracking, like the new
Animojis on iPhone X. It would be interesting to play a multiplayer game where
you communicate with voice, body language, and facial expressions.

A VR headset obscures the eyes, so you couldn't use a 3D camera. Maybe a
combination of 3D tracking and sensors inside the headset.

~~~
Cthulhu_
> It would be interesting to play a multiplayer game where you communicate
> with voice, body language, and facial expressions.

Simple applications already exist; look at Rec Room for example. That's got
voice, limited body language (mostly hands) and I think even simple facial
expressions that it infers from... idk, something. Voice chat too.

------
johan_larson
I'm disappointed none of the various constructed "universal" languages (like
Esperanto) ever took off. Having a single common language spoken by everyone
would be enormously useful. As it is, the closest we have is English, but it's
a natural language and therefore complicated and difficult to learn.

~~~
hans_mueller
yes, but the English spoken by most non-natives is actually something like an
Esperanto. the pronunciation, vocabulary and grammar is almost normalized and
an artificial dialect. i think English is a very good compromise. the only
problem are French people and other communities who disdain English for
patriotical, historical, or political reasons.

~~~
dandare
English is considered to be very difficult language and I find it to be a
historical tragedy that English became the lingua franca of the world. I
remember that as a child I could not wrap my head around the "spelling
competitions" I saw in English cartoons. What is the deal with spelling words?
If you can say it you just spelled it, if you can read it you just pronanceed
it. And that is only one of many things that make English difficult. Only
Chinese Mandarin with its tonality could be worse (will be worse?).

~~~
jabretti
Spelling difficulties are are artefact of what makes English so powerful,
though -- its ability to simply adopt foreign words any time it feels the need
to do so.

The base language was formed by crashing two major European language families
into each other a thousand years ago, and we've been happily borrowing words
from every other language on Earth ever since as we've needed them. If we want
to make up a brand new word, as we so often do in science, we'll probably
reach for a handful of Greek and Latin prefixes and suffices and stick them
together any way we like.

After a thousand years of this, English is probably the language with the
biggest and most expressive vocabulary of any language in the world.

~~~
ufo
I think the spelling difficulties have more to do with English not having
phonetic spelling than with it having many loanwords, prefixes or suffixes. I
speak Portuguese and we have lots of loanwords as well but after a while they
are all spelled like the rest of the words are instead of copying the original
spelling. And every once in a while there is a spelling reform to replace
archaic spellings with more modern forms (this never happened with English
because it was politically unpopular)

~~~
maffydub
I think the cause of a lot of the bizarre spelling in English is the "Great
Vowel Shift" \- basically, sometime between 1350 and 1700, most uses of most
long vowels changed their pronunciation.

Obviously (because this is English), the change didn't apply universally, but
it applied widely enough to move our pronunciation out of sync with the
European languages it was derived from.

See
[https://en.wikipedia.org/wiki/Great_Vowel_Shift](https://en.wikipedia.org/wiki/Great_Vowel_Shift)
for more detail.

------
marcus_holmes
Optical computing. I remember talking to a bloke working on it back in 1990,
and he said they had a working optical transistor so all the hard bits were
done. Soon we'd have computers that used multiple frequencies of light to
process operations concurrently on the same hardware!

Still waiting...

~~~
btown
These computers do exist on tabletops in labs; the problem is that it's
comparatively difficult/expensive to mass-produce and miniaturize these
components compared to silicon (where you can etch billions of transistors at
once on a modern CPU). There are really cool things you can do, especially in
terms of continuous/non-discretized-time computations, but I wouldn't hold
your breath for this to become mainstream.

~~~
avian
Are you talking about analog computing? We tried that both with electronic and
mechanical computers and it obviously didn’t work out since nobody uses them
anymore. What benefits do optical computers have in that regard?

~~~
krylon
I once heard somebody talk about analog computers who is old enough to have
used them.

The impression I got was that they did work _excessively_ well on a very
narrow niche of problems (simulations, mostly). But "programming" them was a
huge pain, and for the general purpose arena, they were not useful.

~~~
marcosdumay
They would be comparing them with the digital computers of the same time.

There's a reason people used analog computers back them, and there's a reason
nobody uses them now. Most of the reason is about how noise interferes with
small components, and it's a physical reality that won't go away.

------
delgaudm
The Lytro Camera[1]. The idea of refocusing a photo after the fact felt like
an absolute revelation. Their initial camera was somewhat affordable, but
ended up being just downright awful. The pictures were low res, low quality,
and the camera was difficult to use and lacked basic, yet essential functions.
I was a super-excited early adopter. After I got the camera I think I took,
maybe 100 pictures with mine and it's sat in a box of disappointment ever
since, because you couldn't actually _do_ anything with the crummy pictures it
took. All I wanted was a reasonably good 4MP image where I could refocus a
picture on my kid's face when the camera focused on the wall behind her so I
could send that picture to grandma, or post it to facebook. They clearly had
something else very different in mind and the novelty of all the interactive-y
stuff they added around the refocusing forgot that what we(I) actually wanted
was a picture.

It's all fine that I ended up not taking more pictures becuase the one place
you could see them is being shut down at the end of the month.

They appear to have pivoted to professional cinema.

[1] [https://www.wired.com/2011/06/lytro-camera-lets-you-focus-
ph...](https://www.wired.com/2011/06/lytro-camera-lets-you-focus-photos-after-
you-take-them/)

~~~
greggman
I wanted to like Lytro but after trying it I realized I don't want to refocus
later. Just don't care. I'm not saying some Pros wouldn't want that but for
the masses my guess was "no, can't be bothered"

On the other hand, what I do expect to happen maybe is my mobile camera will
advance to the point where it takes 1000 images in 1/1000th of a second so I
can just wave it around then then post process any image at nearly any
resolution any dof so I don't need lenses, don't even have to aim much.
Unfortunately actually processing 1000 images in a short amount of time is
10-20yrs off?

I'm a little surprised at their pro video camera hasn't been more popular. It
seems like the no need for a green screen feature would be useful for getting
better lighting for effects scenes.

------
grahamburger
3D printing. Turns out making little plastic shapes isn't all that useful.
Seems obvious in hindsight.

~~~
chx
I believe the problem is the sort of plastic used. You need structural
strength in most things and filament pieces melted together is just not
cutting it.

~~~
cheeze
Absolutely. One area that they have revolutionized is casting. Hell, even Jay
Leno's crew used a 3d printer to create prototype parts that were no longer
made and then cast the plastic printed part into a sand mold using aluminum.

~~~
vlehto
The next step would be to 3D print the sand mold directly.

------
kumarski
CRISPR.

My naivete surrounded me. I didn't realize that even though there was a high
output of genetic marker tests that very few if of them would result in viable
CRISPR targets.

There's 60K genetic marker tests on the market.

8 to 10 new ones come out each day.

Humanity has little hope of figuring out the sort of "druggable targets."

Genome Wide Association Studies aren't like websites or apps waiting to be
optimized with more data.

The more data you feed in the more obfuscated the truth becomes.

Drug discovery is hard and CRISPR modifications are technology way ahead of
clear problem framing.

[https://omicsomics.blogspot.com/2017/03/targets-
drugability-...](https://omicsomics.blogspot.com/2017/03/targets-drugability-
revisited.html)

For 12 years, I've had a chronic autoimmune disorder that I had hoped would be
solved by the likes of a massive genetic dataset of Ulcerative Colitis twins
that had been separated at birth where one twin had the disease and the other
didn't.

I had hoped that CRISPR would be the solution given enough problem context
from a solid GWAS, but this was foolish.

Our best bets in biotech are not in the latest technologies, they are in
ensuring that basic science, phase 0, and phase 1 trials get large unfettered
funding.

I had hoped Liz Parrish's outfit would make leaps and bounds in genetic
modification for patients like me, but that ended up being baloney.
[https://news.ycombinator.com/item?id=11560943](https://news.ycombinator.com/item?id=11560943)

CRISPR technologies are like bullets to attack problems, but the operator is
always increasingly blind. Bonferroni Corrections abound and family wise error
rate are unforgiving in all things related to GWAS studies.

~~~
TrueDuality
A close mentor of mine has been working on private cancer research for close
to three decades and just recently retired. He was both incredibly hopeful and
positively terrified of CRISPR.

He explained CRISPR as "one of the crucial technologies we don't yet have the
wisdom to wield"

I recently learned (I believe on here, in the past week), CRISPR was used to
develop a genetic treatment on the first human patient to solve an inherently
genetic disorder.

I'm honestly terrified we're accelerating our technology faster than our
societies can adapt to the potential threats they may pose.

~~~
faltad
Are you speaking of the person being treated for his Hunter Syndrome via Gene-
editing treatment (I think it was done this week)? It doesn't use Crispr but
another tool named ZFN . Pretty exciting that we're not only trying those out
in vivo but also in fully grown patients with a specific disorder!

~~~
baldfat
"Like the newer gene-editing technology CRISPR, ZFNs can cut both strands of
the genome’s double DNA helix at a specific location."

They are different technologies but zinc finger nucleases (ZFNs) but they both
do DNA editing.

------
antjanus
Adobe Air. HTML/CSS/JS in a native executible with access to native methods?
It sounded like a dream. ANd yeah, you can do Adobe Air without Flash. You
could build desktop apps and phone apps with the same tools on top of it. I
really liked the desktop stuff.

I made a couple of apps and used a ton of Adobe Air apps (non-Flash) back in
the day. Thought it was the future.

It failed, for some reason.

^^^^ this is the reason why the rise of Electron wasn't surprising to me. In
fact, I was surprised by people being surprised. I realize that Electron is
WAY more sophisticated but "desktop web apps" have been around for a while.

~~~
shove
it was soooooo slow though. But yeah, the comparison to Electron et al is spot
on.

~~~
antjanus
not for smaller apps. I mean, can't imagine running VS Code on it.

------
Johnny_Brahms
I am still waiting for all my favourite applications to be extendable in GNU
Guile.

On a more general note, I am still waiting for the world to fall for lisp in
general.

~~~
jknoepfler
> On a more general note, I am still waiting for the world to fall for lisp in
> general.

Gah, aren't we all? Watching John Carmack push Scheme for the Oculus Rift 2
[[https://www.youtube.com/watch?v=ydyztGZnbNs](https://www.youtube.com/watch?v=ydyztGZnbNs)]
made me feel like something was right in the universe.

~~~
marcosdumay
> Gah, aren't we all?

Since I found Haskell, no. Not anymore.

And I don't expect Haskell to take over the world either. I do expect a large
crop of new languages that extend on the concept, and some of them to take
over the world.

------
Simulacra
Google Glass. I thought this would become standard, miniaturized, and used in
all aspects of life. Perhaps I read _Daemon_ too much, but I was sure Glass
would fly

~~~
swyx
it died when the word "Glasshole" was coined.

~~~
toyg
It had everything to do with recording. People wouldn’t mind people flaunting
weird gadgetry, but the feeling of being recorded was too creepy. It didn’t
help that a lot of early adopters were very creepy to start with.

It could have revolutionised journalism, and I still think we will eventually
get something like it once the tech is really invisible, but the social aspect
will take ages to sort out.

~~~
csmattryder
Robert Scoble in that Glass shower pic.

You can spot the exact moment all the hype died, in that one post.

[https://plus.google.com/+Scobleizer/posts/TcaqNeYJWXo](https://plus.google.com/+Scobleizer/posts/TcaqNeYJWXo)

~~~
toyg
Considering what came out later about Scoble...

------
michaelchisari
The blockchain. A strange thing to say when Bitcoin is nearing $8k each, but
I'm not interested in cryptocurrencies as an abstract store of value based on
the fever of the market.

By now, however, I really thought someone would have found a use for the
blockchain as the underpinning of some kind of new app or tech that would be
able to create real value for bitcoin or whichever crypto they built it on.

However, we just haven't seen that. We have a lot of gambling, a lot of whales
moving the waters, and a lot of irrational exuberance.

But no solid tech. It's still early, but I haven't even heard of anything in
the works that really knocks my boots off. All in all, I'm glad people are
getting rich (although my hunch is that most getting rich were so to begin
with), but so far the tech part of it all has been a big disappointment.

I guess other than finding a way to make space heaters that generate money.
That's pretty cool.

~~~
marcus_holmes
yep. Haven't seen a single application that isn't primarily aimed at raising
investment money. It's Ponzi's all the way down as far as I can tell.

That may change, of course.

~~~
intruder
Golem seems interesting, rent out your cpu cycles to others. I think it
currently works only with Blender, i.e. you could rent out rendering-farm.

~~~
anonymous5133
Golem is definitely interesting considering that big cloud computing companies
are now selling computing power.

------
the_decider
Semantic ontologies that would totally change the nature of search.

~~~
narrator
The problem with user supplied metadata is users lie. For example, the
"keywords" meta field was mainly used for SEO spamming.

~~~
williamscales
I dunno, I think as long as you do statistical analysis that's vulnerable to
adverse inputs you're going to see the same problem. Meaning that you can
infer the keywords and prevent obvious lies only until the users figure out
how to trick your keyword machine into inferring the wrong keywords.

~~~
hobofan
As someone who only briefly dipped their toes into the semantic web topic, I
found Googles' Knowledge Vault[0] to be a very interesting approach to the
problem of "everybody lies".

[0]:
[https://en.wikipedia.org/wiki/Knowledge_Vault](https://en.wikipedia.org/wiki/Knowledge_Vault)

------
rticesterp
3D Sports. 2005, in Austin one local theater showed the National Championship
in 3D. To this day this was the best football viewing experience I've come
across. During the game, I could see the holes open for running backs as they
opened and later closed in a split second. I felt like I was actually on the
field as a player as the game played out. At half time, everyone gathered in
the lobby of the theater and there was just this feeling that this was the
future of watching sports in talking to other people.

I was convinced at that time that 3D would eventually be the way that everyone
watched sports and that this would eventually top the in game experience. I'm
still surprised that theaters and sports networks haven't partnered to display
more sporting events in this fashion.

------
b0rsuk
Separate screen for touching on the back of the mobile device.

I forgot how the device was called, but there was a Blackberry or similar
device that didn't have your oily fingers smearing the very screen you're
trying to interact with. You didn't obscure the display with fingers. Instead,
the touch-sensitive rectangle was on the back of the device, and cursor was
displayed on front side, in the spot corresponding to the spot you touched.

~~~
gempir
I like the trend of adding this functionality to the fingerprint sensor.

I have a pixel 2 and I can swipe down the notifications by swiping down on the
fingerprint sensor.

------
zinckiwi
The Commodore Amiga. Used it from release in 1985 through to 1993, at which
point I had to admit defeat and defect. Much has been written, including a
great retrospective on Ars, but its failure wasn't due to engineering. Way
ahead of its time.

Minidisc. This was generally popular in Hong Kong and Japan, perhaps other
parts of the world, but in the US I gather it was marketed as a CD alternative
instead of a mix-tape alternative. Digital copy protection hobbled the latter
immediately, of course. I still used a pair of them ten years after getting my
first player as a poor man's two-track recorder.

~~~
ekianjo
The Amiga was awesome and it did sell quite well in some areas like in Europe.
Its failure on the US market is clearly what killed it, as well as the lack of
innovation after the Amiga 500... It did take over the world by storm with the
Video Toaster though - and stayed there even long after it was dead.

------
0xcde4c3db
In the '90s I was convinced that basically everything with a name attached to
it was going to be built around PowerPC and MIPS processors. When StrongARM
went to Intel, I figured ARM was basically dead outside of
controller/coprocessor/offload applications and expected Pentium II/III to
suffer a similar fate to Pentium Pro.

~~~
kobeya
RISC-V is a derivative of MIPS heritage and there's still a chance it will
take the world by storm...

~~~
0xcde4c3db
I thought RISC (Berkeley) and MIPS (Stanford) were originally rival projects,
with RISC being more closely related to SPARC. I wasn't there, though.

------
kdoherty
I remember being in high school when Google Plus came out and thinking it
would be incredibly popular. I totally missed the mark, and it's funny because
while I thought it would take off with _other_ people, I never really used it
myself.

~~~
Kequc
I touched on this before in my rambling. Google botched it, not the product
itself as the initial offering of G+ was actually really really good. Far
better than Facebook, a return to basics in a clean interface with Circles.
Groups being a feature Facebook later added.

When Google launched G+ it was actually one of two new products the primary
one being Google Accounts. Before that point you needed to use a Gmail account
to sign up on any of Google's services and it made everything a big pain when
you had more than one Gmail.

Now with a single Google Account you could sign in to Google's entire
ecosystem, including all of your Gmail accounts at the same time. It also came
with a free, ad-free, totally simple, you-don't-have-to-use-it social network
called G+.

Google should have marketed this all honestly. Instead they started talking up
G+, and hid Google Accounts underneath it. This was a huge mistake. The
general public felt like they were being tricked, because they were, albeit
far less nefarious reasons than was assumed. Nobody wanted to give their name,
to sign up to G+, and therefore rejected both G+ and the Google Accounts
system.

This compounded because Google really needed everyone to switch to the new
account system. Otherwise they were doomed to continue supporting both Gmail
authentication and Google Accounts authentication forever. So Google began
pushing for people to create Google Accounts, as you would imagine. People
then saw this as Google trying to force them to give their full name, trust
was already gone.

Amazing to me people trusted Facebook over Google because at the time Google
so far had a very good record protecting private information. Now that's all
gone of course but at the time they had an incredible amount of good will
which was seemingly not reciprocated and now Google is a slimy gross privacy
succubus the same as Facebook is. Probably they just threw their hands up and
said 'fuck it'.

Google tried to further reduce complexity in their myriad of different
platforms. They now had two social networks G+ and Youtube. G+ was going to
eventually absorb and replace Youtube, it would have been pretty elegant
really. The first step was to have the Google Accounts system absorb Youtube
accounts. People resisted this again because at this point good will was gone.
Eventually Google gave up on everything.

Now G+ is a cesspool of tacked on features, advertising, just a bad smell
entirely. It did have a lively base for a short while. Things really went
downhill after the Youtube consolidation because G+ suddenly had what was
notoriously one of the worst communities on the internet, Youtube comments.

Google has I believe still not successfully moved everyone over to Google
Accounts. Youtube was never successfully merged into G+. G+ is pretty much
dead at this point. Nothing went well at all and now Google has to support a
million different things that all barely work.

Meanwhile Youtube hasn't seen a facelift in about a decade while Google
figures out what it wants to do with it.

------
contingencies
X3D - [http://en.wikipedia.org/wiki/X3D](http://en.wikipedia.org/wiki/X3D)

Rationale: It was the 1990s, VRML flew but didn't quite make it. 3D hardware
acceleration had arrived, gaming was exploding, Descent and Quake had proven
complex worlds and freedom of movement were possible on commodity hardware,
and talk of buffering 3D content in video streams seemed to make perfect sense
for next generation passive entertainment. Product placements, architectural
documentation, mechanical design courses, etc.

What killed it (IMHO): Took too long to standardize, too much formalism, no
consumer electronics manufacturers on board sending actual products to market.
If they had done an IETF approach ("rough consensus and running code") they
may not have missed the window of opportunity.

~~~
greggman
What killed IMO and what will continue to kill these kinds of things for a
long time is that nearly all 3D applications need massive amounts of custom
optimization solutions. Stuff like X3D gets you a few cubes on the screen.
Then you want to do GTA or Google Maps or Minecraft or pick your app and
suddenly your generic scene graph just isn't good enough.

------
bane
I thought (and still think) the phone as single computing device is going to
replace desktops and laptops via some kind of dock of some sort. There's been
a few attempts at it by Dell and Microsoft and a few others, but none really
satisfactory.

I think the most successful to date is the Nintendo Switch.

~~~
robodale
I thought this as well. I'd love to plug my phone into a laptop-looking-
thingy...and instantly have a keyboard, screen, and maybe a battery (to keep
the phone charged up).

~~~
ezconnect
Samsung is advertising that usage when promoting the current Galaxy phones

~~~
bane
You're right, here it is.

[https://www.samsung.com/us/explore/dex/](https://www.samsung.com/us/explore/dex/)

------
tlrobinson
The converse is possibly even more interesting: what tech were you convinced
would fail but took the world by storm?

~~~
barrkel
YouTube - it didn't look profitable and I thought they'd go out of business

~~~
hobofan
From official statements here and there, and especially the "Adpocalypse"
happening this year, I doesn't look like they are profitable even now.

Sometimes I think the only thing keeping Youtube alive as a Google product is
that the internet would rip them to shreds if they were to shut it down.

~~~
krapp
I think Google was expecting Youtube to no longer exist as an entity in its
own right, but just be a part of Google+ after they integrated all of their
properties into a massive Facebook killer social media app, but that was a
colossal failure.

Now, I think they want to use the infrastructure to build something that
competes against Netflix and Amazon, but to do that they have to destroy the
existing economic incentives for content creators and drive off as many
content derivative users as they can (anyone incorporating or using licensed
content, even as covered by fair use), and basically rebuild Youtube from the
inside out.

------
indubitable
I thought this was going to be 'The VR Thread.'

I still don't entirely understand why virtual reality didn't really catch on.
I think there were some very poor business decisions by Facebook which were
treating it as a success before it had even launched -- trying to ring-fence a
platform, both hardware and software, whose early demographic is going to be
entirely high information users seems a very questionable decision.

But beyond this, I don't really understand why it didn't catch on. The first
time I tried it I was hooked simply due to the issue of presence. Though
perhaps the fact I ended up never purchasing a device answers my own question
-- I had no interest in a Facebook driven platform, and the price point to get
involved with VR for things like the Vive did not mesh well with the
availability of software, as well as the fact that hardware tends to rapidly
iterate.

~~~
Animats
Spend 10 minutes with an HTC Vibe or an Oculus Rift on your head, and you'll
be impressed.

Now spend 4 hours. Most of those things end up in closets. Or on eBay.

~~~
vortico
I like this way of thinking. It explains a load of products and technologies.

------
hannob
I would have expected robots and automation technologies that replace human
workers to come much faster to everyday life.

Of course this is happening in some areas, but not nearly as much as it could
happen.

Just one example: It's easy to imagine running a shop that works with little
to no workers. The closest thing that exists are these self-checkout cashiers,
but even these aren't that prevalent, the norm is still a human cashier.

~~~
blattimwind
A foreshadowing of this is that while industrial robots can do impressive
work, they do this in a very specific set of constraints:

\- they are very expensive in relation to consumer income

\- they are complex to program

\- generally specialized: more or less general purpose frames (like your
typical 5 axis arm) with special single-purpose tools mounted to them

\- safety architecture is generally: total enclosure, no human
access/interaction during operation, if not totally enclosed usually with
light curtains or very stringent requirements.

A general-purpose helper bot looks like a completely different beast from this
angle.

------
jawns
In early 2013 I read this article:

[https://www.sciencedaily.com/releases/2013/02/130220084442.h...](https://www.sciencedaily.com/releases/2013/02/130220084442.htm)

It described what seemed like an incredible advance in capacitor design that
would allow xenon flashes to be installed in even the slimmest of smartphones.

I consider xenon flashes, which are found in most cameras that aren't
smartphones, to be wildly superior to LED flashes, which are used in phones,
and assumed that within a couple of years, the new technology would be
ubiquitous.

But it never happened. So far as I know, the partnership between Nanyang
Technological University and Xenon Technologies never really went anywhere.

After a couple of years of waiting, I even went so far as to try to contact
the researchers to find out what happened, but never got a response.

I hope that someday this tech does hit the mass market. Although LED flash
quality has improved somewhat, xenon flashes are still so much better.

------
geowwy
I expected PCs to be in every home pretty much forever. Now a lot of non-tech
people prefer phones and tablets.

~~~
vortico
To avoid redundancy, here's why I don't think phones and tablets will take
over the concept of a personal computer in the future.
[https://news.ycombinator.com/item?id=15719836](https://news.ycombinator.com/item?id=15719836)

I believe the situation is even better. Desktops and laptops, no longer
needing to support the layman on their phones and tablets, will actually
become _more_ oriented toward the needs of power users, although their total
userbase may decrease. In a decade, Macs will no longer be heading toward
being "Facebook machines" because you can do that on your phone, so Apple will
focus more on the target userbase of creative digital artists. Windows
machines will always need to support word processors and CAD programs but in
ways that are cleaner to install and manage. Linux will keep growing to keep
up with the number of programmers in the world, so more professional and
amatuer solutions will be made available.

There are too many cheap, low quality PC hardware solutions in the market now.
Their users will switch to phones and tablets. Higher class hardware will
remain the same as the power users will keep buying these.

~~~
icebraining
Yeah, but I'm not sure if I would have become a programmer if "real" computers
only targeted the professional market, while you had much cheaper appliances
that solved the needs of my parent. Will a single mother be able to afford the
difference just so her child can geek out? Turning computing into a privilege
of the upper middle class doesn't appeal to me.

~~~
vortico
The "learning programming was much easier back in the day" rhetoric is
exaggerated. Programming on any device is much easier than it was 10 years ago
in VBScript, which was easier than 10 years before. Downloading some $1 app to
write code on an iPad is way easier to dive into than learning BASIC on an IBM
PC. There are even apps that actually specifically help kids learn instead of
dumping a 200 page manual on you.

Also, phones and tablets will remain to be as expensive or more than the
future "working man" desktop computer, so this isn't a matter of one wealth
class vs. another.

------
muzani
I expected microtransaction games to take off, but not in the form of "pay to
win" as it is today.

I expected things like buying side quests or episodes for games. People could
just pay for as much content as they would play. It would have been similar to
how musicians could sell singles instead of albums.

Paradox games and CS:GO seem to have adopted this model very well but it never
happened in mobile games, where microtransactions would have been most
successful.

~~~
Zpalmtree
Well, DLC is pretty common in PC games.

------
tnitsche
Mobile/Micro payments. There is still no friction free solution, where I would
be sure, my mother could use it.

~~~
vortico
This would be my #1, as it would have drastically changed (for better or
worse) the internet. Although, it's a psycological problem instead of a
technical one. You can put a PayPal button or Stripe popup on your page using
technology _today_ , which will usually take 10 seconds (given that the user
is logged into those services, since we're assuming that this micro-payments
practice is common), but it's a wild expectation that people would actually
get used to willingly give up a nonzero amount of time to pay for something so
small as an article.

Another problem may have been this. If _everyone_ on the internet agreed that
say, watching a video should cost $0.02/minute, then people would get
accustomed to paying this amount because that's such a small cost. But it's
only a $0.02 decision for a content creator to restrict or allow it to be
watched for free, and if say, 5% of artists just decided to "eat" the
potenential earnings, then you'd have 5% of free videos and 95% premium
videos. The 5% free videos would prevent the 95% of videos from being watched,
because why would anyone pay to watch a video if you can watch another one for
free. Basically I'm saying that this idealization of micropayment content is
destroyed by free content, whether that is a good thing or not. I do this
myself with respect to Android apps, since I don't take my time seriously
enough to pay the occassional $1 per app.

~~~
alexasmyths
This is because VISA/AMEX have an oligarchy, and VISA is backed by the banks,
i.e. the entities that issue credit.

So, the financial system is 'behind' the crapy payment scheme.

Something would have to come along to basically offer credit without all that.

I'm mostly surprised that Amazon and/or Wallmart have not simply done that.

I'll bet Bezos does some day: a low-interest rate 'Amazon Card' that other
merchants can use, credit provided by Amazon (not banks), and it's 0% payment
for other merchants.

~~~
vortico
That is a good point, but PayPal and other services don't need to use cards at
all (PayPal balance, attached to a bank routing number), so I don't think it's
the _reason_ micro-payments are not the norm.

A credit card alternative from Amazon would be fantastic competitor to VISA /
MasterCard to break the lack of innovation in the personal finance industry.

~~~
alexasmyths
To use PayPal - you have to have your money in PayPal, or it will use an
underlying credit card. So, it's really just like using a bank account to pay
directly.

Most people don't like having to keep a separate current amount in PayPal to
buy stuff.

Also - I'm not sure if the merchant fees that PayPal charges are any less,
i.e. they didn't take the 'no fees' angle for the merchants.

And of course there is the consumer benefit: even if a system only charged 0%
to merchants, and not 2-3%, well, consumers don't care. So getting enough
consumers to sign up is hard.

VISA / AMEX prevent merchants from advertising cheaper prices for use of other
cards - so even if a merchant had a 0% processor option, they're not allowed
to discount by 2-3% for that card vis-a-vis VISA / AMEX.

It's a tightly controlled oligarchy. It's huge, entrenched, it's entire
consumer financial system. Going after it would be economic thermonuclear war.

A disruptor will have to come from an outside angle - or - be huge and have
their own base.

So I can see Amazon just 'doing it' themselves, for their own customer base,
which equals a critical mass.

Or - someone does something 'online' that shakes things up, and that bleeds
over into the brick and mortar. A crypto-currency ... I can see doing that in
some way.

~~~
vortico
You can connect PayPal to your bank without a credit or debit card, so yes,
they can be out of the equation if the consumer wants to. I don't think PayPal
or card fees are significant in this discussion of micropayments. The issue of
micropayments is getting people to pay _at all_ , either 0% or 100%. A 3%
detail doesn't have a huge effect on why they aren't commonly used.

~~~
alexasmyths
"A 3% detail doesn't have a huge effect on why they aren't commonly used."

At brick and mortars, where they are operating on 5% margins in many cases, a
3% skim off the top is massive.

A 3% arbitrary fee for every electronic payment in north America is really,
really quite a huge thing.

It's quite massive form of rent extraction, given that there is no real IP,
and the actual cost is almost nothing.

It's one of the biggest and juiciest forms of obvious oligarchy/monopoly that
there is.

~~~
vortico
Right, totally agree, but this is not the reason that micropayments is not
more commonplace. It's equally difficult to get people to log into a payment
system to pay $1.40 vs $1.55.

~~~
alexasmyths
The micropayments are not there because of the minimum charge on VISA
transactions of about 25 cents.

~~~
vortico
I already mentioned two ways around this, which is why your argument is void:
Holding small amounts in PayPal (or similar) accounts, and bypassing cards
entirely and using bank routing transactions. Both are solved technical
problems.

------
altitudinous
Kinect - such a human interface that doesn't require direct integration with
hardware or technical knowledge or buttons, and a lot of fun in the games I
played. But it failed.

~~~
flockonus
Maybe the product failed, but the technology I'll argue it did not

~~~
popcorncolonel
Classic Microsoft. See the Windows Phone as well.

------
api
The open web, open OSes, open in general. We (average HN users) use these
things but most people use walled garden platforms all the time. For non
techies the trend is that way. Computing market seems to be bisecting into pro
and consumer.

Two issues have prevented this open computing for the masses IMO:

1\. Spam and malware. Any open system that gets any user base becomes spam
hell. This is even happening to the more open of the walled gardens now (e.g.
YouTube) so the future will likely be even more closed.

2\. People like "free" and with no revenue stream you can't pay programmers to
work on the parts that are not fun. This includes UX, which is a brutal grind.
Walled gardens have a revenue stream from being surveillance driven
advertising platforms users can't customize.

------
examancer
Flying cars. I remember seeing prototypes in late 80s and early 90s,
constantly showing up on Discovery Channel programs I watched as a kid like
Beyond 2000 or Next Step.

Over time it seemed increasingly far out, but, possible. Eventually I realized
things like how catastrophic a break down or accident would be meant it will
likely never happen. I and everyone else who thought flying cars were on the
horizon were suckers.

~~~
anon1253
We have helicopters though, even quadcopters that can carry people, not to
mention airplanes. But general purpose "flying cars" are a ridiculous idea:
the failure mode is catastrophic (as you pointed out), the noise will be
terrible (especially in cities), not to mention pollution (you might not care
about CO2, but breathable air is still nice). And there's regulation, since
there are no roads you need centralized control (e.g commercial/military
flight control systems) which makes it almost impossible to scale and would
mean sacrificing to great extend the benefits of cars: autonomy. And above
that it solves a weird problem: congestion and capacity, but those are better
tackled with self-driving cars and better public transit.

------
akkartik
RSS and Google Reader. What can I say, I was young and foolish.

~~~
vortico
The fundamental question for the users here is "should I read an article on a
website, which is designed exactly as the users should read and explore it and
has 100% support for all typographical element, or on a third party reader
which displays a processed copy of the article and requires effort to set up?"

RSS hasn't _died_ , but it is definitely not growing, as those who prefer to
use it are already using it.

~~~
eadmund
> The fundamental question for the users here is "should I read an article on
> a website, which is designed exactly as the users should read and explore it
> and has 100% support for all typographical element, or on a third party
> reader which displays a processed copy of the article and requires effort to
> set up?"

I'd take the plain text of an article over a JavaScript-laden monstrosity of a
single-page app any day of the week, and twice on Sundays.

I really miss being able to quickly & easily read articles, without
distraction. But then, I miss lynx — and I still harbour a deep-seated hatred
for those who have destroyed the Internet I once loved.

------
vssr
Speech to text. 20 years ago there were programs already doing a fairly good
job at it. By now I'd expected to see people recite written documents to their
computer instead of typing them. Not the case.

------
wbl
Nuclear power. Carbon free. Reliable. No waste if you reprocess. Cheap fuel.

~~~
api
Three things are killing it:

It is exponentially harder to finance a few huge super costly projects than
loads of small ones.

It has a high "PITA factor." That stands for pain in the you know what. A high
PITA factor means there is a very long tail of non obvious problems that
compound and multiply. It looks much better on paper than in real life because
on paper that stuff doesn't show up.

An exponential is now in place for solar and storage. It would not surprise me
if almost all energy outside aviation and a few heavy industry applications is
solar and batteries by 2050.

~~~
wbl
It would surprise me. Battery chemistry has already tried everything on the
periodic table together, and storage remains expensive. Right now more solar
leads to more natural gas to cover loads when the sun doesn't shine. Load
shifting can only go so far.

~~~
Jeff_Brown
> already tried everything on the periodic table together

Shall we consider the size of the space to explore? If n is the number of
elements, at first blush it's n*2. But there are compounds -- more than nine
million organic compounds alone. Electrical and temperature treatments can
have different effects -- and those are functional spaces, you don't just
choose a current, you vary the current over time. Simple positioning can have
an effect: rotate one surface relative to another and it can behave totally
differently. Wafers behave differently than strings, which behave differently
than spheres and crystals, which behave differently than dust ...

If there was a race between our exploring the entire chemical space and
colonizing the universe, I would not know which to bet on.

------
ilkan
Google Glass. IMHO they mistakenly went after the consumer and advertising
markets instead of starting with maintenance workers and hobbyists. On a
related note, I also expected some mobile computers would be "cyberdecks",
embedded in the keyboard. Instead, the computer is the screen.

~~~
SimbaOnSteroids
Glass failed because it was too early for the tech, they needed to be able
have the functionality of the the camera they had but make it something you
didn't immediately notice. Also the display mechanism needed to be less
noticeable as well. What killed Glass was that it made people uncomfortable.

Also glass relaunched in July targeting enterprise applications.

~~~
mdjorgensen
Agreed. Here's more on the enterprise relaunch for anyone interested:
[https://www.wired.com/story/google-glass-2-is-
here/](https://www.wired.com/story/google-glass-2-is-here/)

------
codesternews
Mobile Augmented Reality or ARKit, But it was just hype as previous. I mean
nothing in mainstream.

I think main reason is UX or User Interface. It is very hard for people who do
not know how it works and expect it to work perfect (in low light etc) as with
normal user interface of smartphone. It was not very intuitive for normal user
because of Smartphone limitations.

I can think that it might take in future if they improve the correctness of
platform. It have lot of potential and let's see what future brings.

~~~
kromem
Give it time.

Just as a frame of reference, the original iPad sold less units in its first
year than the PlayStation VR.

Virtual reality is still in its infancy, and honestly AR is a technology
that's going to have to build on top of a lot of VR's technical underpinnings.

Hardware growth takes a lot longer and more iterations than software. VR is
going to be huge eventually, and AR is then going to dwarf VR... eventually.

There's a lot of work in wireless signal standards, battery life, displays,
and pure computational power that needs to happen before these technologies
can deliver on the promise though.

We're slightly past the Newton stage, and somewhere behind the Windows Mobile
3.1 era with these technologies. It's going to still be a while before we see
the "iPhone" for VR/AR, and even then, it'll be a few years before we see mass
adoption.

But in my opinion, the end result is an inevitably.

------
bitwarrior
Nanotubes. Seemed like it was all upside, the stuff we were going to be making
starships out of.

~~~
kobeya
It will be. When we're able to manufacture them in large quantities and with
few imperfections. Give it time.

~~~
vortico
Yup, it hasn't gone away or stopped growing. It might be taking too long for
journalists to find interest in it right now, but that has low correlation to
the research effort.

~~~
swyx
mind explaining a little more in terms of how far away we might be able to
mass produce these things?

~~~
vortico
I'm not in the field so I don't want to try to guess how long it will be. But
of course it will be in gradual increments of cost, quality, and size. There
won't be an exact point in time where we transition to being able to mass
produce this material on a large scale.

------
eklavya
Memristor. I thought that will reshape the entire industry.

~~~
swyx
these things?
[https://en.wikipedia.org/wiki/Memristor](https://en.wikipedia.org/wiki/Memristor)

why didn't they? (at a high level... im not too smart about these EE topics)

~~~
henrikeh
AFAIK very difficult to manufacture/effects guarding its behavior are poorly
understood.

Another issue is that memristors are not part of any standard curriculum and
thus limits the number of people who will work on it.

------
j45
Palm/HP/LG WebOS - JavaScript based os a few years ahead of it's time in
mobile form.

~~~
examancer
Came here to vote for this. I even published an app to their store during the
~2 weeks it looked like HP was going to throw their weight behind the
TouchPad.

It was really ahead of its time and most of its major metaphors can be found
in the surviving mobile OSes. Just wish the web-centric development model
survived as well.

I have WebOS on my LG TV now. It sucks.

~~~
jetti
I'm curious as to why you think WebOS on your TV sucks? I too have it and am
content with it. To be fair, though, I don't have any other smart TV OS
experience to compare it to.

I really enjoyed the HP Touchpad with WeBOS but there were some rendering
issues with it that I noticed when going to certain websites. My father in law
got me one during the flash sale as he worked at OfficeMax Corporate at the
time and had access. For ~$60 it wasn't too bad at all.

------
d--b
Theranos, hands down. I don't understand why we need more than a drop of blood
for medical tests. A drop is a lot of molecules.

------
digitalzombie
Those VR glasses for augmented reality, the google glasses... but nope.

~~~
kobeya
Is it really augmented reality if it's just a HUD overlay? I was excited for
Google Glass until I found out it was more about the camera and text overlays.
I wanted stereoscopic full-frame overlays and bionic camera input...

------
phillc73
UDP for fast file transfer.

Working in the television industry at the time and a few companies were making
tools based on UDP, for sending large files across public networks. Smartjog,
Aspera, Signiant. The companies are still around, but only operating in the
B2B space as far as I know.

I thought this technology was going to replace FTP. People wanting to punt
bigger and bigger files around, they needed better tools. Smartjog even open
sourced an early version of their tool.

Anyway, it didn't happen. I'm not in the televisual industry anymore and find
myself entirely satisfied with the current file transfer tools commonly
available (FTP!). I guess there was just never consumer demand and I was
looking at things from the perspective of a very narrow niche.

~~~
swyx
that's interesting. Isn't UDP still in use today? (honestly i'm not too clear
in what context, i just know that its used in some webapps)

~~~
phillc73
Maybe it is and I'm just not aware of it. I left TV about four years ago, so
not sure if the technology has made it to consumer file transfer tools, but as
far as I know clients for services like Dropbox (mentioned below) don't use
UDP.

If they do, and I just don't know about it, then the future is here!

------
mmgutz
MS Silverlight. I thought the world needed a better Flash and with MS behind
it and IE was still popular enough. I was a corporate C# developer so I saw
everything through Microsoft. It turns out iPhone killed Flash. MS could
hardly get anyone to adopt Silverlight.

~~~
sterex
This happens quite a bit even now. Many corporate Microsoft technology
developers have a myopic view of the software world. This is on a downward
trend, but still exists.

------
pera
Lisp, p2p everything, and GNU/Linux on desktop...

~~~
chx
Well, Linux on the desktop kind of happened this year, it's just running on
the Windows desktop, this time.

Here's how I set up Windows 10 coming from Linux.
[https://github.com/chx/chx.github.io/wiki/How-I-set-up-my-
Wi...](https://github.com/chx/chx.github.io/wiki/How-I-set-up-my-
Windows-10-\(coming-from-Linux\))

------
ryandrake
OpenGL: For a brief, glorious period, we had the promise of a truly cross-
platform general-purpose graphics API. Then Microsoft began its relentless
assault on it, ran its tried-and-true 1990s playbook on it and eventually
FUDded it into irrelevance on the Windows platform. For a short while, it at
least held up as a Linux/Mac solution, and survived on Windows as a second-
class citizen. For the last few years, however, Apple seems to have even
abandoned it for their own proprietary technology, dooming developers back to
the bad-old-days of needing to support N platform-specific APIs if they want
to support N platforms. As Trump would say: "Sad!"

~~~
bhouston
OpenGL is the basis of computer graphics on Andriod and Linux and works well
on Windows. So it is fairly universal these day. It does work well. There are
always new proprietary APIs to compete with OpenGL and they usually jump ahead
but then OpenGL slowly but surely consumes them.

------
johnmw
Java applets. Back in the mid 90's Java was at the peak of its hype cycle and
was the darling language. I thought the idea of writing an app in a 'modern
advanced' language, compiling it down to a bytecode, and running it in your
browser was the future of the web. And if someone had told me back then that
Javascript would be the language of the web I would have spat my coffee out
laughing.

Unfortunately speed, security, and the need to install a seperate run-
time/plugin layer were too much for people at the time.

But hey, now 20 years later we are finally starting to see that vision in Web
Assembly. ;-)

------
therealmarv
VR (yes the standalone/PC systems or the one which you can have with your
phone). I have the feeling it's dying already because most non geeks are not
really interested into it. It's like 3D TV of 2017.

~~~
zimpenfish
It's not just that non-geeks aren't really interested; it's that it has a
whole bunch of barriers as well for those that are interested.

1\. You need a reasonably powerful PC or PS4 ($$$) 2\. You need the VR kit
itself ($$) 3\. You need space to use it 4\. You need to be physically able to
use it 5\. You need a lack of motion sickness, etc. 6\. There's bugger all
games at the moment 7\. Probably something else I've not thought of

~~~
lookACamel
7\. You need to find time to use it. (Pairs nicely with reason 3.) Currently,
VR is a separate _activity_ which doesn't fit into one's existing activities.
It's a nice experience the first time, but what's the compelling reason to
turn it into a habit?

~~~
zimpenfish
8\. Wildly unportable (pairs with 7 and 3) - you can't play it whilst
commuting, travelling, etc.

------
harrisreynolds
_Parts_ of the old Web Services stack.

It is obvious why SOAP died (to me at least). It was a solution in search of a
problem that was already solved by REST/HTTP with generic payloads.

BUT.... things like WSDL that gave you an interface description... it would be
nice if something like that had survived.

The closest thing to this that I know of is Swagger. But not having a standard
machine-readable way to integrate with REST APIs/services is a missing piece
of the puzzle I think.

The Web Services Description Language provided that. I'm surprised that or an
equivalent never got traction.

------
virtualized
Email.

I work at a software development shop and most of our internal Email-like
communication happens face-to-face or via phone.

"Will you be available for a meeting two weeks from now?"

"I just wanted to tell you that I checked in my code."

"Please answer my trivially Google-able programming question right now."

"Tell person X that I want to talk to them later."

"The boss just told me [important news that concerns several people who are
not at work today]."

I have to assume that most of the world works like that and Open Source
communities are the great exception to this rule.

~~~
adrianmsmith
I have been thinking about this recently and come to the following conclusion:

\- If you receive a call, it breaks your flow, which is bad.

\- If you send an email, it breaks your flow, which is bad. (As you have to
wait for an answer, and do something else in the meantime.)

So if you have a question, the optimal strategy (for you) is to make a call.
But if someone else has a question, the optimal strategy (for you) is they
email you.

There's even a further level, which is that if someone else has a question,
it's optimal (for you) that they email you, and you call them to give your
answer, and discuss it until it's done.

So if it's up to you (and you're thinking only about your own productivity
above all else) then tell people to email you when they need something, but
call them when you need something.

------
jetti
Sega Channel[0]. My buddy had it around ~1995 and it was amazing. Sleeping
over at his house was always unique because they would change up what games
they offered each month. There are game streaming services offered now, such
as Playstation Now and I think Gamefly has game streaming as well, but it
amazes me that it didn't catch on sooner.

[0]
[https://en.wikipedia.org/wiki/Sega_Channel](https://en.wikipedia.org/wiki/Sega_Channel)

------
simonsarris
Google Wave had so many potential routes to usability as a kind of
collaborative scratch pad.

Google Keep and the now-very-good Google Docs collaborating is what we really
needed all along, I guess.

~~~
sidcool
Google Keep is still quite low on features.

------
peterburkimsher
Smalltalk. Everybody was supposed to be able to program, not just professional
software developers.

~~~
AnimalMuppet
Believe it or not, that was the original goal of COBOL.

~~~
eadmund
And SQL!

The problem is that it turns out most human beings don't actually think in a
clear and logical manner, and thus there's a need for someone to translate
unclear and illogical statements into clear and logical instructions for a
computer — that someone is a programmer.

~~~
hyperpallium
And BASIC!

BTW I find maths, with equations that are true in both directions, much less
intuitive than programming, with functions that accept input and return
output.

I suppose maths is "clear and logical statements", code is "clear and logical
_instructions_ ". Does that mean maths is easier than programming for most
people?

------
farseer
I would have thought, by now Anti Ageing related research and products would
be mainstream but most of that area is still fringe research and the products
seem like snake oil.

~~~
swyx
Anti Ageing seems too broad a category to be useful. were there any specific
vectors or products you had particular hopes for?

------
true_religion
Xhtml. I bought into it completely but hmtl5 is a better system.

~~~
nayuki
XHTML5 works today, and my site is live proof. It helps catch dumb typos when
typing HTML code by hand. It also means the CSS is more likely to be applied
to the correct DOM tree.

~~~
Kiro
I had a look at your site and it looks like normal HTML to me, except <?xml
version="1.0" encoding="UTF-8"?> at the top. How does that help anything?

~~~
AndrewOMartin
Many sites are very subtly not normal HTML.

~~~
Kiro
Example from this site?

------
hashim-warren
I was sure the Facebook Platform would take over the world. I thought the fast
success of Zynga would create a blueprint for all web apps would be built in
the future

~~~
zerostar07
I think it did. It's facebook itself that killed it as soon as their ecosystem
had enough content of its own.

In fact, i believe even today there's an opportunity for a social network for
silly games. its a great way to make random new friends.

------
zwischenzug
erlang.

As a programmer it was a revelatory thing of beauty. When we tried to spread
it within our org we found that the 'typical' (even relatively strong)
programmer resisted its concepts and found it hard to learn.

Therefore it didn't scale for us.

~~~
gozur88
How very odd. That's one language I found intuitive.

~~~
zwischenzug
Yeah, me too. It's an interesting thing. The C syntax and imperative mindset
is hard for a lot of people to get. Especially if they're not motivated by the
effort.

------
awinder
I was pretty convinced that microtransactions were going to become super
ubiquitous and solve a lot of monetization problems. Particularly in terms of
press / media. I think you could make arguments that this still might be true
in a long-term sense, and theres some movement in this space still, but I
don't think this has panned out to be the slam-dunk hit that I thought it
would be.

------
deerpig
For me it was VRML. I was working with SGI (not for) in Hong Kong at the time
and it was amazing what you could do with it. They were even pushing it for
use in interactive 3D banner ads. I went to the Tokyo event where they
streamed someone in a motion capture suit in Mountain View with the animated
figure being rendered on a browser in Tokyo. Very cool stuff. I wish it had
taken off.

------
DoubleGlazing
Interactive digital TV.

back when digital TV launched in the UK during late 1998, the major platofrm
(Sky, ONdigital, Telewest and ntl) were all hyping up their interactive
services.

Sky's service was called Open....
([https://en.wikipedia.org/wiki/Open....](https://en.wikipedia.org/wiki/Open....))
and you could order pizza, email, go shopping, dating, betting etc. Sky were
so convinced it was going to be a money spinner they gave away all the boxes
for free.

The underlying technology was also used to add interactively to TV shows. For
example you could choose the camera angle in some football games and the BBC
made a documentary about dinosaurs that worked like a multimedia CD-ROM.

Most of these services lasted less than five years, the services were
sluggish, the proprietary tech didn't help and in any case the web did it
better.

I didn't think the shopping/commercial side would succeed, but I was convinced
there would be a huge market for interactive TV. Nope, there wasn't.

~~~
arethuza
I remember looking at some of this stuff when I worked at an interactive TV
company in the early 2000s - I remember being deeply troubled looking at the
MHEG spec:

[https://en.wikipedia.org/wiki/MHEG-5](https://en.wikipedia.org/wiki/MHEG-5)

------
Blazespinnaker
Yahoo pipes. I am still holding out hope for this, especially AI pipelines.

~~~
vax
I really miss pipes. Wondering how much we'd have to pay Yahoo for the source
code. Think they'd take $100 for it? :)

------
too_tired
End-to-end email encryption.

~~~
vortico
Meaning PGP, or services like ProtonMail?

If PGP, it's a huge hassle to set up, and _both_ parties have to do it, so
it's no surprise it is rarely used.

------
ilkan
I expected Remote Desktop to be on every system... it's really frustrating
that I can't remotely see and configure my elderly mom's iPhone screen when
she needs tech support. "Do you see an x? Or is there something that looks
like a backwards arrow? No? A less-than symbol? Or the word Back? No? Cancel?
Maybe at the bottom?"

~~~
Viper007Bond
[https://www.teamviewer.us/features/ios-screen-
sharing/](https://www.teamviewer.us/features/ios-screen-sharing/)

------
Steel_Phoenix
Projection mapping. It seems like a natural step between standard interfaces
and AR. We've had the tech for a while. I want devices that use projected
touchscreens. I loved the idea of Augmented Reality Sandbox as a type of toy
for my kids. It seems like the tech got skipped over in anticipation of an AR
tech that has yet to hit market.

~~~
hyperpallium
I liked this too. There have been some phones with it; problem is projectors
use a lot of power.

------
VLM
Token Ring Networking as a protocol/concept (not so much physical IBM
hardware, of course) rather than CSMA/CD.

It naturally makes sense that allocating BW via a token would be much more
power efficient and faster than transmitting into the void and hoping for no
collision. Its hard to get line rate traffic with CSMA/CD but the financial
services co I was working at back then had very high util in the upper 90%s
for hours.

I also miss ATM and its complicated and semi-obscure adaptation layers, pretty
cool although toward its death it was just a weird way to send pt-to-pt
ethernet frames. ISDN was like that too, a whole protocol stack, pretty
interesting exciting ideas, toward the end it was just a 128K pt-pt interface
for internet access and of course a voice trunk signalling protocol replacing
old E+M signalling. Both could have been interesting/cool. X.25 switching
never really took off either.

~~~
sah2ed
I did a cursory search [0] on Token Ring vs CSMA/CD and it appears that "worse
is better" is why CSMA/CD won out.

CSMA/CD was much cheaper to deploy than Token Ring.

"Token-Ring has been a bit of a mystery for many people. This is due to the
fact that Ethernet, and other Carrier Sense Multi Access - Collision Detection
(CSMA/CD) networks, are the most widely installed network topology. This is
because most network designers _cannot look past the initial cost_ of Token-
Ring. While Token-Ring does cost more per port to install, it offers vast
benefits over Ethernet and other CSMA/CD topologies."

[0]
[http://www.let.rug.nl/bosveld/algoritmiek/aptokenhc7.htm](http://www.let.rug.nl/bosveld/algoritmiek/aptokenhc7.htm)

------
AndrewDucker
Asynchronous circuits:
[https://en.wikipedia.org/wiki/Asynchronous_circuit](https://en.wikipedia.org/wiki/Asynchronous_circuit)

I was convinced that we were going to enter a new phase of computing, where
the processor wasn't stuck at the speed of its slowest component.

~~~
baldfat
It will have to happen sometime in our life time.

------
hnmullany
I wrote a whole blog post with source data on the history of Gartner's tech
predictions in its Hype Cycle.

[https://www.linkedin.com/pulse/8-lessons-from-20-years-
hype-...](https://www.linkedin.com/pulse/8-lessons-from-20-years-hype-cycles-
michael-mullany/)

------
albeebe1
I remember going to a cafe in Harvard Square called Cybersmiths in the mid
90s. You got a card, loaded some money on it, then you could surf the world
wide web at high speeds. I thought you were going to see these types of cafes
everywhere, but high speed internet to the home put them out of business.

~~~
ufo
These still exist in many third world countries. A similar service is also
available for PC gaming because even when people have a PC at home with
internet, they might need a better PC with a better connection to be able to
play their favourite games.

------
1065
Holidays on the moon never got the traction I was hoping for.

~~~
Jeff_Brown
This will happen for sure, at least for princes and the like.

------
DannyB2
Ten years ago I was debating with a friend that GPUs would gradually change
into massive general purpose processors. Instead they have remained special
purpose and difficult to program.

My thinking was that with a large number of cores of processors that are more
general in nature, you could still do great graphics. But you could do many
other things as well. There are plenty of problems that are embarrassingly
parallel if done right. There are also plenty of applications for adding
significant amounts of computational power at the cost of a good graphics
card. Computer vision. Speech recognition. Speech synthesis. Photo and video
editing filters and processes. Even amazing scream savers.

~~~
lookACamel
Well deep learning is pretty general purpose so ...

------
RalphJr45
I thought the laptop arms race would be raw power and battery not weight,
thinness and style.

~~~
lowry
Competition in laptops stalled, it is probably ripe for disruption. Looking
the shameful Thinkpad 25, and the hype around it... there is a market for a
developer laptop that does not suck.

~~~
hyperpallium
Laptops were disrupted years ago by netbooks, which got killed by phones.

------
croisillon
Sony's MiniDisc music format!

~~~
VinzO
I came here to say that. It was much more convenient than burning CDs.

------
qilo
Large OLED panels everywhere: TVs, monitors, laptops, etc.

There was a lot of excitement, once blue color longevity issues were solved,
how it is easier and thus cheaper to manufacture, how soon it'll outcompete
LCD. 20 years later still waiting.

~~~
d3ckard
That one is just becoming true I guess.

------
raphinou
Sun readable laptop screens. The olpc had one and a tablet was also produced
with such a screen, which could switch from color mode to e-ink mode. Amazes
me we still don't have sun readable screens for laptop and smartphones.

------
vadimberman
Segway and Code Morphing Software by Transmeta.

------
andrewstuart
CDROM & "interactive multimedia"

Linked Data & "Open Data"

On the other hand, I really wasn't convinced about the web until I essentially
missed the giant early opportunities, which I was well placed to capitalize
on.

------
aryehof
I thought that we would see a world where we would increasingly "mash-up" a
solution using different services accessible through APIs. One where service
and data providers would provide a programmable interface to their
functionality (and data), perhaps in addition to their own user-interface.

It was conceivable for a time, however it didn't occur to me that doing so
would mean no advertising dollars for them, unless they made their offering as
"walled gardens".

Truly to me, it seems that "advertising, marketing and consumerism" are what
makes the world go around.

------
no_gravity
VR glasses / headsets

I remember when I tried the first headset sometime between 1999 and 2002. I
thought "This will be big". 15 years later it's still not big.

Tablets

I expected tablets to become omnipresent. But phones took that spot instead.

------
coke
Mach microkernal and GNU Hurd

------
with_a_herring
Plan 9 from Bell Labs

~~~
PeachPlum
I was a user from 2000. The first question new users would ask was "where's
the web browser". You basically had to run two computers, one with
Linux/Windows and one Plan9.

The next question was "what are the keyboard shortcuts" and the answer was
"escape toggles text entry in the shell, research says using the mouse is
quicker".

It was a hard sell.

------
grandalf
It's fascinating to read everyone's responses. Mine are:

\- mobile devices with extremely long battery life (weeks)

\- mobile devices that can be put into a dishwasher and do not need to be
treated gently.

\- HashCash

\- non-ai personal assistant/concierge services

\- Theranos-like tech

\- Teledildonics

~~~
hyperpallium
> mobile devices with extremely long battery life (weeks)

Problem is charging overnight solves this for most people, and they turn to
other needs.

~~~
grandalf
Maybe. But I see lots of cases for sale w extra batteries, public charging
stations, portable batteries, etc. I have to scramble to avoid a dead phone a
few times per month, usually after not charging the night before.

------
kapilkaisare
I remember holding the ideas behind Jini (now practically defunct Apache
River[0]) in very high regard.

[0]: [https://river.apache.org/](https://river.apache.org/)

~~~
erik_seaberg
Thank you, I actually forgot about how excited I had once been about untrusted
mobile code.

------
knackers
Google Wave. RIP T_T

------
peterburkimsher
Airships. I wish they got off the ground.

~~~
work_account
...ba-dum- _tss_...

------
TallGuyShort
Adapteva's Parallela. I don't follow the semiconductor industry that closely,
so perhaps very similar architectures are becoming more common, but as someone
with a passing interest in HPC I was very excited about their crowd-sourced
board for playing with parallel programming on an architecture built just for
that, but it suffered from many problems typical of crowd-sourcing. It was
delivered late with a lot of business problems leading up to it. I played with
it a bit and haven't heard of it otherwise since.

------
jFriedensreich
The famo.us frontend framework. The initial demos and vision seemed impressive
and i fell for it. Unfortunately they got over ambitious and failed to get
something solid enough out fast enough.

------
c_shu
1\. E-book and products like evernote/google keep. I rarely write things on
paper now because I often experience the difficulty of copying/searching. On
computers copying/searching is a breeze.

2\. VoIP, Skype, WhatsApp, etc. Many years ago I thought they can totally
replace traditional phone calls, at least for 99% users. But that didn't
happen.

3\. Technologies for telecommuting. Less congestion, less pollution. And it
saves both time and money. But it's still not so popular. (In Asia, it's very
rare.)

------
jraines
NFC, Wave, Glass, semantic web, the DCI architecture; a few frameworks that I
wouldn't want to speak ill of because they're not dead yet and their
maintainers pop in here.

~~~
blowski
NFC seems to be rather popular.

~~~
sigi45
I bought my NFC Phone 3,9 Years ago, never used it. Apple is now doing
something with it and it might come but it should have taken of way earlier.

Right now i have already a contact less credit card and that works like a
charm.

~~~
blowski
Correct me if I’m wrong, but don’t Apple Pay and Android Pay already use NFC?

~~~
sigi45
Thats what i meant with Apple. Android pay exists? Is it useable? Haven't seen
that anywhere.

~~~
blowski
I live in London, and both Apple Pay and Android Pay are almost ubiquitous
here. I recently hired a rowboat from a guy at renting them at the side of a
river in the countryside - and paid with Apple Pay.

Admittedly, I’ve no idea how much either are used, and how widespread they are
outside London.

~~~
sigi45
Nice, still hoping for it to arrive in germany.

There was one supermarket where i was able to pay with there app (it used a 4
digit code). It was quite nice to receive the digital bon per email.

It would have been easy to make automatic analysis from them but after just a
few weeks, the supermarket switched brand.

~~~
germanier
Almost all German supermarkets actually support them. It's just the banks that
don't offer it. If you make use of your EU freedoms and open an account abroad
you could use Android/Apple Pay in Germany.

------
Tade0
Leap motion.

I bought one used just three months after rollout - should have seen this as a
red flag back then.

The user experience was great - for the first 30 minutes. After that the pain
related to holding the hand in the air was becoming to annoying to ignore.

Switching between this and the mouse/touchpad proved to be a great optimum
though.

Anyway I successfully used it for my master's thesis to precisely place a few
microphones in space, but after that I stopped using it altogether - too few
useful applications to bother.

------
magoghm
4GL (Fourth Generation Languages). I never thought they would take the world
by storm, but many people in the 1980's thought they were the future of
(business) software.

------
alexee
Online accredited bachelor degree in CS, it seems there is a lot of resistance
in this area too, why it takes so long for Coursera/Udacity to implemented
this?

------
monk_e_boy
API / AI government. It seems that we could replace a lot of politicians with
a simple bash script.

Voting reform.

It seems that with fake news and real news labelled as fake news, stupid
facebook memes, junk internet adverts on everything... I thought the internet
would be better. It started off with so much promise. Lots of smart people
connected together, they all chanted "Get the masses online!" ... it turns out
that comes with its own set of issues.

------
jraby3
3D printing. I thought it'd put China out of business and we'd all be able to
3D print random plastic Chinese goods on demand from everyone's home.

------
zimpenfish
Fractal Image Compression. Way better than JPEG in the mid-1990s but patents +
expensive restrictive licenses left it pretty much dead in the water as a
result.

------
dquail
Biometric Auth. My first job out of university in 2003 was with a biometric
security company. even back then the technology was shockingly mature. But yet
getting deployed was so difficult. I also remember in 2004 going to my
favorite waterslide park and being able to use my fingerprint to get in and
out of my locker - rather than an awkward key or combo. But a year later those
lockers were replaced by clunky combo ones.

------
em3rgent0rdr
Fuel Cells.

~~~
donpdonp
Yes Hydrogen Fuel Cells! Toshiba was claiming in 2006 that they were coming
for laptop batteries. I still think the future of energy is solar cells, using
water+electrolysis to get hydrogen for storage, and a fuel cell for conversion
back to electricity.

------
ricardobeat
Bump, a technology that allowed you to match two devices (phone and/or
computer) over the internet. You did this by literally bumping them together,
or bumping the phone against your spacebar on a computer. A mix of
geolocation, timing, and accelerometer data.

I used it a lot for file sharing, exchanging contact cards. Worked like magic,
until Google bought and murdered it.

------
fallingmeat
Webvan. I could have ordered groceries to my door!

~~~
swyx
i mean.. you were right... eventually!

------
magoghm
NeWS (Network Extensible Window System).

Although, we now have web browsers + JavaScript (not exactly the same thing,
but there are some similarities).

------
ekianjo
3DO. I was not "convinced it would take the world by storm" but I really liked
the idea of a console standard that anyone could manufacture, all compatible
with each other. The PC equivalent for consoles. Too bad it did not work out
(and there are many reasons why it did not, but that's not the place here to
discuss it).

------
santaclaus
3D printing.

~~~
api
It's huge in prototyping and is being used to build parts that can't be made
any other way. Just never took off for personal use.

------
Iwan-Zotow
Plan9 (or rather whole stack of P9/Alef/Inferno/Limbo). Seems so elegant,
small, could run everywhere

------
roryisok
Windows phones! And before those, minidiscs

------
CM30
From the gaming world, some I believed would be popular (but ended up failing)
are:

1\. Augmented reality. When I played around it with on the 3DS, I thought the
idea was going to really blow up and end up having a ton of games based around
it. And in the first year or two, we did get a few like that. Such as Kid
Icarus Uprising having said features or Spirit Camera being entirely based on
the technology.

But it quickly died off afterwards, and since about 2012 I don't think I've
seen a single major game on the system where AR has been a central feature.
Same goes with other consoles and systems too. It's occasionally been
advertised (like with Microsoft and the Hololens or whatever it is), but it's
generally remained a niche idea.

2\. Also, the eReader. No, not the Kindle type, that thing the Game Boy
Advance had in the early 00s where you could scan cards to unlock features in
games. I genuinely believed that would be a huge revolution, to the point of
importing the device from America to try it out. Alas, it failed pretty damn
hard, and even the titles which had support for it in the US dropped said
support in the European versions.

3\. For non gaming stuff, VRML was a good example as well. Again, I thought
the future of the internet would be 3D worlds accessed through the browser,
and looked at the sites writing about it as if they were a glimpse of a high
tech future only a few years off in the distance. Nope, VRML failed, and
pretty much every attempt to implement VR functionality in the browser died
too.

4\. Also, I'm not sure if it counts as revolutionary on a tech level, but at
one point there was a lot of talk about using oauth to tie various communities
together into something akin to a forum network or Reddit equivalent. I think
a group called Zoints tried this in the early 00s or so, and I expected their
service to do pretty well off it.

Again, didn't really happen. Shared login systems did, in the most basic sense
(login with Facebook/Twitter/Google/whatever) but it seemed people preferred
walled gardens run by large companies over individual communities networked
together.

Finally, there were an awful lot of Google products and services I expected to
take the world by storm too. Google Wave has already been mentioned, but
Google Buzz was another one too. Think it acted as a neat alternative to using
Facebook comments or Diqus when it was active.

But yeah, my record with tech predictions is not exactly a great one.

------
dexterdexter
Segway! I'm surprised no one else mentioned this. The hype equaled the
excitement it triggered on people. Personally, it signaled the arrival of the
future at the time. Fast forward to today and they are just niche
transportation devices used by tourists and to a lesser degree by some
security officials.

~~~
krapp
The hype with Segway was high as long as no one knew what it was... for a
while people were seriously speculating that it was an anti-gravity hoverboard
like from Back to the Future or something just as exotic - but as soon as
everyone found out it was just a scooter (albeit a slightly clever one), the
hype vanished.

------
rodolphoarruda
Wearables. I was expecting much better sensors and services to be consumed,
specially in the Healthcare field.

What I see now is extreme attention to design and looks of smartwatches, but
little talk on the type of value the could be adding to people's lives when
connected to online services, phones or what have you.

------
trhway
Quantum computers. I thought so until i spent some time looking at
superposition&entanglement and got convinced that there is no
superposition&entanglement. Looking at the published Bell inequality
experiments i see exactly opposite of what whose experiments supposedly
confirm :/

------
MarkMMullin
The original idea of the internet as a highly distributed fault tolerant
system that could route around localized failures. I will admit that part of
that philosophy emerged when leaving arpanet and trying to get stupid bang
email addys and netnews flowing over dialup :-)

------
ezconnect
Google Glass. When I first saw it, I thought it would be the greatest tool
every human should have.

------
arca_vorago
Wireless power. In about 2007 I was convinced it was "the future", but other
than a few mobile devices with "lay it directly on top of this charging pad"
setups, the safety and power-loss issues seem to be pretty large barriers to
overcome.

------
NautilusWave
Mirasol display technology. I wouldn't be surprised if the display's colors
simply weren't saturated enough to be marketable; but I still pine for a
screen that I could actually see better under bright light instead of
struggling with glare.

------
epx
Windows Phone

------
virtualized
Solid State Drives.

In 2012 I did not expect that you could still buy $2000 laptops with spinning
rust in 2017.

~~~
dsschnau
yeah but that's just because spinning platters are still cheaper by the byte -
if you don't care much about performance you buy that. But any computer worth
using (imo) has a ssd in it, and that's been the case since 2012.

------
vog
Parser generators. They get better and easier to use every year (PEGs being
the newest kid on the street), but we still see loads of buggy ad-hoc parsers
with loops and regexes. And many of those bugs turn into actual security
issues.

~~~
Jeff_Brown
Yes, these are great! Text.Megaparsec.Expr blew my mind. It lets you write
something that can evaluate expressions like "(3 + 4/(-2)) * 7" \-- a nested
expression parser, with prefix, postfix and infix operators of varying
associativity and precedence -- in 12 lines of code. I made a repo about it
[here]([https://github.com/JeffreyBenjaminBrown/megaparsec/tree/mast...](https://github.com/JeffreyBenjaminBrown/megaparsec/tree/master/Expr-
studies)).

------
smt88
Google Glass

------
chx
Intel Itanium. Hell, _everyone_ was convinced!
[https://www.pcmag.com/article2/0,2817,2339629,00.asp](https://www.pcmag.com/article2/0,2817,2339629,00.asp)

------
brudgers
THOR-CD,
[http://articles.latimes.com/1988-04-22/business/fi-1732_1_co...](http://articles.latimes.com/1988-04-22/business/fi-1732_1_compact-
disc)

------
realrocker
Smartwatches. I was convinced enough to bet two years of my life working on
them.

~~~
pimmen
I have two big reasons why I didn't get a smart watch. Number one is the
price, I just don't think it's worth hundreds of dollars not to pick up my
phone when I get a notification.

The other is that the interface is too small to do anything but consuming
notifications or looking at a compass. And, I just don't see the problem being
solved by speech recognition either, I don't want the rest of the bus to know
that I have to Google "where do I buy bigger condoms?".

~~~
sigi45
My reason: Don't like to add another thing on the power cord every night.

------
stfnhrrs
:CueCat this device allowed you to open up a link from a magazine or newspaper
without typing it, truly a timesaver! I can't figure out why everyone wasn't
using these. Also had a really cute form factor.

------
JustSomeNobody
Mesh networking.

I figured by now the cost of nodes would be so low and the technology so far
more advanced that everyone could just stick a few nodes around and we'd have
a completely free, open and decentralized internet.

------
swah
Segways - I thought every one moving by more than a block would be on one.

------
antfarm
Ubiquitous P2P networking.

~~~
api
I see more and more of that.

------
Raphmedia
Pokemon Go. It did take the world by storm but promptly died due to
incompetence from Niantic. One can only hope that they didn't set enthusiasm
for AR games too far back.

~~~
jetti
I don't think Niantic was responsible for the decline but just the game itself
was. It only garnered the attention it did because of the Pokemon name but at
the end of the day the game was a grind. I saw a lot of people lose interest
because there was nothing really unique about it. Everybody that I knew that
played it turned the AR off as well, so it became just another game.

~~~
Raphmedia
> I don't think Niantic was responsible for the decline but just the game
> itself was.

They re-skinned their previous game and only added a gym fighting minigame.
It's entirely their fault. Even today the game doesn't have all the features
shown in the trailer. Their game has no end-game content either so even the
core demographic lost any reason to play.

~~~
jetti
That is true. I guess you can't blame the game without putting the blame on
the developers. I'm honestly shocked that the game got approved by all parties
involved, though I'm sure it still made a bunch of money for all parties
involved.

------
wj
Palm. A computer (some with wireless Internet) in your pocket!

------
technofiend
Iridium held so much promise and was such a disappointment.

~~~
swyx
the radioactive element? what was the promise you were excited about?

~~~
theotherplanb
For the record, Iridium is stable.

------
GFischer
I thought videocalls and live video-shopping would be ubiquitous (and I put my
money where my mouth was).

I still think we're due more interactive shopping experiences.

~~~
michaelmior
Aren't video calls pretty ubiquitous now? (Facetime, Skype, etc.) Not in
connection to shopping though.

------
Heraclite
NFC on the phone.

I thought it would revolutionise lots of things. Turns out it took 10 years
longer than I expected and is just a "fun" feature for now.

~~~
jpatokal
NFC & Android/Apple Pay is actually kinda amazing if you live in a country
like Australia where tap to pay is universally supported. It's starting to get
traction for transport smart cards as well.

------
magoghm
Lisp for AI. Frames for Knowledge Representation.

~~~
Jeff_Brown
Frames might come back: [https://research.googleblog.com/2017/11/sling-
natural-langua...](https://research.googleblog.com/2017/11/sling-natural-
language-frame-semantic.html)

------
robk
OS/2

~~~
swyx
why were you convinced it would take over the world? I'm pretty ignorant on OS
history, but would love to learn.

------
godisdad
Types.

~~~
Jeff_Brown
And purity! And I'm still waiting for dependent types. Idris looks beautiful
but I'm afraid to commit to such a new language.

------
King-Aaron
Betamax :(

------
lldata
Scala ... now betting on Kotlin to replace Java.

------
ainiriand
Any GNU/Linux desktop for the common user.

------
krapp
I didn't expect it to take the world by storm, but I expected Hack to be a lot
more popular by now than it seems to be.

------
amelius
I was convinced at some point that CPU clocks would become much faster than
the 4Ghz that seems to have become the limit.

------
deepnotderp
Silicon on insulator.

I was convinced the entire industry would rapidly adopt it and that it would
quickly replace bulk CMOS.

------
mindcrime
Various approaches to authentication / identity management:

OpenID WS-Federation etc.

Multicast

XHTML, XQuery/XPath/XLink/XPointer/etc.

------
antfarm
Interactive computer simulations of complex adaptive systems, i.e. flight
simulators for decision makers.

~~~
swyx
i remember the Scorpion guy claimed to have built something like that and
somehow applied it to war scenario modeling in Afghanistan? Smelled like a
crock of b.s. but then again I don't have a 200 IQ

for those who dont know about it: its this thing:
[https://scorpioncomputerservices.com/scengen](https://scorpioncomputerservices.com/scengen)

------
eyko
The semantic web, and speech to text.

------
susi22
Spaced repetition is the most efficient way to learn associations. Thought,
it's still barely.

------
apapli
100VG-AnyLAN. Was superior to Ethernet in performance but it’s proprietary
nature held it back.

------
firozansar
Personally I thought Google Glass had the potential but technically its not
dead yet.

------
pknerd
Internet Explorer... kidding :P

------
Lapsa
Microsoft LightSwitch :->

------
foobazzy
Nokia + Belle operating system. It was a beautiful future I dreamed of.

------
magoghm
Fractal image compression.

------
fixermark
TechShop. ;)

(context: [https://techcrunch.com/2017/11/15/techshop-shuts-down-
all-u-...](https://techcrunch.com/2017/11/15/techshop-shuts-down-all-u-s-
locations-declares-bankruptcy/))

------
dejv
Smart homes as with computer controlled switches and automation.

------
peterkelly
Linux as a desktop OS

~~~
k3a
Unfortunately. I still use it as my primary desktop but I see why people
prefer others.

It seems like most people don't want flexibility and configurability and they
don't want to learn to manage OS. They just want to use PC comfortably and
don't care about anything.

For example one of a few things I find annoying on Linux is multiple GUI
frameworks, each having it's own OpenFileDialog so I can't always easily open
recent folder or recent file. It looks totally different in Qt and in GTK.

I think Linux kernel is very solid and cool. But userspace is a mess, there is
probably not enough standardization. For example also X11 (hacky, security
problems) vs still buggy and unfinished Wayland. :( Compare that to OSX
frameworks.

Yet I love free software, appreciate all effort people put into it and will
continue using it. Maybe one day I will also be able to help GNU/Linux
improve..

------
anonymous5133
cell phone coverage broadcast by satellites. FCC eventually killed off the
plan. A company called light squared launched a satellite to have it done.

------
reacweb
google glasses. For me, a smartphone should not have a display (only be a
touchpad) and the glasses should be the display+camera+headset.

------
the_decider
Browser extensions

------
saluki
QR Codes . . . although they are BIG IN JAPAN

~~~
dnh44
Well you should see how common they are in China.

~~~
mszcz
I don't know and I can't tell if that's sarcasm ;)

------
pers0n
E-ink RSS Virtual reality Webrtc FirefoxOS

------
amelius
Laptop battery power that lasts a month.

------
dogcow
XMPP - federated instant messaging

------
GnarfGnarf
Bubble memory. Josephson junction.

------
ananthrk
Semantic web

------
magoghm
The Eiffel Programming Language.

------
atomicbeanie
Google Wave

------
antfarm
Beacon technology in shops.

------
magoghm
FORTH.

------
magoghm
Cyc.

------
anodari
Segway

------
magoghm
The Connection Machine.

------
0x4f3759df
Linux on the Desktop.

------
zerostar07
VR, until i tried it.

------
magoghm
Genetic algorithms.

~~~
redeploy-test
Only a matter of time

------
MBO35711
Space flight. Sigh!

------
cjsuk
Windows Phone 7 :(

~~~
jetti
I really liked aspects of my Windows Phone but there were other things that
just drove me crazy. It would lock up when I was on a call and I wouldn't be
able to do anything. I'd have to pull the battery in order to get it back to a
usable state. The speakers were terribly soft too, I couldn't hear my phone
ring if it fell under the couch. I absolutely loved the live tiles though.

~~~
cjsuk
Yes that was always the problem. If it worked properly it would have been
amazing. But it didn't. My daughter has a Lumia 650 and it's a buggy pile of
crap. That's SOP from MSFT mobile products.

------
rssllm
The Physical Web.

------
magoghm
Space Colonies.

------
magoghm
Expert systems.

------
bstamour
xhtml: it's just a good idea, IMO.

------
wokawoka
Microsoft Bob

------
sicher
Magic wands.

------
losteverything
Video calls

------
Jeff_Brown
Anybody who thinks reading and writing is powerful, and anybody who thinks
sharing is powerful, ought to be excited about knowledge graphs.

Knowledge graphs are powerful -- they underpin Google search, Facebook, Siri,
etc. And open source tools exist for keeping your own knowledge graph. (Here's
my favorite:
[https://github.com/synchrony/smsn/wiki/](https://github.com/synchrony/smsn/wiki/).
It has, to my knolwedge, two ongoing users.)

I was part of a small study group once -- four to seven people, meeting daily
for a couple hours to share economics notes. It was critical -- I could not
have made it through the first year of grad school otherwise. Knowledge graphs
can scale to far greater numbers of users.

So many people are biting their nails about the rise of AI. I belive we could
leverage existing technology to bring about human superintelligence before
then.

Internally, our minds work nonlinearly, but when it comes to written media, we
process text linearly. That is slow, wasteful -- if someone writes a book full
of gems of wisdom, but scatters redundant illustrations, obvious examples, or
unnecessary motivating passages between them, you've got to wade through that
chaff in order to find the gems. Knowledge graphs let us write and read
nonlinearly -- faster, better targeted, hence able to cover more information,
more kinds of information. It's like tables of contents all the way down.

Writing is great because it allows a reader to build on the work of earlier
writers. Knowledge graphs, in addition to that, let readers build on the work
of other _readers_. If I mark a passage as "obvious" or "critical" or
"beautiful", that metadata can guide another reader. Online systems already
use this idea to some extent -- facebook likes, reddit and HN votes, etc. can
help inform reader choices. But knowledge graphs in principle allow for
arbitrarily general metadata -- for instance, "show me every statement [group]
has written about [topic] which has been marked useful for [goal] by at least
[number] readers".

Publishing even a small body of useful, organized information can be extremely
valuable. Craigslist, for instance. Although there was an obvious economic
argument for publishing the information on Craigslist.

In fact, economics might be making us stupid. Non-monetizable information is
of enormous importance. Globally, we are experiencing an epistemic crisis.
Huge swaths of the public mistrust science, journalism, history. We appear to
be at least nearly as susceptible to fascism as we were in the thirties. The
arguments for freedom of speech, or civil rights, or being kind to strangers,
ought to be as obvious to us as which job pays how much money.

Camus talked about something he called "philosophical suicide", wherein
someone stops trying to think high thoughts. They fall into an economic
routine, the specialize, their awareness narrows. They might excel at what
they do. But a world full of such narrow thinking is a dangerous place.

I had heart surgery, almost died, and for years after, knowledge-gardened
furiously. I wrote, reviewed, organized, categorized. I pondered long passages
and reduced them to a few words, which became easier to process. What is
pleasure? What are the elements of consciousness? Where is my boundary of
certainty in ethics? Can I enumerate the state space of a conversation? It was
a transformative experience -- I changed from an awkward, selfish, angry young
man to a warm, gregarious, relaxed middle-aged man.

It was also slow, because I was working alone, and I used trees, which are
less expressive than graphs. I can only dream of how transformative it would
be to process those ideas in a group, nonlinearly, using a knowledge graph.

------
johnchristopher
RSS.

------
xonaether
Jetpacks

------
alfiedotwtf
BBSes

------
ps3udo
Rebol

------
magoghm
Oberon.

------
beamatronic
WebTV

~~~
Double_a_92
Well didn't it? Considering Youtube and Netflix...

Or is that some specific technology that I don't know about?

~~~
vortico
No, it hasn't. Still, the majority of TV viewers use TV utility providers, not
ISPs. When the average couch dweller sits down, grabs a remote, and turns on
Netflix instead of "channel 42" on cable TV, WebTV will be commonplace, but it
won't happen until the young Netflix-watching generation completely replaces
the older cable-watching generation.

~~~
jayflux
This is moving a lot faster than you think, we may be thinking more in years
than generations.

------
Manicdave
Google glasses and those VR headsets

------
kapauldo
Virtual reality.. It was a solution looking for a problem last decade and is
still.

------
timthelion
Microsoft XAML and .NET. They seem popularish, but the lack of an open
ecosystem means that they didn't take the world by storm.

------
throw-away-8
Lisp machines. Expert systems. Functional languages: Scheme, Ocaml, etc.
Itanium. OpenStack.

------
fallingmeat
CueCat. Coupon clipping was going to be rocked!

~~~
swyx
were you actually convinced it would take the world by storm? is there a
deeper story here?

