
Ray Kurzweil 2019 Predictions - melling
https://en.wikipedia.org/wiki/Predictions_made_by_Ray_Kurzweil#2019
======
aetherson
There are a lot of predictions in there that are either correct or kind of,
"Well, I mean, sort of." But if you step back and look at the forest instead
of the trees, it's clear that we are much less technologically advanced than
Kurzweil expected.

This is a persistent failure of well-known futurists, probably because it's
more interesting to predict rapid advances than slow ones.

~~~
krapp
>This is a persistent failure of well-known futurists, probably because it's
more interesting to predict rapid advances than slow ones.

Given his belief in a technological singularity, and the underlying assumption
that infinite, exponential technological expansion is not only possible but
inevitable, this shouldn't surprise anyone. All of his predictions are
essentially religious prophecies of the coming of the Eschaton.

~~~
erikpukinskis
That said, if he’s right about the singularity we should cut him some slack on
the exact dates. As Elon Musk frequently points out, a small variation in the
timing of an S-curve can make huge differences in the value at specific dates.

~~~
gooseus
I don't think that means cutting him some slack, that means that he shouldn't
be so foolish as to make such specific claims about such specific timings.

Also, my impression of Kurzweil was that he didn't think in S-curves, but
assumed technological progress follows an unbounded exponential curve - which
is why he thinks he'll become immortal and bring his father back from the
dead.

------
hprotagonist
This is, as usual for Kurzweil, full of Andy Grove Fallacies [0]: roughly, the
premise that everything is a lot like a semiconductor so we can just apply
Moore’s Law to biology.

Many/most things are not nearly as simple as transistors and computation are.
Nature prefers sigmoids over exponentials anyway, so it’s basically always
unrealistic to predict things getting exponentially different unless you have
really good priors to let you predict when that sigmoid necks over.

[0]
[https://blogs.sciencemag.org/pipeline/archives/2007/11/06/an...](https://blogs.sciencemag.org/pipeline/archives/2007/11/06/andy_grove_rich_famous_smart_and_wrong)

~~~
seppel
> Nature prefers sigmoids over exponentials anyway, so it’s basically always
> unrealistic to predict things getting exponentially different unless you
> have really good priors to let you predict when that sigmoid necks over.

I'm pretty sure that Kurzweil is aware of this, given that he sees ICs already
as the fifth paradigm in computation:

[https://en.wikipedia.org/wiki/The_Singularity_Is_Near#/media...](https://en.wikipedia.org/wiki/The_Singularity_Is_Near#/media/File:PPTMooresLawai.jpg)

The problem (it seems to me) is that there basically was no further paradigm
shift after ICs so far, which would have been necessary for many of Kurzweil's
predictions to come true this year.

~~~
ianai
I think mobile processors may be paving a way forward for us. They’ve been
engineered all along to increase performance through increased core counts and
decreased thermal footprints. I wonder how far they can be expanded in core
counts for general computing and not graphical/vectorized computations. Could
we eventually see 100s or even 1000s of cores in a U?

------
jcranmer
Predictions that sort of came true:

> Most business transactions or information inquiries involve dealing with a
> simulated person.

"Most" is strong, but virtual agents are definitely widespread and fairly
heavily loathed.

> Most people own more than one PC, though the concept of what a "computer" is
> has changed considerably

Undoubtedly correct.

> Cables connecting computers and peripherals have almost completely
> disappeared.

> Rotating computer hard drives are no longer used.

Very strong trendlines, although "almost completely" and "no longer" are not
accurate.

> Massively parallel neural nets and genetic algorithms are in wide use.

Depending on how you define "wide use". Applications that rely on neural nets
are very heavily used, but neural nets are closer to a niche technology than a
widespread algorithmic hammer.

> All students have access to computers.

In developed countries, yes.

> Public places and workplaces are ubiquitously monitored to prevent violence
> and all actions are recorded permanently. Personal privacy is a major
> political issue, and some people protect themselves with unbreakable
> computer codes.

Scarily, this is more true than not.

... That's pretty much the lot of the 2019 predictions that came close to
coming true. At best that's a 20% hit rate for predictions, although many of
these hits require reinterpreting "most" as "many" and so on.

~~~
gooseus
> although many of these hits require reinterpreting "most" as "many" and so
> on.

Words have meanings, if he says "most" and it is less than 50% of the total
then I'd have to say it's a miss.

I mean, just look at his 2009 predictions, if we just assessed his 2009
predictions as if they were for today I'd guess he'd still struggle to get 50%
hit rate without favorable "reinterpretations".

I've always had a hard time understanding the appeal of futurists and their
wild claims, especially amongst a population (techies) who I continue to
assume are more skeptical in general.

~~~
jcranmer
I didn't compare the list to the 2009 predictions. When you try to see the
trend line of what he meant from 2009 on... yeah, the miss rate is pretty
abysmal.

In his 2009 predictions, there are 8 that are pretty thoroughly not true
today, most of which have to do with medicine. There's another dozen or so
that I would also flag as not-true or just-barely-true today. That still
leaves about half of the predictions as being true for today.

But the general theme? Yeah, that's a pretty bad miss all around.

~~~
eesmith
Here's what HN thought about the 2009 predictions, in 2010 -
[https://news.ycombinator.com/item?id=1064789](https://news.ycombinator.com/item?id=1064789)
.

------
andr
Sadly some of the predictions where Kurzweil is the most wrong are the ones
that depend not so much on technology, but on human society. Life expectancy
over 100, the needs of the underclass being met, and most human workers
spending their time acquiring knowledge are not the case, no matter which way
you squint. Heck, even the paperless office is not really the case in many
companies and countries.

------
chillingeffect
Although often vague, hese aren't bad, really, but you can see Kurzweils' bias
for nerdiness and longevity:

> Average life expectancy is over 100.

> Most human workers spend the majority of their time acquiring new skills and
> knowledge.

Also, just as a PSA, many people do not realize most deaf people don't
consider themselves disabled, nor do they want to join the hearing world with
cochlear implants bc it means leaving their hearing world. They are quite
happy in their own network. For talking with hearing people, they use notes
written on paper or phones.

> Deaf people use special glasses that convert speech into text or signs, and
> music into images or tactile sensations. Cochlear and other implants are
> also widely used.

~~~
grkvlt
That's crazy. I find it hard to believe that any profoundly deaf person would
refuse some technological device that would give them perfect hearing because
'They are quite happy in their own network.'

No matter whether they consider themselves disabled or not, hearing > not
hearing, and anything they could do while deaf can still be done with 100%
hearing restored.

~~~
chillingeffect
> 100% hearing restored.

I think you're imagining CIs as shiny and perfect technology, but they aren't.
Check out this article below for some details. [1] They take quite a long time
to learn how to use. Also, for people born deaf, their hearing isn't
_restored_ , it never existed, so it becomes an addition and an added
responsibility. It's two new skills - hearing and speech therapy - they have
to learn since ASL/etc is already their first language. It's really just a
random cultural conditioning that we tend to think we have to be perfect or
even enhanced to have a good life, be loved, successful, etc., but there's
much more to life than the hearing world.

> hearing > not hearing

Actually, it turns out "not hearing == hearing." They make friends, make
money, program, cook, play sports, exercise, etc. You might say they can't do
this with people who aren't deaf, so they have a smaller population to choose
from, but their community is huge and well-connected. they actually don't need
the rest of us. I sat next to a deaf guy on a plane and he ordered coffee with
cream and sugar by typing on his iphone then read the whole plane ride.

[1] [https://www.thisisinsider.com/why-deaf-people-turn-down-
coch...](https://www.thisisinsider.com/why-deaf-people-turn-down-cochlear-
implants-2016-12)

~~~
grkvlt
No. Hearing is, unquestionably, absolutely, no argument whatsoever,
qualitatively _BETTER_ then not being able to hear. A hearing person has a
strict superset of the abilities and experiences of a non-hearing person. If a
deaf person had my hypothetical 100% hearing restoration device fitted, but
still wanted to do the things they did while deaf, nothing is stopping them -
maybe my imaginary device has an off switch. But, when that switch is off,
they will _never_ be able to hear their name called from across a dimly lit
room, or experience the full gamut of orchestral music , or many other things.
But they can still use sign language, or write things down on paper to
communicate, and so on. I agree that there's more to be developed in terms of
CI type technology before this becomes a reality, and there will absolutely be
a period of learning and adjustment.

It is unrealistic to suggest that most deaf people would refuse an operation
to have their hearing restored. This is just as crass as suggesting that
someone who has learned to live a full life with a prosthetic leg would turn
down the (hypothetical, futuristic) opportunity to have a brand new biological
leg...

------
hitchhiker999
meh. // as a programmer for 37 years, and a person heavily familiar with
technology -- meh.

That is 80s level crap right here, he didn't seem to know people yet. Fair
enough, but still.

The future is nothing like this man thinks, plus he doesn't understand the
power of organic tech (the brain) - or the idea that any intelligence is
hampered not by speed but by the density and depth of the "graph" it has to
explore to provide context for any given answer.

Just my 2c -- I'm sure I will get hammered for it sometime in the future.

~~~
trashtester
The hammering may come sooner than you realize. Typical objects to many of the
AI predictions are based on a mindset that sees computers are Turing Machines
and focus on single threaded computational power.

As I'm sure you know, AlphaGo/Alpha0 replaced much of the "graphs" for
Go/Chess already, and as we talk, an open source clone (Lc0) is doing quite
well in TCEC:

[https://www.chessbomb.com/arena/2018-tcec-s14pd](https://www.chessbomb.com/arena/2018-tcec-s14pd)

I would not be surprised if this is the last year a classical engine wins that
tournament.

And as tensor-based computing (an other non-Turing Machiene substrates)
continue to mature, I expect more and more problems to be solved by carefully
designed special purpose substrate/software combination, enabling solutions to
problems that are practically unsolvable with traditional algorithms running
on Turing Machines.

~~~
grkvlt
Tensor-based computing, as you put it, is linear algebra, and it can be
simulated on a turing machine quite easily, but this has really no relevance
in terms of practical use. Maybe you are thinking of von-Neumann
architectures? But there is no escaping the fact that tensorflow and similar
algorithms have nothing special about them that requres a different
computational paradigm...

~~~
trashtester
You are correct in that a Turing machine, given enough time, can perform all
computation required by Tensorflow. So, if you don't care about time an cost,
you are correct that a new paradigm is not required.

But, since a Tensorflow network does not _require_ all the functionality of a
Turing Machine, it _allows_ you to limit you to more specialized computational
models. And in doing so, you can run your system several orders of magnitude
faster. (And the TPU type of processors are still at an early stage, expect at
least a few more innovations to widen this gap further)

This comes with a few changes: \- A lot of algorithms (for instance those
using recursion) are hard to transfer to a "tensor machine", making some
skills obsolete. \- Many/most problems are solved using machine learning
techniques, that tend to favor more stats/data science/maths than many pure
programmers have. \- Simultaneously, the new approaches are almost equally
alien to old school statisticians, as the problem definitions (in particular
the kinds of patterns to look for) tend to be quite different. \- Also, the
way to approach problems, estimate what problems can/cannot be solved or how
complex they are, are quite different. (Requiring changed intuition.) \- New
development is to a large extent fueled by changes in hardware
(CPU->GPU->TPU->?), and I think problem solutions will become more and more
about seeing possibilities across both hardware and software, instead of
treating these mostly separatly.

In sum, I think the change in emphasis, skills, methodology, etc is enough to
be thought of as a paradigm shift.

------
SpeakMouthWords
I remember having band practice at our drummer's house when I was younger. The
drummers mother and our vocalist knew each other. To reach the high notes, she
said to him, you have to "sing higher" and your voice will naturally fail down
to where you want to be.

These predications kind of feel like that. Kurzweil was "singing higher" than
where we ended up. Just enough that it's a clear overshoot, but not but so
much that he's off-key.

~~~
shhehebehdh
That’s a pretty charitable interpretation. I would suggest that we’d be in
more or less exactly the same spot whether or not kurzweil published these
predictions. He didn’t do it to advance the state of the art. Or if he did
he’s totally deluded. He was just wrong.

------
scandox
> The basic needs of the underclass are met

I mean if you're going to fantasize why not bring this forward and give
imaginary relief to millions? I would have predicted this easily for 2016...or
even maybe 2012. Obviously any earlier would have been ridiculous.

~~~
jimktrains2
Poverty has been falling consitently and we, as a species, is more than able
to meet everyone's needs, we simply choose not to because we're greedy and
short-sighted.

~~~
scandox
I think to talk about this as a moment in time when the job is done is the
mistake. It needs to be considered both as something achieved and demonstrably
sustained/sustainable for a meaningful length of time into the future.

That's why I just find it funny to see that statement. For 2019. Generalizing
to some global "underclass" that anyway doesn't exist.

~~~
jimktrains2
> It needs to be considered both as something achieved and demonstrably
> sustained/sustainable for a meaningful length of time into the future.

100% agree. I just wanted to highlight that one of the largest obstacles to
ensuring that everyone's basic needs are met (and that our grandchildren will
still have a planet they can easily live on) is our politics and short-
nearsightedness, not necessarily our technology.

------
edw519
His predictions about technology seem to be much more accurate than his
predictions about what people actually do with that technology.

The technology has always been easier than the people. No surprise that has
affected his "business" too.

------
mrfusion
Is a high end computer anywhere near 20 quadrillion calculations per second?

Edit: 20 quintillion would be 20 million teraflops. Sound right?

~~~
1024core
Not that I'm much into "predictions", but for $4000 you can get a lot of GPU
cards (assuming you can squeeze them all into a box, with the necessary power
and connections).

~~~
mrfusion
I was 1999 dollars so maybe 5 or 6k today?

Do all the gpu cards add up to 20 quintillion?

~~~
gibybo
Typical $500 graphics card today will do around 10 teraflops. You'd need 2k of
them ($1m) for 20 quadrillion flops (the prediction is quadrillion not
quintillion).

------
xhrpost
Interesting how several of the 2009 predictions were very wrong at the time
but now are coming true in 2019. Perhaps we can look at the 2019 predictions
to see what 2029 might look like.

------
yodon
Given he was trying to predict 20 years into the future this is an amazingly
good forecast compared to most other "in 20 years" lists.

By my reading about half his projections are on target and surprisingly few
are completely absurd. His biggest area of concentrated failure was probably
around VR/AR/Haptics, likely a result of writing the list during the first
irrational exuberance period around VR and VRML.

------
AndrewKemendo
I looked at each of the predictions and gave them a binary score. Of the 49
predictions on here 11 were in my estimation unambiguously right: 8, 12, 13,
16, 19, 25, 26, 27, 28, 33, 44. That's a 22% hit rate. I'm not sure if this is
good or bad considering the predictions were made in 1999.

Most of the "correct" predictions are of the following nature, namely that
they are not absolute in nature:

 _" Most people own more than one PC, though the concept of what a "computer"
is has changed considerably: Computers are no longer limited in design to
laptops or CPUs contained in a large box connected to a monitor. Instead,
devices with computer capabilities come in all sorts of unexpected shapes and
sizes"_

I marked some correct because there are examples of it existing, but the
wording didn't indicate it was absolute for everyone, such as:

 _" People with spinal cord injuries can walk and climb steps using computer-
controlled nerve stimulation and exoskeletal robotic walkers."_

I marked many wrong because wording was absolute such as:

 _" All students have access to computers."_

If the prediction would have been "Most..." or something more granular I think
we could agree this is probably true.

Many of the ones that are wrong are very wrong such as:

 _" Computers do most of the vehicle driving—-humans are in fact prohibited
from driving on highways unassisted. Furthermore, when humans do take over the
wheel, the onboard computer system constantly monitors their actions and takes
control whenever the human drives recklessly. As a result, there are very few
transportation accidents."_

------
johnhenry
If you scroll up, above the things that are specifically for 2019, there is a
section for 2020 - 2050 containing the bullet point:

\- By 2020, there will be a new World government.

I wonder what is going to have to change in 2019 for this to be true? Are we
already there?

~~~
notus
They are talking about the US 2020 elections obviously

------
mrfusion
Oh wow these are way off. I wonder what went wrong?

~~~
VMG
His predictions went wrong

~~~
mrfusion
Maybe he didn’t take into account how much things depend on software. He just
assumed faster hardware will make things happen.

And even on that front, Moore’s law died somewhere in the 2010s. That might
have thrown these off.

~~~
jaked89
Moore's law didn't die, it's been para(lle)lized.

------
SirHound
It's worth remembering when these were written.

> Most people own more than one PC, though the concept of what a "computer" is
> has changed considerably. Computers are no longer limited in design to
> laptops or CPUs contained in a large box connected to a monitor. Instead,
> devices with computer capabilities come in all sorts of unexpected shapes
> and sizes.

Doesn't seem like much of a prediction these days, but from the vocabulary of
the past the iPad and smartwatch would qualify, and they weren't easy to
predict at the time.

I remember telling my mum in the early 2000s that one day (I didn't realise
quite how soon) all phones would just be touch screens. She didn't believe me,
or thought she'd be dead before it happened. It is hard to imagine the future.

I'd say we're tracking behind on a lot of these but I don't think he's so far
off (and it'd be interesting to check again in just a couple years). For
instance:

> People communicate with their computers via two-way speech and gestures
> instead of with keyboards.

In the parlance of the past, touch screens, virtual assistants and face unlock
all qualify.

> Most business transactions or information inquiries involve dealing with a
> simulated person.

Our bar for a "simulated person" is quite high these days, but again look at
it from the perspective of the past. I quite often have chats with services of
varying sophistications that try and emulate a natural conversation. I'm not
fooled, but that isn't the prediction.

> Pinhead-sized cameras are everywhere.

Quite literally. I have a couple looking at me right now.

> Cables connecting computers and peripherals have almost completely
> disappeared.

Most of my peripherals actually are wireless. "Almost completely disappeared"?
Sadly not. But we're probably talking years, and this was written twenty years
ago.

> Rotating computer hard drives are no longer used. > Massively parallel
> neural nets and genetic algorithms are in wide use.

Speak for themselves.

> Thin, lightweight, handheld displays with very high resolutions are the
> preferred means for viewing documents.

This one is interesting because again if you immerse yourself in the time it
was written, paper was a very common occurrence on a work desk. Now? Rarely.

> Worldwide economic growth has continued. There has not been a global
> economic collapse.

Not as far off as it might seem.

> "Virtual sex"—in which two people are able to have sex with each other
> through virtual reality

Increasingly true. Check back in 12 months.

> Three-dimensional nanotube lattices are the dominant computing substrate.

Oh.

I actually think that, all in all, and taken in the spirit of the time, these
are surprisingly accurate. Many are of course way off, but I personally think
they show a rare prescience.

------
fallingfrog
I think that kurzweil basically makes 2 critical mistakes:

1) he underestimated the degree to which all those fancy products of modern
capitalism are created not by machines making machines but by the nimble
fingers of vast numbers of very poorly paid people- as it must be, because the
maintenance on the machines to do the same work would be more expensive than
just paying humans to do it. But from kurzweil’s point of view as a wealthy
westerner, that fact is invisible, as it’s designed to be- the products simply
appear to come out of a black box which we are not supposed to look inside of.

2) Technological progress has undergone a sort of phase transition in the last
100 years or so, with the discovery for the first- and only- time of
fundamental facts about nature like electricity, thermodynamics, how human
biology works, and so forth- discoveries that can only be made once. An
s-curve looks like an exponential, for a while.

------
gwbas1c
A computer might be as powerful as my brain, or more powerful than my brain.

That's not the issue. The real issue is knowing how to program it.

------
petermcneeley
There are so many No/Fails that this is more of a wish list than a list of
predictions.

------
MPSimmons
My take analyzing this. It was fun!

##########

Definitely:

##########

Computers are embedded everywhere in the environment (inside of furniture,
jewelry, walls, clothing, etc.).

People experience 3-D virtual reality through glasses and contact lenses that
beam images directly to their retinas (retinal display). Coupled with an
auditory source (headphones), users can remotely communicate with other people
and access the Internet.

(This isn't exclusive, but Siri and home assistants definitely count, imo)

People communicate with their computers via two-way speech and gestures
instead of with keyboards. Furthermore, most of this interaction occurs
through computerized assistants with different personalities that the user can
select or customize. Dealing with computers thus becomes more and more like
dealing with a human being.

Most people own more than one PC, though the concept of what a "computer" is
has changed considerably: Computers are no longer limited in design to laptops
or CPUs contained in a large box connected to a monitor. Instead, devices with
computer capabilities come in all sorts of unexpected shapes and sizes.

Cables connecting computers and peripherals have almost completely
disappeared.

Rotating computer hard drives are no longer used. (other than exceptional or
specialized cases)

Massively parallel neural nets and genetic algorithms are in wide use.

Pinhead-sized cameras are everywhere. (pinhead may be stretching it, but I
think 1999 Ray Kurzweil would count our cellphone-type cameras)

Thin, lightweight, handheld displays with very high resolutions are the
preferred means for viewing documents. The aforementioned computer eyeglasses
and contact lenses are also used for this same purpose, and all download the
information wirelessly. (less the glasses, but definitely the handheld
displays)

Computers have made paper books and documents almost completely obsolete.
(grumble)

Most learning is accomplished through intelligent, adaptive courseware
presented by computer-simulated teachers. In the learning process, human
adults fill the counselor and mentor roles instead of being academic
instructors. These assistants are often not physically present, and help
students remotely. (I'm counting the rise of MOOCs and things like Khan
Academy in this. You may disagree)

Students still learn together and socialize, though this is often done
remotely via computers.

All students have access to computers.

Language translating machines are of much higher quality, and are routinely
used in conversations. (I suspect we can count this, but I don't know if it's
exactly routine yet. I've used it, an I'm not cutting edge in this realm)

Effective language technologies (natural language processing, speech
recognition, speech synthesis) exist.

Anyone can wirelessly access the internet with wearable devices such as
computerized glasses, contacts, and watches.

Traditional computers and communication devices such as desktop PCs, laptops,
and cell phones still exist, but most of their functions can be performed by
wearable gadgets. Examples include reading books, listening to music, watching
movies, playing games, and teleconferencing.

Worldwide economic growth has continued. There has not been a global economic
collapse. (There has been, (2 actually), but we got better)

Household robots are ubiquitous and reliable. (Roomba? Google Home / Alexa?
(though they may not be robotic)

Computers do most of the vehicle driving—-humans are in fact prohibited from
driving on highways unassisted. Furthermore, when humans do take over the
wheel, the onboard computer system constantly monitors their actions and takes
control whenever the human drives recklessly. As a result, there are very few
transportation accidents. (I'm throwing this here, because it's so close.
Everything here is true except for the unassisted highway driving, and the
'very few transportation accidents' part - the tech is definitely here)

Most decisions made by humans involve consultation with machine intelligence.
For example, a doctor may seek the advice of a digital assistant. A lawyer
might utilize a virtual researcher. Or a shopper may receive recommendations
from a software program that has learned his or her shopping habits.

Interaction with virtual personalities becomes a primary interface.

Public places and workplaces are ubiquitously monitored to prevent violence
and all actions are recorded permanently. Personal privacy is a major
political issue, and some people protect themselves with unbreakable computer
codes.

Virtual artists—creative computers capable of making their own art and
music—emerge in all fields of the arts.

Computerized watches, clothing, and jewelry can monitor the wearers health
continuously. They can detect many types of diseases and offer recommendations
for treatment.

###################################

Probably, or in cutting edge cases:

###################################

These special glasses and contact lenses can deliver "augmented reality" and
"virtual reality" in three different ways. First, they can project "heads-up-
displays" (HUDs) across the user's field of vision, superimposing images that
stay in place in the environment regardless of the user's perspective or
orientation. Second, virtual objects or people could be rendered in fixed
locations by the glasses, so when the user's eyes look elsewhere, the objects
appear to stay in their places. Third, the devices could block out the "real"
world entirely and fully immerse the user in a virtual reality environment.

Nanotechnology is more capable and is in use for specialized applications, yet
it has not yet made it into the mainstream. "Nanoengineered machines" begin to
be used in manufacturing.

Most human workers spend the majority of their time acquiring new skills and
knowledge.

Blind people wear special glasses that interpret the real world for them
through speech. Sighted people also use these glasses to amplify their own
abilities.

Retinal and neural implants also exist, but are in limited use because they
are less useful.

Deaf people use special glasses that convert speech into text or signs, and
music into images or tactile sensations. Cochlear and other implants are also
widely used.

People with spinal cord injuries can walk and climb steps using computer-
controlled nerve stimulation and exoskeletal robotic walkers.

Computers are also found inside of some humans in the form of cybernetic
implants. These are most commonly used by disabled people to regain normal
physical faculties (e.g. Retinal implants allow the blind to see and spinal
implants coupled with mechanical legs allow the paralyzed to walk).

Devices that deliver sensations to the skin surface of their users (e.g. tight
body suits and gloves) are also sometimes used in virtual reality to complete
the experience. "Virtual sex"—in which two people are able to have sex with
each other through virtual reality, or in which a human can have sex with a
"simulated" partner that only exists on a computer—becomes a reality.

Just as visual- and auditory virtual reality have come of age, haptic
technology has fully matured and is completely convincing, yet requires the
user to enter a V.R. booth. It is commonly used for computer sex and remote
medical examinations. It is the preferred sexual medium since it is safe and
enhances the experience.

Prototype personal flying vehicles using microflaps exist. They are also
primarily computer-controlled. (I don't know what Microflaps are. Most of the
personal flying drone things I've seen are primarily computer controlled)

Humans are beginning to have deep relationships with automated personalities,
which hold some advantages over human partners. The depth of some computer
personalities convinces some people that they should be accorded more rights.

The basic needs of the underclass are met. (Not specified if this pertains
only to the developed world or to all countries) (sigh)

Most flying weapons are bird-sized robots. Some are as small as insects.

############

Not so much:

############

The computational capacity of a $4,000 computing device (in 1999 dollars) is
approximately equal to the computational capability of the human brain (20
quadrillion calculations per second).

The summed computational powers of all computers is comparable to the total
brainpower of the human race.

Most business transactions or information inquiries involve dealing with a
simulated person. (it isn't majority online yet I think -
[https://www.digitalcommerce360.com/article/us-ecommerce-
sale...](https://www.digitalcommerce360.com/article/us-ecommerce-sales/))

The vast majority of business interactions occur between humans and simulated
retailers, or between a human's virtual personal assistant and a simulated
retailer. (samesies, but we're close!)

Three-dimensional nanotube lattices are the dominant computing substrate.

Destructive scans of the brain and noninvasive brain scans have allowed
scientists to understand the brain much better. The algorithms that allow the
relatively small genetic code of the brain to construct a much more complex
organ are being transferred into computer neural nets.

Most roads now have automated driving systems—networks of monitoring and
communication devices that allow computer-controlled automobiles to safely
navigate.

While a growing number of humans believe that their computers and the
simulated personalities they interact with are intelligent to the point of
human-level consciousness, experts dismiss the possibility that any could pass
the Turing Test. (throwing this here. In my experience, most people
anthropomorphize, but don't believe the assistants are that intelligent)

Average life expectancy is over 100.

~~~
grkvlt
I think you're overstating the 'definitely' column here by an enormous amount.
Most of the things you included just have not happened. Here are two examples:

1\. Household robots are ubiquitous and reliable.

No. Household robots _exist_ and are in an early stage of adoption and
development, but are not very capable (single tasks) or reliable, and have yet
to reach widespread adoption outside of wealthy, early adopters and
enthusiasts.

2\. Computers do most of the vehicle driving.

No. Some cars have driver assistance technologies, but mainly expensive luxury
models. Most vehicles are driven by humans with zero computer assistance.

I think you are confusing the existance of a thing with the universal
availability and adoption of that thing. Where Kurzweil says _most_ he means
it. Even back in 1999 there were examples of these things in research labs and
so on, the prediction he was making was that they would have moved forward
from there to becoming everyday household items, present in the council
estates, trailer parks and so on of the world, as well as the San Francisco
area...

~~~
armitron
1\. Roomba.

2\. Computers in an average modern car control steering, braking and
acceleration to degrees not possible in the past.

It's clear that Kurzweil is more right than wrong if you read his predictions
as trends and account for some degree of variation. Since he's talking about
processes I am not sure what reading his predictions with a letter-of-the-law
critical mindset buys you. The vast majority of what he talks about - has been
invented - and is well on the way to becoming widespread. For me, it's the
point of origin ("invention") of a technology and it's near-term projection
that's most salient and Kurzweil nails it down.

~~~
grkvlt
No, he's not. Autonomous vehicles and household robots _existed_ when he made
these predictions. The predictions are about the ubiquity and widespread, mass
market adoption of these things - which has _not_ happened. Yes, the robots
and driver assistance systems are better, sometimes hugely improved, and more
widespread than 20 years ago, but the operative word in the predictions is
_MOST_ and that is where they fall down, and fail to be true.

