
The Dark Ages of AI (1985) [pdf] - fpoling
https://pdfs.semanticscholar.org/2a85/7e042331c53b856b3ad5018454f72d798790.pdf
======
gumby
Everything that Drew McDermott thought wouldn't really happen did:

> "To sketch a worst case scenario, suppose that five years from now the
> strategic computing initiative collapses mis- erably as autonomous vehicles
> fail to roll. The fifth gen- eration turns out not to go anywhere, and the
> Japanese government immediately gets out of computing. Every startup company
> fails. Texas Instruments and Schlumber- ger and all other companies lose
> interest. And there’s a big backlash so that you can’t get money for
> anything con- nected with AI. Everybody hurriedly changes the names of their
> research projects to something else. This condition, called the “AI Winter”
> by some, prompted someone to ask me if “nuclear winter” were the situation
> where fund- ing is cut off for nuclear weapons. So that’s the worst case
> scenario.

> "I don’t think this scenario is very likely to happen, nor even a milder
> version of it."

Exactly that happened, from the autonomous vehicles not panning out to funding
evaporating. I remember it well.

Then Waldrop's comment could almost have been written last week (except for
the loss of Marvin, RIP), down to the confusion between computers and "AI".

This Time It's Not Different.

Edit: I actually attended this panel.

~~~
6gvONxR4sf7o
This time is different. This line doesn't seem to apply so well today:

>The computer seems to be a mythic emblem for a bright, high-tech future that
is going to make our lives so much easier.

This time, there's a big groundswell of distrust and anger against "Big Tech."

~~~
tachyonbeam
Distrust? Sort of. People like to whine about Facebook, but they are still
using it, or they migrate from Facebook to Instagram. Amazon is still growing.
The news that the US government was spying on everyone came out years ago, and
nobody did anything about it.

The current story is that automation will make all the dull jobs go away, and
we're going to get Universal Basic Income. That's the mythic emblem for a
bright, high-tech future, alive and well.

I work in AI, and nobody is stopping to ask the question: how happy are people
going to be in a future without a practical sense of purpose, where a machine
can do anything you could do better than you? Yes, I know, if you're a well-
adapted individual, you should be able to derive your sense of purpose
somewhere else besides work... Like, by making paintings that nobody will care
about.

~~~
codesushi42
> _Like, by making paintings that nobody will care about._

That didn't work out for Hitler.

~~~
tachyonbeam
UBI utopia: everyone is free to live life to the fullest, pursue their
creative passion and have fulfilling friendships in a stress-free environment.

UBI dystopia: everyone is crammed in a tiny standardized living unit, barely
long enough to lie down in, the cities are all slums. Everyone feels useless
and disconnected. People spend their time playing videogames, using VR porn
and doing copious amounts of chemical drugs.

~~~
codesushi42
_> People spend their time playing videogames, using VR porn and doing copious
amounts of chemical drugs._

This is a more likely outcome, since it is already the trend.

~~~
gumby
Because that's the current social matrix in some societies but it need not be.

------
tim333
There was quite an interesting take on the period by Moravec on how the
hardware available had stagnated during that period
([https://jetpress.org/volume1/moravec.htm](https://jetpress.org/volume1/moravec.htm)).
The 1 million instructions per second compares with say the $120 "Best budget
graphics card AMD Radeon RX 570" today which does 5,100,000 million
instructions per second so quite a difference.

>Funding improved somewhat in the early 1980s, but the number of research
groups had grown, and the amount available for computers was modest. Many
groups purchased Digital's new Vax computers, costing $100,000 and providing 1
MIPS. By mid-decade, personal computer workstations had appeared. Individual
researchers reveled in the luxury of having their own computers, avoiding the
delays of time-shared machines. A typical workstation was a Sun-3, costing
about $10,000, and providing about 1 MIPS.

>By 1990, entire careers had passed in the frozen winter of 1-MIPS computers,
mainly from necessity, but partly from habit and a lingering opinion that the
early machines really should have been powerful enough. In 1990, 1 MIPS cost
$1,000 in a low-end personal computer. There was no need to go any lower.
Finally spring thaw has come. Since 1990...

------
lisper
There is some frighteningly prophetic speculation here, and not just about AI:

"Governments can use large databases to violate people’s privacy and to harass
them. For that matter, credit card companies can do that, too. One can even
envision a natural language system that monitors telephone conversations."

Written in 1985. Before the Internet. Wow.

~~~
jjtheblunt
1985 wasn't really before the internet, just before the public proliferation
of access to it, and browsers as a tool for navigating.

------
YeGoblynQueenne
And here's an editorial about the AI winter that actually did happen, starting
around five years after Drew McDermott's opening comment, just for that extra
bit of cosmic irony:

Avoiding Another AI Winter

James Hendler, Rensselaer Polytechnic Institute

[https://www.computer.org/csdl/magazine/ex/2008/02/mex2008020...](https://www.computer.org/csdl/magazine/ex/2008/02/mex2008020002/13rRUyeCkdP)

------
tachyonbeam
I wonder if it's not just a question of human nature for things to go into a
cycle of overhyped boom and bust. People get very excited about something,
progress doesn't happen fast enough, and then they get bored and move on to
something else.

If we look at Google Trends, it looks like "deep learning" has hit a plateau
back in 2017:
[https://trends.google.com/trends/explore?date=all&geo=US&q=d...](https://trends.google.com/trends/explore?date=all&geo=US&q=deep%20learning)

Artificial intelligence peaked in January 2018:
[https://trends.google.com/trends/explore?date=all&geo=US&q=a...](https://trends.google.com/trends/explore?date=all&geo=US&q=artificial%20intelligence)

I work in AI and I'm very much thinking that another AI winter could come. I
find myself wondering where are all the real-world AI deployments, and how
effective they are. To my knowledge, there aren't super successful AI startups
out there. There are startups getting a lot of investor money for AI research,
but they do not have profitable business models. The venture capitalists are
going to get tired of that at some point, if they aren't already.

~~~
sdenton4
There's a bunch of extremely valuable progress in less glamorous areas, like
data center operations:

[https://www.datacenterknowledge.com/google-
alphabet/google-s...](https://www.datacenterknowledge.com/google-
alphabet/google-switching-self-driving-data-center-management-system)

------
dang
Url changed from
[https://aaai.org/ojs/index.php/aimagazine/article/download/4...](https://aaai.org/ojs/index.php/aimagazine/article/download/494/430)
to one that doesn't do an auto download.

