
Eliza, the Rogerian Therapist (1999) - jrhouston
http://psych.fullerton.edu/mbirnbaum/psych101/Eliza.htm
======
kristopolous
Go read ELIZA's author, Joseph Weizenbaum's book, "Computer and Human Reason";
well at least the first 2 chapters. Let me justify it, (I'm typing it out from
a hard copy so excuse my brevity)

Page 4:

"DOCTOR...soon became famous...mainly because it was an easy program to
demonstrate. Most other programs could not vividly demonstrate the
information-processing power of a computer to visitors who did not already
have some specialized knowledge"

As in, the mid-1960s, this was (one of) the first non-technical computer
programs. No commands, no manual, no conceptual overview needed - just sit
down and start typing.

Page 6:

"I suggested I might rig the system so that I could examine all conversations
anyone had had with it, say, overnight. I was promptly bombarded with
accusations that what I proposed amounted to spying on people's most intimate
thoughts; clear evidence that people were conversing with the computer as if
it were a person who could be appropriately and usefully addressed in intimate
terms ... what I had not realized is that extremely short exposure to a
relatively simple computer program could induce powerful delusional thinking
in quite normal people."

Page 38:

"Like highways and automobiles, they [new forms of media] enable the society
to articulate entirely new forms of social action, but at the same time they
irreversibly disable formerly available modes of social behavior."

... that's a pretty deep insight about social media, especially from 1976.

Pages 115-130 he writes quite critically of "hacker culture", characterizing
the participants as "compulsive programmers" and comparing them to gamblers.
I'm not making any judgment on his claims in those pages, but his arguments
are fairly unique and I had not seen them before.

~~~
tantalor
The bit about "delusional thinking" is incredibly condescending.

~~~
kristopolous
Although you can certainly read it that way, I don't think that was his
intention. From the broader context, I believe he was trying to use it
clinically and not as a derisive insult. He was trying to understand the
mechanisms which made people have emotional connections with a computer
program.

"Maybe it's because people saw it as a tool and people become emotionally
attached to tools" you may think. Coincidentally his chapter following that is
entitled "On Tools" where he goes into psychological literature on human-tool
interaction.

\---

Also there's an unspoken context here that he gets a step away from a few
times without hitting it:

Non-technical people understood computers from their popular depictions;
perfect electronic brains; profoundly superior-to-human machines.

Some people likely assumed there was an infallibility to the interaction, as
if someone, say unfamiliar with stage magic, saw a magic trick and assumed it
was sorcery. People employ whatever mental model they're familiar with that
fits best.

The idea it was _merely_ just rearranging your words and feeding it back at
you likely seemed just outrageously galling to be the thing a multi-million
dollar machine that takes up an entire room and coming from the MIT Artificial
Intelligence Lab is doing. No. It must be real...

Almost as outrageous as someone suggesting your sorcerer simply put the item
in his pocket while he was distracting you.

Eliza, in 1965 was a computer version of Uri Geller.

\---

Another idea relating to Milgram, he never pointed out whether people trusted
that a machine would have had safety precautions built in. Of all his
variations I'm not familiar with any where he made the machine "malfunction".

Would subjects then trust the proctor but distrust the machine and back off
more or perhaps, even more interesting, would there be no difference at all?

~~~
tantalor
People can have strong emotional/sentimental connections with anything. It
doesn't make them delusional, even in the clinical sense. For example, a
diary.

But I take your point that people at the time may have misunderstood the
program, possibly believing it was _actually_ sentient. That's a mistake
belief, not a delusion.

~~~
kristopolous
He does point that out in a part I omitted. I wish the work was online so I
could just link you but please, don't take my selective quoting as a faithful
representation, it probably isn't, I'm just some internet rando.

Go get your hands on a copy, you can probably find one for under $7 on eBay

------
dang
So many great Eliza submissions in the past:

[https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...](https://hn.algolia.com/?dateRange=all&page=0&prefix=false&query=comments%3E0%20Eliza&sort=byDate&type=story&storyText=none)

But not really any fundamental discussions, probably because it was already so
well known as a thing.

------
AaronLasseigne
For anyone interested in learning more I recommend the 99% Invisible episode
on it: [https://99percentinvisible.org/episode/the-eliza-
effect/](https://99percentinvisible.org/episode/the-eliza-effect/)

------
wombatmobile
Eliza was great value for as long as you could stay within its limitations.
Eliza was a true friend.

Imagine Eliza GPT 2020, speech enabled friend in pocket.

Who's going to implement it?

------
tim333
I've got one for my 404 page. Quite fun.

