
An interview with Barbara Liskov - theafh
https://www.quantamagazine.org/barbara-liskov-is-the-architect-of-modern-algorithms-20191120/
======
_bxg1
> In my version of computational thinking, I imagine an abstract machine with
> just the data types and operations that I want. If this machine existed,
> then I could write the program I want. But it doesn’t. Instead I have
> introduced a bunch of subproblems — the data types and operations — and I
> need to figure out how to implement them. I do this over and over until I’m
> working with a real machine or a real programming language. That’s the art
> of design.

Love the way she put this. Good software design is a process of coming up with
layers of vocabularies that most expressively describe what needs to be
described, until you hit the bedrock of something that already exists that you
can hand the rest off to.

~~~
jackfoxy
Which is why I prefer working with a strongly typed functional language.
Domain Driven Development (well, my version of it) is the first step. Get all
the types right. The necessary operations should be obvious by this point.
Code them up! Write unit tests as you go along. You're likely to do some
refactoring as your operations materialize, and unit tests are going to help
you stay safe.

------
skybrian
If you're interested in computer history, here's a transcript of a long, in-
depth interview with Barbara Liskov in 2016 about how she got into computing,
the CLU project, and many other aspects of her career. Quite inspiring!

[https://amturing.acm.org/pdf/LiskovTuringTranscript.pdf](https://amturing.acm.org/pdf/LiskovTuringTranscript.pdf)

------
leibnitz27
Such a waste of a good title. Should have been:

"An interview with Barbara Liskov. Accept no substitutes."

~~~
devnulloverflow
It would have been a great title.

But the pedants like me would have crept up to say "but Liskov tells us we
_can_ accept substitutes".

------
sreyaNotfilc
You know, I've heard about the Liskov substitution principle but have never
really equated it with anyone (especially living). Its nice to see that some
pioneers in our field is still alive and well.

------
BubRoss
"Barbara Liskov invented the architecture that underlies modern programs."

Basic explanation of this claim: she invented CLU, the first language without
goto statements.

~~~
ubertakter
I don't think your statement is intended to be dismissive, but it is a bit...
sparse.

Related to CLU: It introduced a lot of concepts together in one language. "Key
contributions include abstract data types,[6] call-by-sharing, iterators,
multiple return values (a form of parallel assignment), type-safe
parameterized types, and type-safe variant types." [1]

Liskov also has other contributions[2]. Per the article, she says herself in
the early days there were a lot of big problems to solve, so it was somewhat
easier to make a "big" contribution.

It's interesting to look at where many modern concepts came from. It would be
interesting to dig in to these a little more. Time is the one resource that's
hard to come by though.

[1]
[https://en.wikipedia.org/wiki/CLU_(programming_language)](https://en.wikipedia.org/wiki/CLU_\(programming_language\))
[2]
[https://en.wikipedia.org/wiki/Barbara_Liskov](https://en.wikipedia.org/wiki/Barbara_Liskov)

~~~
blotter_paper
Reading about the history of so many major design decisions coming from one
person also makes me wonder how different modern languages would be without
the contributions of that person. Would we be more or less where we are, but a
couple years behind? Would we have gone a different route with some of these
decisions?

------
galaxyLogic
The article doesn't seem to mention
[https://en.wikipedia.org/wiki/Liskov_substitution_principle](https://en.wikipedia.org/wiki/Liskov_substitution_principle)
at all, which I think is her greatest accomplishment, formulating it.

------
agumonkey
pg (amongst others) talked a lot about this. Middle layer language design,
eDSL, language oriented programming ala racket.

Fun to see liskov mention that when so few, if at all, of the oo World thought
this way.

~~~
dragontamer
Liskov is one of the few programmers who truly understood OO-programming
waaaaayyyy back in the 80s.

[https://dl.acm.org/citation.cfm?doid=62139.62141](https://dl.acm.org/citation.cfm?doid=62139.62141)

And later developed into the Liskov Substitution Principle (`94):
[https://www.cs.cmu.edu/~wing/publications/LiskovWing94.pdf](https://www.cs.cmu.edu/~wing/publications/LiskovWing94.pdf)

Over the next 20 years, people would be writing about bad examples like "Class
Car extends class transportation", but the fundamentals of good OO-programming
were laid out by Liskov pretty early on...

~~~
agumonkey
The thing with pioneers is that most of the followers in the decades after
have no clue about what they meant.

------
bonyen
Back in the day, I treasured my copy of Liskov and Guttag's "Program
Development in Java". I enjoyed its discussion of a disciplined programming
style centered around maintaining invariants in your data structures. It was
also the only book I've seen which talked about modeling the state of your
design using a graphical notation inspired from Alloy. I wanted more exposure
to this material so I even purchased a copy of their earlier work (which is
also excellent). Point being, I have a lot of respect for her.

Like some of the other comments, I loved how she expresses her thoughts about
computing in this article, e.g: "In my version of computational thinking, I
imagine an abstract machine with just the data types and operations that I
want..." (reminds me of Norvig/Graham talking about Lisp). But I don't like
this article overall because I think it takes cheap shots at men in computing.

"I’m worried about the divorced couple in which the husband publishes slander
about the wife, including information about where she lives." When I first
encountered this statement, I thought it was being unnecessarily specific but
didn't think anything else of it. After finishing the rest of the article, I
think it was a purposeful choice, and in the current environment, I don't
think it's a fair statement to make. A male making the opposite statement
would likely be called out, so I'm calling her out here.

"At Berkeley, I was one of one or two women in classes of 100. No one ever
said, “Gee, you’re doing well, why don’t you work with me?” I didn’t know such
things went on. I went to graduate school at Stanford. When I graduated,
nobody talked to me about jobs. I did notice that male colleagues, like Raj
Reddy, who was a friend of mine, were recruited for academic positions. Nobody
recruited me." She didn't say this explicitly, but my interpretation is that
she's using this as evidence of a gender bias in recruiting. Clearly, gender
bias is one possibility, but what about giving people the benefit of the
doubt? There are other more charitable explanations (somebody who understands
conditional probability better than me could comment on this situation from a
probabilistic perspective).

"Back then, advisers placed graduates through deals with departments around
the country.

Yes, but nobody made deals for me. In the ’90s, I went back to Stanford for a
department celebration. A panel of the old professors, without knowing what
they were doing, described the old boy network. They said, “Oh, my friend over
there told me that I’ve got this nice young guy you should hire.” It was just
how it was. They were clueless. They talked about a young woman who did so
well because she married a professor! Clueless. Another colleague had a pinup
in his office. I asked him, “What’s that pinup in your office?” Clueless." I
love her directness here, but on the other hand her choice of words suggests
to me that she has a chip on her shoulder, something I wasn't expecting from
somebody so accomplished!

This reminds me of how greatly our beliefs affect our perception of the world,
like in the beginning of a relationship when everything seems perfect, but
after a fight, suddenly even the smallest things about your partner start to
annoy you. The quirkyness of their laugh, or the way they chew their food –
how did I miss how annoying that was before? But in reality they didn't change
at all – only my perceptions did, because I no longer liked them.

"Even so, is it correct that there were approximately 1,000 faculty members
when you started at MIT, only 10 of whom were women?

That was my recollection.

So there was progress, but …" I don't like the use of the word "progress"
here, but I realize that is potentially another highly flammatory discussion.
Here, it's just more evidence that this discussion is politically charged.

"My sense is that all scientific fields have failed to recognize some
foundational contributions by women.

In the 10 years before I was head of computer science at MIT, the department
identified only one woman worth hiring. When I was the head [from 2001 to
2004], I hired seven women. We didn’t scrape the bottom of the barrel. All
three junior women I hired are outstanding. There was a long period of time
where women were not considered at all." Is that last sentence meant to be
taken literally? I would be surprised if it were true. I'm ignorant of the
goings-on at these levels, but I have faith that the majority of people
involved in these decisions (around the time that she's referring to) were not
sexist (the mere fact that she became the department head is evidence of
this). I would also be interested in more details about why she was successful
in hiring women compared to the previous head.

The big tech companies are making an effort to hire more women. One way they
do this is by recruiting heavily at events where there are a preponderance of
women. Somehow, this is seen as not being discriminatory, but I disagree. It's
like there's two separate lines of applicants, and by choosing to focus on
one, the people in the other line are hurt (and to be clear, my position is
that I'd rather gender not be part of the equation at all, at least for tech
jobs).

I could go on about the article, but I think I've made my basic point.

I also realize that I'm subject to the same biases that I allude to above. I
too have a chip on my shoulder because I believe I was unfairly accused of
being a sexist at work. As a result, I've become highly sensitive around this
topic (e.g. I followed the James Damore events very closely) and I too view
things through an (even more) distorted lens.

~~~
jrochkind1
you are in fact the one who sounds like you have a chip on your shoulder.

~~~
oneepic
OP says as much, which I respect, even if I don't share his position. It's
just a rough subject these days.

------
vilhelm_s
Liskov's work on programming languages is (of course) well known. She then
switched research topic to distributed systems and among other things
developed the "Practical Byzantine Fault Tolerance" (PBFT) algorithm.

I thought this was interesting, because like 10 years ago I had never really
heard of her distributed systems work, and kindof thought of Liskov as a one-
hit-wonder. But now in the last few years PBFT is suddenly pretty hyped,
because it's a key part of a bunch of new blockchains, so maybe she was far
ahead of the curve...

~~~
dgacmu
She's done both for a long time. She came up with the replication protocol
that Lamport termed Paxos a year before Lamport did (she called it Viewstamped
Replication). That was 1988.

------
gwbas1c
Meta: Until I clicked this link, I had no idea who Barbara Liskov is.

Perhaps re-title this to "An interview with Barbara Liskov, The Architect of
Modern Algorithms"

~~~
dang
Not every title should explain itself. On HN, it's good for readers to work a
little.
[https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...](https://hn.algolia.com/?dateRange=all&page=0&prefix=false&query=by%3Adang%20%22work%20a%20little%22&sort=byDate&type=comment)

Among other reasons, it interrupts the reflexive circuits and gives time for
the reflective ones to kick in.
[https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...](https://hn.algolia.com/?dateRange=all&page=0&prefix=true&query=by%3Adang%20reflexive%20reflect&sort=byDate&type=comment)

------
justicezyx
Poor journalism and writing ruined an otherwise thoughtful piece of
influential computer scientist.

The root cause probably is that the author does not understand the ideas well
enough, so he/she is urged to overblown the implications...

~~~
_bxg1
Yeah, "You came of age professionally during the development of artificial
intelligence" in the opening question told me right off the bat that something
was off with the interviewer. Some version of AI research has been around
nearly as long as computers themselves, but the late 60s weren't exactly a
focal point like, say, the 80s were. Plus, her primary contribution to
computer science has little to do with AI directly. It felt like
misunderstanding and/or a wide reach for a hot topic.

~~~
farazbabar
Perceptron, which preceded both the minsky induced AI winter and the current
hype around deep learning was in fact invented in 1958.

------
proc0
Hmmm...

"The language, CLU (short for “cluster”), relied on an approach she invented —
data abstraction — that organized code into modules."

Let's ignore how preposterous it is to claim you invented data abstraction and
move to the module claim... that a simple search for "who discovered
programming modules" shows:

"The Modula programming language ... by Niklaus Wirth, the same person who
designed Pascal. The main innovation of Modula over Pascal is a module system,
"

And it seems others were thinking the same way: 'After you won the Turing
Award, a comment appeared online saying, “Why did she get this award? She
didn’t do anything that we didn’t already know.”'

Not sure what to think here but I never heard CLU was the first to add
modules. I'll just leave it at that.

~~~
_pius
_And it seems others were thinking the same way: 'After you won the Turing
Award, a comment appeared online saying, “Why did she get this award? She
didn’t do anything that we didn’t already know.”'_

This is not only distasteful but false.

First, modules in Modula were primarily about _scope_ not abstraction.

Liskov pioneered a core aspect of _abstraction_ which, among other things,
allows modules to rely on abstract interfaces rather than their
implementations (or a type or class rather than its subtypes or subclasses).

Abstraction only works reliably if any provably true property of a type is
also provably true about its subtype.

Want to learn more? You’ll find it referenced in the literature as the
_Liskov_ Substitution Principle.

[https://dl.acm.org/citation.cfm?doid=197320.197383](https://dl.acm.org/citation.cfm?doid=197320.197383)

~~~
zozbot234
When did ML gain its module feature (which does enable the equivalent to data
abstraction)? From a _very_ casual search, the earliest references I can find
to it are from the early 1980s, so well after CLU - but the very first
versions of ML itself were being worked on around the same time.

~~~
_pius
That’s a good question, I don’t know the history well enough to comment on
that.

Much more important than CLU itself — groundbreaking as it was — was Liskov’s
formalization of substitutability as a rigorous principle one can use to
reason about the quality or correctness of an arbitrary abstraction (or model
thereof).

