
Bill Manning has died - dredmorbius
https://rdvlivefromtokyo.blogspot.com/2020/01/bill-manning.html
======
lbenes
For anyone wondering who he was or his contributions:

> Manning has been working in the Internet industry since 1979 when he started
> working at Texas Instruments and helped in building its IP network. After
> which he joined Rice University and made SESQUINET. He played a significant
> role in the migration of MIDNET and SESQUINET from NSFnet regional networks
> to commercial networks.

> He worked on the COREN and CALREN-2 technical committees. At ISI he worked
> in the Routing Arbiter Project.

> Bill has been working with the the IETF and IEPG as an individual
> participant, working group chair, and code developer. He specfied the method
> to add NSAP support to the DNS.

[https://icannwiki.org/Bill_Manning](https://icannwiki.org/Bill_Manning)

~~~
dghughes
Thanks for that.

I always feel bad seeing that someone has died and how important they were in
computer science but it's the first time I've heard of them.

------
ipnon
When I see eulogies like this for people I've never heard of, I'm reminded
sometimes that everything good in my life that I take for granted exists
because of someone somewhere just doing the best they can. It helps ease the
angst.

------
bifrost
I knew Bill professionally, he was a great guy and we'll all be at a loss
without him around.

Back in 2003 when I started SFIX he was instrumental in getting it off the
ground.

I can't believe he's gone.

~~~
Nexxxeh
I'm sorry for your loss.

------
hinkley
Some stand-up comic in the 90's had a joke to the effect of:

And when you hit a certain age, your friends start playing a game called,
"Guess Who Died?"

This seems like the year when all the DotCom era kids start playing the game.

~~~
dredmorbius
Flipside is that many of the founding fathers of computers, software,
networking, and security are still alive, and can still tell their stories.

That's an opportunity for any historians out there.

~~~
gunglefunk
One of the best experiences of my college life was escorting Tony Hoare around
the campus during his several talks that day (as well as attending those talks
of course). I distinctly remember thinking how lucky I was to be around during
the time of the giants of my field. Like a physics major getting to sit and
talk with Newton or similar.

------
krustyburger
> On other occasions, when the waiter asked for his order, Bill would point to
> another person at the table, and say, "I'll have what she's having." "Well,
> what is she having?" "I don't know, I haven't heard her say." Once in a
> while, he would point to someone else in the restaurant and say, "I'll have
> what they are having." It was funny and sometimes disconcerting, which was
> very Bill, and it was also his way of making sure he himself was eating (and
> thinking and doing) as broadly as possible, without getting stale.

What an endearing fellow. I wish I was more like him.

------
tombert
I'm always impressed (and admittedly envious) when I see a person who manages
to get a PhD while skipping the bachelors degree; being a dropout I wish I
could do that, but of course I didn't build TI's IP network or SESQUINET :)

RIP Bill; HN should consider putting a black bar.

~~~
wpietri
Does anybody know more about how this happens? As a fellow dropout, I've
sometimes wanted to go back for a master's, but there's no way I'm spending n
years getting a bachelor's first.

~~~
tombert
From the research I've done on this, it looks like you typically have to have
made huge contributions to a field already, and have a professor or ten
advocating for you at a university to accept you into the PhD program.

I've emailed about twenty of math professors in the NYC area to let me into a
masters or PhD program, since I thought my work history for the big "brand
name" compsci research companies and my work as an ostensible research
scientist a a big nice university would suffice, but sadly not a single one
was even willing to meet with me.

~~~
toomuchtodo
Knowledge > credentials. Hope it doesn’t get you too down, sometimes the only
winning move is to not play.

~~~
tombert
You're not wrong, though it does sadly preclude me from the more interesting
compsci/math research positions (stuff involving type theory and compilers and
whatnot); most universities and research teams (understandably) have a firm
minimum requirement of a masters degrees, preferring a PhD.

I totally understand why they do this...if you're dedicated enough to get a
PhD in a field then you're almost-inherently going to be a decent researcher,
so I can't blame whomever is in charge of hiring at places like MS Research
for having these criteria. It's just a bit sad that this means, at minimum,
even if I were an otherwise perfect candidate, I probably couldn't get a job
doing real research for at least another seven years (3 to finish the
bachelors, and 4 to get the PhD if I'm being _very_ generous).

EDIT: To those wondering if this is contradictory to my previous comment, I
will clarify; I do work for one of the big brand name compsci research
companies, but (sadly) not doing research. When I worked for a big university,
my title was technically "research scientist", but my actual job could have
been more-accurately described as "code-monkey", and that's putting it
generously.

~~~
toomuchtodo
Never give up on applying for roles you believe you’re qualified for even
though you’re missing some letters, every once in a while someone breaks the
mold and hires based on the right signals (versus a checkbox) in this
scenario. Applying is free, you miss 100 percent of the shots you don’t take.

~~~
tombert
Oh, I do keep applying, I just try to not get me hopes up.

------
jascii
Maybe there is a little bit of solace in knowing that in some sense he lived
more in his too short life then most of us will in our full-length ones.

------
Snelius
RIP

------
tabtab
Re: _...we called Bill "the bad idea fairy". He always brought a slightly-off-
kilter view of technical problems, which triggered endless discussions of
fascinating, if usually implausible, alternatives._

I've been "accused" of similar. Although, I can't claim I'm as successful as
Bill. I tend to ruffle sacred cows. It's not like I'm trying to agitate
people, I just view the world a bit different from established opinions.

For example, we have dynamic programming languages, but not dynamic RDBMS
(that use SQL or a close variation). Why not?

And we have XML as a fairly flexible meta-standard for data structures, but
why not something comparable for C-like-syntax programming languages? You
could roll your own programming language without creating a parser from
scratch (and hopefully mix and match behavior based on "part kits").

Both of these seem like logical extensions of existing tools to me, but I get
a lot of vague flack. It's like the universe is calling for them, not just me,
yet nobody cares. It's not outright inventing new concepts, it's just taking
concepts that worked well in one computing domain and applying them to
another. They might fail, but so might every other IT experiment going on
currently.

~~~
samatman
> _For example, we have dynamic programming languages, but not dynamic RDBMS
> (that use SQL or a close variation). Why not?_

Depending on what you mean, SQLite is exactly this.

It is relational, so one does have to structure tables, but I consider this
equivalent to naming variables.

Unlike most (all?) other SQL databases, however, the 'types' of the data are
just suggestions, and have some influence on how SQLite stores and retrieves
them internally.

But you can put a string in an INTEGER field, SQLite won't stop you.

~~~
tabtab
But you have to pre-specify the columns. In a truly dynamic DB, you could
enter "INSERT INTO foo (myNewColumn) VALUES (123)" and both the table "foo"
and "myNewColumn" are created on the fly. SqlLite doesn't do this. Further,
one could add restrictions as needed to "lock down" both new column creation,
and add type-like validation ("parse" check). Thus, as requirements solidify,
it can act more like a traditional RDBMS.

------
warbaker
Really tragic. RIP.

