It is too early for those of us who loved him to recount "Bert stories" and especially "Bert and Ivan" stories, but Dan has provided the links to the YouTube video CHM tribute to the two brothers. Everyone should also read the Wikipedia article about Bert.
Bert's PhD thesis is most often characterized by its title "Online Graphical Specification Of Procedures", but once you look at it you realize that he was one of the first (if not the first) inventor of "dataflow" programs, and in fact this thesis was central to the many "prior art" definitions to quash lawsuits about dataflow ideas.
Another dimension to Bert's scientific and engineering career that is not mentioned enough is that he was one of the earliest and main drivers of what is called CAD today (a rather small number of people in different places made this happen in the early 60s -- including Bert's brother Ivan -- and Bert focused some of the powerful human and computing resources of Lincoln Labs on this vital technology).
Bert's personality was sunny, friendly, and "sweetly firm", to the point that many people clamored to have him as their manager (including only half-jokingly: Ivan). I was completely thrilled when Parc brought in Bert to run the Systems Science Lab in which my group, Lynn Conway's group, Bill English's group etc were all ensconced.
Bert, as with the other enlightened ARPA research managers knew that "the geese wanted to lay their golden eggs" and the manager's job was to support these efforts, not to try to tell the geese how to lay the special eggs). He was superb at this, and many critical inventions and systems happened because he was the nurturer.
I guess I should tell a "Bert and Ivan" story. Their father was a civil engineer who brought not just blueprints home but gadgets and kits for the two brothers -- who were just two years apart in age -- to play with. Bert would recall that Ivan was so smart that he would just start putting the stuff together while Bert read the manual. At the 95% point Ivan would get stuck and Bert would know what to do next. The two brothers with very different personalities got along wonderfully well over their entire lives, and would occasionally do a company together.
A big deal when the kids were young was their mother driving them down from Scarsdale to Murray Hill to Bell Labs to meet Claude Shannon. Years later at MIT, Shannon wound up being a thesis supervisor of both of their PhDs done a few years apart.
I think most of us from 50+ years ago in the ARPA community just revered and were in awe of the research generations that came before us, especially the one right before us. It was tough to do computing back then, but they didn't let this bother them at all. They would program anything they wanted to have happen -- mostly in machine code -- and they would design and build any hardware they needed to run the programs they needed -- mostly with discrete components and relatively high voltages over sometimes acres of computer.
They showed us how to work and play and design and sculpt and the deep art that lies behind the components. We can never thank them enough, and can only "pay forward" by helping those who come after us.
His Ph.D. dissertation is here: https://dspace.mit.edu/handle/1721.1/13474. It was some of the earliest work on dataflow and graphical programming. I know this because Alan Kay told me to read it, so I did. You should too.
Edit: Bert, of course, was Ivan Sutherland's (of Sketchpad) older brother. There's a delightful dialogue with the two of them from 2004: https://www.youtube.com/watch?v=sM1bNR4DmhU ("Mom Loved Him Best: Bert & Ivan Sutherland").
For those that aren't familiar, the TX-2 is one of the systems that led to modern interactive computers. It was far more powerful than you would expect for a computer in 1958, the company DEC was basically founded on this computer's architecture, remarkably advanced computer graphics and image perception was being done (https://dspace.mit.edu/bitstream/handle/1721.1/11589/3395912...)
lincoln labs, where it was hosted, was a fertile area of research at a time when far more people were in favor of academic military research (MIT played a huge role in WWII).
I built a visual programming system for grid computing a while ago and it's interesting how often this paradigm keeps coming up in random places, like music making, blender effects pipelines, etc, etc. You'd think that researchers would design neural network architectures in visual programming space, not in a programming language.
Give it a few years, this will definitely happen.
Other than the notion of "layers of neurons", there isn't much about NNs that I would consider graphical - and that term is merely a remnant of the original motivation for NNs back in the day. Today, this metaphor has mostly been abandoned, and we think more about NNs in what they really are: combinations of linear and non-linear functions.
You may be thinking of sequence processing where a basic network is repeated in various different ways. This is already today often displayed as a graph of sorts.
But I don't see how this would obsolete using a programming language. The strength of visually laying out a network architecture is in communicating it to other humans, not to implement it for a machine.
This is because the building blocks are often "stable" in the software-maintenance sense, and when designing a special-purpose NN most of the hard work seems to be in their configuration and composition. In many projects you'd only ever configure/compose pre-existing functions like "Conv2d" and "Softmax", rather than write them yourself. And the graphical programming model excels at configuration and composition, so it's a natural fit. However, when you have to customize the building blocks, or break away from the "data flowing through functions" pattern, that advantage quickly vanishes.
I'm not claiming that every implementation of a NN would go better with a graphical programming tool. But just like with Audio/DSP tools I can easily see lots of people being able to justify using that kind of thing.
It was always cool to see it, but I wasn’t aware of the significance until now.
Edit: Looks like I was wrong. The computer in the LL library is more likely the predecessor TX-0.
In 2020, he would say: "All engineers are the computer's assistant".
Where did we lose the way or we haven't?
This was huge. In hindsight, I'm amazed he trusted me as a teenager that way, but having been given that trust I never abused it. I'm forever grateful for the way he helped me get started on a path that I've followed my whole life.
This one appears to be the first (18 hours ago):
The Computer History Museum has since tweeted about it:
I've added their tweet as a citation on the Wikipedia page.
/me wonders how many 'modern' managers approach their work with that kind of sensibility.
We definitely don't see enough deference to those fields of expertise, I think.
He seems like a neat guy.
Just a note: the Wikipedia article currently doesn't cite any sources stating that he has died, and I wasn't able to find anything on the internet. So it's not totally clear at this moment that the title is true.
The fact that computer science is one of the few domains where new entrants simply accept the status quo and do not spend a minute on understanding how we got here is a problem. For one it creates a disconnect between the older (and often wiser) generation in the field and the new comers. For another that disconnect then results in endless re-invention of the same wheels. Because by the time the new cadre has acquired the wisdom the cycle will repeat.
Try imagining a modern day genetics student who does not know who Crick & Watson are. That's your typical CS specialist.