
Peter Naur has died - cperciva
https://en.wikipedia.org/wiki/Peter_Naur
======
fsloth
Peter Naur's paper 'Programming as theory building' is perhaps the most lucid,
thought provoking and practically usefull single thing I have ever read on the
sociological aspects of software engineering. Right there with the Brook's
"Mythical man month" ( but the value density of the former is much higher,
since it's a paper and the latter is a book).

~~~
wclax04
Link to the paper:
[http://www.dc.uba.ar/materias/plp/cursos/material/programmin...](http://www.dc.uba.ar/materias/plp/cursos/material/programmingAsTheoryBuilding)

~~~
hcrisp
Briefly looking at the paper, it seems what he calls the "theory of the
program" is akin to what Fred Brooks called "conceptual integrity". Am I
wrong?

~~~
dang
I think that's right. The terms 'design' and 'model' are often used for this
nowadays. Naur's point is that a program is a shared mental construct that
lives in the minds of the people who build it. The source code is not the
program. It's the canonical written representation, but a lossy one.

------
mixmax
It's interesting how many programming languages were made by Danes.

Nauer of course contributed to algol60, but did you know PHP was written by
Rasmus Lerdorrf, Anders Hejlsberg developed delphi, turbo pascal and C#,
Bjarne Stroustrup developed C++, Ruby on Rails was developed by DHH. All
Danes.

Given a nation of 5 million people I think this is quite unusual.

~~~
webkike
Ruby on Rails is not a unique programming language. Matz created Ruby.

~~~
JadeNB
> Ruby on Rails is not a unique programming language. Matz created Ruby.

What does "not a unique programming language" mean? Perhaps 'discrete' or
'distinct' in place of 'unique'? As not an RoR developer, I'm hardly in a
position to say, but surely at a certain point a DSL deserves to be called a
language in its own right. (Extreme case (that, I realise, is far past what
anyone would call a DSL): Perl, as with many other languages, is written in C,
but certainly counts as a separate programming language.)

EDIT, in response to two downvotes: surely it's a legitimate question? I
genuinely don't know what it means to say that something isn't unique (except
mathematically, that it means that there's more than one of it). Assuming that
it means what it seems to mean, how _does_ one draw the line that I mentioned?

~~~
tamana
Rails is not a porgramming language, for various pedantic reasons. That
doesn't make your idea bad. Rails is a defined pattern of expressing a
program, conceptually similar to a language. You could call it a dialect.

~~~
phyllostachys
Rail[1] is a programming language, but that is orthogonal to your point.

[1] - [http://esolangs.org/wiki/Rail](http://esolangs.org/wiki/Rail)

------
Tistel
Its a bit weird being a small part of computer science. The big early movers
and shakers were still around for most of our lives. The whole field is within
living memory. Now they are starting to pass away. Engelbart, McArthy etc.
Think how far physics got from Newton to the Manhattan project. Where will CS
be in 100 years? We have so much work to do.

~~~
coldtea
> _Think how far physics got from Newton to the Manhattan project. Where will
> CS be in 100 years? We have so much work to do._

Might not be so. You're applying the same progress curve to both, but remember
that Newton developed his theories at the start (or close) of the scientific
era, without so many institutions, collaborations, communications,
infrastructure, experimental apparatus, computer simulations, the necessary
math etc.

Computing was developed in the middle (or close) of the 20th century, and had
all that. It also saw much faster development cycles, so the speedup curve we
saw with physics since Newton won't really apply to it (I mean not in the same
timescale: it could still apply "compressed" in the last 5 decades).

I'd say we're already in the Manhattan project era of computing. And with
Moore's law expiring and other limits reached in those areas, we wont be
moving that faster in the future -- mostly like Physics did since 50s.

~~~
sillysaurus3
_I 'd say we're already in the Manhattan project era of computing._

There's no possible way.

We were born at the beginning of civilization. There were the ancients, then
us. A hundred thousand years from now, what we write today has a chance of
still being around. The language that people speak will be completely alien.
But what we write -- our programs -- will still be runnable. There will be
emulators for our architectures, and random work from random people in 2015 AD
will happen to persist to 102,283 AD.

We're going to be the oldest ancients that they consider real. They'll be able
to look at and interact with our work. And even modify it and remix it. And
we'll be thought of as just barely more advanced than cavedwellers.

Now, with that sort of context, there seems no possible chance that we can
know how far away we are from the Manhattan project era of computing. Very
far, suffice to say.

To a certain extent, it's not possible to compare the progress in physics with
computing. On one hand, we've unravelled nature to the point where there are
no mysteries left except in high-energy physics. There's also no way to know
how profoundly the remaining mysteries might impact the world.

Whereas the depth and the mysteries within computing are eternal. If there are
still humans in 102,283 AD, there will still be computing, and they'll still
be coming up with ever-more complex ways of computing. All of the
institutional effort between now and then will push the field way, way beyond
the limited horizons that we can currently imagine.

~~~
coldtea
> _Now, with that sort of context, there seems no possible chance that we can
> know how far, far away we are from the Manhattan project era of computing._

Just piling up years doesn't mean much. Yeah, civilization might go on for
100,000 years. Or 1,000,000 years. Or 20,000,0000. That doesn't mean it will
progress the same as it did the first x years.

Just think of how we had homo sapiens for 200,000 years but progress was
almost flat until around 4-5 thousands years ago. The mere availability of
time doesn't ensure incremental progress at scale, or even monotonically
increasing progress.

There are things such as low hanging fruit, diminishing returns, etc.

One can already see in physics that the most earth-shattering progress was
made until around 1930-1950, and from then on it's mostly small pickings. When
you start fresh, there are lots of low hanging fruits to get. At some point,
you've reached several limits (including physical limits in measuring and
experimental equipment without which you can't go further, and which you can't
overcome because they're, well, physical limits).

And that's even without taking into account a regression (e.g. because of a
huge famine, a world war, nuclear war, environmental catastrophe, a deadly new
"plague" like think, etc.).

~~~
sillysaurus3
_One can already see in physics that the most earth-shattering progress was
made until around 1930-1950, and from then on it 's mostly small pickings.
When you start fresh, there are lots of low hanging fruits to get. At some
point, you've reached several limits (including physical limits in measuring
and experimental equipment without which you can't go further, and which you
can't overcome because they're, well, physical limits)._

Certainly. And you've hit on the core reason why computing is so eternal:
there aren't physical limits.

When a computer becomes ten times faster, we don't affect the world ten times
more profoundly. So the physical limits that hold back clock speeds don't
matter too much. But when we suddenly no longer need to own cars because we
can hail one on demand, our lives become completely different.

The limitations are social, rather than physical. Our own minds, and our lack
of ability to manage complexity, is the primary bottleneck standing between us
affecting the world, right now. Tonight. Imagine a hypothetical superhuman
programmer who can write hundreds of thousands of applications per day. Think
of how that'd reshape the world with time.

It seems true to say the amount that'd affect the world is linear w.r.t. time.
The longer that superhuman programmer churns out permutations of apps in as
many fields as possible, the more the world will change.

But that's exactly what's happening: hundreds of thousands of applications are
being written per day, by humanity as a whole. Project that process forward to
102,283 AD. Are you sure the rate of change will be a sigmoidal falloff like
physics?

~~~
coldtea
> _Certainly. And you 've hit on the core reason why computing is so eternal:
> there aren't physical limits._

On the contrary, there are several. The speed of light. The plank constant.
Heat issues. Interference issues. Issues with printing ever smaller CPU
transistors. Plasma damage to low-k materials (whatever that means).

And all kinds of diminishing returns situations in Computer Science (e.g.
adding more RAM stops having that much of a speed impact over some threshold,
or you can make a supercluster of tens of thousands of nodes, but you're
limited by communication speed between them, unless the job is totally
parallelizable, etc).

> _When a computer becomes ten times faster, we don 't affect the world ten
> times more profoundly. So the physical limits that hold back clock speeds
> don't matter too much. _

Huh? What does that mean? If we can't make faster CPUs, then we're not getting
much further. Increasing cooling etc only helps up to a point.

> _Imagine a hypothetical superhuman programmer who can write hundreds of
> thousands of applications per day. Think of how that 'd reshape the world
> with time._

We already have hundreds of thousands of applications. It's not exactly what
we're lacking. We're talking about qualitative advances, not just churning out
apps.

~~~
sillysaurus3
_If we can 't make faster CPUs, then we're not getting much further._

On the contrary. In our day to day lives, we can do far more today than we
could a decade ago.

In 2006, Intel released Pentium 4 chips clocked at 3.6 GHz. In 2016 we use
quadcore chips where each core is about as fast as that. Yet we're way more
powerful today than a mere 4x multiplier, if you measure what we can _do_.
Think of how limited our tools were just a decade ago.

Clock speed isn't a great measurement, but the point is that making CPUs
faster doesn't make the field more powerful.

It seems like we're talking past each other. I was referring to effects that
computers have on the world and our daily life, but it sounds like you're
referring to total worldwide computation speed, and how it will change over
time. If you're saying the rate will slow sigmoidally, similar to the progress
in physics, I agree.

But the thesis is that the rate at which the world changes w.r.t the field of
computing is unrelated to the total available computation power. Our minds are
the bottleneck.

Compare this situation to the field of physics. The rate that the world
changes w.r.t physics was related to how important the discoveries were.

The contrast is pretty stark, and it might have some interesting long-term
consequences.

~~~
TeMPOraL
> _It seems like we 're talking past each other. I was referring to effects
> that computers have on the world and our daily life, but it sounds like
> you're referring to total worldwide computation speed, and how it will
> change over time._

Interesting, because you seem to be arguing for each other's side. If you look
at what we have then yes - for scientific purposes, total computational power
has increased enormously and continues to do so. But if you ask what effects
computers have on daily life of you and me, then not much has changed in the
last two decades. Software bloat, piling up abstraction layers, turning
everything into webapps - it all eats up the gains in hardware. Yes, the
screens have better resolution and we now have magic shiny rectangles as
another interface, but it seems like software only gets slower and less
functional with time.

> _Yet we 're way more powerful today than a mere 4x multiplier, if you
> measure what we can do._

Scientists? Maybe. SpaceX can run moar CFD for their rockets on a bunch of
graphics cards than the entire world could a decade or two ago. Rest of us?
Not much, it really feels our tools are getting _less_ powerful with time, and
I don't feel like I can do _that_ much more (and if anything, the primary
benefits come from faster access to information - Googling for stuff instead
being stuck with just the spec and the source code speeds things up
considerably).

~~~
sillysaurus3
It's pretty clear at this point that I've failed to communicate. I'll bow out
now.

I have a magical device in my pocket that can summon a car on demand.

Two effective hackers can set up a complete product within a few weeks, and
then host it without having to think too much about what we now call devops.
And when their servers start to melt, they can spin up however many they need.

We no longer get lost. Remember what it was like to get lost? Like, "I have no
idea where I am. Hey, you there. Do you know how to get over to X street?"

These things were not possible ten years ago. Maybe people here simply don't
remember, or choose to forget. Or maybe I just suck at writing. But every one
of these incredible advances were thanks to advances in the field of
computing. Both theoretical and practical. For an instance of the former, see
the recent neural net advancements; for the latter, rails, homebrew, the
pervasiveness of VMs, and everything else that our forerunners could only
dream of but we take for granted.

Have a good evening, and thanks for the enjoyable conversations. You and
coldtea both do really cool work.

~~~
coldtea
> _I have a magical device in my pocket that can summon a car on demand._

As a gadget lover, it seems magical to me too, especially since I was once
lusting over things like a ZX Spectrum. But in the big picture, is it really
life changing technology? You could do sort of the same thing already in 1970
with a stationary phone and a cab service (and in 1995 with a mobile phone).
Not sure in the US, but where I live I used a cab service all the time ever
since the eighties -- it took around 10 mins after the phone to get to you, so
not totally unlike calling it with an iPhone.

Same for not getting lost. GPS is nice and all, but was getting lost much of
an everyday problem in the first place (for regular people of course, not
trekkers etc). Maybe for tourists, but I remember the first 3-4 times I
visited the states, where I did two huge roadtrips with McNally road maps, and
I didn't have much of an issue (compared to later times, when I used an iPhone
+ Navigon). I got lost a few times, but that was it, I could always ask at a
gas station, or try to find where I was on the map and get on the exit towards
the right direction.

I'd go as far as to say that even the 2 biggest changes resulting from the
internet age, fast worldwide communications and e-commerce haven't really
changed the world either.

Some industries died and some thrived -- as it happens --, and we got more
gadgets, but nothing as life-changing as when typography or toilets or
electricity or cars or even TV was developed (things that brought mass changes
in how people lived, consumed, how states functioned, in urbanization and in
societal norms, in mores, etc., heck even in nation-building --e.g. see
Benedict Anderson on the role of typography on the emergence of nation
states).

What I want to say (and this is a different discussion than the original about
limits to computing power over time, etc.) is that technology also has some
kind of marginal returns. Having a PC in our office/home was important and
enabled lots of things. Having a PC in our pocket a few more things (almost
all because of the addition of mobility). Having a PC in our watch? Even less
(we already had PC+mobility solution).

> _Have a good evening, and thanks for the enjoyable conversations. You and
> coldtea both do really cool work._

Thanks, but don't let our counter comments turn you off! They way I see it is
we're painting different pictures of the same thing (based on our individual
experiences and observations), and those reading it can decide or compose them
into a fuller picture.

------
cperciva
Via Poul-Henning Kamp:
[https://twitter.com/bsdphk/status/683750072841072640](https://twitter.com/bsdphk/status/683750072841072640)

I haven't seen any "official" announcement, so I figured Peter Naur's
wikipedia page was the best link to use.

~~~
jmartinpetersen
There's an article in Danish at [http://www.version2.dk/artikel/datalogi-
pioneren-peter-naur-...](http://www.version2.dk/artikel/datalogi-pioneren-
peter-naur-er-doed-538017)

DIKU (University of Copenhagen Department of Computer Science) linked to that
one. Naur was the first professor at DIKU and one of the founders of the
institute.

------
scoot
I first became aware of computers as a thing with the launch of 8-bit home
computers in the early 80s. As a result, I seem to have a hard time imagining
the computing scene before then, as in the 80s it seemed so nascent - so the
fact that computers were a big enough of a thing in 1959 for there to be a
Danish Institute of Computing blows my mind. Everything before the 80s seems
so abstract. Any recommended reading / viewing to try to rebate by historical
perspective?

~~~
neffy
The Psychology of Computer Programming. Gerald M. Weinberg 1971

and:

The Soul of a New Machine Tracy Kidder 1981

ought to be required reading for anybody in tech.

~~~
dandrews
I'll easily second this recommendation. 30 years ago I was struck by Tom
West's quote from Kidder's book: "Not everything worth doing is worth doing
well".

------
orionblastar
Interesting that ALGOL 60 inspired other languages such as Pascal and C as
well as Ada, etc.

[https://en.wikipedia.org/wiki/ALGOL_60](https://en.wikipedia.org/wiki/ALGOL_60)

~~~
poizan42
Also Naur did have a good deal of the responsibility for getting support for
recursion into Algol 60: [https://vanemden.wordpress.com/2014/06/18/how-
recursion-got-...](https://vanemden.wordpress.com/2014/06/18/how-recursion-
got-into-programming-a-comedy-of-errors-3/)

Just think how the programming language landscape might look today if
recursion had never gotten into mainstream languages, or maybe only had begun
to become the norm recently.

------
protomyth
I guess I should be grateful that our profession gets to live in the same time
as the masters. RIP

------
leonroy
Literally spent last night reading up on the work he did on BNF. He sounded
like a very capable and rather rarely, a modest man.

------
morenoh149
link to a retyped version of the revised "Report on the algorithmic language:
algol 60"
[http://datamuseum.dk/site_dk/rc/algol/algol60retyped.pdf](http://datamuseum.dk/site_dk/rc/algol/algol60retyped.pdf)

a very famous paper

------
dopeboy
Anyone that has taken a compiler class has written something in BNF. RIP.

------
jarmitage
Can anyone recommend (a) good resource(s) for learning about BNF?

~~~
mahmud
The Wikipedia page does a good enough job. It shouldn't take more than 1-3
pages, it's a fairly simple concept.

[https://en.wikipedia.org/wiki/Backus%E2%80%93Naur_Form#Examp...](https://en.wikipedia.org/wiki/Backus%E2%80%93Naur_Form#Example)

I stumbled into compilers by accidentally buying the _Dragon book_ as a teen.
BNF part I got through fairly quickly.

------
smegel
Never heard of him, but I think I need to know about him. I find this book he
wrote to be very enticing:

> Computing: A Human Activity

Sounds like a forerunner to Agile programming.

~~~
nine_k
Isn't he the N in BNF?

~~~
phkamp
Yes.

~~~
ymse
Speaking of prominent Danish programmers... :)

------
s_dev
I think he deserves a black bar? Anyone else think so? He has some serious
contributions in that lifetime to the world of Comp Sci and general hackery.
How do we vote to initiate a black bar?

~~~
mintplant
Sometimes I think introducing the black bar was a mistake. Now every death has
to be judged as to whether the person is "worthy" or not of a black bar. This
caused a lot of tension in the threads about Ian Murdock's death, for example.

~~~
mschuster91
Agreed, but what angered me a bit about Murdock's death is how fast it fell
off front despite iirc over 1300 upvotes.

~~~
shawn-butler
The editorializing choices regarding Ian on HN made by staff were poor at
best. Not surprised they locked discussion down.

But, this is a different topic. Any Turing award winner deserves recognition
and reflection on their passing imho.

~~~
chris_wot
I'm not sure I'm following this comment. What editorializing did the HN staff
do? I didn't see any.

If anything, I may have inadvertently got it caught up in some sort of flame
filter because of a comment I wrote about asking for help if you feel
suicidal, then (seemingly, but not actually) contradicting myself later by
saying that medical help may not be the best solution.

Flame filters, AFAIK, kick in automatically. Mental health is a hot button
issue, and flame fests and controversial discussions are really not good or
appropriate for HN. I should know, I've been part of a few and have to
remember when to take a step back.

------
mgpwr_new
:(

------
ldesegur
Will HN have a black banner to remember him?

