Hacker News new | past | comments | ask | show | jobs | submit login
Peter Naur has died (wikipedia.org)
570 points by cperciva on Jan 3, 2016 | hide | past | favorite | 96 comments



Peter Naur's paper 'Programming as theory building' is perhaps the most lucid, thought provoking and practically usefull single thing I have ever read on the sociological aspects of software engineering. Right there with the Brook's "Mythical man month" ( but the value density of the former is much higher, since it's a paper and the latter is a book).



Briefly looking at the paper, it seems what he calls the "theory of the program" is akin to what Fred Brooks called "conceptual integrity". Am I wrong?


I think that's right. The terms 'design' and 'model' are often used for this nowadays. Naur's point is that a program is a shared mental construct that lives in the minds of the people who build it. The source code is not the program. It's the canonical written representation, but a lossy one.


It also has a lot of overlap with the concept of "tacit knowledge": https://en.wikipedia.org/wiki/Tacit_knowledge

The programmer is unable to completely and unambiguously articulate the "design" in source code and documentation. Yes, the source code can be improved with with longer names of variables and functions in addition to liberal code comments. And documentation can be expanded to include chapters on "architectural overview" and "technical motivations" to help fill the gaps but it will inevitably be incomplete.


UBA is a University in Argentina. Happy to see they include it in their curriculum :)


I agree. Years if not decades ahead of its time.

https://hn.algolia.com/?query=programming%20as%20theory%20bu...


Peter Naur was editor of the Algol 60 report, and with foresight, cunning, and a bit of trickery, he is responsible for getting recursion into a mainstream programming language. https://vanemden.wordpress.com/2014/06/18/how-recursion-got-...


Ha, another work I get to know by its author's death ... sigh. Obiwan's quote get boring.


Which quote's that?


"If you strike me down, I shall become more powerful than you can possibly imagine."

^ Is the one that comes to mind? Or maybe

"[...] I'm getting too old for this sort of thing."

or

"An elegant weapon... for a more civilized age."

Actually now that I think about it, I really don't know what the parent was referring to.


I was thinking "Now that's a name I've not heard in a long time."

It fits. With both Murdock and Naur, both of them were names I was familiar with but hadn't heard anything about for quite a while... until I heard they both died.


mattlutze's first one. The fact that passing away spread his influence even more. Even though the 'name I havent heard in a long time' could also fit.


It's interesting how many programming languages were made by Danes.

Nauer of course contributed to algol60, but did you know PHP was written by Rasmus Lerdorrf, Anders Hejlsberg developed delphi, turbo pascal and C#, Bjarne Stroustrup developed C++, Ruby on Rails was developed by DHH. All Danes.

Given a nation of 5 million people I think this is quite unusual.


Ruby on Rails is not a unique programming language. Matz created Ruby.


you're right of course. I thought I'd include it though since its so popular.


> Ruby on Rails is not a unique programming language. Matz created Ruby.

What does "not a unique programming language" mean? Perhaps 'discrete' or 'distinct' in place of 'unique'? As not an RoR developer, I'm hardly in a position to say, but surely at a certain point a DSL deserves to be called a language in its own right. (Extreme case (that, I realise, is far past what anyone would call a DSL): Perl, as with many other languages, is written in C, but certainly counts as a separate programming language.)

EDIT, in response to two downvotes: surely it's a legitimate question? I genuinely don't know what it means to say that something isn't unique (except mathematically, that it means that there's more than one of it). Assuming that it means what it seems to mean, how does one draw the line that I mentioned?


Rails is not a porgramming language, for various pedantic reasons. That doesn't make your idea bad. Rails is a defined pattern of expressing a program, conceptually similar to a language. You could call it a dialect.


Rail[1] is a programming language, but that is orthogonal to your point.

[1] - http://esolangs.org/wiki/Rail


Rails isn't a programming language at all, so it can't really be a "unique" one.


Ruby on Rails is one of many frameworks on top of the Ruby language.


The grandparent was right, in a sense, because while Rails definitely is not a standalone language on its own, it is an embedded DSL since you are working at a much higher level of abstraction than what Ruby offers out-of-the-box.


You could say the same about Ruby and C.

>Ruby is a DSL for C since you are working at a much higher level of abstraction than what C offers out of the box.


I don't think that's how it works. Ruby is general-purpose, unlike Rails, which solves a very specific problem by building on top of core Ruby.


Back to the topic, here's an explanation http://www.maximise.dk/blog/2005/04/why-danish-programmers-a...


That article is just a bunch of prejudices. And I say that as a Dane. I am sure American, French or Spanish programmers think for themselves too and that they also take pride in their work.

First of all, a number of nations in Northern Europe have done well in tech: Dutch (python), Norwegians (css, opera), Swedes (erlang, skype), Finns (linux, nokia) and Brits. What do these countries have in common? They are close to each other, they had strong universities within science or engineering before the computer revolution, they have been wealthy for the past 50 years or more and they speak English as a first language or as a strong second language.

You get nowhere in computer science without a good understanding of English. Virtually all the popular programming languages use English terms, most (all?) articles are in English and almost all documentation and tutorials are in English years before they are translated to local languages (if ever). So it's much easier for an average Scandinavian or Dutch teenager or college student to get into computer science than for an average Italian or Greek.

Oh, and they have all had a generous welfare states for the past 50-100 years so even back when computer science didn't pay well it wasn't so risky to go down that path.


> it's much easier for an average Scandinavian or Dutch teenager or college student to get into computer science than for an average Italian or Greek.

Speaking as a Greek person who is also quite average, English was never a problem getting into programming. I live and work in the UK now, but when I first landed on these rain-sodden shores, my spoken English was bad and I had trouble communicating with people. Not so with computers- and I never had trouble with programming languages.

I think the same would go for Italians and even more so, since English has borrowed a lot from Latin, which is basically just an older form of Italian (with high school Latin, you can understand Italian perfectly well and even parse sentences on the fly as people speak to you :)

I think the reason why you may not find many Greeks or Italians, or generally people from non-English speaking countries in the history of computing is the opposite of what you suggest: any contributions they might be able to make are hard to transfer over to the English-speaking world. Any papers are in languages that are not understood in the English-speaking world and translation is not always available.


If I were sure that this was written seriously I should have replied along the lines of there being few chances for such an amazing resource as the best programmers in the world by a large margin to be untapped by greedy capitalists who'd then topple America's software empires.

Since I'm not sure that this was written seriously, I'll point to anecdotal evidence of cfront, Stroustrup's C++ compiler, being the buggiest C++ compiler ever written. (Here's a bunch of bugs found by modern static analysis tools - http://www.i-programmer.info/programming/cc/9212-finding-bug... - but unfortunately I can't find the Usenet old posts telling that cfront was the buggiest C++ compiler of its time, and that early g++ was the next buggiest. The Unix-Haters Handbook has evidence from James Roskind who tried to make a yacc grammar for C++ - to a first approximation, an impossible thing to do due to a few goofy design decisions in the C++ grammar - and who said that every time when he tried to feed cfront input that would enlighten him with respect to some dark corner of the grammar, cfront crashed.)


Huh? From your article:

I should say right away, I haven't found anything crucial and I think there are three reasons why PVS-Studio hasn't found serious bugs

The rest of the article doesn't seem to back up your claim really. It even says "The code is of high quality."

Regardless I agree the article isn't very good, serious or otherwise. If it is serious it uses one anecdotal piece to make a sweeping claim on the productivity of programmers across nations...


They show bugs in cfront, trying to sell their product to C++ developers. They'll likely call cfront high quality.

That said, Roskind's experience is more telling than the output of a static analyzer because generally and certainly with a compiler, actual testing shows orders of magnitude more issues than static analysis.


And Lars Bak is the co-author of Dart.


ah yes, forgot that one. He also wrote the V8 javascript engine that powers chrome.

Other notable Danes include Poul henning Kamp that among many other things wrote the varnish cache and Lars and Jens Eilstrup Rasmussen that wrote the original google maps.

The Eilstrup Rasmussen brothers also invented Google wave, but I'm not sure how big an accomplishment that is :-)


You may have heard of the Rasmussens' other project: it's now called Google Maps.

https://en.wikipedia.org/wiki/Lars_Rasmussen_(software_devel...


2nd paragraph


You may have heard about something called maps though. Google maps.


I believe BETA is also a programming language born in Denmark: https://en.wikipedia.org/wiki/BETA_%28programming_language%2...


Anders of course also did TypeScript.


Scandinavia as a whole has done alright on the programming front. Of the LAMP stack - which still runs a huge part of the web, though many of us have moved on - all but the 'A' originated here. Linux in Finland, MySql in Sweden, and php in Denmark. The output of course being styled in Norwegian css.


Don't forget the Icelandic dalvik VM.


I'll add COMAL to that list. I believe it has become somewhat obscure today, but still, it was developed by Danes too: https://en.wikipedia.org/wiki/COMAL


Yeah, makes me proud to be a Dane.


I don't want to be unpleasant at all. But let's please not build up artificial walls between us. Humans moved to different parts of the world and developed slightly differently. But let's not think that being born into the "same" culture or ethnic group gives anyone some kind of password to share in some achievement.

I also admire this man's work and am grateful for his intelligence and efforts.


It's not so much that you're entitled to anything because you're of a certain ethnicity or belong to a certain culture, but if we have 200 different cultures, and some metric that we want to perform well according to, should we not try to learn from those cultures who excel?

Saying that being proud of a culture is building walls and thus bad is over-the-top PC. Just because I am proud (of something) does not mean that I don't want to share it with you if you want to give it a go.

You're acting as if "I'm proud to be Danish, 'cus look at all these great CS Danes" is the same as saying "We need to be the only nation in the world, and we want to wipe out everyone who thinks differently." Being proud is great, as long as you don't turn blind to other people's value and worth.


>But let's not think that being born into the "same" culture or ethnic group gives anyone some kind of password to share in some achievement.

Why not? He is a contributor to the culture that produces a disproportionate amount of successful computer scientists. If these walls were artificial, there would be no disparity.


Its a bit weird being a small part of computer science. The big early movers and shakers were still around for most of our lives. The whole field is within living memory. Now they are starting to pass away. Engelbart, McArthy etc. Think how far physics got from Newton to the Manhattan project. Where will CS be in 100 years? We have so much work to do.


>Think how far physics got from Newton to the Manhattan project. Where will CS be in 100 years? We have so much work to do.

Might not be so. You're applying the same progress curve to both, but remember that Newton developed his theories at the start (or close) of the scientific era, without so many institutions, collaborations, communications, infrastructure, experimental apparatus, computer simulations, the necessary math etc.

Computing was developed in the middle (or close) of the 20th century, and had all that. It also saw much faster development cycles, so the speedup curve we saw with physics since Newton won't really apply to it (I mean not in the same timescale: it could still apply "compressed" in the last 5 decades).

I'd say we're already in the Manhattan project era of computing. And with Moore's law expiring and other limits reached in those areas, we wont be moving that faster in the future -- mostly like Physics did since 50s.


I'd say we're already in the Manhattan project era of computing.

There's no possible way.

We were born at the beginning of civilization. There were the ancients, then us. A hundred thousand years from now, what we write today has a chance of still being around. The language that people speak will be completely alien. But what we write -- our programs -- will still be runnable. There will be emulators for our architectures, and random work from random people in 2015 AD will happen to persist to 102,283 AD.

We're going to be the oldest ancients that they consider real. They'll be able to look at and interact with our work. And even modify it and remix it. And we'll be thought of as just barely more advanced than cavedwellers.

Now, with that sort of context, there seems no possible chance that we can know how far away we are from the Manhattan project era of computing. Very far, suffice to say.

To a certain extent, it's not possible to compare the progress in physics with computing. On one hand, we've unravelled nature to the point where there are no mysteries left except in high-energy physics. There's also no way to know how profoundly the remaining mysteries might impact the world.

Whereas the depth and the mysteries within computing are eternal. If there are still humans in 102,283 AD, there will still be computing, and they'll still be coming up with ever-more complex ways of computing. All of the institutional effort between now and then will push the field way, way beyond the limited horizons that we can currently imagine.


>Now, with that sort of context, there seems no possible chance that we can know how far, far away we are from the Manhattan project era of computing.

Just piling up years doesn't mean much. Yeah, civilization might go on for 100,000 years. Or 1,000,000 years. Or 20,000,0000. That doesn't mean it will progress the same as it did the first x years.

Just think of how we had homo sapiens for 200,000 years but progress was almost flat until around 4-5 thousands years ago. The mere availability of time doesn't ensure incremental progress at scale, or even monotonically increasing progress.

There are things such as low hanging fruit, diminishing returns, etc.

One can already see in physics that the most earth-shattering progress was made until around 1930-1950, and from then on it's mostly small pickings. When you start fresh, there are lots of low hanging fruits to get. At some point, you've reached several limits (including physical limits in measuring and experimental equipment without which you can't go further, and which you can't overcome because they're, well, physical limits).

And that's even without taking into account a regression (e.g. because of a huge famine, a world war, nuclear war, environmental catastrophe, a deadly new "plague" like think, etc.).


One can already see in physics that the most earth-shattering progress was made until around 1930-1950, and from then on it's mostly small pickings. When you start fresh, there are lots of low hanging fruits to get. At some point, you've reached several limits (including physical limits in measuring and experimental equipment without which you can't go further, and which you can't overcome because they're, well, physical limits).

Certainly. And you've hit on the core reason why computing is so eternal: there aren't physical limits.

When a computer becomes ten times faster, we don't affect the world ten times more profoundly. So the physical limits that hold back clock speeds don't matter too much. But when we suddenly no longer need to own cars because we can hail one on demand, our lives become completely different.

The limitations are social, rather than physical. Our own minds, and our lack of ability to manage complexity, is the primary bottleneck standing between us affecting the world, right now. Tonight. Imagine a hypothetical superhuman programmer who can write hundreds of thousands of applications per day. Think of how that'd reshape the world with time.

It seems true to say the amount that'd affect the world is linear w.r.t. time. The longer that superhuman programmer churns out permutations of apps in as many fields as possible, the more the world will change.

But that's exactly what's happening: hundreds of thousands of applications are being written per day, by humanity as a whole. Project that process forward to 102,283 AD. Are you sure the rate of change will be a sigmoidal falloff like physics?


>Certainly. And you've hit on the core reason why computing is so eternal: there aren't physical limits.

On the contrary, there are several. The speed of light. The plank constant. Heat issues. Interference issues. Issues with printing ever smaller CPU transistors. Plasma damage to low-k materials (whatever that means).

And all kinds of diminishing returns situations in Computer Science (e.g. adding more RAM stops having that much of a speed impact over some threshold, or you can make a supercluster of tens of thousands of nodes, but you're limited by communication speed between them, unless the job is totally parallelizable, etc).

>When a computer becomes ten times faster, we don't affect the world ten times more profoundly. So the physical limits that hold back clock speeds don't matter too much.

Huh? What does that mean? If we can't make faster CPUs, then we're not getting much further. Increasing cooling etc only helps up to a point.

>Imagine a hypothetical superhuman programmer who can write hundreds of thousands of applications per day. Think of how that'd reshape the world with time.

We already have hundreds of thousands of applications. It's not exactly what we're lacking. We're talking about qualitative advances, not just churning out apps.


If we can't make faster CPUs, then we're not getting much further.

On the contrary. In our day to day lives, we can do far more today than we could a decade ago.

In 2006, Intel released Pentium 4 chips clocked at 3.6 GHz. In 2016 we use quadcore chips where each core is about as fast as that. Yet we're way more powerful today than a mere 4x multiplier, if you measure what we can do. Think of how limited our tools were just a decade ago.

Clock speed isn't a great measurement, but the point is that making CPUs faster doesn't make the field more powerful.

It seems like we're talking past each other. I was referring to effects that computers have on the world and our daily life, but it sounds like you're referring to total worldwide computation speed, and how it will change over time. If you're saying the rate will slow sigmoidally, similar to the progress in physics, I agree.

But the thesis is that the rate at which the world changes w.r.t the field of computing is unrelated to the total available computation power. Our minds are the bottleneck.

Compare this situation to the field of physics. The rate that the world changes w.r.t physics was related to how important the discoveries were.

The contrast is pretty stark, and it might have some interesting long-term consequences.


> It seems like we're talking past each other. I was referring to effects that computers have on the world and our daily life, but it sounds like you're referring to total worldwide computation speed, and how it will change over time.

Interesting, because you seem to be arguing for each other's side. If you look at what we have then yes - for scientific purposes, total computational power has increased enormously and continues to do so. But if you ask what effects computers have on daily life of you and me, then not much has changed in the last two decades. Software bloat, piling up abstraction layers, turning everything into webapps - it all eats up the gains in hardware. Yes, the screens have better resolution and we now have magic shiny rectangles as another interface, but it seems like software only gets slower and less functional with time.

> Yet we're way more powerful today than a mere 4x multiplier, if you measure what we can do.

Scientists? Maybe. SpaceX can run moar CFD for their rockets on a bunch of graphics cards than the entire world could a decade or two ago. Rest of us? Not much, it really feels our tools are getting less powerful with time, and I don't feel like I can do that much more (and if anything, the primary benefits come from faster access to information - Googling for stuff instead being stuck with just the spec and the source code speeds things up considerably).


It's pretty clear at this point that I've failed to communicate. I'll bow out now.

I have a magical device in my pocket that can summon a car on demand.

Two effective hackers can set up a complete product within a few weeks, and then host it without having to think too much about what we now call devops. And when their servers start to melt, they can spin up however many they need.

We no longer get lost. Remember what it was like to get lost? Like, "I have no idea where I am. Hey, you there. Do you know how to get over to X street?"

These things were not possible ten years ago. Maybe people here simply don't remember, or choose to forget. Or maybe I just suck at writing. But every one of these incredible advances were thanks to advances in the field of computing. Both theoretical and practical. For an instance of the former, see the recent neural net advancements; for the latter, rails, homebrew, the pervasiveness of VMs, and everything else that our forerunners could only dream of but we take for granted.

Have a good evening, and thanks for the enjoyable conversations. You and coldtea both do really cool work.


>I have a magical device in my pocket that can summon a car on demand.

As a gadget lover, it seems magical to me too, especially since I was once lusting over things like a ZX Spectrum. But in the big picture, is it really life changing technology? You could do sort of the same thing already in 1970 with a stationary phone and a cab service (and in 1995 with a mobile phone). Not sure in the US, but where I live I used a cab service all the time ever since the eighties -- it took around 10 mins after the phone to get to you, so not totally unlike calling it with an iPhone.

Same for not getting lost. GPS is nice and all, but was getting lost much of an everyday problem in the first place (for regular people of course, not trekkers etc). Maybe for tourists, but I remember the first 3-4 times I visited the states, where I did two huge roadtrips with McNally road maps, and I didn't have much of an issue (compared to later times, when I used an iPhone + Navigon). I got lost a few times, but that was it, I could always ask at a gas station, or try to find where I was on the map and get on the exit towards the right direction.

I'd go as far as to say that even the 2 biggest changes resulting from the internet age, fast worldwide communications and e-commerce haven't really changed the world either.

Some industries died and some thrived -- as it happens --, and we got more gadgets, but nothing as life-changing as when typography or toilets or electricity or cars or even TV was developed (things that brought mass changes in how people lived, consumed, how states functioned, in urbanization and in societal norms, in mores, etc., heck even in nation-building --e.g. see Benedict Anderson on the role of typography on the emergence of nation states).

What I want to say (and this is a different discussion than the original about limits to computing power over time, etc.) is that technology also has some kind of marginal returns. Having a PC in our office/home was important and enabled lots of things. Having a PC in our pocket a few more things (almost all because of the addition of mobility). Having a PC in our watch? Even less (we already had PC+mobility solution).

>Have a good evening, and thanks for the enjoyable conversations. You and coldtea both do really cool work.

Thanks, but don't let our counter comments turn you off! They way I see it is we're painting different pictures of the same thing (based on our individual experiences and observations), and those reading it can decide or compose them into a fuller picture.


>In 2006, Intel released Pentium 4 chips clocked at 3.6 GHz. In 2016 we use quadcore chips where each core is about as fast as that. Yet we're way more powerful today than a mere 4x multiplier, if you measure what we can do. Think of how limited our tools were just a decade ago.

While 2006 Pentium 4 chips might have a 3.6 speed, they were also slower in execution (due to architecture and x86 cruft) than a single 3.5 core today. Factor in much slower RAM at the time and much slower spinning disks and you can see where most of the multiplier comes.

But even at such, I don't see us being "way more powerful" in what we can do today. In raw terms, mainstream CPU development has plateau-ed around 2010 or so, with merely incremental 10%-15% (at best) improvements year over year. In qualitative terms, are we that much advanced? Sure, we've got from HD to 4K (including in our monitors), but it's not like we're solving some totally different problems.

>It seems like we're talking past each other. I was referring to effects that computers have on the world and our daily life, but it sounds like you're referring to total worldwide computation speed, and how it will change over time. If you're saying the rate will slow sigmoidally, similar to the progress in physics, I agree.

Yes, I was taking about advancements in computing as a field (and I think the thread started discussing just this, hence the OP talking about Newton in the comment that started the subthread, etc.), and not in what we can apply existing computing power equipment to.

>But the thesis is that the rate at which the world changes w.r.t the field of computing is unrelated to the total available computation power. Our minds are the bottleneck.

I don't think that the field of computing is of much importance regarding this observation. Our minds are the bottleneck in all kinds of world-changing things (e.g. achieving world peace) -- applications of computers is just another thing that our imagination might limit us.

That said, computational power changes would enable novel technologies that we already can imagine, that can enable far more change into the world that current computers can (not necessarily all for the better, either). Advanced AI is an obvious example. But also more basic, but computing hungry stuff from entertainment to drug research.


>Think of how limited our tools were just a decade ago.

Not sure what you mean here. What were the limits a decade a go?

Many top programmers still swear by vi(m)/emacs, both started 40 years ago. C is 40+ years olds, C++ is 30+ years old, Python is turning 25, Javascript 20, Rails is 10 years old.

Turbo Pascal, which is over 30 years old, was arguably on par with modern IDEs on very many fronts.

I would say the contrary, there has been very little improvement in the tools aspect.


> There will be emulators for our architectures, and random work from random people in 2015 AD will happen to persist to 102,283 AD

That will need the required ISAs to be recorded somewhere robust enough to last 100 millennia; you can bet it won't be "the cloud". They might even need "Programmer Archaeologists" who will know where to hunt for the ancient specs, and prove Vernor Vinge right (A Deepness in the Sky)

I hope there won't be some Library of Alexandria event in the intervening period.


> I hope there won't be some Library of Alexandria event in the intervening period.

It would take a pretty well-coordinated attack, but you could probably erase the better part of modern Internet by striking a dozen targets simultaneously[0].

[0] - https://aws.amazon.com/about-aws/global-infrastructure/


Are you suggesting that AWS will last 100 millenia?! I know we still have some roads the Romans built, but 100,000 years is a long time (compared to recorded history).

Amazon will be lucky to last 100 years.


Agreed. Let's take another examples: do you know how many thousands of tommes of 19th century/early 20th century books we have but we don't give a fuck about and we don't digitize/preserve?

Even if we have digitized them and had access to them, how many would even care? Heck, most people don't even read the books of their own time -- heck, 50+% read 1 book a year (if that). And those that read are already overloaded with stuff.

So why assume people in 100,000 will give much fucks about books and data from 2015?

Apart from some archeologists, that is -- and even them would have tons of higher level descriptions of our century, in the forms of history books, movies and documentaries, so they wont care much about the hundreds of thousands of individual artifacts we have produced.


Its interesting that you say this. I made a similar point some time ago on another forum but I was ignored and even down voted there(no prize for guessing which forum that was)

My idea was that you could gauge how far along a field has come by whether the pioneers were still alive/how far removed you were from them. CS despite huge advances is still in its toddler years.

So, yes. More work to do and exciting days ahead.


How does your idea handle situations when a field gets renamed, or simply subdivided? E.g. Newton died long ago, but pioneers of quantum chromodynamics are still alive and kicking.


I cant truly say I have considered that angle. But I suppose subdividing a field creates new pioneers and a new field. Renaming it- that's a more tricky situation.(And I have no idea what quantum chromodynamics is!)


In physics we reverse engineer a very complex system given to us. In CS we create the system ourselves and there is some overlap with mathematics, so I doubt the progress will be as drastical.


Via Poul-Henning Kamp: https://twitter.com/bsdphk/status/683750072841072640

I haven't seen any "official" announcement, so I figured Peter Naur's wikipedia page was the best link to use.


There's an article in Danish at http://www.version2.dk/artikel/datalogi-pioneren-peter-naur-...

DIKU (University of Copenhagen Department of Computer Science) linked to that one. Naur was the first professor at DIKU and one of the founders of the institute.


I first became aware of computers as a thing with the launch of 8-bit home computers in the early 80s. As a result, I seem to have a hard time imagining the computing scene before then, as in the 80s it seemed so nascent - so the fact that computers were a big enough of a thing in 1959 for there to be a Danish Institute of Computing blows my mind. Everything before the 80s seems so abstract. Any recommended reading / viewing to try to rebate by historical perspective?


The Psychology of Computer Programming. Gerald M. Weinberg 1971

and:

The Soul of a New Machine Tracy Kidder 1981

ought to be required reading for anybody in tech.


I'll easily second this recommendation. 30 years ago I was struck by Tom West's quote from Kidder's book: "Not everything worth doing is worth doing well".



Interesting that ALGOL 60 inspired other languages such as Pascal and C as well as Ada, etc.

https://en.wikipedia.org/wiki/ALGOL_60


Also Naur did have a good deal of the responsibility for getting support for recursion into Algol 60: https://vanemden.wordpress.com/2014/06/18/how-recursion-got-...

Just think how the programming language landscape might look today if recursion had never gotten into mainstream languages, or maybe only had begun to become the norm recently.


I guess I should be grateful that our profession gets to live in the same time as the masters. RIP


Literally spent last night reading up on the work he did on BNF. He sounded like a very capable and rather rarely, a modest man.


link to a retyped version of the revised "Report on the algorithmic language: algol 60" http://datamuseum.dk/site_dk/rc/algol/algol60retyped.pdf

a very famous paper


Anyone that has taken a compiler class has written something in BNF. RIP.


Can anyone recommend (a) good resource(s) for learning about BNF?


The Wikipedia page does a good enough job. It shouldn't take more than 1-3 pages, it's a fairly simple concept.

https://en.wikipedia.org/wiki/Backus%E2%80%93Naur_Form#Examp...

I stumbled into compilers by accidentally buying the Dragon book as a teen. BNF part I got through fairly quickly.


Jack Crenshaw's "Let's Build a Compiler" [1], starting from Part 2.

[1]: http://compilers.iecc.com/crenshaw/


Never heard of him, but I think I need to know about him. I find this book he wrote to be very enticing:

> Computing: A Human Activity

Sounds like a forerunner to Agile programming.


BNF (formal abstract systems) is baked into the structure of the Internet. Knuth himself suggested BNF be referred to as "Backus–Naur Form" in '64. [1]

Reference

[0] For example RFC5511: "Routing Backus-Naur Form (RBNF): A Syntax Used to Form Encoding Rules in Various Routing Protocol Specifications" https://tools.ietf.org/html/rfc5511

[1] "backus normal form vs. Backus Naur form" http://dl.acm.org/citation.cfm?doid=355588.365140


Isn't he the N in BNF?


Yes.


Speaking of prominent Danish programmers... :)


I think he deserves a black bar? Anyone else think so? He has some serious contributions in that lifetime to the world of Comp Sci and general hackery. How do we vote to initiate a black bar?


Sometimes I think introducing the black bar was a mistake. Now every death has to be judged as to whether the person is "worthy" or not of a black bar. This caused a lot of tension in the threads about Ian Murdock's death, for example.


Agreed, but what angered me a bit about Murdock's death is how fast it fell off front despite iirc over 1300 upvotes.


The editorializing choices regarding Ian on HN made by staff were poor at best. Not surprised they locked discussion down.

But, this is a different topic. Any Turing award winner deserves recognition and reflection on their passing imho.


I'm not sure I'm following this comment. What editorializing did the HN staff do? I didn't see any.

If anything, I may have inadvertently got it caught up in some sort of flame filter because of a comment I wrote about asking for help if you feel suicidal, then (seemingly, but not actually) contradicting myself later by saying that medical help may not be the best solution.

Flame filters, AFAIK, kick in automatically. Mental health is a hot button issue, and flame fests and controversial discussions are really not good or appropriate for HN. I should know, I've been part of a few and have to remember when to take a step back.


I'm only aware of the black bar because of the discussions that inevitably seem to describe its absence for a given person.


It's indeed trouble inducing. But Naur's name has been printed and mentioned in classes since forever. It's a pretty strong threshold.


Good idea! I suggest you send an e-mail to hn@ycombinator.com


We don't. It's best to just leave it up to the mods.


Perhaps but I doubt he'll make the cut.


:(


Will HN have a black banner to remember him?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: