The programmer is unable to completely and unambiguously articulate the "design" in source code and documentation. Yes, the source code can be improved with with longer names of variables and functions in addition to liberal code comments. And documentation can be expanded to include chapters on "architectural overview" and "technical motivations" to help fill the gaps but it will inevitably be incomplete.
^ Is the one that comes to mind? Or maybe
"[...] I'm getting too old for this sort of thing."
"An elegant weapon... for a more civilized age."
Actually now that I think about it, I really don't know what the parent was referring to.
It fits. With both Murdock and Naur, both of them were names I was familiar with but hadn't heard anything about for quite a while... until I heard they both died.
Nauer of course contributed to algol60, but did you know PHP was written by Rasmus Lerdorrf, Anders Hejlsberg developed delphi, turbo pascal and C#, Bjarne Stroustrup developed C++, Ruby on Rails was developed by DHH. All Danes.
Given a nation of 5 million people I think this is quite unusual.
What does "not a unique programming language" mean? Perhaps 'discrete' or 'distinct' in place of 'unique'? As not an RoR developer, I'm hardly in a position to say, but surely at a certain point a DSL deserves to be called a language in its own right. (Extreme case (that, I realise, is far past what anyone would call a DSL): Perl, as with many other languages, is written in C, but certainly counts as a separate programming language.)
EDIT, in response to two downvotes: surely it's a legitimate question? I genuinely don't know what it means to say that something isn't unique (except mathematically, that it means that there's more than one of it). Assuming that it means what it seems to mean, how does one draw the line that I mentioned?
 - http://esolangs.org/wiki/Rail
>Ruby is a DSL for C since you are working at a much higher level of abstraction than what C offers out of the box.
First of all, a number of nations in Northern Europe have done well in tech: Dutch (python), Norwegians (css, opera), Swedes (erlang, skype), Finns (linux, nokia) and Brits. What do these countries have in common? They are close to each other, they had strong universities within science or engineering before the computer revolution, they have been wealthy for the past 50 years or more and they speak English as a first language or as a strong second language.
You get nowhere in computer science without a good understanding of English. Virtually all the popular programming languages use English terms, most (all?) articles are in English and almost all documentation and tutorials are in English years before they are translated to local languages (if ever). So it's much easier for an average Scandinavian or Dutch teenager or college student to get into computer science than for an average Italian or Greek.
Oh, and they have all had a generous welfare states for the past 50-100 years so even back when computer science didn't pay well it wasn't so risky to go down that path.
Speaking as a Greek person who is also quite average, English was never a problem getting into programming. I live and work in the UK now, but when I first landed on these rain-sodden shores, my spoken English was bad and I had trouble communicating with people. Not so with computers- and I never had trouble with programming languages.
I think the same would go for Italians and even more so, since English has borrowed a lot from Latin, which is basically just an older form of Italian (with high school Latin, you can understand Italian perfectly well and even parse sentences on the fly as people speak to you :)
I think the reason why you may not find many Greeks or Italians, or generally people from non-English speaking countries in the history of computing is the opposite of what you suggest: any contributions they might be able to make are hard to transfer over to the English-speaking world. Any papers are in languages that are not understood in the English-speaking world and translation is not always available.
Since I'm not sure that this was written seriously, I'll point to anecdotal evidence of cfront, Stroustrup's C++ compiler, being the buggiest C++ compiler ever written. (Here's a bunch of bugs found by modern static analysis tools - http://www.i-programmer.info/programming/cc/9212-finding-bug... - but unfortunately I can't find the Usenet old posts telling that cfront was the buggiest C++ compiler of its time, and that early g++ was the next buggiest. The Unix-Haters Handbook has evidence from James Roskind who tried to make a yacc grammar for C++ - to a first approximation, an impossible thing to do due to a few goofy design decisions in the C++ grammar - and who said that every time when he tried to feed cfront input that would enlighten him with respect to some dark corner of the grammar, cfront crashed.)
I should say right away, I haven't found anything crucial and I think there are three reasons why PVS-Studio hasn't found serious bugs
The rest of the article doesn't seem to back up your claim really. It even says "The code is of high quality."
Regardless I agree the article isn't very good, serious or otherwise. If it is serious it uses one anecdotal piece to make a sweeping claim on the productivity of programmers across nations...
That said, Roskind's experience is more telling than the output of a static analyzer because generally and certainly with a compiler, actual testing shows orders of magnitude more issues than static analysis.
Other notable Danes include Poul henning Kamp that among many other things wrote the varnish cache and Lars and Jens Eilstrup Rasmussen that wrote the original google maps.
The Eilstrup Rasmussen brothers also invented Google wave, but I'm not sure how big an accomplishment that is :-)
I also admire this man's work and am grateful for his intelligence and efforts.
Saying that being proud of a culture is building walls and thus bad is over-the-top PC. Just because I am proud (of something) does not mean that I don't want to share it with you if you want to give it a go.
You're acting as if "I'm proud to be Danish, 'cus look at all these great CS Danes" is the same as saying "We need to be the only nation in the world, and we want to wipe out everyone who thinks differently." Being proud is great, as long as you don't turn blind to other people's value and worth.
Why not? He is a contributor to the culture that produces a disproportionate amount of successful computer scientists. If these walls were artificial, there would be no disparity.
Might not be so. You're applying the same progress curve to both, but remember that Newton developed his theories at the start (or close) of the scientific era, without so many institutions, collaborations, communications, infrastructure, experimental apparatus, computer simulations, the necessary math etc.
Computing was developed in the middle (or close) of the 20th century, and had all that. It also saw much faster development cycles, so the speedup curve we saw with physics since Newton won't really apply to it (I mean not in the same timescale: it could still apply "compressed" in the last 5 decades).
I'd say we're already in the Manhattan project era of computing. And with Moore's law expiring and other limits reached in those areas, we wont be moving that faster in the future -- mostly like Physics did since 50s.
There's no possible way.
We were born at the beginning of civilization. There were the ancients, then us. A hundred thousand years from now, what we write today has a chance of still being around. The language that people speak will be completely alien. But what we write -- our programs -- will still be runnable. There will be emulators for our architectures, and random work from random people in 2015 AD will happen to persist to 102,283 AD.
We're going to be the oldest ancients that they consider real. They'll be able to look at and interact with our work. And even modify it and remix it. And we'll be thought of as just barely more advanced than cavedwellers.
Now, with that sort of context, there seems no possible chance that we can know how far away we are from the Manhattan project era of computing. Very far, suffice to say.
To a certain extent, it's not possible to compare the progress in physics with computing. On one hand, we've unravelled nature to the point where there are no mysteries left except in high-energy physics. There's also no way to know how profoundly the remaining mysteries might impact the world.
Whereas the depth and the mysteries within computing are eternal. If there are still humans in 102,283 AD, there will still be computing, and they'll still be coming up with ever-more complex ways of computing. All of the institutional effort between now and then will push the field way, way beyond the limited horizons that we can currently imagine.
Just piling up years doesn't mean much. Yeah, civilization might go on for 100,000 years. Or 1,000,000 years. Or 20,000,0000. That doesn't mean it will progress the same as it did the first x years.
Just think of how we had homo sapiens for 200,000 years but progress was almost flat until around 4-5 thousands years ago. The mere availability of time doesn't ensure incremental progress at scale, or even monotonically increasing progress.
There are things such as low hanging fruit, diminishing returns, etc.
One can already see in physics that the most earth-shattering progress was made until around 1930-1950, and from then on it's mostly small pickings. When you start fresh, there are lots of low hanging fruits to get. At some point, you've reached several limits (including physical limits in measuring and experimental equipment without which you can't go further, and which you can't overcome because they're, well, physical limits).
And that's even without taking into account a regression (e.g. because of a huge famine, a world war, nuclear war, environmental catastrophe, a deadly new "plague" like think, etc.).
Certainly. And you've hit on the core reason why computing is so eternal: there aren't physical limits.
When a computer becomes ten times faster, we don't affect the world ten times more profoundly. So the physical limits that hold back clock speeds don't matter too much. But when we suddenly no longer need to own cars because we can hail one on demand, our lives become completely different.
The limitations are social, rather than physical. Our own minds, and our lack of ability to manage complexity, is the primary bottleneck standing between us affecting the world, right now. Tonight. Imagine a hypothetical superhuman programmer who can write hundreds of thousands of applications per day. Think of how that'd reshape the world with time.
It seems true to say the amount that'd affect the world is linear w.r.t. time. The longer that superhuman programmer churns out permutations of apps in as many fields as possible, the more the world will change.
But that's exactly what's happening: hundreds of thousands of applications are being written per day, by humanity as a whole. Project that process forward to 102,283 AD. Are you sure the rate of change will be a sigmoidal falloff like physics?
On the contrary, there are several. The speed of light. The plank constant. Heat issues. Interference issues. Issues with printing ever smaller CPU transistors. Plasma damage to low-k materials (whatever that means).
And all kinds of diminishing returns situations in Computer Science (e.g. adding more RAM stops having that much of a speed impact over some threshold, or you can make a supercluster of tens of thousands of nodes, but you're limited by communication speed between them, unless the job is totally parallelizable, etc).
>When a computer becomes ten times faster, we don't affect the world ten times more profoundly. So the physical limits that hold back clock speeds don't matter too much.
Huh? What does that mean? If we can't make faster CPUs, then we're not getting much further. Increasing cooling etc only helps up to a point.
>Imagine a hypothetical superhuman programmer who can write hundreds of thousands of applications per day. Think of how that'd reshape the world with time.
We already have hundreds of thousands of applications. It's not exactly what we're lacking. We're talking about qualitative advances, not just churning out apps.
On the contrary. In our day to day lives, we can do far more today than we could a decade ago.
In 2006, Intel released Pentium 4 chips clocked at 3.6 GHz. In 2016 we use quadcore chips where each core is about as fast as that. Yet we're way more powerful today than a mere 4x multiplier, if you measure what we can do. Think of how limited our tools were just a decade ago.
Clock speed isn't a great measurement, but the point is that making CPUs faster doesn't make the field more powerful.
It seems like we're talking past each other. I was referring to effects that computers have on the world and our daily life, but it sounds like you're referring to total worldwide computation speed, and how it will change over time. If you're saying the rate will slow sigmoidally, similar to the progress in physics, I agree.
But the thesis is that the rate at which the world changes w.r.t the field of computing is unrelated to the total available computation power. Our minds are the bottleneck.
Compare this situation to the field of physics. The rate that the world changes w.r.t physics was related to how important the discoveries were.
The contrast is pretty stark, and it might have some interesting long-term consequences.
Interesting, because you seem to be arguing for each other's side. If you look at what we have then yes - for scientific purposes, total computational power has increased enormously and continues to do so. But if you ask what effects computers have on daily life of you and me, then not much has changed in the last two decades. Software bloat, piling up abstraction layers, turning everything into webapps - it all eats up the gains in hardware. Yes, the screens have better resolution and we now have magic shiny rectangles as another interface, but it seems like software only gets slower and less functional with time.
> Yet we're way more powerful today than a mere 4x multiplier, if you measure what we can do.
Scientists? Maybe. SpaceX can run moar CFD for their rockets on a bunch of graphics cards than the entire world could a decade or two ago. Rest of us? Not much, it really feels our tools are getting less powerful with time, and I don't feel like I can do that much more (and if anything, the primary benefits come from faster access to information - Googling for stuff instead being stuck with just the spec and the source code speeds things up considerably).
I have a magical device in my pocket that can summon a car on demand.
Two effective hackers can set up a complete product within a few weeks, and then host it without having to think too much about what we now call devops. And when their servers start to melt, they can spin up however many they need.
We no longer get lost. Remember what it was like to get lost? Like, "I have no idea where I am. Hey, you there. Do you know how to get over to X street?"
These things were not possible ten years ago. Maybe people here simply don't remember, or choose to forget. Or maybe I just suck at writing. But every one of these incredible advances were thanks to advances in the field of computing. Both theoretical and practical. For an instance of the former, see the recent neural net advancements; for the latter, rails, homebrew, the pervasiveness of VMs, and everything else that our forerunners could only dream of but we take for granted.
Have a good evening, and thanks for the enjoyable conversations. You and coldtea both do really cool work.
As a gadget lover, it seems magical to me too, especially since I was once lusting over things like a ZX Spectrum. But in the big picture, is it really life changing technology? You could do sort of the same thing already in 1970 with a stationary phone and a cab service (and in 1995 with a mobile phone). Not sure in the US, but where I live I used a cab service all the time ever since the eighties -- it took around 10 mins after the phone to get to you, so not totally unlike calling it with an iPhone.
Same for not getting lost. GPS is nice and all, but was getting lost much of an everyday problem in the first place (for regular people of course, not trekkers etc). Maybe for tourists, but I remember the first 3-4 times I visited the states, where I did two huge roadtrips with McNally road maps, and I didn't have much of an issue (compared to later times, when I used an iPhone + Navigon). I got lost a few times, but that was it, I could always ask at a gas station, or try to find where I was on the map and get on the exit towards the right direction.
I'd go as far as to say that even the 2 biggest changes resulting from the internet age, fast worldwide communications and e-commerce haven't really changed the world either.
Some industries died and some thrived -- as it happens --, and we got more gadgets, but nothing as life-changing as when typography or toilets or electricity or cars or even TV was developed (things that brought mass changes in how people lived, consumed, how states functioned, in urbanization and in societal norms, in mores, etc., heck even in nation-building --e.g. see Benedict Anderson on the role of typography on the emergence of nation states).
What I want to say (and this is a different discussion than the original about limits to computing power over time, etc.) is that technology also has some kind of marginal returns. Having a PC in our office/home was important and enabled lots of things. Having a PC in our pocket a few more things (almost all because of the addition of mobility). Having a PC in our watch? Even less (we already had PC+mobility solution).
>Have a good evening, and thanks for the enjoyable conversations. You and coldtea both do really cool work.
Thanks, but don't let our counter comments turn you off! They way I see it is we're painting different pictures of the same thing (based on our individual experiences and observations), and those reading it can decide or compose them into a fuller picture.
While 2006 Pentium 4 chips might have a 3.6 speed, they were also slower in execution (due to architecture and x86 cruft) than a single 3.5 core today. Factor in much slower RAM at the time and much slower spinning disks and you can see where most of the multiplier comes.
But even at such, I don't see us being "way more powerful" in what we can do today. In raw terms, mainstream CPU development has plateau-ed around 2010 or so, with merely incremental 10%-15% (at best) improvements year over year. In qualitative terms, are we that much advanced? Sure, we've got from HD to 4K (including in our monitors), but it's not like we're solving some totally different problems.
>It seems like we're talking past each other. I was referring to effects that computers have on the world and our daily life, but it sounds like you're referring to total worldwide computation speed, and how it will change over time. If you're saying the rate will slow sigmoidally, similar to the progress in physics, I agree.
Yes, I was taking about advancements in computing as a field (and I think the thread started discussing just this, hence the OP talking about Newton in the comment that started the subthread, etc.), and not in what we can apply existing computing power equipment to.
>But the thesis is that the rate at which the world changes w.r.t the field of computing is unrelated to the total available computation power. Our minds are the bottleneck.
I don't think that the field of computing is of much importance regarding this observation. Our minds are the bottleneck in all kinds of world-changing things (e.g. achieving world peace) -- applications of computers is just another thing that our imagination might limit us.
That said, computational power changes would enable novel technologies that we already can imagine, that can enable far more change into the world that current computers can (not necessarily all for the better, either). Advanced AI is an obvious example. But also more basic, but computing hungry stuff from entertainment to drug research.
Not sure what you mean here. What were the limits a decade a go?
Turbo Pascal, which is over 30 years old, was arguably on par with modern IDEs on very many fronts.
I would say the contrary, there has been very little improvement in the tools aspect.
That will need the required ISAs to be recorded somewhere robust enough to last 100 millennia; you can bet it won't be "the cloud". They might even need "Programmer Archaeologists" who will know where to hunt for the ancient specs, and prove Vernor Vinge right (A Deepness in the Sky)
I hope there won't be some Library of Alexandria event in the intervening period.
It would take a pretty well-coordinated attack, but you could probably erase the better part of modern Internet by striking a dozen targets simultaneously.
 - https://aws.amazon.com/about-aws/global-infrastructure/
Amazon will be lucky to last 100 years.
Even if we have digitized them and had access to them, how many would even care? Heck, most people don't even read the books of their own time -- heck, 50+% read 1 book a year (if that). And those that read are already overloaded with stuff.
So why assume people in 100,000 will give much fucks about books and data from 2015?
Apart from some archeologists, that is -- and even them would have tons of higher level descriptions of our century, in the forms of history books, movies and documentaries, so they wont care much about the hundreds of thousands of individual artifacts we have produced.
My idea was that you could gauge how far along a field has come by whether the pioneers were still alive/how far removed you were from them. CS despite huge advances is still in its toddler years.
So, yes. More work to do and exciting days ahead.
I haven't seen any "official" announcement, so I figured Peter Naur's wikipedia page was the best link to use.
DIKU (University of Copenhagen Department of Computer Science) linked to that one. Naur was the first professor at DIKU and one of the founders of the institute.
The Soul of a New Machine Tracy Kidder 1981
ought to be required reading for anybody in tech.
Just think how the programming language landscape might look today if recursion had never gotten into mainstream languages, or maybe only had begun to become the norm recently.
a very famous paper
I stumbled into compilers by accidentally buying the Dragon book as a teen. BNF part I got through fairly quickly.
> Computing: A Human Activity
Sounds like a forerunner to Agile programming.
 For example RFC5511: "Routing Backus-Naur Form (RBNF): A Syntax Used to Form Encoding Rules in Various Routing Protocol Specifications"
 "backus normal form vs. Backus Naur form"
But, this is a different topic. Any Turing award winner deserves recognition and reflection on their passing imho.
If anything, I may have inadvertently got it caught up in some sort of flame filter because of a comment I wrote about asking for help if you feel suicidal, then (seemingly, but not actually) contradicting myself later by saying that medical help may not be the best solution.
Flame filters, AFAIK, kick in automatically. Mental health is a hot button issue, and flame fests and controversial discussions are really not good or appropriate for HN. I should know, I've been part of a few and have to remember when to take a step back.