Hacker News new | past | comments | ask | show | jobs | submit login

I normally agree with Dan's essays, but this time I very much disagree. As someone who first started programming around the time Brooks's article was written, my conclusion is that not only was Brooks right, but, if anything, that he was too optimistic.

In the late 80s and early 90s, a lot of software development was done with various RAD languages (Magic, Business Basic -- my first programming job used that -- and later VB). We fully expected the then-nascent Smalltalk, or something like it, to become dominant soon; does anyone claim that JS or Python are a significant improvement over it? Our current state is a big disappointment compared to where most programmers in the 80s believed we would be by now, and very much in line with Brooks's pouring of cold water (perhaps with the single exception of the internet).

My perception is that the total productivity boost over the past three decades is less than one order-of-magnitude (Brooks was overly careful to predict no 10x boost due to one improvement in language design or programming methodology within one decade), and almost all of it comes from improvements in hardware and the online availability of free libraries (Brooks's "Buy vs Build", which he considered promising) and information -- not from changes in programming methodology or language design (although garbage collection and automated unit-tests have certainly helped, too). The article also mixes hardware improvements and their relationship to languages, but we knew that was going to happen back then, and I think it's been factored well into Brooks's prediction. Moreover, my perception is that we’re in a period of diminishing returns from languages, and that improvements to productivity Fortran and C had over Assembly are significantly greater than the gains since.

The best way to judge Brooks's prediction, I think, is in comparison to opposite predictions made at the time -- like those that claimed Brooks's predictions were pessimistic -- and those were even more wrong in retrospect.

I would also add that if you want to compare the ratio of essential and accidental complexity in line with Brooks's prescient analysis, you should compare the difficulty of designing a system in an accidental-complexity-free specification language like TLA+ to the difficulty of implementing it in a programming language from scratch. I find the claim that this ratio has improved by even one order of magnitude, let alone several, to be dubious.

> Brooks states a bound on how much programmer productivity can improve. But, in practice, to state this bound correctly, one would have to be able to conceive of problems that no one would reasonably attempt to solve due to the amount of friction involved in solving the problem with current technologies.

I don't think so. Although he stated it in practical terms, Brooks was careful to make a rather theoretical claim -- one that's supported by computational complexity results obtained in the 80s, 90s and 00s, on the hardness of program analysis -- about the ability to express what it is that a program is supposed to do.




It is my observation that if you want to write a GUI, the tools peaked around Delphi and Visual Basic 6. GIT is way nicer than making manual pkZIP backups, and there are nice refactoring tools in Lazarus... but it's not THAT much better.

What was really surprising to me is the lack of GUI building IDEs for python, etc. Now I know why they are scripting languages, and not application languages.


> What was really surprising to me is the lack of GUI building IDEs for python

https://www.learnpyqt.com/tutorials/first-steps-qt-creator/


I went through wxBuilder, but the problem is the same... it's a one way trip, and once you start hooking python to the generated code, you lose the ability to tweak the UI without losing work.

I got around this with wxBuilder by building my own interface layer to decouple everything, but then I needed to change a list to a combo box, and everything broke.

Things that take HOURS this way are a few seconds in Lazarus.


With Tkinter Python itself was the GUI-building IDE. (I have no idea why Tkinter isn't more popular. The TCL/Tk widgets that it wraps are boss. For example, the text editor widget comes with word wrap modes, undo/redo, tags, marks, customizable selection patterns, rich text, embedded widgets, ... it's really cool.)


I remember writing stuff in Delphi 6 and making GUI resize correctly wasn't particularly convenient.

On the other hand Qt3-era QtDesigner was great at everything Delphi was good at and much better at layouts.


How maintainable were creations on those old RAD tools? Because in my limited experience problems quickly outgrow them or become nigh impossible to comprehend monstrosities.


It depends to what you compare it to I guess, I think Qt fares pretty well once you get over the original learning curve for instance.

But these days it seems that the standard is web-based interfaces and honestly whatever scaling problem these "old" RAD tools have, the web has times 20.

I was late to the webdev party, I only reluctantly started to write JS a couple of years ago and to these days I'm still baffled by how barebones it is compared to the GUI toolkits I was used to. You have to import a trillion dependencies because the browser, despite being mostly a glorified layout engine, doesn't really support much but bare primitives.

Thinks like date pickers or color pickers are a recent development and are not supported everywhere.

Making range inputs is not supported by most browsers and requires a heavy dose of javascript and CSS to achieve, and you end up with something that won't look or feel like a native control in all browsers.

Ditto for treeviews.

Styling and customizing combo-boxes is so limited and browser-dependant that you have dozens of libraries reinventing the wheel by creating completely custom controls, each with their own quirks and feature set.

There's no built-in support for translations and localization (unless you count the accept-language HTTP headers I suppose). On something called "the world wide web", that's pretty embarrassing and short-sighted IMO. But you do have a Bluetooth stack now, so that's nice.


Can't speak about Visual Basic, other than it apparently worked just fine for small-to-medium complexity UIs.

On Delphi side, there was significant difference between "code produced by someone just starting out" which tended to make an unholy mess of generated code mixing UI and non-UI operations, but it was quite workable for bigger projects if the developer was more experienced and put some architectural thinking into project (for example, separating business logic and the UI code that called into it).


I suspect this is the key. The RAD tools made application developers more productive, but they came with a downside. If the tool became popular, the vendor lock-in kicked in and it became expensive or the company maintaining it stopped doing a proper job of supporting of it.

Therefore, in the 90s, people became tired of this and looked for ways out of the vendor lock-in. That's why OSS started to get traction, and also Java. It turned out that it is cheaper to develop your business app in Java from scratch, rather than in a commercial RAD tool and then pay up your nose to a disinterested company for maintaining the base on which it stands.

So I think OSS is actually less productive and less polished than it could be (many of the RAD tools are actually really cool, but insanely expensive), but it is still case of worse is better.

I think it's possible that some business DSLs (which are at the core of the RAD tools) will win mindshare again, but it is going to be quite difficult.

(I work in mainframes and there is quite a bit of RAD tools that were pretty good in the 80s, when mainframe was a goto choice for large business apps.)


Agreed. And Dan is doing nothing with ANY essential complexity here. Seriously: he's copying logs and generating plots from them. That's no different from copying logs and generating plots in 1986. Only the underlying software and hardware infrastructure has advanced. Conceptually it's an identical problem. And touchingly he thinks that having a grasp of how to use sophisticated tools to achieve what is a very simple task is somehow essential complexity. It isn't. It's still just copying logs from one place to another and generating a plot from them.

And you and Brooks have been absolutely right: there was no one improvement in language design that gave a 10x boost in productivity.


I think you misunderstood. Dan agrees there is no essential complexity. On logs: "this task ... is still nearly entirely accidental complexity". On query and plot: "I think it's unbelievable that essential complexity could have been more than 1% of the complexity I had to deal with".

Brooks claimed in No Silver Bullet that 2x programming productivity improvement is unlikely because now essential complexity dominates over accidential complexity. Dan is trying to demonstrate that it is completely untrue.


But you can't demonstrate it to be untrue by picking specifically crafted examples (which, BTW, are simple not because of any advancements Brooks was talking about in PL design and programming methodology, but due to hardware improvements and availability of OTS components). You could only demonstrate that by showing that most of the effort in software at large is of this kind, and that the improvement is due to language design and/or programming methodology.

When distilled, Brooks's claim is that the relative size of accidental complexity in programming overall is (or was) "low", i.e. less than 90%, while Dan wants to claim it is (or was) "high", i.e. more than 90%. His experiment does not establish that.


Exactly. Developing an operating system for example still has the same ratio of essential complexity to accidental as it ever did and guess what: OSs are still written in C and assembler. Has there been an order of magnitude improvement in productivity because of program language design in either of those two languages since 1986? Nope.


> Our current state is a big disappointment compared to where most programmers in the 80s believed we would be by now, and very much in line with Brooks's pouring of cold water

People not exposed to this era almost have a hard time believing it. My litmus test is this: how easy is it for any person to make a button in a given computing system? On a Mac in the late 90s this was so easy that non-programmers were doing it (Hypercard). Today where would one even begin?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: