Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>, Brooks wasn't talking about that. He was talking about productivity boosts due to programming language design and programming methodology.

Brooks was also talking about non-programming language advancements as a possible "silver bullets". See the pdf and look for the following sections that are not about programming syntax:

  Hopes for the Silver
    Artificial Intelligence
    Environments and Tools
  Promising Attacks on the Conceptual Essence
    Buy versus Build
He underestimated AI machine learning. With hindsight, we see that the deep-layer neural net combined with GPU beat human rule-based programming for automatic language translations, DeepMind AlphaZero beats expert programmers hand-tweaking IBM DeepBlue, etc.

EDIT reply to : ", nor have they reduced the overall cost of programming by 10x even after more than three decades."

I'm not convinced of that. The issue with your conclusion is that it omits the increasing expectations of more complex systems. I'd argue we did get 10x improvement -- if -- we hold the size & complexity of the system constant. E.g. write 1970s style text-based code to summarize sales revenue by region and output it to green bar dot matrix printers. This was tedious work in old COBOL but much easier today with Python Pandas or even no code tools like MS Excel.

The invisible factor we often overlook is that our demand for software to do more complex things will always outpace the smaller productivity increases of the tools. This makes it look like our newer programming tools never gave us 10x improvement when it actually did.



How did he underestimate statistical machine learning? Whatever achievements were made, they did not take place within a decade, nor have they reduced the overall cost of programming by 10x even after more than three decades.

And, indeed, the one thing that Brooks presented as being the most promising direction, i.e. buy vs. build. He said that unlike changes to programming languages and methodology that wouldn't give a huge boost, buy vs. build might (although he wasn't sure about that). So he was exactly right about that.

Also, 1986 wasn't 1966. In 1986 people didn't write simple software in COBOL. They used things like Magic and Business Basic and even Smalltalk, and, shortly after, Visual Basic and Access. Excel was released in 1987, and we had spreadsheets in the early 80s, too (and Brooks explicitly mentions them in No Silver Bullet). RAD tools and "no code" was very much in vogue in the late 80s and early 90s. That was Brooks's present, not future. He even says that this is the right direction:

I believe the single most powerful software-productivity strategy for many organizations today is to equip the computer-naive intellectual workers who are on the firing line with personal computers and good generalized writing, drawing, file, and spreadsheet programs and then to turn them loose. The same strategy, carried out with generalized mathematical and statistical packages and some simple programming capabilities, will also work for hundreds of laboratory scientists.

When generalised, Brooks's prediction amounts to expecting diminishing returns due to reduction of accidental complexity, and we're seeing exactly that.

> I'd argue we did get 10x improvement -- if -- we hold the size & complexity of the system constant.

Only if we do the one thing Brooks says would work: Buy vs. Build. When we write from scratch -- no way. Even then, I think that while we may see a 10x reduction for specific simple task, we won't see it for large, complex software, which is where most of the effort in software is invested.


>, Visual Basic and Access. Excel was released in 1987, and we had spreadsheets in the early '80s, too. When generalised, Brooks's prediction amounts to diminishing returns due to reduction of accidental complexity, and we're seeing exactly that.

The "diminishing returns" of _what_ exactly?

That's what I'm trying to make clear. Let me try and restate another way:

(1) 10x improvement in programming tasks

vs

(2) 10x improvement in completing business projects

I'm emphasizing that (1) has been achieved many times in multiple areas but it's overshadowed by not seeing (2) happen.

I previously mentioned some things that I'm more than 10x faster on now: https://news.ycombinator.com/item?id=23758199

Visual Basic is another good example. When I first used VB Winforms in 1990s, it was more than 10x faster than hand-coding the raw C "Wndproc()" message loop. But that legitimate productivity gain is dwarfed by the business wanting new complexity (the app needs to connect to the internet, it needs to be new-fangled web app for browsers instead of a desktop exe, it needs to work on mobile phones, etc, etc). Our new desires for new business functionality multiply faster than the time-savings progress in tools.

And "accidental complexity" isn't fixed either. And new deployment environments, new features also add a new set of accidental complexity. E.g. if next generation of apps need to interface to virtual reality (headsets, etc), the programming code will have logic that doesn't have direct business value. So we'll then get new 10x programming tool/library to manage that accidental complexity in the VR environment but then.... we're on to the neural implants SDK and we have no silver bullets for that new thing which means we revisit this topic again.

>while we may see a 10x reduction for specific simple task, we won't see it for large, complex software, which is where most of the effort in software is invested.

I agree. But again to be clear, today's expectation of "large, complex software" -- has also changed.

EDIT reply to: "I'm saying that (1) has not been achieved even within a period of time that's 3x Brooks's prediction, "

Raw Windows SDK C language WndProc() was late 1980s and by 1993, I was using Visual Basic 3 drag & drop buttons on to Winforms. Just that one example was 10x improvement within a decade. For line-of-business apps, VB was 10x+ more productive because of the paradigm shift (in addition to things like not worrying about mental bookkeeping of malloc()/free() etc.)

>But most tasks cannot be achieved today 10x faster than in 1986

For discussion purposes, I don't know why we have to constantly refer to 1986 even though the paper has that date. It's repeated submission for discussion makes it seem like people consider it an evergreen topic that transcends Brook's days of the IBM 360 mainframe.

As another example, the productivity improvement is the writing and deploying complex apps using Ruby on Rails or Javascript frameworks and deployed on AWS. That's more than is more 10x faster than the 1990s CGI days of having C Language code writing to stdout to output HTML. Those early web apps were simpler and yet they were so utterly tedious and slow to code.


I'm saying that (1) has not been achieved even within a period of time that's 3x Brooks's prediction, and that (2) has been achieved as Brooks's claimed it would. Again, don't confuse 1986 with 1966. We had Smalltalk in 1980. In 1987 we had Perl. Everybody was using Visual Basic starting in 1991, and Python came out around the same time. The most popular "fast and easy" programming language we have today is 30 years old. We've been using virtually the same IDEs for over 25 years. Clojure would be immediately familiar to anyone who's learned Scheme with SICP at school in 1980, and when I was in uni in the mid 90s, you know what the most hyped language was, the one that was thought to take over programming in a few short years? That's right -- Haskell. Things have really changed very little except when it comes to the easy availability of free libraries and knowledge on the internet.

> But again to be clear, today's expectation of "large, complex software" -- has also changed

But most tasks cannot be achieved today 10x faster than in 1986 except by the one way Brooks said it might be.

In other words, Brooks's prediction was incredibly prescient, and it is those who disagreed with him (and many, many did) who turned out to have been wrong.

> I don't know why we have to constantly refer to 1986

Because Brooks's prediction is predicated on the thesis that improvements due to changes in programming languages and methodology mostly impact the ratio of accidental/essential complexity, which means that we expect to see diminishing returns, i.e. a smaller improvement between 1990 and 2020 than we had between 1960 and 1990, which is exactly what we see.

> Just that one example was 10x improvement within a decade

Brooks didn't say there can't be 10x improvements due to languages in any decade, only that there won't be in the following decade(s), because when essential complexity is reduced, there's less of it to reduce further. To grossly over-simplify his claim, yes, we did see a big difference between 1980 and 1990, but we won't see as big a difference between 1990 and 2000. Or, in other words, he claimed that while you certainly can be 10x more productive than Assembly, you can't be 10x more productive than Python.

> As another example, the productivity improvement is the writing and deploying complex apps using Ruby on Rails or Javascript frameworks and deployed on AWS. That's more than is more 10x faster than the 1990s CGI days of having C Language code writing to stdout to output HTML. Those early web apps were simpler and yet they were so utterly tedious and slow to code.

But that's because the web took us backward at first. It was just as easy to develop and deploy a VB app connected to an Access DB in, say, 1993, as it is to develop and deploy a RoR one today. A lot of effort was spent reimplementing stuff on the web.


>It was just as easy to develop and deploy a VB app connected to an Access DB in,

And that VB desktop app was not accessible to web browsers.

The business expectations/requirements changed.

E.g. when I wrote the VB desktop app for internal sales rep, I didn't need to code a "login screen" because the rep was already authenticated by virtue of being on the corporate network.

But if business says that checking product prices should be "self-service" by outside customers using a web browser, now I have to code a login screen. ... which means I also have to create database tables for customer login credentials... and code the web pages to work for different screen resolutions, etc, etc.

Yes, VB paradigm shift was a 10x productivity improvement over raw C Language (that's a programming syntax and environment change and not just a library) ... but it's overshadowed by having to write more code for previously unavailable business capabilities. New business expectations will always make it seem like we're running to stand still. It's not just the ratio of accidental to essential complexity.

After writing out of bunch of thoughts on this... I propose a another way to translate Brook's paper which is still consistent with his message: The new business requirements (new essential complexity) will always outpace programming technology improvements (e.g. accidental complexity reductions of using GC to manage memory instead of manual malloc()/free()).

This is why accidental complexity is always a smaller component of Total complexity. Thus, the real 10x programming improvements don't actually make us 10x faster at finishing business software.

EDIT reply: ">business requirements change has little to do with his point:

I interpret "essential tasks" and "essential difficulties" as modeling the business requirements. I'm saying his first paragraph can be interpreted that way. If you disagree, what would be some example programming code that shows "essential tasks" that's not related to business requirements and also not "accidental complexity"?

(He's saying the essential task is the "complex conceptual structure".)

>For most given requirements, it is not 10x easier to do something from scratch today than it was 30 years ago.

If the given requirements are the same as the 1980s but I also I get to use newer tools that didn't exist in 1980s (SQLite instead of writing raw b-trees, dynamic collections instead of raw linked-lists, GUI toolkits instead of drawing raw rectangles to the screen memory buffer by hand, etc), then yes, coding from scratch will be much faster.

>Of course applications 30 years ago had login screens and were accessed remotely, they just didn't do that using HTTP and HTML, which set us back in terms of some capabilities for a while.

This misses the point of my example. I was trying to emphasize the new business requirement of customers -- not employees -- accessing corporate database systems.

Therefore, a new GC language to alleviate mental burden of malloc() doesn't really help with that new complexity. It wasn't about mainframe green screens to http/html. It was about the new business functionality for customers access that makes it seem like programming productivity didn't improve at all.

The mainframe greenscreen wasn't relevant because customers at homes don't have X.25 T1/T3/ISDN connections to connect to the company's mainframe.

This is not an example of "web being backwards" or "catching up to old greenscreens". The end customers didn't previously have access to the mainframe at all. Therefore, it's new business functionality to empower customers that must be coded.

>we haven't developed any new programming paradigm or technique that has increased our ability by much.

Even if we exclude libraries, I still think garbage collected language (deployed on regular commodity pc instead of expensive Smalltalk workstation) is a 10x productivity improvement over C/C++ malloc/free/new/delete for line-of-business apps. Chasing random pointer bugs will slow productivity way down. And languages like PHP where the HTML templating was a 1st-class concept alongside the code is 10x improvement over HTML that was generated in CGI stdout of C and Perl scripts. New programming paradigms do help the coding aspect of productivity a lot. They just don't increase total business projects' productivity.


Of course applications 30 years ago had login screens and were accessed remotely, they just didn't do that using HTTP and HTML, which set us back in terms of some capabilities for a while.

What you're saying about requirements outpacing ability might be true, but it is not Brooks's point, which is also true, and that business requirements change has little to do with his point: For most given requirements, it is not 10x easier to do something from scratch today than it was 30 years ago.

We've certainly accumulated many software components over the past 30 years and made them freely available, and that has helped productivity a lot -- as Brooks wrote it might. But, as he predicted, we haven't developed any new programming paradigm or technique that has increased our ability by much.


I was so productive writing VB apps. I miss those days. Everything in one application, no browser -> server communication delay. You just assumed a minimum screen size of 640x480 and made sure it fit into that. No mobile responsive bullshit, no CSS malarky to deal with. What you drew in Visual Studio is exactly what the user got.

But there was also a lot of boilerplate involved. No handy-dandy open-source libraries sitting around on the internet that could just be pulled in to deal with a task. You could buy a COM object or a commercial visual control, but it was rare and expensive. If you were doing tricky things you had to work it out yourself the hard way, and make mistakes doing it.

Now... most programmers I know are plumbers wiring up different ready-made product APIs using a script language.

Yeah, I think jasode is right - we're 10x as productive now, and I think that's because of the popularity of Open Source and Stack Overflow rather than IDE's (if anything modern IDE's are less productive than 1990's era Visual Studio). However the business tasks have got 10x more complex. And I don't think that's unconnected - things that were previously too complex/difficult/expensive are now routine, and we're now expected to do more.


> we're 10x as productive now, and I think that's because of the popularity of Open Source and Stack Overflow

So you agree Brooks was right.


yeah, basically - there's an amount of complexity which is acceptable to business. Less than this and the business will add features until it's met. More than this and the features are not commercially viable.

The amount of complexity that the programmer needs to cope with is about the same, regardless.


> And that VB desktop app was not accessible to web browsers.

That wouldn't be a problem. I work in mainframes, web applications of today are a reimplementation of CICS and other green screen apps. Yes, there is more bells and whistles, it's nicer, but the technology for "distributed" applications has been there for a long time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: