Hacker News new | past | comments | ask | show | jobs | submit login

The author mistakes coding for typing. You maybe get faster at typing, but coding also involves thinking and doing research. Unless you do very repetitive tasks and you can type from memory.

I don't think the author is making that mistake. He worked on the Eve language, whose entire ethos was that the notion behind Fred Brooks' "No Silver Bullet" -- that there is no "one" change to development tolling/methodology that could produce a 10x speedup in productivity -- isn't really true anymore. In "Out of the Tar Pit", Mosely and Marks argue that today there is so much incidental complexity (from poorly designed or ill-fitting tools) slowing down our work, that removing it would result in those 10x gains Brooks argues can't be achieved in NSB.

The question then is how do we remove that incidental complexity? Eve tried to do this through programming language design, combining Prolog-like semantics with a relational database and modern web technologies. In some scenarios it really is at least 10x more productive than other languages, letting you do very sophisticated things in a few hours or days that could take experienced developers a literal week or more in other languages.

The author is musing here about how much he could get done as long as his tools get out of the way. i.e. how much more productive could he be if he only had to deal with the necessary complexity of the problem, rather than being mired in all the incidental complexity of the tooling. What if a programmer could express themselves as freely as a writer can? Where typing and imagination are the only barriers between your mind and expression. It's a nice fantasy future.

Yes there is a lot of thinking and doing research in programming, any experienced programmer knows this. I think if you look at this developer's history you would agree he is an experienced developer who knows these things to be true.



(edit: also, I think the author addresses your criticism directly here: "When I think about speed I think about the whole process - researching, planning, designing, arguing, coding, testing, debugging, documenting etc. Often when I try to convince someone to get faster at one of those steps, they'll argue that the others are more important so it's not worthwhile trying to be faster. Eg choosing the right idea is more important than coding the wrong idea really quickly. But that's totally conditional on the speed of everything else!")

> In "Out of the Tar Pit", Mosely and Marks argue that today there is so much incidental complexity (from poorly designed or ill-fitting tools) slowing down our work, that removing it would result in those 10x gains Brooks argues can't be achieved in NSB.

Just to clarify for others, Brooks' paper does not say that there will not or cannot be 10x (order of magnitude) improvements in programming.

He made two principle statements in his paper:

1. That there will not be a 2x improvement in programming every 2 years (comparable to Moore's Law about the increasing density in transistors that was roughly every 18-24 months). That isn't to say that there won't be 2x improvements in some 2 year periods, but there will not be consistent 2x improvements every 2 years.

2. That within a decade of 1987 (that is, a period ending in 1997) there would not be a 10x (order of magnitude) improvement in programming (reliability, productivity, etc.) from any single thing.

So trying to refute Brooks' 2nd assertion, what lots of people try to do, by looking at changes post 1997 is an absurdity as it ignores what he actually said and the context of it.

I feel that Brooks puts a time horizon of 10 years on his predictions because of AI. If we achieve AGI, then that would be an obvious example of something overnight which would yield (at least) a 10x improvement of productivity. But I think it was a safe bet for Brooks to say we wouldn't be there by 1997.

That said, on the topic of environments and tools Brooks writes, "Surely this work is worthwhile, and surely it will bear some fruit in both productivity and reliability. But by its very nature, the return from now on must be marginal." This claim is not timeboxed to a decade, but is instead projected to perpetuity. He boldly asserts without support that that from now on, improvements to tooling environments definitionally cannot yield 10x improvements in programming. I think this is where people have taken issue, and have shown this claim to be wrong.

I think this error comes from a misconception in the previous paragraph: "Language-specific smart editors are developments not yet widely used in practice, but the most they promise is freedom from syntactic errors and simple semantic errors."

There is no reason to believe that language-specific smart editors could only save you from simple semantic errors and syntax. In fact, language specific editors in the ethos of Eve and Light Table supported sophisticated debugging and development modalities:

- time-travel debugging capabilities that rewind the state of a program to a previous point in its execution history

- what-if scenarios to test how the program would respond to different inputs without restarting it

- provenance tracing of program state to see how it was calculated

- live debugging and modification of a program while it's running

- saving, sharing, and replaying program state to trace bugs

- the ability to ask new questions about program output e.g. "Why is nothing drawing in this area of the screen?"

These are not "simple semantic errors" but deep program analyses and insights that are not even supported by most (any?) mainstream languages (nor could they be without changing core language semantics or adding on new language features with orthogonal semantics). Brooks doesn't imagine any of this and dismisses it all as "marginal" whereas in my experience actually using these tools, I found them literally transformative to my productivity and workflow.

Probably just my misunderstanding:

Some of the items in your list sound like things that were features of Smalltalk IDEs back in the late 80s ?

- time-travel debugging

- live debugging and modification

- saving, sharing, and replaying

p304 "Debugger Windows" "Smalltalk/V 286 Tutorial and Programming Handbook"



And later, Rewrite Rules:

A very large Smalltalk application was developed at Cargill to support the operation of grain elevators and the associated commodity trading activities. The Smalltalk client application has 385 windows and over 5,000 classes. About 2,000 classes in this application interacted with an early (circa 1993) data access framework. The framework dynamically performed a mapping of object attributes to data table columns.

Analysis showed that although dynamic look up consumed 40% of the client execution time, it was unnecessary.

A new data layer interface was developed that required the business class to provide the object attribute to column mapping in an explicitly coded method. Testing showed that this interface was orders of magnitude faster. The issue was how to change the 2,100 business class users of the data layer.

A large application under development cannot freeze code while a transformation of an interface is constructed and tested. We had to construct and test the transformations in a parallel branch of the code repository from the main development stream. When the transformation was fully tested, then it was applied to the main code stream in a single operation.

Less than 35 bugs were found in the 17,100 changes. All of the bugs were quickly resolved in a three-week period.

If the changes were done manually we estimate that it would have taken 8,500 hours, compared with 235 hours to develop the transformation rules.

The task was completed in 3% of the expected time by using Rewrite Rules. This is an improvement by a factor of 36.

from “Transformation of an application data layer” Will Loew-Blosser OOPSLA 2002


There are no silver bullets if you are already near the top of your game. However, in my experience, most engineers and programmers are far from working optimally.

Here's the silver bullet that seems to work very well.

> Master your current stack.

Time and time again I see programmers who know computer science but really don't know their language. They run to stack overflow for everything from sorting to reading files. They make obvious syntax errors. They pause constantly not to think about problems, but to remember how to implement something simple.

Most programmers can 10x themselves just by learning their language and stack really well.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact