Hacker News new | past | comments | ask | show | jobs | submit login

Exactly. This is ignoring the fact that code is a more useful blueprint than anything else for programs.

I'm sure if architects had the ability to magically conjure building materials out of thin air and try things in real life for free, architecture would involve a lot more trying and a lot less planning.

Of course, I'm sure the article is talking about people just rushing into production code without thought, and I get that is a problem. It's just when people make comparisons like that it can imply this horrendous future where we whiteboard program for months, which is just a terrible idea.




I think there's a bit of a problem with the architecture metaphor in general. It implies a limited range of things planned and built. As software eats the world and does more and more things, a metaphor that more closely reflects the variety of software is people telling other people to do stuff.

If you tell someone to bring you your socks, you don't need to plan for it. If they bring the wrong socks, you just fix it. If you tell them to invade the beaches at Normandy, you might want to work out more of the details in advance. You can tell someone to remove a splinter or remove a brain tumor, and your part of the instructions might be roughly equivalent if you are telling an "API" that has already been adequately told how to do what you ask.

The problem of unintended consequences of instructions has been with us far longer than computer software. In any story of a magic jini granting three wishes, the third wish always ended up a version of `git reset --hard`. I love having direct manipulation tools that simulate your proposed solution, giving you much faster feedback. Midas with VR goggles would have quickly seen something unintended turn to gold and canceled before committing. That's extremely helpful.

But this isn't the ultimate solution for how to deal with software complexity. It's a very helpful tool in some cases. Some software should still just be coded immediately and fixed as needed (takes less time to do it again than to create a simulator), some would benefit most from a good, debugged library (I'd rather the robot already know how to remove the tumor than show me direct feedback of my fumbling), some from direct manipulation tools, some from mathematical techniques (remembering that mathematically proven software is buggy if an errant RAM chip violates the axiom that `0=0`), some from better testing, some from better runtime monitoring, and so on.

But as with humans' verbal instructions, there will always be leftover unanticipated consequences due to flaws in the spec, bugs in the code, and breakage in the implementation.


One way the architecture metaphor breaks down is that very tiny details can have incredibly large knock-on effects in software. This is different from normal architecture and building engineering where it's certainly true that there are many details that need to be carefully considered, but those things are well understood and managed from project to project. Software on the other hand could be doing anything, in a world of pure logic traversing scales analogous to the smallest quark ranging to the entire universe. Building physical things just doesn't deal with those scales, you only have to worry about material properties and structures within a range of few orders of magnitude.


Apparently a "subtle conceptual error" can have massive consequences in architecture.

http://people.duke.edu/~hpgavin/cee421/citicorp1.htm


Coding isn't always the best blueprint. One example: what if you're writing a service that talks to two other internal services and a third-party API? Your code might capture what your specific service does, but it doesn't capture the overall design and intent of the complete system. That's one of the places where formal methods like TLA+ excel.


You might be interested in Servant, a haskell library that uses types to capture these high level interactions between components and then validate and even generate clients, documentation, servers, and simulators.


I definitely would! One thing a lot of people miss in these discussions is that this isn't an either-or, and we should be using multiple different correctness techniques to increase our confidence.


> This is ignoring the fact that code is a more useful blueprint than anything else for programs.

Personally, I find good 'ole bubble-and-arrow diagrams, list of data members, and hand-written C structs with arrows between them to be extremely valuable prototyping tools.


It's true that source code is a blueprint (we don't have builders; for software, unlike civil engineering, the build process is done by automated tools, not people.)

There's certainly an argument that coding is often done without clear understanding of the larger-scale structure that the module is going to fit into, the exact domain functionality, etc., and that much development has gone too far in avoiding analysis and requirements, overreacting to the broken and frontloaded abstract design process that often used to be done prior to writing any code.

But missing blueprints are the wrong analogy.


> code is a more useful blueprint than anything else for programs

This is so clearly false that it can actually be mathematically disproven. To prove that there are better software blueprints than code, I will prove that there exist algorithms/systems for which sufficient formal blueprints exist, yet no code can capture. To prove that, I will need to define the terms more precisely. A "sufficient blueprint" is one that makes it possible to prove properties of interest about your software. "Code" is a formal representation of an algorithm, which always allows efficient mechanical execution, meaning, mechanically executing code is within a polynomial bound of the complexity of the algorithm that code expresses.

Now for the counterexamle: take the specification of the Quicksort algorithm, [1], shortened here as: 1. Pick a pivot. 2. Partition the elements around the pivot. 3. Recursively apply 1 and 2 to the resulting partitions.

I claim that this specification is sufficient. For example, from a direct formal statement of it, you can formally prove that it actually sorts and that it runs in worst-case quadratic time complexity. Yet, I claim that no code can express the above specification; in fact, no code can express any of the three steps. For example, step 1 says "pick a pivot", but it doesn't say which. This is not an omission. It doesn't say which because it doesn't matter -- any choice will do. And yet, code must necessarily say which pivot to pick, and when it does, it is no longer captures the specification, and is no longer a blueprint for all implementations of that specification. Similarly for the other two steps: code must specify which partitions are created of the very many possible and completely fine ones, and in what order the recursion takes place. QED

Languages like TLA+ can formally, i.e., mathematically, capture precisely the specification above and express anything code can. There is no magic here: Efficient execution of code (provably!) cannot be done in the presence of some quantifiers, yet quantifiers increase the expressiveness and "generality" of the language.

[1]: https://en.wikipedia.org/wiki/Quicksort#Algorithm


Oh dear. Thank goodness civil engineers do not build by trial and error. Many programmers do use the trial and error approach but their programs are riddled with bugs. There is no way that code is the best or most useful "blueprint" for software.

Your analogy is quite good, though, just not in the way you intended. If civil engineers did what you suggest then they would build magnificent structures which work in their office environment. Of course when that window that's never opened is opened by a cleaner everything comes tumbling down due to a gust of wind.


> Thank goodness civil engineers do not build by trial and error.

Oh but they do. Hundreds - if not thousands - of years of trial and error and catastrophes and deaths.

But the failures get written up, taught, remembered, and applied to future projects.

(And even then not perfectly - witness Mexico City and buildings on a drained lake in an earthquake zone; or Houston's wild building expansion in a hurricane-prone floodplain.)

Software, in contrast, has barely 60 years under its belt and a culture of hiding failures. Give it another 500 years and it'll start to look more like the architecting and building industries we have today.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: