> Let me say it plainly: We know how to produce small portions of software using small development teams—up to 10 or so—but we don’t know how to make software any larger except by accident or by rough trial and error.—Because the software we’re trying to build is too massive—it is simply too difficult to plan it all out, and we have no idea how to coordinate the number of people it takes. Every piece of software built requires tremendous attention to detail and endless fiddling to get right.
I was faced with the same problem when I wrote my first program in assembler. I solved this by making blocks. At the begin and end of each block, I put the requirements and guarantees on the value in each register in a comment.
I work as a IT consultant and maybe my vision is skewed, but I never encountered that very basic level of specification before. At the companies where I have worked, requirements are usually specified verbally, in a loose way. Additionally, OOP concepts are (ab)used, so you there is a lot more state to keep track of.
I imagine that if you think about the structure of the project and specify interfaces of units very precisely (both in terms of what in- and output is allowed, what data types, etc.), this would work a lot better.
In terms of structured working for myself, my background in math and low-level programming has helped me a lot. In terms of working in a team, it has made me aware of how little concrete information is written down, and how much information is in peoples heads only.
> Programming languages have hardly shown one scintilla of difference from the designs made in the first few years of computing. All significant programming languages are expressively, conceptually, and aesthetically equivalent to Fortran and assembly language.
There's been a hidden (or perhaps not so hidden) war between the from first principles approach people and the hackers since we started writing software.
The from first principles approach people want to treat software as a mathematical discipline, and prove software correctness. They want to tackle problems from the top down, and rely heavily on abstraction. The family of Lisp and ML languages follow this approach.
The hacker approach is to iterate until something works. This is the prevailing method. It treats writing software as a craft or trade over a scientific discipline. Correctness is impossible to reason about, but it's quick, dirty, and gets the job done.
In the early days, from first principles lost because it was slow and time consuming. Errors in correctness were less detrimental, the world had yet to become so reliant on software. Quick and dirty meant companies could hire from a broader pool of people than just trained computer scientists. It made software a commodity.
Now we're entering into a crossing, where the from first principles approach has narrowed the gap of efficiency, and quick and dirty is starting to cost us fortunes. Bugs are now a matter of national security, or cause companies to betray the privacy of billions of people.
Our languages do need to evolve to start looking at problems from a from first principles approach. Correctness is now something we can have alongside speed, and correctness has become more important than ever.
This is why I'm bullish on Rust, a language designed from a from first principles perspective, but as efficient as C++. But I think we have stop excepting the lowest common denominator languages. We have to wean ourselves off easy languages that stop us from reasoning about the correctness of our programs. It's not going to be easy, but the weekly breaches of our privacy and the ongoing cyberwarfare between nation states will push us in the right direction eventually.
> iterate until something works... craft or trade over a scientific discipline.
Huh? Science is the quintessence of iterating "until something works", i.e. until you have a theory that appears to fit the observations, until it doesn't, and then you iterate again to find a new theory.
Science is exactly the turning away from a "first principles" approach, and only when we did turn away from that "first principles" approach did first the scientific and then the industrial revolution take off.
And of course in Software, the "first principles" approach was what we successfully overcame with agile approaches that stress quick feedback loops, and of course the same thing happened in manufacturing and other disciplines as well.
> Huh? Science is the quintessence of iterating "until something works"
It is not enough in science to prove that something works, you must also prove that it is correct. That is why in mathematics, even though we believe something to be true through intuition, it is still simply conjecture until we have a rigorous proof. We may even call it a theory if we can observe it with a high degree of certainty. Science is effective because we pay attention to correctness at every level, slowly building understanding in an iterative fashion. The only assumptions we make are at the level of axioms.
The current state of programming makes no claims about the overall correctness of a program. We may write tests about various subsystems, and increase our chances that a system will behave as we expect, but a rigorous proof of correctness, or even something close to a proof of correctness, is still a fantasy.
> Science is exactly the turning away from a "first principles" approach
Science is a from first principles approach. Invention is not necessarily scientific.
> And of course in Software, the "first principles" approach was what we successfully overcame with agile approaches that stress quick feedback loops
Agile has nothing to do with software correctness. It's a methodology for gathering and executing on requirements and constraints. What I'm talking about is ensuring a program behaves in a way we can reason about mathmatically.
Math ≠ Science. In fact, it is questionable whether math even is a science.
"There is disagreement,[16][17] however, on whether the formal sciences [these include mathematics] actually constitute a science as they do not rely on empirical evidence."
"Thus we are led to ask: What is a proof? Heuristically, a proof is a rhetorical device for convincing someone else that a mathematical statement is true or valid."
Not to mention, Lisp family, really, top-down, prove correctness, first principles? This is a family of the most flexible languages, the very best tools for iterative, bottom-up, experimental development.
I was faced with the same problem when I wrote my first program in assembler. I solved this by making blocks. At the begin and end of each block, I put the requirements and guarantees on the value in each register in a comment.
I work as a IT consultant and maybe my vision is skewed, but I never encountered that very basic level of specification before. At the companies where I have worked, requirements are usually specified verbally, in a loose way. Additionally, OOP concepts are (ab)used, so you there is a lot more state to keep track of.
I imagine that if you think about the structure of the project and specify interfaces of units very precisely (both in terms of what in- and output is allowed, what data types, etc.), this would work a lot better.
In terms of structured working for myself, my background in math and low-level programming has helped me a lot. In terms of working in a team, it has made me aware of how little concrete information is written down, and how much information is in peoples heads only.