Even the simplest "real world" programs do things like: communication with a database, string manipulation, calling library functions which can fail.
> As someone who actually does formally prove things about algorithms significantly larger than that from to time, I believe you’re exaggerating here.
I wrote a master's thesis that touched the subject and it's what I put in an introduction as an illustration of the field's difficulty (I believe it was not taken from air ;). I don't know the field now but ca. 10 years ago program's proofs were mainly refinements of high level axioms to executable code so there was no separate "proof" step of a finished program, but still the added complexity of proving the refinement steps was huge.
Sure, but these things are exactly what I’m talking about with accidental complexity. You’re talking here about databases and strings and library functions, not about whatever real world problem you’re ultimately trying to solve. Complexity that comes from the tools you use or how you use them is exactly what I’m arguing we should ideally minimise when programming, but we often don’t for various reasons.
add x y = x + y
This is why I prefer to talk about minimising accidental complexity rather than eliminating it. Essential complexity is logic you can’t avoid. It’s fundamental to the problem you’re solving, and any correct and complete solution must take it into account. Accidental complexity is logic that in principle you can avoid. However, sometimes you don’t want to: diagnostics and test suites bring practical benefits other than directly solving the original problem, and we will usually accept some extra complexity in return for the value they add. Similarly, using a tried and tested tool, albeit one designed to solve a more general problem, may be more attractive than implementing a completely custom solution to our specific problem from scratch, even though again it might introduce extra complexity in some respects.
I don’t really mind whether we break down accidental complexity into finer divisions. There are lots of sources of accidental complexity. There are lots of potential benefits we might receive in return for accepting some degree of accidental complexity. My original point was just that some complexity is avoidable and often it is possible to simplify real world code by eliminating some of that accidental complexity, even if we choose to accept it to a degree for other reasons.