Looking at the decision to adopt some defect prevention strategy in software.
Cost of strategy < ∑(perceived chance of a defect being prevented)*(cost of the defect + cost of correcting the defect)
1968 vs 2018
Cost of strategy
I doubt this changed much. For some strategies, this changed a lot, like Buy vs Build where the cost to buy has gone to near zero due to npm, NuGet, CPAN etc.
Cost of the defect
I doubt the perception of this changed much, whether that is accurate is up for debate. Software defects are prone to long tail events that will have a disproportionate effect.
Cost of correcting the defect
This went to engineers send on a plane with physical media to some customers mainframe to floppies in the mail to downloadable patch installers to asking the customer to patch from the application to pushing code and letting automated build, deploy, test and background updates. Compared to 1968 the cost went almost to zero.
Strategies have to be better or cheaper to be adopted vs 1968; because the costs of defects have plummeted for many organizations. Unfortunately, the author only references "cost" once in the 15 pages.
To me LoC is an indicator of how much you'll spend on maintenance; less is better.
I think that defects per unit of value have plummeted.
Long Tail events is a big problem in software and a few lines of code are responsible for a large part of the costs (for a longer version of this answer https://possumlabs.com/the-cost-of-software-bugs-5-powerful-...).
I don't think we should call what VW did "incorrect software". It actually did what it was supposed to do.
Could someone provide an extra clear ELI5 so I can re-read with that context in mind?