Hacker News new | past | comments | ask | show | jobs | submit login

I think your falacy is in the "less code" assumption when you say "The result is less code". I'd argue that empirially we've seen this to be false. The result isn't less code, at least in a global sense, its more productivity, more features, more customization, and more specificity at a cost of less code/feature. Software has really interesting economics where as the cost/feature decreases by a factor, say 1x, then the set of features that can be profitably worked on expands by like 10x, so paradoxically, as cost/feature decreases, it makes sense to hire more engineers and expand R&D budgets.



The Jevons effect is not especially peculiar to software.


I think ultimately, the question is whether this trend will result in "fewer programmers needed", which is the most important by-product of "less code" in the author's thesis.


Did we slash R&D budgets once we standardized on the X_86 instruction set thus needing less compiler devs? Did we slash R&D budgets when we moved from on prem to cloud hosting? We have seen this happen many times before, we know the economics. Decreasing cost/feature is synonymous with increasing productivity. We know that a 1x increase in productivity results in a very large increase in the numbers of features that become feasible.

There isn’t some fixed factor here that causes it all to collapse. Productivity increases are plowed into growing the market 10x and building the business, not reducing eng budgets. At some point in the future this will slow down, but that is so far from happening, like many decades from now, maybe never in a non theoretical sense.


Yup - this is a much better way of describing the intent of my words. Thanks.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: