Reading it is like the first time you run a profiler and you realize that what you think you knew is true in part, but that other factors you'd not counted are have an outsize effect that you've misestimated.
The widespread solutions in software seem to be to make software engineering more rigorous. Which while I agree with to some extent still largely avoids the problem. Not only do we know that even the best programmers currently can't consistently make programs that don't display catastrophic failure in terms of e.g. compromising data, but what we currently demand from software doesn't fit well into the field of engineering. John Carmack hasn't made a leading game in a decade, because technology isn't enough anymore. It's games like Minecraft, creative with technical problems, and Call of Duty, almost cinematic entertainment, that is pushing the boundaries.
I think to a large extent we missed the opportunity to build a good foundation in computing. The Internet in terms of decentralization and security, software in terms of process and robustness, and probably many more examples I don't know about. Partly because the field (and demand) is moving so fast, but also because of the culture in software. I'm always surprised how conservative (not in terms of politic) people in software are and how we collectively seem to agree, despite things like "software eats the world", that it's still a niche thing.
I think rigor is an important part of the "solution" but in a way that's too easy. It's simplistic to just say "do better", which is sort of what a call for rigor is. Ultimately I think it's not rigor but just seriousness which is called for. Right now the industry is downright delusional and avoiding even talking about some of the fundamentals of software engineering as a practice. So many people want to feel like they're on the bleeding edge, rather than struggling through the mire (which is the reality). We have to get to the stage where people are talking about these things first before we get to the stage where we can do anything about them.
Right now the industry is in a rough shape. And the industry doesn't realize this mostly because there's so much money still (and money is a salve for many wounds) and because not long ago the industry was in crisis, so we're still seeing improvement from that state. Merely producing anything of value from a software project is still a big deal, and not a sure thing. Moreover, shipping any major product or service without relying on "crunch mode" or a death march is a rarity in the industry these days.
But we can do better. Not just with rigor but with how people use their time and do work. The norm is still too close to everyone running around in firefighting mode all the time, while technical debt blooms and burnout looms. A better mode is to work at a slower pace, get the right work done at a higher quality level, and still maintain good development velocity because you aren't slowed down by your technical debt all the time. There are some, small, pockets of the industry where that's the norm, but it's still a hard sell. At the end of the day the default is still to fall back to the ridiculous notion that software dev is like factory work and you get more bang for your buck by just adding more bodies and by having more butts in seats (or fingers at keyboards) for more hours per week.
My google-fu is clearly failing me.
That said, it seems that his last practical experience in the field was better than two decades ago ( https://en.wikipedia.org/wiki/Capers_Jones ).
A criticism of that sort is available to other folks as well--I feel compelled to ask, "Interesting results, but what have you shipped recently?"
More seriously, I see what you're saying--that said, there is a huuuuge difference between how we develop software now versus even five years ago, much less ten or twenty.
Github. Stack Overflow. Agile methods (ugh). Lean methodologies. Various testing methodologies. Changes in the funding structure of companies. Order of magnitude increase (comfortably) in debugging tools for web and network applications. The entire single-page browser app movement. The entire GPGPU movement. Truly convenient and open collaborative coding platforms.
The field, and the people in it, have changed a great deal, and we've learned a lot. I'm increasingly skeptical of professors of professional software engineering who don't, as a day job, professionally engineer software. This applies to other folks--and I mean this nicely--like Uncle Bob or Martin Fowler as well.
Folks like Carmack who have been pushing the envelope and getting scarred the hard way for 20 years are probably simply better sources of learning.
What I've observed in the industry has been a shocking lack of self-awareness about application of best practices, and decidedly half-assed attempts to improve. The state of the industry is not dev shops on the bleeding edge of the best tools, best methodologies, best policies, etc. It's almost universally a tale of failing to even get to "ok" in terms of well-known best (or even better) practices. There are a ton of dev shops that don't do any code reviews at all, and formal code reviews (which have been shown to be one of the best methods for improving code quality) are almost unheard of. Even at places that take testing seriously or do TDD they still typically don't do it very well.
That's why folks like Uncle Bob and Martin Fowler continue to have so much traction in the industry, because simply following good advice that was old 20 years ago is still a huge step forward for the average, or even above average, dev shop in the industry today.
Saying "we've learned a lot" just tells me that I shouldn't take you seriously, because clearly we haven't. Software dev. is still a shambles. Security is still a nightmare. Performance is still a nightmare. Quality and robustness is still a nightmare. Work/life balance is still a nightmare.
Reading books like "The Mythical Man Month" (which is 40 years old!) still holds a ton of lessons that have yet to be taken to heart by the majority of the industry.
Looks like he has shipped 14 books and at least 36 papers on software engineering methodologies.
It's really only relevant when someone is passing down personal wisdom based on their own experiences (like John is).
Capers is conducting and providing research findings and so we should care about his research abilities, and the evidence he presents. His personal software development abilities aren't important.
I believe that because I find there to be a lot of dynamics in software development that are seemingly completely unaccounted for in the literature. I admit that this may be because I'm unfamiliar with the current edge of the research being done, but then again, research does tend to lag rather far behind the state-of-the-art, and our field moves quickly.
Is that really true in terms of organisational structure, communication patterns, team sizes, management structure, or best practices?
Tooling has improved significantly across the board but, for the most part, it has improved and refined existing patterns and processes rather than enabling radically different/novel practices.
The two major differences, IMO, over the last 15 years have been in 1) remote work and 2) the ops space.
On 1) Better tooling has really made remote work viable in many places it didn't use to be. And many companies are starting to take advantage of it.
On 2) the real improvement here the power shift from ops to development. When I first started working in software I was frequently blocked from experimenting by the ops team. These days Ill fire up a few dynos, try out my idea, and then if it works bring the ops team on-board. The workflow itself has significantly changed.
First of all, we still haven't settled on a set of common abstractions and tools for the industry--so much reinvention happens that most projects can almost be considered bespoke every time. There is no common and simple way of estimating development time. There is no common and effective way of accrediting engineers. There is no commonly-understood legal liability and responsibility for software engineers. Frankly, any schmuck who decides to smoosh jQuery libraries together can claim to be just as much of an engineer as a high-end PhD or person with decades of experience--and the nature of the work is such that, quite frankly, that schmuck may deliver more value than their more experienced counterparts.
That said, the technology on which we are deploying has changed a great deal in the last forty years. Orders of magnitude increase in both processing speed, secondary storage, and memory have made a lot of formerly unthinkable approaches commonplace. The problems we face in terms of networked systems and security concerns are far different and more ubiquitous than they've ever been.
If the fundamentals of software engineering haven't changed, we shouldn't take ourselves seriously at all.
And sure, technology has changed, some ways of doing things are different, but the core software engineering fundamentals haven't changed that much.
Take a classic book that is definitely showing its age, Code Complete, and look at what that book has to say about what the fundamental nature of software projects is: managing complexity. That's still true and will always be true, that's what software is all about. As I pointed out in another comment, you could go back even further to a book like The Mythical Man Month and find tons of lessons that are still applicable today. Because despite all the technological advancement that's happened, the fundamental practice of software engineering has not advanced much. "Agile" is just a way to make sure that a team is able to do anything at all and actually ship something that works and is not miles away from what the customer wants. Agile isn't an advanced technique, it's a rudimentary technique to rescue potentially failing projects/teams.
And security? Security is more important than ever. But what does that mean? That means that dev shops follow, oh, say 10% of the most important security best practices instead of 1% or 0% as they might have done 20 years ago. And it means you have to protect against a few specific kinds of attack vectors that were previously unknown. But that's not a sea-change in the way things are done, it's just incremental changes to details.
This seems to be very much the direction Rust is going in.
There's nothing quite like looking at some code and saying "who in the hell did this?", then looking at the blame and realizing it was you, years ago. Learning from those moments is super valuable.
We're looking for the right set of constraints on the programming activity based on patterns of common mistakes.
Being aware of best practices gives you the ability to move in that direction when you see opportunities. It is a mistake, in my opinion, to try to force the opportunities by dictating best practices (something that is done far too often in software process engineering). The best practice you choose depends a lot on the situation and the people involved.
Having said that, I have all too often been in situations where I'm thinking, "I know how to be successful if we could do X, Y, and Z. I have absolutely no idea how to be successful doing the things we are doing now." It's tempting to try and force people's hands, but I have never seen it work. It's usually much better to search for an A, B, and C that will work instead. Not always easy, but it's I suppose it is how you demonstrate that you deserve the big bucks ;-)