I disagree with a lot of this for a few main reasons. And some minor ones
Application development technology has considerably improved over time - most applications simply do not need to reinvent the wheel! Yes, over-engineering by designing something that will never see more than 1qps to scale infinitely is bad - I’m sure it happens but I think it’s more a strawman. If you need a simple CRUD application with a good-enough UI you have no need to introduce additional complexity (and potentially maintenance, reliability issues) with custom tooling.
Two, the software talent market is bifurcated. There is basically commodity development of crud apps, and technically complex novel development. If you think there are no rockstars you might just be in the commodity development scene. There literally are these so-called “rockstars” being flown into SF to work on new stuff in the ML sphere or into NYC/Chicago to work on bleeding edge performance. Maybe the dissonance here is that the commodity developer market has grown a lot, and that over time some technology (like web applications - a lot harder to do at scale in 2005 vs now) shifts from rockstar to commodity as it matures.
Reverting to pets-not-cattle and statefulness can be appropriate at low scale. But honestly this is more of a “choose the right solution to the problem” thing and not a rockstar thing. Following this model as you reach large scale allows for cool production heroism and cowboy coding that makes you feel like a true hacker, but that doesn’t mean your users are getting a more reliable experience, or that your dev time is being spent efficiently.
My minor quip is that, I think as you get more experienced what you used to think of as rockstar development just looks routine because you’re better at writing software.
Another minor point: you can’t just engender a rockstar culture at a company that hires commodity developers as easily as asserted here. The big thing not mentioned: PAY. Nobody wants to get paid like a commodity developer to have to perform like a rockstar. Being a commodity developer is more chill and there is less day to day risk and stress. Once you start getting towards the bleeding edge or reinventing the wheel your work becomes riskier and requires more mental effort and attention.
> Two, the software talent market is bifurcated. There is basically commodity development of crud apps, and technically complex novel development.
I’ve been making this point for years. I think it’s telling that other nearby disciplines are bifurcated. Electrical engineers vs electricians. Architects vs builders. Etc. The virtues of a good electrician are that they’re reliable, have good customer service and they work efficiently. A good building company can bring a building up in time and in budget. But beautiful architecture work doesn’t share those values. The best architecture is bold and creative, and still has people talking years later.
I think this split already exists in programming. We just don’t admit it to ourselves. It’s different work inventing React vs using it to make a website. Inventing react, or rust, or redis requires considered boldness. It takes time to nurture the right insights to make it right. In contrast, the virtues of a good web consultancy team look more like the virtues of an electrician: Good customer service. Clear estimations. Work delivered on time & on budget.
But we’re so enamoured with the idea of ourselves as “elite hackers” that clock in / clock out programmers can’t give up the mantle. I get it. But it’s stupid. There’s no point evaluating candidates on their ability to reimplement a btree from scratch if their job is to style buttons. You’re evaluating people for a different job. Test their software estimation skills. Their social skills. Ask about workflow and productivity.
Essays like this one ask “where did all the hackers go?”. They’re still around, they’re just harder to find now. They went into weird crypto projects. Llama.cpp. Obsessive database projects like scylladb. They’re inventing rust, and finding security vulnerabilities in io_uring for that sweet Google bug bounty money. They’re in the demoscene or making programmatic art for burning man.
Do you need real hackers at your company? It depends. Are you making an apartment building (on time, on budget) or the Sydney Opera House - which is iconic now, but was almost never finished because of delays and creative disagreements? They aren’t the same kind of work.
> But we’re so enamoured with the idea of ourselves as “elite hackers” that clock in / clock out programmers can’t give up the mantle. I get it. But it’s stupid.
There aren't that many wheels that need inventing, even back in the so-called rockstar era.
> Two, the software talent market is bifurcated. There is basically commodity development of crud apps, and technically complex novel development.
This I think is the source of most of the wheel invention. Someone pays a good deal of money for 'talent' they do not actually require, and they end up working at cross purposes. Assign someone with cleverness to work on a crud app, and they're bound to try to reinvent something, if not to keep their resume fresh, at least to fight the boredom.
But it's not bifurcation, it's trifurcation. It's not build or invent, it's build, buy, or invent. We are too many of us writing applications that were never really needed in the first place. Not for any single reason, but a whole host of them, from empire building, to rent seeking from people who could have made a tool that adapted well to customers, but saw much more money in keeping them engaged with you instead of operating on their own steam. Open source is also driven by a lot of motivations, but 'your own steam' is a pretty compelling one.
Application development technology has considerably improved over time - most applications simply do not need to reinvent the wheel! Yes, over-engineering by designing something that will never see more than 1qps to scale infinitely is bad - I’m sure it happens but I think it’s more a strawman. If you need a simple CRUD application with a good-enough UI you have no need to introduce additional complexity (and potentially maintenance, reliability issues) with custom tooling.
Two, the software talent market is bifurcated. There is basically commodity development of crud apps, and technically complex novel development. If you think there are no rockstars you might just be in the commodity development scene. There literally are these so-called “rockstars” being flown into SF to work on new stuff in the ML sphere or into NYC/Chicago to work on bleeding edge performance. Maybe the dissonance here is that the commodity developer market has grown a lot, and that over time some technology (like web applications - a lot harder to do at scale in 2005 vs now) shifts from rockstar to commodity as it matures.
Reverting to pets-not-cattle and statefulness can be appropriate at low scale. But honestly this is more of a “choose the right solution to the problem” thing and not a rockstar thing. Following this model as you reach large scale allows for cool production heroism and cowboy coding that makes you feel like a true hacker, but that doesn’t mean your users are getting a more reliable experience, or that your dev time is being spent efficiently.
My minor quip is that, I think as you get more experienced what you used to think of as rockstar development just looks routine because you’re better at writing software.
Another minor point: you can’t just engender a rockstar culture at a company that hires commodity developers as easily as asserted here. The big thing not mentioned: PAY. Nobody wants to get paid like a commodity developer to have to perform like a rockstar. Being a commodity developer is more chill and there is less day to day risk and stress. Once you start getting towards the bleeding edge or reinventing the wheel your work becomes riskier and requires more mental effort and attention.