>I was once in a meeting, with four people, plus the CTO. The CTO forbade us to take notes: it seems the fad of the week was that note taking is what makes meetings a waste of time. The meeting took two hours. Afterwards, the four of us had about eight different opinions of what had been decided. No follow-up actions were ever taken.
This is still the case in many companies I (founder of a small troubleshooting company; we search out problematic large, cash rich companies like these; they all have failing processes and IT all over the place) meet with; sometimes it's frowned upon, but often simply no one takes notes. Which shows these meetings (almost all meetings I go into) are a complete charade of managers wanting to show they have 'something important to do, really!' (somehow there are often 10+ people there). I am in meetings (including Zoom etc and irl) where i'm sure no-one really heard or understood what anyone else said (different accents of English from different countries and background and of course, no one can say anything because we all have to respect people etc); no notes, no recordings (and of course, the captions of the video chat or my phone didn't understand anything that was said either), while someone was explaining quite difficult, in depth stuff, for 2-3 hours. Afterwards, the stuff is rehashed in a 15 minute text chat with the person who explained the difficult stuff; why didn't they write it down in the first place and forgo the meeting? Because half(that's generous, it's more like 80%) of the room should have no (high paying) job; they are just there for being there.
Ah, the enterprise world, such joys; I enjoy it as I threat it like a cosmic joke; it's a comedy show, not unlike The Office including the main characters high pay.
In almost all the meetings I attend, I'm always the note-taker by choice unless I have earmarked someone to take notes. My notes usually ends up being the go-to and shared document after the meetings.
I'm still learning, and I love taking pictorial/visual notes. Try it; it is fun. If one is interested, I suggest the books by Dan Roam[1]. They are quick to read/browse, and I keep re-reading them. If you are starting with just one book, start with “Draw to Win”.
> Vetinari was very good at committees, especially when Drumknott took the minutes. What the Iron Maiden was to stupid tyrants, the committee was to Lord Vetinari; it was only slightly more expensive,* far less messy, considerably more efficient and, best of all, you had to force people to climb inside the Iron Maiden.
> * The only real expense was tea and biscuits halfway through, which seldom happened with the Iron Maiden.
What worked for me well is if I share my screen and capture the common notes / mural for everyone to see. You might not catch the spellings etc., need to ask, but that's probably the question everyone has.
That way there are less misunderstandings, everyone is aligned, and I can send the notes right after the meeting.
Yep, and the kicker is these are the types that put “good communication skills” on their job requisitions.
Upper management can’t communicate anything worthwhile, mid-levels have varying levels of English from varying first-languages and never develop good short-hand ways to communicate.
Seeing “good communication” on a job req is a red flag at this point, might as well say can do basic math or use a computer.
> why didn't they write it down in the first place and forgo the meeting?
To play devil's advocate for a sec, I wonder if the above could be used to argue that if you're using writing, why have a meeting at all?
I guess maybe the combination of voice and text might be better than just one of the two. But I'll be honest I'm not much of a meeting note taker. It feels actually harder to remember stuff with my face stuffed in a notebook while other people are talking.
> if you're using writing, why have a meeting at all?
Most communication is waste, written or spoken; the writing is not to write down everything discussed, but to summarise and archive it for future reference.
Don't write everything down, just the important parts. And if the conversation continues while you need to focus on writing, ask them to pause.
Useless meetings have always felt like a strong argument for Basic Income to me. These have negative value, in the sense that they destroy time for the competent workers. As a society we have to make sure a bunch of people don’t become unemployed and then homeless, but we should have done it in a way that doesn’t incentivize pretending to have a necessary job.
> Useless meetings have always felt like a strong argument for Basic Income to me. These have negative value, in the sense that they destroy time for the competent workers.
To me, changing company processes to somewhat get a company rid of useless meetings rather sounds like a suitable value proposition that business consultancies could provide to their customers. :-)
That's the bullshit jobs argument of David Graber. I know Andrew Yang talked a lot about it as well.
The concept makes sense to a degree, but I don't know how it could be implemented in mass without issues. In scifi like Star Trek, they have a post scarcity society and everyone wants to work towards self betterment. I'd assume in modern society that everyone would just want to sleep, eat, and watch Netflix to a degree. I'm being a little over pessimistic here of course. Perhaps the "basic" aspect of it that is just enough for food, clothing, shelter, and health, helps incent those taking more risks for things like entrepreneurship that they'd be too scared to do normally?
How did the UBI study in (I think it was Sweden) go?
People eat, sleep and watch Netflix in their spare time because they're dedicating their thoughtful useful hours to work.
A friend of mine is on long term disability support payments, when he was doing full time work he was a couch potato in his spare time and I couldn't ever get him to join in on hobbies with me. Now that he doesn't work he's producing music and painting and he picked both up eagerly from casual interest and to escape surplus free time.
I picked that one because it seems “fair” in the sense that they clearly are taking a negative spin on it, and I like basic income. It includes lines like:
> Moreover, even though all job search requirements were waived, participation in reemployment services remained high.
They label this as a failure, because it didn’t increase the participation. But, it looks like a fairly good outcome to me.
> Perhaps the "basic" aspect of it that is just enough for food, clothing, shelter, and health, helps incent those taking more risks for things like entrepreneurship that they'd be too scared to do normally?
Exactly.
It is not as expensive as you might think because it replaces welfare systems. it also avoids welfare disincentive traps such as we get in the UK, where taking on a low paid job can get you hardly any extra income as the earnings are mostly offset by reductions in benefits.
Very few people want to live on a minimum income so most people will work. People like to eat better than the minimum, go out, have nice stuff, live somewhere a bit nicer....
Also, it should be noted that it just removes a lot of the administrative overhead of other systems, in the sense that we won’t be employing a bunch of people to figure out who deserves money.
Netflix is optimising for our attention as a way to get money; if the content creators are also on UBI and create only for their own artistic vision, we'll probably mostly be a lot less interested in what they make — the alternative is that Holywood is run by people who add nothing and possibly actively harm their own businesses, for which I will let y'all supply your own punchlines.
After about 30 years of programming and seeing mostly everything getting worse, I figured every company must be making absolute garbage and they will eventually get them into issues that cost real and serious money. This turned out to be true everywhere; small and big; I started with smaller companies that have software a few years old rotting away and leaving them vulnerable without able to do anything except hire (hard to find) people to replace or update it (mostly the same proposition because it’s all so bad) or using services that are broken (not maintained or removed features), shut down or getting too expensive for them. Then we moved to bigger and bigger and it’s indeed everywhere. I always wanted and recommended to use tried and tested old and boring software: Apache mature products, Postgres, MySql, SQLite, Java (not the latest), Common Lisp, HTML/CSS with minimal js, QT etc but companies are far and wide not listening or doing that and adopting the latest burning piles of poop which guarantees our business millions of clients for decades to come.
It’s a simple business model; you must like (love) even poking around in horrible stuff that might not work after restart and no one dares to touch. We offer services that a) secure that software so it can keep running, be restarted and reinstalled b) make performance improvements c) do vital fixes or features/integrations d) emergency fix downtimes (we have 247 service for that). We only do things that need to happen now and are thus most urgent; we never offer rewrites, as, for our current even smaller clients, that would tie the team up for years on one client. Boring and dangerous.
But there is more than enough money in rewrites too: but if you are planning that; please use the latest and greatest frameworks, dbs, microservices, paas, saas and definitely never any versions over 0.9 please; anything that might not survive 3 years without continues poking, upgrading and fixing! We want to be around to clean it up! /s
Over 40 years of programming here, and I'm back in my prime again.
Started out in BASIC on ZX Spectrum and BBC Micro/Electron.
The toughest programming I did was writing a full texture-mapped, shaded 3D engine in x86 from scratch, aged about 12.
(I pasted my 12-year-old code into this commercial video game, although I think the texture-mapping here I replaced with Direct3D's much slower software renderer: https://www.youtube.com/watch?v=t2kdKB18c7I&t=332s )
The best paid was modifying some existing source for a torrent site that brought in ~$13m.
Funny moment: putting a language I didn't know on my resume, getting hired from it on a Friday and starting on a Monday as lead developer :D (crazy weekend)
My favorite language of all time was VB.NET, but that's not worth pursuing any longer.
Right now I'm mostly writing hardcore ASP.NET in C#, running on Linux.
Over 40 years and I never got sick of it. I am leaning new things every day.
I strongly disagree with: "Simple, obvious code is easier to write, easier to get to work .."
It takes real skill, time and effort to write simple code in any production setting. I am not talking about some 100 line algorithm some leet guy once wrote, but code that's been in production for years. It's only the simple code that survives.
> I strongly disagree with: "Simple, obvious code is easier to write, easier to get to work .."
I get your point, as well as the author's point.
Complex code is difficult to write, and unless you really pay attention, likely to be wrong.
Simple code is easier to write, easier to verify and more likely to be correct, but it's very hard to learn how to express your solution to a complex problem in simple terms.
Maybe I should have said, "simple code is hard to write and complex code is easy to get wrong".
As an aside, I'm not going to post an AI generated essay, but pasting that quote into the new Claude gives an essay well worth reading if anyone cares.
Funny thing is, being the oldest programmer, I get given that code. Code that has had bugs in it that destroyed the careers of people given it. I could tell a story about one time I fixed something and they had a company wide party, which I couldn't go to as I had another impossible bug.
Thank you for the brilliant article. It is short and concise and covers a wide range of topics for lifelong programmers. I’m bookmarking it for references when someone asks to read this line of thought.
And, I’m happy to know and jealous of, “I have code running on billions of devices, on all continents, on all oceans, in orbit, and on Mars.”
I’m a muggle in the programming world. I wandered around learning the dark spells of management, design, and everything else in between. Looking back on my 20+ years of technical-professional career, I realized that I’m much better off in bursts—5 years of extensive programming in an extremely narrow field of focus, another 5 in code-driven designs, etc.
I hope that one day, I can also write in such beauty about my 40+ years of careers, learning, etc.
> Interesting and significant software is beyond the capacity of any one person to build alone in a reasonable time frame.
I disagree with that. VisiCalc, MacPaint, and many modern “significant” apps started out as 1 person projects. I agree collaboration and communication are vital skills but you don’t need to make this grandiose statement to defend that.
I have done some very “significant” stuff, that was a big deal, when I released it, but would be considered “quaint,” nowadays.
That said, my significant work created a baseline that has been extended by a team, so it’s still “significant.”
The app I just released ain’t Facebook, but it’s pretty non-trivial; consisting of three backend servers (that I wrote), and two native frontends (that I also wrote).
Takes a while, though, and there’s a lot of moving parts. I’ve gotten used to working at this scale, but I’ve also been writing shipping software, for over 35 years.
This may have been true back in the times of the applications (when I was also a developer), but, speaking from 40+ years of experience myself. I agree, most significant projects need more than one person. There may be outliers, but back then it was the normal, now it's not.
Also consider games, back then one or two guys could do a game, now all but the outliers are done by large teams.
>Also consider games, back then one or two guys could do a game, now all but the outliers are done by large teams.
If anything the opposite is true. Gamedev tooling is now so good and high level one or two people can make sophisticated games, in terms of audio, gameplay and visuals that an entire team could not make in the 80s or 90s.
Just one example, Signalis (https://rose-engine.org/signalis/). One developer and one artist if I'm not mistaken, stunning game that in say 2000 would have been a release from a major publisher.
There's a narrative around game budgets and timelines that seems to assume that they're increasing due to it being inherently more expensive to create them. The reality is that the technology has massively raised the potential output of a small team & that increasing budgets are an economic decision - selling a small game that takes players' attention away from your GaaS omnigame is inefficient. And you need to take it away from Fortnite while also competing with excellent games that are free.
It's not an absolute statement, you'd have to have a childish interpretation of the article to have that takeaway. Not every generalization needs a "well actually".
Feel free to come back to us when you have an actual example. The days of one person languages might be over, but we haven't seen any language created using a LLM yet. Much less one people complain about.
I'm writing a language by myself so for me at least the days of one person languages are not over. And I have the joy of working with a language no one complains about ;)
I have been programming for over 35 years, 23 years of it professionally, since I was about 15 years old. I've started with ZX-81, Commodore 64, Amiga 500, IBM PC, ...
I don't like programming and "software development" anymore. People are tiring, jobs are tiring, development processes are tiring, complexities of modern systems are tiring. How many goddamn front- and back-end "frameworks" does one need to learn and use to put together a goddamn website? It is all so boring and soul-destroying.
I find it a bit more enjoyable these days to program (numerical, control) algorithms and firmware (for MCUs), getting myself closer back again to my second life-long interest, which is electronics. At least, at the intersection of hardware and software, there is less hyped-up stuff. And "scary" hardware keeps pseudo-programmers and internet charlatans away. To some extent...
My programming journey span 25 years. Professionally around 18 years. Not as long as your good self, but I share your frustrations!
I started my love for coding as a young teenager. I never had the opportunity to program on Amiga or Commodore... but very thankful to have a 486 PC in the early-to-mid 90s.. leading up to "Intel Inside" PCs a few years later.
I avoid working with a large team. I like no more than 4 people including myself. Projects can be broken down into (what I call) modules. I dont bother with Sprint meetings or the like... just a quick morning catchups with coffee!
I started a new job a few years ago.. which is also me getting back into web development. How much things have changed in this domain! What was originally Server-side processing (of html) 20 years ago has slowly turns into the job of the frontend... so (now) having Node, React, etc. Deployment tools were also using Node, which made it a pain to upgrade it!
We have an opportunity to rewrite some of our websites due to changing internal systems. My team has learning allocated for htmx. This will make our lives sooo much easier!
I will hit 20 years of professional programming experience in a few months.
My q to everyone who has kept at it: did you continue up the career ladder into staff eng-type roles? They’re put on the IC ladder but get further away from programming. I’m still conflicted about that. I can see myself trying that for a few years then just dropping back to senior level.
I did realize recently that it is hard to give up programming when my whole professional career has been built around being seen as the person who programs things well.
I'm not a bad communicator and have entrepreneurial blood so I explored various leadership opportunities over the years but it wasn't really a fit for me. I did well in the various roles but didn't enjoy it.
To point straight at the childhood desires behind it all, I like receiving puzzles to work on and I like wowing people by anticipting their needs and exceeding their expectations.
One can frame management/executive-track work that way too, but the third desire is to do that in the peace and quiet of my own space and (mostly) on my own time -- so for me it all pointed back to increasingly sophisticated remote contract work and using my entrpreneurial blood and commmuniction skills to facilitate it.
I don't think anybody can predict what direction you might want to go, but I do suggest you think about what you're really after in your career and which of your opportunities are more or less aligned with that.
I'm at around 25 years and have decided to stick with the middle of the IC ladder. Moving past that means no more working directly with the technology, only touching it indirectly. Plus theses are "leadership" roles, which pretty much just means moving into the dysfunctional, hierarchical structure most companies seem to default into. I prefer peers, not subordinates.
To put it another way, my favorite thing about programming is the creative aspect. Getting a problem, thinking of a solution and then writing the code that solves it is more rewarding than anything the higher positions have to offer.
I have a similar question. It seems to me that the software engineering salary distribution is bimodal. I'm simplifying a bit here, but basically into those that make <100k USD or >200k USD a year. What makes the difference?
Is it management vs codemonkey? FAANG vs "small" <10k employee corporate?
I like engineering, but my impression is that management often lies to engineers saying "oh, 80k USD is a high salary", when it's clearly not. The divide seems to be between those that know, and those that don't.
FAANG got into a bidding war when money was cheap and stocks were moving fast and turned their SWE jobs into a upper middle class vehicles like medicine, law, finance, etc
Cargo cult startups followed along by naiviete and businesses that had to compete for their same pool of talent followed along by necessity.
But those compensation packages are disconnected from the value generated by the engineers and eclipses it in almost every business and division, so it's not sustainable. The big layoffs are a first part of the correction, and a quieter adjustment will happen as non-cash compensation (options, stock grants, etc) calibrates itself into today's more grounded economy.
The managers saying "80k is a high salary" to you are give-or-take correct (assuming you're in a mid/low cost of living area). Software engineering remains a hot market with lots of money flying into it and will pay better than other engineering fields for a while longer, but ultimately pegs into the same "professional middle class" bucket as the rest.
If you want to be filthy rich, software engineering is only the right choice when you have an opportunity to ride one of the boom waves and FIRE yourself. You probably won't have that opportunity again for years if you aren't on today's cresting wave already. That's the second hill in your bimodal distribution and it's a dwindling aberration.
But if you just like writing software and being financially secure, the first hill is pretty great.
This is way off. The median US SWE makes ~$120k according to BLS (Govt data). Median. High comp is not at all disconnected from value. Revenue per head for engineers can be extremely high.
> But those compensation packages are disconnected from the value generated by the engineers and eclipses it in almost every business and division, so it's not sustainable.
Thats incorrect. Statistics show that those who take most share out of the economic value that they generate are the higher paid tiers of engineers in Silicon Valley. And even they get at most a 10% of the actual economic value they generate.
This means that not only those engineers are already being paid WAY less than the actual economic value they generate, but also the majority of the population receive only a pittance from the economic value they generate.
You're getting to thd real root of the question here - Whst is most important to you? Is it the code, or the money?
Management earns more money. Partly because they control how the money is spent. Partly because they're a value-multiplier.
Coders get to write code.
You seldom get both [1].
So for each person, it's really important to understand your root motivation. If it's money, then climb the ladder. It's gun in a different way, and if done well, can bring satisfaction.
However, if writing code is eat gets you out of bed, if meetings are something you detest, then perhaps "enough money" is enough, and you might prefer enjoying your path through life.
[1] you seldom get both in a large company. However in very small companies, and one-man operations, you can often find a very lucrative niche. Fewer people means little management (although more time talking to customers) so you get to spend more time actually producing.
> my impression is that management often lies to engineers saying "oh, 80k USD is a high salary"
Cost of living always anchors these discussions. You can't compare FAANG salaries in very high cost of living areas against somewhere like the midwest.
This is also a question of leverage: they're able to find developers by paying 80k, so they continue doing that. If you had something they needed but wasn't as readily available, then you could start discussing what it looked like to be paid more.
I've been programming for over 45 years, since I was 14. I consciously made the choice, back around 2005, to NOT move into task management, but to remain a working software engineer. Writing software is what I love; had I become a manager of programmers, I'd have been miserable. (I tried it for six months, actually; I was miserable.) If you find the jobs farther up the ladder challenging and exciting, go for it. But I've found I was able to build a career on being the "person who programs things well".
I'm glad that's a viable path (at least in your experience)! I really hope it'll be in mine. Management seems awful. :p
"Hey, you already make great money doing what you love and are good at. The next step is to make more money while having a much higher workload full of stuff you hate and are bad at."
...Like, what?
It makes sense if you're a janitor or a paper pusher or something. Now you manage janitors without getting dirty; it's mostly upsides. But with our bizarre, quasi-"artist" attachment to the work, it's an almost tragic trajectory.
"quasi-artist": That's how I've always felt about it. My job title says, "engineer", but I think of myself as more the software equivalent of a master cabinet maker, an artisan of fine heirloom software. ("Heirloom" is no joke. I'm currently replacing software I wrote over 25 years ago, that's been in use ever since.)
Been slinging code since 1999 (practicum/internship). Started writing InstallShield scripts and Junit tests (Junit 1.0). Next 10 years were Java / C++ / VisualBasic. Was always trying to avoid the JavaScript revolution (I am a real programmer I said lol). Fast forward to today and I sling TypeScript / Angular / Java as a Senior Software Engineer. Had the chance to move up about 2 years ago and did not even apply (need to apply for promotions at the company I am at). Everyone asked why and I see what happens to my coworkers once they do. Still expected to support every app they have ever worked on and do 8+ hours of meetings a day. The pay is more but so is the stress. I have a great family and just want to enjoy my life. I also work a second job facilitating online courses for a major University when I want to pick up a course and make my fun money (plus get to learn more than any of the students in the courses). I still love coding be it evergreen or legacy. I am not ruling out moving up the ladder, but for now still happy slinging code…. I guess if you have 20 years in I only have slightly more, but man I see the burnout above me (talking to people who were once joyful and fun to be around, now snippy and stressed) and just want to enjoy life.
> Was always trying to avoid the JavaScript revolution
I studied compilers during this time. Dynamic typing + production systems are a waste of life. :)
> Fast forward to today and I sling TypeScript / Angular / Java as a Senior Software Engineer
Yeah, TS is good enough for me now. I swapped Angular for Svelte in there, but otherwise the same. Enjoying learning the frontend now that it is somewhat sane.
> [They're] still expected to support every app they have ever worked on and do 8+ hours of meetings a day. The pay is more but so is the stress. I have a great family and just want to enjoy my life.
> I see the burnout above me
Yeah. Leadership imposes a large emotional cost. And I prefer my job stay in a box for the most part because I have lots of other things in my life.
Do you want to wield "code and tech" to create, or "wield people" to create for you? If you can motivate people to execute on your behalf, you can go into sales. I find that less aggravating than managing development projects.
Over 40 years here too, from basic & assembly on a speccy 48k, 68000 on the amiga, through to my latest just-for-fun project of Zig SoC programming on the A64 pinephone.
Thanks for making me feel old today! but also lucky to have had such a hobby come career. Although I am sure it has rewired my brain in the process, and not all for the best. Depression, and frustration at how software now mostly controls lives rather than enhances / supports. I feel like I got a glimpse behind the curtain, and this problem solving brain now sees a broken system destroying the planet and humanity for the last extra cent of profit, with no solution. EOL
Over 30 years of professional experience. Currently at a FAANG. Just today I fixed a bug that the entire team couldn’t figure out, including our team lead who is brilliant who is 15 years my junior. I’m not a better programmer than he is but I pull my own weight. When I fixed the bug, I had a rush the entire day. 30+ years and I still love programming, and I feel blessed and grateful to have fallen into this career. It felt like divine, Godly intervention for the path that lead me here.
There are some things that I disagree with the author, most especially things like this:
> If you don't agree with this, I have no hope for you.
> When in doubt, choose different. If you exclude people based on them being unlike you, you will likely be choosing poorly.
He says he believes in diversity but contradicts himself a few sentences previous. I find this typical with a lot of people who claim to believe in diversity: they don’t actually believe in true diversity, they believe in people agreeing with their pre-existing beliefs. And if someone doesn’t agree, there’s no hope for them at least according to the author. That’s not someone who believes in diversity.
And I actually believe the greatest software projects have a single strong voice with a strong vision and strong competency that drives the entire project. Look at Steve Jobs with Apple and the iPhone, Linus and Linux, Elon Musk with Tesla, Zuckerberg and Facebook. You don’t find a lot of collaboration, what you see is a brilliant visionary with strong opinions and not much diversity in opinion. Too many cooks spoil the broth, as they say.
If you want a fun environment, then sure you can collaborate and give equal time to others but you won’t go as fast and usually the end results aren’t as dramatic.
You took his first statement out of context which made it worse than it sounds. Hell, I thought the author was an a-hole because I read your comment prior to reading the article.
He says human rights are important and treating other people well is the right thing to do and that's non-negotiable. Which is a fair statement. What decent human being would disagree with that?
He's right about having a diverse set of eyes and opinions on a team being a positive thing. But most people see the word "diversity" and think about the forced diversity quota of 'we need 1 black person, 1 gay person, 2 woman, one Latino and one Indian' kind of diversity, which speaking as someone from a minority, I too am against it.
> Human rights are fundamentally important. Treating other people well is the right thing to do. All of this is of paramount priority, whatever you do. If you don't agree with this, I have no hope for you.
He actually says that treating people well is morally right thing to do and the highest priority. That’s different from human rights which is the minimum that people should be treated. So I don’t think the “if you don’t agree” relates to the idea of human rights but rather that you should treat everyone as best as you can and consider that your highest priority. And if you disagree then he has no hope for you.
And this is the point. What if you believe that you should treat people “okay” not as best as you can, or not as your highest priority? What if you instead believe that you should give people freedom to do what they want no matter what the outcome is and not interfere in one way or another? Basically don’t treat anyone anyway and leave them alone?
That would produce unfettered success but also unfettered failure. What if you’re okay with some people utterly failing as long as some people have unparalleled success?
What if you’re okay with the idea of treating those who work hard very well, but ignore and don’t help anyone who is lazy or doesn’t work hard? Does that go counter to his idea that people should be treated well?
I don’t subscribe to these arguments but my point is that just because they don’t think treating everyone equally doesn’t mean that they are evil, they just have different priorities. A ceo like Steve Jobs would fire unproductive employees and reward productive employees, does that run counter to his argument of treating people well with the highest priority? And does he accept this diversity in thought or does he reject it?
In my home country, we have a saying: "No religion or politics in the bar." People who drink alcohol best avoid these as you end up in a fight sooner or later.
We also do this at work. I am here to do a job. Just keep it focused on the professional side. I don't need to be your best friend. Or share the same religion or political beliefs. At work, I come together to reach a common goal.
But that said, I hate people enforcing diversity for diversity's sake. They are actually introducing politics on the work floor.
> He says he believes in diversity but contradicts himself a few sentences previous. I find this typical with a lot of people who claim to believe in diversity: they don’t actually believe in true diversity, they believe in people agreeing with their pre-existing beliefs. And if someone doesn’t agree, there’s no hope for them at least according to the author. That’s not someone who believes in diversity.
Yeah, I got weirded out by the person saying the creator covenant is something good.
Diversity of thought is what should be striven for (to a certain extent) rather than the much more common case of striving for diversity of superficial attributes.
Over the rest of the article though, the writer is actually talking about the former case.
An autocratic government where there is little diversity of opinion is the same, it can get things done quicker than a democrcy. Not that it's a good thing but velocity of execution is generally higher.
Over 40 myself, and having a similar experience (I just started with an Amstrad CPC6128 instead of a Luxor ABC-802, and it came with a few games, but after that you had to write your own in BASIC). I like the part about the past/present/future "me". I think it's something that happens on smaller timescales as well.
Especially when starting a new project, or writing a new feature, start with being lazy and sloppy: just make the thing work. Once you have it working, then refine it so it the code is clear and maintainable; this is the "superb work" you'll be doing, inspired by "future me"'s view of perfection.
However, instead of this being a linear progression with a clear start and end, this should be more of a cycle; don't get stuck trying to achieve impossible perfection, instead don't be afraid to break stuff now and then to make actual progress.
I too got my start in 1984, when Santa brought me a Commodore 64. I'm not good at remembering things like anniversaries, so I'm glad not only to read this essay, but to be reminded of the occasion.
Over 50 years for me. Stared with a Cardiac [0]. First job after graduating university was FORTRAN IV on a PDP-11. Done some pretty cool stuff over the decades, much of which has disappeared as the market or technology has moved on.
I know mostly "it depends", but I wonder how that much experience affects job prospects, and what kinds of unique skills and advantages it gives. Software development has changed a lot and there are surely people who have been doing the same thing with no improvement, but 40 years is a lot of time to get very specific knowledge, and experience gives skills that can't be learned. At the least, it means you're reliable.
I don’t have 40 years, but I’m a member of the multiple decades club.
Job prospects vary depending on who is hiring, but people develop a wisdom when doing anything for a very long time that’s mostly based on everything you’ve seen (good and bad) in that time. The greatest value I bring to being the old person in the room is knowing what not to do and to be able to explain clearly why.
I’m not meaning the above to sound negative, it’s just that a single good idea rarely makes a business, but a bad one can break it with ease (for example, people wanting to jump on web framework fads that no-one in the company understands well enough to implement securely).
Getting hired tends to be something you worry about more in your 20s than in your 50s.
Speaking generally of course. It's not much fun being fired in your 50s. Although by that point you probably know enough about something that consulting (ie short gigs) are both available and lucrative.
But generally, by the time you're in your 50's you've acquired enough expertise that you add more value than you consume. Which means you're unlikely to be fired.
That experience, and ability to understand both the upside, and downside, can be really valuable.
Certainly good communication skills help, and the better you are at getting information out of your brain, and into everyone else's, the better you'll be for it.
Yeah, this is a complex question. For background, I have 40 years of programming experience, about 20 or 25 professionally, depending on how you count. My first system was a Commodore Vic-20. I've done assembly and COBOL and mainframes and I've been doing web stuff since it was called "dhtml".
If you zoom out far enough, programming has changed very little in the last 40 years, and a valuable programmer 40 years ago could be just as valuable today. Distill programming down into its basic essences, and you get abstractions described in a way that machines can implement. I have occasionally helped introduce non-programmers to programming, and abstract thinking is the biggest and most difficult hurdle by far; the rest is just nuts and bolts, implementation details.
Of course, as you zoom in to tricky patterns in object-oriented programming, or lambdas, or the sharp edges of CAP in distributed systems, and so on, things get a bit more nuanced. A lot of that simply wasn't a part of typical software development 40 years ago. I've met a lot of older developers that found a niche and stayed in it and didn't participate in the broader field.
I just recently worked with one such developer on a project that the average HN reader would recognize. He was brilliant; I'm a trial-and-error developer, a little bit sloppy, but I rely on tooling and following best practices to tune up my code. Usually, that's good enough. His code was horrifically hard to follow; he dogmatically avoided any of the common habits that had developed over the last 20 years, so his code was built out of piles of functions that were hundreds of lines long and had all kinds of tricky branching logic and would return arrays of values and so on. But, he could stare at it for a while and model the entire damn thing in his head. We went together like oil and water. He is someone I think of as a good cautionary example of what happens when you decide all the new stuff isn't worth your time.
Still, though: he really was brilliant, and could be a valuable member of lots of different development teams.
So, conversely, there's this widespread assumption that if you've been slinging code for more than 10 years and you haven't moved into management or have some really big-wow project on your CV, then you've done something wrong and you've stopped learning and nobody should hire you. Every single place I interview with acts like they have some kind of special, one-of-a-kind tech infrastructure, and if I'm not already an expert in that, then I can't provide any value. Most of the people I interview with have never ventured outside of web development, so it's cruelly ironic that they are the stick by which my ability to adapt is measured.
I have a hunch that there's a subconscious incentive for hiring managers and lead developers to keep stitching together ever more esoteric tech ecosystems because it limits competition. If you build infrastructure out of all the latest stuff you can find, you not only develop your own expertise at it (resume-driven development), but you also reduce the number of viable candidates that the company might be able to hire to work with you.
All in all, I still enjoy most of software development, I still do it as a hobby in some of my free time, and I'm more productive and skilled than I ever was, but I've also never been as disillusioned with the state of the industry, and I'm no-joke considering pivoting into becoming a personal trainer.
The biggest change to programming over the last 40 years, is most software running 24/7 with frequent updates, thus requiring round the clock support and pager duty. Really changes the way you think about a lot of things.
With software pressed onto physical media, you couldn’t provide instant support even if you wanted to.
Many years of programming here as well, either as a hobby, while studying or for work.
My main driving motivator after 32 years in the industry is that I still love to code and do problem solving. I have been fortunate to have been working on interesting and challenging tasks with very nice people throughout my career.
And I still enjoy learning new stuff, so never stop learning.
A short chronicle of my linguistic adventures in the computing domain during this 40+ year-long programming journey, although some are just vague memories at this point, like "fingerprints on an abandoned handrail" (Bob Mortimer):
1. Basic
2. Assembly (6502 and 68000) C64 and Amiga Demo Scene
I will let you in on a little secret. Very few devs, full stack or otherwise, remember everything on any non-trivial system. What the experience gives you is the ability to read and comprehend code that is new to you very quickly. And “new to you” code includes code you wrote a year ago and haven’t touched since.
> And “new to you” code includes code you wrote a year ago and haven’t touched since.
True, dat. I write code and documentation for future Me[0].
I can’t remember lots of stuff, and Google the most basic stuff, every day.
But my stuff ships, works well, and tends to last. I think actual results can count. That seems to be a minority opinion, these days, but what do I know? I have to Google basic stuff, so I suppose that means I don’t know what I’m doing.
There’s a difference between full stack and solo developers. There’s absolute overlap, but being aware of the stack and actually being the person who put it all together are different.
All my solo projects I was intimately familiar with. Start adding people and you naturally start losing complete grasp of the project.
In the current zeitgeist it's a sensitive topic, but quite explicitly the article said not to hire people just because they think the same. The author warns against getting stuck in a local maximum. I don't think it has anything to do with current day diversity quotas.
This is still the case in many companies I (founder of a small troubleshooting company; we search out problematic large, cash rich companies like these; they all have failing processes and IT all over the place) meet with; sometimes it's frowned upon, but often simply no one takes notes. Which shows these meetings (almost all meetings I go into) are a complete charade of managers wanting to show they have 'something important to do, really!' (somehow there are often 10+ people there). I am in meetings (including Zoom etc and irl) where i'm sure no-one really heard or understood what anyone else said (different accents of English from different countries and background and of course, no one can say anything because we all have to respect people etc); no notes, no recordings (and of course, the captions of the video chat or my phone didn't understand anything that was said either), while someone was explaining quite difficult, in depth stuff, for 2-3 hours. Afterwards, the stuff is rehashed in a 15 minute text chat with the person who explained the difficult stuff; why didn't they write it down in the first place and forgo the meeting? Because half(that's generous, it's more like 80%) of the room should have no (high paying) job; they are just there for being there.
Ah, the enterprise world, such joys; I enjoy it as I threat it like a cosmic joke; it's a comedy show, not unlike The Office including the main characters high pay.