Hacker Newsnew | past | comments | ask | show | jobs | submit | geetee's commentslogin

But the code is the easy part. Solving the right problem is the hard part.

Repeating this banality does not make it true. There were tons of tech companies over the past 30 years or so who, despite solving the same problems, lost out to competitors because they had worse programmers.

I actually agree that the code is one of the most important things to get right at a software company. Still. I would argue very few companies win on code merit alone either though. Strategy, customer communication, market timing, etc on the business side; design, system architecture, dev velocity on the technical side. So many factors are important beyond the quality of the code.

> Repeating this banality does not make it true.

If anything matches the definition of banality in this discussion, it's the puerile assertion that writing code is software development.

It isn't.

Even at FANGs the first thing they say to newjoiners and hiring prospects for entry level positions is that the workload involving writing code amounts to nearly 50% of your total workload.

And now all of a sudden are we expected to believe that optimizing the 50% solves the 100%?


Now we are shifting the goalpost. Who even claimed AI solves 100%. I would even be damned if AI can solve 50% and it would be huge. Personally I don't even think current AI solves even the 50%.

> Now we are shifting the goalpost. Who even claimed AI solves 100%.

I think you lost track of the discussion. I pointed out that in the absolute best case scenario LLMs only focus on tasks that represent a fraction of a software engineer's work.

Then, once you realize that, you will understand that the total gains of optimizing away the time taken on a fraction of a task only buys you a modest improvement on total performance. It can speed up a task, but it does not and cannot possibly eliminate the whole job.

To see what I mean, see Amdahl's law.

https://en.wikipedia.org/wiki/Amdahl%27s_law

Again, only a fraction of the tasks of a regular software engineering role involves writing code. Some high-profile roles claim their entry level positions at best spend 50% of their time writing code. If LLMs can magically get rid of said 50%,the total speedup is at best 2x speedup in delivery.

You can look at that and think to yourself "hey that's a lot". That is not what's being discussed here. I mean, read the blog post you are commenting on. What's being discussed is that LLMs reduce time spent on a fraction of the software development tasks, but work on other software engineering activities increases as it's no longer blocked by this bottleneck.

As others have wrote, the so-called AI doesn't reduce work: it intensifies it.

https://hbr.org/2026/02/ai-doesnt-reduce-work-it-intensifies...

Also, why do you think the phenomenon of AI-induced burnout, dubbed AI fatigue, is emerging? Processes are shifting, but the work is still there.


> the total speedup is at best 2x speedup in delivery

Which is just huge if we can get 2x speedup.


I miss human writing. I miss the different voices.

The difference is a real engineer will say "hey I need more information to give you decent output." And when the AI does do that, congrats, the time you spend identifying and explaining the complexity _is_ the hard time consuming work. The code is trivial once you figure out the rest. The time savings are fake.

That real engineer knows decent. This parrot knows only its own best (current attempt).

Engineers that have the audacity to think they can context switch between a dozen different lines of work deserve every ounce of burnout they feel. You're the tech equivalent of wanting to be a Kardashian and you're complicit in the damage being caused to society. No, this isn't hyperbole.

Who knew managing a team of ten occasionally brilliant but generally unreliable engineers would be so draining.

I think you mean _micro_managing.

Ugh, yes. Normally, you can theoretically pair someone up with a stronger engineer and watch as they learn and grow through their mistakes, while the stronger engineer keeps them on the proverbial straight and narrow with what they produce, through code reviews, documents, etc.

But now, I can't trust any of the models to be that reliable. I can't delegate that responsibility. And since context and prompting is such a fickle thing, I can't really trust any of them to learn from their mistakes, either.


All this disruption to every facet of society. For what? So you can roleplay as the next billionaire startup founder with your weekend project? All while the actual tech billionaires have a giant dick measuring contest.

I appreciate the author making that the first sentence.

It's not that our identity is grounded in being competent, it's that we're tired of cleaning up messes left by people taking shortcuts.

It's that, but it's also that the incentives are misaligned.

How many supposed "10x" coders actually produced unreadable code that no one else could maintain? But then the effort to produce that code is lauded while the nightmare maintenance of said code is somehow regarded as unimpressive, despite being massively more difficult?

I worry that we're creating a world where it is becoming easy, even trivial, to be that dysfunctional "10x" coder, and dramatically harder to be the competent maintainer. And the existence of AI tools will reinforce the culture gap rather than reducing it.


It's a societal problem we are just seeing the effects in computing now. People have given up, everything is too much, the sociopaths won, they can do what they want with my body mind and soul. Give me convenience or give me death.

Writing the code is where I discover the complexity I missed while planning. I don't truly understand my creation until I've gone through a few iterations of this. Maybe I'm just bad at planning.

My tinfoil guess is archive.today is compromised by a state actor. Simply shutting it down would cause too much drama. Instead turn it into villain, and then take it down.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: