Hacker News new | past | comments | ask | show | jobs | submit login
Labor GPT does is being seriously under-appreciated by programmers because it
4 points by johnwheeler 6 months ago | hide | past | favorite | 32 comments
Yes, so somewhat of a provocative title, but I've been programming professionally now for 24 years, and I really think there's a quirk of human nature that's preventing the bulk of us from seeing something basic.

So if I'm wrong, please help educate me:

The bulk of what programmers do is search on Google as an entry-point into a string of S.O. posts, Github issues, and tech documents.

The programmer's job is then to synthesize that information into a solution.

The better the programmer is at searching, and the more experience they have integrating solutions, the better they are at programming.

Edit: The sentence above is important. I'm not saying programmers are worthless! But for some reason (probably cognitive biases) some commenters aren't seeing this.

If anything, I'm saying programmers are more valuable than ever.

--

That's the way I see it.

Another way of saying it is all us programmers _used_ to do is copy and paste from Stack Overflow, and while that's extremely simplistic and untrue, it's enough to capture the essence of my argument.

And now, ChatGPT does all that in seconds. That's the automation. Not the coding.

If you're going to comment, please drop an upvote to help get the conversation rolling. Thanks




Maybe it’s just me but the things I’ve tried to get ChatGPT (and other related tools) to do has fallen over really fast after you get past the initial toy examples. We’re talking complete inability to resolve introduced errors, grossly insecure APIs, misinterpreting requirements and straight up hallucinating libraries where it couldn’t figure out how to implement something itself.

For light research and rapid prototyping it’s just another tool that augments a human and I think the fundamental underlying technology will not be able to fully resolve those issues until new architectures come out.


All those things you say are true:

1. The toy examples are toys

2. Bad code

3. Hallucinations

Yes!

My point is people are so caught up in that narrative (because it's true and immediately apparent) that they're not seeing what has really been automated away.

Not all programmers are the same. Some don't do shit. There's the 80/20 rule. But the good programmer know that they leverage their extensive experience in conjunction with search to integrate solutions.

GPT hasn't made the work any less tedious or easier as a whole. In fact, you can do more of it if you're really productive.

People on an assembly line are still working their asses off. That's more of what I'm saying. What should I be saying?


This happened with sysadmins a lot time ago when Google improved its search capabilities.

At some point just reaching the 2000s, there was LOTs of system administration stuff in the Internet, literally hundred of thousands of howtos and documentation for the most obscure tasks related to system administration, just lying there, hidden by the lack of frequent updating of the the web indexes used till that poing (Yahoo mainly).

Then google happened.

Back in the beginning of the 2000s, there used to be a crystal ceiling of knowledge of senior sysadmins, above it, you could only find solutions by having one of these Sr. guys inhouse and/or consulting.

Then Google happened and suddenly beginners like me (lots and lots of juniors), popped up out of the blue, right from just having deployed some OS, having configured an ethernet interface and little else, and started to work deploying serious datacenter hardware everywhere.

Initially, there was a huge gap between the already seniors guys, almost at god-level in their play, and the other guys, just fastly typing search queries into google, mostly reaching maybe 20-30% of the advanced solutions previously only provided by seasoned senior sysadmins, but most remarkably, 50-70% maybe of the day to day usual work of SSr. and Sr. sysadmins.

From there, we went straight to the sky in terms of skillset availability, out of nothing, there were thousands of capable sysadmins - for several levels of expertise - where you used to have just some dozens of guys.


I remember when you didn't do shit without the prior permission of a sysadmin-- when the sysadmin was king. Had their own coffee cups telling people to "Back off" and the whole nine yards.

And the really good sysadmins became cloudops and devops, etc. I don't know anything about it to talk intelligently really.

But this is my point - the knowledge labor gets easier. And with GPT it has in a major way because of the way it integrates information. Nevermind that it can't code. That's not the point--that's what people are getting caught up on.

Maybe it's that you just have to have a lot of experience to see it? That can't be it.


Seriously, I think this is way off-base; the job of a software engineer (not a "programmer") is to distill problems that are both technical (bits) AND psychological (Bobs and Barbaras) into a reasonable, reproducible solution that can be subsequently improved upon and maintained. Most days, it's the latter that's the real work.

ChatGPT is, on one level, a fancy form of Google search: you still need to know how to leverage the prompts in order to make decent progress, and it's based on knowledge (correct OR incorrect) that was originally generated by a human, one of whom must still evaluate as to whether or not the outputs are relevant to the task at hand.

There's a whole branch of philosophy dedicated to this, by the way, that deals with the knowledge lifecycle: how concepts come to be, how they are reinforced socially, and then discarded when no longer relevant. The same lifecycle applies to the data pools used for training "old" models.


Thank you for responding.

"Software Engineer", "Computer Programmer" Again - I've been programming for 24 years.

I prefer calling myself a computer programmer, and I don't make a distinction personally. I think we're all just engineers or whatever.

> ChatGPT is, on one level, a fancy form of Google search: you still need to know how to leverage it in order to make progress, and it's based on knowledge (correct OR incorrect) that was originally generated by a human, one of whom must still evaluate as to whether the outputs are (still, in the case of historical knowledge) suitable for the task at hand, or not.

This is 100% true. And NOTHING in my original post says anything otherwise. I'm just saying it in a way that pisses people off, and I'm not meaning to.


    And now, ChatGPT does all that in seconds
I've seen ChatGPT generate a five-line Elixir function that had three hallucinated stdlib functions, several hallucinated options to the ones that existed, and several MORE hallucinated explanations for what those calls were supposed to do.

IMO that's objectively worse than "copy-pasting from Stack Overflow" because sites like SO have at least some chance of obvious nonsense being moderated / corrected.


It's clear to me that I'm not communicating my point correctly. Either that, or its such an emotionally charged topic that programmers aren't reading what I've wrote -- they're just responding.


Also, the valuation of the knowledge asset decreases when those who pay of it start thinking "this must be easy to do it because the AI solves the problem in a matter of seconds". It doesn't matter how good the artist thinks his painting is if others think that anyone can do it better, cheaper and faster with the help of Artificial Intelligence. In the end he will have to find a profession that will feed him.


I don't think GPT4 did. It's really good at helping programmers. But left alone to complete a more complicated task, it's going to fail spectacularly.

However, I do think that another generation of models such as GPT5 + more mature programming agents will truly begin the transition of fully replacing entry level programmers. If not GPT5, then GPT6. It's inevitable.


Hallucinating is a hard problem to solve and likely impossible with the current paradigms and models. Making a bigger GPT is not expected to change this. More complex (CoT, RAG, agents) systems help, but still has issues and takes longer to respond. There is another dimension of complexity and optimization that gets introduced when you adopt these systems.

There was a post on HN yesterday about research into how the brain is different when handling human vs programming language. What we have today is good at human language. We probably need to rethink and find new models for programming / engineering tasks.


OK, so I'm not saying correctly. Thank you.

Here's what I need to convey:

People want it to be an agent, but it's not an agent and if you expect it to be one you get a false negative. The risk is you then assume its immature or not yet up to the task.

But the bulk of what employers pay programmers for: searching google and integrating solutions -- that was automated away the second GPT 4 came out. You can leverage that to think like an employer and get these glorious chunks of labor that used to cost thousands of dollars and take days.

Do you see what I'm saying or am I still saying something wrong?


But your title said "has automated them away". Do you think GPT4 can truly replace everything that programmers do? Taking vague instructions and turning them into working solutions?

I don't think so. GPT4 is really far away from that. You can look at Devin (powered by GPT4) and see that it can only 13.86% of issues on real world projects without human intervention. That's a tiny number. I'd bet that it performs even worse for internal projects due to more complexity in business requirements, and worse written specs (open source projects write better specs because they have to).


OK - you and this other kind soul in the thread at 5 AM PST (my time) are very much helping me understand how much I suck because you're exactly saying my point.

People are looking at Devin and saying: "Gah, Parlor Trick!" GPT hasn't replaced the programmer!

But what they're missing is that the bulk of what an experienced programmer does is search Google for Stack Overflow posts and use their extensive expertise to assemble together solutions.

I am absolutely not saying that GPT has replaced the programmer--I'm saying people are too caught up in that to see what's really happened:

My job, as a 24 year programmer, has been largely replaced. It hasn't been replaced, but it is largely automated freeing me to do other activities or more of the old activity at a new scale.

What do you think now?


The bulk of what an experienced programmer does is communicate with business leaders on project goals, align teams, clarify specs, suggest specs, test technologies that can accomplish said goals, create a plan to adopt that technology, improve the efficiency of the team, fix really hard bugs that just broke production while customers are angry and your boss wants to kill you.

I don't do that much Googling. It's a small part of my job.


> I don't do that much Googling.

If you're a software engineer and you're saying that, you're full of shit.


Why would I be full of shit. I go into the office for 8 hours. How many of those 8 hours do you think I spend on Google as a senior programmer?

Probably 30 mins on average at most is my guess.


I concur, but would say the amount of search depends on the current task. Much of it is getting to the long-form content that LLMs would summarize, but you actually need to read the original source to get a complete understanding or find the quirks.

I'm currently working on something that requires above average searching, but it is definitely something that an LLM fails hard at and requires pouring through copious numbers of content looking for the needle in the haystack. The current LLMs are ok at average or common tasks, but are miserable at anything uncommon, product of their architecture and training

(currently trying to figure out how to get CentOS 7 -> Rocky 8 inplace upgrade with LUKS working, which is not supported in leapp...)


I think the wrong bit is "bulk" in:

"But the bulk of what employers pay programmers for: searching google and integrating solutions"


Thank you, I'll work on this.


"But much of what..."

Generic enough for everyone to interpret as they want, say from ten to ninety percent.


ChatGPT Answers Programming Questions Incorrectly 52% of the Time: Study

https://mail.google.com/mail/mu/mp/110/#cv/All%20Mail/18fb8d...


We, as software developers, are so afraid and blind about how much necessary we are in this world that the punch in our face is going to hurt a lot. It's a matter of time the industry will replace us all, not today not tomorrow but the industry is pushing hard to remove the human factor from the software development equation. I'm not saying this is a good or bad, I'm just saying the egocentrism that we, as a corpus, have scares the s..t out of me. We're not so necessary guys, we have to start facing this reality sooner than later.


Instead of coders, we'd just have prompt engineers. It'd make building software easier and faster for the masses. But at the end of the day, you still have to tell the computer "if you see this, then do this, else, do that" whether it's in English, Chinese, French, Javscript, Python, or C++.


"A very comprehensive and precise spec": https://www.commitstrip.com/en/2016/08/25/a-very-comprehensi...?

Switch in "prompt" in stead of "code", if you will; still pretty much same thing.


This is half my point. The part about lying to ourselves I agree with

I don't think that AI is at the point yet where it has made us redundant. Just the way we used to work.


Agree. My point is: I (and I'm just talking about me) started working in this industry because it helped me to create new things based into problems that required a great level of creativity and introspection. Maybe because I'm a problem solver, who knows. But with this pivot in the industry I dont feel engaged with the new AI paradigm, it represents nothing of interest to ask for answers to a machine (no matter how we want to call this new thing...)


> But with this pivot in the industry I dont feel engaged with the new AI paradigm

So I'm a 45 year old dev trying to start my own company. GPT has helped me build my product and has freed me with time up to learn about marketing. I've been watching these guys on YouTube Kipp and Kienen.

One of the things I'm learning is how important it is to get your message correct. My message is that people should stop expecting GPT to be an agent, since its not there yet, and they should use it like an employee where it carries out the knowledge work and leaves it to the human to untangle the ambiguity, much in the way a good team lead would.

My thesis is that these chunks of labor that GPT can do are being seriously underappreciated by current programmers because they're worried it can't code.


Excellent, (46 here), and I see it the same way. Software is not only code but representation and automation of repeating tasks. For this actions AI will be king soon (if not now)



Congrats John, I really liked the idea of video-introducing your SaaS. In your mind, whats you ideal customer? How do this customers find your value proposition?


In fact, managers soon will come with the idea of removing development cells because the one-man band is a reality (one developer for all the functions of the team helped by an AI agent). This will shock us all much much more than the coding capacities of an LLM or another.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: