> I personally know of a few large companies laying off devs over this.
They’re laying people off and replacing them with chat gpt generating code? That seems... aggressive. Or are they laying off devs who copy-pasted gpt-generate code?
My company recently hired someone that I'm absolutely convinced can't code and produces all their code by copy pasting into/from ChatGPT. I absolutely think they should be fired, it's not even aggressive, it's just common sense. First that means they cheated on their coding interview. Second it means their code is consistently a pile of shit.
I think it's more natural than you might think. For example, my company laid off a lot of people to try to be profitable, and now they pay me more but I have a smaller team with tighter deadlines. I have no choice but to use gpt for a lot of my analysis, design, and code- which I've gotten pretty used to over the past year in my hobby time
The way I see it, if you code without it, you won't compete with the speed and value.
I’m not knocking the parent post here for not replying. But of the 4 or so times on HN that I’ve seen someone been asked to provide detail, I haven’t seen a single answer. I’m not on here a ton but do people tend not to check back?
I’ll admit my bias of having seen enough vacuous industry hype over the years to be naturally skeptical. Heck, I worked in a marketing department once where I helped manufacture the stuff (forgive me father for I have sinned, in my defense they did fire me after three months in a layoff). But my few personal experiences with ChatGPT were pretty disappointing and I’m actually looking for someone to tell me otherwise.
Yes I think since it requires to look up your own comments to see if you got any replies, it's quite common to not get any replies, and I'm very guilty of this myself.
My personal use of gpt4 (also daily) is: correct, rephrase spelling from my brain dump, make python plots (stylize, convert, add subplots, labels, handle indexing when things get inverted), makw short shell scripts (generated 2FA, login vpn through console using 2fa, make script of disabling keyboard etc), and help debug my code (my situation is this, here's some code, what do you suggest?).
The last part is an interesting one for people with attention deficiency disorders, like myself, where procrastination can be conquered when there's an assistant that keeps you on track.
> The last part is an interesting one for people with attention deficiency disorders, like myself, where procrastination can be conquered when there's an assistant that keeps you on track.
Aha, I’ll def have to give it a whirl. My procrastination ability is world class.
You can’t replace devs with LLMs because someone that knows what they are doing still needs to put it all together.
You can only make employees more productive.. this in turn could, in theory, lessen the need for developers in the long run, but it assumes the company will not bother to use the extra bandwidth for other projects.
This is exactly what would happen if ChatGPT was actually a productivity boost for senior devs. I don't know why some idiots on here keep insisting businesses want to get rid of people when that's not how the game works at all. Extra work capacity will always be used. Regardless of what payroll costs, what's always more important is the ROI.
I also believe the gain is in productivity more than needing less people. They will fire as much as possible, but the largest gains seem to me to be in productivity.
And exactly like this some future brain implant thing will also put another layer of pressure. People will get it as it'll give them an edge on certain fronts.
They’re laying people off and replacing them with chat gpt generating code? That seems... aggressive. Or are they laying off devs who copy-pasted gpt-generate code?