Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: SWEs – Are y'all worried about your jobs?
25 points by neom on March 13, 2024 | hide | past | favorite | 47 comments
I saw that Devin[1] thing yesterday and read through the comments, it was the first time I feel like I've seen SWE's switch from "this is going to augment us" to "ruh oh"

Are you worried about your work prospects over the next 5 years?

If so, why and what are you going to do about it?

If not, why, and what should others do with their worry?

[1]https://news.ycombinator.com/item?id=39679787




No. I was a JavaScript programmer for 15 years. I should have left that line of work more than 10 years ago, but I was really good at it. The problem is that the industry baseline for competence is exceptionally low, so the longer I remained doing that work the more incompatible I became. I was laid off last year and desperate for employment so I could pay my bills, but I was also patient.

I am in a different line of work now. In this new line of work the business challenges are greater but the technical challenges are less challenging. I am essentially starting over so at the moment the technical challenges are still plenty challenging for me.

I still enjoy writing JavaScript for personal applications, but I will never go back to that line of work. Its an industry where the least competent dominate the hiring requirements because nobody wants to invest in formal training, so everything is a race to the bottom. AI can replace 90% of those overpaid unqualified people and the world will be better off enjoying faster and more secure products at far lower costs.


> I am in a different line of work now. In this new line of work the business challenges are greater but the technical challenges are less challenging. I am essentially starting over so at the moment the technical challenges are still plenty challenging for me.

What are you doing now exactly?


Enterprise API management.


Your last paragraph is scary :)


It shouldn’t be. If 90% of your work force is ready for replacement by AI, given the immature state of AI, something has fundamentally failed.

Perhaps the greatest failure is in perceptions of originality. A transportation engineer primarily works through communicating original ideas in the forms of drafts, papers, models, and so forth. Contrast that to a mechanic that changes oil, rotates tires, and flushes your transmission. Mechanics do work that is not original.

Most software developers want to call themselves engineers and yet their first fear is originality. You will hear that expressed as reinventing a wheel. These are mechanics, not engineers. Sure, there are senior mechanics that can repair transmissions and rebuild engines, but that still does not rise to the level of engineering. Software does not formalize what defines a senior or an engineer. The result is a lot of unqualified people wearing lofty labels, Dunning-Kruger. Ultimately, what matters is what you produce but for most developers things produced not by strangers are not worthy of trust.


Not at all. I work in a field that can't use public (or in many cases commercially) available solutions. We have studied the use of AI using private models that are based on the publicly available models for some isolated use-cases. The outcome was that AI is no where near mature enough to write anything besides basic solutions.

If AI does penetrate and become effective in my area, it's another tool. Just like you write less and less code as you achieve higher roles, you'll spend more time using AI rather than writing "raw" code.

One way or another, AI will become part of our jobs. To what level and when will greatly vary from industry to industry.


Worried, but in a measured way.

Craftsmanship is still important and will remain important. Being able to work with systems as they exist is also incredibly important. There will be a lot of apps created and they will need to be maintained, or made to scale, and that will require more programmers as the solutions start to strain - AI code may end up the new "legacy PHP" code.

My prediction (worth almost nothing): A lot of folks will lose their jobs, and average pay will go down, sometimes by a very high amount for developers that aren't at staff level - the "able to buy a house where there are jobs" rung on the economic ladder will keep rising and income inequality will keep increasing until something major is done about it. Business owners will be able to do more with less, and the job of programming will get more difficult as more productivity will be expected.

What I'm doing about it: Let the future come and react to what happens, not what I think will happen. Keep living my life and find things that are important to me outside of work. Remind myself our time is always limited, and make the most of relationships I have and seek contentment rather than happiness.


I was absolutely blown away when I saw the demos.

Then about 15 minutes later, after some reflection, I thought about what I just saw:

-An LLM generating code

-An LLM Agent taking a request, breaking it down into tasks, and then creating a plan

- An LLM using tools (shell, compiler, web browser)

Wait a minute, I've seen all these things before. Many times. So why was I so impressed?

I thought about it. Well, the UI was slick. But really, the wow factor was due to how much it was able to accomplish with a single prompt like "Benchmark Llama2 on these 3 cloud providers". Then I started wondering whether it actually did all those things from that one prompt or whether there was an entire days worth of prompting, interspersed with manual steps, spliced together into a 2 minute video. Like that staged Gemini demo video from a few months ago. Hmmm.


Your description reminds me of DHH's original "how to build a blog in 15 min" video to demonstrate Rails. Tons of time went into understanding the framework, design patterns in use and general understanding of OO, web code, etc. If you sat a non technical person down and asked them to generate the same blog in 15 minutes it just wouldn't happen. As a Java developer seeing a web app go from zero to working in 15 mins was amazing.

So I think the same thing is in effect with any AI/LLM that is outputting code. There is a ton of knowledge of the LLM and trial/error crafting prompts, clarifying things that go into getting any output that will work.

People keep focusing on the text output of these tools but the majority of time spent in Software Development is understanding business problems, working with other humans on those and defining a solution that solves the business problem, coding and testing that solution are a minority share of the total time involved from end-to-end of problem/idea inception to working software in production.


Why would we be worried at all?

One year after ChatGPT became a thing, VC money are still being used for the supposed revolution that so far has not materialized. And I am not saying one year is a long enough time to judge, no; I am saying that the financial incentives are severely tilted against making "AI" itself productive. The VC-backed AI startups have to expand, grow, hype up their product, and not worry about accuracy and reproducibility very much.

They are sabotaging their own area. The incentives are extremely wrong. I and 99.9999% of all programmers everywhere don't have to lift a finger; the area itself will self-destruct after a while.

And as other posters pointed out -- if an "AI" can displace all programmers out of jobs, I rather worry about that when it happens (I have my doubts whether it will happen in this century but that's a separate topic).

Our current society's / civilization's incentives are severely fucked as well. What even is "value" these days? We need to get permits to interact with nature, whereas being able to destroy said nature is rewarded with money.

Worrying about losing one's job is of course rooted in common sense but that's only in the general sense; worrying about the nebulous "AI" that is always juuuuust around the corner is rooted in fantasies. Just one example: for all the money invested in "AI", nobody has figured out to plug CI/CD or just general verifier scripts so the LLM can self-improve on whether it spits out the right code when prompted (and I am not saying you can capture _all_ human prompts of course; I am saying that a good chunk of the smaller algorithms can be detected and tested before the LLM gives you your output -- and even that is not being done).

Food for thought.


I wouldn't say "not at all" because I think the industry is going to change at least a little bit, so with change comes inestability.

I think that my ability regarding problem solving will be valuable in the future, maybe if most of software engineering becomes fully automated I could build some valuable product easier, just thinking, writing and using feedback loops for the continuous iteration of the product. That would be awesome, I like programming but I love problem solving.

If the above is not possible, I've some investments and savings that could help me take some time to adapt to the new environment and find something where I could provide some value.

And if the above is not possible there are some possibilities:

- we live in leisure while the machines do the work for us -> great!

- we don't exist anymore, the machines won -> bummer, we had a nice ride.

- I'm no longer useful to the new environment -> I prefer not to think about this :P


The question remains, what is "value" (i.e., money) that we can use to exchange for other things that bring us "value" (i.e., money)? Water preparation, catching fish and making fire can bring “value” in forms of nutritional energy and bodily satisfaction.

However, our Western governments require us to have permits to catch fish, make fire. Yet, employers do not want us entry levels working at their companies. In California, they are moving the "goalpost" with the homeless people (not few are older folk where some of them went out of luck).

So tell me, is this “consumer economy” still a thing?


We're one year into the hysteria of AI replacing engineers and I have to yet see a company that actually did this sort of thing.


I’ve found reality often moves much slower than optimistic projections from someone trying to sell something. The reality of the solution is often far more complicated than the sales pitch as well.

15-20 years ago everyone was worried about getting outsourced, and all the software engineering jobs in the US would be gone. Here we still are.


I don’t think you’ll find many people on HN that are too concerned. On the other hand, I truly pity aspiring devs in school now. The drawbridges on the profession seem like they are getting raised and I don’t know how an intern or junior can add enough value to cut their teeth in the face of things like Devin.


If you work at a company where software is a cost center, then I think you should be. Those companies will try to reduce software costs as much as they can. Non-technical executives who don't understand software will be particularly prone to shoving as much AI in as possible. I think this will lead to the same outcome as the first wave of outsourcing - garbage software that opens the door for technical startups to disrupt incumbents.

If you work at a company that sells software products, there is still a lot of value in craftsmanship and experience. AI will be an accelerator for high performing engineers to be able to do their job more effectively


Software as we know it, at least enterprise CRUD and the like, will become cheap to make with fewer programmers. The AI's are going to help millions of organization who don't know what they're doing create lots of complex, bloated, seemingly correct, applications. The world will drown in complexity and data fiefdoms and duplication and special cases and bureaucratic rules. The qualified people who can cut down complexity are too few, and will be easily overwhelmed. The world will need ever more CS majors to deal with the complexity; but their work will be joyless, more like PM's negotiating with one another what orders to give the AI code monkeys. Then the AI PM's enter the picture. They'll need to be trusted with more and more business context, more and more of the hidden institutional knowledge, to understand management's intent. Until, eventually, they themselves become the managers, with broad mandates like 'make profits' or 'provide quality education' or 'improve public health.' That's when they either take over and keep us as pets or turn us into paperclips or something.


I'm not. I'm no longer a SWE, but I still write software to support my business. I just can't picture AI solving even the basic hurdles of my work without significant work on my part.

Right now I'm crawling apartment listings across multiple websites. It's tedious work, but it's not challenging. AI can't even figure _that_ out. I can't imagine it sitting in meetings and making sense of complex real world problems.


As a junior having only started two years ago, I am worried. I am very behind on what I need to know to be valuable beyond creating CRUD apps.

It feels like I need to either have an idea and build it myself or change careers. Perhaps this is overly pessimistic, but the rate at which everything is changing is alarming.


It's not too late to consider a trade or nursing.


Public service is awful. I was a Leo/paramedic before switching to intelligence/SDE work. Was being paid less than a quarter of what I make now.


I already have a trade; I was a furniture maker for 10+ years before changing careers to a more financially stable option...round and round I go.


We’re at the “self driving is next year” stage of unfounded hysteria when it comes to AI being a viable replacement for developers. On the surface it may seem around the corner but it’s actually very far away.


Not at all, those tools barely help if you need to deal with encrypted data or any kind of complex domain, also it can't handle very well distributed systems tradeoffs.

I believe it can probably do CRUD well, but before AI there were already no-code tools that could handle such cases. I personally don't see any big win.

Even Github Copilot is kinda useless, it offers too many bad suggestions before showing the right one, sometimes I just disable it.

It can be useful when learning a new prog language though. I actually wish that those AI tools would be better...


No, I'm excited. Better tools mean I can achieve more.


I think you have hedonist fantasy thinking.

You think "achieving more" is good. To me, npm's left-pad incident, that every new fridge and TV is connected to the Internet, the Great Video Game Crash of '83, modern cinema, and AI art you see everywhere are examples of how "more" is not "better" -- it's very much "worse". A new world of meaningless, mass-produced mediocrity doesn't excite me.


The world has been awash with meaningless, mass-produced mediocrity for the last half a century. There's already more noise than anyone can handle. If you haven't adapted to that by now, the noise from AI won't make any difference since you're already hopelessly overwhelmed.


> If you haven't adapted to that by now, the noise from AI won't make any difference since you're already hopelessly overwhelmed.

That one hit hard.


Not particularly. Reading through https://www.reddit.com/r/cscareerquestions/comments/1bd12gc/..., this looks like good old-fashioned vaporware.


First they came for the weavers, and I did not speak out — because I was not a weaver.

Then they came for the artisans, and I did not speak out — because I was not an artisan.

Then they came for me, and I started screeching and coping — because now I might have to get a heckin' blue collar job!


My GF earns more money selling her hand made weaves than the average senior SWE at google.

When did they come for the weavers?


Assuming a senior SWE there probably makes around $500T I guess your GF would need to sell one woven item for $2,000 every week day for the whole year to come close to that? That seems like an amazing amount of work, is she a spider perhaps? ;)


No, these are not able to reason. LLM is not the answer to AGI.


I’m pretty sure GPT4 would wipe the floor with you in any debate (including this one).


That’s because most debates are not won based on reason. In fact a large part of how the world works is mostly based on how you say something not what you say. Perhaps the world of engineering and mathematics (and subjects which strongly depend on these) are the only ones where reasoning is a stronger lever in an outcome (outside of perhaps economic incentives).


Please give me an example where GPT4 reasons worse than an average adult college educated human. Any topic, STEM or otherwise.


LLMs don't necessarily need to think to be useful. Having a simple minimal working example from an API you're trying to grok is helpful, as long as it's error-free and not redundant.

However, given the availability of LLMs, how would you justify the expense of hiring junior developers fresh out of school?

Check Microsoft.com or Google.com's career pages as an example (you will also see something similar going on within mid-sized companies), do you find any jobs for entry level applicants? Are those entry level jobs offered mostly in India or Eastern Europe?

So tell me, how is the ability to think helpful for an entry level applicant, when HR departments won't hire you because you lack “2+ years of non-internship C++ experience” for an entry-level position?

My point: being able to think is of course fine, but HR departments require you to do back and front flips, speak perfect reverse Mandarin Chinese and have 2+ years of non-internship experience in C++ on top of that. Being able to think won't justify the expense of hiring a junior dev fresh out of school in a high-income country. (At least this is my impression of the current job market.)

PS: So, I just finished school last year, and I'm still on the hunt for a job. It seems like a lot of companies in wealthy countries are holding off on hiring newbies like me. I get why they're doing it (e.g., inflation, interest rates), but it's really frustrating when you're the one affected.

I mean, how are we supposed to get experience if no one's willing to give us a shot, right? It's a tough spot to be in. Then the LLMs (even if they cannot think), they still bring value and accelerate development (with a potentially lower headcount). Seniors will be fine without us juniors given the prevalence of LLMs and high inflation environment (which makes companies risk-averse and very picky).


Not really? Someone could always get a fast but not very good developer for very cheap to 'replace' me. Like, I'm not paid well because of the 80% work I do. I'm paid because 2% of the time I do work that no one else can do very well.


it is even hard for a PM/PO to define functions/workflow of a project/app by writing the requirements.


Just gotta keep building things and trying to get that payday, man. There's no other way.


I’m honestly not worried at all. I just see it as another tool to take care of mundane work, if it’s even that helpful. You still need context and domain knowledge to build features / fix bugs.


> SWEs – Are y'all worried about your jobs?

no


I visit job search sites and select the “entry level” filter. Then, I read requirements such as “2+ years (non-internship) experience in C++”, “5 years of professional experience in the game industry”, and “excellent C++ skills”. Furthermore, there are requirements like “Fluent in speaking reverse Mandarin Chinese” and “being able to perform perfect back and front flips on command”. The latter two are, of course, exaggerations, but I couldn't help joking about them to alleviate the dire circumstances not only I find myself in, but also others in my circles.

I know individuals who attended top computer science universities, earned master's degrees, yet they still struggle to find a job. The missing element in such conversations is that recent graduates often lack the required expertise that HR departments demand. Companies or their HR departments have become extremely selective and require multiple interview rounds before considering employment. Alternatively, companies hire student workers because they are cheaper, or they outsource hiring to countries like India, Eastern Europe, or Turkey, pausing hiring for entry-level applicants in high-income countries.

However, some governments claim a “shortage of skilled labor especially in the IT sector,” while individuals with computer science degrees find themselves unemployed in these “uncertain times”. Experienced individuals may manage to stay afloat, but those without experience, such as recent graduates like myself, are deemed less valuable. This is partly due to the prevalence of Large Language Models (LLMs), which I refer to as “Google search on steroids”.

Have you heard of “bullshit jobs”? If so, I suspect that many positions are actually insignificant. As a student worker, I co-developed an audio editing application (C++17, Qt 5). To be honest, there was nothing in it that hadn't already been solved. Would you argue that a “level meter”, “equalizer”, or “JSON parser” are things that need to be reinvented despite the availability of MIT-licensed libraries?

Rather, these jobs appear to be a form of “collective busywork”. Nevertheless, the fortunate few engaged in such “busywork” earn significant sums despite not contributing much. What kind of economy is this, where one can thrive without generating actual value (e.g., innovation, non-copy paste work)?

A “consumer-oriented economy”, huh? We need consumers, yet we cannot drive up consumption, because we collectively play a game of hot potato until someone solves all the world's woes for us.


Bruh, you are me. You are literally me. You speak the same thing I speak. You wrote the same things I think. The idea about only few writing something novel and everyone else riding waves on top of it.

I am really happy that someone else also came to same conclusion. It means it's not just us! There will be many more like us who truly want to build new things.

How should we do that? Like you I also find that anything I want to build as MIT licensed software and it works. Then I wonder what do the engineers do?

Oh yes, the new graduates like us are the bottom feeder that are valueless to the companies. They demand 3yr+ for entry level. It's "entry level".

To them we are just expenses that should be cut. I am blessed to learn this really early on. Now I can quit even before starting. Kinda happy about it NGL. I am thinking of building things for myself and when these companies come after me, I would charge them in millions. Let's see how long they can play this game.

They can't speak the local language like I do since I come from fairly backwarded country. That is a huge advantage.


Love your username.


Nope.

It's nice at implementing simple stuff at a first semester CS student, after somebody over specified it, but that's it.

Code monkeys might want to start looking for a new job in 10-20 years, but you asked about SWEs. And less than 1-10% of our job is to actually type code.


Come on now... Devin is make believe vaporware. It's too bad they didn't use Devin to have a working website;)

But, yes AI will probably get to that level. But, someone has got to oversee and prompt these AI agents.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: