Hacker News new | past | comments | ask | show | jobs | submit login
How to tell if AI threatens YOUR job (testdouble.com)
37 points by tortilla on March 14, 2023 | hide | past | favorite | 44 comments



I can’t tell if a lot of programmers fear losing their jobs because, deep down, they don’t have much confidence in their skills and value, or if they actually want to get replaced by a supercharged autocomplete.

Just a year ago HN was full of existential dread over no code/low code tools. How many jobs did those cost?

Going farther back in history it was outsourcing that would put programmers on the street.

I can remember way back when we got told that software building blocks would just snap together like Legos, making software development child’s play.

Why do we tech professionals keep predicting some new thing (or old thing getting a second wind) will make us obsolete? It’s like a mass case of imposter syndrome.

You can see in the headlines what actually threatens developer careers: bad management and greed. Look at the approx 150,000 people laid off in tech since last year and tell me how many lost their jobs because of a chatbot. Or no code tools. Zero.

Read David Graeber’s book Bullshit Jobs. If you recognize what you do in his descriptions, you have reason to worry.


> Just a year ago HN was full of existential dread over no code/low code tools. How many jobs did those cost?

> Going farther back in history it was outsourcing that would put programmers on the street.

These two trends actually created a lot of jobs.

No-Code tools are great to solve the first 90% of the problem. Often, the remaining 10% requires code to properly express. Now, with these low-code/no-code tools, you don't simply write code against a well known framework or runtime but a completely proprietary API/plugin interface that's barely documented and can break at any time. As an added bonus, if the no-code/low-code startup goes under, you now have a worthless plugin and no way to even export or run your no-code workflow!

As for outsourcing, I know people who rewrite offshored code for a living. It's the majority of their contracts. They built this niche by being systematically undercut by foreign body shops. They got started by getting contacted by the companies who initially went with offshored programmers to see if they could take-over the project when they realized it just wasn't ever going to work.


I agree with everything you wrote. I carved out a good freelancing business fixing abandoned and almost-finished code. Some from outsourced developers, some from users trying to use low-code solutions.


“It occurred to me that if I could invent a machine — a gun — which could by its rapidity of fire, enable one man to do as much battle duty as a hundred, that it would, to a large extent supersede the necessity of large armies, and consequently, exposure to battle and disease [would] be greatly diminished.”

— Doctor Richard Jordan Gatling


Yet armies did not get smaller. WWI, 40 years after the Gatling gun, saw more soldiers die from infection and disease than from gunfire.

Technology has made militaries more efficient and multiplied the number of bullshit jobs in the sector. Projecting forward we can maybe expect to all work bullshit jobs, rather than facing unemployment.


Damn what an extremely stupid quote lmao, not your use of it but the original quote itself


The idea that programmers will be replaced is absolutely ridiculous. Being from the business side, I've never worked on a software project that didn't have unforeseen elements, hyper-specific requirements, unnecessarily complicated legacy code to deal with etc. The idea that I would have to deal with a chatbot to sort it out would make me blow my brains out. I don't need some macro understanding of all available code synthesized in a bot, I need micro understanding of our specific needs, environment and the office politics of deadlines and demands. It's great if a chatbot can support the programmer but no manager will stake his success on a chatbot just getting it right.


As a programmer, my job is not to write code, but to think. To find solutions to problems. As long as AI can't think, I am fine.


I’ve actually found chatgpt to be really helpful with the “thinking”. Maybe it’s just the talking through the problem which helps but it gives me a few good pointers each time I’ve had a chat with it about an issue.


As long as you define what it is to think right?


Here's the conclusion written by GPT-4:

In conclusion, the rise of generative AI is a transformative trend that will have a significant impact on the information economy and the job market. While certain types of jobs might be at greater risk than others, there is still room for adaptation and growth for individuals who proactively prepare for this shift. By focusing on tasks that are novel, unpredictable, and fragile, fostering high-trust relationships and interdisciplinary collaboration, and embracing AI tools to enhance productivity, it is possible to navigate these changes and secure a meaningful career in the evolving landscape. The key is to take action now and invest in the skills and strategies that will make you indispensable in a world where AI continues to advance.


I got this with GPT-3 "write a conclusion to this article please":

To conclude, the emergence of generative AI tools like ChatGPT has brought both excitement and apprehension to the professional world. However, it is important to recognize that while some jobs may be threatened by these technologies, others will remain secure. Rather than fearing AI, we should embrace it and use it to enhance our productivity and performance. By focusing on high-value tasks that require human expertise, fostering collaboration and relationships, and embracing continuous learning, we can thrive in the age of AI. Early adoption of AI tools can also help us lead our organizations towards a successful future. Therefore, the key to success in the age of AI is adaptability, continuous learning, and a willingness to evolve alongside these groundbreaking technologies.


Reading these is kind of a mind fuck for me, knowing that it's generated text. However, the texts are not incredibly insightful and remind me of high school prose, which consisted of a lot of BS and "diarrhea of the pen."


I get VP communication diarrhea of the pen :D

In our hearts, we are all high school children.


One of our friends who is a graphic designer mentioned that he thought was going to be out of job in the next couple of years. My husband made the point that he was pretty sure that given whatever AI tools exist in the next couple of years, he still wouldn't be able to do as good a job with them as our friend.

My sister is a translator so has been at the coal front of AI taking over her job. There is a lot of (less well-paid!) work now to check AI translations instead of translating from scratch. She's decided not to do that (it's harder in many ways that doing the translation in the first place) and only stick to full translation work. Some translations don't need to be perfect, but for others the accuracy and quality matter or make a difference. I think she has a niche where she can still get work doing the latter (she has rare languages for a native English speaker and is a very good writer in English, and is also qualifying as a legal translator in the hope that legal translation stays human-based for longer!). But ultimately, there probably will be less full translation work around.


With this pace, only manual physical jobs will remain. Most of the office jobs will be replaced with virtual workers. AGI will have shock-collared humans building new data centers full of TPU/GPUs. Maybe some farmers to feed those as well. Everyone else is on their own. Ah, and humans should be locked away from nuclear weapons for their own safety.


Sorry. I’m not yet seeing on the horizon AI that can even begin to approach the my skill level organizing reliable codebase, inventing new architecture to solve business and technical scaling problems, and the implementing that architectural design. AI is at best tackling uninteresting bust work right now with no sight that this is changing


Which just means, AI needs more data points for architects. What pages did you open in the code-base, what did you read, what changes did you suggest, what different approach did you choose.

Introducing .. the AIrchitect..


But there are not many of you. Sure a few architects will be needed maybe. That still means ‘all’ with a little error margin.

Also, code in a new AI reality, will probably be mostly throw away code; you don’t need to repair a function; you just give the ai the input and output where it fails + the code of the function and it’ll give you a new one which might or might not be completely different from the previous one.


If your only value add is pumping out code that a generative model can generate with a few tweaks, please reconsider. This career is maybe not going to be extremely rewarding for you and your salary will go to 0 in the long run (it probably will for all of us at some point but the question is how much further out in the future).

If this is just fancy auto complete to you then neat.

> you just give the ai the input and output where it fails + the code of the function and it’ll give you a new one which might or might not be completely different from the previous one.

I spent three days chasing down a bug in our codebase with a repro. I’ll run the problem through ChatGPT but I feel like even if I give it the two functions involved and surrounding boilerplate pointing at those two functions, with tests the answer will be gibberish. My basic understanding is that chatgpt’s and copilot’s capability here is doing this for the kinds of things that are toy examples only where the scenario is both carefully constructed and the problem quite obvious. Is that not the case? Don’t get me wrong. I’m 100% on board that this is a fascinating direction and we will get there and it’s going to be great. I’m just saying the base level of competence hasn’t been met yet and so I’d use it as a teaching tool but I’d ban it in a professional environment for most code (not all) until it’s way more mature. It’s just going to do more damage than it’s worth. I look forward to a future where I can give AI higher level instructions of what to change in the codebase and I just need to peripherally review what it did. That and product leadership and upper management is a secure job path. Everything else. Well… good luck? And I am clear. I will go before product leadership and executives. But that means we have AI that’s experienced at building software with minimal input. You can just start your own business and that’s so fantastic. It lowers the barriers to software development so that more people can compete in business. That will be an interesting society to observe.


> If your only value add is pumping out code that a generative model can generate with a few tweaks,

You are describing 99% of programmers, many whose life and their family depend on this job which now is almost no longer a job.


I think we vastly underestimate how much bespoke, physical labor and bespoke, tedious engineering goes into the high tech equipment that drives forward innovation.

What I am trying to say is, work for a defense contractor.


Now is the time to choose which future we want.

If human workers become obsolete, do we want to continue with capitalism, now with no more labor checks on those that already own everything, with the Bezoses and Musks controlling everything while the rest of us live in squalor?

Or do we want to usher in a Star Trek - Style technoutopian socialism, where humankind is provided for by technology, freed from the shackles of work and able to explore its interests for their own sake?


Even if you think that AI breakthroughs won't cause a collapse of the labor market, why not create a backstop so that those who do can put their full support behind creating new and useful AI technology instead of fearing for their jobs?


Not quite yet, friend. Wait till the lawyer market collapses, once they all turn coat the capitalists will have no more intellectual linemen. The quarterback will be sacked.

I will not be surprised if we end up with an organization composed of people like us, that looks a lot like the medieval Masons, keeping the internet going while everyone else just talks to it at ubiquitous terminals.


I don't really get the comments from HN people. I could understand my neighbour (non-tech) saying "Ha ha, you software engineers will become jobless because of AI", but the HN crowd saying that? Here's my rationale:

- AI is a catalyzer. It will let us do more with less

- Most of the companies out there are greedy. They don't look for stability, but for growth

- If right now, it takes a company X resources to build product Y in N months, then with AI the company will have to decide between a) build the same product Y with X/10 resources and in N/10 months, or b) build 10 different products with X resources and in N months. Greedy companies will not choose option 'a' because that options leads to stability; they will choose option 'b' which is the one that leads to (hyper) growth

Just imagine your usual startup in 2024 invigorated with AI:

- Founder 1: I want to build product X. Cool, with this AI thingy I can hire only 1 engineer, 1 product manager and maybe 1 designer to build what it would have cost 20 engineers, 2 managers and 3 designers in 2020.

- Founder 2: Cool. So we just hire 3 people and launch X in 1 month, right?

- Founder 1: Well, we could do that. Or, we could hire 20 engineers, 2 managers and 3 designers to build not just product X, but also product Y and product Z. Right? I mean, why should we limit ourselves with one product only? What if it fails? At least if we launch 3 products, we'll have more chances of "winning"

- Founder 2: brilliant!

So, AI will just boost customer demands and greedy companies (the majority of companies). There will be plenty of jobs for all of us. If you don't believe this, not long ago, Twitter had around 7000 employees. Yeez, could you believe that? And Twitter had technology of the 21st century!

My bet is that AI will enable tech companies to look more like Google: small-medium companies will be able to do A LOT more (e.g., we'll see small companies launching complete G-suite-like software in less than 6 months, we'll see YouTube-like competitors like we see nowadays TODO lists)... but that won't be achieved with 10 or 20 engineers only.


I have two comments, both subject to change. One is that I think most startups hire too many people anyway, you don't need 20 people to build a TODO app. Depends on the scope sure, but I see it all the time. Too many chefs, too much code.

Second though, no one has demonstrated an AI that can write programs with complex requirements yet. Even light requirements it doesn't approach holistically or with the goal of a coherent software ecosystem. It might give a junior the tools to muddle through code snippets from ChatGPT and copilot but you would still need just as many of them. And your senior engineers derive even less value from what language models provide, since they are not struggling to write programs, coding is never the hard part of building an application. There is limited value there currently. Maybe help with an algorithm or two but it's not about to start making changes to your eCommerce platform for you.

I think we need AI specific frameworks for that to occur, ground up AI codebases, like a DSL for the somewhat disjointed and abstract approach that language models have toward code. Current frameworks are aimed at providing guidance and strict scaffoling to humans to reason about business logic in but the grey matter is actually busy solving the very specific business problems, and those problems are spread over a fourth dimension, ever evolving with time.

The other issue is that language models rely on prior art, for human language that is pretty abundant. But I suspect the problem solving skills of resolving business logic into code are not well represented online, since no one is writing blogs about their every thought at BigCo Commerce and Analytics.


Nobody's invented an orchestrator yet. There will be a kAIbernetes invented by someone, which will analyze structures. It seems to me that not many people are really trying hard to project and extrapolate in a realistic way. This will develop much like other techs have developed, but express itself (literally, expression is what these fuckers do) very differently in its various stages.

Ten years got us from seeing dog faces in everything to verbal diarrhea that makes sense most of the time and award-winning Instragram photographers that only exist as a StableDiffusion delusion. You and I might be gone in twenty years, but the world will still be here in a thousand. We just don't know, but I do know one thing - AI is still at the "Steam Thresher" stage of things.

We programmers will definitely be the last to go, I think, but we'll go too, because fundamentally, we became programmers to make machines do stuff we don't want to do.


kAIbernetes will become what jQuery became: someone will need to create something better at some point. It's all the same again. AI won't affect shit. Did we lose our jobs because of compilers?


Follow your example: someone did create something better at some point. Hence, someone will create something better at some point.

Automating people away without taking care of them has been given a pass because so far, the people being automated away were not People Who Matter.

Let's watch.


Yeah. If it works for more than just generating code snippets with no actual understanding, then we get to level up.


I've been thinking about it in terms of the transition from pre-1950s human computers to electronic computers. How much should we morn the loss of rooms of people doing calculations by hand? Making it easier for people to solve problems seems like an obvious win even if puts the "just give me the requirements in a ticket and let me code" programmers out of their jobs. Even outside of big software projects, imagine the business benefits from allowing an office worker to describe a problem and get an excel formula to try.


Somebody ought to save somewhere safe a copy of the internet as it was in the 2021 - full of mostly human-generated content. As the time goes on - it will become ever more valuable as the material to train your AI models on - that isn’t itself chock full of AI generated surrogate.

If it isn’t saved - we will all be soon employed to sort and create the ever diminishing share of “genuine organic human-produced content” from the all-consuming ocean of AI-induced echolalia.


> full of mostly human-generated content

So certain are you?


I think people overstimate the time spent on writing new code and underestimate time spent maintaining existing code.

The point of maintenance is to minimize changes as much as possible and only change what you must. The AI has the advantage that it can type out syntactically valid code it has already seen very quickly.

When you are starting from a blank slate this capability is perfectly adequate as you go from nothing to something and there are no preexisting constraints. Just take a look at all the people claiming they can build a twitter clone in a week.

Does this obsolete the developers working on twitter?

Once you have a working product you will spend the vast majority of the time dealing with pre-existing constraints. Most developers might end up stuck in "maintenance hell" where you have to read dozens of files of code and dozens of pages of production logs, all to change one line of code.

I personally am spending weeks on a feature that is broken in an aesthetically unpleasing way but more importantly, I am not allowed to break what already works.


The conclusion section is amazing, highly recommend at least skimming to there if not RTFA.


The conclusion is that AI threatens dietitians.


This was very enjoyable, thank you. After reading this article I am _less_ concerned about my job activities being taken over by LLMs. In fact they could help me with tedious BS so I can spend more time troubleshooting real problems.


The "AI taking jobs" narrative is marketing for investors to make AI companies seem valuable. Besides being used to make customer service even more unpleasant, don't expect to see too many AI-related job losses in most industries. While there is a chance it will press down wages for people in the graphics/commercial art industries, unless there's a major development/pivot in AI, it will have a major impact on employment. Let's not forget how impressed we were by voice assistance and where it ended up.


The real question is if AI will eat jobs faster than economy can expand to create new applications on it.

First we automate known jobs with AI - then we realise everyone else has AI and get the same efficiencies, competition just got much harder - then we need to find a way to surpass competition by extracting even more from AI.

So we're back to needing people to drive AI to its max potential.


>So if you’re a programmer like me, you’re probably safe!

Phew!


GPT won’t replace your job, but with time it’ll lead to tools that increases developer productivity significantly. So when each developer is more productive you don’t need as many of them. And as we see with history increase with productivity is not tied with an increase in pay.


All this discussion of lost jobs. The intangible are lost to AI; the tangible are lost to robots. I'm starting to think that Zardoz is a documentary from the future.


The answer is yes




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: