I don't worry about AI automating away my job not because I don't think it's possible (though I do think it's much much further out than the hype would suggest), but because an AI automating away software creation is the economic equivalent of nuclear war: it would irrevocably alter everything about the way the world is run and is therefore impossible to adequately prepare for on an individual basis.
Most of us in software are automating other people's jobs: we learn and understand requirements and build things that make other people more productive or erase their jobs entirely. The rate of automation is right now limited in large part by the availability and cost of software engineers. If AGI can reduce that cost to ~0, then a massive percentage of the economy would be wiped out in a matter of months. What is any individual software developer supposed to do to prepare for that scenario?
If AI automates software creation, which automates everybody else's job, why does it follow that a "massive percentage of the economy would be wiped out?" What is so undesirable about productivity going up by a factor of a 100 or a 1000 and everyone living like an aristocrat, because machines do all the work? Why is full employment such an obsession?
Because the benefits of these tools are not distributed equally. The ultra wealthy will become even more wealthy and the unwashed masses will starve.
Of course the ideal outcome is the Star Trek post-scarcity utopia. But humans are not currently incentivized in a way that I can see leading to that outcome in our lifetimes.
Benefits of innovation start with the privileged, but invariably end up benefitting the masses. Cellphones once were once available only to the rich. Poor people have smart phones now.
Many may argue that the science and innovation in the last century was funded by the taxpayers and the government that proportionally taxes the poor more than the rich.
Workers merely made tools to make their own work processes more efficient which was taken by many corporations and applied across the workforce without the payment of proportional savings to the people who invented it.
The mass adoption of technology funds the future research process thus the profit is diverted from the masses directly to benefit and maintain competitive advantage.
The time saved due to technology never did reduce working hours or increase leisure time of the workforce.
Yet healthier foods are now less accessible to the poor. Where once the rich were fat and had rotten teeth from sweets, now the rich have trainers and personal chefs and healthier food. Poorer folks have food deserts and junk calories.
Innovation doesn't always trickle down. Sometimes it works in reverse to exploit the poor and the ignorant.
Having a smartphone is probably not hugely conducive to an increase in quality of health or happiness.
Sounds like OP is worried that they aren’t going to be able to afford a roof over their head. No government seems to be ready to roll out a plan to deal with huge swathes of the populous suddenly being out of a job because their employer wants bigger profits.
I didn't say it would be undesirable! Just that it would completely alter the world in ways that we cannot prepare for on an individual basis.
Ideally, we move to UBI and, as you say, everyone lives like an aristocrat. But we don't get there by trying to hedge against AI taking over our individual jobs.
With our current societal structure, a few people would live like aristocrats guarded with AI weaponry while the rest eke out an existence in shanty towns.
The bottleneck to wealth creation wouldnt be nothing it would be access to natural resources, which is and always has been mediated, ultimately, through violence. Land ownership is mediated by states until they lose their monopoly on violence.
Already the % of wealth that is earned through labor has dropped to record lows. The productivity gains gets swallowed by things like rent, because land has been bottlenecked through excess hoarding.
What happens when labor is commoditized to death and the only value comes from natural resources? Fighting. LOTS of fighting.
> What is so undesirable about productivity going up by a factor of a 100 or a 1000 and everyone living like an aristocrat, because machines do all the work? Why is full employment such an obsession?
Nothing is undesirable about the future you laid out in your comment, except for the fact that it will never ever happen the way you describe (be nice if it did, but I'm not holding my breath). You could have made similar claims about the computer or the internet back in the day, but what we've seen from those technologies is ever increasing concentration of wealth in a few hands, which is what technology does, it centralizes by default. And most human attempts to redistribute wealth on a large scale have been more disastrous than the inequality they were trying to solve.
Food would be so cheap that watching a 15 second ad that played on the disposable e-ink screen that wraps your burger will pay for it.
Note: I don't think this will happen. I think rich people now would rather let people starve so they could save the price of the 15-Second Ad Burger.
We have incentivized sociopaths to run things in the name of efficiency. We will either realize this is a false goal, remove all those people from power and influence, and live in a post scarcity society... Or we will all suffer the consequences.
If everything is automated, that includes the production of whatever goods or services may be advertised. I think that leaves only ideological advertising: theological, political, whatever.
If the food has a meaningful cost, or the adverts has meaningful revenue, then there is meaningful labour to earn money to pay — directly or indirectly — for the food and/or advertising.
> What is any individual software developer supposed to do to prepare for that scenario?
Do what happened to countless of people in past that gets "automated" away or otherwise encounter shifting times in life, become unemployed until you find a different way of surviving.
The industrial revolution changed the life of a lot of people, making vast amount of people unemployed over the course of some decades. It's not hard to imagine that future changes will happen much faster, as everything happens much faster in modern times. So once it starts shifting, the effects will probably be greater.
But, where there is a will there is a way, and surely software developers would discover this as well if they ever find themselves unemployed because of a shift in the world.
That implies heavy reliance that the employers in my past were paying me a surplus over a living wage. The reality is that many employers do not pay a living wage forcing people to seek other means to earn an extra income to top their regular wage.
If you are a software developer, which was the context here, chances are you are paid enough to have it considered a living wage if not more. Is that not the case?
What if the AI will not automate your job, but will walk past it by discovering a new piece of software truly reusable, so all the single-customer software we develop nowadays becomes way too expensive? That possibility sounds to me much more believable.
Think about it. If AI can truly automate software creation then it's going to improve itself, at which point the Butlerian Jihad begins. If you want to prepare for this start learning a trade at the renaissance fair.
It's not just issues of tech and economics but perhaps more importantly, of power. If workers have political representation (power) than these productivity increases can feed into wage increases and hour reductions. A big if. In USA, workers are pretty powerless. That means it is most likely to result in massive waves of unemployment and political instability.
When I first moved to the city in which I live there were NO homeless. That was in 1980. Since then the working class economy here crashed and the professionals took over. Now we step over bodies in the streets.
I think it is likely that the AI wave will be this, but worse.
BTW my last job was at an AI company as a high level tech contributor.
The statistics are pretty clear that the 80s were a MUCH better time to rent.
Drug usage is mostly caused BY homelessness. There is a strong economic incentive in play explaining why you believe the opposite.
We're told that homelessness is caused by drugs rather than endlessly rising rents because a lot of very rich people are making ungodly sums of money from endlessly rising rents and theyd like to continue. For that they need ownership of media and a good scapegoat - both of which many of them have.
Not too many magazines owned by homeless people rather than property developers but if there were theyd tell the opposite story.
The people that maintain jobs aren’t the visible homeless OP is referring to. The dude in a tent on embarcadero shooting up isn’t working as a barista 40 hours a week.
He probably will be after 5-10 years of homelessness. Drugs salve the pain and despair of being homeless which becomes overwhelming after a few years on the street with no hope in sight.
Theres no way somebody on a low wage is going to be able to afford rent anywhere near Embarcadero. The land-parasites have the market locked up.
>Visible homelessness you are referring to is generally a drug policy issue more than anything.
Drug and/or alcohol abuse (not use) is often a symptom of an underlying condition, typically associated with despair. It's escapism.
Despair: the complete loss or absence of hope.
I think a lot of people suffer from despair in the US today. I remember in the 1980s, farmers were killing themselves at quite a high rate because of automation and consolidation (big agg) and them losing their family's farms. I think that same thing has caught up with the rest of the population in the last 40 years.
There has to be a tipping point when enough jobs are done by machines that there are not enough wage earners to buy products and services created by these machines. At that point a choice must be made, to either take the machines and lock them behind secure walls to only be used by the privileged and leave everyone else to squalor, or to fundamentally change the structure of society where work and money are no longer core elements. I've got my bets on which one will occur.
Cost of products made solely by machines will approach zero. If machines take all the jobs, humans will live like aristocrats. Imagine telling farmers 200 years ago, that 95% of the humanity won't be farming at some point. Imagine them making bets similar to yours.
Money is just a game coin. The aristocracies need it because they need someone to provide services to them. By creating that bond an economic and currency system has to be created as well. But once they don't need people to provide services, well, they can remove that system too.
That pretty much means the end of everything for ordinary people and the beginning of the "Lord of light". That also means that humans probably won't be able to go through the great filter -- yes I believe it is not a scientific filter, but a social one.
That’s not going to happen overnight. Producers will keep the price the same for a higher profit margins because other things in life still require money.
Labor is not the only requirement for production. Resources will become the limiting factor, and their price will rise above what non-AI-owning people will be able to afford. So back to square one of scarcity.
* Busy people are peaceful people. Idle hands are the devil's playthings.
* Those "arbitrary people" are our neighbors, and prone bodies draped across the sidewalks are bad for property values.
* Allowing domestic industries and expertise to vanish too quickly can have unintended consequences. Try buying a new set of subway cars in the US today...
That being said, nobody suggested forcing employers to keep redundant positions open. There are plenty of other ways to help keep people productive and engaged in a shifting industrial landscape, which could be funded through various kinds of taxes on the use of AI.
Okay, so, the year is 2036 and your second child has just celebrated her first birthday. The economy is humming, production of all goods and commodities are way up. Unfortunately, you and your spouse are both unemployed at the moment, because for any skill that you have to offer the labor market, a machine is already available that does the job better, faster, and more cheaply. Quicker fingers, stronger arms, faster legs, and now a mind.
Your kid is hungry and your savings account is running low. What do you do?
I think there will come a point, where almost everything will be automated by machines. Since we cannot just get rid of billions of people, we have to find a way to both feed the people and have something for them to do. That is unless AI takes over control of humanity. This is similar to how prisons function. They have to keep large masses of people, feed them, keep them peaceful, without expecting anything in return.
I think a lot of this thread (and HN in general) seems to think that their case is special and AI will not make any dent there. While this is true for many comments, it reminds of several studies that a majority of people think they are better than average in their field. We tend to view our tasks as special cases. People are trying to see how the current AI fits in their current process.
However this won't be the case. The AI might improve and improve. And companies might change processes to revolve around the AI. Just like how companies change their processes to cater to ERP systems, they might change them to cater to AI systems. AI won't have to talk to 10 teams and play politics for prioritisation.
If AI is solving everything and ultimately phasing workers out, then what exactly would you be educating your kids in? Human's limiting factor is the time that it takes them to learn something (18-27 years of education depending on the field). What job would be worth striving for? They'd be training for jobs that a future AI would be trying to phase out. This whole article just described how most of the the highest paid workers (doctors, lawyers, driving, logistics, film, music, radio, etc etc) would be automated. I don't know what you'd need a person for at that point. Feels like a race to the bottom.
It is also sad that every other specie has already suffered before humans are starting to feel this way. There was massive biodiversity loss on this planet and most of mammalian biomass consists of humans and animals that feed humans. Humans need to retreat. We have enough tools and people improving human lives. We now need more people and tools to improve many other lives and systems on this planet.
I think education will still be crucial in human to human relation but it remains to be seen if the value of education won't be lessened because it won't give any other edge. I really share this concern because let's face it - we're pushing ourselves for the sake of better life, better job, better X.
This, just like many other takes, confirms that there will be a short term boom in applying AI to everything. If you are in your 30s or 40s and surf this wave correctly, this may very well be the last job you will ever do.
So my plan is to learn how to plug GPT to excel, databases etc and provide AI for the smaller companies that can’t integrate AI like Salesforce and big companies like Azure Professional Services will not find them profitable enough to deal with.
In the early days of widespread internet connectivity (dial-up) it generally went two ways:
1) You lived in a big city and AOL/Compuserve/Prodigy/etc was a local call. Practically everyone had it.
2) You lived "somewhere else" and some random guy you could pay had a T1 and a bunch of modems in a garage (this was me).
Eventually it became worth it and technology advanced enough (with demand) for cable cos to deploy cable modems. At first only in dense metro areas, then slowly out to more rural areas.
Point is - there's a TON of opportunity for "guy with a T1 and modems in a garage" in this space. There is a huge market of ignored and underserved businesses/customers/applications/integrations/etc that (like AOL/Cable Cos) the big AI guys don't care about (yet).
Right now seems like the right time to be doing this for sure. There are also a lot of potential applications for GPT to be used to solve problems that were previously too expensive or infeasible to be done by humans at scale. It definitely doesn't hurt for you to hedge your bets and try to learn how the tech works at the bare minimum.
Even if all the jobs creating CRUD apps are automated away, there will probably still be an interest in automating them even faster. LLMs are currently the state of the art technique for doing so, hence there will be a continual need for more AI engineers/researchers.
And if something that performs better than current LLMs comes around, all those engineers will shift over to whatever field develops them.
20 years is a long time in an industry with an incredible rate of change. Though I guess if you're in the market now, you're 3-5 years ahead of kids coming through tertiary studies specialising in AI.
I hear many other tech professionals doing more or less the same. Shame you don't have any contact information in your bio, I would love to have a chat.
Squarespace was supposed to take away the jobs of people who make websites, but it didn't. The same people who you used to hire to make a website for you are still doing that, but now they just use squarespace for it.
I saw a recent quote for a very basic, static website that was greater than $10k. This would be about a day of work to put together in squarespace.
AI is not going to take away jobs, it's just going to make the people already doing them more efficient.
A few points on this from someone who's been building websites for 25 years (I started around when images were added to HTML) -
The web designers don't hear from many of these previous prospects who now go straight to Squarespace.
When people come to me for a Shopify site, it's usually because they've done all but the hardest 10%. Then they want to pay me a tiny amount to do the most unpredictable and difficult 10%. Usually something custom/difficult within the parts of the platform that are locked down.
I've seen budgets from local brand-name companies go from $20k for a build to $2k.
Often, the people charging $10k for a Squarespace site are justifying the majority of that with related services (copywriting, photography, content, marketing, etc). Many surviving web companies needed to become agencies. Shopify has some automated marketing options now. Copywriting is increasingly done with ChatGPT/similar.
Don't get me wrong - this is all very liberating for the client side and a boon for platforms like Squarespace and Shopify, but don't underestimate the upheaval for web designers.
We've been re-inventing the wheel for mom and pops for so long it made sense that someone made it more efficient. I used to do it for small businesses and now I do it on the exact opposite side of the spectrum building web apps and my industry is ripe for the same kind of thing.
In the near-term I think we'll see it as an efficiency gain for developers, but longer term we will be able to just make applications on the fly.
It's hard to predict. Consider bank tellers. ATMs initially allowed banks to run more local branches, resulting in more jobs. But now that ATMs, especially the ones inside, are VERY full function, the job numbers are reducing fast.
Already reduced numbers, and a prediction of a 12% drop over the next 10 years, versus an average 5% gain for other jobs. As mentioned in other replies, not just ATMs, but cashless self-service of all types.
Right. But now Squarespace can integrate a prompt that has a conversation with you and build a site and continues to iterate on it until you’re happy. This adds very little cost to Squarespace. Maybe it gets most people to 80%. The last 20% will be a service provided by a human.
This is an interesting example, because they probably don't use squarespace for it as they are likely more effective with other types of tooling. Whilst squarespace is a great general purpose tool for people who don't know specialized tools, a specialist will be more effective with different tools.
That's where I see the difference in AI as well. A specialist is probably faster using their own tooling rather than muddling through an AI interaction. But the AI gives non specialists the ability to muddle through tasks they can't do on their own, or don't have specialized knowledge for.
It's really feeling like AI is the new crypto. Majority of the people hyping up AI are directly benefited from the raise in AI. The same way the people that were hyping up crypto either ran blockchain, dapp, or were VC invested into it.
I don't know the writer of this blog post, but I'm willing to bet he has posts about how crypto will replace the financial system.
Tbh this makes me much more wary about all these AI claims. Wasn’t too long ago when people were clamoring about how their latest favorite technology, blockchain was going to magically come in and solve every problem since the dawn of society.
Agreed. It's always possible that this is a "a stopped clock is right twice a day" situation—that by hopping on every bandwagon the hype artists eventually land on something that happens to work out—but in general I find it likely that this blows over to a degree.
There's an anti-productivity link round the front page that answers this well - the amount of useless work to be done is infinite, and growing productivity on useless work only lets you do more useless work.
The more useless work ai is able to do, the more useless work will be asked of it. The ai doesn't have opinions about what it should be doing, so without people to push it into doing the right thing, it's going to be answering a lot of emails
The recent rounds of tech giant layoffs suggest that capital is far more capable of asking wether a given unit of work is actually useful than we might have thought before.
If n employees can earn m money, sure, one form of institutional greed might assume that 2n employees will earn 2m money, and 100n 100m. But if that fails (and chances are it will, because everything has a sweet spot beyond which lies the land of diminishing returns), it's hardly surprising that the next form of greed that will be attempted is discovering how much of m will remain with n/2.
As a non senior data engineer, I think AI can take my job already, except that it requires my company to open up its code base for AI to consume and train seniors to be better prompt engineers so that they spend less time modifying code written by AI. There are a lot of edge cases but I think the tech is already here.
To escape from that fate I'm going to drill deeper into k8s, docker and Linux system programming. I know a better route is to launch a career that translates requirements from business stakeholders and develop an AI-friendly architecture so that our AI lords can conveniently code functions, scripts and programs without much interference. But throughout my career life I have tried my best to avoid that road (because I'm not interested in business), so I have to pick the second best way -- serve our AI lords by maintaining infrastructure for them, just like rhino birds serving rhinos I shall serve our AI lords faithfully in eternity, even after death when my coding skills can be consumed by them.
I think that's the most dangerous way to go, THB. AI is good at stuff that is very widely represented in its training data, and common pieces of infrastructure that many apps use like docker will be some of the easiest stuff to automate away.
"AI, spin up a K8 with these properties like you have for 100,000 others..."
After seeing a demo of ChatGPT generating terraform code I have to agree with you.
I'm not sure what else can I do though. Looks like AI is indeed going to be very good at everything I love to do, mostly very technical stuffs. I just fed ChatGPT with a bunch of DOS viruses and it did a good analysis. It might take me quite a few months of full time study to reach that level.
Do I have to be an entrepreneur? Sigh. I have spent my whole tech life to be away from that road...
There are plenty of students who can ace these exams but then utterly fail at getting things done at companies because they lack problem solving experience, a toleration for ambiguity or any number of interpersonal skills needed to complete complex projects. Until AI can just deal with other AI and not messy humans, there will be job security.
Yes but AI won't have to deal with messy humans. Once companies see how quickly this ultra cheap AI can get things done, a lot of companies would restructure things to ensure the AI has a superhighway to whatever it needs. That sounds like an easy competitive advantage for any company. Messy politics exist within companies because there are so many people and so many orgs. If one cheap AI can do so much concurrently, structures optimal to the AI will be formed focused around the AI needs.
who checks this output ? who troubleshoot when prod is down? what about regulations? which company is the first one to have a huge data leak and blame it on AI? which legislator bans use of AI unless there are human engineers?
I’ve never seen any one company be successful at streamlining their projects.
I am worried about AI but I doubt this will be happen as you’re saying it.
Code is reviewed by one more person anyways right now in most continues. So it continues same way.
> who troubleshoot when prod is down
A human engineer for now. The amount of time when prod is down should be low enough for this to not be a huge factor.
> what about regulations
the AI will do a better job of memorizing and understanding those
> which legislator bans use of AI unless there are human engineers?
what does this even mean. Is it ok to have one AI and 5 engineers where previously 50 were needed? Since nobody is going to have a company with 0 human engineers and only AI.
> I am worried about AI but I doubt this will be happen as you’re saying it.
I also doubt it happens exactly as I am saying it, but I think the possibilities are non zero.
I think my job is reasonably safe simply because "AI" is never going to be as effective at it as "me + AI". Someone has to be there to understand the problem in human terms in order to know what to tell the AI to do. I've recently started pushing myself to take a more Systems Thinking approach to solving problems, and to start using AI in my work, precisely because I see this as a change that's coming. It is inevitable. I might as well embrace it.
Also, and maybe this is selfish, I suspect fewer people are going to see tech as a good career option in the very long term (10+ years). The experience I have now puts me ahead right now, but if there are fewer devs entering the industry in a decade it'll keep me in work for a long time. I'm think (hope!) I'm going to be OK.
If I was a junior right now, or a student, I would see AI as a much greater threat.
and eventually it would just be "business owner"+ai. The way i see it is that a small business owner would use the AI by directly talking to it, via natural language, and continuously iterate on the output.
The feedback loop would be fast, because the AI is almost instantly able to produce an observable outcome/product, even if it's wrong. Then the owner tests it out, or ask the AI to test it etc, and whenever the owner finds any deficiency, they can easily ask for a change.
My hope (as a mid 30s developer) is that I can effectively use AI to become a Me+ version of myself, and this helps me level up my career enough that I appropriately ride the AI wave, pick the right engineering positions/companies, and am able to effectively retire in ~5-10 years.
I really do feel that I have like 5 years left of my currently high income before I am making what my plumber friend makes. I'm OK with that, as I'll be out of the early child rearing years, but also, phew, it's scary!
Yeah that's me. I'm hoping that if I keep up to speed as much as possible I'm employable for the next 20-30 years until (early) retirement.
I started transitioning away from programming before that (for unrelated reasons) but I'm starting to think that it was a good call. If I am to become the COBOL guy of the future I'd consider that a win.
Yes, a very limited AI still devalued all that a human brought to the table.
Now AI models are getting both more general, and deepening their comprehension, at the same time. Rapidly.
If there are highly talented humans in your field today who would find your extra help more of a distraction than a help, then it’s likely more advanced models will to.
Not being critical. This applies to me too. All of us!
Eventually the ‘me’ part hits diminishing marginal returns if AI is capable of getting you 90% of the way. As good as you + AI? Maybe not. Good enough? Probably.
What I find interesting is what professions are concerned about AI. For the most part they appear to be better paying white collar need-a-college-degree type of jobs. Tech, copywriters, marketing, designers, lawyers, accountants, etc. All feeling the heat.
The person pumping gas? Cutting hair? Janitors? Nurses? They, for now, seem to be immune.
As a side note, anecdotally, productive gains have a ceiling. Sure Copilot and ChatGPT free me up to focus on the heavy lifting. But my brain can't run that relentlessly all day. It seems to need to catch its breath.
> I haven't seen anyone paid to pump gas in my country for decades?
Meanwhile, I've seen people at gas stations pumping my gas for me in plenty of countries, especially in South America. Some places in South Europe do it too during rush hour and at really busy gas stations in metropolitan areas.
I will admit that I am personally on the fence. There are parts of my job that could definitely be augmented by LLM ( I know, because I have been trying to use them as a way to test whether it could work in my domain ). Some results were cool and shaved off a fair amount of time for me ( for example, it cut down some time on proposed approach to problem, where there was a solution already ), but I can't say it would, in its current state, replace me ( Python and SQL code quality gpt generated seemed to rely too much on how specific the prompt was and I typically had to refine it as I went through it ).
But we are merely starting out. I actually started looking into building excel plugin ( first time ever too so, who knows how that will pan out ) for my use case ( categorizing some unstructured data into structured data, which is what GPT seems to excel at ).
And the one thing that was initially protecting me is that some companies, rightfully, identified ingestion of prompt data as a massive privacy issue. Now OpenAI supposedly said data ingestion from prompts and provided files will be opt in only, which may mollify some of those companies.
OTOH, some things can't possibly be replaced by AI ( last week we just dealt with a vendor requirements translation and, uhh, good luck with all that GPT ).
> The main winners of these trends will be all of us consumers, with access to much better content.
Also, the best creators, who will use these tools to generate lots of great content and rake in its benefits.
Maybe yes. But I wonder. Take that example with disappearing local football leagues: what is impact for local communities? When everything is "centralized" to very best providers... Is best always best?
Are you talking American Football or the "actually using your feet" one (Soccer to us)?
Youth (American) Football is dying because the data on CTE and other debilitating injuries is clear and parents are less enthusiastic/willing to allow their kids to damage their brains.
Related to this quote, I don’t think we’re guaranteed access to better content. I think we’re guaranteed to more prolific cheap to produce content. Think Instagram era relative to Flickr era. Less about the art, more about fast fashion, or fast art, to capture your ever more brief attention. Certainly the next chapter of Tim Wu’s book, The Attention Merchants.
Before AI can take my job, AI can be my tool. I can become more productive with AI, many times more productive than now. Then the question will not be "is AI better than credit_guy?" but "is AI alone better than credit_guy who uses AI?". If I learn how to take full advantage of AI, I have a fighting chance.
Tanks may have made cavalry obsolete (was is tanks? I don't know), but they didn't make generals obsolete. When was there ever a military engagement that was described in terms of commanders facing off? It's been the story since the beginning of war.
Likewise there are decision makers in civilian life. They decide all sorts of things, sometimes relying on people to implement their decisions, sometimes machines. Sometimes they use a machine to replace a lower level decision maker. But there's always a boss.
This is what will determine whether your job is automated away.
here is my day-day from this last Friday;
1. Made circuit boards for a Photo-diode circuit(1nA max out. welp!) with Altium previous week.
2. Got bare fabs from local fab-house.
3. Received them, did opens and shorts between power and ground. Also verified internal planes were intact(no dead copper).
4. Stuffed one board(made 5).
5. passed smoke test(huzzah!!). But op-amp output zilch.
6. Debugged the entire afternoon found out that the enable to the reference chip was "pulled-down" instead of "pulled-up". I fixed it the circuit started working and op-amp output high.
In the intermeaning steps between 5 and 6, I tried searching for specific debug tasks using BingAI(I was an early adopter hihi). The results were sub-par. essentially summarizing data sheets. Went direct to the source(ChatGPT) with alarmingly same results. Hence I decided to have the data sheet on one screen and my schematic on the other and traced the connections and found out my error within say 15 minutes.
Long story short, in my professional career AI is not going to be of any immediate help(in its current avatar) for the next 3-4 years. Beyond that I probably would be long retired and gone off grid.
Definitely, debugging is going to be a manifold problem. Particularly because it's typically not just undocumented behavior but also wrongly documented or inversely documented or it's not parallel safe, etc.
> get in the bandwagon of AI, start using the tools, explode your productivity
I'm reluctant to start using LLM-based coding assistants (e.g. Copilot) in my programming work because I don't want to taint my code with laundered code that was under a copyleft or other restrictive license. Is this a reasonable concern?
In short, no. Once you try out the tool you’ll understand this concern never made much sense to begin with for normal usage patterns. Copilot primarily excels in writing very short snippets, far too short to be under copyleft.
Oddly enough half of my job is learning to get answers (from the physical world), the other half finding out if those answers are true. So far no human has come close to threatening my job if they don't actually care about those things. So it will be when AI learns to care about the truth.
This got me thinking: as more of what can be produced with existing technology, is made trivial-to-produce through enhanced automation (e.g. AI), one area that will still require labor, and that people will still be willing to expend resources on to procure, is the creation of new technology.
It would be in our interest to remove legal impediments to raising capital for new ventures, so that more of the labor freed up by automation of existing industries can be allocated to the creation of new industries.
Such a world would see more kickstarters, equity crowdsales, crypto token sales and DAOs for collaborative creation, and more people employed in the ventures these capital raising projects fund.
Parents already use technology (TV, tablet computer) to alleviate some of the burden of caring for their children. Why not use a far more engaging product?
I think even if the utopian case happen and the machine wealth get shared among people and everyone lives happy in leisure while the machine does everything, even then it would severely curtain social mobility. There will not be a way to get better outcomes in life. An even larger number of folks won't have any upward mobility. A lot of human inventions in the past 2 centuries have owed to individual drive and ambition. Once that goes away, how do anything new happens.
I think the mistake a lot of people make is assuming that their job is actually important. We got a little sneak preview of that during the lock downs in the last two years when people's daily routines got disrupted a lot and a lot of day to day work either stopped happening or was reduced by a lot. It had a lot less effect than you would expect if you assume that most people do something important/essential. Most of what people do in their jobs isn't all that important.
This is a result of automation that has gradually been introduced since the industrial revolution. People kept on working but the nature of what they do has become increasingly detached from things that people back in the day would have recognized as work or important.
We work less than ever and lot of "work" basically involves people having meetings, managing other people, or doing things in the services industry in a complex network of interdependent tasks and processes where you might question what it actually is that we are actually doing collectively or why.
Obviously, there is some value in doing these things because we're obviously doing them and getting something out of the process of doing that. At least subjectively. But what matters economically is that money changes hands and is used to create the demand for all these things. Without demand, there's no need to do a lot of the things that we do. And without that, the need to automate it also goes away.
So there's this paradox that automating what we do, removes the demand for the thing (or at least downgrades its value). And it helps create demand for more valuable things we can do.
Take bread for example. Rather crucial to a lot of people and it used to be the main component of a lot of people's diets. Bread now comes from factories and ever since we invented sliced bread it's been considered the best thing ever. Sliced bread (sliced by a machine) from a factory that produces bread at an industrial scale.
Except we pay extra for artisanal bread that is made the old fashioned way. So we now have people with college degrees opening bakeries and selling us really expensive baked goods that they made with their own hands. Could what they do be automated? Yes, that already happened. But the whole point of what they do and why it is valuable is that it isn't. The factory bread is low value. And the bread is just part of the experience. The whole bread experience involves interacting with the amazing people that bake the bread, that knead the dough, and visiting the beautiful shops in which they do that. Add premium packaging to the mix and it makes you feel really good about spending 10$ on a loaf of bread. Can AI automate that experience. No. It might bake some bread but it won't be worth 10$.
There's going to be a lot of that. People will look for activities that are valuable to others in some way. The automated stuff is inherently lower value, boring, and a commodity. And we need value creation to create demand for whatever else it is that we do for each other.
Under the current system, millions will lose their jobs and a handful of extremely rich people will get richer: capitalists. All of this will be completely lawful: is the law of the free market.
The only answer is to destroy capitalism and unite as workers. Only if we unite as proletarians, we can redistribute the inmense profits of automation and make sure to live a worthy life.
Most of us in software are automating other people's jobs: we learn and understand requirements and build things that make other people more productive or erase their jobs entirely. The rate of automation is right now limited in large part by the availability and cost of software engineers. If AGI can reduce that cost to ~0, then a massive percentage of the economy would be wiped out in a matter of months. What is any individual software developer supposed to do to prepare for that scenario?