Hacker News new | past | comments | ask | show | jobs | submit login

There are approximately zero economic indicators showing this.



"Economic indicators" are entirely irrelevant for that discussion. This isn't going to be a gradual change either. At some point, GPT-n is going to be available with super-human capabilities in all tasks that matter, and then it's simply over, from one day to the next. Nobody will continue paying people to do things that AI can do better, faster, and more reliably, at lower cost.


One day eventually we will have general human equivalent AI, at least for the vast majority of work tasks. Sure, but that's as much a premise for a science fiction story as a prediction about the future.

We are absolutely nowhere near even close to beginning to know how to even start building such a thing. Chat bots, language models and image generators are fun tools that look amazing to people who don't understand how they work, but they're extremely rudimentary compared to real intelligence.

I'll make a counter-prediction. All the low hanging fruit in language model development have been picked. Like all technologies there's a steep part of the S-curve of development and that's where we are now, but you can't extrapolate that to infinity. We'll soon hit the top of the curve and it will level off, and the inherent limitations of these systems will become a severe obstacle to further major advances. They will become powerful, useful tools that may even be transformative in some activities, but they won't turn out to be a significant step towards general AI. An important step maybe, but not a tipping point.


No, that's not how jobs work. Comparative advantage means it's worth paying people to do a job even if you're better at it than they are, because you have more important things to do.

Hiring AIs to do something is extremely expensive. You're basically setting a warehouse of GPUs on fire.

Anyway, if it was true total factor productivity would be exploding, but it's actually kinda underperforming. (And automation almost always causes increased employment.)


Yeah but have you tried dealing with people? Just like horses replaced cars, for lots of reasons, an http endpoint, powered by AI, that could do the work of a person, would replace so many jobs. Or ATMs for example.


> Just like horses replaced cars, for lots of reasons

Other way round, right?

Humans (labor) are different from horses (capital) because 1. they actively participate in work, ie, they don't just literally do what you tell them 2. they actually signed up to work, whereas horses don't care. And 3. if you give them money, they'll also become your customers. Though, I don't know if that's a major factor for employers, even if there is that Henry Ford anecdote.

ATMs are a good example here because there are more bank tellers now than before ATMs were invented. (see Jevons' paradox)


Lol oops yeah I wrote that backwards. Too late to edit now, ah well.

There are some jobs where labor needs to care. Most tech jobs, for example. But there are lots of jobs, especially temp ones, that are about throwing as many bodies as you can afford at a problem, and don't ask questions or try to do it smarter. So 1 is actually a detriment in those kinds of jobs.

To point 3, Henry Ford aside, if businesses really wanted employees to be able to afford their goods, they'd stop offshoring jobs!

ATMs put bank tellers out of work. There do happen to be more bank tellers now than before because there are more bank customers needing more bank services, but my bank only needs X tellers at at time vs X+1 or 2 or 3, and they don't hire any for 24 hours services. It's a bit hard to see, because the number of tellers is higher now than before, but the question is how many more tellers would there be without said machines?


Maybe in training but presumably you buy the trained AI service or whatever and run on cpu. AWS machines with GPUs don’t even surpass roughly $25 an hour. That’s the low end for a desk worker.


Admitting my last comment is a bit false. Turns out ChatGPT is running on GPUs.


It is, but especially high memory ones, and presumably it's loaded on a bunch of them since it supports simultaneous queries.


True, but surely the cost of multiple machines is amortized by simultaneous queries?


You’re predicting something that has never happened before by invoking magic AI that can do any tasks without specifying real-world limitations that might come up.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: