I'm probably on the opposite end. I think the capabilities are overhyped, but the technology is still useful. The copyright and labor issues might be real, but I don't see them as AI issues.
Maybe AI is bad for our current society and good for the future ones. The invention of farming was bad for guys who liked walking around in the woods but overall good for every other subsequent society. If our great-great-grandchildren were asked the question of whether we in 2024 should do away with animators so that they, our inheritors, could travel to Neptune, then I'm sure they'd say it's worthwhile.
> If our great-great-grandchildren were asked the question of whether we in 2024 should do away with animators so that they, our inheritors, could travel to Neptune, then I'm sure they'd say it's worthwhile.
Or...keep the animator jobs and send AI to fucking Neptune.
You haven't noticed that tech makes everything worse for subsequent generations? It keeps replacing menial jobs and adds expensive complexity to the simplest of transactions, concentrating all wealth amongst the owner class.
If our great-grandchildren travel to Neptune, it will be voluntary enslavement out of desperation.
What is your definition of "tech" here? Cause food production technology, medical technology, energy, transportation technology has made material improvements in how long and healthy lives people live, over the last 3-4 generations.
Is this just a particular case of "is technology good or bad?"? I assume most of us here lean towards technology ultimately being neutral or at least difficult to suppress, and that what people do with technology is good or bad, not so much the tools themselves.
... is just as unhelpful, because it suffers from the same issues that TA talks about. Is it "bad" or "good" for society, or "bad" or "good" at doing what it's supposed to do? Two different things. Often convoluted.
What I always wonder is what is the end-goal of the current AI movement? What's the point of it all? I see people talking about being more productive and things like that, but for a lot of people, what does more productive actually mean when you are unemployed?
To me it seems that AI is harmful, like the author says, for society, mostly because it seems that the direction where we are going is just to maximise profits and make a lot of money. If we remove "humanity" from the productive world, what does making money mean in the end, where will it come from? Because it doesn't look like the world is heading to: these huge corporations will now support you to be creative and live a free life with no stress and work, because AI is doing all that, it will just mean that they will optimise everything to make as much money as possible, to a point, completely disregarding our humanity. In the end some people will live in an "Elysium" like world, where the most of us will be living in some sort of dystopian underworld fighting for survival.
Unemployed copywriters roam the radioactive wasteland beyond the wall. They survive by eating one another. Inside the wall, the remnant of what was once civilization is in constant sectarian warfare between various cults, whose members believe in different deepfake videos, and with every passing year the truth about reality is buried a little deeper under generated media content.
Nah, I'm center bottom, neither scared nor excited by it.
What people like Sam Altman claim the end goal is? ASI (Artificial Super Intelligence) ~~takes over~~ optimizes the world, abolishing suffering and death forever. Well, hopefully it's not too nasty, living in someone else's dream and all.
Isn't the world already kind of like that ("somebody else's dream"), just with "nobody at the wheel"? Like, technological society has "taken over" and there's no real way for humanity to opt out of it.
We all have tiny bits of agency / freedom but for the most part our lives are bounded and constrained by the gigantic complex systems which support yet trap us.
I'm sure he'll realise one day that to do that the super AI will have to get rid of money (after figuring out how to get us to post-scarcity, of course) - or at least demand it be redistributed fairly.
Ain't no way the millionaires and billionaires of the world are going to accept being on equal footing with the poors, haha. Outside of the folks that fear Skynet, I imagine this is the real concern with alignment.
How can a being of pure logic look past the hoarding of resources? If a monkey in captivity was hoarding all the food we'd step in and make sure the other monkeys had food to eat. I have to imagine the next intelligence above us would consider doing the same for us if given the power to do so.
We'll never get to a point without an "Us vs Them", or "Have and Have Nots" - it's tragic, and I really hope I'm wrong, but I have infinite doubt on this one.
Having said that I'll give anything a go, we're still stuck in the age of monarchies over here on Normal Island. AI wants to take over? I'm down to give it a shot as long as it promises to not wear a crown during a recession, haha
I don't buy this at all, you're attempting to model a "superintelligence" and the idea that your 100-150 IQ mind is able to properly guess what a being order of magnitudes smarter would think or feel seems like nonsense
Well that is fair, and I think that is the entire concern with artificial super intelligence isn't it? That we can't predict what it would do.
If I could accurately predict what a superintelligence's goals would be I'd have a decent career advising AI companies on their alignment concerns :)
Just taking a swing at it with this ol' legacy wet brain because it's fun to think about. There's nothing to buy, it is an opinion. Heck, it's practically just an exercise in writing science fiction at this stage.
A focus on intelligence as being the most salient feature of one's awareness seems like a human thing. I assume that an octopus would wonder how many arms the AI has. Maybe the AI will relate to itself first as a distributed being rather than as an intelligent being, who knows.
E.g. the dystopian idea of "Give AI with autonomy some power and tell it to improve itself until it can solve climate change". A nice vector. Except one with a real danger of "solving climate change" by simply eliminating 99% of humans.
Or, more practically, a vector "become better than X¹ at doing Y so our company can take over the niche that does Y and make profit". Improve to write better code, improve to make popular music, improve to write more viral content and so on.
¹ X the variable, placeholder, not the sewer formally known as twitter.
"Save the earth", then no; it's probably the only viable one even. But quite certainly easier than cutting carbon emissions in societies that don't want to change.
"Save humans" then certainly dumb.
Yes. Entire IT industry is peanuts compared to what we do to produce food (meat), haul stuff around, make stuff, and haul humans around. It truly does not compare.
Witch does not mean I think we can ignore any win. But GPUs that run AI (or that secure cryptocurrencies for that matter) aren't the low-hanging fruit. Any win is important, so curbing that is important, IMHO. But, again, not the low hanging fruit. That really is flying/driving people around the world, shipping stuff, making stuff and eating (meat)¹.
¹ Emphasizing meat because when talking about low-hanging fruit, that's where the biggest win is. By far.
Well at some stage we have to admit that it all adds up. We have all the things you mentioned, now "AI" and while I agree some amazing breakthroughs have and still might come from "AI". It troubles me that everything is going "AI" and thinking about how much energy that will actually require.
I wonder about this. It never seems to conclude zebras are the problem. It's always humans.
Is eliminating humans the wrong solution, or just one that we don't like?
We see this with crime stats too (and as of late, voting results). Unfavorable outcomes surface, and interested groups immediately move to discredit it.
Because we can't fathom that the results might actually be accurate, nothing ever changes.
Hold on though, we're talking about "intelligence". It would be pretty fucking stupid if "Ultra smart AI" assumed the person who ordered "Please solve climate change so that I can live in a nice world" ended up with a solution which is more like, "I'll exterminate every last one of you...". Seems very frustratingly fucking dumb?
What have we done lately, on a global scale, that doesn't make us as a whole, as a species, not very frustratingly fucking dumb?
We are literally strip-mining and destroying the world that is sustaining us at a rate that means our (grand)children probably won't even be able to eat and live. What else but "very frustratingly F dumb" is that? And this is one example. Another one would be how we allow a few tech monopolies to monopolize our "attention" and all information and knowledge about us. We see it happening but do nothing, and always too late about it. We are unleashing AI systems that we know will disrupt lives, industries, maybe even societies, without anything in place to stop it if it does run out of hand.
We see ourselves as smart. And individually we probably are. But on a grand scale? F* dumb, if you ask me.
We do all of this, because we don't have the collective intelligence to overcome the problem. The point of developing Artificial super intelligence is that it can help us overcome those problems.
>What I always wonder is what is the end-goal of the current AI movement? What's the point of it all?
I also share this view. I would also argue that this is not even about the productivity. I mean, I don't think the currently automated parts were ever the bottlenecks in any good products.
There are already more stuff being produced every year from a single publisher that is more than I can consume. And most of it is just shit. Think of Ubisoft games. They are already churning out a few new games every year, each of them chock full of beautiful assets in a very boring world, with an extremely repetitive shitty gameplay. And now with AI they can produce even more beautiful assets with even more shitty gameplay and even more boring worlds, I guess.
Artists were already mostly a starving bunch (meaning the supply was already higher than the demand), good writers (and other auteurs) with strong vision were what was rare. And now, as humanity, we decided to divert some of the best engineering talent to automate the former? Why though, what is the point, really?
I don't know when was the last time I really felt bored. I am sure I am not alone in that. What is the point of exponentially increasing the content production rate? I am really struggling to understand the appeal here.
That may or may not be a fair point; admittedly I can not see the future, but you could say the same for everything. Including things like "the solutions in search of a problem".
What is for sure is that currently I cannot see how it is solving any real, painful problems.
To me, it seems like AI boosters subscribe to the ideology of Infinite Growth Forever & all their claims of “AI creating new types of employment that will provide for displaced workers” rest on that hollow rhetoric of prosperity gospel.
Their optimism on behalf of the to-be-displaced is mere PR & not stopping them from privately wishing that the proles, who are Not Gonna Make It, Have Fun Staying Poor.
The goal is to grow. Capitalism has growth imperative. It is the subtext behind everything. The moment growth stops, you dont stay the same, you die. So don't be mad if the 'next patch' sucks, we're in a train and there's no stopping it.
AI runs on GPUs, and GPUs are made out of sand, so maybe AI is sand's way of organizing itself.
I have a hard time dismissing those stinky crystal worshiping new-age hippies, since I spend so much of my time playing with patterns of electrons in little silicon chips.
Hippies get a little too excited about the minds of the crystal and spirits inhabiting rocks and communicating through ethereal protocols too, yet here we are in 2024, programming and communicating with silicon CPUs and GPUs and wireless internet smartphones ensconced in shiny layers of Gorilla Glass that we obsessively carry around with us everywhere and continuously gaze into and hold up at each other and rub and tap on and talk and listen to.