Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> just burning their goodwill to the ground

AI firms seem to be leading from a position that goodwill is irrelevant: a $100bn pile of capital, like an 800lb gorilla, does what it wants. AI will be incorporated into all products whether you like it or not; it will absorb all data whether you like it or not.



Yep. And it is much more far reaching than that. Look at the primary economic claim offered by AI companies: to end the need for a substantial portion of all jobs on the planet. The entire vision is to remake the entire world into one where the owners of these companies own everything and are completely unconstrained. All intellectual property belongs to them. All labor belongs to them. Why would they need good will when they own everything?

"Why should we care about open source maintainers" is just a microcosm of the much larger "why should we care about literally anybody" mindset.


> Look at the primary economic claim offered by AI companies: to end the need for a substantial portion of all jobs on the planet.

And this is why AI training is not "fair use". The AI companies seek to train models in order to compete with the authors of the content used to train the models.

A possible eventual downfall of AI is that the risk of losing a copyright infringement lawsuit is not going away. If a court determines that the AI output you've used is close enough to be considered a derivative work, it's infringement.


I've pointed this out to a few people in this space. They tend to suggest that the value in AI is so great this means we should get rid of copyright law entirely.


That value is only great if it's shared equitably with the rest of the planet.

If it's owned by a few, as it is right now, it's an existential threat to the life, liberty, and pursuit of a happiness of everyone else on the planet.

We should be seriously considering what we're going to do in response to that threat if something doesn't change soon.


Yep. The "wouldn't it be great if we had robots do all the labor you are currently doing" argument only works if there is some plan to make sure that my rent gets paid other than me performing labor.


It depends if you're the only one out of a job. If it really is everyone then the answer will likely be some variant of metaphorically or literally killing your landlord in favor of a different resource allocation scheme. I put these kinds of things in a "in that world I would have bigger problems" bucket.


And that's the ultimate fail of capitalist ethics - the notion that we must all work just so we can survive. Look at how many shitty and utterly useless jobs exist just so people can be employed on them to survive.

This has to change somehow.

"Machines will do everything and we'll just reap the profits" is a vision that techno-millenialists are repeating since the beginnings of the Industrial Revolution, but we haven't seen that happening anywhere.

For some strange reason, technological progress seem to be always accompanied with an increase on human labor. We're already past the 8-hours 5-days norm and things are only getting worse.


> And that's the ultimate fail of capitalist ethics - the notion that we must all work just so we can survive. Look at how many shitty and utterly useless jobs exist just so people can be employed on them to survive.

This isn't a consequence of capitalism. The notion of having to work to survive - assuming you aren't a fan of slavery - is baked into things at a much more fundamental level. And lots of people don't work, and are paid by a welfare state funded by capitalism-generated taxes.

> "Machines will do everything and we'll just reap the profits" is a vision that techno-millenialists are repeating since the beginnings of the Industrial Revolution, but we haven't seen that happening anywhere.

They were wrong, but the work is still there to do. You haven't come up with the utopian plan you're comparing this to.

> For some strange reason, technological progress seem to be always accompanied with an increase on human labor.

No it doesn't. What happens is not enough people are needed to do a job any more, so they go find another job. No one's opening barista-staffed coffee shops on every corner in the time when 30% of the world was doing agricultural labour.


> This isn't a consequence of capitalism.

Yes, it is. The fact we have welfare isn't a refutation of that, it's proof. The welfare is a bandaid over the fundamental flaws of capitalism. A purely capitalist system is so evil, it is unthinkable. Those people currently on welfare should, in a free labor market, die and rot in the street. We, collectively, decided that's not a good idea and went against that.

That's why the labor market, and truly all our markets, are not free. Free markets suck major ass. We all know it. Six year olds have no business being in coal mines, no matter how much the invisible hand demands it.


You have a very different definition of free than I do. Free to me means that people enter into agreements voluntarily. It's hard to claim a market is free when it's participants have no other choice...


> That value is only great if it's shared equitably with the rest of the planet.

I think this should be an axiom which should be respected by any copyright rule.


You are correct, but the real problem is that copyright needs complete reform.

Let's not forget the basis:

> [The Congress shall have Power . . . ] To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.

Is our current implementation of copyright promoting the progress of science and useful arts?

Or will science and the useful arts be accelerated by culling back the current cruft of copyright laws?

For example, imagine if copyright were non-transferable and did not permit exclusive licensing agreements.


The "publisher bootstrap kit + revenue sharing agreement" would become ubiquitous overnight.

Copyright isn't the problem. Over-financialization is the problem.


AI is going to implode within 2 years. Once it starts ingesting its own output as training data it is going to be at best capped at its current capability and at worst even more hallucinatory and worthless.


The mistake you make here is to forget that the training data of the original models was also _full_ or errors and biases — and yet they still produced coherent and useful output. LLM training seems to be incredibly resilient to noise in the training set.


Forget what it eats to continue improving.

Realize what it already has.

A foundational language model with no additional training is already quite powerful.

And that genie isn't going back into the bottle.


Nonsense. Some of the current best AI models were specifically trained on AI output.


That's a talking point for bros looking to exploit it as their ticket.

"The upside of my gambit is so great for the world, that I should be able to consume everyone else's resources for free. I promise to be a benevolent ruler."


"What's good for Milo Minderbinder is good for the world."


…meaning that whatever model results would have no protection, and would be free for anyone to use?


That's not how conservatism works. AI oligarchs are part of the "in" group in the "there are laws that protect but do not bind the in group, and laws that bind but do not protect the out group" summary. Anyone with a net worth less than FOTUS is part of the "out" group.


AI is worthless without training data. If all content becomes AI generated because AI outcompetes original content then there will be no data left to train on.

When Google first came out in 1998, it was amazing, spooky how good it was. Then people figured out how to game pagerank and Google's accuracy cratered.

AI is now in a similar bubble period. Throwing out all of copyright law just for the benefit of a few oligarchs would be utter foolishness. Given who is in power right now I'm sure that prospect will find a few friends, but I think the odds of it actually happening before the bubble bursts are pretty small.


Are we not past past critical mass though? The velocity at which these things can out compete human labor is astonishing, any future human creations or original content will already have lost the battle the moment it goes online and gets cloned by AI.


We should, but not for those reasons.

If software and ideas become commodities and the legal ecosystem around creating captive markets disappears, then we will all be much better off.


I'm doubtful the AI companies would be happy with getting rid of laws protecting _their_ intellectual property.


What an infantile worldview.


> Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.

https://news.ycombinator.com/newsguidelines.html


OK. To be clear, that wasn't about the OP, but rather the alleged people promoting the abolition of copyright... which would significantly hurt open source.

The people agitating for such things are usually leeches who want everything free and do, in fact, hold an infantile worldview that doesn't consider how necessary remuneration is to whatever it is they want so badly (media pirates being another example).

Not that I haven't "pirated" media, but this is usually the result of it not being available for purchase or my already having purchased it.


There's already been an interesting ruling that a pure AI output is not, in itself, copyrightable.


I'm curious what will happen when someone modifies a single byte (or a "sufficient" number of bytes) of AI output, thereby creating a derivative work, and then claiming copyright on that modified work.


> The AI companies seek to train models in order to compete with the authors of the content used to train the models.

When I read someone else’s essay I may intend to write essays like that author. When I read someone else’s code I may intend to write code like that author.

AI training is no different from any other training.

> If a court determines that the AI output you've used is close enough to be considered a derivative work, it's infringement.

Do you mean the output of the AI training process (the model), or the output of the AI model? If the former, yes, sure: if a model actually contains within it it copies of data, then sure: it’s a copy of that work.

But we should all be very wary of any argument that the ability to create a new work which is identical to a previous work is itself derivative. A painter may be able to copy Gogh, but neither the painter’s brain nor his non-copy paintings (even those in the style of Gogh) are copies of Gogh’s work.


If you as an individual recognizably regurgitate the essay you read, then you have infringed. If an AI model recongnizably regurgitates the essay it trained on then it has infringed. The AI argument that passing original content through an algorithm insulates the output from claims of infringement because of "fair use" is pigwash.


> If an AI model recongnizably regurgitates the essay it trained on then it has infringed.

I completely agree — that’s why I explicitly wrote ‘non-copy paintings’ in my example.

> The AI argument that passing original content through an algorithm insulates the output from claims of infringement because of "fair use" is pigwash.

Sure, but the argument that training an AI on content is necessarily infringement is equally pigwash. So long as the resulting model does not contain copies, it is not infringement; and so long as it does not produce a copy, it is not infringement.


> So long as the resulting model does not contain copies, it is not infringement

That's not true.

The article specifically deals with training by scraping sites. That does necessarily involve producing a copy from the server to the machine(s) doing the scraping & training. If the TOS of the site incorporates robots.txt or otherwise denies a license for such activity, it is arguably infringement. Sourcehut's TOS for example specifically denies the use of automated tools to obtain information for profit.


I'm curious how this can be applied with the inevitable combinatorial exhaustion that will happen with musical aspects such as melody, chord progression, and rhythm.

Will it mean longer and longer clips are "fair use", or will we just stop making new content because it can't avoid copying patterns of the past?


> I'm curious how this can be applied with the inevitable combinatorial exhaustion that will happen with musical aspects such as melody, chord progression, and rhythm.

https://www.vice.com/en/article/musicians-algorithmically-ge...

They did this in 2020. The article points out that "Whether this tactic actually works in court remains to be seen" and I haven't been following along with the story, so I don't know the current status.


More germane is that there will be a smoking gun for every infringement case: whether or not the model was trained on the original. There will be no pretending that the model never heard the piece it copied.


> AI training is no different from any other training.

Yes, it is. One is done by a computer program, and one is done by a human.

I believe in the rights and liberties of human beings. I have no reason to believe in rights for silicon. You, and every other AI apologist, are never able to produce anything to back up what is largely seen as an outrageous world view.

You cannot simply jump the gun and compare AI training to human training like it's a foregone conclusion. No, it doesn't work that way. Explain why AI should have rights. Explain if AI should be considered persons. Explain what I, personally, will gain from extending rights to AI. And explain what we, collectively, will gain from it.


Outcomes matter. Things that are fine at an individual level can become social harmful at scale.


What happens when a culture becomes overwhelmingly individualistic and becomes blind to the at-scale harms?


I have this line of thought as well but then I wonder, if we are all out of jobs and out of substantial capital to spend, how do these owners make money ultimately? It's a genuine question and I'm probably missing something obvious. I can see a benevolant/post-scarcity spin to this but the non-benevolant one seems self defeating.


"Making money" is only a relevant goal when you need money to persuade humans to do things for you.

Once you have an army of robot slaves ... you've rendered the whole concept of money irrelevant. Your skynet just barters rare earth metals with other skynets and your robot slaves furnish your desired lifestyle as best they can given the amount of rare earth metals your skynet can get its hands on. Or maybe a better skynet / slave army kills your skynet / slave army, but tough tits, sucks to be you and rules to be whoever's skynet killed yours.


Good thing AI for now needs power, water and a place to exchange heat. Our version of womp rats if it goes to far I guess.


That's part of the "rare earth metals" synecdoche - hydroelectric dams, thorium mines, great lakes heat sinks - they're all things for skynets to kill or barter for as expedient


I don’t think you’re missing anything, I think the plan really is to burn it all down and rule over the ashes. The old saw “if you’re so smart, why aren’t you rich?” works in reverse too. This is a foolish, shortsighted thing to do, and they’re doing it anyway. Not really thinking about where value actually comes from or what their grandchildren’s lives would be like in such a world.


Capitalism is an unthinking, unfeeling force. The writing is on the wall that AI is coming, and being altruistic about it doesn’t do jack to keep others from the land grab. Their thinking is, might as well join the rush and hope they’re one of the winners. Every one of us sitting on the sidelines will be impacted in some way or the other. So who’re the smart ones, the ones who grab shovels and start digging, or the ones who watch as the others dig their graves and do nothing?


This is a bleak view. How about the ones who work hard to shape the way we adopt the technology to societal benefit ?


> How about the ones who work hard to shape the way we adopt the technology to societal benefit ?

they are heavily outnumbered and "outfunded"


Some technology is fundamentally incompatible with some societal architecture's implementation details; AI is one such technology.

Ubiquitous surveillance is another.


There are a few people like that, but not many. And certainly none in the AI space.


obviously China is going full forward and better at it, with no "Capitalism" involved


There's plenty of capitalism in Chinese business. It's not a purely communist country, it's a hybrid system with an active market economy.


China has been communist-in-name-only since Deng, you're accidentally proving the parent's point instead of refuting it.


ha, thanks for that!


I already started incorporating AI into my workflow. It's definitely helped with productivity.

At some point in the future, if you aren't using AI, you won't be able to compete in the job market.


At some point in the future, if you aren't AI, you won't be able to compete in the job market.


Sure, maybe in 50 years. At the moment, it's a productivity tool. Strangely, by the look of the down votes, the HN community doesn't quite understand this.


How confident are you that you will not be outcompeted by AI's in 3-7 years?


what you don't understand is you are training your own replacement

the tools feed back to the mothership what you are accepting and what you aren't

this is a far better signal than anything they get from crawling the internet


Class traitors never understand this


That's a laughable idea.

Job market is formed by the presence of needs and the presence of the ability to satisfy them. AI - does not reduce the ability to satisfy needs, so only possible situation where you won't be able to compete - is either the socialists will seize power and ban competition, or all the needs will be met in some other ways. In any other situation - there will be job market and the people will compete in it


> there will be job market and the people will compete in it

maybe there will be. I'm sure there also is a market for Walkman somewhere, its just exceedingly small.

The proclaimed goal is to displace workers on a grand scale. This is basically the vision of any AI company and literally the only way you could even remotely justify their valuations given the heavy losses they incur right now.

> Job market is formed by the presence of needs and the presence of the ability to satisfy them

The needs of a job market are largely shaped by the overall economy. Many industrial nations are largely service based economies with a lot of white collar jobs in particular. These white collar jobs are generally easier to replace with AI than blue collar jobs because you don't have to deal with pesky things like the real, physical world. The problem is: if white collar workers are kicked out of their jobs en masse, it also negatively affects the "value" of the remaining people with employment (exhibit A: tech job marker right now).

> is either the socialists will seize power and ban competition,

I am really having a hard time understanding where this obsession with mythical socialism comes from. The reality we live in is largely capitalistic and a striving towards a monopoly - i.e. a lack of competition - is basically the entire purpose of a corporation, which is only kept in check by government regulations.


>The proclaimed goal is to displace workers on a grand scale.

It doesn't matter. What you need to understand - is that in the source of the job market is needs, ability to meet those needs and ability to exchanges those ability on one another. And nothing of those are hindered by AI.

>Many industrial nations are largely service based economies with a lot of white collar jobs in particular.

Again: in the end of the day it doesn't change anything. In the end of the day you need a cooked dinner, a built house and everything else. So someone must build a house and exchange it for a cooked dinners. That's what happening (white collar workers and international trade balance included) and that's what job market is. AI doesn't changes the nature of those relationship. Maybe it replace white collar workers, maybe even almost all of them - that's only mean that they will go to satisfy another unsatisfied needs of other people in exchange for satisfying their own, job marker won't go anywhere, if anything - amount of satisfied needs will go up, not down.

>if white collar workers are kicked out of their jobs en masse, it also negatively affects the "value" of the remaining people with employment

No, it doesn't. I mean it does if they would be simply kicked out, but that's not the case - they would be replaced by AI. So the society get all the benefits that they were creating plus additional labor force to satisfy earlier unsatisfied needs.

>exhibit A: tech job marker right now

I don't have the stats at hand, but aren't blue collar workers doing better now than ever before?

>I am really having a hard time understanding where this obsession with mythical socialism comes from

From the history of the 20th century? I mean not obsession, but we we are discussing scenarios of the disappearance (or significant decrease) of the job market, and the socialists are the most (if not only) realistic reason for that at the moment.

>The reality we live in is largely capitalistic and a striving towards a monopoly

Yeas, and this monopoly, the monopoly, are called "socialism".

>corporation, which is only kept in check by government regulations.

Generally corporation kept in check by economic freedom of other economic agents, and this government regulations that protects monopolies from free market. I mean why would government regulate in other direction? Small amount of big corporations are way easier for government to control and get personal benefits from them.


> In the end of the day you need a cooked dinner, a built house and everything else. So someone must build a house and exchange it for a cooked dinners.

You should read some history.This veiw is so naive and overconfident.


My views on this issue are shaped by history. Starting with crop production and plowing and ending with book printing, conveyor belts and microelectronics - creating tools that increase productivity has always led to increased availability of goods, and the only reason that has lead to decreased availability - is things that has hindered ability to create and exchange goods.


I started a borderline smug response here pointing out how bullshit white collar and service jobs* where in deep shit but folks who actually work for a living would be fine. I scrapped it halfway through when it occurred to me that if everyone's broke then by definition nobody's spending money on stuff like contractors, mechanics, and other hardcore blue collar trades. Toss in AI's force multiplication of power demands in the face of all of the current issues around global warming and it starts to feel like pursuing this tech is fractally stupid and the best evidence to date I've seen that a neo-luddite movement might actually be a thing the world could benefit from. That last part is a pretty wild thought coming from a retired developer who spent the bulk of his adult life in IT, but here we are.

* https://phys.org/news/2023-08-people-pointless-meaningless-j...


Neo-Luddism is less stupid when you remember that the Luddites weren't angry that looms existed. Smashing looms was their tactic, not their goal.

Parliament had made a law phasing in the introduction of automated looms; specifically so that existing weavers were first on the list to get one. Britain's oligarchy completely ignored this and bought or built looms anyway; and because Parliament is part of that oligarchy, the law effectively turned into "weavers get looms last". That's why they were smashing looms - to bring the oligarchy back to the negotiating table.

The oligarchy responded the way all violent thugs do: killing their detractors and lying about their motives.


>if everyone's broke >nobody's spending money on stuff like contractors, mechanics, and other hardcore blue collar trades.

Why would this happen? Money is simply a medium of exchange of values that this contractors, mechanics and other hardcore blue collar trades are creating. How can they be broke, if Ai doesn't disturb their ability to create values and exchange it?


Customers that have funds available to purchase the services you offer and who are willing to actually spend that money are a hard requirement to maintain any business. If white collar and service industries are significantly disrupted by AI this necessarily reduces the number of potential customers. Thing is you don't have to lay off that many people to bankrupt half of the contractors in the country, a decent 3-5 year recession is all it takes. Folks stop spending on renovations and maintenance work when they're worried about their next paycheck.


>who are willing to actually spend that money

Money mean nothing. It is simply medium of exchange. The question is, is there anything to exchange? And the answer is yeas, and position of white collar workers doesn't affect availability of things for exchange. There's no reason for recession, there is nothing that can hinder ability of blue collar workers to create goods and services, all that things that when combined is called "wealth".

Don't think in the meaningless category of "what set of digits will be printed in the piece of paper called paycheck?". Think in the terms, that are implied: "What goods and services blue collar workers can't afford to themselves?". And it will become clear that the set of unaffordable goods and services to blue collar workers will decrease because of the replacement white collar workers with AI, because it is not hinder their ability to create those goods and services.


> Money mean nothing.

You think so? Give me the contents of your checking, savings, and retirement accounts and then get back to me on that.

> position of white collar workers doesn't affect availability of things for exchange.

You appear to be confused about the concept of consumers, let me help. Consumers are the people who buy things. When there are fewer consumers in a market, demand for products and services declines. This means less sales. So no, you don't get to unemploy big chunks of the population and expect business to continue thriving.


>When there are fewer consumers in a market, demand for products and services declines.

No, demand is unlimited and defined by the amount of production.

>You don't get to unemploy big chunks of the population and expect business to continue thriving.

I mean, generally replaced worker with the instruments - is the main way to business (and society) to thrive. In other words, what goods and services will became less affordable to the blue collar workers?


> No, demand is unlimited and defined by the amount of production.

Enough of your trolling, go waste someone else's time.


When ~white collar [researchers, programmers, managers, salespeople, translators, illustrators, ...] lose their income/jobs to AI's -> lose their ability to buy products/services and at the same time try to shift in mass to doing some kind of manual work, do you think that would not affect incomes of those who are the current blue collar class?


>do you think that would not affect incomes of those who are the current blue collar class?

Obviously it is affect. Supply of goods are increased and their relative market value are increased - how can this not increase their incomes?


The law of supply and demand dictates that when the supply of a thing increases it's value decreases.


> it's value decreases

I mean yeas, values of consumed goods will decrease, so blue color workers will be able to consume more. That's exactly what is called increase of income.


My gut is telling me you're being intentionally obtuse but I'm going to give you the benefit of the doubt. To reiterate in detail:

AI is poised to disrupt large swaths of the workforce. If large swaths of the workforce are disrupted this necessarily means a bunch of people will see their income negatively impacted (job got replaced by AI). Broke people by definition don't have money to spend on things, and will prioritize tier one of Maslow's Hierarchy out of necessity. Since shit like pergolas and oil changes are not directly on tier 1 they will be deprioritized. This in turn cuts business to blue collar service providers. Net result: everyone who isn't running an AI company or controlling some currently undefined minimum amount of capital is fucked.

If you're trying to suggest that any notional increases in productivity created by AI will in any way benefit working class individuals either individually or as a group you are off the edge of the map economically speaking. Historical precedents and observed executive tier depravity both suggest any increase in productivity will be used as an excuse to cut labor costs.


>This in turn cuts business to blue collar service providers.

No, it doesn't. Where's that is come from?

I mean, look at the situation from the perspective of blue collar service providers: what is exactly those goods and services, that they was be able to afford for themselves, but AI will make it unaffordable for them? Pretty obviously, that there's about none of those goods and services. So, in big picture, all that process that you described, doesn't lead to any disadvantage of blue collar workers.


I literally described the mechanism to you twice and you're still acting confused. I'm not sure if we have a language barrier here or what but go check out a Khan Academy course on economics or maybe try running a lemonade stand for an afternoon if you still don't get it.


I think the obvious thing you are missing is just b2b. It doesn’t actually matter if people have any money.

Similar to how advertising and legal services are required for everything but have ambiguous ROI at best, AI is set to become a major “cost of doing business“ tax everywhere. Large corporations welcome this even if it’s useless, because it drags down smaller competitors and digs a deeper moat.

Executives large and small mostly have one thing in common though.. they have nothing but contempt for both their customers and their employees, and would much rather play the mergers and acquisitions type of games than do any real work in their industry (which is how we end up in a world where the doors are flying off airplanes mid flight). Either they consolidate power by getting bigger or they get a cushy exit, so.. who cares about any other kind of collateral damage?


Money is a proxy for control. Eventually humans will become mostly redundant and slated for elimination except for the chosenites of the managerial classes and a small number of technicians. Either through biological agents, famines, carefully engineered (civil?) wars and conflicts designed to only exterminate the non-managerial classes, or engineered Calhounian behavioral sinks to tank fertility rates below replacement.



Ssssh, you can't say that. Those types of brain damage are protected diversity.


Why should we care if they make money? Owning things isn't a contribution to society.

Building things IS a contribution to society, but the people who build things typically aren't the ultimate owners. And even in cases where the builders and owners are the same, entitling the builders and all of their future heirs to rent seek for the rest of eternity is an inordinate reward.


You don't. It's like Minecraft. You can do almost everything in Minecraft alone and everything exists in infinite quantity, so why trade in the first place?

This goes both ways. Let's say there is something you want but you're having trouble obtaining it. You'd need to give something in exchange.

But the seller of what you want doesn't need the things you can easily acquire, because they can get those things just as easily themselves.

The economy collapses back into self sufficiency. That's why most Minecraft economy servers start stagnating and die.


Unfortunately I don’t think the logic extends beyond “if we don’t do it, someone else will”. Anything after that is secondary.


What people say is not the same as what people do.. in other words, what is spoken in public repeatedly is not representational of actual decision flows


Money is only a bookkeeping tool for complex societies. The aim of the owner class in a worker-less world would be accumulation of important resources to improve their lives and to trade with other owners (money would likely still be used for bookkeeping here). A wealthy resource-owner might strive to maintain a large zone of land, defended by AI weaponry, that contains various industrial/agricultural facilities producing goods and services via AI.

They would use some of the goods/services produced themselves, and also trade with other owners to live happy lives with everything they need, no workers involved.

Non-owners may let the jobless working class inhabit unwanted land, until they change their minds.


Better for them to give us jobs so we owe them and are less likely to revolt!


With what and against what? There will be spy satellites and drones and automated turrets that will turn you to pulp if you come within, say, 50KM of their compound borders.


will they care if they have an army of cheap easily replaceable robots with guns?

I miss the star trek visions of the future

now the "good" outcome is a world sized north korea, with elon as ruler

and the bad outcome is the ruler using his army of robots to eliminate the possibility of the peasant revolt once and for all


One (satirical) answer to this question is given in Greg Egan's "The Discrete Charm of the Turing Machine" (2017). https://i.4pcdn.org/tg/1599529933107.pdf


The non-benevolent future is not self-defeating; we have historical examples of depressingly stable economies with highly concentrated ownership. The entirety of the European dark ages was the end result of (western[0]) Rome's elites tearing the planks out of the hull of the ship they were sailing. The consequence of such a system is economic stagnation, but that's not a consequence that the elites have to deal with. After all, they're going to be living in the lap of luxury, who cares if the economy stagnates?

This economic relationship can be collectively[1] described as "feudalism". This is a system in which:

- The vast majority of people are obligated to perform menial labor, i.e. peasant farmers.

- Class mobility is forbidden by law and ownership predominantly stays within families.

- The vast majority of wealth in the economy is in the form of rents paid to owners.

We often use the word "capitalist" to describe all businesses, but that's a modern simplification. Businesses can absolutely engage in feudalist economies just as well, or better, than they can engage in capitalist ones. The key difference is that, under capitalism, businesses have to provide goods or services that people are willing to pay for. Feudalism makes no such demand; your business is just renting out a thing you own.

Assuming AI does what it says on the tin (which isn't at all obvious), the endgame of AI automation is an economy of roughly fifty elite oligarchs who own the software to make the robots that do all work. They will be in a constant state of cold war, having to pay their competitors for access to the work they need done, with periodic wars (kinetic, cyber, legal, whatever) being fought whenever a company intrudes upon another's labor-enclave.

The question of "well, who pays for the robots" misunderstands what money is ultimately for. Money is a token that tracks tax payments for coercive states. It is minted specifically to fund wars of conquest; you pay your soldiers in tax tokens so the people they conquer will have to barter for money to pay the tax collector with[2]. But this logic assumes your soldiers are engaging in a voluntary exchange. If your 'soldiers' are killer robots that won't say no and only demand payment in energy and ammunition, then you don't need money. You just need to seize critical energy and mineral reserves that can be harvested to make more robots.

So far, AI companies have been talking of first-order effects like mass unemployment and hand-waving about UBI to fix it. On a surface level, UBI sounds a lot like the law necessary to make all this AI nonsense palatable. Sam Altman even paid to have a study done on UBI, and the results were... not great. Everyone who got money saw real declines in their net worth. Capital-c Conservative types will get a big stiffy from the finding that UBI did lead people to work less, but that's only part of the story. UBI as promoted by AI companies is bribing the peasants. In the world where the AI companies win, what is the economic or political restraining bolt stopping the AI companies from just dialing the UBI back and keeping more of the resources for themselves once traditional employment is scaled back? Like, at that point, they already own all the resources and the means of production. What makes them share?

[0] Depending on your definition of institutional continuity - i.e. whether or not Istanbul is still Constantinople - you could argue the Roman Empire survived until WWI.

[1] Insamuch as the complicated and ideosyncratic economic relationships of medieval Europe could even be summed up in one word.

[2] Ransomware vendors accidentally did this, establishing Bitcoin (and a few other cryptos) as money by demanding it as payment for a data ransom.


You may find "Technofeudalism: What Killed Capitalism" book to your liking.


And how could they possibly base their actions on good when their technology is more important than fire? History is depending on them to do everything possible to increase their market cap.


Careful, I think you're being sarcastic, but you're in a space where a lot of people believe what you just said unironically.


Ha! Comment of the week.


More important than fire? AI runs on fire.


> The entire vision is to remake the entire world into one where the owners of these companies own everything and are completely unconstrained.

I agree with you in the case of AI companies, but the desire to own everything an bee completely unconstrained is the dream of every large corporation.


> remake the entire world into one where the owners of these companies own everything and are completely unconstrained

how has this been any different from the past 10,000 years of human conquest and domination?


in the past, you had to give some of your spoils to those who did the conquering for you, and laborers after that. if you can automate and replace all work, including maintening the robots that do that and training them, you no longer need to share anything.


In my view it's the same thing, same trajectory -- with more power in the hands of fewer people further along the trajectory.

It can be better or worse depending on what those with power choose to do. Probably worse. There has been conquest and domination for a long time, but ordinary people have also lived in relative peace gathering and growing food in large parts of the world in the past, some for entire generations. But now the world is rapidly becoming unable to support much of that as abundance and carrying capacity are deleted through human activity. And eventually the robot armies controlled by a few people will probably extract and hoard everything that's left. Hopefully in some corners some people and animals can survive, probably by being seen as useful to the owners.


On the bright side, armies of robot slaves give us an off-ramp from the unsustainable pyramid scheme of population growth.

Be fruitful, and multiply, so that you may enjoy a comfortable middle age and senescence exploiting the shit out of numerous naive 25-year-olds! If it's robots, we can ramp down the population of both humans and robots until the planet can once again easily provide abundance.


Sure, the problem though is it won't be "we" deciding what the robots do, it will most likely be a few powerful people of dubious character and motivations since those are the sort of people who pursue power and end up powerful.

That's why even though technology could theoretically be used to save us from many of our problems, it isn't primarily used that way.


True.

But presumably petty tyrants with armies of slave robots are less interested than consensus in a long-term vision for humanity that involves feeding and housing a population of 10 billion.

So after whatever horrific holocaust follows the AI wars the way is clear for a hundred thousand humans to live in the lap of luxury with minimal impact on the planet. Even if there are a few intervening millennia of like 200 humans living in the lap of luxury and 99,800 living in sex slavery.


The thing is that this will be their destruction as well. If workers don't have any money (because they don't have jobs), nobody can afford what the owners have to sell?


The human population will be decimated just as the work horse population was.


They are also gutting the profession of software engineering. It's a clever scam actually: to develop software a company will need to pay utility fees to A"I" companies and since their products are error prone voila use more A"I" tools to correct the errors of the other tools. Meanwhile software knowledge will atrophy and soon ala WALE we'll have software "developers" with 'soft bones' floating around on conveyed seats slurping 'sugar water' and getting fat and not knowing even how to tie their software shoelaces.


Yes, like the Pixel camera app, which mangles photos with AI processing, and users complain that it won't let people take pics.

One issue was a pic with text in it, like a store sign. Users were complaining that it kept asking for better focus on the text in the background, before allowing a photo. Alpha quality junk.

Which is what AI is, really.


AI tarpits && lim (human curated contant/mediocre AI answers -> 0) = AI's crumbling into dust by themselves.


We, the people, might need to come up with a few proverbial tranquilizer guns here soon


Maxim 1: "Pillage, then burn."


Another Schlock Mercenary fan? Or does this adage have many adherents?


The adage predates the longest continuous webcomic, but not as a maxim.


Yep, a fan I am.


That's pretty much what our future would look like -- you are irrelevant. Well I mean we are already pretty much irrelevant nowadays, but the more so in the "progressive" future of AI.



Rules and laws are for other people. A lot of people reading this comment having mistaken "fake it til you make it" or "better to not ask permission" for good life advice are responsible for perpetrating these attitudes, which are fundamentally narcissistic.


"... you have the lawyers clean it all up later." - Eric Schmidt


> AI will be incorporated into all products whether you like it or not

AI will be incorporated into the government, whether you like it or not.

FTFY!


I think the logic is more like “we have to do everything we can to win or we will disappear”. Capitalism is ruthless and the big techs finally have some serious competition, namely: each other as well as new entrants.

Like why else can we just spam these AI endpoints and pay $0.07 at the end of the month? There is some incredible competition going on. And so far everyone except big tech is the winner so that’s nice.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: