You're both right, and that's exactly how early regulation often ends up stifling innovation. Trying to shape a market too soon tends to lock in assumptions that later prove wrong.
Sometimes you can't reverse the damage and societal change after the market has already been created and shaped. Look at fossil fuels, plastic, social media, etc. We're now dependent on things that cause us harm, the damage done is irreversible and regulation is no longer possible because these innovations are now embedded in the foundations of modern society.
Innovation is good, but there's no need to go as fast as possible. We can be careful about things and study the effects more deeply before unleashing life changing technologies into the world. Now we're seeing the internet get destroyed by LLMs because a few people decided it was ok to do so. The benefits of this are not even clear yet, but we're still doing it just because we can. It's like driving a car at full speed into a corner just to see what's behind it.
I think it’s one of those “everyone knows” things that plastic and social media are bad, but I think the world without them is way, way worse. People focus on these popular narratives but if people thought social media was bad, they wouldn’t use it.
Personally, I don’t think they’re bad. Plastic isn’t that harmful, and neither is social media.
I think people romanticize the past and status quo. Change is scary, so when things change and the world is bad, it is easy to point at anything that changed and say “see, the change is what did it!”
People don't use things that they know are bad, but someone who has grown up in an environment where everyone uses social media for example, can't know that it's bad because they can't experience the alternative anymore. We don't know the effects all the accumulating plastic has on our bodies. The positive effects of these things can be bigger than the negative ones, but we can't know that because we're not even trying to figure it out. Sometimes it might be impossible to find out all the effects before large scale adoption, but still we should at least try. Currently the only study we do before deciding is the one to figure out if it'll make a profit for the owner.
> We don't know the effects all the accumulating plastic has on our bodies.
This is handwaving. We can be pretty well sure at this point what the effects aren’t, given their widespread prevalence for generations. We have a 2+ billion sample size.
No, we can't be sure. There's a lot of diseases that we don't know the cause of, for example. Cancers, dementia, Alzheimer's, etc. There is a possibility that the rates of those diseases are higher because of plastics. Plastic pollution also accumulates, there was a lot less plastic in the environment a few decades ago. We add more faster than it gets removed, and there could be some threshold after which it becomes more of an issue. We might see the effect a few decades from now. Not only on humans, but it's everywhere in the environment now, affecting all life on earth.
You're not arguing in a way that strikes me as intellectually honest.
You're hypothesizing the existence of large negative effects with minimal evidence.
But the positive effects of plastics and social media are extremely well understood and documented. Plastics have revolutionized practically every industry we have.
With that kind of pattern of evidence, I think it makes sense to discount the negatives and be sure to account for all the positives before saying that deploying the technology was a bad idea.
I agree that plastics probably do have more positives than negatives, but my point is that many of our innovations do have large negative effects, and if we take them into use before we understand those negative effects it can be impossible to fix the problems later. Now that we're starting to understand the extent of plastic pollution in our environment, if some future study reveals that it's a causal factor in some of our diseases it'll be too late to do anything about it. The plastic is in the environment and we can't get it out with regulation anymore.
Why take such risks when we could take our time doing more studies and thinking about all the possible scenarios? If we did, we might use plastics where they save lives and not use them in single-use containers and fabrics. We'd get most of the benefit without any of the harm.
I'm sure it's very good the first time you take it. If you don't consider all the effects before taking it, it does make sense. You feel very good, but the even stronger negative effects come after. Same can be said about a lot of technology.
Addiction is a matter of degree. There's a bunch of polls where a large majority of people strongly agree that "they spend too much time on social media". Are they addicts? Are they "coosing to use it"? Are they saying it's too much because that's a trendy thing to say?
WHAT?! Do you think we as humanity would have gotten to all the modern inventions we have today like the internet, space travel, atomic energy, if we had skipped the fossil fuel era by preemptively regulating it?
How do you imagine that? Unless you invent a time machine, go to the past, and give inventors schematics of modern tech achievable without fossil fuels.
Maybe not as fast as we did, but eventually we would have. Maybe more research would have been put into other forms of energy if the effects of fossil fuels were considered more thoroughly and usage was limited to a degree that didn't have a chance cause such fast climate change. And so what if the rate of progress would have been slower and we'd be 50 years behind current tech? At least we wouldn't have to worry about all the damage we've caused now, and the costs associated with that. Due to that damage our future progress might halt while a slower, more careful society would continue advancing far into the future.
I think it's an open question whether we can reboot society without the use of fossil fuels. I'm personally of the opinion that we wouldn't be able to.
Simply taking away some giant precursor for the advancements we enjoy today and then assuming it all would have worked out somehow is a bit naive.
I would need to see a very detailed pipeline from growing wheat in an agrarian society to the development of a microprocessor without fossil fuels to understand the point you're making. The mining, the transport, the manufacture, the packaging, the incredible number of supply chains, and the ability to give people time to spend on jobs like that rather than trying to grow their own food are all major barriers I see to the scenario you're suggesting.
The whole other aspect of this discussion that I think is not being explored is that technology is fundamentally competitive, and so it's very difficult to control the rate at which technology advances because we do not have a global government (and if we did have a global government, we'd have even more problems than we do now). As a comment I read yesterday said, technology concentrates gains towards those who can deploy it. And so there's going to be competition to deploy new technologies. Country-level regulation that tries to prevent this locally is only going to lead to other countries gaining the lead.
You might be right, but I'm wasn't saying we should ban all use of any technology that has any negative effects, but that we should at least try to understand all the effects before taking it into use, and try to avoid the worst outcomes by regulating how to use the tech. If it turns out that fossil fuels are the only way to achieve modern technology then we should decide to take the risk of the negative effects knowing that there's such a risk. We shouldn't just blindly rush into any direction that might give us some benefit.
Regarding competition, yes you're right. Effective regulation is impossible before we learn global co-operation, and that's probably never going to happen.
Very naive take that's not based in reality but would only work in fiction.
Historically, all nations that developed and deployed new tech, new sources of energy and new weapons, have gained economic and military superiority over nations who did not, which ended up being conquered/enslaved.
UK would not have managed to be the world power before the US, without their coal fueled industrial era.
So as history goes, if you refuse to take part in, or cannot keep up in the international tech, energy and weapons race, you'll be subjugated by those who win that race. That's why the US lifted all brakes on AI, to make sure they'll win and not China. What EU is doing, self regulating itself to death, is ensuring its future will be at the mercy of US and China. I'm not the one saying this, history proves it.
You're right, in a system based on competition it's not possible to prevent these technologies from being used as soon as they're invented if there's some advantage to be gained. We need to figure out global co-operation before such a thing is realistic.
But if such co-operation was possible, it would make sense to progress more carefully.
There is no such thing as "global cooperation" in our reality for things beyond platitudes. That's only a fantasy for sci-fi novels. Every tribe wants to rule the others, because if you don't, the other tribes will rule you.
It's been the case since our caveman days. That's why tribes that don't focus on conquest end up removed form the gene pool. Now extend tribe to nation to make it relevant to current day.
The internet was created in the military at the start of the fossil era, there is no reason, why it should be affected by the oil era. If we wouldn't travel that much, because we don't use cars and planes that much, the internet would be even more important.
Space travel does need a lot of oil, so it might be affected, but the beginning of it were in the 40s so the research idea was already there.
Atomic energy is also from the 40s and might have been the alternative to oil, so it would thrive more if we haven't used oil that much.
Also all 3 ARE heavily regulated and mostly done by nation states.
How would you have won the world wars without oil?
Your augment only work in a fictional world where oil does not exist and you have the hindsight of today.
But when oil does exist and if you would have chosen not to use it, you will have long been steamrolled by industrialized nations powers who used their superior oil fueled economy and military to destroy or enslave your nation and you wouldn't be writing this today.
I thought we are arguing about regulating oil not to not use oil at all.
> How would you have won the world wars without oil?
You don't need to win world wars to have technological advancement, in fact my country didn't. I think the problem with this discussion, is that we all disagree what to regulate, that's how we ended up with the current situation after all.
I interpreted it to mean that we wouldn't use plastic for everything. I think we would be fine having glass bottles and paper, carton, wood for grocery wrapping. It wouldn't be so individual per company, but this not important for the economy and consumers, and also would result in a more competitive market.
I also interpreted it to mean that we wouldn't have so much cars and don't use planes beside really important stuff (i.e. international politics). The cities simply expand to the travel speed of the primary means of transportation. We would simply have more walkable cities and would use more trains. Amazon probably wouldn't be possible and we would have more local producers. In fact this is what we currently aim for and it is hard, because transition means that we have larger cities then we can support with the primary means of transportation.
As for your example inventions: we did have computers in the 40s and the need for networking would arise. Space travel is in danger, but you can use oil for space travel without using it for everyday consumer products. As I already wrote, we would have more atomic energy, not sure if that would be good though.
Depends what those assumptions are. If by protecting humans from AI gross negligence, then the assumptions are predetermined to be siding towards human normals (just one example). Lets hope logic and understanding of the long term situation proceeds the arguments in the rulesets.
You're just guessing as much as anyone. Almost every generation in history has had doomers predicting the fall of their corner of civilization from some new thing. From religion schisms, printing presses, radio, TV, advertisements, the internet, etc. You can look at some of the earliest writings by English priests in the 1500s predicting social decay and destruction of society which would sound exactly like social media posts in 2025 about AI. We should at a minimum under the problem space before restricting it, especially given the nature of policy being extremely slow to change (see: copyright).
I'd urge you to read a book like Black Swan, or study up on statistics.
Doomers have been wrong about completely different doom scenarios in the past (+), but it says nothing about to this new scenario. If you're doing statistics in your head about it, you're wrong. We can't use scenarios from the past to make predictions about completely novel scenarios like thinking computers.
(+) although they were very close to being right about nuclear doom, and may well be right about climate change doom.
I'd like for you to expand your point on understanding statistics better. I think I have a very good understanding of statistics, but I don't see how it relates to your point.
Your point is fundamentally philosophical, which is you can't use the past to predict the future. But that's actually a fairly reductive point in this context.
GP's point is that simply making an argument about why everything will fail is not sufficient to have it be true. So we need to see something significantly more compelling than a bunch of arguments about why it's going to be really bad to really believe it, since we always get arguments about why things are really, really bad.
> which is you can't use the past to predict the future
Of course you can use the past to predict (well, estimate) the future. How fast does wheat grow? Collect a hundred years of statistics of wheat growth and weather patterns, and you can estimate how fast it will grow this year with a high level of accuracy, unless a "black swan" event occurs which wasn't in the past data.
Note carefully what we're doing here: we're applying probability on statistical data of wheat growth from the past to estimate wheat growth in the future.
There's no past data about the effects of AI on society, so there's no way to make statements about whether it will be safe in the future. However, people use the statistics that other, completely unrelated, things in the past didn't cause "doom" (societal collapse) to predict that AI won't cause doom. But statistics and probability doesn't work this way, using historical data about one thing to predict the future of another thing is a fallacy. Even if in our minds they are related (doom/societal collapse caused by a new technology), mathematically, they are not related.
> we always get arguments about why things are really, really bad.
When we're dealing with a completely new, powerful thing that we have no past data on, we absolutely should consider the worst, and of course, the median, and best case scenarios, and we should prepare for all of these. It's nonsensical to shout down the people preparing for the worst and working to make sure it doesn't happen, or to label them as doomers, just because society has survived other unrelated bad things in the past.
Ah, I see your point is not philosophical. It's that we don't have historical data about the effect of AI. I understand your point now. I tend to be quite a bit more liberal and allow things to play out because I think many systems are too complex to predict. But I don't think that's a point that we'll settle here.