A fun thought experiment is describing modern living to someone in the early Georgian era in an uncharitable way. It would be relatively easy to sell it is a godless industrial landscape. A city dweller could feasibly go days without seeing a tree or even leaving their apartment. Threat of annihilation looming over everyone's heads and the various major armies that can reach out and do what they like in most parts of the world. Constant risk from authoritarian ideologies that spread like wildfire.
The future is always scary. The present is pretty scary. The past was much less comfortable.
You are right. But I thinking it's fair to say that industrial progress did have some consequences that could be easily prevented. Mostly at the beginning.
But I do not like this alarmist view either. It's usually useless and just make people feeling more endangered which is not a good prerequisite for racional debate. I would be worried more not about the technologies it self but by their human overlords.
The article is surprisingly positive given the headline. On the plus side we may have:
>trillions of super happy people who basically live forever
and on the minus:
>maybe some kind of "own goal" or "disaster" brought about by tech companies — like mass unemployment in a short time
It sounds like the upside outweighs the downside there.
By the way I'm not sure about "moral obligation to colonize as much of the universe as possible" but it's quite likely some will want to colonize like Musk with Mars. There are some logistical issues there though.
I have a foreboding of an America in my children's or
grandchildren's time - [...] when awesome technological
powers are in the hands of a very few, and no one
representing the public interest can even grasp the issues;
when the people have lost the ability to set their own
agendas or knowledgeably question those in authority; when,
clutching our crystals and nervously consulting our
horoscopes, our critical faculties in decline, unable to
distinguish between what feels good and what’s true, we
slide, almost without noticing, back into superstition and
darkness.[0]
Oswald Spengler also wrote stuff in "The Decline of the West", in particular about the rise of Caesar types at particular moments in the lifetime of civilizations considered as organisms that grow, flourish, and decline. On the other hand, Spengler may have been overly enthusiastic in fitting history to suit his narrative. Spengler was not well received by some, particularily those optimists who believe such things as "rain follows the plow", or whatever that goes by these days. The graph goes up and to the right; therefore, Singularity. These optimists may likewise be overly enthusiastic in fitting reality to a particular narrative.
That why it's ironic that some techno-optimists have embraced Nick Land's philosophy, given that his work fundamentally challenges their core assumptions:
Converging upon terrestrial meltdown singularity, phase-out culture accelerates through its digitech-heated adaptive landscape, passing through compression thresholds normed to an intensive logistic curve: 1500, 1756, 1884, 1948, 1980, 1996, 2004, 2008, 2010, 2011 ...
Nothing human makes it out of the near-future. [0]
I have seen that quote dragged out lately as some sort of prophecy of 2025…but it really speaks to the exact opposite in my opinion. Smartphones and the internet have given the layperson awesome technology literally in their hands. The entire election of 2024 was driven by the people specifically questioning those in authority…and just what exactly is the “public interest”? Is it what the public demands? Or is it what some elite few say it should be?
Seems to me whatever time that Sagan envisioned, it certainly isn’t this time we are in right now.
> Smartphones and the internet have given the layperson awesome technology literally in their hands.
The quote is "awesome technological powers", which, I think, is different from having a technological device. There are the data brokers, social media giants, and governments that have the technological power to manipulate the masses.
> The entire election of 2024 was driven by the people specifically questioning those in authority
The quote: "when the people have lost the ability to set their own agendas or knowledgeably question those in authority". Are they serving their own agenda or the agenda of others through the media they consume? Are they knowledgable in their questioning?
> Are they serving their own agenda or the agenda of others through the media they consume? Are they knowledgable in their questioning?
Are you? I bet you think you are knowledgeable in your questioning, fully in tune with reality, serving your own agenda and above “their agenda”. I’d estimate that you probably are. Seems to me that most people are capable of understanding what is important to them so I don’t buy into the mass manipulation thing.
Meanwhile go on oddysee or irc and people are 3d printing guns on a days unskilled laborer wages of tools and another days wages of materials, while discussing anarchist theory or materials science. Some of them even using such technology to fight against ethnic cleansing in Myanmar.
Can't stop the signal. Work on your own local reality, with what you have, where you are, with those around you without obsessing about whatever the gullible hordes fall for next, which is probably the same things they have since inception of human culture.
How much of this was good faith questioning and how much of it was that the public has been propagandized by platforms whose algorithms favor outrage and anger?
> I have seen that quote dragged out lately as some sort of prophecy of 2025…
2025 has nothing to do with my reason for citing this quote.
> Smartphones and the internet have given the layperson awesome technology literally in their hands. The entire election of 2024 was driven by the people specifically questioning those in authority…
Neither of these opinions are relevant either. The point of my citing the Sagan quote was to illuminate the danger we, as a society, find ourselves in as aptly described. The influence social media offerings have on public opinion is both well documented and extreme.
> ... and just what exactly is the “public interest”?
The applicable quote is "no one representing the public interest", which obviously identifies elected government officials.
> Is it what the public demands? Or is it what some elite few say it should be?
Not directly aside from voting and absolutely not, in that order. Suggesting the latter is disingenuous at best.
> Seems to me whatever time that Sagan envisioned, it certainly isn’t this time we are in right now.
Seems to me we disagree, as one could easily argue a handful of corporations (Twitter, Facebook, TikTok, et al.) have undue influence over the information billions of people consume, be it of the "mis" or "dis" variety.
The entire election of 2024 was driven by the people specifically questioning those in authority
you really think this???! so we should expect the same in 2028 but at much larger scale now though, as now we have actual authority who is not caring for either constitution or the rule of law, left side winning all 50 states?
I think not, the questioning of authority is what people in power make people without think what actually happen… :)
I 100% agree. Anyone can use the Sagan quote to justify why they lost. The quote is just used as a fancy way of calling people with different views stupid.
"Society is failing because people are dumber, and thats why they voted for X, when a smart person like me would have voted for Y!"
This is such a reductive view of what happened in 2024. Oligarchs openly supported a political candidate that has attacked experts time and again on things that are provably true because it's inconvenient to their narrative. We know Musk spent hundreds of millions of dollars supporting one side because it was in his interest. Mark Zuckerberg loves that side because they're not going to regulate his objectively harmful platform.
Since this administration entered office, it has attacked agencies and organizations that do things like prepare for climate change, educate our children, provide essential Healthcare to 10s of millions of Americans, do basic scientific research, and protect consumers from financial fraud. Even if you get rid of these organizations, climate disasters will still happen (and insurance companies will still pull out of states like Florida and California), kids still need basic education to function in a high technology society, people are still getting sick, China is pulling ahead on public research, and fraudsters are still stealing from the American public.
This administration is bad because it's attacking the career public servants and agencies that are a net positive for our society. We get back more than the investment here in innumerable ways. It is stupid to try and dismantle the parts of government that work and work well. So shut the hell up with this if they were smart they would have voted for Y bullshit as if this was ever a good faith alternative to the liberal program the dems ran on.
I pay about 50% in taxes. These agencies are a net positive at an unjustifiably high cost. Most tax dollars are wasted/grifted.
Have you ever worked at a government contractor? Those that have will laugh at your statements here. $80 screws, $250 staplers, the list goes on. I've had n>1 friends that flipped their political alignment after seeing the sheer wastage here.
There is so much wastage and grift going on, and someone needs to come in, shock the system, and reduce it to the bare minimum.
The correct way to do this is through congressional oversight, not through unelected and unaccountable billionaires illegally cutting programs they personally don't like.
Can someone give an alternative vision of the next 100-200 years that doesn't include AGI/ASI? I don't see it. At least one that isn't a nuclear wasteland.
I have thought about this every day for 32 years and don't know of any possible alternative future than one run alongside or by Artificial Superhuman Systems. I just can't possibly look into the future and expect the current state of machines to plateau here and stop. In fact that's my life's work and has been so since I was 13.[1] I even discussed this at the AGI conference in Laval in 2014 with Turing winner Yoshua Bengio [2]
The Overcoming Bias blog, which turned into Less wrong, was the best place for these discussions until it became a cult of personality around 2013. The SIAI and all the mess that it became at least was a chance to discuss it. There's no serious place to discuss it.
AGI discussion went from nerds that everyone pointed and laughed at, to now being an imminent existential threat "To the average person, this all sounds like the plot of a sci-fi novel. There might be a tendency to dismiss a techno-utopia as "billionaire boys and their toys" but Torres says the danger comes from the amount of power and influence these billionaires have to make policy and spend money."
There have been, at least a dozen core books from Bostrom, Kelly, Kurzweil, Shannon etc... that have discussed this concept in precise and exacting detail.
Arguably the whole reason to make computers has been to make AGI/ASI since the first mechanical turk etc...
So I'm not really seeing a future without ASI and despite a lot of talk inside computing - for the entire history of computing - it's happening now, and everyone is surprised for some reason.
AGI/ASI as we think of it in science fiction is not guaranteed. I think its definition will erode due to marketing until it's a label on a product, but less limited than what we think it could be today. It'll probably be a capable system but not able to do science on its own.
I don't think we'll throw nukes. AI will have taken care of work that requires text and 95% of us are left doing manual labor because a robotic workforce still isn't feasible. US/Canada will start moving its cities north as the planet warms, and we have to start new farms up there.
To be fair, we don't know the future. Solving scientific problems have eluded researchers and their billions of dollars for centuries. Genuine AGI will be on par with solving P=NP or curing cancer. Its a profound problem that gets to the very root of what we are.
100 years from now, AGI could be viewed as such an improbable proposition as flying cars, teleportation, ghosts, or faster than light travel.
I agree we'll have AGI/ASI in the next 100-200 years, probably the next 10.
I'm not sure about "Humans will not exist in a million years from today" in [1] though. We've been quite good at preserving interesting species.
Re. places to discuss it - x/twitter? There are serious people on there though it's not easy to chat. r/singularity is quite jolly but not a very serious place.
Eh. I can rather trivially imagine a future where AGI development gets interrupted by a series of domestic terrorist attacks in response to some critical mass of economic disruption being hit. Folks get cranky when they're having difficulty feeding their kids, and given how comprehensively armed the US populace is that's bound to get spicy.
I'm not sure I agree there. You don't need to hit anything like AGI to brick the bulk of the economy. 60% of the US works service industry jobs that could be replaced with a kiosk and automation that requires zero problem-solving ability. I also don't think AGI is possible, but that's a separate discussion.
> Can someone give an alternative vision of the next 100-200 years that doesn't include AGI/ASI? I don't see it. At least one that isn't a nuclear wasteland.
Eh? Without artificial super intelligence the future is a nuclear wasteland? How does that follow?
Why wouldn't an artificial super intelligence conclude that humans are the problem and nuke the lot?
> Can someone give an alternative vision of the next 100-200 years that doesn't include AGI/ASI? I don't see it. At least one that isn't a nuclear wasteland.
I don't think I can. I can only see a post-apocalyptic landscape similar to mad max, terminator or the fallout series.
If we look at the actions of these AI leaders more than what they are talking about in-front of the camera it does not look like a utopia for us.
If Zuckerberg, Sam Altman and Larry Ellison are building bunkers, I guess that they have a backup plan when there is societal collapse when AGI accelerates into total job displacement.
It's a utopia for billionaires that have power, influence and lots of spending power; unaffected by AGI. They mention that a theoretical UBI system will solve the issue with job displacement which is ill thought out and has never worked sustainably at a large scale.
Of course there will always be people that will try to slow it down, but the ultimate endgame is full job displacement + humanoid robots taking over manual labor with no alternatives for those lost jobs.
Or perhaps "The bro-ligarchs have a vision for the new Trump term":
> All of these men see themselves as the heroes or protagonists in their own sci-fi saga. And a key part of being a “technological superman” — or ubermensch, as the German philosopher Friedrich Nietzsche would say — is that you’re above the law. Common-sense morality doesn’t apply to you because you’re a superior being on a superior mission. Thiel, it should be noted, is a big Nietzsche fan, though his is an extremely selective reading of the philosopher’s work.
> The ubermensch ideology helps explain the broligarchs’ disturbing gender politics. “The ‘bro’ part of broligarch is not incidental to this — it’s built on this idea that not only are these guys superior, they are superior because they’re guys,” Harrington said.
[…]
> The so-called network state is “a fancy name for tech authoritarianism,” journalist Gil Duran, who has spent the past year reporting on these building projects, told me. “The idea is to build power over the long term by controlling money, politics, technology, and land.”
The future is always scary. The present is pretty scary. The past was much less comfortable.
reply