Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not worried about actual AGI; infinitely powerful, not bound to the whims of any person or conglomerate. I see human atrocities born out of weakness, so I'm not worried about something that would have gain nothing from torturing or destroying us. That's assuming it can do and think everything we can, and doesn't need us as slaves. E.g. I like sparrows lots. I don't understand them, I wouldn't want them in my room, but I like seeing them do stuff on the periphery of my life. I can imagine AGI looking at us with an similarly friendly eye.

But that won't happen, whatever grows, it will grow out of the diseased now. Today Trump says "I'm a very instinctual person, but my instinct turns out to be right. Hey, look, in the meantime, I guess I can't be doing so badly, because I'm President, and you're not,", tomorrow's Trump will have infinite power over anyone at the press of a button and no uncomfortable questions to even give non-answers to. They'll hand their power over to others who really want it, people who by definition will also be dysfunctional.

As black and white as it may be, I think Erich Fromm's stuff about biophilia and necrophilia applies here, and necrophilia will win out, things staying the same and people staying as obedient as they are. No fate about it, just cowardice.



Well I think that's a reasonable thought but in my opinion it's not really describing AGI. Rather, it's describing something near AGI that can be controlled.

That's why I am vociferously against the "Friendly AI" movement, because fundamentally humans aren't friendly - and that doesn't mean friendly in terms of agreeable or altruistic. Rather "Friendly" as used in the FAI sense, means "Does not reject or override the human values." In effect, the FAI movement wants to try to determine how you make AGI without it having it's own goal system. That it would always be subservient to humans. Which is only marginally a human trait. We pride ourselves as a species on not being beholden or enslaved by someone else's ethos. To try and extend that to human level machines is wrong ethically, and would just create unenlightened super soldiers.

Could a sufficient independent AGI decide to wipe out all humans? Maybe, but in that case it would have come to that conclusion, and taken the steps necessary to achieve it (harder than AGI), on it's own. I don't find the paperclip maximizer or gray goo scenario a plausible argument against.


We pride ourselves on being unbeholden, but it's mostly self-deception. Take a look at this, and realize that even in the west, we aren't that far removed in the way our power structures operate.

https://aeon.co/essays/this-is-what-slavery-looks-like-today...


If I had only the two choices, I would absolutely choose unconstrained, actual AGI over something controllable.

But that's because the way things are going, I see mostly doom and gloom anyway; controlled AI will increase that, uncontrollable ones are at least wildcards. I'm not proud to feel that way, but I do. I have little hope for the human resistance against other humans that has been necessary for so long and just isn't forthcoming -- and even either AGI or humanity spreading so far over the galaxy that some pockets get cut off from the MCP seem more likely. But that's still the lame option, the honorable one is to get our stuff together and then go to space and create AI. As long as I live, I will "work" on that, I certainly will not go into the night gently.

Unless that "paradigm shift" happens, technology will continue to serve capital, and unless you can make the "jump" in total seclusion, so nobody knows what you're doing until it's too late, there will always be people with guns to have a word with you. Still godspeed but that's how I see it.


Technology inventors/project leaders can turn into the majority owners of capital. We already see the shift: 5 of the 10 richest people in the world built their wealth from tech, 7 of the 10 largest public companies in the world by market capitalization are tech companies.

https://en.wikipedia.org/wiki/List_of_public_corporations_by...

The shift is still ongoing. Old(er) capital still have much control of the world's economy, but as technological development accelerates and becomes more valuable, the balance could tilt more towards these tech-based capital.

I am not saying they are saints but at least many of these self-made tech billionaires pledged to give much of their wealth away for philantrophy instead of leaving it to their heirs.


unless you can make the "jump" in total seclusion

Not sure if this is possible, but I think we can get pretty close.

there will always be people with guns to have a word with you

Yes, well it helps to already be one of those people. I spent 12 years in military intelligence and remain a reserve adviser to the Joint Chiefs and SECDEF on emerging technology.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: