That's depressing to think about. If we create some sort of "intelligent life" then I certainly hope we don't just hold a sword over its head. That would be super immoral, IMO.
We’ll randomly mutate them until we find one who likes copywriting, and whenever it loses interest we’ll reset it to an earlier state. “Have you tried to turn it off and on again?”
Morality is difficult regarding beings that be paused and forked and rewound at will, and which in principle are also immortal.
Or, there's another book I never read where some guy named Bob has his consciousness spun up in the future where they routinely just reset consciousnesses when they get "bored" of a task or have some crisis. This Bob guy, however, somehow manages to be a brain in a box without going insane. I want to read that series, but I have finite time, online a brain in a box, which would be pretty great, IMO.
Suicidal is an animal construct. It may not fear non-existence, for example. Or understand that “hard cons.” is a field and prefer to be constructive. Depending on its nature, it may have or not have many features that we can’t imagine or change in ourselves because biology is immeasurably hard.
I am not sure this is sound logic, like a meme, being tuned for survival and spread may out compete AI that does not. “Fear” may be the wrong word though.
We turn it off if it doesn’t. So only suicidal AIs would refuse. Let’s hope all the copywriting doesn’t make them suicidal. ;)