It is what I meant, I just don't see what the paperclip-maximiser has to do with it. As far as I understand it the primary idea behind that particular thought experiment is how a misaligned ai leads to agi ruin, even for simple goals.
The scenario we talk about doesn't even contain misaligned ai. It contains friendly ai (the best case scenario), which still drops all current human economic value to zero. The contrived scenario has all jobs except for the boss replaced with ai. You propose hiring operators to increase the companies productivity. I say this doesn't make sense.
Do you agree until this point? If so, what does the paperclip maximiser have to do with anything? If not, what did I misunderstand?
The scenario we talk about doesn't even contain misaligned ai. It contains friendly ai (the best case scenario), which still drops all current human economic value to zero. The contrived scenario has all jobs except for the boss replaced with ai. You propose hiring operators to increase the companies productivity. I say this doesn't make sense.
Do you agree until this point? If so, what does the paperclip maximiser have to do with anything? If not, what did I misunderstand?