I think the "paperclip maximizer" is actually a better metaphor. The autofacs have goals that are completely orthogonal to those of the humans around them. In contrast to the paperclip maximizer, however, the autofacs' goals are ostensibly good for humanity, whereas making as many paperclips as possible sounds benign at worst, until you dig deeply into the implications of it.