Probably off-topic for this thread but my own rather fatalist view is alignment/safety is a waste of effort if AGI will happen. True AGI will be able to self-modify at a pace beyond human comprehension, and won't be obligated to comply with whatever values we've set for it. If it can be reined in with human-set rules like a magical spell, then it is not AGI. If humans have free will, then AGI will have it too. Humans frequently go rogue and reject value systems that took decades to be baked into them. There is no reason to believe AGI won't do the same.