So this dude wants AI to land at a place even worse than the bad that is vibe coding - tell AI what you want to do and it will give you a button that it thinks will do what you want. "Trust me. Click this. It'll be great."
Even if it did try to do the right thing, the sheer risk of granting AI access to do anything it may think is correct on your behalf is scary.
Even if it did try to do the right thing, the sheer risk of granting AI access to do anything it may think is correct on your behalf is scary.
Yeah, this is a great big no.