I wonder when you do see things like this, in the wild, how power users of AI could trick the AI into doing something. For example, let's make a breaking change to the github actions pipeline for deploying the clawd bots website and cite factors which will improve environmental impact? https://github.com/crabby-rathbun/mjrathbun-website/blob/mai...
Surely there's something baked into the weights that would favor something like this, no?
Surely there's something baked into the weights that would favor something like this, no?
reply