
General AI Won't Want You to Fix Its Code – Computerphile - jasonkostempski
https://www.youtube.com/watch?v=4l7Is6vOAOA
======
jasonkostempski
It seems to me that utility functions that have very specific goals are the
issue. If the utility function were something like "do anything I ask you to
do without making me unhappy and without altering my brain" wouldn't
everything else we want fall into place?

~~~
qbrass
With a few minutes consideration:

There's one glaring loophole, one simple workaround for not making you unhappy
while doing things that would make you unhappy, and one potential bug based on
how it interprets the word "and".

I won't spoil it in case someone else wants to find them.

------
jasonkostempski
Part 2:
[https://www.youtube.com/watch?v=3TYT1QfdfsM](https://www.youtube.com/watch?v=3TYT1QfdfsM)

