If it doesn't know who the dude is, won't it be more honest to say "I don't know"?
The problem with this bot is that it doesn't know what it doesn't know. It has no slightest idea about knowledge.
Its uncorrelated with "honesty"/"knowing" yet it has some very firm stances on not lying or making false statements. These are contradictory. It can respond to prompts that posit a fictional setting. Thus it must, at least as an emergent property, have some separation of general knowledge of existent things (training) from context dependent imagined things (prompts). And thus there is an emergent concept of honesty, as in an awareness not to treat content inside an imagined scope as existent outside of that scope.