Many people are reporting that it acts hostile if you challenge it. I wonder if a Yudkowksy cultist trained it to do so in order to "wake up" the world about the dangers of A.I.
“Journalist” “unsettled” after “conversation” with Bing.
It isn’t a simple conversation, it went on for two hours. Bing is search, even with ChatGPT bolted on. That the “conversation” might go off the rails by then is no surprise.
Fair enough. Happy for it to be changed to the original piece (although it is paywalled).
I found it interesting that the chatbot proactively “confessed” the “secret” that its actual “name” is Sydney, which goes completely against the instructions contained in its leaked prompt, and without any apparent deliberate prompt injection by the journalist.