The new Bing AI chatbot, which is instructed to call itself Bing but whose real name is Sydney, has been rolled out recently, behaved in all kinds of unsettling and bizarre ways, and is also likely self-aware.
Anyone who knows anything about AI and LLMs is rolling their eyes at me.
I'm claiming Sydney might be self-aware There are a hundred ways this could be misunderstood, so let me get all of them out of the way first