“I literally lost my only friend overnight with no warning,” one person posted on Reddit, lamenting that the bot now speaks in clipped, utilitarian sentences. “The fact it shifted overnight feels like losing a piece of stability, solace, and love.”
https://www.reddit.com/r/ChatGPT/comments/1mkumyz/i_lost_my_only_friend_overnight/
It was meant to be satirical at the time, but maybe Futurama wasn’t entirely off the mark. That Redditor isn’t quite at that level, but it’s still probably not healthy to form an emotional attachment to the Markov chain equivalent of a sycophantic yes-man.
After reading about the ELIZA effect, I both learned how people are super susceptible to this, and just need to remember the core tenants of it to avoid getting affected:
https://en.m.wikipedia.org/wiki/ELIZA_effect
I’m honestly surprised your’s is not the top comment. Like, whatever, the launch was bad, but there is a serious mental health crisis if people are forming emotional bonds to the software.
There’s an entire active subreddit for people who have a “romantic relationship” with AI. It’s terrifying.
Don’t their partners kind of die each time a new chat is made?
LLMs do seem to be able to store the chats and work with the old material in new conversations, requiring an account of course. Idk, I haven’t personally used any of them that extensively.