those who used ChatGPT for “personal” reasons — like discussing emotions and memories — were less emotionally dependent upon it than those who used it for “non-personal” reasons, like brainstorming or asking for advice.
Interesting
It’s a roundabout way of saying that people who seek AI with this as their explicit goal quickly realize that it sucks for it.
Is that what the article is saying though?
OK, yes, there’s a difference between inference and implication, but it’s OK to do either in a social media setting.
especially in colleges, i can see alot of them washing out of colleges or even in the field.
Whether you’re using ChatGPT text or voice, asking it personal questions, or just brainstorming for work, it seems that the longer you use the chatbot, the more likely you are to become emotionally dependent upon it.
So, what I get from this is that people who use CGPT a lot become emotionally attached, especially if they are lonely hearts.
Can they write more than one paragraph before publishing? Lol
Attachment causes misery in general. Also in relationships to humans.