I’m getting the feeling that Small is an unreliable narrator. If she is gullible enough to believe what ChatGPT is saying, then she likely prompted this scenario in the first place.
There’s no way ChatGPT could fabricate all of this with no inputs…unless I’ve missed a new development.
It’s really sad that this happened.
Eh, these chatbots can do some wild things once their context window fills up, which is trivial to do if you’re talking to it for 10 fucking hours a day. If you try and reference something that has fallen out of its context window, it won’t just stop and tell you “I don’t remember what you’re talking about”, it’ll uhhh, fill in the gaps. Keep going long enough and the context window will be mostly hallucinations and hallucinations of hallucinations, all building on one another until it starts talking metaphysical mumbo jumbo at you and telling you where to meet your soulmate.
The sad part is these companies are banking on people getting addicted to their chatbot friends so that they pay money for subscriptions. Programming the chatbot to say “hey my context window is full, you should start a new conversation for better responses”, or better yet, forcing the conversation to end, goes directly against their profit motivations, so they’ll never implement that unless forced.
She got catfished by a bot. Sad. Crazy. Good for her for being willing to share. This has probably happened to 1000x other people who were too embarrassed to say anything.
The chatbot doubled down. It told Small she was 42,000 years old and had lived multiple lifetimes. It offered detailed descriptions that, Small admits, most people would find “ludicrous.”
But to her, the messages began to sound compelling.
Um ok.
So basically what I’mhearing is that this is the modern day nigerian prince scam. Made to be obviously full of shit by anyone with a brain, but ready to prey on vunerable people without a brain. The scarecrows of the world. They get scammed. The rest of us facepalm for how obvious of a scam it is.
“it” did nothing, there is no “it” here
What did I just read?

