• zalgotext@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    12 hours ago

    Eh, these chatbots can do some wild things once their context window fills up, which is trivial to do if you’re talking to it for 10 fucking hours a day. If you try and reference something that has fallen out of its context window, it won’t just stop and tell you “I don’t remember what you’re talking about”, it’ll uhhh, fill in the gaps. Keep going long enough and the context window will be mostly hallucinations and hallucinations of hallucinations, all building on one another until it starts talking metaphysical mumbo jumbo at you and telling you where to meet your soulmate.

    The sad part is these companies are banking on people getting addicted to their chatbot friends so that they pay money for subscriptions. Programming the chatbot to say “hey my context window is full, you should start a new conversation for better responses”, or better yet, forcing the conversation to end, goes directly against their profit motivations, so they’ll never implement that unless forced.