• ameancow@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    16 hours ago

    Welcome to the late 2020’s. It’s only going to get weirder.

    To be clear, the LLM in this story did not actually “want” a robot body, it doesn’t “want” anything, it’s not a thinking entity like you or I (assuming you’re real.)

    The guy fed it a ton of crazy shit and he got a lot of crazy shit amplified back to him by the world’s best associating machine, crafting detailed and fleshed-out narratives based on every inadvertent prompt he sent into it. People are very bad at understanding how these things work in the best circumstances, so if you’re already unbalanced or have deep emotional/mental health problems, an LLM can be incredibly dangerous for you.