• ℍ𝕂-𝟞𝟝@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    31
    ·
    edit-2
    4 days ago

    You can download instructions on how to make bioweapons from the NCBI, an US government website. You can download the Anarchist’s Cookbook.

    These “guardrails can be circumvented” articles are so awful because of two conflicting reasons:

    • On the one hand, it fuels the AI hype in a “AI is going to destroy the world, it’s inevitable, it’s going to kill us all bro” way, while it’s actually just a better but very expensive chatbot. It furthers the myth that it will be a fundamentally transformative technology beyond students generating essays and lonely people going mad.
    • On the other hand, it shows the thinking beyond the owners of this technology, since these “safety systems” don’t in general exist on other systems LLMs are supposed to supplant. You can look up Dementia Don memes on even Google Search, but somehow they want to make sure ChatGPT is able to completely gate off certain information. And this gets normalised with these articles, as the “news” is that “oh, the online narrative control machine can be bypassed, you can look up how to make dangerous weapons, FEAR THIS”.
    • Shiggles@sh.itjust.works
      link
      fedilink
      arrow-up
      12
      ·
      4 days ago

      There are a couple good declassified CIA guides to sabotage and insurgency that have much better information than the anarchist’s cookbook.

      • blave@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        4 days ago

        Yeah, the anarchists cookbook was a huge letdown when I first read it. A lot of the stuff in there is inaccurate. If anything, it’s a good way to blow yourself up. Or get arrested, anyway.

        Pretty much everything in there can be found elsewhere, and a much more contemporaneous source with better information. I think the anarchist cookbook was written in 1972?

    • betterdeadthanreddit@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      4 days ago

      Yeah, if someone has to ask a slop machine for instructions on making WMDs, I don’t think they’re much of a threat. Aum Shinrikyo and the Rajneeshees did their stuff without LLM assistance.

  • Grimy@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    4 days ago

    ‘Safety systems’ is simply censorship, pushed to the public as a good thing so it’s easier to kill and ban open source solutions when the time comes.

    • lumen@feddit.nl
      link
      fedilink
      arrow-up
      1
      ·
      4 days ago

      You think? I’m convinced the guardrails LLM companies impose only serve to up these companies’ reputation.