Lee Duna@lemmy.nz to Fuck AI@lemmy.worldEnglish · 5 days agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square8fedilinkarrow-up178arrow-down12cross-posted to: [email protected]
arrow-up176arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comLee Duna@lemmy.nz to Fuck AI@lemmy.worldEnglish · 5 days agomessage-square8fedilinkcross-posted to: [email protected]
minus-squarebetterdeadthanreddit@lemmy.worldlinkfedilinkarrow-up7·5 days agoYeah, if someone has to ask a slop machine for instructions on making WMDs, I don’t think they’re much of a threat. Aum Shinrikyo and the Rajneeshees did their stuff without LLM assistance.
Yeah, if someone has to ask a slop machine for instructions on making WMDs, I don’t think they’re much of a threat. Aum Shinrikyo and the Rajneeshees did their stuff without LLM assistance.