return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 7 days agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square33fedilinkarrow-up1203arrow-down17cross-posted to: [email protected]
arrow-up1196arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 7 days agomessage-square33fedilinkcross-posted to: [email protected]
minus-squareCubitOom@infosec.publinkfedilinkEnglisharrow-up44·7 days ago Remember kids, if you want to look up something that you don’t want the government to know about, don’t use the internet to do it. Also, LLMs are not the best source for asking about how to make things that explode.
Remember kids, if you want to look up something that you don’t want the government to know about, don’t use the internet to do it.
Also, LLMs are not the best source for asking about how to make things that explode.