return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 4 months agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square33linkfedilinkarrow-up1210arrow-down17cross-posted to: fuck_ai@lemmy.world
arrow-up1203arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 4 months agomessage-square33linkfedilinkcross-posted to: fuck_ai@lemmy.world
minus-squareFreedomAdvocate@lemmy.net.auBanned from communitylinkfedilinkEnglisharrow-up7·4 months agoYou don’t even need an LLM, just an internet connected browser.
minus-squareEcho Dot@feddit.uklinkfedilinkEnglisharrow-up3·4 months agoOr literally just buy some fertiliser. We’ve all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it’s practically a nuclear bomb level detonation.
minus-squareMeThisGuy@feddit.nllinkfedilinkEnglisharrow-up2·4 months agolike this guy? https://wikipedia.org/wiki/Oklahoma_City_bombing
minus-squareCodenameDarlen@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-24 months agodeleted by creator
You don’t even need an LLM, just an internet connected browser.
Or literally just buy some fertiliser. We’ve all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it’s practically a nuclear bomb level detonation.
like this guy?
https://wikipedia.org/wiki/Oklahoma_City_bombing
deleted by creator