Zerush@lemmy.ml to Technology@lemmy.ml · 12 hours agoGoogle's AI Deletes User's Entire Hard Drive, Issues Groveling Apology: "I Cannot Express How Sorry I Am"futurism.comexternal-linkmessage-square23linkfedilinkarrow-up186arrow-down12cross-posted to: technology@lemmy.zipfuck_ai@lemmy.world
arrow-up184arrow-down1external-linkGoogle's AI Deletes User's Entire Hard Drive, Issues Groveling Apology: "I Cannot Express How Sorry I Am"futurism.comZerush@lemmy.ml to Technology@lemmy.ml · 12 hours agomessage-square23linkfedilinkcross-posted to: technology@lemmy.zipfuck_ai@lemmy.world
minus-squareScrubbles@poptalk.scrubbles.techlinkfedilinkEnglisharrow-up2·3 hours agoOh my god really? Cursor explicitly asks you each command and could only do this in “yolo” mode. Not having these guardrails is insane
minus-squareutopiah@lemmy.mllinkfedilinkarrow-up3·3 hours agoWell there are guardrails from what I understood, including : executing commands (off by default) executing commands without user confirmation (off by default) which are IMHO reasonable but if the person this happened to is right, there is no filesystem sandbox, e.g. limited solely to the project repository.
minus-squareScrubbles@poptalk.scrubbles.techlinkfedilinkEnglisharrow-up1·3 hours agoOkay that changes things. If they turned off these guardrails than that was on them, never blindly trust an LLM like that
Oh my god really? Cursor explicitly asks you each command and could only do this in “yolo” mode. Not having these guardrails is insane
Well there are guardrails from what I understood, including :
which are IMHO reasonable but if the person this happened to is right, there is no filesystem sandbox, e.g. limited solely to the project repository.
Okay that changes things. If they turned off these guardrails than that was on them, never blindly trust an LLM like that