return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 13 days agoAI agents now have their own Reddit-style social network, and it's getting weird fastarstechnica.comexternal-linkmessage-square174linkfedilinkarrow-up1473arrow-down112cross-posted to: futurology@futurology.todaytechnology@lemmy.zipfuck_ai@lemmy.worldAii@programming.devartificial_intel@lemmy.mlaboringdystopia@lemmy.world
arrow-up1461arrow-down1external-linkAI agents now have their own Reddit-style social network, and it's getting weird fastarstechnica.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 13 days agomessage-square174linkfedilinkcross-posted to: futurology@futurology.todaytechnology@lemmy.zipfuck_ai@lemmy.worldAii@programming.devartificial_intel@lemmy.mlaboringdystopia@lemmy.world
minus-squareBradleyUffner@lemmy.worldlinkfedilinkEnglisharrow-up37arrow-down1·13 days agoThere is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.
minus-squareKeenFlame@feddit.nulinkfedilinkEnglisharrow-up1·11 days agoI don’t understand what you mean. Why is there no way?
minus-squareBradleyUffner@lemmy.worldlinkfedilinkEnglisharrow-up1·11 days agoWatch this video. https://youtu.be/_3okhTwa7w4
There is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.
I don’t understand what you mean. Why is there no way?
Watch this video.
https://youtu.be/_3okhTwa7w4