RobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 9 days agoOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comexternal-linkmessage-square107linkfedilinkarrow-up1653arrow-down19cross-posted to: fuck_ai@lemmy.worldtechnology@lemmy.worldnottheonion@sh.itjust.worksnottheonion@lemmy.mltechnology@hexbear.netaboringdystopia@mander.xyztechnology@hexbear.net
arrow-up1644arrow-down1external-linkOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comRobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 9 days agomessage-square107linkfedilinkcross-posted to: fuck_ai@lemmy.worldtechnology@lemmy.worldnottheonion@sh.itjust.worksnottheonion@lemmy.mltechnology@hexbear.netaboringdystopia@mander.xyztechnology@hexbear.net
minus-squareJoe@lemmy.worldlinkfedilinkEnglisharrow-up11·8 days agoIt certainly should be designed for those type of queries though. At least, avoid discussing it. Wouldn’t ChatGPT be liable if someone planned a terror attack with it?
It certainly should be designed for those type of queries though. At least, avoid discussing it.
Wouldn’t ChatGPT be liable if someone planned a terror attack with it?