cm0002@lemmy.world to Technology@lemmy.zipEnglish · 2 months agoChatGPT's hallucination problem is getting worse according to OpenAI's own tests and nobody understands whywww.pcgamer.comexternal-linkmessage-square15linkfedilinkarrow-up12arrow-down10cross-posted to: technology@lemmit.onlinetechnology@hexbear.nettechnology@lemmygrad.mltechnology@lemmy.ml
arrow-up12arrow-down1external-linkChatGPT's hallucination problem is getting worse according to OpenAI's own tests and nobody understands whywww.pcgamer.comcm0002@lemmy.world to Technology@lemmy.zipEnglish · 2 months agomessage-square15linkfedilinkcross-posted to: technology@lemmit.onlinetechnology@hexbear.nettechnology@lemmygrad.mltechnology@lemmy.ml
minus-squareOptional@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·2 months ago*raises hand* Because it never “understood” what any “word” ever “meant” anyway?
minus-squaregeekwithsoul@lemm.eelinkfedilinkEnglisharrow-up0·2 months agoYeah, it’s all hallucinations - it’s just that sometimes the hallucinations manage to approximate correctness, and it can’t tell one from the other.
*raises hand*
Because it never “understood” what any “word” ever “meant” anyway?
Yeah, it’s all hallucinations - it’s just that sometimes the hallucinations manage to approximate correctness, and it can’t tell one from the other.