misk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 years agoWe have to stop ignoring AI’s hallucination problemwww.theverge.comexternal-linkmessage-square204fedilinkarrow-up1532arrow-down129
arrow-up1503arrow-down1external-linkWe have to stop ignoring AI’s hallucination problemwww.theverge.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 years agomessage-square204fedilink
minus-squareUnsavoryMollusk@lemmy.worldlinkfedilinkEnglisharrow-up2·edit-22 years agoThey are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
minus-squareCyberflunk@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·edit-22 months agodeleted by creator
They are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
deleted by creator