misk@sopuli.xyz to Technology@lemmy.worldEnglish · vor 2 JahrenWe have to stop ignoring AI’s hallucination problemwww.theverge.comexternal-linkmessage-square208fedilinkarrow-up1534
arrow-up1534external-linkWe have to stop ignoring AI’s hallucination problemwww.theverge.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · vor 2 Jahrenmessage-square208fedilink
minus-squareCyberflunk@lemmy.worldlinkfedilinkEnglisharrow-up1·vor 2 JahrenWtf are you even talking about.
minus-squareUnsavoryMollusk@lemmy.worldlinkfedilinkEnglisharrow-up2·edit-2vor 2 JahrenThey are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
minus-squareCyberflunk@lemmy.worldlinkfedilinkEnglisharrow-up1·vor 2 JahrenYour 1 sentence makes more sense than the slop above.
Wtf are you even talking about.
They are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
Your 1 sentence makes more sense than the slop above.