Sixth0795@sh.itjust.works to Science Memes@mander.xyzEnglish · 7 months agoHuhsh.itjust.worksimagemessage-square45fedilinkarrow-up116arrow-down10
arrow-up116arrow-down1imageHuhsh.itjust.worksSixth0795@sh.itjust.works to Science Memes@mander.xyzEnglish · 7 months agomessage-square45fedilink
minus-squareLimeey@lemmy.worldlinkfedilinkEnglisharrow-up0·7 months agoIt all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.
It all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.