Lugh@futurology.todayM to Futurology@futurology.todayEnglish · 8 months agoEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgexternal-linkmessage-square67fedilinkarrow-up1298
arrow-up1298external-linkEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgLugh@futurology.todayM to Futurology@futurology.todayEnglish · 8 months agomessage-square67fedilink
minus-squareUmbrias@beehaw.orglinkfedilinkEnglisharrow-up1·8 months agoHallucinations are not qualia. Please go talk to an llm for hallucinations, you can use duck duck gos implementation of chatgpt, and see why it’s being used to mean a fairly different thing from human hallucinations.
Hallucinations are not qualia.
Please go talk to an llm for hallucinations, you can use duck duck gos implementation of chatgpt, and see why it’s being used to mean a fairly different thing from human hallucinations.