Lugh@futurology.todayM to Futurology@futurology.todayEnglish · 2 years agoEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgexternal-linkmessage-square67fedilinkarrow-up1299
arrow-up1299external-linkEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgLugh@futurology.todayM to Futurology@futurology.todayEnglish · 2 years agomessage-square67fedilink
minus-squareUmbrias@beehaw.orglinkfedilinkEnglisharrow-up1·2 years agoHallucinations are not qualia. Please go talk to an llm for hallucinations, you can use duck duck gos implementation of chatgpt, and see why it’s being used to mean a fairly different thing from human hallucinations.
Hallucinations are not qualia.
Please go talk to an llm for hallucinations, you can use duck duck gos implementation of chatgpt, and see why it’s being used to mean a fairly different thing from human hallucinations.