brt01010101@sh.itjust.works to No Stupid Questions@lemmy.world · edit-28 months agoXXXNSFWmessage-squaremessage-square101fedilinkarrow-up1118
arrow-up1118message-squareXXXNSFWbrt01010101@sh.itjust.works to No Stupid Questions@lemmy.world · edit-28 months agomessage-square101fedilink
minus-squareA_Very_Big_Fan@lemmy.worldlinkfedilinkEnglisharrow-up1·8 months agoSo you do unironically think it takes that amount of equipment and power to output to a single device lmao
minus-squaremojofrododojo@lemmy.worldlinkfedilinkEnglisharrow-up1·8 months agoI can’t tell if you’re fucking dense or can’t read. A LLM RUN ON A PHONE WILL DO YOU FUCKALL GOOD. you uninformedly think you can run an AI worth a damn on your phone - and the corpus to teach it? fuck off, you stupid git. good luck with your HAL 9. You’re gonna walk through the apocalypse with a moron. Which fits, you’ll be equals.
minus-squareA_Very_Big_Fan@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-28 months agoHere’s a RaspberryPi doing a variety of tasks with various LLMs, like programming and accurately describing a picture. There’s a literal mountain of evidence of what these models can do. It’s been fun making you rage :3
So you do unironically think it takes that amount of equipment and power to output to a single device lmao
I can’t tell if you’re fucking dense or can’t read.
A LLM RUN ON A PHONE WILL DO YOU FUCKALL GOOD.
you uninformedly think you can run an AI worth a damn on your phone - and the corpus to teach it?
fuck off, you stupid git. good luck with your HAL 9. You’re gonna walk through the apocalypse with a moron. Which fits, you’ll be equals.
Here’s a RaspberryPi doing a variety of tasks with various LLMs, like programming and accurately describing a picture.
There’s a literal mountain of evidence of what these models can do. It’s been fun making you rage :3