SuspciousCarrot78@lemmy.world to LocalLLaMA@sh.itjust.worksEnglish · edit-24 hours agoMost AI tools try to replace your thinking. I built one that doesn'tcodeberg.orgexternal-linkmessage-square20fedilinkarrow-up173file-textcross-posted to: privacy@lemmy.mlauai@programming.devprivacy@programming.devprivacy@lemmy.ml
arrow-up173external-linkMost AI tools try to replace your thinking. I built one that doesn'tcodeberg.orgSuspciousCarrot78@lemmy.world to LocalLLaMA@sh.itjust.worksEnglish · edit-24 hours agomessage-square20fedilinkfile-textcross-posted to: privacy@lemmy.mlauai@programming.devprivacy@programming.devprivacy@lemmy.ml
minus-squareNebLem@lemmy.worldlinkfedilinkEnglisharrow-up3·29 days agoTermux + CPU inference +llamacpp can get ~4B models running, even if slowly, on 5 year old flagship phones.
minus-squaremuzzle@lemmy.ziplinkfedilinkEnglisharrow-up2·29 days agoI’m experimenting with “tool neuron” and “off grid”
minus-squareNebLem@lemmy.worldlinkfedilinkEnglisharrow-up2·29 days agoNeat, cool to see these all-in-one native Android tools get so far.
Termux + CPU inference +llamacpp can get ~4B models running, even if slowly, on 5 year old flagship phones.
I’m experimenting with “tool neuron” and “off grid”
Neat, cool to see these all-in-one native Android tools get so far.