baatliwala@lemmy.world to memes@lemmy.world · 3 days agoThe AI revolution is cominglemmy.worldimagemessage-square77fedilinkarrow-up1324
arrow-up1324imageThe AI revolution is cominglemmy.worldbaatliwala@lemmy.world to memes@lemmy.world · 3 days agomessage-square77fedilink
minus-squareMora@pawb.sociallinkfedilinkarrow-up1·2 days agoAs someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
minus-squarekyoji@lemmy.worldlinkfedilinkarrow-up2·1 day agoI also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think
minus-squareLurker@sh.itjust.workslinkfedilinkarrow-up3·1 day agoYou can try from lowest to bigger. You probably can run biggest too but it will be slow.
Deepseek is good locally.
As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
I also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think
You can try from lowest to bigger. You probably can run biggest too but it will be slow.