• BaroqueInMind
    link
    fedilink
    arrow-up
    3
    ·
    3 months ago

    You don’t even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af

    • bi_tux@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)

      • tomjuggler@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        I ran it on my dual core celeron and… just kidding try the mini llama 1B. I’m in the same boat with Ryzen 5000 something cpu