• rebelsimile@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 hours ago

    I’ve been using open webui (search for it with those terms) to run local models in a docker container served from Llama for the last few months and I love it.