• taanegl@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Open source locally run LLM that runs on GPU or dedicated PCIe open hardware that doesn’t touch the cloud…