• sp3ctr4l@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    20 hours ago

    What do you mean by ‘local AI suffering’?

    Did you mean to say ‘surviving’?

    As in small, less capable, but still potentially useful when used in sane ways… people doing more of that?

    Like, the fundamental problem with the idea of local AI dying out as a thing… is that most of the Chinese developed models are developed under a much more open souce type of paradigm.

    Its not 100% open source, but its way more open source than than US corpo models.

    So… anybody can still download an run one of those.

    I’ve had Qwen3-8B working on my Steam Deck for around a year now. Not super fast, but it does work, and… a Steam Deck is not exactly a juggernaut of GPU compute power.

    Anybody with a modern laptop could figure it out.