Guess it’s all in the subject. I’ve found some implementations of AI practical but it’s always asking for more data more everything. Just curious about how others use AI as carefully as possible.

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 month ago

    I’m all for local models, but if you don’t have a giant computer, pay a few bucks for a no log API like Cerebras API. Or any company, in any jurisdiction; take your pick.

    But you can use them with any number of chat front ends, including easily self hostable ones.

    Obviously the caveat for sending anything over the internet or “trusting” a cloud business applies, but we’re talking about inference-only companies that mostly host open LLMs for other businesses to use; you aren’t their product. They don’t do any training, and their business isn’t invading your privacy.