• @snek_boi@lemmy.ml
    link
    fedilink
    17
    edit-2
    6 months ago

    I can’t see how AI can’t be done in a privacy-respecting way [edit: note the double negative there]. The problem that worries me is performance. I have used texto-to-speech AI and it absolutely destroys my poor processors. I really hope there’s an efficient way of adding alt text, or of turning the feature off for users who don’t need it.

    • @MangoPenguin@lemmy.blahaj.zone
      link
      fedilink
      English
      216 months ago

      If it runs locally then no data ever leaves your system, so privacy would be respected. There are tons of good local-only LLMs out there right now.

      As far as performance goes, current x86 CPUs are awful, but stuff coming out from ARM and likely from Intel/AMD in the future will be much better at running this stuff.