• FunkyStuff [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 month ago

    You’re right that phones are more efficient than I gave them credit for, but power costs are absolutely a consideration for the tech companies that are training large models.

    Besides, how much more power efficiency does a phone have that it can make up for only doing 1 query at a time compared to a GPU running several at a time, benefiting from cache locality since it’s just using the same data over and over for different queries, etc? I highly doubt that the efficiency of scale could be outweighed by mobile hardware’s power usage edge.

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 month ago

      The key is that Apples model isn’t all that large, and that’s how they’re targeting being able to do it efficiently on a phone. It also sucks so IDK.