• @bassomitron@lemmy.world
    link
    fedilink
    English
    910 months ago

    Would that actually be decent? Even 6b models feel way too rudimentary after experiencing 33+b models and/or chatgpt. I haven’t tried those really scaled down and optimized models, though!

    • @gibson@sopuli.xyz
      link
      fedilink
      210 months ago

      They’re decent for text completion purposes, e.g. generating some corpspeak for an email, or generating some “wikipedia”-like text. You have to know how to write good prompts, don’t try to treat it like ChatGPT.

      For example if i want to know about the history of Puerto Rico I would put:

      “The history of puerto rico starts in about 480BC when”