- cross-posted to:
- docker@programming.dev
- cross-posted to:
- docker@programming.dev
cross-posted from: https://programming.dev/post/39062524
You must log in or # to comment.
Do you guys successfully use small self hosted models somewhere? Although I can see a use case for LLMs, it’s always the biggest ones that can output any useful results (and by “biggest” I usually mean: the ones I can’t afford to run by myself)