

2·
12 days agoVery cool! Thanks for sharing your experience with it.
I’m trying to run things offline myself.


Very cool! Thanks for sharing your experience with it.
I’m trying to run things offline myself.


Nice! Do you use the models for coding? Or image generation, for example?


Thanks! I will do some searching on my own, and your comment is a good starting point. I will probably ask you for links if I’m unable to find anything.
May I ask what kind of hardware you use to run your LLMs? Like, do you have a rack full or GPUs?


Thank you!


Any examples of these, please?
You can still program in those platforms, if you want to.