Our gracious host @JonahAragorn asks that we sign up on a server on join-lemmy.org or sign up on kbin.social instead of directing everybody to use the lemmy.one server specifically, in order to distribute the load.

Thank you!

  • @arbiter@lemmy.ml
    link
    fedilink
    English
    91 year ago

    I purchased (havent deployed) lemmy.club yesterday and I’m going to look in to deploying via kubernetes this weekend. We’ll need some instances that can horizontally scale so that we can absorb as many users as we can from the APIcalypse.

    I’m slightly worried about the costs, but I’m loving the platform so far and am willing to do as much as I can!

    • @FirstWizardZorander
      link
      English
      41 year ago

      I’m interested in how this would work, especially in regard to the database, which I understand is the most intense of the workloads.

      If this could be solved in a way that scales well, I think we’d be in a good spot. I’ll be looking into the Lemmy backend code in the near future, to see if there is any way to offload the db.

    • @empireOfLove
      link
      English
      2
      edit-2
      1 year ago

      I am also interested in anything you learn about the horizontal scaling. Once I get some free time I’m planning to open my own instance for car culture related stuff and want to be learning ahead of the curve just in case there are legit traffic spikes.

      I was under the impression that the lemmy DB did not scale safely without some core code rewrites.