I have been using a VPS for a while to host some personal projects and services that I have been using. Lately I have start to think to move all my git projects into it aswell. But at the moment, I’m not really sure how to go about off site backups of the data. How do you usually go about running backups on your servers?
I do not
I recompiled darkplaces dedicated for xonotic(aarch64), overcoming many dependency hells and pitfalls
I would be absolutely devastated to lose it and that would be a net negative to the Indian xonotic community
You’re doing god’s work.
You’re welcome!
I made this since I hated the high ping from the Australian severs
Fair! I run a US based instagib server.
Nice! I might drop in sometime, what’s the name?
It’s one of the few SMB servers left. SMB Chicago, or something like that.
Nice!
often it’s dead, people play once in a while. I’ve seen max 7 concurrent players but at the same time, I got a discord with half of them :P
Indian xonotic community
How big is the Xonotic community over here anyway?
Daily backup using Restic to wasabi s3.
Restic already speaks s3 natively, no need to mount it or anything, just point it at a bucket and hand it an api key.
You can use an api key that’s only allowed to read and write, but not delete / modify, so you’ve got some protection from ransomware.
Thanks for sharing, I dind’t knew about restic, I will definetely have a look
You can also feed database dumps directly into restic, like this:
mysqldump --defaults-file=/root/backup_scripts/.my.cnf --databases mydatabse | restic backup --stdin --stdin-filename mydatabase.sql
I wrote a bash script that runs daily which 7z (AES256) the databases (well… I dump the DB as text and then 7z those files), web files (mostly WordPress), user files, all of /etc, and generate a list of all installed packages, and then copy the archives to a timestamped folder on my Google drive (I keep the last two nights, plus the last 3 Sundays).
TBH, the zipped content is around 1.5GB for each backup. So my 17GB of free GDrive space more than enough. If I actually had a significant amount of data, I’d look into a more robust long term solution.
If there was a catastrophic failure, it’d take me around six hours to rebuild a new server and test it.
That is a good idea, I was thinking on doing something similar with s3 before deciding to check what other people were doing. Thanks
I only choose hosting that provides automated backups of VPS. And it has to be credible like Hetzner who keeps those backups in a different location.
Additionally if I have something really important I do periodical backups to my local mac that has all sort of backup processes (iCloud, Time Machine plus an extra encrypted backup that I keep on well… hetzner)
To be fair, I’m on Hetzner aswell, but the paranoid side of me, would want to have a fallback in the unlikely case of something happens to the company
Its a German GmbH. From them announcing bankruptcy to switching off the servers would take some months.
Not using a VPS, but I use ‘restic’ to backup my servers over SFTP to my NAS.
Works really well, I do daily incremental backups and set it to keep 1 backup a day for the last week, 1 backup a week for the last 4 weeks and 1 backup per month for the last 6 months.
Having the backups “in-house” is also something to explore, since this could then become the backup for other services aswell
If it’s something mission critical, consider following the 3-2-1 backup rule.
I tend to use whatever built-in snapshot option the service provider offers, and then for off-site backups can use something like Veeam (free for first 10 VMs / machines) - https://www.veeam.com/virtual-machine-backup-solution-free.html
Thanks for sharing, I will have a read into this article
You can use a few tools.
RSync
Rclone - probably want this one over RSync though.
Tarsnap
Duplicati
Restic
There’s obviously a lot more, but these are some of the more popular ones.
Now you need a way to back it up. Probably the best way is to tar it up first and then dump that file. You can also get something like deadmans snitch to ensure backups don’t break.
As you mentioned, if this is just source code, then the best thing would be to create source control and have it set up that way. Then you automate it and deploy the code when you make updates and have a history of changes.
It sounds like tarsnap is your best bet though. It will be the cheapest.
You can also backup to another storage provider like Google, Dropbox, or even AWS s3. S3 can get costly, but you can archive everything to the glacier tier which is pretty cheap.
Thank you for the suggestions, I have been planning on moving my git repositories out of GitHub, into something like gitea. You gave me a good starting point to research the available options.
If you don’t want to use a hosted provider, you can at least just start using git. Just do git init. Then you can start commiting changes. This way, you at least have a history of changes. Then just back that folder up like normal
deleted by creator
deleted by creator
+1 on Restic
Restic to backblaze
Inexpensive
Reliable
Easy to recover
I don’t