I just started getting into self hosting using docker compose and I wonder about possible backup solutions. I only have to safe my docker config so far, but I want host files as well. What software and hardware are you using for backup?

    • @webjukebox@mujico.org
      link
      fedilink
      English
      3
      edit-2
      1 year ago

      I was in the same boat, until my prayers weren’t listened and my hopes are now dead.

      I lost some important data from my phone a few days ago. My plan was to backup at night but chaos was that same day in the morning.

    • @neardeaf@lemm.ee
      link
      fedilink
      English
      31 year ago

      Seconding this. On my unRAID host, I run a docker container called “Vorta” that uses Borg as its backend mechanism to backup to my SynologyNAS over NFS. Then on my Syno, run two backup jobs using HyperBackup, one goes to my cousin’s NAS connected via a Site-to-Site OpenVPN connection on our edge devices (Ubiquity Unifi Security Gateway Pro <-> UDM Pro), the other goes to Backblaze B2 Cloud Storage.

      OP, let me know if you need any assistance setting something like this up. Gotta share the knowledge over here on Lemmy that we’re still used to searching evil Reddit for.

        • @neardeaf@lemm.ee
          link
          fedilink
          English
          11 year ago

          Niiiice, quick question, are both of y’all running the latest UniFi Controller version & using the new WebUI view layout?

          • @PlutoniumAcid@lemmy.world
            link
            fedilink
            English
            21 year ago

            His gear is v7 (Unifi and also Synology DSM) and I am still on v6 because I didn’t have a good reason to upgrade. If it works, don’t fix it, you know? Feature-wise there the same anyway just different UI. But sure, give me a good reason to upgrade, and I will :)

  • 0110010001100010
    link
    fedilink
    71 year ago

    Local backup to my Synology NAS every night which is then replicated to another NAS at my folks house through a secure VPN tunnel. Pretty simple and easy to deploy.

  • @francisco_1844@discuss.online
    link
    fedilink
    English
    61 year ago

    Restic for backup - can send backups to S3 and SFTP amongst other target options.

    There are S3 (block storage) compatible services, such as Backblaze’s B2, which are very affordable for backups.

  • @kaotic@lemmy.world
    link
    fedilink
    English
    6
    edit-2
    1 year ago

    I’ve had excellent luck with Kopia, backing up to Backblaze B2.

    At work, I do the same to a local directory in my company provided OneDrive account to keep company data on company resources.

  • @bier@lemmy.blahaj.zone
    link
    fedilink
    English
    51 year ago

    my 20 TB storage is currently hosted by Hetzner on a SMB Share with a acompanying server The storage is accessable via NFS/SMB i have a Windows 10 VPS running Backblaze Personal Backup for 7$/Month with unlimited storage while mounting the SMB share as a “Physical Drive” using Dokan because Backblaze B1 doesn’t allow backing up Network shares If your Storage is local you can use the win Backup Agent in a Docker container

  • @ComptitiveSubset@lemmy.world
    link
    fedilink
    English
    5
    edit-2
    1 year ago

    For app data, Borg as backup/restore software. Backup data is then stored on Hetzner as an offsite backup - super easy and cheap to setup. Also add healthchecks.io to get notified if a backup failed.

    Edit: Backup docker compose files and other scripts (without API keys!!!) with git to GitHub.

  • @lynny@lemmy.world
    link
    fedilink
    English
    51 year ago

    Someone on lemmy here suggested Restic, a backup solution written in Go.

    I back up to an internal 4TB HDD every 30 minutes. My most important files are stored in an encrypted file storage online in the cloud.

    Restic is good stuff.

  • @DataDreadnought
    link
    English
    51 year ago

    I doubt your using NixOS so this config might seem useless but at its core it is a simple systemd timer service and bash scripting.

    To convert this to another OS you will use cron to call the script at the time you want. Copy the part between script=“” and then change out variables like the location of where docker-compose is stored since its different on NixOS.

    Let me explain the script. We start out by defining the backupDate variable, this will be the name of the zip file. As of now that variable would be 2023-07-12. We then go to each folder with a docker-compose.yml file and take it down. You could also replace down with stop if you don’t plan on updating each night like I do. I use rclone to connect to Dropbox but rclone supports many providers so check it out and see if it has the one you need. Lastly I use rclone to connect to my Dropbox and delete anything older than 7 days in the backup folder. If you end up going my route and get stuck let me know and I can help out. Good luck.

    systemd = {
          timers.docker-backup = {
            wantedBy = [ "timers.target" ];
            partOf = [ "docker-backup.service" ];
            timerConfig.OnCalendar= "*-*-* 3:30:00";
          };
          services.docker-backup = {
            serviceConfig.Type = "oneshot";
            serviceConfig.User = "root";
            script = ''
            backupDate=$(date  +'%F')
            cd /docker/apps/rss
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/apps/paaster
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/no-backup-apps/nextcloud
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/apps/nginx-proxy-manager
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/backups/
            ${pkgs.zip}/bin/zip -r server-backup-$backupDate.zip /docker/apps
    
            cd /docker/apps/nginx-proxy-manager
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/apps/paaster
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/apps/rss
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/no-backup-apps/nextcloud
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/backups/
            ${pkgs.rclone}/bin/rclone copy server-backup-$backupDate.zip Dropbox:Server-Backup/
            rm server-backup-$backupDate.zip
            ${pkgs.rclone}/bin/rclone delete --min-age 7d Dropbox:Server-Backup/
            '';
          };
        };
    
    
    • thejevans
      link
      fedilink
      English
      11 year ago

      Thanks! I just started setting up NixOS on my laptop and I’m planning to use it for servers next. Saving this for later!

  • @gobbling871@lemmy.world
    link
    fedilink
    English
    4
    edit-2
    1 year ago

    I use restic (and dejadup just to be safe) backing up to multiple cloud storage points. Among these cloud storage points are borgbase.com, backblaze b2 and Microsoft cloud.

    • @gibnihtmus@lemmy.world
      link
      fedilink
      English
      31 year ago

      You should look into s3 deep glacier. It’s $0.001 GB / month. Caveat is there’s a 6 month minimum charge per object.

  • lnxtx
    link
    fedilink
    English
    41 year ago

    VM instances on the Proxmox VE with native integration with the Proxmox Backup Server (PBS).

    For non-VM a little PBS agent.

  • stown
    link
    fedilink
    English
    41 year ago

    I host everything on Proxmox VM’s so I just take daily snapshots to my NAS

  • Sam
    link
    fedilink
    English
    31 year ago

    raid1 + data duplication

    Photos, videos, music, documents, etc… are available on multiple devices using SyncThing.

        • tables
          link
          fedilink
          31 year ago

          It’s not pedantry, it’s just that RAID and instant data duplication or synchronization aren’t meant to protect you from many of the situations in which you would need a backup. If a drive fails, you can restore the information from wherever you duplicated the data to. If, however, your data is corrupted somehow, the corruption is just duplicated over and you have no way to restore the data to a state before the corruption happened. If you accidentally delete files you didn’t want to delete, the deletion is replicated over and, again, no way to restore them. RAID wasn’t built to solve the problems a backup tries to solve.

          • Sam
            link
            fedilink
            21 year ago

            Well I guess my personal definition of backup is wrong.

        • @lynny@lemmy.world
          link
          fedilink
          English
          31 year ago

          If a program screws up and crashes while writing data to your drive, it can take out more than just the data it was dealing with. RAID will simply destroy data on both your drives at the same time, making any data recovery impossible.

  • Morethanevil
    link
    fedilink
    English
    31 year ago

    I rsync my data once a day to another drive via script. If I accidentaly delete files, I can easily copy them back. Then once a day, rclone makes an encrypted backup to a hetzner storagebox