Hi Everyone
I need some help
I’m currently selfhosting some of my applications on Digital Ocean and i run a container using Portainer CE. I was wondering how you guys keep backups for the applications running on docker.
I’m currently using Digital ocean’s snapshots feature but is there a better way i could use, any help on this is highly appreciated.
For databases and data I use restic-compose-backup because you can use labels in your docker compose files.
For config files I use a git repository.
I backup all the mounted docker volumes once every hour (snapshots). Additionally i create dumps from all databases with https://github.com/tiredofit/docker-db-backup (once every hour or once a day depending on the database).
duplicati to take live, crash-consistent backups of all my windows servers and VMs with Volume Shadowcopy Service (VSS)
When backing up Docker volumes, should not the docker container be stopped first.?? I can’t se any support for that in the backup tools mentioned.
Yes the containers do need to be stopped. I actually built a project that does exactly that.
Thanks, I will look into this.
I use Nautical. It will stop your containers before performing an RSYNC backup.
I have bind mounts to nfs shares that are backed my zfs pools and last snapshots and sync jobs to another storage device. All containers are ephemeral.
On Proxmox i use for my Backup Solution - Hetzner Storage Bix
As others said, use volume mounts, and I incrementally backup those with borg to minimize storage space requirements
I use rdiff-backup to backup the volumes directory of my VPS to a local machine via VPN. Containers are stored in some public registry anyways. Also use ansible with all the configurations and container settings.
ZFS snapshots.
I just run a pg_dump through kubectl exec and pipe the stdout to a file on my master node. The same script then runs restic to send encrypted backups over to s3. I use the host name flag on the restic command as kind of a hack to get backups per service name. This eliminates the risk of overwriting files or directories with the same name.
I use docker in Proxmox and i backup all container
I use an Ubuntu vm for all my containers in proxmox and make backups of the vm onto my zfs pool
Cronjobs to backup important folders to a separate disk
Git repo(s) for services & configs with weekly automated commits and pushes
I do the reverse… all configs are ansible scripts and files and I just push them to the servers. That way I can spin up a new machine from scratch, completely automated within minutes… just the time it takes the machine to set itself up.
I use duplicati to backup to a secure off site location. Useful for something like vaultwarden.
Most of mine are lightweight so private repos on git.
For big data I have two NAS that sync on the daily.