A way to get Daily PostgreSQL Backups from Docker Swarm
May 15, 2021 • 8 minutes read

PostgreSQL Backups

How to make backups from PostgreSQL running in Docker Swarm?

PostgreSQL backups

Running Docker container always has a unique container id. Docker Swarm (actually Docker) runs services with dynamic IDs.

There is a Docker service name that has a fixed name. We can use this name to get a service ID.

You can list all services with the next command

docker service ls

I have the service name swarm_postgres. Actually we need to get a task (container) ID running PostgreSQL service. We can do this with the next command:

docker service ps -f "name=swarm_postgres" swarm_postgres -q --no-trunc | head -n1

This command would return you Docker Swarm task (container) ID from postgres service. For example: m4q72hmem43talp8ndxn6a37r

So we know container ID and we can make PostgreSQL backup using native pg_dump tool

docker exec -t swarm_postgres.1.m4q72hmem43talp8ndxn6a37r pg_dump -U dbusername dbname > dump_dbname.sql

As a result of this command you would get dump_dbname.sql file which contains SQL dump

All these actions can be done with one bash script. Let's call it postgres-backup.sh

#!/bin/bash
#configuration settings
POSTGRES_USER=dbusername
POSTGRES_DB=dbname
CURRENT_MONTH=$(date +%Y-%m)
CURRENT_DATE=$(date +%Y-%m-%d)
CURRENT_DATETIME=$(date +%d-%b-%Y_%H_%M_%Z)
BACKUPS_PATH=/backups
DOCKER_SWARM_SERVICE_NAME=swarm_postgres
####################################
#backup PostgreSQL database
BACKUP_FOLDER=$BACKUPS_PATH/$CURRENT_MONTH/$CURRENT_DATE
if [ ! -d "$BACKUP_FOLDER" ]; then
    mkdir -p "$BACKUP_FOLDER"
fi

echo 'Creating PostgreSQL backups...'
cd "$BACKUP_FOLDER"
if [ -f 'dump_'"$POSTGRES_DB"'.sql' ]; then
   rm 'dump_'"$POSTGRES_DB"'.sql'
fi
db_backup_filename=$POSTGRES_DB'_'$CURRENT_DATETIME'.tar.gz'
postgres_container_id=$(docker service ps -f "name=$DOCKER_SWARM_SERVICE_NAME" $DOCKER_SWARM_SERVICE_NAME -q --no-trunc | head -n1)
docker exec -t symfony-blog_postgres.1."$postgres_container_id" pg_dump -U $POSTGRES_USER $POSTGRES_DB > 'dump_'"$POSTGRES_DB"'.sql'
tar -cf - 'dump_'"$POSTGRES_DB"'.sql' | gzip -9 > "$db_backup_filename"
rm 'dump_'"$POSTGRES_DB"'.sql'

cd "$BACKUP_FOLDER"
md5sum * > MD5SUMS

echo 'Done.'

There are some extra actions. Scripts creates tar.gz archive, creates MD5SUMS file which contain md5 sum from archive.

You can add this command to cron and run it each day. Just add a line to your /etc/crontab

0 8 * * * root /root/postgres-backup.sh

In this example I run this script each day at 8 a.m.

Removing outdated PostgreSQL backups

After some period of time you will have a lot of archives with PostgreSQL database dumps. We don't need all of them. I suggest storing backups for the last 14 days.

There is a way to check file creation date and remove outdated files. We can do this with bash script. Let's call it postgres-backups-cleaner.sh

#!/bin/bash
EXPIRE_DAYS=14
SCAN_DIR=/backups
FILES=$(find $SCAN_DIR -type f)
for file in $FILES;
   do
         timestamp=$(date -r $file +%Y%m%d);
         echo "Processing $file file..";
         date1yrs=$(date -d "$timestamp" +%Y);
         date1days=$(date -d "$timestamp" +%j);
         date2yrs=$(date +%Y);
         date2days=$(date +%j);
         diffYear=$(expr "$date2yrs" - "$date1yrs");
         diffYear2days=$(expr "$diffYear" \* 365);
         diffDays=$(expr "$date2days" - "$date1days");
         DAYS=$(expr "$diffYear2days" + "$diffDays");

         if [ "$DAYS" -ge $EXPIRE_DAYS ]
           then
                echo "Deleting $file file...";
                rm "$file";
         fi
   done

We need to run this script on a daily basis. The same way like we did for daily backups. Cron is a good option to do this. Edit your /etc/crontab file and add an extra line there

0 9 * * * root /root/postgres-backups-cleaner.sh

In this example I run this script each day at 9 a.m.

Coping backups to safe place

Created Backups stored on host machine running Docker Swarm. It would be bad if something happened to the host machine.

I decided to copy backups to my home PC. I have a PC running all time. So that's the cheapest way to store backups in my case. I added cron task to copy backups from host machine to my PC with rsync command

30 21 * * * luxurydab /usr/bin/rsync --progress -zoguhvr --compress-level=9 sshusername@domain.example.com:/backups/ /media/storage/Backups/

You can use any cloud storage like Amazon S3, Google Cloud Storage, DigitalOcean Spaces, etc. Just add extra copy actions to your backup script.

Last update March 18, 2023
Development backup postgresql database
Translation of this article: Русский
Do you have any questions?

Contact Me