Setting up automatic server backup to cloud storage


Amazon S3 or Wasabi Cloud Storage

It’s easy to set up an automated backup of your server to cloud storage to protect yourself from data loss or corruption.

We’re going to link your cloud storage (Amazon S3 or Wasabi) bucket where you will store your files in the cloud, to your Linux computer or server. It’s similar to how you would install Dropbox on Windows and have a Dropbox folder on your system.

If you don’t already have an Amazon S3 or Wasabi account, the following guides will show you how to set them up and create a bucket (which you need for storing the backups):

How to create an Amazon S3 bucket

How to create a Wasabi bucket

I recommend Wasabi because it’s much easier to set up thanks to its user-friendly interface and costs less than Amazon S3.

The Wasabi guide in the link above also shows you how to get the access keys which you’ll need to connect Wasabi to your computer or server.

Step 1. Install s3fs

Begin by installing s3fs:

sudo apt update
sudo apt install s3fs

Step 2. Create your Access Keys

If you’re using Wasabi, use the guide from the start of this article to get your access keys (How to create a Wasabi bucket)

If you’re using Amazon S3, the process is slightly more complicated because you need to create an IAM user within Amazon AWS and then create the access keys separately

Create an IAM user in your AWS account

Create access keys for the IAM user

Store your Amazon or Wasabi keys and set permissions (replace “access_key” with your Access Key ID and “secret_key” with the Secret Key you have for the Amazon IAM or Wasabi bucket user)

sudo echo "access_key:secret_key" > ~/.passwd-s3fs
sudo chmod 600 ~/.passwd-s3fs

Step 3. Create your backup folder

Create a folder (directory) on your Linux computer or server that you can link to your cloud storage:

If you’re using Amazon S3 I suggest:

sudo mkdir /mnt/s3

If you’re using Wasabi I suggest:

sudo mkdir /mnt/wasabi

Step 4. Connect

Now you need to connect (mount) your cloud storage bucket to your Linux computer or server, in the examples below I’ve used the locations “eu-west-1” and “eu-west-2”, you should set this to match the location shown in your Amazon S3 or Wasabi account for the bucket you created. Replace bucketname with the name of your bucket in Amazon S3 or Wasabi.

For Amazon S3:

sudo s3fs bucketname /mnt/s3 -o passwd_file==~/.passwd-s3fs -o allow_other -o endpoint=eu-west-2 -o use_path_request_style

For Wasabi:

sudo s3fs bucketname /mnt/wasabi -o passwd_file=~/.passwd-s3fs -o url=https://s3.eu-west-1.wasabisys.com -o endpoint=eu-west-1

Step 5. Test

Now create any file on your Linux computer or server using the following command as an example and check to see if it shows up in your Amazon S3 or Wasabi bucket

For Amazon S3:

sudo touch /mnt/s3/linuxmadeeasy.txt

For Wasabi:

sudo touch /mnt/wasabi/linuxmadeeasy.txt

Step 6. Make it permanent

Now that you’ve successfully linked your cloud storage to your Linux computer or server, we can use the following to make sure it stays permanently connected (mounted)

Open the file:

sudo nano /etc/fstab

Add the following line for Amazon S3:

s3fs#bucketname /mnt/s3 fuse _netdev,allow_other,passwd_file=/etc/passwd-s3fs 0 0

Add the following line for Wasabi:

s3fs#bucketname /mnt/wasabi fuse _netdev,allow_other,passwd_file=/etc/passwd-s3fs,url=https://s3.eu-west-1.wasabisys.com,endpoint=eu-west-1 0 0

Remember to change the bucketname and location in the commands above. To save it and exit, use CTRL-O and then CTRL-X.

Server Backup to Cloud Storage

Creating your server backup

Now that you’ve connected Amazon S3 or Wasabi to your Linux computer or server, you can make a backup and save it there.

I’ve covered making backups using BorgBackup, Restic and Duplicity. I recommend Duplicity because I feel it’s the easiest to set up and it provides incremental backups.

Using BorgBackup

If you want to use BorgBackup, first update packages

sudo apt update

Now install BorgBackup:

sudo apt install borgbackup

Install any compression library you want to use

sudo apt install zstd

You can also use LZ4 or zlib for example

Now you can start making a backup, first you need to initialise the directory for the backups (replace /mnt/s3 with your path)

borg init --encryption=repokey /mnt/s3

You will be asked to enter a passphrase as you’re encrypting the backup directory/folder here to keep it secure

Next you should see a message saying that you will need both KEY AND PASSPHRASE to access this repo, to get your key use the following command (replace /mnt/s3 with your path)

borg key export /mnt/s3

You should copy the key and keep it safe, you won’t be able to access your backups without your key and password.

Now create a backup like this (replace /mnt/s3 with your path)

borg create /mnt/s3::backup-{now} / --compression zstd

You can automate the backup process by creating a cron job or systemd timer to run the backup command at regular intervals. This ensures that your backups are created automatically according to your schedule.

Using Restic

Use Restic if you want to backup snapshots of your system that you can use to restore from for disaster recovery purposes.

Update packages and install Restic:

sudo apt update
sudo apt install restic

Initialise a repository (replace /mnt/s3 with your path)

restic init --repo /mnt/s3/restic-backup

Create a tar archive of your entire filesystem:

sudo tar --exclude=/proc --exclude=/mnt --exclude=/sys --exclude=/dev --exclude=/run -czvf /tmp/filesystem_backup.tar.gz /

It may take a while to compress your entire filesystem.

Copy the backup archive we just created to your cloud storage using the following command:

restic backup /tmp/filesystem_backup.tar.gz --repo /mnt/s3/restic-backup

Verify the Restic backup:

restic snapshots --repo /mnt/s3/restic-backup

This should display the backups you just created.

Using Duplicity

Duplicity is ideal if you want to make use of incremental backups rather than performing full backups each time

What’s an incremental backup: The three types of backups

Step 1. Install Duplicity

Update your packages and install Duplicity:

sudo apt update
sudo apt install duplicity
Step 2. Make a backup folder (directory)

Create a backup folder within your mounted cloud storage bucket to store your backup:

For Amazon S3:

sudo mkdir /mnt/s3/backup

For Wasabi:

sudo mkdir /mnt/wasabi/backup
Step 3. Start backup

Now you’re ready to begin your first backup, simply use the following command to perform a full backup, and then every time you use this command in the future it will detect file changes and automatically perform an incremental backup only.

For Amazon S3:

duplicity / --exclude=/proc --exclude=/mnt --exclude=/sys --exclude=/dev --exclude=/run/ --exclude=/tmp/ file:///mnt/s3/backup

For Wasabi:

duplicity / --exclude=/proc --exclude=/mnt --exclude=/sys --exclude=/dev --exclude=/run/ --exclude=/tmp/ file:///mnt/wasabi/backup

It will ask you to create a passphrase which will be used to encrypt your backup. Make sure you remember your passphrase as you’ll need it whenever you restore your backups.

Step 4. Automate the server cloud storage backup

It’s easy to automate the backup, to set it up use this:

crontab -e

Add a new line to the file to schedule your backup, so to perform the backup daily at 3:00 AM you would add:

For Amazon S3:

0 3 * * * export PASSPHRASE='your_backup_password' duplicity / --exclude=/proc --exclude=/mnt --exclude=/sys --exclude=/dev --exclude=/run/ --exclude=/tmp/ file:///mnt/s3/backup

For Wasabi:

0 3 * * * export PASSPHRASE='your_backup_password' duplicity / --exclude=/proc --exclude=/mnt --exclude=/sys --exclude=/dev --exclude=/run/ --exclude=/tmp/ file:///mnt/wasabi/backup

Replace your_backup_password with a password to encrypt your backup, you will need this whenever you restore your backup.

Other Server Backup Options

You have plenty of options for backing up your Linux computer or server, for example if you wanted to create an image backup for easy restoration of your system, you could use Clonezilla.

Or you could consider using one of the popular paid backup services like Veeam Backup or Acronis Cloud Backup.

It’s worth mentioning Backblaze B2 Cloud Storage as well, it’s S3 compatible so you can use s3fs to set it up in a similar way to Amazon S3 and Wasabi and the prices are unbeatable.

Restoring backup with Duplicity

Restoring the backup with Duplicity is mostly the same process as making the backup so I’ll just summarise here and if you’re stuck just refer to the same steps further up the page.

Install Duplicity on fresh server:

sudo apt update
sudo apt install duplicity

Install s3fs and add your keys

sudo apt install s3fs

Replace “access_key” and “secret_key” with your keys

sudo echo "access_key:secret_key" > ~/.passwd-s3fs
sudo chmod 600 ~/.passwd-s3fs

Create the backup folder

For Amazon S3:

sudo mkdir /mnt/s3

For Wasabi:

sudo mkdir /mnt/wasabi

Mount your bucket

For Amazon S3:

sudo s3fs bucketname /mnt/s3 -o passwd_file==~/.passwd-s3fs -o allow_other -o endpoint=eu-west-2 -o use_path_request_style

For Wasabi:

sudo s3fs bucketname /mnt/wasabi -o passwd_file=~/.passwd-s3fs -o url=https://s3.eu-west-1.wasabisys.com -o endpoint=eu-west-1

Restore using Duplicity and rSync

First, create a temporary directory for the backup:

sudo mkdir /tmp/restored

Now you can restore your backup to this temporary directory.

For Amazon S3:

sudo duplicity restore file:///mnt/s3/backup /tmp/restored

For Wasabi:

sudo duplicity restore file:///mnt/wasabi/backup /tmp/restored

It will ask for the passphrase you set when you created the backup earlier.

It may take a while, once the process completes, you need to restore the full system backup to the root of the system, you can do this using rSync:

sudo rsync -av /tmp/restored/ /

That’s it! You’ve restored your backup.

Following this you will probably find that many services are not running because we excluded the /run/ (runtime) directory, so you’ll have to start your services again one-by-one.

Whatever backup and recovery process you decide on, I hope this guide has been useful to you!

PS have you considered setting up a Cloudflared Tunnel?