Backup VPS with Rclone and upload to Google Drive

Categories: Blog

This post is modified from Rclone – VPS Backup to Google Drive. I correct some bugs in the original code and added the function to backup Gogs database

Ago I still used to store backup VPS using tools Duplicity or Rsync . However, now there was a new, more effective methods, saving more (Free), which is backed up to the Cloud with Rclone.

Rclone is a tool for data synchronization similar Rsync but is focused on developing the functions connected to the cloud storage service.

The advantage of using cloud storage services such as high speed (due to the server are located around the world), data security (no worries hardware issues, network) and most is mostly Free . Especially liked his stuff Free!

Rclone supports many popular Cloud services such as:

  • Google Drive
  • Amazon S3
  • OpenStack Swift / Rackspace cloud files / memset Memstore
  • Dropbox
  • Google Cloud Storage
  • Amazon Cloud Drive
  • Microsoft One Drive
  • Hubic
  • Backblaze B2
  • Yandex Disk
  • instead of backup time is taken up other VPS hosting, I switched to using Google Drive, 15GB of free storage, also quite cheap to buy, only 45k / month and 100GB. You do have a free Google Apps account, the more great again.

In this article will have two main parts, one is installed on VPS Rclone, 2 is used to upload files compressed backup Rclone to Google Drive. With the cloud of other service you do the same.

Creating a full backup of data files your VPS has detailed instructions in the article Guidelines automatically backup the entire VPS this article will only focus on the installation of automatic compressed file upload to Google Drive. More manuals Rclone with Google Drive and other cloud services here .

Automatic backup scenario is as follows:

Backup entire MySQL database into .sql files, each database file

Backup the entire code in the folder /data/wwroot

Backup entire Nginx configuration folder /usr/loca/nginx

Backup Gogs database

Compress all data into TGZ file

Upload backup files to Google Drive at 2:00 am

Automatically delete backup files on a VPS after the upload, delete backup files on Google Drive if past 1 month

Make sure use absolute path of files as the crontab may have different enviromental variables. Details can be found at running-a-cron-job-manually-and-immediately

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
#!/bin/bash

cwd=$(pwd)
SERVER_NAME=VPS
REMOTE_NAME=GDrive

TIMESTAMP=$(date +%F)
BAK_DIR=/mnt/blockstorage/Backup
BACKUP_DIR=${BAK_DIR}/${TIMESTAMP}
#BACKUP_DIR="/mnt/blockstorage/Backup/$TIMESTAMP"
MYSQL_USER="root"
MYSQL=/usr/local/mysql/bin/mysql
MYSQL_PASSWORD=XXXXXX
Mysqldump=/usr/local/mysql/bin/mysqldump
rclone=/usr/bin/rclone

SECONDS=0

mkdir -p "$BACKUP_DIR/mysql"

echo "Starting Backup Database";
databases=`$MYSQL  -u $MYSQL_USER -p$MYSQL_PASSWORD -e "SHOW databases;" | grep -Ev "(Database | information_schema | performance_schema | mysql)" `

for db in $databases; do
echo ${db}
$Mysqldump --compact -u $MYSQL_USER -p$MYSQL_PASSWORD  --opt --force  --databases $db | gzip> "$BACKUP_DIR/mysql/$db.gz"
done
echo "Finished";
echo '';

echo "Starting Backup Website";
mkdir -p $BACKUP_DIR/web
#Loop Through the / home directory
for D in /data/wwwroot/*; do
if [ -d "$D" ]; then #if a directory
domain=${D##*/}  # Domain name
echo "-" $domain;
zip -r -y -q $BACKUP_DIR/web/$domain.zip /data/wwwroot/$domain  -x /data/wwwroot/$domain/wp-content/cache/* #Exclude cache
fi
done
echo "Finished";
echo '';

echo "Starting Backup Nginx Configuration";
mkdir -p $BACKUP_DIR/nginx/
cp -r /usr/local/nginx/conf $BACKUP_DIR/nginx/
echo "Finished";
echo '';

echo "Starting Backup Git file";
su git -c "cd /home/git/gogs; ./gogs backup --archive-name git.zip"
mv /home/git/gogs/git.zip ${BACKUP_DIR}/
echo "Finished";
echo '';

echo "Starting compress file";
size1=$(du -sh ${BACKUP_DIR} | awk '{print $1}')
#cd /mnt/blockstorage/Backup/
cd ${BAK_DIR}
tar -czf  ${TIMESTAMP}".tgz" $TIMESTAMP
cd $cwd
size2=$(du -sh ${BACKUP_DIR}".tgz"| awk '{print $1}')
rm -rf ${BACKUP_DIR}
echo "File compress from "$size1"  to "$size2
echo "Finished";
echo '';

echo "Starting Backup Uploading";
$rclone copy ${BACKUP_DIR}.tgz "$REMOTE_NAME:/$SERVER_NAME/" 
$rclone -q delete --min-age 1m "$REMOTE_NAME:/$SERVER_NAME"   #remove all backups older than 1 month
find ${BAK_DIR} -mindepth 1 -mtime +30 -delete
echo "Finished";
echo '';

duration=$SECONDS
echo "Total $size2, $(($duration/60)) minutes and $(($duration%60)) seconds elapsed."

To run the code every sunday 2:00 am, just add the following line to your crontab

1
0 2 * * 0 /root/backup_vps.sh > /dev/null 2>&1
comments powered by Disqus