CraftedTemplate
Blog How to Set Up Automatic Backups for Websites and Databases (Complete Guide)

How to Set Up Automatic Backups for Websites and Databases (Complete Guide)

11/2/2025 • Festus Ayomike
How to Set Up Automatic Backups for Websites and Databases (Complete Guide)

Your website might be live and running smoothly, but what happens if your server crashes, a bad update breaks your files, or your database gets corrupted? That’s where automatic backups come in.

A solid backup system ensures that no matter what happens, you can restore your website and database quickly and easily.

In this guide, you’ll learn how to:

  • Create daily backups using cron jobs
  • Back up databases automatically
  • Sync backups to cloud storage (Google Drive, Dropbox, or AWS S3)
  • Manage old backups safely

See Also: How to Set Up a Reverse Proxy and SSL with Nginx and Certbot (Production Server Setup)

Step 1: Understand the Backup Strategy

Follow the 3-2-1 rule of backups:

  1. Keep 3 copies of your data
  2. Store on 2 different storage types (e.g., server + cloud)
  3. Keep 1 copy offsite (in case your server is lost or damaged)

Step 2: Back Up Website Files

Start by creating an archive of your site files.

For manual backups:

Code · batchfile
tar -czf website-backup.tar.gz /var/www/mywebsite.com/html

For automated backups, use a cron job. Open the crontab editor:

Code · batchfile
crontab -e

Add this line to back up every night at midnight:

Code · batchfile
0 0 * * * tar -czf /backups/website-$(date +\%F).tar.gz /var/www/mywebsite.com/html

Now your website files will automatically be compressed and stored daily.

See Also: How to Host a Website on an Nginx VPS (Complete Ubuntu Server Guide)

Step 3: Back Up Your Database

For MySQL or MariaDB databases, you can use mysqldump:

Code · batchfile
mysqldump -u root -p mydatabase > /backups/db-$(date +%F).sql

Add that command to a cron job so it runs every day:

Code · batchfile
0 1 * * * mysqldump -u root -p'yourpassword' mydatabase > /backups/db-$(date +\%F).sql

For PostgreSQL:

Code · batchfile
pg_dump mydatabase > /backups/db-$(date +%F).sql

Step 4: Sync Backups to Cloud Storage

Local backups are good, but if the entire server fails, you could lose everything.

To prevent this, use rclone to sync backups to cloud storage like Google Drive, Dropbox, or AWS S3.

Install rclone:

Code · batchfile
sudo apt install rclone -y

Configure it:

Code · batchfile
rclone config

Then sync your backups:

Code · batchfile
rclone copy /backups remote:website-backups --progress

Automate it to run daily at 2 AM:

Code · batchfile
0 2 * * * rclone copy /backups remote:website-backups --progress

See Also: How to Monitor Website Uptime and Performance (Step-by-Step Tutorial)

Step 5: Manage Backup Retention

Backups take up space quickly. To delete backups older than 7 days, run:

Code · batchfile
find /backups -type f -mtime +7 -delete

You can also add that to your cron job schedule.

Step 6: Test Backup Restoration

A backup is only useful if you can actually restore it.

To restore your database:

Code · batchfile
mysql -u root -p mydatabase < db-2025-10-03.sql

To restore your files:

Code · batchfile
tar -xzf website-2025-10-03.tar.gz -C /var/www/mywebsite.com/html

Make sure to test your backups regularly to confirm that they’re valid.

Conclusion

Automatic backups are your website’s safety net. By combining local cron jobs with off-site cloud storage, you can protect your data against hardware failure, human error, or cyberattacks.

In the next Hosting Academy post, we’ll take your skills further — optimizing website performance and scaling your servers for maximum speed and reliability.