Your website might be live and running smoothly, but what happens if your server crashes, a bad update breaks your files, or your database gets corrupted? That’s where automatic backups come in.
A solid backup system ensures that no matter what happens, you can restore your website and database quickly and easily.
In this guide, you’ll learn how to:
- Create daily backups using cron jobs
- Back up databases automatically
- Sync backups to cloud storage (Google Drive, Dropbox, or AWS S3)
- Manage old backups safely
See Also: How to Set Up a Reverse Proxy and SSL with Nginx and Certbot (Production Server Setup)
Step 1: Understand the Backup Strategy
Follow the 3-2-1 rule of backups:
- Keep 3 copies of your data
- Store on 2 different storage types (e.g., server + cloud)
- Keep 1 copy offsite (in case your server is lost or damaged)
Step 2: Back Up Website Files
Start by creating an archive of your site files.
For manual backups:
tar -czf website-backup.tar.gz /var/www/mywebsite.com/htmlFor automated backups, use a cron job. Open the crontab editor:
crontab -eAdd this line to back up every night at midnight:
0 0 * * * tar -czf /backups/website-$(date +\%F).tar.gz /var/www/mywebsite.com/htmlNow your website files will automatically be compressed and stored daily.
See Also: How to Host a Website on an Nginx VPS (Complete Ubuntu Server Guide)
Step 3: Back Up Your Database
For MySQL or MariaDB databases, you can use mysqldump:
mysqldump -u root -p mydatabase > /backups/db-$(date +%F).sqlAdd that command to a cron job so it runs every day:
0 1 * * * mysqldump -u root -p'yourpassword' mydatabase > /backups/db-$(date +\%F).sqlFor PostgreSQL:
pg_dump mydatabase > /backups/db-$(date +%F).sqlStep 4: Sync Backups to Cloud Storage
Local backups are good, but if the entire server fails, you could lose everything.
To prevent this, use rclone to sync backups to cloud storage like Google Drive, Dropbox, or AWS S3.
Install rclone:
sudo apt install rclone -yConfigure it:
rclone configThen sync your backups:
rclone copy /backups remote:website-backups --progressAutomate it to run daily at 2 AM:
0 2 * * * rclone copy /backups remote:website-backups --progressSee Also: How to Monitor Website Uptime and Performance (Step-by-Step Tutorial)
Step 5: Manage Backup Retention
Backups take up space quickly. To delete backups older than 7 days, run:
find /backups -type f -mtime +7 -deleteYou can also add that to your cron job schedule.
Step 6: Test Backup Restoration
A backup is only useful if you can actually restore it.
To restore your database:
mysql -u root -p mydatabase < db-2025-10-03.sqlTo restore your files:
tar -xzf website-2025-10-03.tar.gz -C /var/www/mywebsite.com/htmlMake sure to test your backups regularly to confirm that they’re valid.
Conclusion
Automatic backups are your website’s safety net. By combining local cron jobs with off-site cloud storage, you can protect your data against hardware failure, human error, or cyberattacks.
In the next Hosting Academy post, we’ll take your skills further — optimizing website performance and scaling your servers for maximum speed and reliability.

