blog.n11n

2026-02-26

How I backup

history

When I started to build my homelab, backups were a core part of the design. I wanted to be sure my data was available and wouldn't be lost in an emergency. Having a reliable backup plan has already saved me more than once.

Here's my workflow for any data I can't afford to lose.

3-2-1-1 strategy

Nightly backups

Before migrating to Proxmox, incremental backups were generated using rsync in a cron job that looked like:

30 4 * * * rysnc_script.sh || alert "Backup: ERROR"

rsync_script.sh handled the configuration, and any services that needed to be stopped and restarted. In the event of a failure, alert would send a message to my notification service using curl.

Now, this is managed with Proxmox Backup Server running in a separate VM. Daily snapshots are retained for the last 30 days, while monthly snapshots are kept for the last six months.

Off-site / Offline export

Every two weeks I run a script that:

  1. Creates a compressed archive (tar.gz) of the latest snapshot
  2. Encrypts it using AES-256
  3. Uploads the encrypted archive to the cloud
  4. Notifies me the archive is ready to be moved offline

In the past this ran every week. However, I noticed most of my really important data wasn't changing much week over week. Moving to a bi-weekly schedule has been a good blend of convenience and data integrity so far.

Testing

Making sure to test my backups has been crucial. I do it in two ways:

  1. Checksum validation: this is automated and runs every night, notifying me of any problems.
  2. Restore drill: once a month, I spin up a new VM using my latest off-site archive and confirm everything starts correctly.

I keep a simple "restore checklist" in a repository (along with a printed copy) so anyone can follow it step-by-step.

Being able to really trust my backups has helped turn any scary situation into a routine confidence boost.