Stop Hoping Your Backups Work: A Self-Hosted Backup Strategy That Actually Scales

Why you need automated, encrypted, offsite backups. A practical guide to Restic and Duplicati—and why I ditched the rest.

The Moment Everything Goes Wrong

Your SSD fails. Your homelab crashes. Your Nextcloud instance gets encrypted by ransomware. You tell yourself “it’s fine, I have backups,” but then you realize: you never actually tested them.

I’ve been there. Three years ago, I lost 6 months of photos because my backup strategy was basically “copy files to another folder on the same machine.” When that drive died, so did everything.

Now? Automated, encrypted, distributed backups. And yeah, it’s easier than you think.

The Backup Holy Trinity

Before we get technical, here’s what a real backup strategy needs:

  1. Automated — If it requires you to remember to do it, it won’t happen
  2. Encrypted — Your backup server shouldn’t be a liability
  3. Offsite — If your house burns down, your backups shouldn’t be in it

That’s it. If your solution doesn’t check all three boxes, you’re gambling.

Why I Chose Restic (And Why You Probably Should Too)

I’ve tested Duplicati, Bacula, Borgmatic, Veeam. They all work. But Restic won? Here’s why:

Restic is stupidly simple. One binary. Initialize a repo, set a cron job, done. No agents, no database, no backup windows. Just incremental snapshots that happen in the background.

The kicker: incremental deduplication. Your first backup of 500GB takes a while. The next backup? 2 minutes if only 5GB changed. Restic doesn’t care about full/differential/incremental—it just stores what’s new.

The cryptography is solid. Everything is encrypted before leaving your machine. Even if someone breaks into your backup server, they get encrypted blobs. AES256 by default. No backdoors.

I run Restic backing up:

  • My Nextcloud data (photos, documents, the works)
  • My Forgejo repos (belt and suspenders for code)
  • My Immich database (because I’m paranoid about losing photos twice)
  • Database dumps from my applications

All automated. All encrypted. All off-site.

Duplicati: The GUI Alternative

Here’s the thing—not everyone wants to live in a terminal. Duplicati exists for that reason.

Pros:

  • Web UI that’s actually intuitive
  • Supports a ridiculous number of backends (AWS, Azure, Google Drive, Wasabi, B2, even SFTP)
  • Incremental backups
  • Can run as a service

Cons:

  • Slower than Restic (sometimes noticeably)
  • Requires more system resources
  • Database can get corrupted if you’re not careful
  • Community is smaller than Restic’s

I tested Duplicati for about a month. It worked fine. But when I ran restic restore vs duplicati restore side-by-side, Restic was 3-4x faster. On a NAS backing up 2TB+, that matters.

Use Duplicati if:

  • You prefer a GUI
  • You want to backup to a managed service (Google Drive, AWS)
  • Your backup jobs are infrequent
  • You don’t want to touch the command line

My Restic Setup (Copy This)

Here’s my actual config. You can use it as a template.

1. Install Restic

# On most Linux distros
apt install restic

# Or grab the binary from the site if you're on something weird
wget https://github.com/restic/restic/releases/download/v0.17.0/restic_0.17.0_linux_amd64.bz2
bunzip2 restic_0.17.0_linux_amd64.bz2
chmod +x restic_0.17.0_linux_amd64
mv restic_0.17.0_linux_amd64 /usr/local/bin/restic

2. Create a Backup Repository

I use SFTP to backup to my VPS. (You could also use S3, Wasabi, B2, or any remote server.)

# Initialize the repo (choose a strong password!)
restic -r sftp:[email protected]:/backups/homelab init

# Restic will ask for a password. Write it down. Seriously.

3. Write Your Backup Script

Create /usr/local/bin/backup-homelab.sh:

#!/bin/bash

export RESTIC_REPOSITORY="sftp:[email protected]:/backups/homelab"
export RESTIC_PASSWORD="your-super-secret-password-here"

# Backup Nextcloud data
restic backup /var/lib/nextcloud/data \
  --exclude="/var/lib/nextcloud/data/*/files/.ipynb_checkpoints" \
  --exclude="/var/lib/nextcloud/data/*/cache"

# Backup app configs
restic backup /etc/appdata --exclude="/etc/appdata/*/cache"

# Backup databases (dump them first!)
mysqldump -u root -p$DB_PASSWORD --all-databases | \
  restic backup --stdin --stdin-filename "databases-$(date +%Y%m%d).sql"

# Cleanup old snapshots (keep last 30 days)
restic forget --keep-daily 7 --keep-weekly 4 --keep-monthly 12 --prune

echo "Backup completed at $(date)"

4. Automate With Cron

# Backup every day at 2 AM
0 2 * * * /usr/local/bin/backup-homelab.sh >> /var/log/restic-backup.log 2>&1

That’s it. Restic handles the rest.

Testing Your Backups (The Part Everyone Skips)

Here’s the truth: a backup that’s never been tested is just wishful thinking.

I test my backups quarterly. Here’s how:

# List available snapshots
restic snapshots

# Pick one and restore to a test folder
restic restore 8ac0a -r sftp:... --target /tmp/restore-test

# Verify the files are actually there
ls -la /tmp/restore-test

When you restore files, Restic pulls from the encrypted blobs and reassembles them. If this works, you’re golden. If it fails, you know now—not when you actually need it.

Pro tip: Restore to a VM or different machine if possible. Just to be 100% sure.

What About Ransomware?

Restic doesn’t prevent ransomware. But it mitigates damage.

Here’s my defense:

  1. Immutable snapshots — Restic keeps old versions. Even if your live data gets encrypted, your backups from 3 days ago are untouched.

  2. Air-gapped backups — I run one backup to my NAS (fast recovery), and another to my VPS (offsite). If ransomware hits my NAS, the VPS copy is still clean.

  3. Read-only backups — My VPS SSH key is on my backup server with command="restic" only. Even if someone gets that key, they can’t delete the backups—they can only append new ones.

  4. Monitoring — I get an alert if a backup hasn’t completed in 24 hours. If my homelab gets encrypted, the alerts stop. That’s my signal something’s wrong.

No backup strategy is 100% bulletproof. But Restic with offsite + immutable snapshots + monitoring? That’s pretty damn close.

The One Thing I Almost Forgot

Bandwidth costs you money.

If you’re backing up 2TB monthly to AWS S3, you’re paying about $50/month just for egress. Wasabi charges flat rate ($0.022/GB). Backblaze B2 is somewhere in between.

I chose a cheap VPS ($5/month) and run backups over SFTP. Total cost: the VPS. No surprise charges.

Your choice depends on your risk tolerance:

  • Cheap VPS — Faster, cheaper, but you control the security
  • Managed service — Slower, more expensive, but S3 doesn’t get “hacked” (they handle security)

I do both. VPS for quick recovery. B2 for paranoia.

Now You Try

Here’s your action plan for today:

  1. Pick a backup solution (Restic or Duplicati)
  2. Set up a destination (VPS, Wasabi, B2, whatever)
  3. Write the backup script
  4. Set a cron job
  5. Test the restore before you need it

That’s not overkill. That’s responsibility.

In a week, you’ll be the person saying “yeah, my backups are fine” and actually meaning it.


Have a backup story? Either the disaster that taught you, or the peace of mind you got from having one? Drop me a line. I’m curious how other people approach this.

Until next time—backup early, backup often, and test them occasionally.

Updated March 2026 | Restic 0.17+ | Tested on Debian 12 + Proxmox

Stay in the loop 📬

Get self-hosting tutorials, tool reviews, and infrastructure tips delivered to your inbox. No spam, unsubscribe anytime.

Join 0 self-hosters. Free forever.