Methods of Backing Up From Linux Desktop to Linux Server

Here’s what I use for performing backups. I use SSH with public key authentication. My file server running in a VM has two hard drives passing into my VM: one for general storage and one for storing movies and music.

My script involves rsync. Here’s a script for performing a full and partial backup:

grayson@grayson-epcotcenter
  ~
 $ cat bin/backup-home-grayson.sh 
#!/bin/bash
HNAME=nas
BACKUPDIR=/data/nas/backups/grayson-epcotcenter/grayson-$(date +%Y%m%d-%H%M%S)-full
rsync -e 'ssh' -ra ~/Documents \
      ~/Videos \
      ~/Storage \
      ~/eBooks \
      # ...
      ~/.mozilla \
      ~/.ssh \
      ~/.pki \
      $HNAME:$BACKUPDIR
# Document the date and time the files were backed up to a remote server.
ssh $HNAME "date > $BACKUPDIR/backupdate.txt" 
grayson@grayson-epcotcenter
  ~
 $ cat bin/diffbackup-home-grayson.sh 
#!/bin/bash
HNAME=nas
BACKUPDIR=/data/nas/backups/grayson-epcotcenter/grayson-$(date +%Y%m%d-%H%M%S)-diff
rsync -e 'ssh' -ra --link-dest=/data/nas/backups/grayson-epcotcenter/grayson-20210701-093714-full \
      ~/Documents \
      ~/Videos \
      ~/Storage \
      ~/eBooks \
      # ...
      ~/.mozilla \
      ~/.ssh \
      ~/.pki \
      $HNAME:$BACKUPDIR > /dev/null
ssh $HNAME "date > $BACKUPDIR/backupdate.txt"

Here’s how it looks like over the last couple of months:

[username-redacted]@nas:/data/nas/backups/grayson-epcotcenter$ du -h -d 1
...
109G       ./grayson-20210701-093714-full
377M       ./grayson-20210705-174243-diff
417M       ./grayson-20210709-010948-diff
422M       ./grayson-20210713-143840-diff
53G	       ./grayson-20210723-170221-diff
176G       ./grayson-20210805-013615-full
...

I do have NFS in my file server virtual machine, but the last thing I want is encrypting all my files for ransom in my NAS server, which is why I do not automatically mount my NFS share in my desktop PC unless absolutely necessary. Not even sshfs. Of course, I do not receive any kind of spam because I make use of 190+ email aliases (no catchall and + addressing at all). T-Mobile suffered a data breach recently, so even if I’m not affected, in addition to changing my password, I also changed my unique email address as well for T-Mobile so I can become a moving target and not get any phishing email messages. Is it overkill? Yes, but that’s because paranoia is part of my mindset in addition to protecting my privacy online.

I’m a Linux user for the past 10+ years and I have not encountered any malware since the since that time. So yeah, I do follow as much as I can regarding my security hygiene. I’m even protecting myself with NoScript and pfBlockerNG in pfSense. I don’t want to come across any form of malvertisement (malicious advertisement).

Sure, my thread might seem more like security-related than anything related to backup; however, my question is how do you perform your own backups from your desktop to your home server? Do you use a script like the one shown above?

If you’re doing incremental backups, then ransomware would only affect that increment (and later), not earlier ones, assuming you’re backing up to another server (as well as off-site).

We mount our shared volumes from our primary Synology NAS and most of our working files live on there as well as our media libraries etc. It’s backed up nightly with HyperBackup to another Synology NAS here, and also to Synology’s C2 service. Backups are incremental with a retention policy set so that we can easily restore a file from over a year ago.

If a ransomware attack ran, it woudl just encrypt the working files. The next backup would pick them up as changed, and add them to the backup servers - it wouldn’t replace backup files. So if it happened today, we’d just restore yesterday’s backup state and only be out today’s work.

The other thing we have is that we self-host a GitLab server running in Docker on or main NAS, (which is also backed up), so project files are always under version control though that, and we also have our dotfiles in a git repository.

Also, we have our system configurations for all the different Linux systems in our Ansible project, so rebuildng a machine means install from media then run Ansible to configure it and ta-da.

So, we don’t back up the workstations as they can be easily rebuilt, our working projects easily re-cloned from git, and our resource files mounted from the NAS.

Unless the house burns down; then we’d have to get by on what we have with us at the time while we wait for replacement hardware to arrive.

2 Likes

My Strategy is similar to what @Buffy does, though I don’t use C2.

Most of my critical data is in the form of source code, so, I have three different Git Service Providers, plus my own Gitlab Server providing redundancy. On my workstation, I do use Rsync and send data to a local storage volume which is backed up to my TrueNAS server running ZFS. I don’t back up any code projects as that’s already in Git.

My next home-lab project will be adding a second NAS server, probably Synology, with the sole purpose of backup and recovery.

2 Likes

I backup to an encrypted external HDD using rsync. The backup drive is only connected to my desktop/laptop during backups.

+1 for no-script

I run pi-hole instead of pfBlocker.

2 Likes

Oh, okay. Well thanks everyone.

@GraysonPeddie thank you for sharing your rsync script with us. I’m going to look at it in a bit more depth and try to learn some things to apply to my own backups. I don’t think I’ve been using rsync to do incremental backups which is what your script does. We can always learn something from one another.

1 Like

I am having issues setting up git on gitlab to backup my dotfiles.
I’ve read and gone through the instructions in the gitlab docs , Git - ArchWiki and How to store dotfiles | Atlassian Git Tutorial, but I can’t get it to work.
Can someone point me to a resource that shows this being setup from creating the git repo to committing dotfiles?
apologies if this is the wrong place for this post.

For managing my dotfiles, I use yadm (https://yadm.io/), which is basically a wrapper for git.

So, using GitLab, you can just create a new project (say, “My Dotfiles”), and I always check the “Create a Readme.md” box, just so there will be something to clone.

Then you can do “yadm clone” on that project’s URL. From there, just use ‘yadm’ the same way you’d use ‘git’:

yadm add .bashrc .bash_profile .bash_aliases
yadm commit -m "Initial commit."
yadm push

and so on.

1 Like

i’ll give that a try when i get the chance. thanks.

2 Likes

I’ve done it a similar way myself. Although I used Python. In retrospect, Python was probably not the best way to go. More lines of code than necessary.

1 Like

I would like to second @Buffy suggestion of yadm. I looked at a few different tools, but to keep my dotfiles synced between two Debian 10 machines (one at the office and one at home) it worked well. Because of version differences I didn’t use it to also sync my home workstation which runs Fedora 34 at the moment.

2 Likes

Oh, okay! Thank you everyone for sharing!

@Buffy in a worst case scenario, what if your computer gets infected with a ransomware, encrypts your home directory, and discovers that there is a .git folder or similar and that yadm or git is installed? Wouldn’t the ransomware encrypt the repository stored in the remote server?

Sure, a good security hygiene is important; however, not everyone has the best security hygiene in the world and do not follow best security practices all the time, so ransomware infections can happen regardless of the operating system. Of course, I have not seen any news where Linux desktop users got infected with ransomware since the main target is Windows and possibly Mac users. But then this is a Linux forum, so I’d better stick to Linux. :slight_smile:

Well you’d be backing up your remote server with proper incrementals etc, so even if it could, you’d resstore from backup.

1 Like