There's two sayings that I'm fond of. 

"It's better to have it and not need it, then to need it and not have it" and "One is none, two is one" 

Both saying's fit perfectly when it comes to backups. 

They exist for sole purpose of restoring data, whether it's a few files or entire servers, where there's either a catastrophic failure or accidental deletion of data. Within my homelab and home network, I run several different types of backup solutions, while redundant & overlapping they give me  the opportunity to test and play around with different backup solutions in place, I have different ways to recover the same file(s) in the event of a deletion or catastrophic failure. 

Backup solutions that I'm currently using

  • BAREOS (Backup Archiving Recovery Open Sourced) for file system backups 
  • Robocopy - for backing up windows servers 
  • bashscripts - for backing up mysql databases and important files on linux machines
  • Veeam - for my VMWare infrastructure

No matter how good a backup solution is, if all you data is stored on site or exists on the same server(s) as the machines that are being backed up, then you're still in a position that you can loose everything. For that reason offsite backups are extremely important, whether it's replicating data from point A to point B, or to have an offsite solution in place. 

The offsite solution, I have in place for my critical data, is to export the data to encrypted external hard drives and store at a secondary location, these drives are rotated on a monthly basis, to ensure that I have an N+1 in place of my data and in a worst case scenario, only loose a months worth of data. I'll go over my offsite solution(s) in more details in upcoming articles. 

For security reasons, I won't go into extreme depth of how things are setup, specific IP addresses of hosts, configuration files, etc. 

I will go over a generalization of my environment, this will be covered in multiple articles. 

Pin It