RAID is not a backup

I’m not sure who will benefit from me posting this, and I may be preaching to the choir on this forum, but it’s something I’ve seen amongst people developing their backup strategies whether it’s for their home computers, or the computers and servers in their businesses, and I wanted to bring it up.

Please remember that RAID is not a backup. A sound backup strategy (and this is all over the place so I’m not going to get into much detail here) is the 3-2-1 method …it’s like 3 different copies of your data on 2 different types of media, with at least 1 of those copies being offsite. But a lot of people assume that if they have a NAS that features some type of RAID, the RAID itself counts as a backup. That would be asking for trouble, and here’s why:

1.) RAID won’t save you from accidental deletion of files
2.) RAID won’t save you from natural disaster that destroys the entire server
3.) RAID won’t save you from supply chain failure (I have a long story about this from my web hosting experience)

I’d like to hear about your experiences with data loss nightmares if you have any. Also tell me about your backup strategies. I want to create some discussion here.

5 Likes

I think your 3-2-1 strategy is right for user data.
For system backups there is no need to be so meticulous, at least for home computers. An OS can be easily reinstalled from scratch. User data is unrecoverable unless you have backups.
I dont really have an offsite strategy, other than using Github for specific projects. I would be interested to hear what others recommend there.

1 Like

On my main machine, which is running W11 Pro, I keep a local bkup and a bkup to an external drive, that is never used until I bkup. I also run a bkup, using the built in bkup and Easeus Todo backup for redundancy. Both have saved me from data loss, due mostly from user error. I also have my data written on other PC’s, on separate data drives.

Yes, using other PC’s is a good option. The more redundant copies you keep, the less risk of loss.

You mention data partitions. It used to be said that one’s backup strategy started with ones choice of partitioning scheme. Is that still true today?

@nevj
That may be true, but my system image bkups, have let me restore an OS image without a reinstall. I do not worry too much about Linux, but Windows can be a nightmare.

3 Likes

That is true, especially if a reinstall means a lot of configuring.
What I meant was you dont need to do system backups as frequently, and I dont think offsite is needed. Systems dont change as much as user data.
An important server might be more in need of regular system backups than a humble home computer.

The role of snapshots (ie Timeshift, Systemback, etc) in all this is not clear to me ?

Snapshots are used to reverse changes to a system in the moment - i.e you’re going to upgrade something, and you want a way to “roll back” the changes if something goes wrong. They’re not usually portable though, so you can’t seperate them from the VM or system they’re attached to, and they usually disappear when the system gets reinstalled. They’re good as an intermediary measure, but they should never take the place of actual proper backups.

3 Likes

@nevj
I try to run bkups, once a month, mostly incremental, where only the data that has changed gets back-up. I also need new syatem-image back-ups, if Windows update has been updated. I for one do not use online backup, and I will not use Timeshift, for various reasons, I have used Windows system-restore a few times.

My backups consist of a Synology NAS that has 2 external drives mounted to it, and they back up the NAS alternately, so one one day, and the other the next day, and so on. I also have larger backups that don’t change that I perform manually as needed. I don’t bother with backing up the OS - if I need to do that I’ll just make a system image. Right now my issue is that I’m running out of space on both the NAS and the external drives, so I’m kind of having to be selective about what I choose to back up.

I learnt the lesson the hard way. Many-many years ago I had a 750GB external Samsung drive, the kept some noticeable amount of precious data. Of course this drive was the only place I had this data.
The drive failed suddenly, from one second to the other…
Since then I keep EVERYTHING important on 2 different hardrives. Nowadays I have a bunch of drives, different age, different vendor…
I had some failing drives, but I always had the data on an other, so no data loss since then…
This not the back part though, just kind of archive, mirrored on 2 separate set of drives.

On my homeserver I run Seafile server. All of computers synchronize precious folders to Seafile.
Seafile has a (per volume) configurable version history, so when someone overwrites a document on monday, I still can revert it on friday, or even the next month…
The Seafile instance is backed to a backup server, as a whole, the data directory with the chunks of files, and the database… then mount the Seafile shared volumes on the server via fuse (we don’t use encryption), and rsync the stored files to the backup server, when done, the backup server shuts down.
So we have all the data on our computers and phones (phones sync via Foldersync to webdav directzory provided by Seafile), AND on the Seafile server (seafile holds multiple versions of each file).
Seafile’s data is backed up to another machine every day automatically.
Loosing data can occur, if Seafile server, backup server, and computer, originally storing that data breaks at the same time.
I can’t really imagine this…

3 Likes

@Doron_Beit-Halahmi
Nothing that fancy for my machine!!! I have a 1tb HDD and a 4tb HDD for local backups and data and a WD 3tb external drive for backups only. The Easeus Todo backup is set to backup all my data partitions and do a system image. Would not want to reinstall W11. The reason for separate data partitions on a drive that is separate from C:/ drive, is to defeat W11 and onedrive, I hate onedrive.

2 Likes

Yeah, I’m a bit OCD about backups, if you couldn’t tell lol

Business vs Home computing backup needs are completely different.

Way back in the day when I was involved, the company I worked at backup everything every night, form disk to tape, and the tape where sent offsite that morning. We also contracted with a company call Sungard that provided an offsite computer center that we could practice the recovery of the data and linkup back to the offices a 100 miles away. Of course that is “very old” history of the 80’s and 90’s.

For home a monthly backup of the system OS on a separate disk and every other month on a USB disk. And even if the OS could not be restored, like @nevj said, a re-buiild of the OS does not take that long. For daily backup I use Timeshift stored in a separate partition.

I keep system and personal data on separate partitions. For my personal data I keep either 3 or 5 copies depending on how important the data is to me, plus some data is on the cloud.

1 Like

Timeshift has saved my ass a few time or at least save me time restoring the OS. A Timeshift snapshot before an update or install of software usually can be easily backed out as if the change to the system was never done. Plus the recovery is fast. Usually less then 5 minutes.

2 Likes

That is same as what I do. Clonezilla everything monthly to one external disk. Plus rsync my data directory to a second external disk whenever there is activity.
I deal with the second desktop same way with Clonezilla, but I rsync its data directory to my main PC’s data directory.

1 Like

100% agree - I run RAIDZ+1 (I think that’s ZFS equivalent of RAID5) on my TrueNAS… if one drive fails - I’ve still got my data, if two drives go, or the whole server fail, I’m stuffed… several decades worth of downloads - but - most of it I could eventually put back together - but there’s some fairly obscure things like movies that are hard to find, and may be impossible to replace (too bad).

My changing volatile data? I keep a copy (sync) on my NAS (I run a Resilio Sync jail on it) - and copies on about 5 or more different computers… It’s not foolproof, but it does do version control on the machines I allow to store deleted/changed copies of files… That’s not a backup solution either, but I’m lazy… I’ve got 2 x 6TB external drives which I hardly ever use…

Too hard to backup my NAS anyway - don’t have enough storage for that… 10+ years ago I had an LTO3 tape drive hooked up (and still got plenty of tapes), but that was cumbersome (I mostly just used tar - didn’t worry about compression as LTO3 does hardware compression).

I also run backups of my personal M1 MacBook Pro with Time Machine onto a 1 TB external SSD… Don’t need backup for my Linux boxes, as mentioned above - all my volatile data (shell scripts, favourite music, documents, pictures [note I have my shell scripts sync’d across about 10 devices]) is easily sync’d again after a new machine build - takes maybe 2 hours to get it all back after a new build…

187 GB in my Resilio Sync “self hosted cloud solution” - that’s mostly all I need… Of course the machines hosting “deleted copies” of files need more space than that…

3 Likes

I vote for keeping really important data in an off-site location as well as having multiple copies in the house. Some years ago my work partner lost his house to a wildfire. He had less than ten minutes advance notice. All that was left were a few puddles of melted metal and the concrete slab. Disasters are rare but they do happen.

I use ProtonDrive for this. My data is encrypted and under some mountain in Switzerland.

1 Like

Been there. We used dump onto 9 track reel tapes.
Dump is still around… why is it not used today?

1 Like

Ah… yes, I remember them very well.
I believe it was in 1993, the company switched over to cartridge tape and a robotic tape system.
The robotic system was from Storage Technology. Pic below shows a peek inside one.


.

1 Like

Until about 10 years ago or so - the UNIX admin team used to run all “Enterprise” backup solutions - usually using something like EMC / Legato Network or Veritas / Symantec Netbackup, running on Solaris or Linux, some big IBM shops used some other product (Tivoli ?).

So that was my job, I wrote lots of shell scripts (mostly KSH) to get reports and generate tape recall lists (which had to be printed then faxed). Also used to do things like, stop the enterprise backup solution and use dump (ufsdump) to get low level operating system backups on “remote” tapes mounted into the tape libraries - man that was slow on measly 100 mbit LAN links!

Somewhere along the way - new products like Commvault got their foot in the door with the Wintel teams, now everywhere I’ve worked since, they’re running Commvault, and the Wintel team runs the backup and restore function and are responsible for sending tapes offsite (and recalling them)…

And many of those Commvault instances still use Red Hat or CentOS for media agents, so sometimes we get an SOS from our Wintel colleagues for some help with Linux…

Just took a look at Proton - might look at that down the track a bit I reckon…

I’d still be using my free 11 GB free dropbox storage - if they hadn’t restricted it to three devices (and I’m already at ~5 since the last time I paid for it)…

1 Like