Robust Backup System

Data losses

Every IT specialist faced data losses, caused by hardware malfunctions, malicious (or just badly designed) software and his own mistakes. Those of you who’d say different are just too young, and your first fault is still in your future.

I myself can withstand and survive any losses of my personal data — because I myself am not this data. But loss of work data definitely can cause troubles — it’s pretty uncomfortable when you can not show your achievements to, for example, your new employer.

The reasons

Modern computers are not reliable. Well, they never were — troubles we had are same as 10, 20 and 30 years ago. Actually, computers are never being made for storing data. Their purpose is processing — literally, computing. They never being designed to store data, and even now situation not changed.

So we storing our data using devices not purposed for this. This devices are not foolproof by design — because they must agile work with information keeping totally within the system, i.e., under total control of it. This makes stored data vulnerable by default — and situation became worse when we remember that information must be easily manipulated. I.e., in any case design of our computer systems assumes that data can not be stored reliably.

We accepting this dangerous situation only because computers allow to manipulate and transmit our data relatively simple and extremely fast.

Of course, grown computing power, high-speed networks and wide IT infrastructure (and mostly higher level of computer knowledge) slightly reduces possibility of catastrophic data losses. But modern operation systems are extremely complex (overcomplicated, I’d say!), changing dynamically with badly tested online updates, and poorly documented (i.e., functions of them can work surprisingly different than what you expect!)

Same time, we, IT professionals with decades of experience, reached now «power we’re hardly able to control». And especially for us it’s even simpler (simpler than ever before) to loose really valuable data.

Counter-arguments

Of course you can said to me — «But, Daniil, but why you’re so anxious about this?» There are cloud storages, version control, different backup software, external HDDs, in the end!

OK. Agree, they are. But let’s break your illusions step by step.

Cloud storages are not a solution. First, they are slow. Yeah, it’s OK when your project weights 50 Mb. But typical Unity project is 10+ Gb. And what artists should do? Work folder of typical artist weights 50+ Gb. For experienced digital creator this can be 500+ Gb with easy! Pushing this through internet to storage looks stupid. In addition to this, rent of a storage costs money, and your access to the storage could be blocked any day, just because of changes of political situation (i.e., literally because of bad mood of an old man with strange haircut).

Version control is not about backing up the data. It’s about version control — no more no less. Your Git repo will die with all other files if your HDD will break apart.

External HDDs are good. But you will forget to copy your data to it. This is very typical. In most cases your USB drive works as a dust collector until some hardware failure hurts your data. Then you pulling it out from far shelve, and running around with screams «But I copied everything last time 3 months ago!»

And backup software just never cost its price. Enterprise solutions in most cases are expensive and complex, common solutions are unreliable. They can help… But you can’t be sure in this!

Yes, in worst cases there are data restoration tools, allows you to gravedig deleted data from your damaged HDD. That’s an option if everything was lost… But in some cases even this wouldn’t help. For example, DEL . /S /Q in PowerShell doesn’t just delete files, but zeroing their data on the drive. So you can restore files — but they will be empty!

And — why you need this stress?

Solution

The only solution can be lot of full-scale backup copies. Like — full-through, 100% of files, primitively and harshly copied to a safe place. Better in two safe places. Better more — but that’s paranoia already. 3 copies are OK in most cases.

Options depends on your goals.

For operation systems this could be a full image of HDD or its volume. But now, when your OSes updates almost each single day, that doesn’t look good. Your OS today and your OS 6 months ago — it’s two different operation systems.

For data, the best way is just an archive, packed in simple and popular format (like .zip). Don’t run for efficiency and complex storage forms. Don’t forget — in case of failure of your computer, you need to be able to unpack your data!

You need to do this regularly. Because it’s complex and boring process, it must be automatic.

Robust Backup System

At November 25 I’ve lost contents of my Desktop folder because of irresponsive experiment with .bat files. Happily, there were not too much of important data — but the accident showed that we (as always) can not trust «reliability» of our computers.

So I decided to create a simple and free independent app, able to make backup copies of directories I need. 2 days later, here it is — Robust Backup System!

Functions

Robust Backup System allows you to create backups of your data, packed into .zip files and placed in storages you set up for this. App starts with the console, and able to read and store data on your computer (including external sources like USB sticks and external HDDs) or on NAS (LAN file servers). You can easily configure it, describing parameters of backup repositories with .ini files stored in appropriate folder with repo configs. You can set up scheduled runs of Robust Backup System using any scheduler, like Windows Task Scheduler or nnCron.

App is standalone, lightweight, tested, and freeware.

Feel free to use it and share it. Hope it can help you to protect your data well.

Download Robust Backup System v.1.0.0.