Backup software (rant)

Discussion in 'General Software' started by maldotcom2, Oct 29, 2018.

  1. maldotcom2

    maldotcom2 Member

    Joined:
    Feb 18, 2006
    Messages:
    1,966
    Seems like I post this thread at least once a year. The currently available consumer backup software SUCKS.

    I'm currently using Acronis, previously I used fbackup. Before that it was something else. I can't even remember. Inevitably they all fail to be the robust, and reliable data protection that I require. Honestly, in the 10 or so years I've spent trialing software, I could have learned to program and written a better program than the pile of garbage currently available.

    The main reason for failure? No verification of data integrity. Inevitably, I set up a scheduled backup with the chosen software, only to check it in a few months and find it hasn't done a backup in weeks despite the software reporting it has. With the latest occurrence, I can hit "Manual backup" in an Acronis client and it will respond along the lines of "Yep, done." I check and there's no files in the destination since September. AND THIS IS FROM A WORLD RENOWNED DATA SOFTWARE COMPANY!

    I can setup all the required infrastructure, redundancy, scheduling, versioning, measures against crypto, and it's always let down by piss poor coding on the client side.

    Why is it so hard?

    • Full image backups
    • FTP transfer
    • Scheduling
    • Versioning
    • Monitoring of backup file integrity
    • Alerts when something screws up
     
  2. HobartTas

    HobartTas Member

    Joined:
    Jun 22, 2006
    Messages:
    822
    Tried VEEAM? That appears to be highly rated, I'm impressed with it as I can even get the free version to backup to LTO tape.

    If you want this you will need to use ZFS (or BTRFS) as your target for backups as that will checksum every block and if anything goes wrong then it will tell you what file is damaged and if you have raid or mirroring it can also do repair.

    Well if you can do all this then setup your ZFS backup target and use Robocopy with the /MIR mirroring option to get the initial data moved over and from there you do a ZFS snapshot and after which you do further robocopies where you just backup incremental changes by using the /a option for "Copies only files for which the Archive attribute is set" and then do another snapshot. The end result is deleted files are held by the snapshots until you release them and changed and new files will have the "A" attribute set and will be copied over as incremental updates.

    Another alternative is to get Rsync operating on windows but people state that pretty much all the implementations of it aren't any good.

    Given windows always writes to the hard drives then AFAIK its next to impossible to boot windows off a read only disk so I don't trust windows full disk images whilst the system is running. If I do an image I boot off a CDROM or a USB stick and then copy the source disk to a target disk like say if I'm upgrading to a larger SSD and the only time I used the Samsung live migration tool the copy was very unstable.

    Problems pretty much solved as there's not a lot that can go wrong with my proposed arrangements, otherwise I presume you'll be posting a similar thread in 12 months time.

    Cheers
     
  3. OP
    OP
    maldotcom2

    maldotcom2 Member

    Joined:
    Feb 18, 2006
    Messages:
    1,966
    Cheers for the reply. I actually used to run ZFS on the destination, but abandoned it when I upgraded (downgraded?) to a Synology for simplicity and ability to salvage data from a single drive without complicated rebuilds. You may be right about disk imaging while system is running being a bad idea. Not sure how I can automate it otherwise though.

    Also, Veeam doesn't support FTP destinations, which rules it out for me.
     
    Last edited: Oct 31, 2018
  4. BurningFeetMan

    BurningFeetMan Member

    Joined:
    Apr 22, 2003
    Messages:
    8,338
    Location:
    Veg City
    I too use Acronis for my system images and have done for almost 10 years. I take my backups whilst systems are offline via a USB boot disk that copies various system images to a large USB storage HDD that spends its life sitting on a shelf.

    To be honest though, I only use this method of backing up now and then, a couple of times a year. This is because I'm using encrypted online storage for my various hobbies and projects that sync upon file save, and virtually all the games I play save in one form or another to the cloud too. All my office apps are cloud based storage too.

    Have you played with the likes of Spideroak?

    Hmm, and in further thinking - as you seem fairly passionate about this topic, why not start a "back up review" project/youtube channel?
     
  5. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    37,520
    Location:
    Brisbane
  6. domsmith

    domsmith Member

    Joined:
    Nov 7, 2002
    Messages:
    284
    Last edited: Oct 31, 2018
  7. elh9

    elh9 Member

    Joined:
    Feb 28, 2016
    Messages:
    107
    Location:
    Perth NOR
    If your setup accommodates it, UR Backup is pretty good. Client / Server setup. Web interface for easy control. Folder watching for differential backups.
    https://www.urbackup.org
     
  8. NSanity

    NSanity Member

    Joined:
    Mar 11, 2002
    Messages:
    17,681
    Location:
    Canberra
    legitimately - fix that problem. Not your Backup Application.
     
  9. HobartTas

    HobartTas Member

    Joined:
    Jun 22, 2006
    Messages:
    822
    If they had managed to get something right like this in the past when say defragmenting the disk on the fly was introduced without corrupting NTFS then there shouldn't be any problems but I think to be utterly safe the only way I'm aware of doing it is by having a separate small Linux partition which you set active when you shutdown or reboot windows and all it does is it copies the windows partition to say another hard drive/partition and then resets the windows partition back to active and re-boots and windows fires up again.
     
  10. NSanity

    NSanity Member

    Joined:
    Mar 11, 2002
    Messages:
    17,681
    Location:
    Canberra
  11. OP
    OP
    maldotcom2

    maldotcom2 Member

    Joined:
    Feb 18, 2006
    Messages:
    1,966
    I'll look into the suggestions above, cheers.

    I assume this is about insecurities of FTP? I'm not using FTP over the net. If there's an alternative to SMB that doesn't have overhead I'm happy to look into it.
     
  12. NSanity

    NSanity Member

    Joined:
    Mar 11, 2002
    Messages:
    17,681
    Location:
    Canberra
  13. DarkYendor

    DarkYendor Member

    Joined:
    Feb 25, 2008
    Messages:
    3,196
    Location:
    Perth
    VEEAM, if you have the money, and it's all virtualised.

    What are you backing up?

    Personally, for my file server I use SnapRAID. You can say "RAID is not a Backup", but non-striping snapshot raids can actually act as a backup in lots of scenarios. If I lost everything on the flies server, most can be re-downloaded, and the bits that can't (photo's etc...) are also stored on an FTP server and on an external HDD I leave at work.
     
  14. NSanity

    NSanity Member

    Joined:
    Mar 11, 2002
    Messages:
    17,681
    Location:
    Canberra
    I have about ~40 servers using Veeam Windows Agent and about 5 using Veeam Linux Agent.

    its fine, and tie back into our existing VBR infrastructure nicely.
     
  15. pH@tTm@N

    pH@tTm@N Member

    Joined:
    Jun 27, 2001
    Messages:
    1,993
    Location:
    BRISBANE
    Another Veeam user for years here, at a few different places. You can run a pre or post scripts which could FTP if you want?

    People tell me Zerto is similar to Veeam for less bucks maybe check that out too?
     
  16. mareke

    mareke Member

    Joined:
    Jun 1, 2003
    Messages:
    7,335
    Location:
    Sydney, NSW
  17. cvidler

    cvidler Member

    Joined:
    Jun 29, 2001
    Messages:
    12,339
    Location:
    Canberra
    OP needs to detail the requirements for backups.

    snapshotting OS's is a waste of time. As covered, on Windows it's unreliable at best, and

    Separate data and software as they require different backup strategies.

    - software (including OS), ensure you have copies of the installers handy and backed up. It doesn't take long to rebuild a WIndows PC to a workable state (much less time than trying to recover a disk image). You'll be up and running faster from a disk failure, installing all your crapware can be done as required (you'll find you'll install less of it, as you don't really use it, resulting in a cleaner system - win).

    This stuff only need archiving monthly at most.


    - data, not I'm talking your personal stuff, the irreplaceable stuff, all your linux ISOs can be re-torrented (or archive them monthly as per your software) etc. here you want snapshotting (and maybe version control, depending what your actually doing), data integrity checking etc. and backup the daily snapshot daily to removable media. The snapshots protect you from cryptoware, offline nightlys protect you from media failures/fire/flood/act of god etc.

    It's not really hard, backups aren't a new thing, planning and doing it right requires you to think about what you're doing, trying to backup a image of everything you have every hour/day just won't work.
     
  18. miicah

    miicah Member

    Joined:
    Jun 3, 2010
    Messages:
    6,310
    Location:
    Brisbane, QLD
    SnapRAID is perfect for my Microserver. If a disk dies I don't lose all my linux ISOs and I don't have drives spinning 24/7 for no reason eating up power.
     
  19. cvidler

    cvidler Member

    Joined:
    Jun 29, 2001
    Messages:
    12,339
    Location:
    Canberra
    power cycling drives (indeed most electronics- thermal shock) is where the most wear and tear - and thus highest risk of failure occur.

    for longevity, best to keep things running 24x7.
     
  20. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    37,520
    Location:
    Brisbane
    Triggering memories of the 2011 Brisbane Floods when the data centre power, backup power, generators, UPSes and everything were all stupidly stored in the basement of the building I worked in, the waters broke the banks, flooded everything electrical, and instantly shut down several very large SANs that were running 24x7 for eons.

    Weeks later when all the power stuff was sorted and we powered up those SANs... yeah... lots of failed drives.

    Kudos to whichever knob designed the power stuff to *ALL* sit below river level. Some great planning there.
     

Share This Page

Advertisement: