Has anyone used this product especially with servers that have SQL Server, mail or other locked type files.
![]() ![]() |
Never heard of it before, seems new as I looked around a lot last year. Looks like it has a nice website, nice features, though the language isn't well written. "The CLI version is free for personal use so there are no personal licenses for the CLI version.". The website isn't very deep either. Because it's new and the website isn't well written and doesn't have much info I'd be careful.
Look at Duplicati (opens source) as well, and Cloudberry Backup (commercial). Duplicati is awesome but the restore didn't work for me when I used non-standard block sizes, which spooked me somewhat, but it seems to be built on good technology. I use CloudBerry, though I don't 100% trust it, so as well as deduplicated backups I do a sync to a disk as well. It's great for backing up to Cloud, I have some files on B2, lots on S3. There's a huge thread on backup software.
Thanks.
I'm using duplicati but it seems slow and painful when restoring - especially if you have a way to go back between full saves and snapshots (I'm looking at backing up over 200GB of data with a 1%-5% change per week).
Also even with a 1000MBPs, housed in a data centre, backboned with huge speed networks, it is still a dog backing up to Amazon S3. Restorations incredibly painful.
The thread you pointed to seems to love crashplan but recent ones discussing BackBlaze. Wondering if there is any comparisons.
However it is the sortware that is important. Bad software = bad backups.
@timmmay You did a bunch of testing on Backblaze and Cloudberry. How's it going for you? I'm thinking about the MSP offering as have multiple clients and a variety of servers (Linux MSSQL, Server 2003, Server 2012, Mac) to deal with. The big problem becomes pricing as USD to NZD starts to hike the prices and a client with 10 pcs, may not want to pay $$5 - 10USD per pc per month. and that's before I put my profits in.
I also have a need to have an old 2003 server with SAP able to be run up in a VM when the HW dies and ditto a server 2012 with IIS and Smartermail running on it. figured drop them as an image onto Amazon then run them up once per month or so to receive updates but not pay for them to run all the time.
We also had a client who downloaded an email, opened the word doc, typed in a password and got Crypto Locked. They had no off site backups (despite my careful explanations about data recovery (onsite) vs disaster recovery (offsite and disconnected ) so versioning really important. Thankfully Dropbox can roll back but it is a PITN to do it file by file by file and no easy way to do it in bulk.
I tried duplicati and duplicity in the past because their ability to save the backups directly to cloud storage but none of them worked properly for my use case (backup of a server with a gazillion small files). Backups took too long and frequenty crashed because they made the server run out of memory.
Ended up using borg and using rsync.net for storing the backup, and it has worked like a treat for months. They have a special price for borg backups, although it's not as cheap as Backblaze.
nunz:
@timmmay You did a bunch of testing on Backblaze and Cloudberry. How's it going for you? I'm thinking about the MSP offering as have multiple clients and a variety of servers (Linux MSSQL, Server 2003, Server 2012, Mac) to deal with. The big problem becomes pricing as USD to NZD starts to hike the prices and a client with 10 pcs, may not want to pay $$5 - 10USD per pc per month. and that's before I put my profits in.
I didn't test BackBlaze. I use CloudBerry myself, to backup to disks and to cloud, but for no good reason I don't trust it 100%. It's never failed me, but I've never done a wide scale restore. It's fine for syncing to disks and S3/B2.
timmmay:
nunz:
@timmmay You did a bunch of testing on Backblaze and Cloudberry. How's it going for you? I'm thinking about the MSP offering as have multiple clients and a variety of servers (Linux MSSQL, Server 2003, Server 2012, Mac) to deal with. The big problem becomes pricing as USD to NZD starts to hike the prices and a client with 10 pcs, may not want to pay $$5 - 10USD per pc per month. and that's before I put my profits in.
I didn't test BackBlaze. I use CloudBerry myself, to backup to disks and to cloud, but for no good reason I don't trust it 100%. It's never failed me, but I've never done a wide scale restore. It's fine for syncing to disks and S3/B2.
I'm not a huge fan of Cloudberry. I've used it a few times and the errors seem pretty unhelpful and it fails for random reasons after working fine for ages.
Shadowprotect is my favourite backup software, it's never let me down, and saved my bacon many times. But it's for whole disks not specific data (although that may have changed in recent years).
Andib:
I use Duplicati on my personal VMs & at home however I wouldn't personally use it for commercial backups. For clients we use either Veeam or Commvault
It is interesting that you and @timmay and others don't trust your backup software 100% :) I too feel this for a lot of the backups I run except for stuff I have scripted using old fashioned tools like tar, rsync, xcopy and a few similar ones.
I've looked at a few commercial offerings but in the small business area they seemed very expensive for what they do. With a commercial offering it would be ideal to have a vm image of servers for quick disaster recovery, incremenetal backups for data and something that really works with shadow protect for iis, sql server, pst files and similar. Still looking.
I'll give cloudberry a crack - run a vm / disk image test and see what I get out of it. If I have a good vm in the cloud, then will have to trust the incremenetals and data work.
Duplicati - keeps screwing up on shadow protect. Works one day and ont the next. also keeps giving error messages that file xxxx has not een backed up as is locked or gone. These are often index files from mail server - but it is worrying that shadow protect backups still cant properly deal with that scenario.
My dream ideal would be an rsync server with attached cloud storage - cloud storage accessbility, rsyncs awesome reliability. Throw backups into rsyc watch directory and let it do the heavy lifting to a remote server and into the cloud. Still looking.
Has anyone had any more experiences with Duplicacy?
I was backing up a Nextcloud server's data with Duplicati but it's initial backup and restore times are insane. For about 200k files and 850GB of data it takes 48hs to complete the initial backup and I couldn't test the restore times because while waiting in the Web UI to start a test recovery, I decided I had already waited for too long to consider Duplicati viable.
I am currently giving Duplicacy a try and the initial backup of the same data set takes 4hs. Only the performance difference greatly offsets the $20/year of it's license.
(Both Duplicacy and Duplicati used encryption and default settings, backing up to a local server through SFTP through a gigabit link)
I haven't tried Duplicacy but I did have similar issues to yours with Duplicati, except for me the initial backup took a couple of weeks (just under 2TB backing up to Backblaze). But once it was fully backed up the incrementals seemed to work okay. But my main issues with Duplicati were the logging and file selections. It became really difficult to analyze the logs if errors occurred, and then even harder to exclude files that were causing locking issues. I ended up switching to Arq and have paid the one off licence cost with the future upgrades included. That won't help you though since Arq is Windows and Mac compatible only.
Have you checked out restic?
SumnerBoy:Not until now... Their model seems good and documentation looks pretty complete.
Have you checked out restic?
Will try to run a benchmark backup today and will come back with the results tomorrow!
I use restic for some backups from Windows and Linux, but as a second backup, to both local disks and Amazon S3. So far, so good. Remember it's not at version 1.0 yet, so while it seems reliable I'd be careful with it as primary backup. While it does encryption it doesn't do compression. It's a technical backup product, you have to write scripts, run or schedule occasionally commands / scripts to do things like check the backup archive and purge unneeded files, etc. I think it has a LOT of potential but I'd be a bit wary as primary backup.
Duplicati has far too many bugs / problems, but I think in a few years it'll be good. They fixed one bug I logged a year after I logged it, but within a few days I found another show stopper. I don't remember the details, I gave up on it.
Restic results:
Added to the repo: 360.400 GiB
processed 290873 files, 834.526 GiB in 2:47:50
Pretty good considering Duplicacy took 4hs to achieve the same initial backup, both with encryption enabled.
timmmay does restic really not have any compression? I'm quite impressed with the results considering the reduced size comes only from deduplication. Duplicacy's backup is using 318G vs 362G from restic.
On the left is 4hs of CPU usage while Duplicacy ran the initial backup. On the right, the same initial backup done by restic. I like that restic uses more CPU threads to get the job done quicker :)
On the scripting side of things I don't see any difference between restic and Duplicacy CLI, both will need the same kind of work to get up and running.
Edit: true, compression is still an opened issue for restic: https://github.com/restic/restic/issues/21
So I decided to try both Duplicacy CLI and restic for a week on the same data set. The data being backed up comes from a nextcloud server where 15 users have their work, it's about 360k files and 1TB. I ran daily backups after the users finished their work day and these were the results of the initial backup, the next 6 days and a full restore from the last snapshot to an empty folder.
Backups were being held on a local server and both software connected through sftp.
CPU usage and bandwidth comparison of initial backup:
Restic restore CPU and bandwidth:
Duplicacy restore CPU and bandwidth:
I must admit that using restic felt nicer than Duplicacy, it also has better output stats, but doubling the restore times is a killer for me. 15 vs 7hs may not look like much difference now because both can restore overnight but I'm not yet backing up everything I need. Restic also failed to restore 1 file with a "failed: ciphertext verification failed" error.
I'll be using Duplicacy but will keep a close eye on restic's progress.
![]() ![]() |