Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.


Filter this topic showing only the reply marked as answer View this topic in a long page with up to 500 replies per page Create new topic
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
antoniosk
2383 posts

Uber Geek
+1 received by user: 749

ID Verified
Trusted
Lifetime subscriber

  #1784620 19-May-2017 10:28
Send private message

 Out of curiosity, what is the default mechanism for storing these files in CP?

 

 

 

I'm a simple person and can comprehend COMPUTER_NAME > DRIVE NAME > FOLDER > FILES as a backup structure. Makes it easy to go the other way when restoring. Basically if I save all the folders and files under there in my E: drive to the cloud, Id expect to see them in the same layout for restore.

 

 

 

Have worked with too much software that has historically manipulated a backup into a single virtual file, meaning you're hosed if the file is corrupted. or backups that put extensions and additional characters on files and folders ("E:_")





________

 

Antoniosk




timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1784623 19-May-2017 10:37
Send private message

CrashPlan is de-duplicating, so it stores all your backup files as blocks in a series of large files. Whichever blocks are required to be stored in their cloud service are sent as required. I think every modern backup program does something similar. You can access the individual files in this archive using their client.

 

On their website you can navigate to your files just as you say, by computer name, drive, folder, subfolder, etc. If you restore they come out in folders as you'd expect.


Stu

Stu
Hammered
8743 posts

Uber Geek
+1 received by user: 2409

Moderator
ID Verified
Trusted
Lifetime subscriber

  #1785208 20-May-2017 16:51
Send private message

Haven't seen it mentioned yet, but how about the basic Amazon Drive for cloud storage? Just having a look at it now in case there's any truth to the rumours regarding Crashplan Home.





People often mistake me for an adult because of my age.

 

Keep calm, and carry on posting.

 

Referral Links: Sharesies

 

Are you happy with what you get from Geekzone? If so, please consider supporting us by subscribing.

 

No matter where you go, there you are.




timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1785214 20-May-2017 16:59
Send private message

Not a bad idea, but I'm not so keen on "drive" or dropbox type services. I'd prefer something more "archival" than just storage.

 

I've decided on zip + par2 being stored in AWS Glacier. I understand AWS well, I can control access easily, and it's enterprise level reliability. Because of the security around it I'm as confident as I can be that the data can't be corrupted or deleted by either the service or some kind of virus / worm on my computer.

 

The only problem I'm having is the free Glacier clients limit you to two upload threads, so I'm only getting 4-8Mbps upload speed to the USA with CloudBerry. I'm going to try FastGlacier soon. It's $40 to get pro versions with more threads, but given uploading is rare maybe I just let it go for a few days.


Stu

Stu
Hammered
8743 posts

Uber Geek
+1 received by user: 2409

Moderator
ID Verified
Trusted
Lifetime subscriber

  #1785229 20-May-2017 17:12
Send private message

 It certainly doesn't seem there's a way to use Amazon Drive as a Crashplan replacement. Back to the drawing board!





People often mistake me for an adult because of my age.

 

Keep calm, and carry on posting.

 

Referral Links: Sharesies

 

Are you happy with what you get from Geekzone? If so, please consider supporting us by subscribing.

 

No matter where you go, there you are.


timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1786475 23-May-2017 10:00
Send private message

I've been playing with Duplicati, backing up to disks and S3. I tried to do a test restore today, and got an odd error - mismatched block size. I think it's probably solvable easily, but it highlights that it's still immature. The releases are labeled "experimental" or "canary", it's not even beta yet. Because of that I think I want to use commercial, paid backup software that provides support.

 

I think I'm going to give Arq a go. I did a quick evaluation. Its UI is a bit clunky, but it seems to have all the required features.

 

One Arq question: can it backup to external disks that aren't always connected? It can back up to folders or a NAS, so I suspect yes. I wonder how well it handles not having the external disk connected all the time - anyone know?

 

I did note that with Arq you can "run all backups", but there doesn't seem to be a way to run backups to a specific destination. That seems like a fairly major omission.

 

Tagging @Rappelle and @ripdog


 
 
 
 

Shop now for Dyson appliances (affiliate link).
timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1786503 23-May-2017 10:31
Send private message

Rappelle:

 

 

 

I've not tried backing up to an external drive. Seems like it would work given folders and NAS locations work. Best bet is to try it.

 

As for backing up a particular location, when you click 'Back up now' and you have multiple destinations, it will ask which you want to do

 

 

Ok I'll give that a shot. I've also asked support about that, partly for the question, partly to look at response times.

 

Good to know about backing up to multiple locations. Its user interface is a bit quirky.


ripdog
548 posts

Ultimate Geek
+1 received by user: 373
Inactive user


  #1786588 23-May-2017 11:45
Send private message

I don't think it would actually be necessary to implement any code whatsoever to support backing up on sometimes-connected drives. Arq simply tries to backup every hour, and if it fails, it tries again next hour.

 

Not sure what you mean about Arq having a clunky UI. To me, it seems very orthodox for a backup app.

 

I had a similar experience with Duplicati. A loosely-coupled web UI (meaning it often seemed to have issues communicating with the daemon when the daemon was busy, also error display issues left me confused as to what was wrong), combined with unhelpful errors in the error log. I didn't know what was happening and I didn't know if the errors were harmless or if I was missing data in my backups. I get errors with Arq too, but they're all "file in use by another process", which makes sense. Hmm, wasn't VSS supposed to solve those? Perhaps the error prompts Arq to use VSS? I'd better check that.

 

Arq is developed by a single guy, so although support is excellent, don't expect super response times.


rb99
3505 posts

Uber Geek
+1 received by user: 1830

Lifetime subscriber

  #1786599 23-May-2017 11:58
Send private message

Stu:

 

 It certainly doesn't seem there's a way to use Amazon Drive as a Crashplan replacement. Back to the drawing board!

 

 

I presume as off-line storage Amazon Drive would be fine ? I also presume its not a Crashplan replacement because Am Drive is so manual - no automatic drive monitoring, no versioning, etc ?





“The modern conservative is engaged in one of man's oldest exercises in moral philosophy; that is, the search for a superior moral justification for selfishness.” -John Kenneth Galbraith

 

rb99


timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1786654 23-May-2017 13:16
Send private message

ripdog:

 

I don't think it would actually be necessary to implement any code whatsoever to support backing up on sometimes-connected drives. Arq simply tries to backup every hour, and if it fails, it tries again next hour.

 

Not sure what you mean about Arq having a clunky UI. To me, it seems very orthodox for a backup app.

 

I had a similar experience with Duplicati. A loosely-coupled web UI (meaning it often seemed to have issues communicating with the daemon when the daemon was busy, also error display issues left me confused as to what was wrong), combined with unhelpful errors in the error log. I didn't know what was happening and I didn't know if the errors were harmless or if I was missing data in my backups. I get errors with Arq too, but they're all "file in use by another process", which makes sense. Hmm, wasn't VSS supposed to solve those? Perhaps the error prompts Arq to use VSS? I'd better check that.

 

Arq is developed by a single guy, so although support is excellent, don't expect super response times.

 

 

That's good that it should work out of the box, I'll give it a go when I have time.

 

The UI just isn't so refined as it could be. For example, why is there no "back up now" button for an individual destination or folder in the main / right window? You have to go to a menu to do it. It's just extra steps that are required. Why do you have to create your destination as S3 standard, then edit it to choose S3 IA class? When you add a folder to your S3 backup why does it say "s3 standard" even when the destination has S3 IA class set? When you add a new folder why do you need to add the folder, then edit it to remove subfolders you don't want, rather than doing everything in one step?

 

It's not bad, as such, it's just slower and clunkier than I'd expect from modern software. It's not surprising that it's written by one guy, an engineer, as a team with a UI expert would probably do a better job.

 

Also, a one man band is additional risk. If he gets hit by a bus there are no updates or fixes. Things like S3 integration probably need maintenance over time.

 

I'll evaluate Arq, but if there was another similar piece of software that was not a one man band I'd probably favour it.


ripdog
548 posts

Ultimate Geek
+1 received by user: 373
Inactive user


  #1786782 23-May-2017 14:38
Send private message

For example, why is there no "back up now" button for an individual destination or folder in the main / right window? You have to go to a menu to do it. It's just extra steps that are required.

 

 

 

Well, I see your point, but very rarely would you ever need to force a backup. It's only one extra click to access a rarely needed function. Good UI design is not filling up the main view with buttons.

 

 

 

Why do you have to create your destination as S3 standard, then edit it to choose S3 IA class? When you add a folder to your S3 backup why does it say "s3 standard" even when the destination has S3 IA class set?

 

 

 

I don't use S3, but that sounds like... a bug? Certainly the second one sounds like a bug. Email him about it.

 

 

 

When you add a new folder why do you need to add the folder, then edit it to remove subfolders you don't want, rather than doing everything in one step?

 

 

 

Ahh, the great continuum between simplicity and power. I guess he thought that most people wouldn't need to exclude subfolders, so again relegated it to an extra step. The nice thing is that it for the average case, adding a new folder is a one-step process.

 

 

 

Also, a one man band is additional risk. If he gets hit by a bus there are no updates or fixes. Things like S3 integration probably need maintenance over time.

 

 

 

They do. There is, however, never going to be any actual lock-in. Even if Arq development died today, you will always be able to download your backup using standard S3 apps and extract your data locally, either using Arq or the open source command line restoration tool he offers.

 

 

 

I'll evaluate Arq, but if there was another similar piece of software that was not a one man band I'd probably favour it.

 

One alternative is Cloudberry, which only gained gdrive compression and encryption in the very latest release (oddly enough). I'm personally a bit wary of them - they feel like a company more interested in looking good than actually caring about the consumer. They do not publish their backup format like Arq do, and offer no open source restoration tool. If their client breaks, your data is goneburgers. Reading the FAQ, they also impose lots of silly restrictions on you even after you pay, for example you can only backup one network folder with the standard version, and you can't install it on server versions of windows.

 

Just discovered this in the FAQ:

 

 

 

Q: If I delete a file on my local computer is it deleted from the backup storage as well?

 

A: Yes, the file will be deleted from the online storage too. By default, though, the file will remain in the online storage for 30 days to prevent accidental deletions. We call it Smart Delete and you can learn more about it here

 

 

Oh lord.

 

 

 

 

I just feel that everything the Arq developer does is honest and good-willed. Publishing his data format and an open source restoration utility is going above and beyond simply for the sake of his customers. Just something to consider.


 
 
 

Shop on-line at New World now for your groceries (affiliate link).
timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1786824 23-May-2017 15:11
Send private message

If you back up to an external disk that's only bought on site for occasional backups you would use a "backup now" button regularly.

 

Open format is good, but if he stops developing it the product dies. That's not great for a backup system that should be long term. One of the reasons I'm changing from my current backup system is the software isn't supported any more, and it's not inspiring confidence.

 

Duplicati will be great in a few years, but it's just too immature. I've been looking at other pieces of software, Easus TODO, Acronis, etc, but they all seem limiting in some way like not backing up to local disks and cloud. Though Genie9 seems promising.


MikeAqua
8031 posts

Uber Geek
+1 received by user: 3820


  #1786842 23-May-2017 15:23
Send private message

After a bit of reading and research I'm going with the following: -

 

1) Synology NAS (maybe 216j);

 

2) One Drive or similar for files actively being edited etc on the machines;

 

3) Inactive files on the NAS (available via remote connection if required);

 

4) Archival back up of NAS to a cloud service;

 

The choice of back up service will depend on price and what plays nicely with the NAS. 





Mike


timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1786918 23-May-2017 17:08
Send private message

Mike, that will give you incremental, versioned backups right? If a virus on your PC encrypts everything on the NAS then backs it up before you notice, will you be able to roll back versions?

 

Genie9

 

I just had a play with Genie9 pro 9 backup. It has all the features, including S3 and external disk backup, but it just doesn't feel right to me. The UI, the way the options are set up, it doesn't de-duplicate, it just all feels clunky. Feeling clunky isn't a problem as such, but it reduces confidence in the software. The Arq interface is better.

 

Arq

 

I'm still not sold on Arq, mostly because it's a one man band. The software and technology looks pretty good. I have it running now, but oddly it doesn't say anywhere "backup is running". If you look carefully it is on the bottom of the window. That program really needs a UI designer. It also doesn't have any kind of a progress indicator during backup - it says how much is scanned and how much is uploaded to the storage device, but not how much there is left to scan.

 

It seems to be the best of the bunch right now.

 

Duplicati

 

It's unfortunate that Duplicati isn't seeming that reliable. I restored fine from my PC, which has all the settings and local metadata database on it. I have that stored on Dropbox so I could use it remotely, but the software just doesn't inspire confidence right now. I do like that it's open source, using experimental software for backups may not be prudent. If it was finished it'd be the best of the lot by a significant margin.

 

I just ran a local backup. Duplicati complained it couldn't find a file, which was exactly where it should've been.

 

 

 

So I guess I keep playing around for a while before I decide what to call my main backup system. If the duplicati bug can be worked out I might still consider that.

 

 

 

Glacier

 

I'm happy with zipping things, using PAR2 for a bit of redundancy, then storing the data in AWS Glacier using FastGlacier. I've put around 60GB up, which will cost me US$0.24 per month. That's my long term archive sorted, so if I lose my PC I only lose recent stuff. CrashPlan is still working fine for that. It's backing up to offsite disks that's the main thing I want to be better, and ideally that software should back up to S3 as well.


timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1787055 23-May-2017 20:16
Send private message

True. Arq seems to handle disks being removed just fine, it just reports an error when the backup tries to run.

 

I have found a fairly major bug though: on my PC it completely failed. I can't run backups, I can't add new destinations, nothing, as the service fails every time the UI tries to have it do anything. I've sent in a fairly comprehensive bug report.

 

So I'm still looking for a good tool. My requirements are:

 

  • Runs reliably on PC
  • Backs up to hard disk (internal and external) and doesn't take ages to scan it every time an external disk is attached
  • Backs up to S3 (I could probably live without this if I had to, but I'd rather not)
  • Does incremental versioned backups, configurable, and has a system to delete old versions
  • Trustworthy company
  • Maintained, ideally not by a single person
  • Demonstratably reliable, stable software
  • Doesn't limit backup size for no reason

 

 

CrashPlan fails because it's slow with external disks, which I use a lot. Duplicati seems awesome but it's not released, not stable, and has defects. CloudBerry limits your backup size for no seemingly good reason.

 

Genie9 Pro 9 seems to work, and it explicitly supports external disks and S3. It purges old versions after a given number of days, it's not tiered but it's ok. However purging doesn't work with S3, so you never lose your old versions or documents - maybe a plus, maybe a minus. Scheduling is a bit confusing, it seems to want to do full backups quite regularly, but I think I can make it do just one with incremental after that. That makes me wonder if it's reliable enough to do always incremental backups, or if it has to do full backups to keep things reliable.

 

The Genie UI doesn't inspire confidence, it was clearly designed by an engineer. But the software should probably work, it's by a good company, and it'll have been tested.

 

Guess I should give Genie9 a bit more of a test.


1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
Filter this topic showing only the reply marked as answer View this topic in a long page with up to 500 replies per page Create new topic








Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.