Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.


Filter this topic showing only the reply marked as answer View this topic in a long page with up to 500 replies per page Create new topic
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8
5385 posts

Uber Geek
+1 received by user: 2213


  # 1784612 19-May-2017 10:05
Send private message

timmmay:

 

 

 

I think you can have multiple computers on an account. Alternately you can back up from one computer to another using CrashPlan, but that creates an encrypted archive. I use BitTorrent Sync to mirror files over, then CrashPlan to back them up.

 

CrashPlan runs in the background and doesn't require any user actions.

 

 

Thanks we both travel a lot so not often on the same network with both computers turned on.  Definitely need to back up machines independently.





Mike

1906 posts

Uber Geek
+1 received by user: 458

Trusted
Subscriber

  # 1784620 19-May-2017 10:28
Send private message

 Out of curiosity, what is the default mechanism for storing these files in CP?

 

 

 

I'm a simple person and can comprehend COMPUTER_NAME > DRIVE NAME > FOLDER > FILES as a backup structure. Makes it easy to go the other way when restoring. Basically if I save all the folders and files under there in my E: drive to the cloud, Id expect to see them in the same layout for restore.

 

 

 

Have worked with too much software that has historically manipulated a backup into a single virtual file, meaning you're hosed if the file is corrupted. or backups that put extensions and additional characters on files and folders ("E:_")





________

 

Antonios K

 

Click to see full size


 
 
 
 




14869 posts

Uber Geek
+1 received by user: 2790

Trusted
Subscriber

  # 1784623 19-May-2017 10:37
One person supports this post
Send private message

CrashPlan is de-duplicating, so it stores all your backup files as blocks in a series of large files. Whichever blocks are required to be stored in their cloud service are sent as required. I think every modern backup program does something similar. You can access the individual files in this archive using their client.

 

On their website you can navigate to your files just as you say, by computer name, drive, folder, subfolder, etc. If you restore they come out in folders as you'd expect.


Stu

Hammered
5244 posts

Uber Geek
+1 received by user: 1178

Moderator
Trusted
Lifetime subscriber

  # 1785208 20-May-2017 16:51
Send private message

Haven't seen it mentioned yet, but how about the basic Amazon Drive for cloud storage? Just having a look at it now in case there's any truth to the rumours regarding Crashplan Home.





Keep calm, and carry on posting.

 

 

 

Click to see full size Click to see full size




14869 posts

Uber Geek
+1 received by user: 2790

Trusted
Subscriber

  # 1785214 20-May-2017 16:59
Send private message

Not a bad idea, but I'm not so keen on "drive" or dropbox type services. I'd prefer something more "archival" than just storage.

 

I've decided on zip + par2 being stored in AWS Glacier. I understand AWS well, I can control access easily, and it's enterprise level reliability. Because of the security around it I'm as confident as I can be that the data can't be corrupted or deleted by either the service or some kind of virus / worm on my computer.

 

The only problem I'm having is the free Glacier clients limit you to two upload threads, so I'm only getting 4-8Mbps upload speed to the USA with CloudBerry. I'm going to try FastGlacier soon. It's $40 to get pro versions with more threads, but given uploading is rare maybe I just let it go for a few days.


Stu

Hammered
5244 posts

Uber Geek
+1 received by user: 1178

Moderator
Trusted
Lifetime subscriber

  # 1785229 20-May-2017 17:12
Send private message

 It certainly doesn't seem there's a way to use Amazon Drive as a Crashplan replacement. Back to the drawing board!





Keep calm, and carry on posting.

 

 

 

Click to see full size Click to see full size




14869 posts

Uber Geek
+1 received by user: 2790

Trusted
Subscriber

  # 1786475 23-May-2017 10:00
Send private message

I've been playing with Duplicati, backing up to disks and S3. I tried to do a test restore today, and got an odd error - mismatched block size. I think it's probably solvable easily, but it highlights that it's still immature. The releases are labeled "experimental" or "canary", it's not even beta yet. Because of that I think I want to use commercial, paid backup software that provides support.

 

I think I'm going to give Arq a go. I did a quick evaluation. Its UI is a bit clunky, but it seems to have all the required features.

 

One Arq question: can it backup to external disks that aren't always connected? It can back up to folders or a NAS, so I suspect yes. I wonder how well it handles not having the external disk connected all the time - anyone know?

 

I did note that with Arq you can "run all backups", but there doesn't seem to be a way to run backups to a specific destination. That seems like a fairly major omission.

 

Tagging @Rappelle and @ripdog


 
 
 
 




14869 posts

Uber Geek
+1 received by user: 2790

Trusted
Subscriber

  # 1786503 23-May-2017 10:31
Send private message

Rappelle:

 

 

 

I've not tried backing up to an external drive. Seems like it would work given folders and NAS locations work. Best bet is to try it.

 

As for backing up a particular location, when you click 'Back up now' and you have multiple destinations, it will ask which you want to do

 

 

Ok I'll give that a shot. I've also asked support about that, partly for the question, partly to look at response times.

 

Good to know about backing up to multiple locations. Its user interface is a bit quirky.


449 posts

Ultimate Geek
+1 received by user: 249

Subscriber

  # 1786588 23-May-2017 11:45
Send private message

I don't think it would actually be necessary to implement any code whatsoever to support backing up on sometimes-connected drives. Arq simply tries to backup every hour, and if it fails, it tries again next hour.

 

Not sure what you mean about Arq having a clunky UI. To me, it seems very orthodox for a backup app.

 

I had a similar experience with Duplicati. A loosely-coupled web UI (meaning it often seemed to have issues communicating with the daemon when the daemon was busy, also error display issues left me confused as to what was wrong), combined with unhelpful errors in the error log. I didn't know what was happening and I didn't know if the errors were harmless or if I was missing data in my backups. I get errors with Arq too, but they're all "file in use by another process", which makes sense. Hmm, wasn't VSS supposed to solve those? Perhaps the error prompts Arq to use VSS? I'd better check that.

 

Arq is developed by a single guy, so although support is excellent, don't expect super response times.


1308 posts

Uber Geek
+1 received by user: 279

Subscriber

  # 1786599 23-May-2017 11:58
Send private message

Stu:

 

 It certainly doesn't seem there's a way to use Amazon Drive as a Crashplan replacement. Back to the drawing board!

 

 

I presume as off-line storage Amazon Drive would be fine ? I also presume its not a Crashplan replacement because Am Drive is so manual - no automatic drive monitoring, no versioning, etc ?





rb99




14869 posts

Uber Geek
+1 received by user: 2790

Trusted
Subscriber

  # 1786654 23-May-2017 13:16
Send private message

ripdog:

 

I don't think it would actually be necessary to implement any code whatsoever to support backing up on sometimes-connected drives. Arq simply tries to backup every hour, and if it fails, it tries again next hour.

 

Not sure what you mean about Arq having a clunky UI. To me, it seems very orthodox for a backup app.

 

I had a similar experience with Duplicati. A loosely-coupled web UI (meaning it often seemed to have issues communicating with the daemon when the daemon was busy, also error display issues left me confused as to what was wrong), combined with unhelpful errors in the error log. I didn't know what was happening and I didn't know if the errors were harmless or if I was missing data in my backups. I get errors with Arq too, but they're all "file in use by another process", which makes sense. Hmm, wasn't VSS supposed to solve those? Perhaps the error prompts Arq to use VSS? I'd better check that.

 

Arq is developed by a single guy, so although support is excellent, don't expect super response times.

 

 

That's good that it should work out of the box, I'll give it a go when I have time.

 

The UI just isn't so refined as it could be. For example, why is there no "back up now" button for an individual destination or folder in the main / right window? You have to go to a menu to do it. It's just extra steps that are required. Why do you have to create your destination as S3 standard, then edit it to choose S3 IA class? When you add a folder to your S3 backup why does it say "s3 standard" even when the destination has S3 IA class set? When you add a new folder why do you need to add the folder, then edit it to remove subfolders you don't want, rather than doing everything in one step?

 

It's not bad, as such, it's just slower and clunkier than I'd expect from modern software. It's not surprising that it's written by one guy, an engineer, as a team with a UI expert would probably do a better job.

 

Also, a one man band is additional risk. If he gets hit by a bus there are no updates or fixes. Things like S3 integration probably need maintenance over time.

 

I'll evaluate Arq, but if there was another similar piece of software that was not a one man band I'd probably favour it.


449 posts

Ultimate Geek
+1 received by user: 249

Subscriber

  # 1786782 23-May-2017 14:38
Send private message

For example, why is there no "back up now" button for an individual destination or folder in the main / right window? You have to go to a menu to do it. It's just extra steps that are required.

 

 

 

Well, I see your point, but very rarely would you ever need to force a backup. It's only one extra click to access a rarely needed function. Good UI design is not filling up the main view with buttons.

 

 

 

Why do you have to create your destination as S3 standard, then edit it to choose S3 IA class? When you add a folder to your S3 backup why does it say "s3 standard" even when the destination has S3 IA class set?

 

 

 

I don't use S3, but that sounds like... a bug? Certainly the second one sounds like a bug. Email him about it.

 

 

 

When you add a new folder why do you need to add the folder, then edit it to remove subfolders you don't want, rather than doing everything in one step?

 

 

 

Ahh, the great continuum between simplicity and power. I guess he thought that most people wouldn't need to exclude subfolders, so again relegated it to an extra step. The nice thing is that it for the average case, adding a new folder is a one-step process.

 

 

 

Also, a one man band is additional risk. If he gets hit by a bus there are no updates or fixes. Things like S3 integration probably need maintenance over time.

 

 

 

They do. There is, however, never going to be any actual lock-in. Even if Arq development died today, you will always be able to download your backup using standard S3 apps and extract your data locally, either using Arq or the open source command line restoration tool he offers.

 

 

 

I'll evaluate Arq, but if there was another similar piece of software that was not a one man band I'd probably favour it.

 

One alternative is Cloudberry, which only gained gdrive compression and encryption in the very latest release (oddly enough). I'm personally a bit wary of them - they feel like a company more interested in looking good than actually caring about the consumer. They do not publish their backup format like Arq do, and offer no open source restoration tool. If their client breaks, your data is goneburgers. Reading the FAQ, they also impose lots of silly restrictions on you even after you pay, for example you can only backup one network folder with the standard version, and you can't install it on server versions of windows.

 

Just discovered this in the FAQ:

 

 

 

Q: If I delete a file on my local computer is it deleted from the backup storage as well?

 

A: Yes, the file will be deleted from the online storage too. By default, though, the file will remain in the online storage for 30 days to prevent accidental deletions. We call it Smart Delete and you can learn more about it here

 

 

Oh lord.

 

 

 

 

I just feel that everything the Arq developer does is honest and good-willed. Publishing his data format and an open source restoration utility is going above and beyond simply for the sake of his customers. Just something to consider.




14869 posts

Uber Geek
+1 received by user: 2790

Trusted
Subscriber

  # 1786824 23-May-2017 15:11
Send private message

If you back up to an external disk that's only bought on site for occasional backups you would use a "backup now" button regularly.

 

Open format is good, but if he stops developing it the product dies. That's not great for a backup system that should be long term. One of the reasons I'm changing from my current backup system is the software isn't supported any more, and it's not inspiring confidence.

 

Duplicati will be great in a few years, but it's just too immature. I've been looking at other pieces of software, Easus TODO, Acronis, etc, but they all seem limiting in some way like not backing up to local disks and cloud. Though Genie9 seems promising.


5385 posts

Uber Geek
+1 received by user: 2213


  # 1786842 23-May-2017 15:23
Send private message

After a bit of reading and research I'm going with the following: -

 

1) Synology NAS (maybe 216j);

 

2) One Drive or similar for files actively being edited etc on the machines;

 

3) Inactive files on the NAS (available via remote connection if required);

 

4) Archival back up of NAS to a cloud service;

 

The choice of back up service will depend on price and what plays nicely with the NAS. 





Mike



14869 posts

Uber Geek
+1 received by user: 2790

Trusted
Subscriber

  # 1786918 23-May-2017 17:08
Send private message

Mike, that will give you incremental, versioned backups right? If a virus on your PC encrypts everything on the NAS then backs it up before you notice, will you be able to roll back versions?

 

Genie9

 

I just had a play with Genie9 pro 9 backup. It has all the features, including S3 and external disk backup, but it just doesn't feel right to me. The UI, the way the options are set up, it doesn't de-duplicate, it just all feels clunky. Feeling clunky isn't a problem as such, but it reduces confidence in the software. The Arq interface is better.

 

Arq

 

I'm still not sold on Arq, mostly because it's a one man band. The software and technology looks pretty good. I have it running now, but oddly it doesn't say anywhere "backup is running". If you look carefully it is on the bottom of the window. That program really needs a UI designer. It also doesn't have any kind of a progress indicator during backup - it says how much is scanned and how much is uploaded to the storage device, but not how much there is left to scan.

 

It seems to be the best of the bunch right now.

 

Duplicati

 

It's unfortunate that Duplicati isn't seeming that reliable. I restored fine from my PC, which has all the settings and local metadata database on it. I have that stored on Dropbox so I could use it remotely, but the software just doesn't inspire confidence right now. I do like that it's open source, using experimental software for backups may not be prudent. If it was finished it'd be the best of the lot by a significant margin.

 

I just ran a local backup. Duplicati complained it couldn't find a file, which was exactly where it should've been.

 

 

 

So I guess I keep playing around for a while before I decide what to call my main backup system. If the duplicati bug can be worked out I might still consider that.

 

 

 

Glacier

 

I'm happy with zipping things, using PAR2 for a bit of redundancy, then storing the data in AWS Glacier using FastGlacier. I've put around 60GB up, which will cost me US$0.24 per month. That's my long term archive sorted, so if I lose my PC I only lose recent stuff. CrashPlan is still working fine for that. It's backing up to offsite disks that's the main thing I want to be better, and ideally that software should back up to S3 as well.


1 | 2 | 3 | 4 | 5 | 6 | 7 | 8
Filter this topic showing only the reply marked as answer View this topic in a long page with up to 500 replies per page Create new topic



Twitter and LinkedIn »



Follow us to receive Twitter updates when new discussions are posted in our forums:



Follow us to receive Twitter updates when news items and blogs are posted in our frontpage:



Follow us to receive Twitter updates when tech item prices are listed in our price comparison site:





News »

Scientists unveil image of quantum entanglement
Posted 13-Jul-2019 06:00


Hackers to be challenged at University of Waikato
Posted 12-Jul-2019 21:34


OPPO Reno Z now available in New Zealand
Posted 12-Jul-2019 21:28


Sony introduces WF-1000XM3 wireless headphones with noise cancellation
Posted 8-Jul-2019 16:56


Xero announces new smarter tools, push into the North American market
Posted 19-Jun-2019 17:20


New report by Unisys shows New Zealanders want action by social platform companies and police to monitor social media sites
Posted 19-Jun-2019 17:09


ASB adds Google Pay option to contactless payments
Posted 19-Jun-2019 17:05


New Zealand PC Market declines on the back of high channel inventory, IDC reports
Posted 18-Jun-2019 17:35


Air New Zealand uses drones to inspect aircraft
Posted 17-Jun-2019 15:39


TCL Electronics launches its first-ever 8K TV
Posted 17-Jun-2019 15:18


E-scooter share scheme launches in Wellington
Posted 17-Jun-2019 12:34


Anyone can broadcast with Kordia Pop Up TV
Posted 13-Jun-2019 10:51


Volvo and Uber present production vehicle ready for self-driving
Posted 13-Jun-2019 10:47


100,000 customers connected to fibre broadband network through Enable
Posted 13-Jun-2019 10:35


5G uptake even faster than expected
Posted 12-Jun-2019 10:01



Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.


Support Geekzone »

Our community of supporters help make Geekzone possible. Click the button below to join them.

Support Geezone on PressPatron



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.

Alternatively, you can receive a daily email with Geekzone updates.