Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.


Filter this topic showing only the reply marked as answer View this topic in a long page with up to 500 replies per page Create new topic
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8

Foo

508 posts

Ultimate Geek


  # 1778139 8-May-2017 21:54
Send private message

I use Backblaze which works well for me. Gave Crashplan a trial but stuck with Backblaze.


978 posts

Ultimate Geek

Trusted

  # 1778158 8-May-2017 23:33
Send private message

Foo: I use Backblaze which works well for me. Gave Crashplan a trial but stuck with Backblaze. 

 

Why did you choose to stay with Backblaze?

 

 

 

Any Carbonite users out there?





Please keep this GZ community vibrant by contributing in a constructive & respectful manner.


 
 
 
 


4476 posts

Uber Geek

Trusted

  # 1778176 9-May-2017 06:20
Send private message

IcI:

Foo: I use Backblaze which works well for me. Gave Crashplan a trial but stuck with Backblaze. 


Why did you choose to stay with Backblaze?


 


Any Carbonite users out there?



I used to be before CrashPlan. (And mozy)

Their iOS tools were better (thumbnails for photos in the iOS app), but their unlimited didn't have a cap, but they'd slowed you down at certain limits.


Also not sure if they have the same ability to save a backup set to cloud and/or a local folder in machine and/or another machine on your network and/or a friends place.




Previously known as psycik

OpenHAB: Gigabyte AMD A8 BrixOpenHAB with Aeotech ZWave Controller, Raspberry PI, Wemos D1 Mini, Zwave, Xiaomi Humidity and Temperature sensors and Bluetooth LE Sensors
Media:Chromecast v2, ATV4, Roku3, HDHomeRun Dual
Windows 10
Host (Plex Server/Crashplan): 2x2TB, 2x3TB, 1x4TB using DriveBender, Samsung 850 evo 512 GB SSD, Hyper-V Server with 1xW10, 1xW2k8, 2xUbuntu 16.04 LTS, Crashplan, NextPVR channel for Plex,NextPVR Metadata Agent and Scanner for Plex




15012 posts

Uber Geek

Trusted
Subscriber

  # 1778204 9-May-2017 07:18
Send private message

Does anyone roll their own cloud backups? It's probably not worth it given the number of "unlimited backup" providers for negligible money.

 

I'll play with CrashPlan backup sets some time soon. I'll get my offsite backups back into the house and see how well it works with drives that are mostly disconnected.


BDFL - Memuneh
64258 posts

Uber Geek

Administrator
Trusted
Geekzone
Lifetime subscriber

185 posts

Master Geek

Lifetime subscriber

  # 1778606 9-May-2017 16:51
Send private message

http://www.hashbackup.com/

 

And B2 cloud storage from BackBlaze.

 

Only works for unixy systems though




15012 posts

Uber Geek

Trusted
Subscriber

  # 1778623 9-May-2017 17:29
Send private message

I have seen about the B2 BackBlaze file storage, S3 compatible. Having a Unix client is handy. Backing up EC2 / VM instance data from AWS to outside the AWS ecosystem isn't as easy as it should be.


 
 
 
 




15012 posts

Uber Geek

Trusted
Subscriber

  # 1778872 10-May-2017 07:11
Send private message

CrashPlan doesn't seem to deduplicate between backup sets.

 

I moved my backups into a few sets, so I can vary the destinations. For example I want RAW files in the cloud and on an external disk, but not on my internal mirror drive. The RAW files are in two backup sets, one fully uploaded, yet they're being uploaded again. Seems like an edge case they may not have considered that will increase their bandwidth requirements. Perhaps they'll de-duplicate on the server rather than on the client, to reduce total storage requirements.


12 posts

Geek


  # 1778880 10-May-2017 07:49
Send private message

timmmay:

 

Does anyone roll their own cloud backups? It's probably not worth it given the number of "unlimited backup" providers for negligible money.

 

I'll play with CrashPlan backup sets some time soon. I'll get my offsite backups back into the house and see how well it works with drives that are mostly disconnected.

 

 

 

 

I run Owncloud off a Raspberry PI with a TB of space on an external HD. Will be updating this to 2 or 3 TB as I have run out of space. Owncloud supports versioning of files so the extra space will allow for more versions to be kept. The same PI also serves as a file server for media files for a separate media server.

 

Owncloud client is used to backup laptops and android phone.

 

A backup copy of the external hard drive is done periodically. Automating that backup and performing it from off-site is on the to-do list but fibre is still a year away for us. Owncloud has the ability to sync with Google and Dropbox etc but I haven't tried it out yet. 


4476 posts

Uber Geek

Trusted

  # 1778885 10-May-2017 08:06
Send private message

timmmay:

 

CrashPlan doesn't seem to deduplicate between backup sets.

 

I moved my backups into a few sets, so I can vary the destinations. For example I want RAW files in the cloud and on an external disk, but not on my internal mirror drive. The RAW files are in two backup sets, one fully uploaded, yet they're being uploaded again. Seems like an edge case they may not have considered that will increase their bandwidth requirements. Perhaps they'll de-duplicate on the server rather than on the client, to reduce total storage requirements.

 

 

Does it actually upload the second copy?  I got the impression they were pretty hot on dedupe.  And when it rips through analysing the files that's when it figures out it's already there.

 

You could ask them, I wouldn't think they'd have missed that. 

 

So the raw files for you, are from the same path twice in two backup sets, or are they two different copies (ie two different paths) that have copies of the same data?

 

I tend to add large datasets to one of my backup sets that don't backup as frequent etc, then add it to a more permanent set later on, and remove from the first, and I've not noticed it having to restart, which would be a similar behaviour to backing it up twice in two different sets.

 

 

 

 





Previously known as psycik

OpenHAB: Gigabyte AMD A8 BrixOpenHAB with Aeotech ZWave Controller, Raspberry PI, Wemos D1 Mini, Zwave, Xiaomi Humidity and Temperature sensors and Bluetooth LE Sensors
Media:Chromecast v2, ATV4, Roku3, HDHomeRun Dual
Windows 10
Host (Plex Server/Crashplan): 2x2TB, 2x3TB, 1x4TB using DriveBender, Samsung 850 evo 512 GB SSD, Hyper-V Server with 1xW10, 1xW2k8, 2xUbuntu 16.04 LTS, Crashplan, NextPVR channel for Plex,NextPVR Metadata Agent and Scanner for Plex




15012 posts

Uber Geek

Trusted
Subscriber

  # 1778887 10-May-2017 08:12
Send private message

davidcole:

 

Does it actually upload the second copy?  I got the impression they were pretty hot on dedupe.  And when it rips through analysing the files that's when it figures out it's already there.

 

You could ask them, I wouldn't think they'd have missed that. 

 

So the raw files for you, are from the same path twice in two backup sets, or are they two different copies (ie two different paths) that have copies of the same data?

 

I tend to add large datasets to one of my backup sets that don't backup as frequent etc, then add it to a more permanent set later on, and remove from the first, and I've not noticed it having to restart, which would be a similar behaviour to backing it up twice in two different sets.

 

 

Same files in two different backup sets. I don't keep duplicate information on the same hard drive. I do have a second copy of some files on another drive, but that's not backed up with CrashPlan.

 

It's using all my bandwidth (because I told it to), so yeah I think it is. I think I created the second backup set including the files, then I may have taken them out of the first backup set - I may have been better waiting until after the first backup ran.

 

Not that it particularly matters, it's an unlimited backup service and my fiber is unlimited. I prefer to be efficient when I can though.

 

I was describing my backup system to a friend, who thought it might be a little over the top. I do run a few small businesses though, and have TB of images, including mine and customer images. Not everything goes to each destination - I don't backup customer wedding photos to the cloud, 2TB is too large, so I keep them on hard drives in three locations.

 

  • RAID mirror of my main data disk
  • CrashPlan mirroring data to another drive
  • CrashPlan uploading data to the cloud service
  • One hard drive in my shed (first line backups)
  • Two hard drives at work (main backups)
  • One hard drive with a friend (mostly wedding backups)

4476 posts

Uber Geek

Trusted

  # 1778890 10-May-2017 08:23
Send private message

timmmay:

 

davidcole:

 

Does it actually upload the second copy?  I got the impression they were pretty hot on dedupe.  And when it rips through analysing the files that's when it figures out it's already there.

 

You could ask them, I wouldn't think they'd have missed that. 

 

So the raw files for you, are from the same path twice in two backup sets, or are they two different copies (ie two different paths) that have copies of the same data?

 

I tend to add large datasets to one of my backup sets that don't backup as frequent etc, then add it to a more permanent set later on, and remove from the first, and I've not noticed it having to restart, which would be a similar behaviour to backing it up twice in two different sets.

 

 

Same files in two different backup sets. I don't keep duplicate information on the same hard drive. I do have a second copy of some files on another drive, but that's not backed up with CrashPlan.

 

It's using all my bandwidth (because I told it to), so yeah I think it is. I think I created the second backup set including the files, then I may have taken them out of the first backup set - I may have been better waiting until after the first backup ran.

 

Not that it particularly matters, it's an unlimited backup service and my fiber is unlimited. I prefer to be efficient when I can though.

 

I was describing my backup system to a friend, who thought it might be a little over the top. I do run a few small businesses though, and have TB of images, including mine and customer images. Not everything goes to each destination - I don't backup customer wedding photos to the cloud, 2TB is too large, so I keep them on hard drives in three locations.

 

  • RAID mirror of my main data disk
  • CrashPlan mirroring data to another drive
  • CrashPlan uploading data to the cloud service
  • One hard drive in my shed (first line backups)
  • Two hard drives at work (main backups)
  • One hard drive with a friend (mostly wedding backups)

 

 

 

I don't quite have all the extra drives, but as you 

 

Crashplan to cloud

 

Crashplan to another drive in same machine

 

Crashplan to another machine at home

 

Folder replication from one drive (where files are "mirrored" - I use drive bender not proper drive mirroring) to another drive where files are also mirrored - things like vm exports

 

Folder replication from one machine to the other that also has crashplan  - vm exports and system backups.

 

 

 

It's 100% automatic, I don't do anything manual..

 

 





Previously known as psycik

OpenHAB: Gigabyte AMD A8 BrixOpenHAB with Aeotech ZWave Controller, Raspberry PI, Wemos D1 Mini, Zwave, Xiaomi Humidity and Temperature sensors and Bluetooth LE Sensors
Media:Chromecast v2, ATV4, Roku3, HDHomeRun Dual
Windows 10
Host (Plex Server/Crashplan): 2x2TB, 2x3TB, 1x4TB using DriveBender, Samsung 850 evo 512 GB SSD, Hyper-V Server with 1xW10, 1xW2k8, 2xUbuntu 16.04 LTS, Crashplan, NextPVR channel for Plex,NextPVR Metadata Agent and Scanner for Plex




15012 posts

Uber Geek

Trusted
Subscriber

  # 1778892 10-May-2017 08:29
Send private message

Fully automatic would be nice. I could backup to my wifes laptop I guess, but it has a 120GB SSD and I have 6TB of storage ;) Because of the data volumes I find offsite disks essential for now, which involves scripting. I'm hoping to use CrashPlan to automate it more though.

 

I've read that the more you store the more memory and resources it takes. That makes sense, deduplication keeps block hashs in RAM. Not a problem at my volume though.




15012 posts

Uber Geek

Trusted
Subscriber

  # 1778893 10-May-2017 08:32
One person supports this post
Send private message

I asked CrashPlan about de-duplication between sets and got a reply in about 20 minutes. Impressive!

 

--

 

We do de-duplicate between backup sets! Its likely that you saw a file verification scan. The scan puts ALL of your data in the To Do list. Then as the backup after the scan runs, its analyzes your files to determine if it has been backed up before and can be skipped, or if its new/changed and needs to be backed up. The percentage complete that you see is CrashPlan's progress through the current To Do list, not the overall backup.

 

The scan is an important and normal part of CrashPlan's scheduled activities, but it can also be triggered by several events. To read more, click on the link below:

 

The explanation for the lengthy time estimation relates to how CrashPlan prioritizes files for backup. CrashPlan backs up your most recent changes first. When the scan finds new files for backup, these files go straight to the top of CrashPlan's “to do” list. This impacts the estimated time to complete backup because the estimate is based on the type of files the scan is reviewing (i.e., new or existing) and your current network speed.

 

As these new or newly modified files complete backup, CrashPlan moves on to your already-backed-up files. At that point, the effective transfer rate rises dramatically because CrashPlan sends significantly less data to your backup destinations for previously-backed up files. This is data de-duplication in action.

 

Please let me know if there's anything else I can do for you—I'm happy to help.


4476 posts

Uber Geek

Trusted

  # 1778902 10-May-2017 08:44
Send private message

timmmay:

 

Fully automatic would be nice. I could backup to my wifes laptop I guess, but it has a 120GB SSD and I have 6TB of storage ;) Because of the data volumes I find offsite disks essential for now, which involves scripting. I'm hoping to use CrashPlan to automate it more though.

 

I've read that the more you store the more memory and resources it takes. That makes sense, deduplication keeps block hashs in RAM. Not a problem at my volume though.

 

 

For the wifes since "don't play with it" rules apply, I just acronis it every day back to my main server, then they're copied to the mirrored drive and to the other machine.

 

I also have a big robocopy script that runs over her profile dir that copies to a directory on the server tat is covered by the 3 crashplan destinations.

 

 

 

Re the dedupe, yeah it's the difference between the client saying backing up (normally at around 1-3mpbs) and analysing which you see running in the 50 - 100mbps range.  Where I think it's just scanning the file for a hash against a stored one.

 

 





Previously known as psycik

OpenHAB: Gigabyte AMD A8 BrixOpenHAB with Aeotech ZWave Controller, Raspberry PI, Wemos D1 Mini, Zwave, Xiaomi Humidity and Temperature sensors and Bluetooth LE Sensors
Media:Chromecast v2, ATV4, Roku3, HDHomeRun Dual
Windows 10
Host (Plex Server/Crashplan): 2x2TB, 2x3TB, 1x4TB using DriveBender, Samsung 850 evo 512 GB SSD, Hyper-V Server with 1xW10, 1xW2k8, 2xUbuntu 16.04 LTS, Crashplan, NextPVR channel for Plex,NextPVR Metadata Agent and Scanner for Plex


1 | 2 | 3 | 4 | 5 | 6 | 7 | 8
Filter this topic showing only the reply marked as answer View this topic in a long page with up to 500 replies per page Create new topic



Twitter and LinkedIn »



Follow us to receive Twitter updates when new discussions are posted in our forums:



Follow us to receive Twitter updates when news items and blogs are posted in our frontpage:



Follow us to receive Twitter updates when tech item prices are listed in our price comparison site:





News »

Intel expands 10th Gen Intel Core Mobile processor family
Posted 23-Aug-2019 10:22


Digital innovation drives new investment provider
Posted 23-Aug-2019 08:29


Catalyst Cloud becomes a Kubernetes Certified Service Provider (KCSP)
Posted 23-Aug-2019 08:21


New AI legaltech product launched in New Zealand
Posted 21-Aug-2019 17:01


Yubico launches first Lightning-compatible security key, the YubiKey 5Ci
Posted 21-Aug-2019 16:46


Disney+ streaming service confirmed launch in New Zealand
Posted 20-Aug-2019 09:29


Industry plan could create a billion dollar interactive games sector
Posted 19-Aug-2019 20:41


Personal cyber insurance a New Zealand first
Posted 19-Aug-2019 20:26


University of Waikato launches space for esports
Posted 19-Aug-2019 20:20


D-Link ANZ expands mydlink ecosystem with new mydlink Mini Wi-Fi Smart Plug
Posted 19-Aug-2019 20:14


Kiwi workers still falling victim to old cyber tricks
Posted 12-Aug-2019 20:47


Lightning Lab GovTech launches 2019 programme
Posted 12-Aug-2019 20:41


Epson launches portable laser projector
Posted 12-Aug-2019 20:27


Huawei launches new distributed HarmonyOS
Posted 12-Aug-2019 20:20


Lenovo introduces single-socket servers for edge and data-intensive workloads
Posted 9-Aug-2019 21:26



Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.


Support Geekzone »

Our community of supporters help make Geekzone possible. Click the button below to join them.

Support Geezone on PressPatron



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.

Alternatively, you can receive a daily email with Geekzone updates.