![]() ![]() ![]() ![]() |
|
richms:
I think its transfering the details of those blocks of the files, which because my internet is crap just takes so long, but its more the fact they have designed the process to not be able to be stopped/started that is the most annoying thing.
I also think they use it as a method of stopping people putting too much up onto their service. That and the problem where the service will silently crash and restart all the time effectivly doing nothing till you give it some commands to let it use more memory to do its thing are also really shadey.
Check its backing up. Drop a brand new file in the middle of your stuff and check that it can be restored a week or so later. If not, then investigate whats gone wrong. It really is not a set up and leave platform like they would have you believe.
Sounds like something to keep an eye on, but is it likely to be more of a problem than the other options (Backblaze etc) ? It doesn't seem to have a dubious reputation, though its unlikely to be perfect.
rb99
@richms I think backup up external drives is probably a power user thing. It's not unreasonable to check an external drive each time it's connected. Using the cloud is fine.
I might change my strategy slightly, because keeping multiple TB of files I rarely access on my online backups is going to slow things down. I might put my previous years raw image archives onto AWS Glacier, which will cost about US$1 / month. That's extra cost, but not much. I could reduce that a LOT by archiving a high res JPEG instead of the RAW, which is almost as good because all my images are processed already.
The issue is its single threadedness. While its doing that block info for a 3TB external that you may have just dumped 20 gigs new stuff onto, its not backing up the new stuff, its not backing up any other drives in the system, and it seems if that backup set gets to the end of the allocated backup time that you gave it, then it will just give up and restart again the next day when its backup time comes around. I have to disable scheduling and just leave it going. Its quite low bandwidth and not affecting my net usage so I dont even know if it would be quicker on a gig fiber connection instead of crappy dsl.
If it was smart enough to find files that do not exist at the other end at all and be doing those while checking out all the old stuff, and also being able to pause and resume the sync it wouldnt be so bad, but I have things on the computer go unbackedup for weeks because of it.
Does Crashplan get upset if you just shut down the computer when its busy ? I guess it'll either resume (or start again) the next opportunity it gets, but I've always been a turn things off overnight type.
rb99
richms:
The issue is its single threadedness. While its doing that block info for a 3TB external that you may have just dumped 20 gigs new stuff onto, its not backing up the new stuff, its not backing up any other drives in the system, and it seems if that backup set gets to the end of the allocated backup time that you gave it, then it will just give up and restart again the next day when its backup time comes around. I have to disable scheduling and just leave it going. Its quite low bandwidth and not affecting my net usage so I dont even know if it would be quicker on a gig fiber connection instead of crappy dsl.
If it was smart enough to find files that do not exist at the other end at all and be doing those while checking out all the old stuff, and also being able to pause and resume the sync it wouldnt be so bad, but I have things on the computer go unbackedup for weeks because of it.
Based on what you've said I don't think CrashPlan is the best tool for dealing with large external disks that are disconnected regularly. If they're working disks, sure, but if they're more archive / media maybe you need a different setup. Maybe something more manual, that doesn't deduplicate, or an archiving tool.
rb99:
Does Crashplan get upset if you just shut down the computer when its busy ? I guess it'll either resume (or start again) the next opportunity it gets, but I've always been a turn things off overnight type.
Nope, no problem.
Are you happy with Geekzone? Consider subscribing or making a donation.
freitasm on Keybase | My technology disclosure
These links are referral codes: Sharesies | Mighty Ape | Norton 360 | Lenovo laptops | Goodsync | Geekzone Blockchain Project
I had a good play with CrashPlan this weekend, and interesting to read what others are doing. It seems that:
Adding heaps of rarely changing folders, like say image archives, doesn't seem like a great option, if you want to backup to external disks as well as the cloud. That would lead me to another solution for archiving to keep data volumes down, increasing prices, and decreasing the value of CrashPlan.
What Next?
So what next? That puts me back where I started.
Strategy
I won't be giving up my hard drive based backups, because backup and restore speed is massively faster than cloud. I do need a bit more automation and want to go down to one piece of software. My NexStar enclosures are being annoying connecting to my PC though, so I might replace them with a dock.
Online backup me seems great for data that changes regularly or needs to be backed up right after it's made - eg financials, family photos, etc.
So maybe I have
Tools
Ideally I'd like a single tool that can backup to a Cloud drive and to hard drives. If you can define backup sets and multiple destinations, great, because I have multiple drives, and having to keep multiple backup sets is a little annoying. I don't think anything other than CrashPlan does this though.
Duplicati: I'm not super keep on Duplicati just because it's been in beta for a couple of years. It's probably great, and reliable, but it's almost too flexible.
Cloudberry: CloudBerry backup however seems interesting, being commercial, well maintained, and having a free version. I might start to experiment with it more. Having a little play though, each backup seems to goes to one destination only. That means more time to configure, and changes have to be done in multiple places.
Arq: I've done a little reading. It seems like a decent tool, but nothing really makes it stand out.
All in all, I guess I'll keep using CrashPlan and pondering options for now.
Slightly inverse query, but I'm wondering what solution others use to back up their cloud storage locally?
Background: I was at a conference yesterday where a frankly terrifying story was described. The takeaway was essentially if its plugged in to the internet it might be vulnerable.
I use Office 365 / OneDrive for Business. I have looked at all the relevant documents and assessments and am satisfied that the level of security provided by Microsoft is more than enough for the significance of the documents and records stored there. We get far more benefit from having the documents hosted by Microsoft than the potential downsides of cloud-based storage.
Far more likely than Microsoft getting hacked is that someone's passwords become compromised (PEBKAC). Obviously all the usual precautions are taken but it seems nothing is foolproof. If that resulted in destruction of data it would be a significant problem.
I realise that Office 365 caches at least some stuff locally, but I'm kind of working to a worse case scenario where local machines are somehow compromised too.
I'd therefore like to keep off-line backups and as isolated as possible. I know I can do this by plugging in an external hard drive once a day/week but I know myself and know that will happen more like once a month (at best). I was toying with the idea of having a small PC (rasp pi or equivalent) sitting behind a switch connected to a timer to physically break any connection to the internet other than when a backup is scheduled but feel that is kind of a stupid idea. Maybe better would be some kind of router/local block on internet access other than at certain times?
It's all highly unlikely I know but if the precautions are simple enough they are worth taking.
mdf:
Slightly inverse query, but I'm wondering what solution others use to back up their cloud storage locally?
Background: I was at a conference yesterday where a frankly terrifying story was described. The takeaway was essentially if its plugged in to the internet it might be vulnerable.
I use Office 365 / OneDrive for Business. I have looked at all the relevant documents and assessments and am satisfied that the level of security provided by Microsoft is more than enough for the significance of the documents and records stored there. We get far more benefit from having the documents hosted by Microsoft than the potential downsides of cloud-based storage.
Far more likely than Microsoft getting hacked is that someone's passwords become compromised (PEBKAC). Obviously all the usual precautions are taken but it seems nothing is foolproof. If that resulted in destruction of data it would be a significant problem.
I realise that Office 365 caches at least some stuff locally, but I'm kind of working to a worse case scenario where local machines are somehow compromised too.
I'd therefore like to keep off-line backups and as isolated as possible. I know I can do this by plugging in an external hard drive once a day/week but I know myself and know that will happen more like once a month (at best). I was toying with the idea of having a small PC (rasp pi or equivalent) sitting behind a switch connected to a timer to physically break any connection to the internet other than when a backup is scheduled but feel that is kind of a stupid idea. Maybe better would be some kind of router/local block on internet access other than at certain times?
It's all highly unlikely I know but if the precautions are simple enough they are worth taking.
My dropbox, box and gmail accounts are all pulled back down to my server which is then crashplaned. gmail uses a script called gmvault.
Previously known as psycik
OpenHAB: Gigabyte AMD A8 Brix, OpenHAB with Aeotech ZWave Controller, Raspberry PI, Wemos D1 Mini, Zwave, Xiaomi Humidity and Temperature sensors
Media:Chromecast v2, ATV4 4k, ATV4, HDHomeRun Dual
Windows 10 Host Plex Server 3x3TB, 4x4TB using DriveBender, Samsung 850 evo 512 GB SSD, Hyper-V Server with 1xW10, 2xUbuntu 20.04 LTS, Backblaze Backups,
I have my dropbox and google drive in folders that are sent to crashplan.
If a cloud backup service was any good it would allow direct connection to those services so it wasnt going out my slow DSL to the cloud, back down the DSL to several computers and then each of them backing it up to crashplan when its time came overnight. But none seem to offer that as a service yet as I assume they use peoples crap internet and slow backups as a way of throttling the use of an unlimited service.
davidcole:
My dropbox, box and gmail accounts are all pulled back down to my server which is then crashplaned. gmail uses a script called gmvault.
Worse case scenario/unlikely I know, but what happens if it's the server that is compromised?
I use Attic to make deduplicated, compressed backups of my AWS EC2 data (web files, databases, etc). Though if I was doing it now I might use Duplicati.
I back the Attic database up to Dropbox. I sync Dropbox to my PC, and I make a zip archive of the files occasionally. Those archive files are stored to offline backups in multiple locations.
If I was actually running anything critical I'd probably do things a bit differently, to enable faster restore. For example you can sync data between accounts on AWS, and keep the second account login info separate.
One day soon ransomware will start deleting online backups. This is why you need offline backups.
Of course you should use 2FA and access policies for everything.
mdf:
davidcole:
My dropbox, box and gmail accounts are all pulled back down to my server which is then crashplaned. gmail uses a script called gmvault.
Worse case scenario/unlikely I know, but what happens if it's the server that is compromised?
As in everything pulled to that server is then compromised and backed up like that to crashplan?
Then on another system where I'm using my crashplan credentials (ie other machines on my lan) or the IOS app, or their web app, you'd restore the n-1 version.
Previously known as psycik
OpenHAB: Gigabyte AMD A8 Brix, OpenHAB with Aeotech ZWave Controller, Raspberry PI, Wemos D1 Mini, Zwave, Xiaomi Humidity and Temperature sensors
Media:Chromecast v2, ATV4 4k, ATV4, HDHomeRun Dual
Windows 10 Host Plex Server 3x3TB, 4x4TB using DriveBender, Samsung 850 evo 512 GB SSD, Hyper-V Server with 1xW10, 2xUbuntu 20.04 LTS, Backblaze Backups,
I am expecting that malware will start to pull crashplan/backblaze etc credentials and nuke any backups that you have in place. None seem to have 2 factor required for removing data (yet) and its the next logical step to kill off any backups before destroying local copys and demanding ransom.
I've started using Duplicati along with Amazon S3. I could change my bucket policy to disallow deletion, but that would prevent the deduplication working properly. Enabling versioning would work, but increases costs as nothing is ever deleted. Replication perhaps...
|
![]() ![]() ![]() ![]() |