Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.


Filter this topic showing only the reply marked as answer View this topic in a long page with up to 500 replies per page Create new topic
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1782095 14-May-2017 20:00
Send private message

@richms I think backup up external drives is probably a power user thing. It's not unreasonable to check an external drive each time it's connected. Using the cloud is fine.

 

I might change my strategy slightly, because keeping multiple TB of files I rarely access on my online backups is going to slow things down. I might put my previous years raw image archives onto AWS Glacier, which will cost about US$1 / month. That's extra cost, but not much. I could reduce that a LOT by archiving a high res JPEG instead of the RAW, which is almost as good because all my images are processed already.




richms
29105 posts

Uber Geek
+1 received by user: 10222

Trusted
Lifetime subscriber

  #1782096 14-May-2017 20:03
Send private message

The issue is its single threadedness. While its doing that block info for a 3TB external that you may have just dumped 20 gigs new stuff onto, its not backing up the new stuff, its not backing up any other drives in the system, and it seems if that backup set gets to the end of the allocated backup time that you gave it, then it will just give up and restart again the next day when its backup time comes around. I have to disable scheduling and just leave it going. Its quite low bandwidth and not affecting my net usage so I dont even know if it would be quicker on a gig fiber connection instead of crappy dsl.

 

If it was smart enough to find files that do not exist at the other end at all and be doing those while checking out all the old stuff, and also being able to pause and resume the sync it wouldnt be so bad, but I have things on the computer go unbackedup for weeks because of it.





Richard rich.ms

rb99
3508 posts

Uber Geek
+1 received by user: 1830

Lifetime subscriber

  #1782105 14-May-2017 20:10
Send private message

Does Crashplan get upset if you just shut down the computer when its busy ? I guess it'll either resume (or start again) the next opportunity it gets, but I've always been a turn things off overnight type.





“The modern conservative is engaged in one of man's oldest exercises in moral philosophy; that is, the search for a superior moral justification for selfishness.” -John Kenneth Galbraith

 

rb99




timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1782111 14-May-2017 20:35
Send private message

richms:

 

The issue is its single threadedness. While its doing that block info for a 3TB external that you may have just dumped 20 gigs new stuff onto, its not backing up the new stuff, its not backing up any other drives in the system, and it seems if that backup set gets to the end of the allocated backup time that you gave it, then it will just give up and restart again the next day when its backup time comes around. I have to disable scheduling and just leave it going. Its quite low bandwidth and not affecting my net usage so I dont even know if it would be quicker on a gig fiber connection instead of crappy dsl.

 

If it was smart enough to find files that do not exist at the other end at all and be doing those while checking out all the old stuff, and also being able to pause and resume the sync it wouldnt be so bad, but I have things on the computer go unbackedup for weeks because of it.

 

 

Based on what you've said I don't think CrashPlan is the best tool for dealing with large external disks that are disconnected regularly. If they're working disks, sure, but if they're more archive / media maybe you need a different setup. Maybe something more manual, that doesn't deduplicate, or an archiving tool.

 

rb99:

 

Does Crashplan get upset if you just shut down the computer when its busy ? I guess it'll either resume (or start again) the next opportunity it gets, but I've always been a turn things off overnight type.

 

 

Nope, no problem.


freitasm
BDFL - Memuneh
80661 posts

Uber Geek
+1 received by user: 41083

Administrator
ID Verified
Trusted
Geekzone
Lifetime subscriber

  #1782126 14-May-2017 21:29
Send private message

Use Crashplan, turn off laptops every night, do not have a problem.




Referral links: Quic Broadband (free setup code: R587125ERQ6VE) | Samsung | AliExpress | Wise | Sharesies 

 

Support Geekzone by subscribing (browse ads-free), or making a one-off or recurring donation through PressPatron.

 


timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1782284 15-May-2017 09:53
Send private message

I had a good play with CrashPlan this weekend, and interesting to read what others are doing. It seems that:

 

  • Backing up files from your internal disks to their cloud, it's great, especially with versioning keeping things safe.
  • Backing up to or from external disks that have a lot of data has some significant delays, as it wants to recheck everything on the disk first.
  • Backing up to multiple destinations is pretty good, because you can define your backup sets once and have them go to their cloud, other computers, external disks, etc. That's convenient. However there's the disk checking thing.
  • The more data you have the slower things are, and the more RAM it uses, probably due to de-duplication
  • The automation is good - well it's better than the series of scripts and the two different backup programs I used before

Adding heaps of rarely changing folders, like say image archives, doesn't seem like a great option, if you want to backup to external disks as well as the cloud. That would lead me to another solution for archiving to keep data volumes down, increasing prices, and decreasing the value of CrashPlan.

 

What Next?

 

So what next? That puts me back where I started.

 

Strategy

 

I won't be giving up my hard drive based backups, because backup and restore speed is massively faster than cloud. I do need a bit more automation and want to go down to one piece of software. My NexStar enclosures are being annoying connecting to my PC though, so I might replace them with a dock

 

Online backup me seems great for data that changes regularly or needs to be backed up right after it's made - eg financials, family photos, etc.

 

So maybe I have

 

  • Everything including RAW files and video on hard drives (I have around 1TB of customer images and 0.5TB of personal). I back these up every month or so.
  • Medium jpeg versions of all personal and customer images in AWS Glacier - right now I'm batching to 2500pixel and Photoshop Q8. This is a "last resort" backup.
  • A backup tool backing up my frequently changing data - financials, documents, new photos and video. This could be CrashPlan, but it would possibly be cheaper to use S3 IA class.

 

 

Tools

 

Ideally I'd like a single tool that can backup to a Cloud drive and to hard drives. If you can define backup sets and multiple destinations, great, because I have multiple drives, and having to keep multiple backup sets is a little annoying. I don't think anything other than CrashPlan does this though.

 

Duplicati: I'm not super keep on Duplicati just because it's been in beta for a couple of years. It's probably great, and reliable, but it's almost too flexible.

 

Cloudberry: CloudBerry backup however seems interesting, being commercial, well maintained, and having a free version. I might start to experiment with it more. Having a little play though, each backup seems to goes to one destination only. That means more time to configure, and changes have to be done in multiple places.

 

Arq: I've done a little reading. It seems like a decent tool, but nothing really makes it stand out.

 

All in all, I guess I'll keep using CrashPlan and pondering options for now.


 
 
 

Move to New Zealand's best fibre broadband service (affiliate link). Free setup code: R587125ERQ6VE. Note that to use Quic Broadband you must be comfortable with configuring your own router.

mdf

mdf
3566 posts

Uber Geek
+1 received by user: 1519

Trusted

  #1784182 18-May-2017 12:59
Send private message

Slightly inverse query, but I'm wondering what solution others use to back up their cloud storage locally?

 

Background: I was at a conference yesterday where a frankly terrifying story was described. The takeaway was essentially if its plugged in to the internet it might be vulnerable.

 

I use Office 365 / OneDrive for Business. I have looked at all the relevant documents and assessments and am satisfied that the level of security provided by Microsoft is more than enough for the significance of the documents and records stored there. We get far more benefit from having the documents hosted by Microsoft than the potential downsides of cloud-based storage.

 

Far more likely than Microsoft getting hacked is that someone's passwords become compromised (PEBKAC). Obviously all the usual precautions are taken but it seems nothing is foolproof. If that resulted in destruction of data it would be a significant problem.

 

I realise that Office 365 caches at least some stuff locally, but I'm kind of working to a worse case scenario where local machines are somehow compromised too.

 

I'd therefore like to keep off-line backups and as isolated as possible. I know I can do this by plugging in an external hard drive once a day/week but I know myself and know that will happen more like once a month (at best). I was toying with the idea of having a small PC (rasp pi or equivalent) sitting behind a switch connected to a timer to physically break any connection to the internet other than when a backup is scheduled but feel that is kind of a stupid idea. Maybe better would be some kind of router/local block on internet access other than at certain times?

 

It's all highly unlikely I know but if the precautions are simple enough they are worth taking.


davidcole
6099 posts

Uber Geek
+1 received by user: 1465

Trusted

  #1784189 18-May-2017 13:04
Send private message

mdf:

 

Slightly inverse query, but I'm wondering what solution others use to back up their cloud storage locally?

 

Background: I was at a conference yesterday where a frankly terrifying story was described. The takeaway was essentially if its plugged in to the internet it might be vulnerable.

 

I use Office 365 / OneDrive for Business. I have looked at all the relevant documents and assessments and am satisfied that the level of security provided by Microsoft is more than enough for the significance of the documents and records stored there. We get far more benefit from having the documents hosted by Microsoft than the potential downsides of cloud-based storage.

 

Far more likely than Microsoft getting hacked is that someone's passwords become compromised (PEBKAC). Obviously all the usual precautions are taken but it seems nothing is foolproof. If that resulted in destruction of data it would be a significant problem.

 

I realise that Office 365 caches at least some stuff locally, but I'm kind of working to a worse case scenario where local machines are somehow compromised too.

 

I'd therefore like to keep off-line backups and as isolated as possible. I know I can do this by plugging in an external hard drive once a day/week but I know myself and know that will happen more like once a month (at best). I was toying with the idea of having a small PC (rasp pi or equivalent) sitting behind a switch connected to a timer to physically break any connection to the internet other than when a backup is scheduled but feel that is kind of a stupid idea. Maybe better would be some kind of router/local block on internet access other than at certain times?

 

It's all highly unlikely I know but if the precautions are simple enough they are worth taking.

 

 

 

 

My dropbox, box and gmail accounts are all pulled back down to my server which is then crashplaned.  gmail uses a script called gmvault.





Previously known as psycik

Home Assistant: Gigabyte AMD A8 Brix, Home Assistant with Aeotech ZWave Controller, Raspberry PI, Wemos D1 Mini, Zwave, Shelly Humidity and Temperature sensors
Media:Chromecast v2, ATV4 4k, ATV4, HDHomeRun Dual
Server
Host Plex Server 3x3TB, 4x4TB using MergerFS, Samsung 850 evo 512 GB SSD, Proxmox Server with 1xW10, 2xUbuntu 22.04 LTS, Backblaze Backups, usenetprime.com fastmail.com Sharesies Trakt.TV Sharesight 


richms
29105 posts

Uber Geek
+1 received by user: 10222

Trusted
Lifetime subscriber

  #1784192 18-May-2017 13:06
Send private message

I have my dropbox and google drive in folders that are sent to crashplan.

 

If a cloud backup service was any good it would allow direct connection to those services so it wasnt going out my slow DSL to the cloud, back down the DSL to several computers and then each of them backing it up to crashplan when its time came overnight. But none seem to offer that as a service yet as I assume they use peoples crap internet and slow backups as a way of throttling the use of an unlimited service.





Richard rich.ms

mdf

mdf
3566 posts

Uber Geek
+1 received by user: 1519

Trusted

  #1784193 18-May-2017 13:06
Send private message

davidcole:

 

 

 

My dropbox, box and gmail accounts are all pulled back down to my server which is then crashplaned.  gmail uses a script called gmvault.

 

 

Worse case scenario/unlikely I know, but what happens if it's the server that is compromised? 


timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1784197 18-May-2017 13:07
Send private message

I use Attic to make deduplicated, compressed backups of my AWS EC2 data (web files, databases, etc). Though if I was doing it now I might use Duplicati.

 

I back the Attic database up to Dropbox. I sync Dropbox to my PC, and I make a zip archive of the files occasionally. Those archive files are stored to offline backups in multiple locations.

 

If I was actually running anything critical I'd probably do things a bit differently, to enable faster restore. For example you can sync data between accounts on AWS, and keep the second account login info separate.

 

One day soon ransomware will start deleting online backups. This is why you need offline backups.

 

Of course you should use 2FA and access policies for everything.


 
 
 
 

Shop now for Dyson appliances (affiliate link).
davidcole
6099 posts

Uber Geek
+1 received by user: 1465

Trusted

  #1784201 18-May-2017 13:09
Send private message

mdf:

 

davidcole:

 

 

 

My dropbox, box and gmail accounts are all pulled back down to my server which is then crashplaned.  gmail uses a script called gmvault.

 

 

Worse case scenario/unlikely I know, but what happens if it's the server that is compromised? 

 

 

 

 

As in everything pulled to that server is then compromised and backed up like that to crashplan?

 

Then on another system where I'm using my crashplan credentials (ie other machines on my lan) or the IOS app, or their web app, you'd restore the n-1 version.





Previously known as psycik

Home Assistant: Gigabyte AMD A8 Brix, Home Assistant with Aeotech ZWave Controller, Raspberry PI, Wemos D1 Mini, Zwave, Shelly Humidity and Temperature sensors
Media:Chromecast v2, ATV4 4k, ATV4, HDHomeRun Dual
Server
Host Plex Server 3x3TB, 4x4TB using MergerFS, Samsung 850 evo 512 GB SSD, Proxmox Server with 1xW10, 2xUbuntu 22.04 LTS, Backblaze Backups, usenetprime.com fastmail.com Sharesies Trakt.TV Sharesight 


richms
29105 posts

Uber Geek
+1 received by user: 10222

Trusted
Lifetime subscriber

  #1784202 18-May-2017 13:12
Send private message

I am expecting that malware will start to pull crashplan/backblaze etc credentials and nuke any backups that you have in place. None seem to have 2 factor required for removing data (yet) and its the next logical step to kill off any backups before destroying local copys and demanding ransom.





Richard rich.ms

timmmay

20859 posts

Uber Geek
+1 received by user: 5350

Trusted
Lifetime subscriber

  #1784205 18-May-2017 13:17
Send private message

I've started using Duplicati along with Amazon S3. I could change my bucket policy to disallow deletion, but that would prevent the deduplication working properly. Enabling versioning would work, but increases costs as nothing is ever deleted. Replication perhaps...


mdf

mdf
3566 posts

Uber Geek
+1 received by user: 1519

Trusted

  #1784211 18-May-2017 13:29
Send private message

davidcole:

 

 

 

As in everything pulled to that server is then compromised and backed up like that to crashplan?

 

Then on another system where I'm using my crashplan credentials (ie other machines on my lan) or the IOS app, or their web app, you'd restore the n-1 version.

 

 

The scenario I guess I had in mind was some kind of malware that provided remote access to the server (don't know whether this is even possible). The server presumably has all the credentials for all your cloud services it is syncing. With this access, the hacker then either deletes (malicious/psychopath) or changes all the passwords on (ransom) the cloud services and the server itself. 

 

Given I'm still working through options, I'm aiming for something that is as invulnerable as possible (if possible). Right now, to me that means something that is physically isolated from the rest of the internet. Which might only be possible with an external HDD and a bit of personal discipline, but I was wondering if there was a smarter option than that.


1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
Filter this topic showing only the reply marked as answer View this topic in a long page with up to 500 replies per page Create new topic








Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.