Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.


Filter this topic showing only the reply marked as answer View this topic in a long page with up to 500 replies per page Create new topic
1 | ... | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19
339 posts

Ultimate Geek
+1 received by user: 14


  Reply # 1855031 30-Aug-2017 00:35
Send private message quote this post

SirHumphreyAppleby:

 

There may be some additional resources used in order to flag blocks encrypted with Blowfish vs AES, but there must already be headers for blocks as encryption and compression can be disabled on archives. Seems improbable they wouldn't have a free bit somewhere in the headers in order to flag Blowfish/AES. The limit therefore appears to be entirely arbitrary.

 

I believe the technical platform constraint is that they are hoping people with larger backups go elsewhere. Last year they changed the definition of commercially reasonable for the multi-computer Home plan from 20TB to 5TB. I note the CrashPlan for Small Business terms now state e.g. 5TB, while the Home terms said i.e. 5TB (previously 20TB). That's quite a significant change in definition. Sneaky.

 

 

They didn't say it had anything to do with the Blowfish/AES issue. I imagine much more likely it has to do with however the storage works on their end on either or both sides. Maybe something like if you have more than 5TB, it's in multiple parts at some level. I'm sure they could have found a solution if they tried hard enough and probably the fact they'd prefer people with such large backups to go elsewhere is a factor, but I wouldn't be surprised if there really is a techical reason for it.


2121 posts

Uber Geek
+1 received by user: 605


  Reply # 1855191 30-Aug-2017 11:20
Send private message quote this post

RmACK:

 

Good luck! I had a deep pruning forced on me right when I wanted to do a restore - and it took over 24 hours! 

 

I can't emphasise enough just how big a speed increase I got by turning dedupe & compression to minimum settings - I frequently get over 120Mbps, but it does vary down to 20Mbps sometimes. Before changing settings, I never went over 8Mbps. This is on Bigpipe 1000/500 fibre. And for large backup sets, assign 1GB per TB to Java or risk random crashes.

 

 

All finished now and migration complete.

 

The pruning seemed to just randomly stop and restart again on it's own several times, but if it seemed like it wasn't going to restart on it's own (or I just got impatient) I restarted the CrashPlan service which would then enable the "Compact" button again for me to manually tell it to restart. It always seemed to pickup where it left off, and I suspect if I was a little more patient it would have completed on its own without my interference (probably would have been safer to just leave it as well rather than restarting services in the middle of it, but it seems to have worked out OK in my instance).

 

I've now turned off compression and set dedupe to minimum, and while I have only backed up a little bit since the migration the speed does seem much improved.

 

I have Java set at 5GB for my 4.9TB backup set, but memory usage seems to sit at "only" 1.8GB most of the time.

 

 


 
 
 
 


Try Wrike: fast, easy, and efficient project collaboration software
2121 posts

Uber Geek
+1 received by user: 605


  Reply # 1855195 30-Aug-2017 11:25
Send private message quote this post

Nil Einne:

 

They didn't say it had anything to do with the Blowfish/AES issue. I imagine much more likely it has to do with however the storage works on their end on either or both sides. Maybe something like if you have more than 5TB, it's in multiple parts at some level. I'm sure they could have found a solution if they tried hard enough and probably the fact they'd prefer people with such large backups to go elsewhere is a factor, but I wouldn't be surprised if there really is a techical reason for it.

 

 

I have no idea about the technicalities of it however the migration is essentially instantaneous, so it doesn't seem like the are actually "migrating" the data itself from one system to another. It really just seems like the data is staying exactly where it is.


572 posts

Ultimate Geek
+1 received by user: 8


  Reply # 1855196 30-Aug-2017 11:28
Send private message quote this post

Do you have to change clients when you move to the Pro plan? Or does the existing client continue?





Working for Service Plus - www.serviceplus.co.nz

Authorised Service Agent for Apple, BenQ, Navman, Sony, and Toshiba - warranty & non-warranty repairs.

2121 posts

Uber Geek
+1 received by user: 605


  Reply # 1855199 30-Aug-2017 11:30
One person supports this post
Send private message quote this post

jpwise:

 

Do you have to change clients when you move to the Pro plan? Or does the existing client continue?

 

 

It automatically updates the client for you the next time you launch it.


161 posts

Master Geek
+1 received by user: 18


  Reply # 1855201 30-Aug-2017 11:32
Send private message quote this post

Paul1977:
I've now turned off compression and set dedupe to minimum, and while I have only backed up a little bit since the migration the speed does seem much improved.

I have Java set at 5GB for my 4.9TB backup set, but memory usage seems to sit at "only" 1.8GB most of the time.


The 1GB/TB is only a rule of thumb I saw suggested by CP support. The key is that you allowed more than what is needed. I know from experience that hitting the limit causes it to crash. The default 1GB is clearly too little for your backup set. I think IIRC the RAM usage varies with dedupe settings and what data it is backing up at the time so YMMV.

2121 posts

Uber Geek
+1 received by user: 605


  Reply # 1855202 30-Aug-2017 11:38
Send private message quote this post

RmACK: The 1GB/TB is only a rule of thumb I saw suggested by CP support. The key is that you allowed more than what is needed. I know from experience that hitting the limit causes it to crash. The default 1GB is clearly too little for your backup set. I think IIRC the RAM usage varies with dedupe settings and what data it is backing up at the time so YMMV.

 

Yeah, I'm going to leave it at 5GB as there is no benefit in lowering it since it only uses what it needs anyway.

 

I'd obviously read the same support article as you you when I originally set it up as 5GB.


392 posts

Ultimate Geek
+1 received by user: 130


  Reply # 1855229 30-Aug-2017 12:21
Send private message quote this post

Paul1977:

 

jpwise:

 

Do you have to change clients when you move to the Pro plan? Or does the existing client continue?

 

 

It automatically updates the client for you the next time you launch it.

 

 

Upgrades are fully automatic. If you have remote systems, they will updated without any user interaction, which is nice. The client is essentially (if not exactly) the same, just rebranded for CrashPlan Pro.

 

There is a CrashPlan native client, which I thought Pro was using... turns out that's something different again. CrashPlan have been promising a native client for CrashPlan Home for more than two years, and still nothing. Hopefully we will _finally_ see it, and the excessive waste of memory and annoying Java configuration will be a thing of the past.


13835 posts

Uber Geek
+1 received by user: 2450

Trusted
Subscriber

  Reply # 1855231 30-Aug-2017 12:22
Send private message quote this post

Memory usage is for deduplication block hashes. A native client might be able to use memory a little more efficiently, and could be a touch faster, but I doubt it will be night and day. Java is pretty well optimised these days.





AWS Certified Solution Architect Professional, Sysop Administrator Associate, and Developer Associate
TOGAF certified enterprise architect
Professional photographer


184 posts

Master Geek
+1 received by user: 63


  Reply # 1855408 30-Aug-2017 17:06
Send private message quote this post

Hi, so here's my story.

 

 

 

I renewed my crashplan home annual subscription about a month ago.

 

I've now upgraded to crashplan pro (I had 5.4TB of backups online and the migration didn't delete any of it).

 

So, the next 11months are covered by my annual subscription renewal, then the next 12 months will be $2.50/month, then I'll worry about what to do then.

 

 

 

I backup 1 server, and I have 5 workstations using veeam endpoint protection, which backup to the server. Then crashplan uploads that (I have the do not backup open files option ticked, and that seems to avoid issues).

 

 

 

I am using myrepublic Gigabit UFB.

 

Backups typically 120Mb/s

 

Restores typically 300Mb/s

 

 

 

 


32 posts

Geek
+1 received by user: 11


  Reply # 1855412 30-Aug-2017 17:21
One person supports this post
Send private message quote this post

I was a Crashplan customer. For the past 18 months or so I've used Arq to backup both to a local hard drive and to Amazon Web Services. The storage is very very cheap. In addition I've found the 50GB free from Mega to be useful to backup and sync one particular set of data between two machines.

 

It has all run very well.

 

 

 

 

 

 


392 posts

Ultimate Geek
+1 received by user: 130


  Reply # 1855418 30-Aug-2017 17:29
Send private message quote this post

UncleArthur:

 

 

 

I backup 1 server, and I have 5 workstations using veeam endpoint protection, which backup to the server. Then crashplan uploads that (I have the do not backup open files option ticked, and that seems to avoid issues).

 

 

I'd be very cautious using such an approach as CrashPlan doesn't upload a snapshot of files, and it's unlikely the archive software would keep files open until the backup was complete (assuming you consider that good software design, and trust file locking over the network, which I don't). If CrashPlan is backing up while you are updating your backups from client to server, you could end up with corrupt archives. At a minimum, I'd set CrashPlan to only scan for files once a day, at a time you're not going to be updating the backups.


184 posts

Master Geek
+1 received by user: 63


  Reply # 1855556 30-Aug-2017 23:20
One person supports this post
Send private message quote this post

SirHumphreyAppleby:

 

 

 

.... At a minimum, I'd set CrashPlan to only scan for files once a day, at a time you're not going to be updating the backups.

 

 

 

 

Thanks, that is what I have.

 

Of note I have had no issues with several restore scenarios, including a complete raid array failure and recovering everything from crashplan (which took about 2 days to restore 5.4 TB btw.)

 

I've also recovered veeam backups from crashplan without issue.


46 posts

Geek
+1 received by user: 8


  Reply # 1855871 31-Aug-2017 11:53
Send private message quote this post

UncleArthur:

 

 I've now upgraded to crashplan pro  

 

 

Hmm, that's a point - is there any reason NOT to upgrade to Pro now rather than later?

 

Or to put it another way, is there any reason to stay with Home while it still exists (apart from computer to computer backups which I don't use).

 

 

 

UncleArthur:

 

So, the next 11months

 

 

Actually I think you have 13 months - they've extended existing subs by 60 days.


184 posts

Master Geek
+1 received by user: 63


  Reply # 1856052 31-Aug-2017 17:09
Send private message quote this post

mgeek:

 

 

 

Actually I think you have 13 months - they've extended existing subs by 60 days.

 

 

 

 

Good point.


1 | ... | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19
Filter this topic showing only the reply marked as answer View this topic in a long page with up to 500 replies per page Create new topic



Twitter »

Follow us to receive Twitter updates when new discussions are posted in our forums:



Follow us to receive Twitter updates when news items and blogs are posted in our frontpage:



Follow us to receive Twitter updates when tech item prices are listed in our price comparison site:





News »

Exhibition to showcase digital artwork from across the globe
Posted 23-May-2018 16:44


Auckland tops list of most vulnerable cities in a zombie apocalypse
Posted 23-May-2018 12:52


ASB first bank in New Zealand to step out with Garmin Pay
Posted 23-May-2018 00:10


Umbrellar becomes Microsoft Cloud Solution Provider
Posted 22-May-2018 15:43


Three New Zealand projects shortlisted in IDC Asia Pacific Smart Cities Awards
Posted 22-May-2018 15:14


UpStarters - the New Zealand tech and innovation story
Posted 21-May-2018 09:55


Lightbox updates platform with new streaming options
Posted 17-May-2018 13:09


Norton Core router launches with high-performance, IoT security in New Zealand
Posted 16-May-2018 02:00


D-Link ANZ launches new 4G LTE Dual SIM M2M VPN Router
Posted 15-May-2018 19:30


New Panasonic LUMIX FT7 ideal for outdoor: waterproof, dustproof
Posted 15-May-2018 19:17


Ryanair Goes All-In on AWS
Posted 15-May-2018 19:14


Te Papa and EQC Minecraft Mod shakes up earthquake education
Posted 15-May-2018 19:12


Framing Facebook: It’s not about technology
Posted 14-May-2018 16:02


Vocus works with NZ Police and telcos to stop scam calls
Posted 12-May-2018 11:12


Vista Group signs Aeon Entertainment, largest cinema chain in Japan
Posted 11-May-2018 21:41



Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.

Alternatively, you can receive a daily email with Geekzone updates.