Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.


timmmay

20581 posts

Uber Geek

Trusted
Lifetime subscriber

#138285 28-Dec-2013 14:46
Send private message

I've noticed that my larger backup disks that aren't used very often are starting to fail, often just with a single sector being reallocated, then another sector later. I've read this could happen more and more as drives get bigger, single bit or sector errors that cause files to become corrupt. I assume there's some error checking and correction built in at the hardware level, but can a file system be designed to combat this?

You might say treat a 2TB hard drive like a 1TB drive, each byte is written to different areas of the disk. This would give you better reliability, and disks are so cheap the loss of space probably isn't important for most consumers.

Has anything like this been done in a way that's usable for windows machines?

View this topic in a long page with up to 500 replies per page Create new topic
 1 | 2 | 3
insane
3240 posts

Uber Geek

ID Verified
Trusted

  #958331 28-Dec-2013 15:16
Send private message

That's where SAS disks come into play, overall more reliable, and I believe they have more spare sectors too



timmmay

20581 posts

Uber Geek

Trusted
Lifetime subscriber

  #958337 28-Dec-2013 15:30
Send private message

Spare sectors isn't quite what I mean. I mean keeping two copies of the data on different parts of the same disk. I assume not all errors are recoverable.

Fred99
13684 posts

Uber Geek


  #958347 28-Dec-2013 16:11
Send private message

timmmay: Spare sectors isn't quite what I mean. I mean keeping two copies of the data on different parts of the same disk. I assume not all errors are recoverable.


I get what you mean, but not a good idea IMO.  It might marginally reduce the chance of permanently losing data in case of head crash etc, but because it's going to be writing twice, then that's more wear and tear on the mechanical components, and not giving you any advantage in case of electronic/board failure. I expect if it could be achieved, then it would also be very slow.



Hammerer
2476 posts

Uber Geek

Lifetime subscriber

  #958349 28-Dec-2013 16:17
Send private message

I keep multiple copies using file syncing of folders or entire partition. There's a lot of free software to do this e.g. freefilesync, rsync. If data does become corrupted then you don't want the corruption copied to that other partition so you don't want to use a comparison of file contents to start syncing for that file.

There are file systems that provide redundancy and there were some solutions under development for Windows but I'm not sure if they are available yet.

The Wikipedia article on data corruption might point you in the right direction. ZFS, for example, performs continuous integrity checks and can correct most data errors and I use it on my NAS.

[Edit to correct spelling]

gundar
488 posts

Ultimate Geek

Trusted

  #958381 28-Dec-2013 18:10
Send private message

Hi,

I've seen this on large drives that are infrequently used but have power saving enabled, that is, they are constantly being powered up and down because they are infrequently used.

To avoid problems I've had in the past, I've disabled all forms of power saving and bought enterprise grade drives where possible.

I don't think I've heard of software raid between partitions on the same device, and I'm sure it won't make disks more reliable because you'd be wearing the head down trying to sync the partitions. I know it's not what you want to hear, but you should have two disks, separate power supplies each with a copy of your stuff, minimum.

+1 for ZFS and it's friend RAID-Z - I think Windows 2012 has a similar file/RAID system available when storage services are setup.

Good luck.

timmmay

20581 posts

Uber Geek

Trusted
Lifetime subscriber

  #958432 28-Dec-2013 21:02
Send private message

Thanks for the thoughts guys, doesn't sound very practical. I have my backups on external drives in three separate locations, but I use mirroring a lot so I am vulnerable to corruption being copied. I'm wary of differential backup software, I wouldn't want a single bit image in a 50GB backup file to corrupt everything, but I guess you can't protect against everything.

Maybe I'll just use amazon glacier and let them worry about it. Problem is the 2TB+ of images, gets expensive.

gundar
488 posts

Ultimate Geek

Trusted

  #958435 28-Dec-2013 21:06
Send private message

What you need is a mate across town who has VDSL or FTTH and leave a spare encrypted disk at his place. It would be free if you offered to return the favour.

 
 
 

Trade NZ and US shares and funds with Sharesies (affiliate link).
kyhwana2
2566 posts

Uber Geek


  #958491 29-Dec-2013 00:19
Send private message

btrfs can do this. (RAID1 on a single disk/block cloning)

https://en.wikipedia.org/wiki/Btrfs#Cloning

jpoc
1043 posts

Uber Geek


  #958505 29-Dec-2013 05:50
Send private message

Both zfs and btrfs can do what you want on a single drive. (You would have to split it into partitions with zfs, btrfs can do this on a drive with one partition.)

Personally, I would pick zfs - I use it myself.

Neither of those will work on windows.

For a pure windows solution, for backups that need to be protected against single sector disk failures, I would pick the following:

Use winrar to create a rar file of the part of your windows file system that you wish to backup. Rather than have this done as one single large rar file, select the option to split into (say) 100 parts and then write a further (say) 5 recovery volumes.

If you want to get back just one file, open the first rar file with winrar and then navigate the directories until you find the file and then extract it.

If you get sector level corruption on the drive, then as long as you can read 100 files from your set of 100 rar files and five recovery volumes then winrar can restore your full backup. You can deal with as many bad sectors as you have recovery volumes.

In a way this will also get you one of the big advantages that zfs has over something like raid and that is protection against silent corruption*. You will get the same using winrar like this because it checksums all of its data.

I have used exactly this technique on windows systems in the past and have been able to recover from cases where I had lost the maximum allowable number of rar or recovery files. It works very well indeed.

*If you do not know about this, google for descriptions of the raid 5 write hole.

timmmay

20581 posts

Uber Geek

Trusted
Lifetime subscriber

  #958520 29-Dec-2013 09:03
Send private message

Interesting about the file systems, I'll do some reading. I'm not sure how practical those file systems would be given these are for external backup disks that are connected weekly to quarterly.

gundar: What you need is a mate across town who has VDSL or FTTH and leave a spare encrypted disk at his place. It would be free if you offered to return the favour.


This is for backup disks that are connected weekly to quarterly. Online backup isn't practical for the time frames or data volumes in question, I often add 40GB at a backup after photographing an event. That's half a months download allowance.

jpoc
1043 posts

Uber Geek


  #958527 29-Dec-2013 09:22
Send private message

timmmay: Interesting about the file systems, I'll do some reading. I'm not sure how practical those file systems would be given these are for external backup disks that are connected weekly to quarterly.

gundar: What you need is a mate across town who has VDSL or FTTH and leave a spare encrypted disk at his place. It would be free if you offered to return the favour.


This is for backup disks that are connected weekly to quarterly. Online backup isn't practical for the time frames or data volumes in question, I often add 40GB at a backup after photographing an event. That's half a months download allowance.


You could consider making a usb boot stick with Linux on it and then, come backup time, booting the PC off the stick, mounting an external drive that you have formatted with btrfs and copying the data onto the drive. Linux/btrfs would be a better choice than FreeBSD/ZFS for that as Linux will do better at working with your windows disks. With FreeBSD/ZFS you are better off having that on a stand alone machine and mounting it over the network.

PANiCnz
990 posts

Ultimate Geek


  #958537 29-Dec-2013 10:14
Send private message

You can run ZFS on Linux ;)

jpoc
1043 posts

Uber Geek


  #958543 29-Dec-2013 10:37
Send private message

PANiCnz: You can run ZFS on Linux ;)


Is it still run as a user mode file system? Are there drawbacks with that?


freitasm
BDFL - Memuneh
79294 posts

Uber Geek

Administrator
ID Verified
Trusted
Geekzone
Lifetime subscriber

  #958545 29-Dec-2013 10:43
Send private message

timmmay: Thanks for the thoughts guys, doesn't sound very practical. I have my backups on external drives in three separate locations, but I use mirroring a lot so I am vulnerable to corruption being copied. I'm wary of differential backup software, I wouldn't want a single bit image in a 50GB backup file to corrupt everything, but I guess you can't protect against everything.

Maybe I'll just use amazon glacier and let them worry about it. Problem is the 2TB+ of images, gets expensive.


How much space are you using for backup?






Please support Geekzone by subscribing, or using one of our referral links: Quic Broadband (free setup code: R587125ERQ6VE) | Samsung | AliExpress | Wise | Sharesies | Hatch | GoodSync 


timmmay

20581 posts

Uber Geek

Trusted
Lifetime subscriber

  #958588 29-Dec-2013 12:55
Send private message

Booting to linux isn't very practical. I'd probably go full RAID before I did that. Thanks for the idea though.

freitasm:
timmmay: Thanks for the thoughts guys, doesn't sound very practical. I have my backups on external drives in three separate locations, but I use mirroring a lot so I am vulnerable to corruption being copied. I'm wary of differential backup software, I wouldn't want a single bit image in a 50GB backup file to corrupt everything, but I guess you can't protect against everything.

Maybe I'll just use amazon glacier and let them worry about it. Problem is the 2TB+ of images, gets expensive.


How much space are you using for backup?


Somewhere between 2TB and 4TB, but I don't back up everything to every location.

 1 | 2 | 3
View this topic in a long page with up to 500 replies per page Create new topic





News and reviews »

Air New Zealand Starts AI adoption with OpenAI
Posted 24-Jul-2025 16:00


eero Pro 7 Review
Posted 23-Jul-2025 12:07


BeeStation Plus Review
Posted 21-Jul-2025 14:21


eero Unveils New Wi-Fi 7 Products in New Zealand
Posted 21-Jul-2025 00:01


WiZ Introduces HDMI Sync Box and other Light Devices
Posted 20-Jul-2025 17:32


RedShield Enhances DDoS and Bot Attack Protection
Posted 20-Jul-2025 17:26


Seagate Ships 30TB Drives
Posted 17-Jul-2025 11:24


Oclean AirPump A10 Water Flosser Review
Posted 13-Jul-2025 11:05


Samsung Galaxy Z Fold7: Raising the Bar for Smartphones
Posted 10-Jul-2025 02:01


Samsung Galaxy Z Flip7 Brings New Edge-To-Edge FlexWindow
Posted 10-Jul-2025 02:01


Epson Launches New AM-C550Z WorkForce Enterprise printer
Posted 9-Jul-2025 18:22


Samsung Releases Smart Monitor M9
Posted 9-Jul-2025 17:46


Nearly Half of Older Kiwis Still Write their Passwords on Paper
Posted 9-Jul-2025 08:42


D-Link 4G+ Cat6 Wi-Fi 6 DWR-933M Mobile Hotspot Review
Posted 1-Jul-2025 11:34


Oppo A5 Series Launches With New Levels of Durability
Posted 30-Jun-2025 10:15









Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.