Posts: 1,318
Threads: 353
Joined: Jan 2014
Reputation:
0
So I'm doing the zero out test, 7 times, it says it will be done in 6 days...
If the drive doesn't fail after 6 days of continuous writing then I think it will be ok.
Posts: 2,569
Threads: 443
Joined: Oct 2021
My question would be:
Is this really SOP anywhere? Why wouldn't this possibly reduce the life of the drive? I'd love to see a link to info on if/when this practice is used in "enterprise-level"* situations.
I'm not doubting any of your knowledge, just wondering about the background. I've never heard of doing this to a new drive before...
Thanks,
John
*I hate jargon like that, but didn't know how else to say it!
Posts: 52,256
Threads: 2,802
Joined: May 2025
Reputation:
2
If the drive doesn't fail after 6 days of continuous writing then I think it will be ok.
I don't think that statement has any statistical merit. MBTFs for HDs are based on a lot longer run times.
All the above means that *if* it was going to fail early on, it *might* have failed earlier rather than later. And that doesn't mean much.
Zeroing a new drive *does* have merit by virtue of mapping out any bad blocks that might exist. But doing so for six days really doesn't improve upon that, nor does one drive failing or not failing have any statistical value over that time frame.
It might run flawlessly for that time, then fail on the next startup or restart.
Posts: 2,569
Threads: 443
Joined: Oct 2021
RAMd®d wrote: It might run flawlessly for that time, then fail on the next startup or restart.
That was my gut reaction as well, and I was just curious if there was any widespread use of these practices that I wasn't aware of...
I guess it boils down to this question: what's the simplest, most efficient way to map out bad blocks, both on a new drive, and over time? DiskWarrior should do this, right?
Posts: 420
Threads: 30
Joined: Jan 2020
Reputation:
0
I think that this makes some sense- if I remember correctly, I saw statistics (maybe on BareFeats, or a hard-drive specific site) that a new drive is most likely to fail very early in life. Presumably running it through a few months of normal use in a few days before putting data on it gets it over that hurdle. It probably will not be an enterprise practice- because it is cheaper (in the long run), easier and more reliable to buy drives in pairs.
cheers
scott
Posts: 1,318
Threads: 353
Joined: Jan 2014
Reputation:
0
I think I'm on day 4 and the drive has gone non-responsive. I'm going to go home during my lunch break, unplug it and plug it back in, but maybe this test isn't pointless after all. If I had written a bunch of data to it and then had it go unresponsive I'd be pretty upset right now.