• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Does writing zeros to hard disk completely wipe out data ?

xMax

Senior member
because ive downloaded balcco file shredder, and even after shredding my hardisk, i still was able to recover empty folder and file names on the disk. its almost like nobody wants you to be able to completely delete all information off your hardisk.

so i figure, writing zeros to the entire drive should completely wipe out data beyond the point of any concievable means of recovering that data. i mean, how would the information be recovered if you write zeros to every binary string that stores data.

 
Originally posted by: meatfestival
Wouldn't overwriting the hard disk with random data be better than zeros?

Once you write 0's 7 times the polarity of the platters makes it almost impossible to recover ANY data on the disk down to the bit level. Random writes could still leave clusters that haven't been written enough times to be totally clean.

Its kinda the same logic as a paper shredder, you don't want to leave any pieces big enough to read...

 
Originally posted by: meatfestival
Wouldn't overwriting the hard disk with random data be better than zeros?
Yeah, several times. That's what DBAN does. Another option (for WinXP Pro and any Win2000 variant) is the cipher command, which overwrites any blank space with three layers of Encrypting File System-encrypted junk data, deleting it after each round. But I'd go with DBAN on one of its higher settings, if I were looking to blank a disk before letting a company PC go out the door or something.

To use cipher, let's say you deleted all partitions on a drive and then created a single blank partition that's got drive letter F:. Start > Run > cmd and use this command line:

cipher /w:F:\

and off she goes. You can run cipher /w on your C: drive while using your computer, incidentally, it doesn't do any harm to existing data.
 
Originally posted by: thorny169
Originally posted by: meatfestival
Wouldn't overwriting the hard disk with random data be better than zeros?

Once you write 0's 7 times the polarity of the platters makes it almost impossible to recover ANY data on the disk down to the bit level. Random writes could still leave clusters that haven't been written enough times to be totally clean.

Its kinda the same logic as a paper shredder, you don't want to leave any pieces big enough to read...

You're wrong.
 
Originally posted by: n0cmonkey
Originally posted by: thorny169
Originally posted by: meatfestival
Wouldn't overwriting the hard disk with random data be better than zeros?

Once you write 0's 7 times the polarity of the platters makes it almost impossible to recover ANY data on the disk down to the bit level. Random writes could still leave clusters that haven't been written enough times to be totally clean.

Its kinda the same logic as a paper shredder, you don't want to leave any pieces big enough to read...

You're wrong.


Explain it then
 
Originally posted by: thorny169
Originally posted by: n0cmonkey
Originally posted by: thorny169
Originally posted by: meatfestival
Wouldn't overwriting the hard disk with random data be better than zeros?

Once you write 0's 7 times the polarity of the platters makes it almost impossible to recover ANY data on the disk down to the bit level. Random writes could still leave clusters that haven't been written enough times to be totally clean.

Its kinda the same logic as a paper shredder, you don't want to leave any pieces big enough to read...

You're wrong.


Explain it then

Read up on it.
 
Originally posted by: n0cmonkey
Originally posted by: thorny169
Originally posted by: n0cmonkey
Originally posted by: thorny169
Originally posted by: meatfestival
Wouldn't overwriting the hard disk with random data be better than zeros?

Once you write 0's 7 times the polarity of the platters makes it almost impossible to recover ANY data on the disk down to the bit level. Random writes could still leave clusters that haven't been written enough times to be totally clean.

Its kinda the same logic as a paper shredder, you don't want to leave any pieces big enough to read...

You're wrong.


Explain it then

Read up on it.

It's been several years since school, but in class we were told DOD standard (which we were left to assume was the best) was 7 writes. Not random. What about the Law of Probability? If your dealing with sensitive data do you leave anything to chance?

I'm really not interested in reading up on it, I'm recalling something I was told years ago from memory. If I'm wrong, I'm sorry for spreading misinformation, but please at least explain it instead of just saying I'm wrong.
 
Originally posted by: n0cmonkey
Originally posted by: thorny169
Originally posted by: n0cmonkey
Originally posted by: thorny169
Originally posted by: meatfestival
Wouldn't overwriting the hard disk with random data be better than zeros?

Once you write 0's 7 times the polarity of the platters makes it almost impossible to recover ANY data on the disk down to the bit level. Random writes could still leave clusters that haven't been written enough times to be totally clean.

Its kinda the same logic as a paper shredder, you don't want to leave any pieces big enough to read...

You're wrong.


Explain it then

Read up on it.


Why do you think people come here, dipsh!t.
 
Originally posted by: thorny169
I'm really not interested in reading up on it, I'm recalling something I was told years ago from memory. If I'm wrong, I'm sorry for spreading misinformation, but please at least explain it instead of just saying I'm wrong.

Multiple overwrites is good, and ramdom over writes helps. The only really good method is destroying the drives.

To quote from a paper on usenix:
In conventional terms, when a one is written to disk the media records a one, and when a zero is written the media records a zero. However the actual effect is closer to obtaining a 0.95 when a zero is overwritten with a one, and a 1.05 when a one is overwritten with a one. Normal disk circuitry is set up so that both these values are read as ones, but using specialised circuitry it is possible to work out what previous "layers" contained. The recovery of at least one or two layers of overwritten data isn't too hard to perform by reading the signal from the analog head electronics with a high-quality digital sampling oscilloscope, downloading the sampled waveform to a PC, and analysing it in software to recover the previously recorded signal. What the software does is generate an "ideal" read signal and subtract it from what was actually read, leaving as the difference the remnant of the previous signal. Since the analog circuitry in a commercial hard drive is nowhere near the quality of the circuitry in the oscilloscope used to sample the signal, the ability exists to recover a lot of extra information which isn't exploited by the hard drive electronics (although with newer channel coding techniques such as PRML (explained further on) which require extensive amounts of signal processing, the use of simple tools such as an oscilloscope to directly recover the data is no longer possible).

So, with multiple over writes of random types (0s and 1s, obviously), it'd be harder to get the original piece. If you over write it with just 0s, it'd be easier to find the original.

9. Conclusion
Data overwritten once or twice may be recovered by subtracting what is expected to be read from a storage location from what is actually read. Data which is overwritten an arbitrarily large number of times can still be recovered provided that the new data isn't written to the same location as the original data (for magnetic media), or that the recovery attempt is carried out fairly soon after the new data was written (for RAM). For this reason it is effectively impossible to sanitise storage locations by simple overwriting them, no matter how many overwrite passes are made or what data patterns are written. However by using the relatively simple methods presented in this paper the task of an attacker can be made significantly more difficult, if not prohibitively expensive.

IIRC the paper explains that the read/write arm in the hard drive won't get the exact same place every time, with makes finding the original easier.

The DoD doesn't generally sit on its butt. I'm sure their standards have changed over the years. 😉
 
Originally posted by: aeternitas
Why do you think people come here, dipsh!t.

To either ask questions that aren't answered anywhere else, or to be lazy jackholes and ask the same questions a million other people asked. Crawl back in your hole newbie.
 
Originally posted by: n0cmonkey
Originally posted by: thorny169
I'm really not interested in reading up on it, I'm recalling something I was told years ago from memory. If I'm wrong, I'm sorry for spreading misinformation, but please at least explain it instead of just saying I'm wrong.



The DoD doesn't generally sit on its butt. I'm sure their standards have changed over the years. 😉

You sir, have enlightened me, and I stand corrected. Not sure how accurate your comment is about the DoD sitting on its butt is though 😉
 
Originally posted by: thorny169
Originally posted by: n0cmonkey
Originally posted by: thorny169
I'm really not interested in reading up on it, I'm recalling something I was told years ago from memory. If I'm wrong, I'm sorry for spreading misinformation, but please at least explain it instead of just saying I'm wrong.



The DoD doesn't generally sit on its butt. I'm sure their standards have changed over the years. 😉

You sir, have enlightened me, and I stand corrected. Not sure how accurate your comment is about the DoD sitting on its butt is though 😉

The government spends money.

Overwriting hard drives better costs more money.

The government over writes their disks better.

Hell, they probably just incinerate the things (Nuke'em from orbit, it's the only way to be sure). 😉
 
hmm, how often do these threads come up, every three weeks? hahaha. i agree that the random writes are better than zeros, as long as you a lot of them obviously.
 
Originally posted by: aeternitas
Originally posted by: n0cmonkey
Originally posted by: thorny169
Originally posted by: n0cmonkey
Originally posted by: thorny169
Originally posted by: meatfestival
Wouldn't overwriting the hard disk with random data be better than zeros?

Once you write 0's 7 times the polarity of the platters makes it almost impossible to recover ANY data on the disk down to the bit level. Random writes could still leave clusters that haven't been written enough times to be totally clean.

Its kinda the same logic as a paper shredder, you don't want to leave any pieces big enough to read...

You're wrong.


Explain it then

Read up on it.


Why do you think people come here, dipsh!t.

HAH, 0wn3d! Sorry N0c, but I'm going to have to disagree with your stance on this. I would rather discuss something on AT than go rummaging through the net if I could.
 
Originally posted by: n0cmonkey
To quote from a paper on usenix:
In conventional terms, when a one is written to disk the media records a one, and when a zero is written the media records a zero. However the actual effect is closer to obtaining a 0.95 when a zero is overwritten with a one, and a 1.05 when a one is overwritten with a one.(...)Data overwritten once or twice may be recovered by subtracting what is expected to be read from a storage location from what is actually read.

But does that paper give some real-life examples? The theory looks nice and sound, but how about a practical experiment?

And the latter part of what I quoted above "subtracting what is expected to be read" is somewhat ambiguous, is it not? (you've read 1.05, but expected 1.00, so you conclude the previous layer was a 1?)

I've been dimly aware of this type of recovery since the mid-eighties, but after a brief e-mail exchange a few years back with a researcher working for IBAS, I got the distinct impression that such recovery is highly theoretical.

NSA may (or may not) have technology to do this, but as with all things: Nobody knows what NSA are capable of -- we can only guess. NSA certainly isn't going to dissect xMax' hard drive to track down the deleted porn pictures he is hiding from his fiance! 😀

And all this assumes the file can be located... Although the file may not have been overwritten much, you still need to look at the file allocation table (or similar) and hope it hasn't been overwritten (too much), plus a fragmented file will present further difficulties. (I suspect NTFS will provide a steeper challenge than FAT -- I'm not too familiar with NTFS in this regard though)
 
so its basically a ridiculously diffuclt process to try to permanently erase data off your disk. but i guess the bottom line is that what your all saying is that to revover data that has been overwritten 7 times, using random overwrites, one would need hardware tools, and not simply be able to hack into your computer from the internet and revover data using only software tools. because if thats so, then its not a problem for the average user whos not concerned about the government or some major organization stealing his computer and using gadgets and tools to recover the data.



 
Back
Top