Jump to content

A few suggestions


Saintj

Recommended Posts

Lol well this is turning more into a Guttman/wipe type discussion then suggestions :P the suggestions are still valid lol.

 

On the other hand one last thing about the whole Guttman thing, the 35 wipe methode it uses has about 5-7 passes that are pretty much useless now a days right? But it still does what it says right? overwrite the data a couple of times with different data making it unrecoverable or is it better to switch to NSA? That was my question

 

As another note in the original Guttmann script he says some of his passes alter the db level wich on new drives mostlikely won't work anymore but that idea did work on older drives altering the magnetic data a bit thus making even forensic teams come up with junk. I don't think that works with new drives anymore though.

 

But anyways as I asked, is Guttman still a unrecoverable wipe (just like the 1 pass) or does it make it easier to recover thanks to old methods used. I'm just curious about that :)

Link to comment
Share on other sites

I strongly agree that the registry cleaner will in deep speed up your system and in deed CCleaner's registry cleaner is far far far from perfect. Even so that it is not designed be be aggressive, people have reported problems and others would put the blame on Piriform for this. Many guys would even argue that the downside of registry cleaners outweigh its benefits.

 

If another registry cleaner is doing better for you, who is stopping you from using that along CCleaner???

 

I have 4 different registry cleaners installed but even so, I had to manually choose which to fix, when using the aggressive ones.

I love computer maintenance tasks.

Some of my favorite programs:

Wordpad -basic word processing

Notepad - temporary clipboard and basic scripting module

Windows Media Player 12- video, music and online radio player

Windows Media Center - live TV, local FM radio

CCleaner- handy computer maintenance tool

 

If something fails to work after using the registry cleaner, use SYSTEM RESTORE.

Link to comment
Share on other sites

Unrecoverable. smile.gif

 

Figured, just wanted to make sure :), only other question I have about that is howmany passes are useless now a days? I think about 6 (checking the diagram of what the passes do that is) And what passes that change the magnetification of wiped parts are still in tact? (because that was the only "Special" part of Guttman, the fact it changed the digital signal, MFM, RLL etc.), ask this more out of curiosity :P there is no written documentation available on wich passes are rendered useless)

Link to comment
Share on other sites

  • Moderators

None of the Gutmann passes are relevant today, and haven't been for the past 15 years or so (well, perhaps one of the first and last four, which are random passes). The authority - the written docuumantation - for this is Pete Gutmann himself, as has already been mentioned in this thread - "A good scrubbing with random data will do about as well as can be expected"

 

The Gutmann method is irrelevant, as it attempts to eliminate the possible use of threshold detection to identify a value (where the strength of a signal retrieved is above or below a threshold), whereas all current disks use PRML coding, which relies on a detection of data sequences to establish a signal.

 

The process of writing and subsequent reading/decoding data on a hard drive is phenomenally complicated. No user data is ever stored on a disk, or ever has been. It is coded and expanded many times before being written, and subsequently needs the exact reverse process to be decoded. If I published a track's worth of data as it is stored on a disk, in zeroes and ones, deleted or otherwise, then few if any would be able to translate it into data.

 

To read data off a disk you, or mercifully the disk controller, would have to detect and process the waveform into a digital signal (a very complex filtering and sampling process), strip off the parity bits, detect the sector sync mark, decode the RLL sequence, descramble the data, correct any errors using the ECC algorithm, and unwrap the CRC code. And possibly quite a lot more. To do this you would need to be in possession beforehand of many parameters such as the RLL type, the pseudo-random scrambler sequence, and the type of correction codes used. You would need to know what disk make and model and what coding the manufacturer used. So it's no small task. And with Hyper-tuning, where some of these parameters are optimised iteratively when the disk is burned-in at the factory, you would never know what they are.

 

So, one pass of random data will do. Using CC's one pass of zeroes is just as good, as it's scrambled on the disk anyway. So why does CC offer Gutmann (and other methods of overwriting)? I con only guess, as mentioned, that it's for marketing. It's certainly a waste of time and effort.

Link to comment
Share on other sites

 

I con only guess, as mentioned, that it's for marketing. It's certainly a waste of time and effort.

 

This basically,

No fate but what we make

Link to comment
Share on other sites

Here's an interesting article I read. It talks about the differences between Secure Erase (ATE-SE) and block erasure methods (the typical pass over wipes I believe).

 

The author of the post says, "However: remember that data CAN (probably) be recovered from a drive wiped by ATA-SE. Granted, the level of expertise and equipent is high and time committed is huge ? but it CAN (probably) be done." Now from what I've read isn't Secure Erase supposed to be infallible and work similar to how TRIM does?

Link to comment
Share on other sites

None of the Gutmann passes are relevant today, and haven't been for the past 15 years or so (well, perhaps one of the first and last four, which are random passes). The authority - the written docuumantation - for this is Pete Gutmann himself, as has already been mentioned in this thread - "A good scrubbing with random data will do about as well as can be expected"

 

The Gutmann method is irrelevant, as it attempts to eliminate the possible use of threshold detection to identify a value (where the strength of a signal retrieved is above or below a threshold), whereas all current disks use PRML coding, which relies on a detection of data sequences to establish a signal.

 

The process of writing and subsequent reading/decoding data on a hard drive is phenomenally complicated. No user data is ever stored on a disk, or ever has been. It is coded and expanded many times before being written, and subsequently needs the exact reverse process to be decoded. If I published a track's worth of data as it is stored on a disk, in zeroes and ones, deleted or otherwise, then few if any would be able to translate it into data.

 

To read data off a disk you, or mercifully the disk controller, would have to detect and process the waveform into a digital signal (a very complex filtering and sampling process), strip off the parity bits, detect the sector sync mark, decode the RLL sequence, descramble the data, correct any errors using the ECC algorithm, and unwrap the CRC code. And possibly quite a lot more. To do this you would need to be in possession beforehand of many parameters such as the RLL type, the pseudo-random scrambler sequence, and the type of correction codes used. You would need to know what disk make and model and what coding the manufacturer used. So it's no small task. And with Hyper-tuning, where some of these parameters are optimised iteratively when the disk is burned-in at the factory, you would never know what they are.

 

So, one pass of random data will do. Using CC's one pass of zeroes is just as good, as it's scrambled on the disk anyway. So why does CC offer Gutmann (and other methods of overwriting)? I con only guess, as mentioned, that it's for marketing. It's certainly a waste of time and effort.

 

Gotta make someone wonder, with the knowledge we have now a days why isen't anyone busy creating a new method of wiping, one that IS current. A you state yourself "They" need to unwrap the CRC code (why not create a couple passes scrambeling this randomly) we know they could recover things from possible left magnetic traces why not create passes that backtrack the magnetic traces left rewrite the same file with different magnetic traces a couple of times, end of it all the current "Secure" wipe methodes are outdated while some people still want to securely delete stuff, it is possible just has to be updated, the Guttman wipe obviously was a good wipe back in the days, why not recreate something for the current way data is stored and magnetic traces are left on your comp, end of it all the best way to securely delete something is a giant magnet/hammer/magnesium burn but that's not what we want ofcourse unless you want to securely dump your disk. There has to be a way to create about a 25 pass wipe that does the trick, (5 passes that fill it with different random data, 5 passes scrambeling the RLL sequence and possibility to unwrap the CRC code, 5 passes changing the Magnetic Frequencies randomly so its inrecoverable even with a microscope, 7 passes filling it with 1's 2's 3's 4's 5's 6's and 7's, one more pass scrambeling the code and magnetic force and the last pass filling it with 0's). This should be possible, and then you WILL have a seriously secure delete that IS 100% bulletproof, you could even make a pass encrypting the data with the latest/best encryption algorithms wich will be random on each wipe ofcourse or even a pass corrupting the data completely. It's just an idea but that's mostlikely the same way a "Guttman method" would look like in our current day and age.

I'd have to say the 0's challenge might be unable to be done for any recovery agency buuutt an intelligence agency will have alot less problems with the unscrambeling/recovering of sayed data (eventhough it costs alot for them to do forensics will be able to).

 

One thing I got to wonder about though eventhough you say the 5 random scrubbing passes are the only working passes in Guttman what about the MFM changes it creates, they should still be slightly accurate aren't they?

 

And what about the NSA way or the DoD way, why would the US/canadian and other goverments bother using these wipe methods still if they "don't work any better then a one pass wipe"... gotta make you wonder though.

Link to comment
Share on other sites

Here's an interesting article I read. It talks about the differences between Secure Erase (ATE-SE) and block erasure methods (the typical pass over wipes I believe).

 

The author of the post says, "However: remember that data CAN (probably) be recovered from a drive wiped by ATA-SE. Granted, the level of expertise and equipent is high and time committed is huge ? but it CAN (probably) be done." Now from what I've read isn't Secure Erase supposed to be infallible and work similar to how TRIM does?

 

u even read my reply, data could be partially recovered, at best from a true pass

 

end of discussion, u lot can argue you it till your blue in the face

No fate but what we make

Link to comment
Share on other sites

  • Moderators

The referred article also says that 'On modern, high-capacity drives, multiple overwrites are no more effective than a single overwrite' and that 'The single overwrite ATA-SE method ....... has replaced the old DoD 5220.22-M standard'.

 

I don't seem to be getting my point over very well. None of the Gutmann passes are relevant, as the disk data coding methods they are trying to obliterate haven't been used for 15 years. Of course any one Gutmann pass is as effective in overwriting data as any other pass, but because it's overwriting, not because of what it's overwriting with. There's no difference between overwriting with a passage from the Book of Revelations or a selection of juicy swear-words.

 

The reason why Gutmann created his muliple passes is because with the old coding methods you could predict what the bit pattern written to the disk would be, and overwrite accordingly. You can't predict the bit pattern using current PRML/EPMRL coding.

 

The rather complex description of the way data is encoded on disk was to show how difficult it would be to decode the raw signals of even non-overwritten data. The reason the disk controller does this is solely to ensure that data can be read from the disk with an error rate of around one bit in 1,000,000,000,000,000 (I couldn't type 10 to the power of 15). It has nothing to do with data security.

 

Of course you can't scramble the RLL or any other coding sequence, that's how the data is coded to be physically represented on the disk.

 

I find your suggestions for secure deletion rather fantastic. I mean in the realm of fantasy, not super duper. Just run one overwrite and be sure that nobody can ever read that data again. The stuff you're trying to secure may well be found elsewhere on the disk, but that's another topic I don't want to get involved with.

Link to comment
Share on other sites

I vote the heading of this thread be changed to the Gutmann method. :) and moved to the Lounge.

I vote for the little + and - buttons to be fixed cos Augeas certainly deserves a couple of positive votes for his explanations in his last couple of posts.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.