I've tried to look for it after seeing other topics about this happening to others to no avail. I'm thinking it must be a problem with the free space wiping since this has happened once before and I fixed it. I need help looking for that folder that's over 100GB.
not folder but file
Wipe free space is what causes this. If you run WFS again (set it to 1 pass and don't interrupt it)this file will go away.
after that uncheck wipe free space
Hi there!
I have been having a similar (but not the same) problem which has reliably and reproduceably occured with the secure overwrite function set to gutmann. When writing over very large files (>5Gb) in my wastebasket, ccleaner ends successfully, only to leave behind a file in the wastebasket which is not visible in the analysis or explorer but fills up all of my remaining harddrive space. ccleaner will attempt to overwrite this in the next pass, but will not kill the phantom data garbage file when in single pass and will reproduce the file in the size of all available space when in gutmann. This might be because it fills up all the space in the wastebasket and windows gives it (the wastebasket) all the remainig space to dump. I can fix the problem by defragging my drive, but: a) this happens repeatedly with large files, which means I got to go through a lot of work (and computer time!) just to delete them properly; this makes me think about the large amount of data garbage that ccleaner must be producing; c) most importantly, all this overwriting/defragging/rewriting must be taking a toll on my drive and eating up its lifetime, especially if I have to do it repeatedly. Since getting a solid state is not an option financially, I got to stick to hard magnetic disk, and working it like this produces errors and data loss (which is a b***h). I'd like to know if there is a solution to this or if Piriform knows about this problem and is working on a solution, or if this is a limit to the overwrite format and I have to work around it.
Whilst I try to advocate that anyone can do what they like to their own kit, I would suggest that you consider what you are doing. For a start, do you really want to securely delete very large files? What would be the risk if you just deleted them?
Secondly, the mechanics of a Gutmann overwrite are horrendous. I don't know whether CC bypasses the normal transaction logging used when updating data, but in any event you will be writing hundreds of gigs of data to overwrite one 5 gb file using the Gutmann method. This is just going to kill your hd, apart from your life.
I don't know what's causing your problem (or really understand it): I would go for a more simple approach to deletion.
@Augeas
When I think about it, you're right. BUT: Should I be using the gutmann setting at all, then? As you said, writing hundreds of Gb (which does reflect in the fragmentation of my drive) to securely overwrite is certainly going to kill my drive and probably what produced my problem. On the other hand: I want to get rid of this file and not have it traceable. I guess I have to set priorities.
Thanks for the insight, I will be using the settings a little more cautiously from now on.
Greets from the Continent and the Krauts.
No Gutmann. One overwrite will do fine for secure deletion.
Greetings von der Inselaffen.