I’m using DUP finder to clean up two large 2TB flash drives. 1TB of duplicates is reported, but when scrolling down (to the larger dups), the app becomes unresponsive and the busy spinner is displayed. The flash drives are NVMe drives mounted in Sabrent cases utilizing USB-C interface with a 64GB RAM M3 Macbook Pro.
I have use the DUP finder before and found it very helpful. Have I overflowed some buffer?
The problem is that the drives we are using at home today have much more capacity than we used to have.
So scanning operations such as duplicate finding, ful backups, etc., consequently take a lot longer because there is a lot more to be scanned.
(1TB of duplicates is a lot, I guess these are probably backups?).
As you appear to be interested in a certain size of file then the best I can suggest is to try doing things in stages.
Use the ‘Ignore’ settings on file sizes set so as to limit the range and thus the number of duplicate files searched for at each stage.
This is from Windows, I assume it’s not too different on a Mac:
I suspect that it’s still going to take time to scan 4TB of storage, (with obviously a lot of files on there if there are 1TB of duplicates), but should be quicker if it can quickly discount some files from checking because of filesize.
It will also mean less results to look at each time for each range of sizes.
Thanks for the suggestion. I found the Limit search option however the max exclude file size is 50MB. So I set it to 50 (from 16MB) and I’m running DUP Finder now. Yup, it takes a while.
I have a habit of backing up without regard to previous backups, so I end up with bits and pieces of backups scattered over several drives. I’m attempting to consolidate so I can get use of my external drives back…
DEVELOPERS, IF YOU ARE READING, MAC USERS NEED A HIGHER EXCLUDE LIMIT. Please - 500MB would be nice…
That’s going to exclude files over 50MB, I thought those bigger files were the ones that you were particularly interested in?
I also tend to do full backup rather than incrementals.
I do a full image onto one removable drive (Macrium Reflect saves the last 12 and takes care of removing older ones).
Then I do a copy/paste manual backup of the complete Documents folder to a networked drive. (I delete the older ones manually when I remember to get round to it).
You might be best just doing new complete backup to an empty drive (or 2 to different drives) and just deleting all of the old ones.
Unless you need a record of how things were before any revisions to the files.
I just wanted to report on my success. Setting the exclude limit to 50MB made the difference. No lock-up - full scrolling without issue. Right now the app is “Cleaning Duplicates” - very slowly, but I’m happy if it completes…
Another suggestion for the developers: Can we have the App programmatically select first duplicate for deletion?
Backup may be a misnomer. Archival copy may be better. My intention is to delete dups then fold both drives into one (if possible). I fear the original may have been removed from the original location to make space on a system drive…
Anyway, I got it from here… Thanks for the help and conversation.
Lesson learned: Do your work in small manageable bites, where possible.
It’s deliberate that you have to select them yourself, and only one-by-one.
Years ago when they first added the Duplicate Finder it was decided that any quick/auto option would be open to users making errors and deleting the wrong things, and then of course blaming CCleaner for doing exactly what they told it to do.