hi all
in a previous posting on this subject I highlighted a pretty nasty performance issue I was seeing with Defraggler in that even on a relatively empty and relatively un-fragmented drive Defraggler was wanting hours to complete a job that should have in reality taken no more than 20 to 30 mins now I see from reading the change logs for subsequent versions that you have tried to address this issue and I have seen a slight increase in performance sadly however Defraggler is still running extremely slowly on my system
I've been pondering over why that might be because when using say Auslogics Disk Defrag I see quiet good performance and even in a heavily fragmented disk something on the order of 50% Auslogics Disk Defrag never seems to take more than an hour or so to get the job done and that’s doing a full defrag and optimise including moving system files to the start of the drive for faster operation and it’s got my puzzled as to why Defraggler is so slow in comparison.
Thinking about it the though occurs that it might be something to do with the way the 2 to apps process files Defraggler seems to try and process entire files at once which on a high powered system isn't a problem my system however only has 1gb or ram and therefore when processing very large files such as though found the World of Warcraft folder the whole thing chokes and Defraggler moves with all the speed of a glassier where as Auslogics Disk Defrag seems to process files cluster by cluster which makes for much better performance on low powered system because the Ram isn't being overloaded with more data than it has capacity to handle.
I'm no programmer so I could be wrong admittedly these are only visual observation from watching the two programmes operate but if you want to improve the speed of Defraggler further and actually get it to operate and somewhere near a reasonable level for all system regardless of there hardware then it might be a good idea to have a look at the way Defraggler operates as there might be a better more efficient way to do thing ie processing smaller chunks of data sequentially rather then trying to process entire files at one time that's fine for small files but not for large multi gigabyte files not on a lower end system.