When analysing any NTFS drive formatted with cluster sizes of 128KB or larger defraggler either crashes or the analysis gets stuck on starting.
Other file systems with large clusters of 128K such as ExFAT do work as expected.
I know its unusual to have clusters this large but I need them for a particular backup program that when creating huge backup files has a tendency to have massive fragmentation issues and huge clusters greatly reduces this problem.
So why isn't there any support? Older defrag programs from years ago such as UltraDefrag do appear to support such drives years ago (I have not extensively tested this but it appeared to work fine) and programmatically I would of thought it would be relatively simple to implement.
Atleast if for some strange reason you chose not to support such drives the program should fail gracefully and bring up a prompt informing the user that it can't defrag such drives instead of just falling over or hanging?
Just because one defrag tool was updated to support it doesn't mean the myriad of others will or ever will, can't assume that. Allot of defrag tools are at this point very legacy and much older than when Microsoft added support for a cluster size larger than 64KB. I guess that kind of makes it obvious that although defrag tools are calling the Microsoft Defrag API they're still doing their own thing, and I think a crash is better than it trying to defrag the disk and corrupting it.