To add to my previous comment, it occurred to me now afterwards that it could be even more meaningful to calculate and display the median file fragment size instead of the average file fragment size, and then to identify the most problematic files using that metric.
Calculating the average fragment size of the files is blindly distributing the total file size evenly onto all fragments, absent any better more specific knowledge of the size of the fragments. But I assume that after a completed disk analysis, Defraggler knows exactly how large each fragment of a particular file is, i.e. it knows the distribution of the file data among the file fragments.
From this analysis data, it is probably possible to figure out the median file fragment size (= the size of the middlemost file fragment, when all file fragments of a file are sorted by size), which is a more representative metric than the average.