Jump to content

"Best Practice" or Cookbook for Defraggler


Peter2

Recommended Posts

Hi

 

is there a description for the usage of "Defraggler" like a "Best Practice: Advantages and Disadvantages"

 

I read the docs, and I found a lot of info like "If you want Quick Defrag, press Quick Defrag". "If you want to move the files, enable the move files option."

 

That's fine, but not always helpful - sometime I need a "cookbook" like this...

 

for example, to reach the maximum of defrag:
- a) defrag "empty space" (enable option X and disable option Y)
-  defrag drive without moving ABC
- c) enable boot-time-defrag

 

This is the kind of information what I'm looking for...


  •  
  • What is recommended?
  • What are (dis)advantages?
  • Which options are dangerous or useless to combine?
     

Who can help?

 

Peter

Win XP

Link to comment
Share on other sites

  • 2 weeks later...

This is the kind of info i was looking for as well, but no replies were forthcoming. Guess we will just have to use common sense and hope for/test for the best. You pretty much hit what SEEMS optimal in the posted question. Best thing i like: using my system WHILE Drfraggler is running with no apparent side effects. Using regular defrag that comes with windows (XP pro in my case) EVERY time the mouse is clicked or the system is used restarts defrag!!! What a blessing it is not to have to put up with that any more :-) ...JL

Link to comment
Share on other sites

A simple "cookbook" is an attractive idea, but what makes sense is going to depend on your computer configuration and how you use it; there are tradeoffs.

 

There are also going to be differing opinions, depending on the goals of the "expert", a perfect layout with every file (no matter how large) in one extent or simply high system performance for mimimal maintenance effort.

 

For example an optimal way for my system appears to be :

 

o Regular weekly scheduled run of Windows 7 defragger

o Run defraggler "Quick Defrag" regularly to catch new files (perhaps daily)

o Occasionally running defrag freespace (perhaps monthly)

o Seperate out a cache with lots of small files, and a very large database file into a small filesystem which also contains the pagefile

o Seperate out large archive files like media or downloads into filesystem on end of disk

o Moving large files to the end of partitions to avoid them being regularly shuffled about when defrag is compacting the used area after deletions

 

This works well for me, resulting in good performance for little effort, with scheduled defrag mostly being unnoticed, but benefits from the disk partitioning scheme I planned. If everything was in one huge C: drive moving the large files, could be less attractive.

 

I stopped using defraggler for full defrag because it was tending to work too hard (coalescing even very large huge fragments so the defrag costs were larger than fragment overheads), and I had issue with the scheduled defrag not operating with the same options as interactive defrag (which meant large files were moved back to start of disk).

Link to comment
Share on other sites

..There are also going to be differing opinions, depending on the goals of the "expert", ...

This is what I wanted to say: There should be some variants of solution

 

a ) If you want high performance in daily work ..

b ) If you want high performance in making backups to tape ...

c ) If you want to schedule a short daily job ...

d ) If you want to schedule an intensive monthly job ...

 

Peter

Win XP

Link to comment
Share on other sites

OK, but with Windows 7 (and Vista) is the system scheduled defrag not giving that (relatively high performance)?

 

I'm using Defraggler mainly for the Quick Defrag, on small cache files which tend to be stored in a horribly large number of pieces otherwise, and can be expected to be reaccessed so reducing the seek time makes sense. The new boot time defrag feature is another reason, as is the freespace defragging feature in advasnced.

 

Somone else, may make "having a fragmentation free file system" as their goal, and then the recipe is going to require advice about restore points, making contiguous space for page files, multiple passes and other tricks to get to 0% fragmentation. I don't think performance numbers would support such an extreme focus, after all the PC is still going to be opening and reading 1000's of individual small files each requiring seeks to be located on disk, no matter how well laid out the large files are.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.