Jump to content


  • Posts

  • Joined

  • Last visited


0 Neutral

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hmm.. for jkDefrag there is a GUI now. : ) - [Link] To move everything to the beginning or to the end of a partition, I can recommend using [MyDefrag]. This thing also has a GUI, and you simply click on what you'd like to do. - Like for our use case "Move files to the beginning of disk". I have to add, that I recommend using this tool ONLY for this purpose. For the usual defragging I recommend using Defraggler [in file defragging mode].
  2. Yes, can only agree to nukecad: There is no performance improvement in defragging whole disks in one task. Just takes excessive ammount of time, wears down disk mechanics, and all this for no noticeable improvement in performance afterwards. You should perhaps consider only defragging the files with most fragments. But also keep in mind, the bigger the file, the more fragments it needs to be worthy of defragging. As an example... -------------------------------------- Imagine, you have a file which is 800 KB in size, but has 2000 fragments. - You should consider defragging this one, as it it really fragmented. If you have a file of 9 GB, that has 10 fragments.. well.. who cares. This file is THAT big, so 10 fragments make no difference at all when reading. It does not fit in any read buffer anyway. Operating systems use read buffers of a certain size (where the 9 GB file does not fit in anyway) per file access, and since the disk most likely has more than one task at the same time, it will be stressed with other read/write tasks anyway, so it will reach over the surface anyway multiple times. 10 times more won't hurt performance at all. But compared to actually defrag the 9 GB file, this takes considerably way way longer than search for the 10 fragments of that huge file. -------------------------------------- You should perhaps consider defragging most often used files, like those in %windir%\System32 or %windir%\INF more often, as Windows comes across with them quite often. Like for instance if you have a 50 KB INF file with 5 fragments.. you might want to defrag it. - Due to the importance of this file (windows might read them quite often), and since it's quite small, it might fit into the operating system's read buffer, and therefore can be read in one-go. This might work out if you have data sequentially (1 fragment) on your disk. Also consider defragging folder information. Important folders like these in %windir% and the like, are mostly used on your disk. These folders need to be accessed as fast as possible. No sense in defragging log files and stuff that is only rarely accessed ever again. Do not defrag open databases. Defraggler (and all other defrag tools) WILL render the DB to end up being corrupt, as the database service does not know, you're defragg(l)ing it's database, and therefore writes into it, while Defraggler is also writing in that file. The result is a corrupted file. - Always avoid defragging open databases. --> All the above things can only lead to the conclusion to NEVER use the "Defrag" button. - Always use the FILE tab to choose what to defrag. Sort after file size, most or less fragmented files, then use defrag selected file(s). You may also use multi-select to select them first and then LAUNCH the process.
  3. Well it is, indeed, quite hard to see what's already been suggested, if you have 20+ pages of suggested stuff. But I do have to agree: Would be excellent to have a choice between caching the to-be-defragged data either into ram, OR choose a Windows drive letter to tell which drive should be used for caching. This could be a ram disk, for instance, that exists anyway. A few things to consider, when caching to RAM or a disk drive/ram drive: - You should have backup power, in case of a power outage. Data would be lost else. - There needs to be a settings dialogue in which to choose how many MB, TB, whatever size you'd like Defraggler to use this cache. - Your machine requires enough free ram/disk space for the above set limit - Should files being cached, or whole sets of (selected) files being cached? - The cache drive needs to be faster than the source drive. Two reasons for why to even consider use caching onto other drives/ram: 1) Performance. If you read from the source drive, and write to the same drive (source == destination drive), the disk needs to do both things right after each other. And in today's world the disk most likely has other tasks too, in the meanwhile, all tasks are running parallel. This slows down the process and puts more stress on the disk. Caching would put the disk in position of only reading data, and writing later, when disk load allows. 2) More space on the source drive. While you take away the to-be-defragged data from the source drive, you create free space, which can be used for data later on while defragg(l)ing. Cif
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.