Jump to content

HSchulze

Experienced Members
  • Posts

    12
  • Joined

  • Last visited

Everything posted by HSchulze

  1. The problem is the size of files and HDDs are increasing so poorly threaded software using 1 or 2 of 16 or more cores and getting 20MB/s on a 220MB/s drive is inefficient and slow. It's almost faster to copy the remaining files from one HDD to an empty one. If I had RAID10 it wouldn't go much faster, but take 5x longer. Background defragmentation starts to make sense.
  2. Glary utilities is much faster, but not as accurate nor flexible. I end up using 3 tools to scrub system drive and 3 tools to fix massively fragmented drives after astrophotography explosion of multiple files per frame of a 100Mpixel camera. Both 2TB M.2 and 12TB HDD are needed for scratch files. Glary defrag gets 10k files scrunched in less than 5 mins. Then Defraggler for the 50 remaining massive OS and protected files, then a reboot defrag for the MFT etc. When done working, I move the files to a 3rd drive and a NAS for backup, delete the temp files and start over. It's 20 hours of intensive disk work of terabytes of temp files.
  3. The leading digit is truncated. Soon to be changing from 1x to 2x.
  4. 5th year, and large hard drive users like myself would like some improvements. I would buy the Biz version if your website showed that it had more features. Performance is important for video editing, dropping to 20% on a fragmented drive. I already use SSD pairs to bounce edits, but on a productive day, I can generate / replace 1TB or more of files. Making the drives larger just exacerbates the problem (ie RAID of 2-5 drives, striping, etc), increases the peak speed but does not decrease the defrag time. 1) Please allow processing more than one drive at a time. Exists in competitive software. 2) Proper multi-thread support to process the file/fragment sorting operations in parallel. 3) Increase the allowed amount of files being read/written so that scatter-gather can increase the throughput. On 32GB machine, only 300MB used (apparently). I see some defrag software do long read/write operations under some circumstances, much faster than reading and writing single small files. 4) Decide whether moving the next 50,000 semi-consecutive files one at a time is slower than reading them all at once, and writing them in a consolidated block is faster. Making use of the hard drive's internal DRAM for read-ahead makes sense to speed up the software. I know it would be fast, because on an SSD you can move 500MB/sec or more. At >3 days per drive (70% used 12TB, not even the latest 24TB) capable of 220MB/s, it takes 10 hours to copy the drive to an empty one, and 10 hours to copy it back. Need to do this type of defrag once a month or so. Consolidation (full or fragmented) is important because writing new files scatters into all the potholes. It shouldn't take an hour to do a fast consolidation. I would put $100 on the table for 2 or more of the above.
  5. The query cache isn't going to be cleaned while Excel is still installed. I was under the impression that CC would clean out temporary files. I wish it would also tackle the many copies of installer files, where I find 2 to 4 levels of unpacking, caching, and backupping. I removed a total of 77GB of working file copies, a hibernate file that isn't used by a box PC that runs 24/7, a dozen not used apps, released some cloud caches, pst files from work (just retired), for a total of 170GB. That matters on a 512GB SSD. Duplicate photos and documents found by other tools. I am just posting to give awareness to these trashcans to the developers. The point about the file age is that temp files may need to lie around for a day to a month, but not much longer.
  6. ..\AppData\Local\Microsoft\Office\16.0\PowerQuery\Cache\ 0c22c117-47d9-4ebd-b5e5-7c6d8243a4f3.dat 1.2GB and many others like it. Last used 1.5 years ago. The cache files cover a period of 4 months. Also found 300MB in 6000 files in ..\AppData\Local\Google\Chrome\User Data\Default\Code Cache\js which are up to 1.5 years old. Another 200MB right next to it in IndexedDB. I will toss as soon as I close Chrome. AppData\Roaming\Zoom\logs had 150MB as well.
  7. Same here. Had 64 installed, no sign in registry of 32. CC installed 32. Uninstalling IV32 caused some registry links to get trashed, so I had to uninstall IV64 older version. Then reinstall IV latest version. Not useful. Should I trust CC to update 4 drivers?
  8. Op asked: Why isn't it faster? The throughput algorithms worked for 50MB/s HDDs no longer holds for SSDs. Or is there additional driver and/or OS level (Win10Pro) overhead that is slowing it down? I am defragging about 200x 500KB 3-fragment files, throughput is <100KB/sec, on a Samsung 980.
  9. Long-time-ago (LTO) accessed files and LTO written files should be thrown to end of drive, to save re-defragmenting, and speed up access to newer files at the beginning and middle of the drive. I watched Norton SD for many hours trying to divine what it was up to. it was a Great Thing in the days of DOS. The various things that people want out of their defrag performance should be weighted, based on their priorities
  10. I am doing a once-after-6-months defrag of a 512GB Samsung SSD, which does 1 to 3 GB/s sequential reads, and usually around >600MB/s sequential writes. In an effort to understand the fragmentation, I started by defragging the largest and most fragmented files (top 100, etc) Now that I am looking at 500x 1MB files with ~4 fragments each, I am seeing <1 MB/s throughput for the last hour. I don't know why Defraggler doesn't use a scatter-gather with a live bitmap to read a pile of files into memory (32GB DRAM) in sector order (will help HDDs more), then write them back to the new locations, preferably sequentially behind each other. With the bitmap you mark files that you will read, and skip (or re-strategize) a file that hits a write to a read-pending cluster. This can be done multi-threaded instead of single-threaded. One thread reading and writing, one committing changes to the bitmap, another tracks writes to completed files that can be removed from the bitmap. I have 16 threaded CPU at 4GHz, which can saturate SSD 2GB/s with 1 thread.. This should run at least 10 to 20x faster since it isn't throughput constrained. First time I ran Defraggler was what seems like 10 years ago. It works. But has always been the slowest of the lot. I would pay $100 for a version that was full speed.
  11. Defragmenting all files all the time only increases the likelihood of bit rot as the data travels through your CPU, Memory, I/O chip, _without_ CRC protection. Defragging once every month to 3 is enough, unless you are a movie editor, in which case you will copy your files to a different media or server once you are done, and erase the fragmented files anyways (then defrag).
  12. Use Windows recovery options, searchable on Google and other search systems. The latest software will find out what is wrong and make recommendations.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.