Huge Memory Usage

Machine Stats

post-13221-1200696376_thumb.jpg

post-13221-1200696376_thumb.jpg

Look at the number of files found!!!

The memory usage is small compare to them!!!

The best request is to have a option to not show files older than... or not remember files...

luik said:
<div class="ipsQuote_contents">
	<div>
		 
	</div>
</div>

You can only start the filter after you've already done the search? It might be smaller if you could set filter options before searching

If that is the case, that Recuva is keeping the list of recoverable files in the RAM I'm sure that would do it. If a pc is much like my calculator that would easily do it and should have an easy fix, by simply dumping portions of the list on to the hard drive as a temp file and creating a new list in the RAM. I wrote a program on my calculator which made a list of numbers over 3,000 entries long (don't ask why, it's a long story) and would reach a Buffer Overflow error at 88% because of the constant opening, appending, and closing of the list. I simply set it to only work on a tenth of the list at a time and append each tenth on to the previous one. Not only would the program finish running this way, but would go a lot faster. True the same thing couldn't be done with Recuva because it doesn't know how many files it will find (the calculator program covered a certain range and it would be a tenth of that range at a time, not of the list of numbers it would find because it was finding them) but it could store only say 10,000 in memory at a time as an example. When dealing with over one million entries that would probably speed it up.

I don't really know how this would be accomplished with computer code, but I'm willing to bet there is a way, provided that is the problem.