![]() ![]() if that happens he might send the other programs to the swap file. I know that mm will grab all the ram he can to do his tasks and that can be a lot of ram, which i am thinking you might get a slowdown if you do more then the 10,000 files you said. ![]() That program i mentioned does not use swap file but if your programs are in swap file usage yep it can take time to get that ram back. Yep with that much memory you still can notice the speed up i see in your list a program i also have but mine is listed as zero memory needed to stay loaded so its just less then one kb to load it but you have it as taking 9334kbs' its the juschedl.exe unless yours was running at the time it should not need that much, Even when you close the offending program, it will still take a long time for all the memory pages in virtual memory to make it back into system memory, or to just reboot - neither of which are appealing. This problem isn't a concern for me at all, but I doubt I am the first one to try to analyze over 10,000 tracks for duplicates and, those who have 2 GB or less that try to do that, their system will just die in an unholy mess of page file hits. ![]() The best I could hope for is that MediaMonkey can recognize when the system is low on resources, and free up garbage, but assuming it could somehow do that, I doubt there would be this problem in the first place. Also, that will do almost nothing since all but 20 MB of memory is in the private set. Along with that, if you look at the paged pool, there is hardly any page file hits going on, so it isn't like my system is struggling for resources. I have plenty of memory available still - 69% of 8 GB, so still over 5 GB. That will do wonders for you and it is a program that you can use on either 32 or 64 systems. Rovingcowboy wrote:looking at the image you posted, you need to go to and look for the cleanmem program it was just released a few days ago maybe last week? but i posted about it in the news and other stuff forum room in the message ![]() Though seeing as MediaMonkey only takes up 50 MB of memory when loaded and after doing the monitoring check, it seems that at least a good chunk of it is not being cleaned up. I was hoping for it to finish when a ton more memory was allocated, but I planned it out poorly and it ended with 100 MB allocated. I'll probably post again when it finally finishes if the memory was cleaned up in the end or not.Įdit: Alright, it finished, and far sooner than I expected. The overall memory consumption will constantly being jumping up and down for a while, but you'll see the base consumption slowly start to increase, and passing a few hundred MB of memory usage in 10-30 minutes. You should see them in the "Monitoring x of x" progress bar, going really slow since they are checking for duplicates for the first time Add a few thousand tracks to your library from a folder being monitored But it would probably help if someone can confirm this with the latest version. I know, its not the latest, but I didn't see anything even remotely related to this in the updates. Unfortunately I can not kill the "Monitoring." task, and it has a long ways to go until it finishes completely, so I can't say if it will clean up or not when finished.ĭefinitely not a high priority problem, especially if the memory cleans up after the analyzing has finished (will have to wait for a few hours to get back to you on that one). Tracks that have already had their hash value computed for the duplicate check do not seem to be contributing to this at all, only the ones that have computed since MediaMonkey has run, so I am assuming it is a memory leak. I watched it as it finished monitoring a few thousand more tracks, and the memory usage is definitely rising. When I closed MediaMonkey, and started it back up again, it looks like it left off at around track 14,000, so it probably took around 12,000-13,000 tracks to reach this mass usage. If it is relevant, it is doing this rebuilding during the run-time "monitoring tracks" check on my folders I have set to watch. I have around 30,000 songs in my library, and this is the first time I have ticked this option, so it is being built for every track. I can not say for a fact, but it seems to be because I ticked "analyze tracks for duplicates". I have never, ever had any memory issues with MediaMonkey before, so this has got to be something I did recently. Turns out it managed to use up 2 GB of memory: I left my computer for a few hours and came back to some "out of memory" messages from MediaMonkey. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |