System.OutOfMemoryException because of Large Dictionary

半世苍凉 提交于 2019-12-30 09:57:32

问题


I keep a large cache in a dictionary with value IEnumerable<KeyValuePair<DateTime, Double>>. I remove items from the dictionary periodically and add items to the dictionary periodically. Every now and again I get a System.OutOfMemoryException. I wanted to know why doesn't the garbage collector come to my rescue?


回答1:


Remember that a Heap can get fragmented as @Gabe mentioned. Even though you might have free memory it might not have a chunk large enough to allocate the dictionary to when it performs it's resize.

Perhaps you could use the caching block from the patterns and practices library MSDN Library Link that will assist you in implementing a good cache. Maybe you could choose an algorithm that doesn't dynamically allocate memory, with a fixed number of entries?

Also note that if there isn't any memory that you can use then it's a problem with the size of your cache, not the garbage collector.




回答2:


It's quite possible that the GC is coming to your rescue for a lot of the time, but that you're just going beyond its capabilities sometimes.

Just to be absolutely clear, is this a Dictionary<DateTime, Double> or a Dictionary<SomeKeyType, IEnumerable<KeyValuePair<DateTime, Double>>>? If it's the latter, then perhaps you're holding onto references somewhere else?

How large is your cache getting? Do you have monitoring to keep track of it? What makes you think it's the dictionary which is causing the problem? If you have control over how much you cache, have you tried reducing the size?




回答3:


Since you're asking why the GC doesn't rescue you I'll give an answer to that.

Using a programming language/environment with a garbage collector makes live easier for you but won't make memory management a thing of the past.

If you assign big chunk of memory going over about 2 gig in a 32bit xp machine you're just reached one of the first .Net memory boundaries. Keeping 2 gig in memory is always a bad idea.

On a memory constrained machine running a huge database or the like you'll quickly hit the boundaries of available memory. Since the GC is not OS aware it might nog notice a low memory situation in time (creating huge objects like bitmaps can trigger this situation). Manually calling GC.Collect once you're set a huge object to nothing help a lot here.

Keeping a big dictionary in memory is a very simple description. Can you tell us whats in the collection and how big those items are theoretically.

If you mean big like 2,147,483,647 items you could have hit the Integer size limit.

To sum up:

  • Don't keep unneeded items in memory or swap them out to disk.
  • Do call GC.Collect once you've freed 'big' items (but not in a loop of deleting items, after the loop please)



回答4:


I'm not sure, escuse me if I'm wrong but maybe that the Dictionary is stored in the Large Object Heap when larger than 85kB (like a byte[90000])

Like Gabe Said :

I can see that maybe the large object heap gets fragmented and you wouldn't be able to grow the dictionary after a point

When the LOH get fragmented, it sometime doesn't have enough space to stored object with contigus adress. That's what is causing the OutOfMemory Exception. It more like a Out Of Contigus space in LOH exception.



来源:https://stackoverflow.com/questions/5336011/system-outofmemoryexception-because-of-large-dictionary

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!